Tensorflow's tensorflow variable_scope values parameter meaning

爷,独闯天下 提交于 2020-01-20 08:10:46

问题


I am currently reading a source code for slim library that is based on Tensorflow and they use values argument for variable_scope method alot, like here.

From the API page I can see:

This context manager validates that the (optional) values are from the same graph, ensures that graph is the default graph, and pushes a name scope and a variable scope.

My question is: variables from values are only being checked if they are from the same graph? What are the use cases for this and why someone will need that?


回答1:


The variable_scope parameter helps ensure uniqueness of variables and reuse of variables where desired.

Yes if you create two or more different computation graphs then they wouldn't necessarily share the same variable scope; however, there are ways to get them to be shared across graphs so the option is there.

Primary use cases for variable scope are for RNN's where many of the weights are tied and reused. That's one reason someone would need it. The other main reason it's there is to ensure that you are reusing the same variables when you explicitly mean to and not by accident. (For distributed settings this can become a concern.)



来源:https://stackoverflow.com/questions/40164583/tensorflows-tensorflow-variable-scope-values-parameter-meaning

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!