Fixing a subset weights in Neural network during training

浪子不回头ぞ 提交于 2019-12-12 16:49:46

问题


Recently, I am considering creating a customized neural network. The basic structure is the same as usually, while I want to truncate the connections between layers. For example, if I construct a network with two hidden layers, I would like to delete some weights and keep the others, like the picture below: Structure of customized neural networks Sorry I cannot embed pictures here, only links.

This is not a dropout to avoid overfitting. Actually, the remained weights(connections) are specified and fixed. The corresponding structure is designed intentionally.

Are there any ways in python to do it? Tensorflow, pytorch, theano or any other modules?


回答1:


Yes you can do this in tensorflow.

You would have some layer in your tensorflow code something like so:

m = tf.Variable( [width,height] , dtype=tf.float32  ))
b = tf.Variable( [height] , dtype=tf.float32  ))
h = tf.sigmoid( tf.matmul( x,m ) + b )

What you want is some new matrix, let's call it k for kill. It is going to kill specific neural connections. The neural connections are defined in m. This would be your new configuration

k = tf.Constant( kill_matrix , dtype=tf.float32 )
m = tf.Variable( [width,height] , dtype=tf.float32  )
b = tf.Variable( [height] , dtype=tf.float32  )
h = tf.sigmoid( tf.matmul( x, tf.multiply(m,k) ) + b )

Your kill_matrix is a matrix of 1's and 0's. Insert a 1 for every neural connection you want to keep and a 0 for every one you want to kill.



来源:https://stackoverflow.com/questions/43851657/fixing-a-subset-weights-in-neural-network-during-training

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!