Neural Network sigmoid function

匿名 (未验证) 提交于 2019-12-03 01:08:02

问题:

I'm trying to make a neural network and I have a couple of questions:

My sigmoid function is like some

s = 1/(1+(2.7183**(-self.values))) if s > self.weight:         self.value = 1     else:         self.value = 0 

The self.values is a array of the connected nodes, for instance the HNs(hidden nodes) in the HL(hidden layer) 1 is connected to all input nodes, so it's self.values is sum(inputnodes.values).

The HNs in the HL2 is connected to all HNs in HL1, and it's self.values is sum(HL.values)

The problem is, every node is getting the value of 1, no mather their weights(unless it's too high, like 0.90~0.99)

My Neural Network is set like so:

(inputs, num_hidden_layers, num_hidden_nodes_per_layer, num_output_nodes) inputs is a list of binary values:

Here's a log that shows this behavior.

>>NeuralNetwork([1,0,1,1,1,0,0],3,3,1)# 3 layers, 3 nodes each, 1 output Layer1 Node: y1 Sum: 4, Sigmoid: 0.98, Weight: 0.10, self.value: 1 Node: y2 Sum: 4, Sigmoid: 0.98, Weight: 0.59, self.value: 1 Node: y3 Sum: 4, Sigmoid: 0.98, Weight: 0.74, self.value: 1 Layer2 Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.30, self.value: 1 Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.37, self.value: 1 Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.80, self.value: 1 Layer3 Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.70, self.value: 1 Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.56, self.value: 1 Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.28, self.value: 1 

Even if I try using float points in the input it turns out the same:

>>NeuralNetwork([0.64, 0.57, 0.59, 0.87, 0.56],3,3,1) Layer1 Node: y1 Sum: 3.23, Sigmoid: 0.96, Weight: 0.77, self.value: 1 Node: y2 Sum: 3.23, Sigmoid: 0.96, Weight: 0.45, self.value: 1 Node: y3 Sum: 3.23, Sigmoid: 0.96, Weight: 0.83, self.value: 1 Layer2 Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.26, self.value: 1 Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.39, self.value: 1 Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.53, self.value: 1 Layer3 Node: y1 Sum: 3, Sigmoid: 0.95, Weight: 0.43, self.value: 1 Node: y2 Sum: 3, Sigmoid: 0.95, Weight: 0.52, self.value: 1 Node: y3 Sum: 3, Sigmoid: 0.95, Weight: 0.96, self.value: 0 

Note de Node y3 in the layer3, the only one that returned a 0 after the sigmoid

What am I doing wrong?

Also, is it really necessary to connect every node with every other node in the previous layer? Isn't it better to let it be random?

EDIT: Forgot to mention, this is a in-development NN, I'll be using a genetic algorithm to train the network.

EDIT2:

class NeuralNetwork:     def __init__(self, inputs, num_hidden_layers, num_hidden_nodes_per_layer, num_output):         self.input_nodes = inputs         self.num_inputs = len(inputs)         self.num_hidden_layers = num_hidden_layers         self.num_hidden_nodes_per_layer = num_hidden_nodes_per_layer         self.num_output = num_output          self.createNodes()         self.weights = self.getWeights()         self.connectNodes()         self.updateNodes()      def createNodes(self):         self._input_nodes = []         for i, v in enumerate(self.input_nodes):             node = InputNode("x"+str(i+1),v)             self._input_nodes.append(node)          self._hidden_layers = []         for n in xrange(self.num_hidden_layers):             layer = HiddenLayer("Layer"+str(n+1),self.num_hidden_nodes_per_layer)             self._hidden_layers.append(layer)      def getWeights(self):         weights = []         for node in self._input_nodes:             weights.append(node.weight)          for layer in self._hidden_layers:             for node in layer.hidden_nodes:                 weights.append(node.weight)         return weights      def connectNodes(self):         for i,layer in enumerate(self._hidden_layers):             for hidden_node in layer.hidden_nodes:                 if i == 0:                     for input_node in self._input_nodes:                         hidden_node.connections.append(input_node)                 else:                     for previous_node in self._hidden_layers[i-1].hidden_nodes:                             hidden_node.connections.append(previous_node)      def updateNodes(self):         for layer in self._hidden_layers:             for node in layer.hidden_nodes:                 node.updateValue() 

And here's the updateValue() method of the nodes:

def updateValue(self):     value = 0     for node in self.connections:         value += node.value     self.sigmoid(value) # the function at the beginning of the question. 

The nodes created just have value, name, and weight(random at start).

回答1:

You are mashing together several different NN concepts.

The logistic function (which is the generalized form of the sigmoid) already serves as a threshold. Specifically, it is a differentiable threshold which is essential for the backpropagation learning algorithm. So you don't need that piecewise threshold function (if statement).

The weights are analogues for synaptic strength and are applied during summation (or feedforward propagation). So each connection between a pair of nodes has a weight that is multiplied by the sending node's activation level (the output of the threshold function).

Finally, even with these changes, a fully-connected neural network with all positive weights will probably still produce all 1's for the output. You can either include negative weights corresponding to inhibitory nodes, or reduce connectivity significantly (e.g. with a 0.1 probability that a node in layer n connects to a node in layer n+1).



标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!