Neural network in MATLAB

后端 未结 2 1696
心在旅途
心在旅途 2020-12-03 12:13

I have trained xor neural network in Matlab and got these weights:

iw: [-2.162 2.1706; 2.1565 -2.1688]

lw: [-3.9174 -3.9183]

b{1} [2.001; 2.0033]

b{2} [3.         


        
相关标签:
2条回答
  • 2020-12-03 12:59

    You usually don't use a sigmoid on your output layer--are you sure you should have the tansig on out3? And are you sure you are looking at the weights of the appropriately trained network? It looks like you've got a network trained to do XOR on [1,1] [1,-1] [-1,1] and [-1,-1], with +1 meaning "xor" and -1 meaning "same".

    0 讨论(0)
  • 2020-12-03 13:14

    I wrote a simple example of an XOR network. I used newpr, which defaults to tansig transfer function for both hidden and output layers.

    input = [0 0 1 1; 0 1 0 1];               %# each column is an input vector
    ouputActual = [0 1 1 0];
    
    net = newpr(input, ouputActual, 2);       %# 1 hidden layer with 2 neurons
    net.divideFcn = '';                       %# use the entire input for training
    
    net = init(net);                          %# initialize net
    net = train(net, input, ouputActual);     %# train
    outputPredicted = sim(net, input);        %# predict
    

    then we check the result by computing the output ourselves. The important thing to remember is that by default, inputs/outputs are scaled to the [-1,1] range:

    scaledIn = (2*input - 1);           %# from [0,1] to [-1,1]
    for i=1:size(input,2)
        in = scaledIn(:,i);             %# i-th input vector
        hidden(1) = tansig( net.IW{1}(1,1)*in(1) + net.IW{1}(1,2)*in(2) + net.b{1}(1) );
        hidden(2) = tansig( net.IW{1}(2,1)*in(1) + net.IW{1}(2,2)*in(2) + net.b{1}(2) );
        out(i) = tansig( hidden(1)*net.LW{2,1}(1) + hidden(2)*net.LW{2,1}(2) + net.b{2} );
    end
    scaledOut = (out+1)/2;              %# from [-1,1] to [0,1]
    

    or more efficiently expressed as matrix product in one line:

    scaledIn = (2*input - 1);           %# from [0,1] to [-1,1]
    out = tansig( net.LW{2,1} * tansig( net.IW{1}*scaledIn + repmat(net.b{1},1,size(input,2)) ) + repmat(net.b{2},1,size(input,2)) );
    scaledOut = (1 + out)/2;            %# from [-1,1] to [0,1]
    
    0 讨论(0)
提交回复
热议问题