Simple Linear Neural Network Weights from Training are not compatible with training results

一曲冷凌霜 提交于 2019-12-25 03:26:31

问题


The weights that I get from training, when implied directly on input, return different results! I'll show it on a very simple example let's say we have an input vector x= 0:0.01:1; and target vector t=x^2 (I know it better to use non linear network) after training, 2 layer, linear network, with one neuron at each layer, we get:

sim(net,0.95) = 0.7850 (some error in training - that's ok and should be) weights from net.IW,net.LW,net.b:

IW =

0.4547

LW =

2.1993

b =

0.3328   -1.0620

if I use the weights: Out = purelin(purelin(0.95*IW+b(1))*LW+b(2)) = 0.6200! , I get different result from the result of the sim! how can it be? what's wrong?

the code:

%Main_TestWeights
close all
clear all
clc


t1 = 0:0.01:1;
x = t1.^2;

hiddenSizes = 1;
net = feedforwardnet(hiddenSizes);


[Xs,Xi,Ai,Ts,EWs,shift] = preparets(net,con2seq(t1),con2seq(x));
net.layers{1,1}.transferFcn = 'purelin';
[net,tr,Y,E,Pf,Af] = train(net,Xs,Ts,Xi,Ai);
view(net);


IW = cat(2,net.IW{1});
LW = cat(2,net.LW{2,1});
b = cat(2,[net.b{1,1},net.b{2,1}]);

%Result from Sim
t2=0.95;
Yk = sim(net,t2)

%Result from Weights
 x1 = IW*t2'+b(1)
 x1out = purelin(x1)
 x2 = purelin(x1out*(LW)+b(2))

回答1:


The neural network toolbox rescales inputs and outputs to the [-1,1] range. You must therefore rescale and unscale it so that your simulation output is the same sim()'s output:

 %Result from Weights
 x1 = 2*t2 - 1; # rescale 
 x1 = IW*x1+b(1);
 x1out = purelin(x1);
 x2 = purelin(x1out*(LW)+b(2));
 x2 = (x2+1)/2 # unscale

then

>> x2 == Yk

ans =

     1


来源:https://stackoverflow.com/questions/11807184/simple-linear-neural-network-weights-from-training-are-not-compatible-with-train

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!