I have trained xor neural network in Matlab and got these weights:
iw: [-2.162 2.1706; 2.1565 -2.1688]
lw: [-3.9174 -3.9183]
b{1} [2.001; 2.0033]
b{2} [3.
You usually don't use a sigmoid on your output layer--are you sure you should have the tansig on out3? And are you sure you are looking at the weights of the appropriately trained network? It looks like you've got a network trained to do XOR on [1,1] [1,-1] [-1,1] and [-1,-1], with +1 meaning "xor" and -1 meaning "same".