neural-network

LSTM Autoencoder problems

守給你的承諾、 提交于 2021-02-06 16:14:30
问题 TLDR: Autoencoder underfits timeseries reconstruction and just predicts average value. Question Set-up: Here is a summary of my attempt at a sequence-to-sequence autoencoder. This image was taken from this paper: https://arxiv.org/pdf/1607.00148.pdf Encoder: Standard LSTM layer. Input sequence is encoded in the final hidden state. Decoder: LSTM Cell (I think!). Reconstruct the sequence one element at a time, starting with the last element x[N] . Decoder algorithm is as follows for a sequence

LSTM Autoencoder problems

拜拜、爱过 提交于 2021-02-06 16:07:07
问题 TLDR: Autoencoder underfits timeseries reconstruction and just predicts average value. Question Set-up: Here is a summary of my attempt at a sequence-to-sequence autoencoder. This image was taken from this paper: https://arxiv.org/pdf/1607.00148.pdf Encoder: Standard LSTM layer. Input sequence is encoded in the final hidden state. Decoder: LSTM Cell (I think!). Reconstruct the sequence one element at a time, starting with the last element x[N] . Decoder algorithm is as follows for a sequence

LSTM Autoencoder problems

≯℡__Kan透↙ 提交于 2021-02-06 16:06:04
问题 TLDR: Autoencoder underfits timeseries reconstruction and just predicts average value. Question Set-up: Here is a summary of my attempt at a sequence-to-sequence autoencoder. This image was taken from this paper: https://arxiv.org/pdf/1607.00148.pdf Encoder: Standard LSTM layer. Input sequence is encoded in the final hidden state. Decoder: LSTM Cell (I think!). Reconstruct the sequence one element at a time, starting with the last element x[N] . Decoder algorithm is as follows for a sequence

LSTM Autoencoder problems

旧巷老猫 提交于 2021-02-06 16:05:10
问题 TLDR: Autoencoder underfits timeseries reconstruction and just predicts average value. Question Set-up: Here is a summary of my attempt at a sequence-to-sequence autoencoder. This image was taken from this paper: https://arxiv.org/pdf/1607.00148.pdf Encoder: Standard LSTM layer. Input sequence is encoded in the final hidden state. Decoder: LSTM Cell (I think!). Reconstruct the sequence one element at a time, starting with the last element x[N] . Decoder algorithm is as follows for a sequence

Unable to solve the XOR problem with just two hidden neurons in Python

可紊 提交于 2021-02-06 11:29:32
问题 I have a small, 3 layer, neural network with two input neurons, two hidden neurons and one output neuron. I am trying to stick to the below format of using only 2 hidden neurons. I am trying to show how this can be used to behave as the XOR logic gate, however with just two hidden neurons I get the following poor output after 1,000,000 iterations! Input: 0 0 Output: [0.01039096] Input: 1 0 Output: [0.93708829] Input: 0 1 Output: [0.93599738] Input: 1 1 Output: [0.51917667] If I use three

Unable to solve the XOR problem with just two hidden neurons in Python

五迷三道 提交于 2021-02-06 11:28:11
问题 I have a small, 3 layer, neural network with two input neurons, two hidden neurons and one output neuron. I am trying to stick to the below format of using only 2 hidden neurons. I am trying to show how this can be used to behave as the XOR logic gate, however with just two hidden neurons I get the following poor output after 1,000,000 iterations! Input: 0 0 Output: [0.01039096] Input: 1 0 Output: [0.93708829] Input: 0 1 Output: [0.93599738] Input: 1 1 Output: [0.51917667] If I use three

Unable to solve the XOR problem with just two hidden neurons in Python

心不动则不痛 提交于 2021-02-06 11:28:02
问题 I have a small, 3 layer, neural network with two input neurons, two hidden neurons and one output neuron. I am trying to stick to the below format of using only 2 hidden neurons. I am trying to show how this can be used to behave as the XOR logic gate, however with just two hidden neurons I get the following poor output after 1,000,000 iterations! Input: 0 0 Output: [0.01039096] Input: 1 0 Output: [0.93708829] Input: 0 1 Output: [0.93599738] Input: 1 1 Output: [0.51917667] If I use three

Unable to solve the XOR problem with just two hidden neurons in Python

老子叫甜甜 提交于 2021-02-06 11:27:48
问题 I have a small, 3 layer, neural network with two input neurons, two hidden neurons and one output neuron. I am trying to stick to the below format of using only 2 hidden neurons. I am trying to show how this can be used to behave as the XOR logic gate, however with just two hidden neurons I get the following poor output after 1,000,000 iterations! Input: 0 0 Output: [0.01039096] Input: 1 0 Output: [0.93708829] Input: 0 1 Output: [0.93599738] Input: 1 1 Output: [0.51917667] If I use three

Unable to solve the XOR problem with just two hidden neurons in Python

筅森魡賤 提交于 2021-02-06 11:27:36
问题 I have a small, 3 layer, neural network with two input neurons, two hidden neurons and one output neuron. I am trying to stick to the below format of using only 2 hidden neurons. I am trying to show how this can be used to behave as the XOR logic gate, however with just two hidden neurons I get the following poor output after 1,000,000 iterations! Input: 0 0 Output: [0.01039096] Input: 1 0 Output: [0.93708829] Input: 0 1 Output: [0.93599738] Input: 1 1 Output: [0.51917667] If I use three

Unable to solve the XOR problem with just two hidden neurons in Python

三世轮回 提交于 2021-02-06 11:27:01
问题 I have a small, 3 layer, neural network with two input neurons, two hidden neurons and one output neuron. I am trying to stick to the below format of using only 2 hidden neurons. I am trying to show how this can be used to behave as the XOR logic gate, however with just two hidden neurons I get the following poor output after 1,000,000 iterations! Input: 0 0 Output: [0.01039096] Input: 1 0 Output: [0.93708829] Input: 0 1 Output: [0.93599738] Input: 1 1 Output: [0.51917667] If I use three