recurrent-neural-network

What exactly is timestep in an LSTM Model?

拥有回忆 提交于 2021-02-18 00:57:40
问题 I am a newbie to LSTM and RNN as a whole, I've been racking my brain to understand what exactly is a timestep. I would really appreciate an intuitive explanation to this 回答1: Let's start with a great image from Chris Olah's blog (a highly recommended read btw): In a recurrent neural network you have multiple repetitions of the same cell. The way inference goes is - you take some input (x 0 ), pass it through the cell to get some output 1 (depicted with black arrow to the right on the picture)

What exactly is timestep in an LSTM Model?

牧云@^-^@ 提交于 2021-02-18 00:52:34
问题 I am a newbie to LSTM and RNN as a whole, I've been racking my brain to understand what exactly is a timestep. I would really appreciate an intuitive explanation to this 回答1: Let's start with a great image from Chris Olah's blog (a highly recommended read btw): In a recurrent neural network you have multiple repetitions of the same cell. The way inference goes is - you take some input (x 0 ), pass it through the cell to get some output 1 (depicted with black arrow to the right on the picture)

What exactly is timestep in an LSTM Model?

左心房为你撑大大i 提交于 2021-02-18 00:51:12
问题 I am a newbie to LSTM and RNN as a whole, I've been racking my brain to understand what exactly is a timestep. I would really appreciate an intuitive explanation to this 回答1: Let's start with a great image from Chris Olah's blog (a highly recommended read btw): In a recurrent neural network you have multiple repetitions of the same cell. The way inference goes is - you take some input (x 0 ), pass it through the cell to get some output 1 (depicted with black arrow to the right on the picture)

why set return_sequences=True and stateful=True for tf.keras.layers.LSTM?

为君一笑 提交于 2021-02-17 16:38:22
问题 I am learning tensorflow2.0 and follow the tutorial. In the rnn example, I found the code: def build_model(vocab_size, embedding_dim, rnn_units, batch_size): model = tf.keras.Sequential([ tf.keras.layers.Embedding(vocab_size, embedding_dim, batch_input_shape=[batch_size, None]), tf.keras.layers.LSTM(rnn_units, return_sequences=True, stateful=True, recurrent_initializer='glorot_uniform'), tf.keras.layers.Dense(vocab_size) ]) return model My question is: why the code set the argument return

why set return_sequences=True and stateful=True for tf.keras.layers.LSTM?

不问归期 提交于 2021-02-17 16:36:26
问题 I am learning tensorflow2.0 and follow the tutorial. In the rnn example, I found the code: def build_model(vocab_size, embedding_dim, rnn_units, batch_size): model = tf.keras.Sequential([ tf.keras.layers.Embedding(vocab_size, embedding_dim, batch_input_shape=[batch_size, None]), tf.keras.layers.LSTM(rnn_units, return_sequences=True, stateful=True, recurrent_initializer='glorot_uniform'), tf.keras.layers.Dense(vocab_size) ]) return model My question is: why the code set the argument return

Keras Recurrent Neural Networks For Multivariate Time Series

蹲街弑〆低调 提交于 2021-02-16 09:15:25
问题 I have been reading about Keras RNN models (LSTMs and GRUs), and authors seem to largely focus on language data or univariate time series that use training instances composed of previous time steps. The data I have is a bit different. I have 20 variables measured every year for 10 years for 100,000 persons as input data, and the 20 variables measured for year 11 as output data. What I would like to do is predict the value of one of the variables (not the other 19) for the 11th year. I have my

LSTM to forecast numerical data by having categorical data as input

别来无恙 提交于 2021-02-11 12:50:53
问题 I have a similar DataFrame : df = pd.DataFrame([ {'date':'2021-01-15', 'value':145, 'label':'negative'}, {'date':'2021-01-16', 'value':144, 'label':'positive'}, {'date':'2021-01-17', 'value':147, 'label':'positive'}, {'date':'2021-01-18', 'value':146, 'label':'negative'}, {'date':'2021-01-19', 'value':155, 'label':'negative'}, {'date':'2021-01-20', 'value':157, 'label':'positive'}, {'date':'2021-01-21', 'value':158, 'label':'positive'}, {'date':'2021-01-22', 'value':157, 'label':'negative'},

Setting initial state in dynamic RNN

萝らか妹 提交于 2021-02-11 01:51:14
问题 Based on the link: https://www.tensorflow.org/api_docs/python/tf/nn/dynamic_rnn In the example, it is shown that the "initial state" is defined in the first example and not in the second example . Could anyone please explain what is the purpose of the initial state ? What's the difference if I don't set it vs if i set it ? Is it only required in a single RNN cell and not in a stacked cell like in the example provided in the link? I'm currently debugging my RNN model, as it seemed to classify

Result changes every time I run Neural Network code

烈酒焚心 提交于 2021-02-10 22:47:28
问题 I got the results by running the code provided in this link Neural Network – Predicting Values of Multiple Variables. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same (consistent) result? 回答1: The code is full of random.randint() everywhere! Furthermore, the weights are most of the time randomly set aswell, and the batch_size also has an influence (although pretty minor) in the result. Y_train, X_test, X_train

Result changes every time I run Neural Network code

别等时光非礼了梦想. 提交于 2021-02-10 22:47:23
问题 I got the results by running the code provided in this link Neural Network – Predicting Values of Multiple Variables. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same (consistent) result? 回答1: The code is full of random.randint() everywhere! Furthermore, the weights are most of the time randomly set aswell, and the batch_size also has an influence (although pretty minor) in the result. Y_train, X_test, X_train