When does keras reset an LSTM state?

∥☆過路亽.° 提交于 2019-11-26 09:25:47

问题


I read all sorts of texts about it, and none seem to answer this very basic question. It\'s always ambiguous:

In a stateful = False LSTM layer, does keras reset states after:

  • Each sequence; or
  • Each batch?

Suppose I have X_train shaped as (1000,20,1), meaning 1000 sequences of 20 steps of a single value. If I make:

model.fit(X_train, y_train, batch_size=200, nb_epoch=15)

Will it reset states for every single sequence (resets states 1000 times)?
Or will it reset states for every batch (resets states 5 times)?


回答1:


Cheking with some tests, I got to the following conclusion, which is according to the documentation and to Nassim's answer:

First, there isn't a single state in a layer, but one state per sample in the batch. There are batch_size parallel states in such a layer.

Stateful=False

In a stateful=False case, all the states are resetted together after each batch.

  • A batch with 10 sequences would create 10 states, and all 10 states are resetted automatically after it's processed.

  • The next batch with 10 sequences will create 10 new states, which will also be resetted after this batch is processed

If all those sequences have length (timesteps) = 7, the practical result of these two batches is:

20 individual sequences, each with length 7

None of the sequences are related. But of course: the weights (not the states) will be unique for the layer, and will represent what the layer has learned from all the sequences.

  • A state is: Where am I now inside a sequence? Which time step is it? How is this particular sequence behaving since its beginning up to now?
  • A weight is: What do I know about the general behavior of all sequences I've seen so far?

Stateful=True

In this case, there is also the same number of parallel states, but they will simply not be resetted at all.

  • A batch with 10 sequences will create 10 states that will remain as they are at the end of the batch.

  • The next batch with 10 sequences (it's required to be 10, since the first was 10) will reuse the same 10 states that were created before.

The practical result is: the 10 sequences in the second batch are just continuing the 10 sequences of the first batch, as if there had been no interruption at all.

If each sequence has length (timesteps) = 7, then the actual meaning is:

10 individual sequences, each with length 14

When you see that you reached the total length of the sequences, then you call model.reset_states(), meaning you will not continue the previous sequences anymore, now you will start feeding new sequences.




回答2:


In the doc of the RNN code you can read this :

Note on using statefulness in RNNs :

You can set RNN layers to be 'stateful', which means that the states computed for the samples in one batch will be reused as initial states for the samples in the next batch. This assumes a one-to-one mapping between samples in different successive batches.

I know that this doesn't answer directly your question, but to me it confirms what I was thinking : when a LSTM is not stateful, the state is reset after every sample. They don't work by batches, the idea in a batch is that every sample is independant from each other.

So you have 1000 reset of the state for your example.




回答3:


In Keras there are two modes for maintaining states: 1) The default mode (stateful = False) where the state is reset after each batch. AFAIK the state will still be maintained between different samples within a batch. So for your example state would be reset for 5 times in each epoch.

2) The stateful mode where the state is never reset. It is up to the user to reset state before a new epoch, but Keras itself wont reset the state. In this mode the state is propagated from sample "i" of one batch to sample"i" of the next batch. Generally it is recommended to reset state after each epoch, as the state may grow for too long and become unstable. However in my experience with small size datasets (20,000- 40,000 samples) resetting or not resetting the state after an epoch does not make much of a difference to the end result. For bigger datasets it may make a difference.

Stateful model will be useful if you have patterns that span over 100s of time steps. Otherwise the default mode is sufficient. In my experience setting the batch size roughly equivalent to the size (time steps) of the patterns in the data also helps.

The stateful setup could be quite difficult to grasp at first. One would expect the state to be transferred between the last sample of one batch to the first sample of the next batch. But the sate is actually propagated across batches between the same numbered samples. The authors had two choices and they chose the latter. Read about this here. Also look at the relevant Keras FAQ section on stateful RNNs




回答4:


Expanding on @Nassim_Ben's answer, it is true that each sequence is considered independent for each instance of the batch. However, you need to keep in mind that the RNNs hidden state and cell memory get's passed along to the next cell for 20 steps. The hidden state and cell memory is typically set to zero for the very first cell in the 20 cells.

After the 20th cell, and after the hidden state (only, not cell memory) gets passed onto the layers above the RNN, the state gets reset. I'm going to assume that they mean cell memory and hidden state here.

So yes, it does get reset for all 1000 instances, however, considering that your batch_size=200, it gets reset 5 times, with each batch getting reset after they are done passing information through those 20 steps. Hopefully you got your head around this.

Here's a project I did where I had the same question. Pay special attention to cell 15 and it's explanation in the blob after cell 11. I kept appending letters because the state was getting reset otherwise.




回答5:


Everyone seems to be making it too confusing. Keras LSTM resets state after every batch.

Here is a good blog: https://machinelearningmastery.com/understanding-stateful-lstm-recurrent-neural-networks-python-keras/

Read LSTM State Within A Batch and Stateful LSTM for a One-Char to One-Char Mapping topics in this blog. It shows why it must reset it after batch only.



来源:https://stackoverflow.com/questions/43882796/when-does-keras-reset-an-lstm-state

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!