Time Series Prediction via Neural Networks

谁说胖子不能爱 提交于 2019-11-28 17:05:52

I think that you've got the basic idea: a "sliding window" approach where a network is trained to use the last k values of a series (Tn-k ... Tn-1) to predict the current value (Tn).

There are a lot of ways you can do this, however. For example:

  • How big should that window be?
  • Should the data be preprocessed in any way (e.g. to remove outliers)?
  • What network configuration (e.g. # of hidden nodes, # of layers) and algorithm should be used?

Often people end up figuring out the best way to learn from their particular data by trial and error.

There are a fair number of publicly-accessible papers out there about this stuff. Start with these, and look at their citations and papers that cite them via Google Scholar, and you should have plenty to read:

There is a kind of neural networks named recurrent neural networks (RNNs. One advantage of using these models is you do not have to define an sliding window for the input examples. A variant of RNNs known as Long-Short Term Memory (LSTM) can potentially take into account many instances in the previous time stamps and a "forget gate" is used to allow or disallow remembering the previous results from the previous time stamps.

Technically this is the same as your digit recognition - it is recognizing something and returning what it was...

Well - now your inputs are the previous steps (T-5 ... T-1) - and your output or outputs are the predicted steps (T0, T1...).

The mechanics in the ANN itself are the same - you will have to teach every layer for feature detection, correcting its reconstruction of the thing, so that it looks like what is actually going to happen.

(some more info about what do I mean: tech talk )

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!