I want to predict certain values that are weekly predictable (low SNR). I need to predict the whole time series of a year formed by the weeks of the year (52 values - Figure
I have data from 10 years. If my training dataset are: values from 4 weeks to predict the 5th and I keep shifting, I can have almost 52 X 9 examples to train the model and 52 to predict (last year)
This actually means you have only 9 training examples with 52 features each (unless you want to train on highly overlapping input data). Either way, I don't think this is nearly enough to merit training an LSTM.
I would suggest trying a much simpler model. Your input and output data is of fixed size, so you could try sklearn.linear_model.LinearRegression which handles multiple input features (in your case 52) per training example, and multiple targets (also 52).
Update: If you must use an LSTM then take a look at LSTM Neural Network for Time Series Prediction, a Keras LSTM implementation which supports multiple future predictions all at once or iteratively by feeding each prediction back in as input. Based on your comments this should be exactly what you want.
The architecture of the network in this implementation is:
model = Sequential()
model.add(LSTM(
input_shape=(layers[1], layers[0]),
output_dim=layers[1],
return_sequences=True))
model.add(Dropout(0.2))
model.add(LSTM(
layers[2],
return_sequences=False))
model.add(Dropout(0.2))
model.add(Dense(
output_dim=layers[3]))
model.add(Activation("linear"))
However, I would still recommend running a linear regression or maybe a simple feed forward network with one hidden layer and comparing accuracy with the LSTM. Especially if you are predicting one output at a time and feeding it back in as input your errors could easily accumulate giving you very bad predictions further on.