Error when checking target: expected time_distributed_5 to have 3 dimensions, but got array with shape (14724, 1)

荒凉一梦 提交于 2019-12-01 22:46:14

Okay guys, think I found a fix According to - https://keras.io/layers/wrappers/ it says that we are applying dense layer to each timestep (in my case I have 48 timesteps). So, the output of my final layer would be (batch_size, timesteps, dimensions) for below:

output = TimeDistributed(Dense(1, activation='linear'))(output)

will be (?,48,1) hence the dimensions mismatch. However, If I want to convert this to single regression output we will have to flatten the final TimeDistributed layer

so I added the following lines to fix it:

output = Flatten()(output)
output = (Dense(1, activation='linear'))(output)

so now the timedistributed layer flattens to 49 inputs(looks like a bias input is included) to the final dense layer into a single output.

Okay, the code works fine and I am getting proper results(the model learns). My only doubt is if it is mathematically okay to flatten TimeDistributed layer to simple dense layer to get my result like stated above?

Can you provide a more on the context of your problem? Test data or at least more code. Why are you choosing this architecture in the first place? Would a simpler architecture (just the LSTM) do the trick? What are you regressing? Stacking multiple TimeDistributed Dense layers with linear activation functions probably isn't adding much to the model.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!