lstm

Keras: Wrong Input Shape in LSTM Neural Network

我是研究僧i 提交于 2021-02-19 08:26:35
问题 I am trying to train an LSTM recurrent neural network, for sequence classification. My data has the following formart: Input: [1,5,2,3,6,2, ...] -> Output: 1 Input: [2,10,4,6,12,4, ...] -> Output: 1 Input: [4,1,7,1,9,2, ...] -> Output: 2 Input: [1,3,5,9,10,20, ...] -> Output: 3 . . . So basically I want to provide a sequence as an input and get an integer as an output. Each input sequence has length = 2000 float numbers, and I have around 1485 samples for training The output is just an

Delayed echo of sin - cannot reproduce Tensorflow result in Keras

早过忘川 提交于 2021-02-19 04:54:20
问题 I am experimenting with LSTMs in Keras with little to no luck. At some moment I decided to scale back to the most basic problems in order finally achieve some positive result. However, even with simplest problems I find that Keras is unable to converge while the implementation of the same problem in Tensorflow gives stable result. I am unwilling to just switch to Tensorflow without understanding why Keras keeps diverging on any problem I attempt. My problem is a many-to-many sequence

PyTorch LSTM - using word embeddings instead of nn.Embedding()

左心房为你撑大大i 提交于 2021-02-18 18:15:16
问题 Is the nn.Embedding() essential for learning for an LSTM? I am using an LSTM in PyTorch to predict NER - example of a similar task is here - https://pytorch.org/tutorials/beginner/nlp/sequence_models_tutorial.html Code wise, I am using code almost identical to the code in the tutorial above. The only detail is - I am using word2Vec instead of nn.Embedding(). So I remove the nn.Embedding() layer and provide the forward function the features from the word2Vec directly. The RNN does not learn.

Why is import cntk as C not working in google colab

杀马特。学长 韩版系。学妹 提交于 2021-02-18 12:54:45
问题 I installed opencv version 3.4.4, installed cntk,Importing into google collab gives the following results. import cntk as C /usr/local/lib/python3.6/dist-packages/cntk/cntk_py_init.py:56: UserWarning: Unsupported Linux distribution (ubuntu-18.04). CNTK supports Ubuntu 16.04 and above, only. warnings.warn('Unsupported Linux distribution (%s-%s). CNTK supports Ubuntu 16.04 and above, only.' % (__my_distro__, __my_distro_ver__)) /usr/local/lib/python3.6/dist-packages/cntk/cntk_py_init.py:102:

How to build a Neural Network with sentence embeding concatenated to pre-trained CNN

醉酒当歌 提交于 2021-02-18 08:48:40
问题 I want to build a neural network that will take the feature map from the last layer of a CNN (VGG or resnet for example), concatenate an additional vector (for example , 1X768 bert vector) , and re-train the last layer on classification problem. So the architecture should be like in: but I want to concat an additional vector to each feature vector (I have a sentence to describe each frame). I have 5 possible labels , and 100 frames in the input frames. Can someone help me as to how to

What exactly is timestep in an LSTM Model?

拥有回忆 提交于 2021-02-18 00:57:40
问题 I am a newbie to LSTM and RNN as a whole, I've been racking my brain to understand what exactly is a timestep. I would really appreciate an intuitive explanation to this 回答1: Let's start with a great image from Chris Olah's blog (a highly recommended read btw): In a recurrent neural network you have multiple repetitions of the same cell. The way inference goes is - you take some input (x 0 ), pass it through the cell to get some output 1 (depicted with black arrow to the right on the picture)

What exactly is timestep in an LSTM Model?

牧云@^-^@ 提交于 2021-02-18 00:52:34
问题 I am a newbie to LSTM and RNN as a whole, I've been racking my brain to understand what exactly is a timestep. I would really appreciate an intuitive explanation to this 回答1: Let's start with a great image from Chris Olah's blog (a highly recommended read btw): In a recurrent neural network you have multiple repetitions of the same cell. The way inference goes is - you take some input (x 0 ), pass it through the cell to get some output 1 (depicted with black arrow to the right on the picture)

What exactly is timestep in an LSTM Model?

左心房为你撑大大i 提交于 2021-02-18 00:51:12
问题 I am a newbie to LSTM and RNN as a whole, I've been racking my brain to understand what exactly is a timestep. I would really appreciate an intuitive explanation to this 回答1: Let's start with a great image from Chris Olah's blog (a highly recommended read btw): In a recurrent neural network you have multiple repetitions of the same cell. The way inference goes is - you take some input (x 0 ), pass it through the cell to get some output 1 (depicted with black arrow to the right on the picture)

Keras EarlyStopping: Which min_delta and patience to use?

流过昼夜 提交于 2021-02-17 19:13:56
问题 I am new to deep learning and Keras and one of the improvement I try to make to my model training process is to make use of Keras's keras.callbacks.EarlyStopping callback function. Based on the output from training my model, does it seem reasonable to use the following parameters for EarlyStopping ? EarlyStopping(monitor='val_loss', min_delta=0.0001, patience=5, verbose=0, mode='auto') Also, why does it appear to be stopped sooner than it should if it was to wait for 5 consecutive epochs

Keras EarlyStopping: Which min_delta and patience to use?

﹥>﹥吖頭↗ 提交于 2021-02-17 19:12:39
问题 I am new to deep learning and Keras and one of the improvement I try to make to my model training process is to make use of Keras's keras.callbacks.EarlyStopping callback function. Based on the output from training my model, does it seem reasonable to use the following parameters for EarlyStopping ? EarlyStopping(monitor='val_loss', min_delta=0.0001, patience=5, verbose=0, mode='auto') Also, why does it appear to be stopped sooner than it should if it was to wait for 5 consecutive epochs