deep-learning

tensorflow:Your input ran out of data

二次信任 提交于 2020-12-25 09:35:14
问题 I am working on a seq2seq keras/tensorflow 2.0 model. Every time the user inputs something, my model prints the response perfectly fine. However on the last line of each response I get this: You: WARNING:tensorflow:Your input ran out of data; interrupting training. Make sure that your dataset or generator can generate at least steps_per_epoch * epochs batches (in this case, 2 batches). You may need to use the repeat() function when building your dataset. The "You:" is my last output, before

What does model.eval() do in pytorch?

孤街醉人 提交于 2020-12-24 04:00:07
问题 I am using this code, and saw model.eval() in some cases. I understand it is supposed to allow me to "evaluate my model", but I don't understand when I should and shouldn't use it, or how to turn if off. I would like to run the above code to train the network, and also be able to run validation every epoch. I wasn't able to do it still. 回答1: model.eval() is a kind of switch for some specific layers/parts of the model that behave differently during training and inference (evaluating) time. For

Why the model is training on only 1875 training set images if there are 60000 images in the MNIST dataset? [duplicate]

折月煮酒 提交于 2020-12-21 04:22:20
问题 This question already has answers here : TensorFlow Only running on 1/32 of the Training data provided (1 answer) Keras not training on entire dataset (2 answers) Training MNIST data set in google colab issue: [duplicate] (2 answers) actual training samples and samples visible in epochs while running are different [duplicate] (1 answer) Model fitting doesn't use all of the provided data [duplicate] (1 answer) Closed 7 months ago . I am trying to create a simple CNN to classify images in MNIST

Numpy unavailable even if eager-execution is enabled

£可爱£侵袭症+ 提交于 2020-12-15 19:16:09
问题 My code works well with VGG19 loaded from a .mat file and is used in the following function like this (I use tensorflow 1.14.0) : #initial method : load VGG19 via file and function conv2D_relu VGG19 = scipy.io.loadmat('imagenet-vgg-verydeep-19.mat') VGG19_layers = VGG19['layers'][0] def conv2d_relu_old(prev_layer, n_layer, layer_name,VGG19_layers): # get weights for this layer: weights = VGG19_layers[n_layer][0][0][2][0][0] W = tf.constant(weights) bias = VGG19_layers[n_layer][0][0][2][0][1]

AutoTokenizer.from_pretrained fails to load locally saved pretrained tokenizer (PyTorch)

◇◆丶佛笑我妖孽 提交于 2020-12-15 09:05:40
问题 I am new to PyTorch and recently, I have been trying to work with Transformers. I am using pretrained tokenizers provided by HuggingFace. I am successful in downloading and running them. But if I try to save them and load again, then some error occurs. If I use AutoTokenizer.from_pretrained to download a tokenizer, then it works. [1]: tokenizer = AutoTokenizer.from_pretrained('distilroberta-base') text = "Hello there" enc = tokenizer.encode_plus(text) enc.keys() Out[1]: dict_keys(['input_ids'

AutoTokenizer.from_pretrained fails to load locally saved pretrained tokenizer (PyTorch)

安稳与你 提交于 2020-12-15 09:04:53
问题 I am new to PyTorch and recently, I have been trying to work with Transformers. I am using pretrained tokenizers provided by HuggingFace. I am successful in downloading and running them. But if I try to save them and load again, then some error occurs. If I use AutoTokenizer.from_pretrained to download a tokenizer, then it works. [1]: tokenizer = AutoTokenizer.from_pretrained('distilroberta-base') text = "Hello there" enc = tokenizer.encode_plus(text) enc.keys() Out[1]: dict_keys(['input_ids'

Predicting future values in a multivariate time forecasting LSTM model

好久不见. 提交于 2020-12-15 08:31:07
问题 I am confused on how to predict future results with a time series multivariate LSTM model. I am trying to build a model for a stock market prediction and I have the following data features Date DailyHighPrice DailyLowPrice Volume ClosePrice If I train my model on 5 years of data up until today and I want to predict tomorrows ClosePrice, essentially I will need to predict all the data features for tomorrow. This is where I am confused.... Because if all the data features are dependent on one

How can I create a dataset in tensorflow with multiple outputs and data sources? [closed]

谁说我不能喝 提交于 2020-12-15 05:32:24
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 5 days ago . Improve this question I have a structure like this: file01,file02... file_output(all dictionaries) structure of files where each file is a dataframe with features as columns and a single file in output with 4 numbers that rapresent the output or y of my network. How can I feed multiple folders like

How can I create a dataset in tensorflow with multiple outputs and data sources? [closed]

情到浓时终转凉″ 提交于 2020-12-15 05:31:30
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 5 days ago . Improve this question I have a structure like this: file01,file02... file_output(all dictionaries) structure of files where each file is a dataframe with features as columns and a single file in output with 4 numbers that rapresent the output or y of my network. How can I feed multiple folders like

How can I create a dataset in tensorflow with multiple outputs and data sources? [closed]

陌路散爱 提交于 2020-12-15 05:31:23
问题 Closed . This question needs details or clarity. It is not currently accepting answers. Want to improve this question? Add details and clarify the problem by editing this post. Closed 5 days ago . Improve this question I have a structure like this: file01,file02... file_output(all dictionaries) structure of files where each file is a dataframe with features as columns and a single file in output with 4 numbers that rapresent the output or y of my network. How can I feed multiple folders like