mnist

Reshape and write ImageDataGenerator output to CSV file

跟風遠走 提交于 2021-01-07 01:05:55
问题 I'm working with the MNIST data set. I have the training data vectors in one CSV file (i.e. 60,000 rows, each with 784 columns), and the labels in a separate CSV file. I want to bulk up the amount of training data, and append it to the CSV. It has to be done like this, because then the CSV file has to be fed in to a separate pipeline. I originally wrote this script: import keras from keras.preprocessing.image import ImageDataGenerator import pandas as pd X_train = pd.read_csv('train-images

Convert MNIST data set from CSV to ubyte format

北战南征 提交于 2020-12-15 02:02:55
问题 I'm working with the MNIST data set. I pulled down the original binary files (i.e. -ubyte; 784 columns X 60,000 rows for training data set), and converted them to CSV so I could do some processing on them. Now I want to convert the CSV files back to ubyte, to upload them to a pipeline I'm testing. I found this code, but I would have thought converting .csv to ubyte would be a common process, particularly as the MNIST data set is so famous, and I'm wondering am I missing something and if there

Convert MNIST data set from CSV to ubyte format

走远了吗. 提交于 2020-12-15 01:59:39
问题 I'm working with the MNIST data set. I pulled down the original binary files (i.e. -ubyte; 784 columns X 60,000 rows for training data set), and converted them to CSV so I could do some processing on them. Now I want to convert the CSV files back to ubyte, to upload them to a pipeline I'm testing. I found this code, but I would have thought converting .csv to ubyte would be a common process, particularly as the MNIST data set is so famous, and I'm wondering am I missing something and if there

Convert MNIST data set from CSV to ubyte format

我与影子孤独终老i 提交于 2020-12-15 01:59:10
问题 I'm working with the MNIST data set. I pulled down the original binary files (i.e. -ubyte; 784 columns X 60,000 rows for training data set), and converted them to CSV so I could do some processing on them. Now I want to convert the CSV files back to ubyte, to upload them to a pipeline I'm testing. I found this code, but I would have thought converting .csv to ubyte would be a common process, particularly as the MNIST data set is so famous, and I'm wondering am I missing something and if there

Tensorflow 2.0 InvalidArgumentError: assertion failed: [Condition x == y did not hold element-wise:]

我们两清 提交于 2020-12-06 12:20:24
问题 i am training a mnist CNN. When i ran my code the problem is coming . I tried other answers but they do not work. I am a new to TensorFlow so can someone explain me this error. Here is my code. i am using Pycharm 2020.2. and Python 3.6 in anaconda. There is no help i could find. import tensorflow as tf from tensorflow.keras.models import Sequential mnist = tf.keras.datasets.mnist (x_train, y_train), (x_test, y_test) = mnist.load_data() x_train = tf.keras.utils.normalize(x_train, axis=1) x

Should I use softmax as output when using cross entropy loss in pytorch?

≯℡__Kan透↙ 提交于 2020-07-18 04:24:48
问题 I have a problem with classifying fully connected deep neural net with 2 hidden layers for MNIST dataset in pytorch . I want to use tanh as activations in both hidden layers, but in the end, I should use softmax . For the loss, I am choosing nn.CrossEntropyLoss() in pytorch, which (as I have found out) does not want to take one-hot encoded labels as true labels, but takes LongTensor of classes instead. My model is nn.Sequential() and when I am using softmax in the end, it gives me worse

Should I use softmax as output when using cross entropy loss in pytorch?

℡╲_俬逩灬. 提交于 2020-07-18 04:23:59
问题 I have a problem with classifying fully connected deep neural net with 2 hidden layers for MNIST dataset in pytorch . I want to use tanh as activations in both hidden layers, but in the end, I should use softmax . For the loss, I am choosing nn.CrossEntropyLoss() in pytorch, which (as I have found out) does not want to take one-hot encoded labels as true labels, but takes LongTensor of classes instead. My model is nn.Sequential() and when I am using softmax in the end, it gives me worse

ModuleNotFoundError: No module named 'tensorflow.examples'

独自空忆成欢 提交于 2020-06-27 08:53:15
问题 When I import tensorflow import tensorflow as tf I don't get an error. However, I do get the error below. I'm using spyder if that helps. As per other questions, I ensured up to date (v1.8) tensorflow using both conda and then pip installs. This didn't resolve the issue. Please assist. import tensorflow.examples.tutorials.mnist.input_data as input_data ModuleNotFoundError: No module named 'tensorflow.examples' 回答1: Sometimes on downloading the TF, the example directory might not be available.

Tensorflow weight initialization

烈酒焚心 提交于 2020-06-09 08:29:05
问题 Regarding the MNIST tutorial on the TensorFlow website, I ran an experiment (gist) to see what the effect of different weight initializations would be on learning. I noticed that, against what I read in the popular [Xavier, Glorot 2010] paper, learning is just fine regardless of weight initialization. The different curves represent different values for w for initializing the weights of the convolutional and fully connected layers. Note that all values for w work fine, even though 0.3 and 1.0