mnist

Convolutional neural network outputting equal probabilities for all labels

有些话、适合烂在心里 提交于 2019-12-02 07:28:08
I am currently training a CNN on MNIST, and the output probabilities (softmax) are giving [0.1,0.1,...,0.1] as training goes on. The initial values aren't uniform, so I can't figure out if I'm doing something stupid here? I'm only training for 15 steps, just to see how training progresses; even though that's a low number, I don't think that should result in uniform predictions? import numpy as np import tensorflow as tf import imageio from sklearn.datasets import fetch_mldata mnist = fetch_mldata('MNIST original') # Getting data from sklearn.model_selection import train_test_split def one_hot

Unexpected increase in validation error in MNIST Pytorch

蓝咒 提交于 2019-12-02 05:13:34
I'm a bit new to the whole field and thus decided to work on the MNIST dataset. I pretty much adapted the whole code from https://github.com/pytorch/examples/blob/master/mnist/main.py , with only one significant change: Data Loading. I didn't want to use the pre-loaded dataset within Torchvision. So I used MNIST in CSV . I loaded the data from CSV file by inheriting from Dataset and making a new dataloader. Here's the relevant code: mean = 33.318421449829934 sd = 78.56749081851163 # mean = 0.1307 # sd = 0.3081 import numpy as np from torch.utils.data import Dataset, DataLoader class dataset

使用tensorflow实现cnn进行mnist识别

淺唱寂寞╮ 提交于 2019-12-01 17:14:53
第一个CNN代码,暂时对于CNN的BP还不熟悉。但是通过这个代码对于tensorflow的运行机制有了初步的理解 1 ''' 2 softmax classifier for mnist 3 4 created on 2019.9.28 5 author: vince 6 ''' 7 import math 8 import logging 9 import numpy 10 import random 11 import matplotlib.pyplot as plt 12 import tensorflow as tf 13 from tensorflow.contrib.learn.python.learn.datasets.mnist import read_data_sets 14 from sklearn.metrics import accuracy_score 15 16 def weight_bais_variable(shape): 17 init = tf.random.truncated_normal(shape = shape, stddev = 0.01); 18 return tf.Variable(init); 19 20 def bais_variable(shape): 21 init = tf.constant(0.1, shape

tensorboard_embedding

时间秒杀一切 提交于 2019-12-01 16:42:41
from tensorboardX import SummaryWriter import torchvision writer=SummaryWriter(log_dir="embedding") mnist=torchvision.datasets.MNIST("mnist",download=True) writer.add_embedding( mat=mnist.train_data.reshape((-1,28*28))[:100,:], metadata=mnist.train_labelS[:100], label_img=mnist.train_data[:100,:,:].reshape((-1,1,28,28)).float()/255, global_step=0 )    add_embedding (mat , metadata = None , label_img = None , global_step = None , tag = 'default' , metadata_header = None ) mat (torch.Tensor or numpy.array): 一个矩阵,每行代表特征空间的一个数据点 metadata (list or torch.Tensor or numpy.array, optional): 一个一维列表,mat

tensorflow2.0手写数字识别

旧时模样 提交于 2019-12-01 16:02:06
import tensorflow as tf import matplotlib.pyplot as plt import numpy as np datapath = r'D:\data\ml\mnist.npz' (x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data(datapath) x_train = tf.keras.utils.normalize(x_train, axis=1) x_test = tf.keras.utils.normalize(x_test, axis=1) model = tf.keras.models.Sequential() model.add(tf.keras.layers.Flatten()) model.add(tf.keras.layers.Dense(128, activation=tf.nn.relu)) model.add(tf.keras.layers.Dense(128, activation=tf.nn.relu)) model.add(tf.keras.layers.Dense(10, activation=tf.nn.softmax)) model.compile(optimizer='adam', loss='sparse

TensorBoard应用进阶与TensorFlow游乐场

霸气de小男生 提交于 2019-12-01 13:21:36
主要内容: 一、TensorBoard应用进阶 1.1图像的显示 在构建模型的输入层时,定义了占位符x,placeholder就是接收训练样本的图像数据。为了把这些图像在TensorBoard中显示出来,需要把它加入summary中去。 通过tf.summary.image()函数,把带进来的图像信息加到summary中去。它的第一个参数是标识ID;第二个参数是图像的数据;第三个参数为10,表示最多显示10张图片。 image_shaped_input要求的形是一个四维的模式,后面三个参数分别表示图像的长、宽以及颜色的通道数(灰度图为1),第一个参数表示一次有多少行数据,-1表示暂时不确定,会根据带进来所有数据的总数来计算。 1.2张量的显示 在TensorBoard中以直方图的形式显示在计算中一些tensor的情况 可以通过tf.summary.histogram()函数。它的第一个参数也是标识ID,第二个参数是要显示的tensor值。 1.3标量的显示 当我们定义好loss_function以后,就可以通过tf.summary.scalar()函数,把loss值以标量的形式显示出来。 我们还可以把准确率也以标量的形式显示出来。 1.4训练模型的显示 在训练模型的过程中,需要把前面所有定义的summary操作合并。 这里定义了一个merged_summary_op,直接调用tf

Tensorflow export estimators for prediction

家住魔仙堡 提交于 2019-12-01 10:41:12
问题 I wonder how can I export the estimator and then import it for prediction from MNIST tutorial, Tensorflow's page. Thank you! 回答1: The Estimator has model_dir args where the model will be saved. So during prediction we use the Estimator and call the predict method which recreates the graph and the checkpoints are loaded. For the MNIST example, the prediction code would be: tf.reset_default_graph() # An input-function to predict the class of new data. predict_input_fn = tf.estimator.inputs

Tensorflow细节-P312-PROJECTOR

你说的曾经没有我的故事 提交于 2019-12-01 10:27:25
首先进行数据预处理,需要生成.tsv、.jpg文件 import matplotlib.pyplot as plt import numpy as np import os from tensorflow.examples.tutorials.mnist import input_data LOG_DIR = 'log' SPRITE_FILE = 'mnist_sprite.jpg' META_FIEL = "mnist_meta.tsv" # 存储索引和标签 def create_sprite_image(images): if isinstance(images, list): images = np.array(images) img_h = images.shape[1] img_w = images.shape[2] n_plots = int(np.ceil(np.sqrt(images.shape[0]))) spriteimage = np.ones((img_h * n_plots, img_w * n_plots)) for i in range(n_plots): for j in range(n_plots): this_filter = i * n_plots + j if this_filter < images.shape[0]: # 个数 this

TensorFlow2.0 入门教程实战案例

只谈情不闲聊 提交于 2019-12-01 10:11:50
中文文档 TensorFlow 2 / 2.0 中文文档 知乎专栏 欢迎关注知乎专栏 https://zhuanlan.zhihu.com/geektutu 一、实战教程之强化学习 TensorFlow 2.0 (九) - 强化学习 70行代码实战 Policy Gradient TensorFlow 2.0 (八) - 强化学习 DQN 玩转 gym Mountain Car TensorFlow 2.0 (七) - 强化学习 Q-Learning 玩转 OpenAI gym TensorFlow 2.0 (六) - 监督学习玩转 OpenAI gym game 二、实战教程之图像识别 TensorFlow 2.0 (五) - mnist手写数字识别(CNN卷积神经网络) TensorFlow入门(四) - mnist手写数字识别(制作h5py训练集) TensorFlow入门(三) - mnist手写数字识别(可视化训练) TensorFlow入门(二) - mnist手写数字识别(模型保存加载) TensorFlow入门(一) - mnist手写数字识别(网络搭建) 三、Github 源码地址 Github - TensorFlow 2.0 Tutorial Github - TensorFlow 2 / 2.0 中文文档 中文文档 TensorFlow 2 / 2.0

Tensorflow Slim restore model and predict

≡放荡痞女 提交于 2019-12-01 07:29:53
问题 I'm currently trying to learn how to use TF-Slim and I'm following this tutorial: https://github.com/mnuke/tf-slim-mnist. Assuming that I already have a trained model saved in a checkpoint, how do I now use that model and apply it? Like, in the tutorial how do I use my trained MNIST model and feed in a new set of MNIST images, and print the predictions? 回答1: You can try a workflow like: #obtain the checkpoint file checkpoint_file= tf.train.latest_checkpoint("./log") #Construct a model as such