neural-network

Data shuffling for Image Classification

╄→гoц情女王★ 提交于 2020-04-17 22:50:44
问题 I want to develop a CNN model to identify 24 hand signs in American Sign Language. I created a custom dataset that contains 3000 images for each hand sign i.e. 72000 images in the entire dataset. For training the model, I would be using 80-20 dataset split (2400 images/hand sign in the training set and 600 images/hand sign in the validation set). My question is: Should I randomly shuffle the images when creating the dataset? And Why? Based on my previous experience, it led to validation loss

Big difference between val-acc and prediction accuracy in Keras Neural Network

社会主义新天地 提交于 2020-04-17 20:59:08
问题 I have a dataset that I used for making NN model in Keras, i took 2000 rows from that dataset to have them as validation data, those 2000 rows should be added in .predict function. I wrote a code for Keras NN and for now it works good, but I noticed something that is very strange for me. It gives me very good accuracy of more than 83%, loss is around 0.12, but when I want to make a prediction with unseen data (those 2000 rows), it only predicts correct in average of 65%. When I add Dropout

Self Organizing Map isn't working perfectly, same class always as output

可紊 提交于 2020-04-16 05:14:12
问题 I want to train and test Kohonen network which is a kind of (Self Organizing Maps). My problem is that I get all the outputs with same values either 0000 or 1111 each time even though I'm using random weights matrix which will differ each time I'm running the code! My data-set is 3 tiny text files on the link below: note that I'm using samples from my train data first to check if my code is correct before to use the test data. data-sets link #==================================================

How can I use tf.keras.Model.summary to see the layers of a child model which in a father model?

人走茶凉 提交于 2020-04-11 07:38:09
问题 I have a subclass Model of tf.keras.Model,code is following import tensorflow as tf class Mymodel(tf.keras.Model): def __init__(self, classes, backbone_model, *args, **kwargs): super(Mymodel, self).__init__(self, args, kwargs) self.backbone = backbone_model self.classify_layer = tf.keras.layers.Dense(classes,activation='sigmoid') def call(self, inputs): x = self.backbone(inputs) x = self.classify_layer(x) return x inputs = tf.keras.Input(shape=(224, 224, 3)) model = Mymodel(inputs=inputs,

How to disable dropout while prediction in keras?

早过忘川 提交于 2020-04-10 07:07:48
问题 I am using dropout in neural network model in keras. Little bit code is like model.add(Dropout(0.5)) model.add(Dense(classes)) For testing, I am using preds = model_1.predict_proba(image) . But while testing Dropout is also participating to predict the score which should not be happen. I search a lot to disable the dropout but didn't get any hint yet. Do anyone have solution to disable the Dropout while testing in keras?? 回答1: Keras does this by default. In Keras dropout is disabled in test

How to disable dropout while prediction in keras?

家住魔仙堡 提交于 2020-04-10 07:06:50
问题 I am using dropout in neural network model in keras. Little bit code is like model.add(Dropout(0.5)) model.add(Dense(classes)) For testing, I am using preds = model_1.predict_proba(image) . But while testing Dropout is also participating to predict the score which should not be happen. I search a lot to disable the dropout but didn't get any hint yet. Do anyone have solution to disable the Dropout while testing in keras?? 回答1: Keras does this by default. In Keras dropout is disabled in test

How to disable dropout while prediction in keras?

笑着哭i 提交于 2020-04-10 07:06:11
问题 I am using dropout in neural network model in keras. Little bit code is like model.add(Dropout(0.5)) model.add(Dense(classes)) For testing, I am using preds = model_1.predict_proba(image) . But while testing Dropout is also participating to predict the score which should not be happen. I search a lot to disable the dropout but didn't get any hint yet. Do anyone have solution to disable the Dropout while testing in keras?? 回答1: Keras does this by default. In Keras dropout is disabled in test

How can I use the output of intermediate layer of one model as input to another model?

时光总嘲笑我的痴心妄想 提交于 2020-04-07 19:07:50
问题 I train a model A and try to use the output of the intermediate layer with the name="layer_x" as an additional input for model B . I tried to use the output of the intermediate layer like on the Keras doc https://keras.io/getting-started/faq/#how-can-i-obtain-the-output-of-an-intermediate-layer. Model A: inputs = Input(shape=(100,)) dnn = Dense(1024, activation='relu')(inputs) dnn = Dense(128, activation='relu', name="layer_x")(dnn) dnn = Dense(1024, activation='relu')(dnn) output = Dense(10,

How to create variable names in loop for layers in pytorch neural network

拥有回忆 提交于 2020-04-05 06:40:11
问题 I am implementing a straightforward feedforward neural newtork in PyTorch. However I am wondern if theres a nicer way to add a flexible amount of layer to the network? Maybe by naming them during a loop, but i heard thats impossible? Currently I am doing it like this import torch import torch.nn as nn import torch.nn.functional as F class Net(nn.Module): def __init__(self, input_dim, output_dim, hidden_dim): super(Net, self).__init__() self.input_dim = input_dim self.output_dim = output_dim

Can the sigmoid activation function be used to solve regression problems in Keras?

感情迁移 提交于 2020-03-21 20:15:54
问题 I have implemented simple neural networks with R but it is my first time doing so with Keras so would appreciate some advice. I developed a neural network function in Keras to predict car sales (the dataset is available here). CarSales is the dependent variable. As far as I'm aware, Keras is used to develop a neural network for classification purposes rather than regression. In all the examples I have seen so far, the output is bounded between 0 and 1. Here is the code I developed, and you