cross-validation

Why should I build separated graph for training and validation in tensorflow?

走远了吗. 提交于 2019-12-01 10:33:01
I've been using tensorflow for a while now. At first I had stuff like this: def myModel(training): with tf.scope_variables('model', reuse=not training): do model return model training_model = myModel(True) validation_model = myModel(False) Mostly because I started with some MOOCs that tought me to do that. But they also didn't use TFRecords or Queues. And I didn't know why I was using two separate models. I tried building only one and feeding the data with the feed_dict : everything worked. Ever since I've been usually using only one model. My inputs are always place_holders and I just input

how to create leave one out cross validation in matlab? [duplicate]

风格不统一 提交于 2019-12-01 10:11:21
问题 This question already has an answer here : Leave one out cross validation algorithm in matlab (1 answer) Closed 6 years ago . I am still confused with my code. I tried to implement leave one out cross validation in matlab for classification. so in here . I take out one data from training become testing data. I already make a code in matlab. but Iam not sure it's correct because the result is wrong. can someone help me to correct it?? thank you very much. this is my code : clc [C,F] = train('D

Example of 10-fold cross-validation with Neural network classification in MATLAB

为君一笑 提交于 2019-12-01 08:54:58
问题 I am looking for an example of applying 10-fold cross-validation in neural network.I need something link answer of this question: Example of 10-fold SVM classification in MATLAB I would like to classify all 3 classes while in the example only two classes were considered. Edit: here is the code I wrote for iris example load fisheriris %# load iris dataset k=10; cvFolds = crossvalind('Kfold', species, k); %# get indices of 10-fold CV net = feedforwardnet(10); for i = 1:k %# for each fold

Why should I build separated graph for training and validation in tensorflow?

风流意气都作罢 提交于 2019-12-01 07:26:10
问题 I've been using tensorflow for a while now. At first I had stuff like this: def myModel(training): with tf.scope_variables('model', reuse=not training): do model return model training_model = myModel(True) validation_model = myModel(False) Mostly because I started with some MOOCs that tought me to do that. But they also didn't use TFRecords or Queues. And I didn't know why I was using two separate models. I tried building only one and feeding the data with the feed_dict : everything worked.

How to calculate feature importance in each models of cross validation in sklearn

天大地大妈咪最大 提交于 2019-12-01 05:35:16
问题 I am using RandomForestClassifier() with 10 fold cross validation as follows. clf=RandomForestClassifier(random_state = 42, class_weight="balanced") k_fold = StratifiedKFold(n_splits=10, shuffle=True, random_state=42) accuracy = cross_val_score(clf, X, y, cv=k_fold, scoring = 'accuracy') print(accuracy.mean()) I want to identify the important features in my feature space. It seems to be straightforward to get the feature importance for single classification as follows. print("Features sorted

Creating a table with individual trials from a frequency table in R (inverse of table function)

二次信任 提交于 2019-12-01 03:46:54
I have a frequency table of data in a data.frame in R listing factor levels and counts of successes and failures. I would like to turn it from frequency table into a list of events - i.e. the opposite of the "table" command. Specifically, I would like to turn this: factor.A factor.B success.count fail.count -------- -------- ------------- ---------- 0 1 0 2 1 1 2 1 into this: factor.A factor.B result -------- -------- ------- 0 1 0 0 1 0 1 1 1 1 1 1 1 1 0 It seems to me that reshape ought to do this, or even some obscure base function that I have not heard of, but I've had no luck. Even

Using neuralnet with caret train and adjusting the parameters

ぐ巨炮叔叔 提交于 2019-12-01 00:31:23
So I've read a paper that had used neural networks to model out a dataset which is similar to a dataset I'm currently using. I have 160 descriptor variables that I want to model out for 160 cases (regression modelling). The paper I read used the following parameters:- 'For each split, a model was developed for each of the 10 individual train-test folds. A three layer back-propagation net with 33 input neurons and 16 hidden neurons was used with online weight updates, 0.25 learning rate, and 0.9 momentum. For each fold, learning was conducted from a total of 50 different random initial weight

Selecting SVM parameters using cross validation and F1-scores

牧云@^-^@ 提交于 2019-11-30 23:36:02
I need to keep track of the F1-scores while tuning C & Sigma in SVM, For example the following code keeps track of the Accuracy, I need to change it to F1-Score but I was not able to do that……. %# read some training data [labels,data] = libsvmread('./heart_scale'); %# grid of parameters folds = 5; [C,gamma] = meshgrid(-5:2:15, -15:2:3); %# grid search, and cross-validation cv_acc = zeros(numel(C),1); for i=1:numel(C) cv_acc(i) = svmtrain(labels, data, ... sprintf('-c %f -g %f -v %d', 2^C(i), 2^gamma(i), folds)); end %# pair (C,gamma) with best accuracy [~,idx] = max(cv_acc); %# now you can

Fitting in nested cross-validation with cross_val_score with pipeline and GridSearch

你说的曾经没有我的故事 提交于 2019-11-30 20:06:13
问题 I am working in scikit and I am trying to tune my XGBoost. I made an attempt to use a nested cross-validation using the pipeline for the rescaling of the training folds (to avoid data leakage and overfitting) and in parallel with GridSearchCV for param tuning and cross_val_score to get the roc_auc score at the end. from imblearn.pipeline import Pipeline from sklearn.model_selection import RepeatedKFold from sklearn.model_selection import GridSearchCV from sklearn.model_selection import cross

Using neuralnet with caret train and adjusting the parameters

爷,独闯天下 提交于 2019-11-30 19:09:06
问题 So I've read a paper that had used neural networks to model out a dataset which is similar to a dataset I'm currently using. I have 160 descriptor variables that I want to model out for 160 cases (regression modelling). The paper I read used the following parameters:- 'For each split, a model was developed for each of the 10 individual train-test folds. A three layer back-propagation net with 33 input neurons and 16 hidden neurons was used with online weight updates, 0.25 learning rate, and 0