neural-network

TPU training freezes in the middle of training

余生颓废 提交于 2021-02-11 12:32:39
问题 I'm trying to train a CNN regression net in TF 1.12, using TPU v3-8 1.12 instance. The model succesfully compiles with XLA, starting the training process, but some where after the half iterations of the 1t epoch freezes, and doing nothing. I cannot find the root of the problem. def read_tfrecord(example): features = { 'image': tf.FixedLenFeature([], tf.string), 'labels': tf.FixedLenFeature([], tf.string) } sample=tf.parse_single_example(example, features) image = tf.image.decode_jpeg(sample[

Plotting 3D brain image Python

自古美人都是妖i 提交于 2021-02-11 12:28:43
问题 I have a data set of brain which is in .nii.gz format to work on neural network I used nilearn package to read the file and get the image data. The shape of the image is (256, 256, 150) I can plot the 2D image from slicing the image using Matplotlib but, How can I plot the image in 3D using Python Thanks 来源: https://stackoverflow.com/questions/62256355/plotting-3d-brain-image-python

Can not squeeze dim[1], expected a dimension of 1, got 5

。_饼干妹妹 提交于 2021-02-11 09:41:29
问题 I tried different solutions but still facing the issue. Actually I am new in Ml/DL (python). In which case we face this error "Can not squeeze dim1, expected a dimension of 1, got 5"? Please help me here, what I am doing wrong here and what is correct Here is InvalidArgumentError Traceback (most recent call last) --------------------------------------------------------------------------- <ipython-input-9-0826122252c2> in <module>() 98 model.summary() 99 model.compile(loss='sparse_categorical

How implement Batch Norm with SWA in Tensorflow?

元气小坏坏 提交于 2021-02-11 06:17:18
问题 I am using Stochastic Weight Averaging (SWA) with Batch Normalization layers in Tensorflow 2.2. For Batch Norm I use tf.keras.layers.BatchNormalization . For SWA I use my own code to average the weights (I wrote my code before tfa.optimizers.SWA appeared). I have read in multiple sources that if using batch norm and SWA we must run a forward pass to make certain data (running mean and st dev of activation weights and/or momentum values?) available to the batch norm layers. What I do not

How implement Batch Norm with SWA in Tensorflow?

三世轮回 提交于 2021-02-11 06:13:24
问题 I am using Stochastic Weight Averaging (SWA) with Batch Normalization layers in Tensorflow 2.2. For Batch Norm I use tf.keras.layers.BatchNormalization . For SWA I use my own code to average the weights (I wrote my code before tfa.optimizers.SWA appeared). I have read in multiple sources that if using batch norm and SWA we must run a forward pass to make certain data (running mean and st dev of activation weights and/or momentum values?) available to the batch norm layers. What I do not

How implement Batch Norm with SWA in Tensorflow?

偶尔善良 提交于 2021-02-11 06:13:06
问题 I am using Stochastic Weight Averaging (SWA) with Batch Normalization layers in Tensorflow 2.2. For Batch Norm I use tf.keras.layers.BatchNormalization . For SWA I use my own code to average the weights (I wrote my code before tfa.optimizers.SWA appeared). I have read in multiple sources that if using batch norm and SWA we must run a forward pass to make certain data (running mean and st dev of activation weights and/or momentum values?) available to the batch norm layers. What I do not

Result changes every time I run Neural Network code

烈酒焚心 提交于 2021-02-10 22:47:28
问题 I got the results by running the code provided in this link Neural Network – Predicting Values of Multiple Variables. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same (consistent) result? 回答1: The code is full of random.randint() everywhere! Furthermore, the weights are most of the time randomly set aswell, and the batch_size also has an influence (although pretty minor) in the result. Y_train, X_test, X_train

Result changes every time I run Neural Network code

别等时光非礼了梦想. 提交于 2021-02-10 22:47:23
问题 I got the results by running the code provided in this link Neural Network – Predicting Values of Multiple Variables. I was able to compute losses accuracy etc. However, every time I run this code, I get a new result. Is it possible to get the same (consistent) result? 回答1: The code is full of random.randint() everywhere! Furthermore, the weights are most of the time randomly set aswell, and the batch_size also has an influence (although pretty minor) in the result. Y_train, X_test, X_train

plot Roc curve using keras

左心房为你撑大大i 提交于 2021-02-10 20:51:57
问题 I have a neural network model and I am using KerasClassifier and then using KFold for cross-validation. Now I am having issues in plotting the ROC curve. I have tried few codes but most of them is giving me an error of multi-labeled is not interpreted. I have the following code till my neural network produces the accuracy. I will be thankful if anyone can help me with the later part of the code. import numpy as np import pandas as pd from keras.layers import Dense, Input from keras.models

plot Roc curve using keras

旧街凉风 提交于 2021-02-10 20:46:23
问题 I have a neural network model and I am using KerasClassifier and then using KFold for cross-validation. Now I am having issues in plotting the ROC curve. I have tried few codes but most of them is giving me an error of multi-labeled is not interpreted. I have the following code till my neural network produces the accuracy. I will be thankful if anyone can help me with the later part of the code. import numpy as np import pandas as pd from keras.layers import Dense, Input from keras.models