问题
I've recently started to use Google Colab, and wanted to train my first Convolutional NN. I imported the images from my Google Drive thanks to the answer I got here.
Then I pasted my code to create the CNN into Colab and started the process. Here is the complete code:
Part 1: Setting up Colab to import picture from my Drive
(part 1 is copied from here as it worked as exptected for me
Step 1:
!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!add-apt-repository -y ppa:alessandro-strada/ppa 2>&1 > /dev/null
!apt-get update -qq 2>&1 > /dev/null
!apt-get -y install -qq google-drive-ocamlfuse fuse
Step 2:
from google.colab import auth
auth.authenticate_user()
Step 3:
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}
Step 4:
!mkdir -p drive
!google-drive-ocamlfuse drive
Step 5:
print('Files in Drive:')
!ls drive/
Part 2: Copy pasting my CNN
I created this CNN with tutorials from a Udemy Course. It uses keras with tensorflow as backend. For the sake of simplicity I uploaded a really simple version, which is plenty enough to show my problems
from keras.models import Sequential
from keras.layers import Conv2D
from keras.layers import MaxPooling2D
from keras.layers import Flatten
from keras.layers import Dense
from keras.layers import Dropout
from keras.optimizers import Adam
from keras.preprocessing.image import ImageDataGenerator
parameters
imageSize=32
batchSize=64
epochAmount=50
CNN
classifier=Sequential()
classifier.add(Conv2D(32, (3, 3), input_shape = (imageSize, imageSize, 3), activation = 'relu')) #convolutional layer
classifier.add(MaxPooling2D(pool_size = (2, 2))) #pooling layer
classifier.add(Flatten())
ANN
classifier.add(Dense(units=64, activation='relu')) #hidden layer
classifier.add(Dense(units=1, activation='sigmoid')) #output layer
classifier.compile(optimizer = "adam", loss = 'binary_crossentropy', metrics = ['accuracy']) #training method
image preprocessing
train_datagen = ImageDataGenerator(rescale = 1./255,
shear_range = 0.2,
zoom_range = 0.2,
horizontal_flip = True)
test_datagen = ImageDataGenerator(rescale = 1./255)
training_set = train_datagen.flow_from_directory('drive/School/sem-2-2018/BSP2/UdemyCourse/CNN/dataset/training_set',
target_size = (imageSize, imageSize),
batch_size = batchSize,
class_mode = 'binary')
test_set = test_datagen.flow_from_directory('drive/School/sem-2-2018/BSP2/UdemyCourse/CNN/dataset/test_set',
target_size = (imageSize, imageSize),
batch_size = batchSize,
class_mode = 'binary')
classifier.fit_generator(training_set,
steps_per_epoch = (8000//batchSize),
epochs = epochAmount,
validation_data = test_set,
validation_steps = (2000//batchSize))
Now comes my Problem
First of, the training set I used is a database with 10000 dog and cat pictures of various resolutions. (8000 training_set, 2000 test_set)
I ran this CNN on Google Colab (with GPU support enabled) and on my PC (tensorflow-gpu on GTX 1060)
This is an intermediate result from my PC:
Epoch 2/50
63/125 [==============>...............] - ETA: 2s - loss: 0.6382 - acc: 0.6520
And this from Colab:
Epoch 1/50
13/125 [==>...........................] - ETA: 1:00:51 - loss: 0.7265 - acc: 0.4916
Why is Google Colab so slow in my case?
Personally I suspect a bottleneck consisting of pulling and then reading the images from my Drive, but I don't know how to solve this other than choosing a different method to import the database.
回答1:
As @Feng has already noted, reading files from drive is very slow. This tutorial suggests using some sort of a memory mapped file like hdf5 or lmdb in order to overcome this issue. This way the I\O Operations are much faster (for a complete explanation on the speed gain of hdf5 format see this).
回答2:
It's very slow to read file from google drives.
For example, I have one big file(39GB).
It cost more than 10min when I exec '!cp drive/big.file /content/'.
After I shared my file, and got the url from google drive. It cost 5 min when I exec '! wget -c -O big.file http://share.url.from.drive'. Download speed can up to 130MB/s.
回答3:
I have the same question as to why the GPU on colab seems to be taking at least just as long as my local pc so I can't really be of help there. But with that being said, if you are trying to use your data locally, I have found the following process to be significantly faster than just using the upload function provided in colab.
1.) mount google drive
# Run this cell to mount your Google Drive.
from google.colab import drive
drive.mount('/content/drive')
2.) create a folder outside of the google drive folder that you want your data to be stored in
3.) use the following command to copy the contents from your desired folder in google drive to the folder you created
!ln -s "/content/drive/My Drive/path_to_folder_desired" "/path/to/the_folder/you created"
(this is referenced from another stackoverflow response that I used to find a solution to a similar issue )
4.) Now you have your data available to you at the path, "/path/to/the_folder/you created"
回答4:
You can load your data as numpy array (.npy format) and use flow method instead of flow_from_directory. Colab provides 25GB RAM ,so even for big data-sets you can load your entire data into memory. The speed up was found to be aroud 2.5x, with the same data generation steps!!! (Even faster than data stored in colab local disk i.e '/content' or google drive.
Since colab provides only a single core CPU (2 threads per core), there seems to be a bottleneck with CPU-GPU data transfer (say K80 or T4 GPU), especially if you use data generator for heavy preprocessing or data augmentation. You can also try setting different values for parameters like 'workers', 'use_multiprocessing', 'max_queue_size ' in fit_generator method ...
来源:https://stackoverflow.com/questions/49360888/google-colab-is-very-slow-compared-to-my-pc