问题
I am using keras for CNN but the problem is that there is memory leak. The error is
anushreej@cpusrv-gpu-109:~/12EC35005/MTP_Workspace/MTP$ python cnn_implement.py
Using Theano backend.
[INFO] compiling model...
Traceback (most recent call last):
File "cnn_implement.py", line 23, in <module>
model = CNNModel.build(width=150, height=150, depth=3)
File "/home/ms/anushreej/12EC35005/MTP_Workspace/MTP/cnn/networks/model_define.py", line 27, in build
model.add(Dense(depth*height*width))
File "/home/ms/anushreej/anaconda3/lib/python3.5/site-packages/keras/models.py", line 146, in add
output_tensor = layer(self.outputs[0])
File "/home/ms/anushreej/anaconda3/lib/python3.5/site-packages/keras/engine/topology.py", line 458, in __call__
self.build(input_shapes[0])
File "/home/ms/anushreej/anaconda3/lib/python3.5/site-packages/keras/layers/core.py", line 604, in build
name='{}_W'.format(self.name))
File "/home/ms/anushreej/anaconda3/lib/python3.5/site-packages/keras/initializations.py", line 61, in glorot_uniform
return uniform(shape, s, name=name)
File "/home/ms/anushreej/anaconda3/lib/python3.5/site-packages/keras/initializations.py", line 32, in uniform
return K.variable(np.random.uniform(low=-scale, high=scale, size=shape),
File "mtrand.pyx", line 1255, in mtrand.RandomState.uniform (numpy/random/mtrand/mtrand.c:13575)
File "mtrand.pyx", line 220, in mtrand.cont2_array_sc (numpy/random/mtrand/mtrand.c:2902)
MemoryError
Now I am unable to understand why is this happening. My training images are very small of the size 150*150*3.
The code is -:
# import the necessary packages
from keras.models import Sequential
from keras.layers.convolutional import Convolution2D
from keras.layers.core import Activation
from keras.layers.core import Flatten
from keras.layers.core import Dense
class CNNModel:
@staticmethod
def build(width, height, depth):
# initialize the model
model = Sequential()
# first set of CONV => RELU
model.add(Convolution2D(50, 5, 5, border_mode="same", batch_input_shape=(None, depth, height, width)))
model.add(Activation("relu"))
# second set of CONV => RELU
# model.add(Convolution2D(50, 5, 5, border_mode="same"))
# model.add(Activation("relu"))
# third set of CONV => RELU
# model.add(Convolution2D(50, 5, 5, border_mode="same"))
# model.add(Activation("relu"))
model.add(Flatten())
model.add(Dense(depth*height*width))
# if weightsPath is not None:
# model.load_weights(weightsPath)
return model
回答1:
I faced the same problem, I think the issue is the number data points just before the Flattening layer are more than your system can handle(i tried in difference systems so one with high ram worked and with less ram gave this error). Just add more CNN layers to reduce the size and then add a flattening layer it works.
This gave me and error:
model = Sequential()
model.add(Convolution2D(32, 3, 3,border_mode='same',input_shape=(1, 96, 96),activation='relu'))
model.add(Convolution2D(64, 3, 3,border_mode='same',activation='relu'))
model.add(MaxPooling2D((2,2), strides=(2,2)))
model.add(Flatten())
model.add(Dense(1000,activation='relu'))
model.add(Dense(97,activation='softmax'))
This didnt give an error
model = Sequential()
model.add(Convolution2D(32, 3, 3,border_mode='same',input_shape=(1, 96, 96),activation='relu'))
model.add(Convolution2D(64, 3, 3,border_mode='same',activation='relu'))
model.add(MaxPooling2D((2,2), strides=(2,2)))
model.add(Convolution2D(64, 3, 3,border_mode='same',activation='relu'))
model.add(Convolution2D(128, 3, 3,border_mode='same',activation='relu'))
model.add(MaxPooling2D((2,2), strides=(2,2)))
model.add(Flatten())
model.add(Dense(1000,activation='relu'))
model.add(Dense(97,activation='softmax')
Hope it helps.
来源:https://stackoverflow.com/questions/38889352/memory-error-while-using-keras