I am using a DBN (deep belief network) from nolearn based on scikit-learn.
I have already built a Network which can classify my data very well, now I am interested in ex
First, install joblib.
You can use:
>>> import joblib
>>> joblib.dump(clf, 'my_model.pkl', compress=9)
And then later, on the prediction server:
>>> import joblib
>>> model_clone = joblib.load('my_model.pkl')
This is basically a Python pickle with an optimized handling for large numpy arrays. It has the same limitations as the regular pickle w.r.t. code change: if the class structure of the pickle object changes you might no longer be able to unpickle the object with new versions of nolearn or scikit-learn.
If you want long-term robust way of storing your model parameters you might need to write your own IO layer (e.g. using binary format serialization tools such as protocol buffers or avro or an inefficient yet portable text / json / xml representation such as PMML).