Let\'s say I am doing a larger data analysis in Jupyter/Ipython notebook with lots of time consuming computations done. Then, for some reason, I have to shut down the jupyte
(I'd rather comment than offer this as an actual answer, but I need more reputation to comment.)
You can store most data-like variables in a systematic way. What I usually do is store all dataframes, arrays, etc. in pandas.HDFStore. At the beginning of the notebook, declare
backup = pd.HDFStore('backup.h5')
and then store any new variables as you produce them
backup['var1'] = var1
At the end, probably a good idea to do
backup.close()
before turning off the server. The next time you want to continue with the notebook:
backup = pd.HDFStore('backup.h5')
var1 = backup['var1']
Truth be told, I'd prefer built-in functionality in ipython notebook, too. You can't save everything this way (e.g. objects, connections), and it's hard to keep the notebook organized with so much boilerplate codes.