Python: how do you store a sparse matrix using python?

不羁岁月 提交于 2019-12-03 03:13:34

Note: This answer is in response to the revise question that now provides code.

You should not call cPickle.dump() in your function. Create the sparse matrix and then dump its contents to the file.

Try:

def markov(L):
   count=0
   c=len(text1)
   for i in range(0,c-2):
       h=L.index(text1[i])
       k=L.index(text1[i+1])
       mat[h,k]=mat[h,k]+1 #matrix


text = [w for g in brown.categories() for w in brown.words(categories=g)]
text1=text[1:500]
arr=set(text1)
arr=list(arr)
mat=lil_matrix((len(arr),len(arr)))
markov(arr)
f = open('spmatrix.pkl','wb')
cPickle.dump(mat,f,-1)
f.close()

Assuming you have a numpy matrix or ndarray, which your question and tags imply, there is a dump method and load function you can use:

your_matrix.dump('output.mat')
another_matrix = numpy.load('output.mat')

pyTables is the Python interface to HDF5 data model and is pretty popular choice for and well-integrated with NumPy and SciPy. pyTables will let you access slices of databased arrays without needing to load the entire array back into memory.

I don't have any specific experience with sparse matrices per se and a quick Google search neither confirmed nor denied that sparse matrices are supported.

Adding on the HDF5 support, Python also has NetCDF support which is ideal for matrix form data storage and quick access both sparse and dense. It is included in Python-x,y for windows, which a lot of scientific users of python end up with.

More numpy based examples can be found in this cookbook.

For very big sparse matrices on clusters, you might use pytrilinos, it has a HDF5 interface which can dump a sparse matrix to disk, and works also if the matrix is distributed on different nodes.

http://trilinos.sandia.gov/packages/pytrilinos/development/EpetraExt.html#input-output-classes

Depending on the size of the sparse matrix, I tend to just use cPickle to pickle the array:

import cPickle
f = open('spmatrix.pkl','wb')
cPickle.dump(your_matrix,f,-1)
f.close()

If I'm dealing with really large datasets then I tend to use netcdf4-python

Edit:

To then access the file again you would:

f = open('spmatrix.pkl','rb') # open the file in read binary mode
# load the data in the .pkl file into a new variable spmat
spmat = cPickle.load(f) 
f.close()

For me, using the -1 option in cPickle.dump function caused the pickled file to not be loadable afterwards.

The object I dumped through cPickle was an instance of scipy.sparse.dok_matrix.

Using only two arguments did the trick for me; documentation about pickle.dump() states the default value of the protocol parameter is 0.

Working on Windows 7, Python 2.7.2 (64 bits), and cPickle v 1.71.

Example:

>>> import cPickle
>>> print cPickle.__version__
1.71
>>> from scipy import sparse
>>> H = sparse.dok_matrix((135, 654), dtype='int32')
>>> H[33, 44] = 8
>>> H[123, 321] = -99
>>> print str(H)
  (123, 321)    -99
  (33, 44)  8
>>> fname = 'dok_matrix.pkl'
>>> f = open(fname, mode="wb")
>>> cPickle.dump(H, f)
>>> f.close()
>>> f = open(fname, mode="rb")
>>> M = cPickle.load(f)
>>> f.close()
>>> print str(M)
  (123, 321)    -99
  (33, 44)  8
>>> M == H
True
>>> 
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!