How can scikit-learning perform PCA on sparse data in libsvm format?

空扰寡人 提交于 2020-01-28 06:47:45

问题


I am using scikit-learning to do some dimension reduce task. My training/test data is in the libsvm format. It is a large sparse matrix in half million columns.

I use load_svmlight_file function load the data, and by using SparsePCA, the scikit-learning throw out an exception of the input data error.

How to fix it?


回答1:


Sparse PCA is an algorithm for finding a sparse decomposition (the components have a sparsity constraint) on dense data.

If you want to do vanilla PCA on sparse data you should use sklearn.decomposition.RandomizedPCA that implements an scalable approximate method that works on both sparse and dense data.

IIRC sklearn.decomposition.PCA only works on dense data at the moment. Support for sparse data could be added in the future by delegating the SVD computation on the sparse data matrix to arpack for instance.

Edit: as noted in the comments sparse input for RandomizedPCA is deprecated: instead you should use sklearn.decomposition.TruncatedSVD that does precisely what RandomizedPCA used to do on sparse data but should not have been called PCA in the first place.

To clarify: PCA is mathematically defined as centering the data (removing the mean value to each feature) and then applying truncated SVD on the centered data.

As centering the data would destroy the sparsity and force a dense representation that often does not fit in memory any more, it is common to directly do truncated SVD on sparse data (without centering). This resembles PCA but it's not exactly the same. This is implemented in scikit-learn as sklearn.decomposition.TruncatedSVD.

Edit (March 2019): There is ongoing work to implement PCA on sparse data with implicit centering: https://github.com/scikit-learn/scikit-learn/pull/12841



来源:https://stackoverflow.com/questions/11809686/how-can-scikit-learning-perform-pca-on-sparse-data-in-libsvm-format

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!