Python scikit learn pca.explained_variance_ratio_ cutoff

后端 未结 3 1155
甜味超标
甜味超标 2020-12-23 21:05

When choosing the number of principal components (k), we choose k to be the smallest value so that for example, 99% of variance, is retained.

However, in the Pytho

3条回答
  •  失恋的感觉
    2020-12-23 21:36

    This worked for me with even less typing in the PCA section. The rest is added for convenience. Only 'data' needs to be defined in an earlier stage.

    import sklearn as sl
    from sklearn.preprocessing import StandardScaler as ss
    from sklearn.decomposition import PCA 
    
    st = ss().fit_transform(data)
    pca = PCA(0.80)
    pc = pca.fit_transform(st) # << to retain the components in an object
    pc
    
    #pca.explained_variance_ratio_
    print ( "Components = ", pca.n_components_ , ";\nTotal explained variance = ",
          round(pca.explained_variance_ratio_.sum(),5)  )
    

提交回复
热议问题