svd

PCA of RGB Image

情到浓时终转凉″ 提交于 2020-01-02 04:34:08
问题 I'm trying to figure out how to use PCA to decorrelate an RGB image in python. I'm using the code found in the O'Reilly Computer vision book: from PIL import Image from numpy import * def pca(X): # Principal Component Analysis # input: X, matrix with training data as flattened arrays in rows # return: projection matrix (with important dimensions first), # variance and mean #get dimensions num_data,dim = X.shape #center data mean_X = X.mean(axis=0) for i in range(num_data): X[i] -= mean_X if

Can't run correspondence analysis on two-way contingency table using FactoMineR

筅森魡賤 提交于 2019-12-31 03:15:11
问题 It does not appear to work on this table, named mytable: 0 1 2 3 4 5 7 Click_No 242854 91661 102 21 65 51 291 Click_Yes 48274 20785 14 2 19 4 146 However, it works on this table: 0 1 2 3 4 5 7 Row1 4 0 0 0 0 0 11 Row2 35 2 0 0 0 0 0 Row3 18364 14 0 0 0 0 0 Row4 13 0 0 0 0 0 7 Row5 1497 1521 6 0 0 0 0 Row6 686 2 0 0 0 0 393 Row7 270167 110512 110 23 84 54 0 Row8 1 0 0 0 0 0 26 Row9 361 395 0 0 0 1 0 I used the FactoMineR function: res.ca <- CA(mytable) Does CA not work on specific types of

Comparing svd and princomp in R

给你一囗甜甜゛ 提交于 2019-12-31 02:03:46
问题 I want to get singular values of a matrix in R to get the principal components, then make princomp(x) too to compare results I know princomp() would give the principal components Question How to get the principal components from $d, $u, and $v (solution of s = svd(x) )? 回答1: One way or another, you should probably look into prcomp , which calculates PCA using svd instead of eigen (as in princomp ). That way, if all you want is the PCA output, but calculated using svd , you're golden. Also, if

Comparing svd and princomp in R

我们两清 提交于 2019-12-31 02:03:12
问题 I want to get singular values of a matrix in R to get the principal components, then make princomp(x) too to compare results I know princomp() would give the principal components Question How to get the principal components from $d, $u, and $v (solution of s = svd(x) )? 回答1: One way or another, you should probably look into prcomp , which calculates PCA using svd instead of eigen (as in princomp ). That way, if all you want is the PCA output, but calculated using svd , you're golden. Also, if

SVD分解技术详解

时光总嘲笑我的痴心妄想 提交于 2019-12-25 15:58:11
版权声明: 本文由LeftNotEasy发布于 http://leftnoteasy.cnblogs.com , 本文可以被全部的转载或者部分使用,但请注明出处,如果有问题,请联系 wheeleast@gmail.com 前言: 上一次写了关于 PCA与LDA 的文章,PCA的实现一般有两种,一种是用特征值分解去实现的,一种是用奇异值分解去实现的。在上篇文章中便是基于特征值分解的一种解释。特征值和奇异值在大部分人的印象中,往往是停留在纯粹的数学计算中。而且线性代数或者矩阵论里面,也很少讲任何跟特征值与奇异值有关的应用背景。奇异值分解是一个有着很明显的物理意义的一种方法,它可以将一个比较复杂的矩阵用更小更简单的几个子矩阵的相乘来表示,这些小矩阵描述的是矩阵的重要的特性。就像是描述一个人一样,给别人描述说这个人长得浓眉大眼,方脸,络腮胡,而且带个黑框的眼镜,这样寥寥的几个特征,就让别人脑海里面就有一个较为清楚的认识,实际上,人脸上的特征是有着无数种的,之所以能这么描述,是因为人天生就有着非常好的抽取重要特征的能力,让机器学会抽取重要的特征,SVD是一个重要的方法。 在机器学习领域,有相当多的应用与奇异值都可以扯上关系,比如做feature reduction的PCA,做数据压缩(以图像压缩为代表)的算法,还有做搜索引擎语义层次检索的LSI(Latent Semantic

different results for PCA, truncated_svd and svds on numpy and sklearn

谁说我不能喝 提交于 2019-12-24 12:18:22
问题 In sklearn an numpy there are different ways to compute the first principal component. I obtain a different results for each method. Why? import matplotlib.pyplot as pl from sklearn import decomposition import scipy as sp import sklearn.preprocessing import numpy as np import sklearn as sk def gen_data_3_1(): #### generate the data 3.1 m=1000 # number of samples n=10 # number of variables d1=np.random.normal(loc=0,scale=100,size=(m,1)) d2=np.random.normal(loc=0,scale=121,size=(m,1)) d3=-0.2

Calculate SVD in R error: missing or infinite values

*爱你&永不变心* 提交于 2019-12-24 06:48:14
问题 I have a similar problem with svd(XTR) the data is like: 1 0.3045 0.1448 -0.0714 -0.038 -0.0838 -0.1433 -0.1071 -0.1988 -0.1076 -0.0313 -0.157 -0.1032 -0.137 -0.0802 0.1244 0.0701 0.0457 -0.0634 0.0401 0.1643 0.3056 0.3956 0.4533 0.1557 0.3045 0.9999 0.3197 0.1328 0.093 -0.0846 -0.132 0.0046 -0.004 -0.0197 -0.1469 -0.1143 -0.2016 -0.1 -0.0316 0.0044 -0.0589 -0.0589 0.0277 0.0314 0.078 0.0104 0.0692 0.1858 0.0217 0.1448 0.3197 1 0.3487 0.2811 0.0786 -0.1421 -0.1326 -0.2056 -0.1109 0.0385 -0

Svd recomposition with Mathnet numerics library seems wrong

折月煮酒 提交于 2019-12-24 03:08:28
问题 I'm looking for non regression between Mathnet.Iridium and Mathnet.Numerics. Here is my code, using Mathnet.Numerics : double[][] symJaggedArray = new double[5][]; symJaggedArray[0] = new double[] { 3, 0, 0, 0, 0 }; symJaggedArray[1] = new double[] { 0, 2, 4, 0, 0 }; symJaggedArray[2] = new double[] { 0, 4, 5, -4, 5 }; symJaggedArray[3] = new double[] { 0, 0, -4, -8, 12 }; symJaggedArray[4] = new double[] { 0, 0, 5, 12, -5 }; symDenseMatrix = DenseMatrix.OfArray(new Matrix(symJaggedArray)

SVD algorithm implementation

我们两清 提交于 2019-12-23 15:59:33
问题 Does anyone know good scalable implementation of SVD on C# for very big matrix? 回答1: ILNumerics.net seems to have SVD among other things. Feature list: Frameworks .NET 1.1, .NET 2.0, available soon: mono 1.2.3 Languages all CLI conform: C# (recommended), managed C++, Visual Basic ... Array objects * Full OO class design * Generic typed container classes * single object for arbitrary array dimensions: scalar, vector, matrices, n-dim arrays * full support for flexible array modification:

奇异值分解(SVD)原理及应用

若如初见. 提交于 2019-12-23 13:45:11
一、奇异值与特征值基础知识: 特征值分解和奇异值分解在机器学习领域都是属于满地可见的方法。两者有着很紧密的关系,我在接下来会谈到,特征值分解和奇异值分解的目的都是一样,就是提取出一个矩阵最重要的特征。先谈谈特征值分解吧: 1)特征值: 如果说一个向量v是方阵A的特征向量,将一定可以表示成下面的形式: 这时候λ就被称为特征向量v对应的特征值,一个矩阵的一组特征向量是一组正交向量。特征值分解是将一个矩阵分解成下面的形式: 其中Q是这个矩阵A的特征向量组成的矩阵,Σ是一个对角阵,每一个对角线上的元素就是一个特征值。我这里引用了一些参考文献中的内容来说明一下。首先,要明确的是,一个矩阵其实就是一个线性变换,因为一个矩阵乘以一个向量后得到的向量,其实就相当于将这个向量进行了线性变换。比如说下面的一个矩阵: 它其实对应的线性变换是下面的形式: 因为这个矩阵M乘以一个向量(x,y)的结果是: 上面的矩阵是对称的,所以这个变换是一个对x,y轴的方向一个拉伸变换(每一个对角线上的元素将会对一个维度进行拉伸变换,当值>1时,是拉长,当值<1时时缩短),当矩阵不是对称的时候,假如说矩阵是下面的样子: 它所描述的变换是下面的样子:   这其实是在平面上对一个轴进行的拉伸变换(如蓝色的箭头所示),在图中,蓝色的箭头是一个最主要的变化方向(变化方向可能有不止一个),如果我们想要描述好一个变换