eigenvector

Power Method in MATLAB

末鹿安然 提交于 2019-11-29 11:21:47
I would like to implement the Power Method for determining the dominant eigenvalue and eigenvector of a matrix in MATLAB. Here's what I wrote so far: %function to implement power method to compute dominant %eigenvalue/eigenevctor function [m,y_final]=power_method(A,x); m=0; n=length(x); y_final=zeros(n,1); y_final=x; tol=1e-3; while(1) mold=m; y_final=A*y_final; m=max(y_final); y_final=y_final/m; if (m-mold)<tol break; end end end With the above code, here is a numerical example: A=[1 1 -2;-1 2 1; 0 1 -1] A = 1 1 -2 -1 2 1 0 1 -1 >> x=[1 1 1]; >> x=x'; >> [m,y_final]=power_method(A,x); >> A*x

How to use eigenvectors obtained through PCA to reproject my data?

只谈情不闲聊 提交于 2019-11-29 07:42:25
I am using PCA on 100 images. My training data is 442368x100 double matrix. 442368 are features and 100 is number of images. Here is my code for finding the eigenvector. [ rows, cols] = size(training); maxVec=rows; maxVec=min(maxVec,rows); train_mean=mean(training,2); A=training-train_mean*ones(1,cols); A=A'*A; [evec,eval]=eig(A); [eval ind] = sort(-1*diag(eval)); evec= evec(:, ind(1:100)); Now evec is an eigenvector matrix of order of 100x100 double and now I have got 100 eigenvectors sorted. Questions: Now, if I want to transform my testing data using above calculated eigenvectors then how

Singular Value Decomposition (SVD) in PHP

青春壹個敷衍的年華 提交于 2019-11-28 21:41:37
I would like to implement Singular Value Decomposition (SVD) in PHP. I know that there are several external libraries which could do this for me. But I have two questions concerning PHP, though: 1) Do you think it's possible and/or reasonable to code the SVD in PHP? 2) If (1) is yes: Can you help me to code it in PHP? I've already coded some parts of SVD by myself. Here's the code which I made comments to the course of action in. Some parts of this code aren't completely correct. It would be great if you could help me. Thank you very much in advance! si28719e SVD-python Is a very clear,

how to calculate 2^n modulo 1000000007 , n = 10^9

蹲街弑〆低调 提交于 2019-11-28 12:28:38
what is the fastest method to calculate this, i saw some people using matrices and when i searched on the internet, they talked about eigen values and eigen vectors (no idea about this stuff)...there was a question which reduced to a recursive equation f(n) = (2*f(n-1)) + 2 , and f(1) = 1, n could be upto 10^9.... i already tried using DP, storing upto 1000000 values and using the common fast exponentiation method, it all timed out im generally weak in these modulo questions, which require computing large values f(n) = (2*f(n-1)) + 2 with f(1)=1 is equivalent to (f(n)+2) = 2 * (f(n-1)+2) = ...

Mapping array back to an existing Eigen matrix

爷,独闯天下 提交于 2019-11-28 10:20:20
I want to map an array of double to an existing MatrixXd structure. So far I've managed to map the Eigen matrix to a simple array, but I can't find the way to do it back. void foo(MatrixXd matrix, int n){ double arrayd = new double[n*n]; // map the input matrix to an array Map<MatrixXd>(arrayd, n, n) = matrix; //do something with the array ....... // map array back to the existing matrix } I'm not sure what you want, but I'll try to explain. You're mixing double and float in your code (a MatrixXf is a matrix where every entry is a float). I'll assume for the moment that this was unintentional

R and MATLAB returning different eigenvectors

半城伤御伤魂 提交于 2019-11-28 08:41:54
问题 I'm missing something obvious, but here goes: In R , dput(M) structure(c(-2.77555756156289e-16, 9.63770703841896e-16, 0, 9.63770703841896e-16, 10.6543192562307, 4.11228781751498e-14, 0, 4.11228781751498e-14, 275.591724761168), .Dim = c(3L, 3L), .Dimnames = list(c("", "", ""), c("", "", ""))) #thus M is -2.775558e-16 9.637707e-16 0.000000e+00 9.637707e-16 1.065432e+01 4.112288e-14 0.000000e+00 4.112288e-14 2.755917e+02 eig(M) $values [1] 2.755917e+02 1.065432e+01 -2.775558e-16 $vectors [,1] [

Power Method in MATLAB

元气小坏坏 提交于 2019-11-28 01:37:29
问题 I would like to implement the Power Method for determining the dominant eigenvalue and eigenvector of a matrix in MATLAB. Here's what I wrote so far: %function to implement power method to compute dominant %eigenvalue/eigenevctor function [m,y_final]=power_method(A,x); m=0; n=length(x); y_final=zeros(n,1); y_final=x; tol=1e-3; while(1) mold=m; y_final=A*y_final; m=max(y_final); y_final=y_final/m; if (m-mold)<tol break; end end end With the above code, here is a numerical example: A=[1 1 -2;-1

How to use eigenvectors obtained through PCA to reproject my data?

落花浮王杯 提交于 2019-11-28 01:25:04
问题 I am using PCA on 100 images. My training data is 442368x100 double matrix. 442368 are features and 100 is number of images. Here is my code for finding the eigenvector. [ rows, cols] = size(training); maxVec=rows; maxVec=min(maxVec,rows); train_mean=mean(training,2); A=training-train_mean*ones(1,cols); A=A'*A; [evec,eval]=eig(A); [eval ind] = sort(-1*diag(eval)); evec= evec(:, ind(1:100)); Now evec is an eigenvector matrix of order of 100x100 double and now I have got 100 eigenvectors sorted

Singular Value Decomposition (SVD) in PHP

人盡茶涼 提交于 2019-11-27 14:04:38
问题 I would like to implement Singular Value Decomposition (SVD) in PHP. I know that there are several external libraries which could do this for me. But I have two questions concerning PHP, though: 1) Do you think it's possible and/or reasonable to code the SVD in PHP? 2) If (1) is yes: Can you help me to code it in PHP? I've already coded some parts of SVD by myself. Here's the code which I made comments to the course of action in. Some parts of this code aren't completely correct. It would be

Could we get different solutions for eigenVectors from a matrix?

有些话、适合烂在心里 提交于 2019-11-27 09:37:47
My purpose is to find a eigenvectors of a matrix. In Matlab, there is a [V,D] = eig(M) to get the eigenvectors of matrix by using: [V,D] = eig(M) . Alternatively I used the website WolframAlpha to double check my results. We have a 10X10 matrix called M : 0.736538062307847 -0.638137874226607 -0.409041107160722 -0.221115060391256 -0.947102932298308 0.0307937582853794 1.23891356582639 1.23213871779652 0.763885436104244 -0.805948245321096 -1.00495215920171 -0.563583317483057 -0.250162608745252 0.0837145788064272 -0.201241986127792 -0.0351472158148094 -1.36303599752928 0.00983020375259212 -0