svd

Toy R function for solving ordinary least squares by singular value decomposition

99封情书 提交于 2019-12-12 22:17:49
问题 I'm trying to write a functions for multiple regression analysis ( y = Xb + e ) using a singular value decomposition for matrices. y and X must be the input and regression coefficients vector b , the residual vector e and variance accounted for R2 as output. Beneath is what I have so far and I'm totally stuck. The labels part of the weight also gives me an error. What is this labels part? Can anybody give me some tips to help me proceed? Test <- function(X, y) { x <- t(A) %*% A duv <- svd(x)

scikit-learn TruncatedSVD's explained variance ratio not in descending order

孤街醉人 提交于 2019-12-12 07:47:10
问题 The TruncatedSVD's explained variance ratio is not in descending order, unlike sklearn's PCA. I looked at the source code and it seems they use different way of calculating the explained variance ratio: TruncatedSVD: U, Sigma, VT = randomized_svd(X, self.n_components, n_iter=self.n_iter, random_state=random_state) X_transformed = np.dot(U, np.diag(Sigma)) self.explained_variance_ = exp_var = np.var(X_transformed, axis=0) if sp.issparse(X): _, full_var = mean_variance_axis(X, axis=0) full_var

Convert std::list to cv::Mat in C++ using OpenCV

无人久伴 提交于 2019-12-12 02:57:14
问题 I'm trying to solve an equation system using SVD: cv::SVD::solveZ(A, x); , but A needs to be a Matrix. OpenCV doesn't offer any convertion of a std::list to cv::Mat . So my question is, whether there is a smart way to convert it without having to convert the std::list to a std::vector before. The Matrix A is a 3xN matrix. My list contains N cv::Point3d elements. My code looks something like this: std::list<cv::Point3d> points; // length: N cv::Mat A = cv::Mat(points).reshape(1); // that's how

How to clustering syllable types with python?

痞子三分冷 提交于 2019-12-12 02:47:28
问题 This is my second question in stack overflow. I don't have to much experience with python, but had excellent results with my first question and I was able to implement the code from the answer, so I will try again with this new problem: I am trying to classify syllable types from a canary song, in order to use each types as templates to find and classify large sets of data with similar behavior. I use the envelope of the singing. My data is a sampled array, with time and amplitude (a plot of

Can I get data spread (noise) from singular value decomposition?

风流意气都作罢 提交于 2019-12-12 01:54:37
问题 I'm was hoping to use singular value decomposition to estimate the standard deviation of eliptoid data. I'm not sure if this is the best approach and I may be overthinking the entire process so I need some help. I simulated some data using the following script... from matplotlib import pyplot as plt import numpy def svd_example(): # simulate some data... # x values have standard deviation 3000 xdata = numpy.random.normal(0, 3000, 5000).reshape(-1, 1) # y values standard deviation 300 ydata =

plot more than 50 components in RSSA package in R

白昼怎懂夜的黑 提交于 2019-12-11 18:24:27
问题 require(Rssa) t=ssa(co2,200) #200 here should be the number of components plot(t) # this way it plots only the first 50 not 200! Above code produces a graph the first 50 components only. I need to plot more than 50 components I tried plot(t$sigma[1:200],type='l',log='y') but it didn't work! Example : similar to this case accessing eigenvalues in RSSA package in R 回答1: Looking at the help page for ?ssa we see a parameter named neig which is documented as; integer, number of desired

奇异值分解(SVD)与主成分分析(PCA)

给你一囗甜甜゛ 提交于 2019-12-11 18:14:30
奇异值分解(SVD)与主成分分析(PCA) 1 算法简介 奇异值分解(Singular Value Decomposition),简称SVD,是线性代数中矩阵分解的方法。假如有一个矩阵A,对它进行奇异值分解,可以得到三个矩阵相乘的形式,最左边为m维的正交矩阵,中间为m*n 的对角阵,右边为n维的正交矩阵: A = U Σ V T A=U\Sigma V^{T} A = U Σ V T 这三个矩阵的大小如下图所示: 矩阵 Σ \Sigma Σ 除了对角元素其他元素都为0,并且对角元素是从大到小排列的,前面的元素比较大,后面的很多元素接近0。这些对角元素就是奇异值。( u i u_i u i ​ 为m维行向量, v i v_i v i ​ 为n维行向量) Σ \Sigma Σ 中有n个奇异值,但是由于排在后面的很多接近0,所以我们可以仅保留比较大的前r个奇异值,同时对三个矩阵过滤后面的n-r个奇异值, 奇异值过滤之后,得到新的矩阵: [外链图片转存失败,源站可能有防盗链机制,建议将图片保存下来直接上传(img-7Y9zuN9s-1576054984887)(./Img/fig2.png)] 在新的矩阵中, Σ \Sigma Σ 只保留了前r个较大的特征值: 实际应用中,我们仅需保留三个比较小的矩阵,就能表示A,不仅节省存储量,在计算的时候更是减少了计算量。SVD在信息检索(隐性语义索引

Singular Value Decomposition: Different results with Jama, PColt and NumPy

风格不统一 提交于 2019-12-11 08:36:51
问题 I want to perform Singular Value Decomposition on a large (sparse) matrix. In order to choose the best(most accurate) library, I tried replicating the SVD example provided here using different Java and Python libraries. Strangely I am getting different results with each library. Here's the original example matrix and it's decomposed (U S and VT) matrices: A =2.0 0.0 8.0 6.0 0.0 1.0 6.0 0.0 1.0 7.0 5.0 0.0 7.0 4.0 0.0 7.0 0.0 8.0 5.0 0.0 0.0 10.0 0.0 0.0 7.0 U =-0.54 0.07 0.82 -0.11 0.12 -0.10

SVD法坐标系转换原理

喜你入骨 提交于 2019-12-11 07:32:05
SVD法是将在两个不同坐标系下的测量的多个点的坐标值对构造成一个矩阵,对该矩阵进行奇异值分解得到姿态矩阵。 1 P = 1 2 R 2 P + 1 2 T {}^1P = {}_1^2R{}^2P + {}_1^2T 1 P = 1 2 ​ R 2 P + 1 2 ​ T 首先,使用两组点构造出一个矩阵 H H H H = ∑ i = 1 m ( ( 2 P i − 2 P ˉ ) ⋅ ( 1 P i − 1 P ˉ ) T ) H = \sum\limits_{i = 1}^m {\left( {\left( {{}^2{P_i} - {}^2\bar P} \right) \cdot {{\left( {{}^1{P_i} - {}^1\bar P} \right)}^T}} \right)} H = i = 1 ∑ m ​ ( ( 2 P i ​ − 2 P ˉ ) ⋅ ( 1 P i ​ − 1 P ˉ ) T ) 然后对 H H H 进行奇异值分解 U ⋅ Δ ⋅ V T = S V D ( H ) U \cdot \Delta \cdot {V^T} = SVD\left( H \right) U ⋅ Δ ⋅ V T = S V D ( H ) 然后计算 ∣ V ⋅ U T ∣ |V \cdot {U^T}| ∣ V ⋅ U T ∣ 的符号,如果大于0,则 1 2 R

Simple (working) handwritten digit recognition: how to improve it?

自作多情 提交于 2019-12-10 23:27:37
问题 I just wrote this very simple handwritten digit recoginition. Here is 8kb archive with the following code + ten .PNG image files. It works: is well recognized as . In short, each digit of the database (50x50 pixels = 250 coefficients) is summarized into a 10-coefficient-vector (by keeping the 10 biggest singular values, see Low-rank approximation with SVD). Then for the digit to be recognized, we minimize the distance with the digits in the database. from scipy import misc import numpy as np