A^k for matrix multiplication in R?

前端 未结 6 840
故里飘歌
故里飘歌 2020-12-31 08:22

Suppose A is some square matrix. How can I easily exponentiate this matrix in R?

I tried two ways already: Trial 1 with a for-loop hack and Trial 2

相关标签:
6条回答
  • 2020-12-31 08:50

    The expm package has an %^% operator:

    library("sos")
    findFn("{matrix power}")
    install.packages("expm")
    library("expm")
    ?matpow
    set.seed(10);t(matrix(rnorm(16),ncol=4,nrow=4))->a
    a%^%8
    
    0 讨论(0)
  • 2020-12-31 08:50

    Indeed the expm's package does use exponentiation by squaring.

    In pure r, this can be done rather efficiently like so,

    "%^%" <- function(mat,power){
        base = mat
        out = diag(nrow(mat))
        while(power > 1){
            if(power %% 2 == 1){
                out = out %*% base
            }
            base = base %*% base
            power  = power %/% 2
        }
        out %*% base
    }
    

    Timing this,

    m0 <- diag(1, nrow=3,ncol=3)
    system.time(replicate(10000, m0%^%4000))#expm's %^% function
    user  system elapsed 
    0.31    0.00    0.31 
    system.time(replicate(10000, m0%^%4000))# my %^% function
    user  system elapsed 
    0.28    0.00    0.28 
    

    So, as expected, they are the same speed because they use the same algorithm. It looks like the overhead of the looping r code does not make a significant difference.

    So, if you don't want to use expm, and need that performance, then you can just use this, if you don't mind looking at imperative code.

    0 讨论(0)
  • 2020-12-31 08:55

    If A is diagonizable, you could use eigenvalue decomposition:

    matrix.power <- function(A, n) {   # only works for diagonalizable matrices
       e <- eigen(A)
       M <- e$vectors   # matrix for changing basis
       d <- e$values    # eigen values
       return(M %*% diag(d^n) %*% solve(M))
    }
    

    When A is not diagonalizable, the matrix M (matrix of eigenvectors) is singular. Thus, using it with A = matrix(c(0,1,0,0),2,2) would give Error in solve.default(M) : system is computationally singular.

    0 讨论(0)
  • 2020-12-31 09:01

    Simple solution

    `%^%` <- function(A, n) {
      A1 <- A
      for(i in seq_len(n-1)){
        A <- A %*% A1
      }
      return(A)
    } 
    
    0 讨论(0)
  • 2020-12-31 09:06

    A shorter solution with eigenvalue decomposition:

    "%^%" <- function(S, power)
             with(eigen(S), vectors %*% (values^power * t(vectors)))
    
    0 讨论(0)
  • 2020-12-31 09:10

    Although Reduce is more elegant, a for-loop solution is faster and seems to be as fast as expm::%^%

    m1 <- matrix(1:9, 3)
    m2 <- matrix(1:9, 3)
    m3 <- matrix(1:9, 3)
    system.time(replicate(1000, Reduce("%*%" , list(m1,m1,m1) ) ) )
    #   user  system elapsed 
    #  0.026   0.000   0.037 
    mlist <- list(m1,m2,m3)
    m0 <- diag(1, nrow=3,ncol=3)
    system.time(replicate(1000, for (i in 1:3 ) {m0 <- m0 %*% m1 } ) )
    #   user  system elapsed 
    #  0.013   0.000   0.014 
    
    library(expm)  # and I think this may be imported with pkg:Matrix
    system.time(replicate(1000, m0%^%3))
    # user  system elapsed 
    #0.011   0.000   0.017 
    

    On the other hand the matrix.power solution is much, much slower:

    system.time(replicate(1000, matrix.power(m1, 4)) )
       user  system elapsed 
      0.677   0.013   1.037 
    

    @BenBolker is correct (yet again). The for-loop appears linear in time as the exponent rises whereas the expm::%^% function appears to be even better than log(exponent).

    > m0 <- diag(1, nrow=3,ncol=3)
    > system.time(replicate(1000, for (i in 1:400 ) {m0 <- m0 %*% m1 } ) )
       user  system elapsed 
      0.678   0.037   0.708 
    > system.time(replicate(1000, m0%^%400))
       user  system elapsed 
      0.006   0.000   0.006 
    
    0 讨论(0)
提交回复
热议问题