Einstein-Convention like implicit summation with sympy

南楼画角 提交于 2019-12-25 07:47:01

问题


Given some simpy tensor

A = sympy.tensor.array.Array( <3rd-rank data consisting of sympy expressions> )

and a matrices T and T1=T.inv() representing a basis transform, is it somehow possible to use an notation like

B[i,j,k] = T[i,a] * A[a,b,c] * T1[b,j] * T1[c,k]

to calculate the transformed tensor?

It seems that it is, in principle, possible in sympy to use an Einstein summation convention, but I am running into multiple problems with it:

  • A code snippet

    from sympy import symbols, IndexedBase, EinsteinSum 
    TX, A, x = symbols('TX A x', cls=IndexedBase)
    i, j = symbols('i j')
    ein_sum = EinsteinSum(A[i, j] * x[j])
    

    I found doesn't work, because EinsteinSum doesn't seem to exist anymore.

  • When trying to use expressions like

    var("i")
    Sum(A[i,i,i],(i,1,3))
    

    I will raise a TypeError (apparently Array doesn't allow symbolic indices).


回答1:


Some features have been recently added to the development version of SymPy. They will be available in the next version.

NOTE: EinsteinSum has never existed. There was just a proposal to add it, IIRC.

Let's suppose you have 3-rank array A given by

In [13]: A
Out[13]: 
⎡⎡1  0  0   0 ⎤  ⎡0   0   0  1⎤  ⎡0   0  0  -ⅈ⎤  ⎡0   0  1  0 ⎤⎤
⎢⎢            ⎥  ⎢            ⎥  ⎢            ⎥  ⎢            ⎥⎥
⎢⎢0  1  0   0 ⎥  ⎢0   0   1  0⎥  ⎢0   0  ⅈ  0 ⎥  ⎢0   0  0  -1⎥⎥
⎢⎢            ⎥  ⎢            ⎥  ⎢            ⎥  ⎢            ⎥⎥
⎢⎢0  0  -1  0 ⎥  ⎢0   -1  0  0⎥  ⎢0   ⅈ  0  0 ⎥  ⎢-1  0  0  0 ⎥⎥
⎢⎢            ⎥  ⎢            ⎥  ⎢            ⎥  ⎢            ⎥⎥
⎣⎣0  0  0   -1⎦  ⎣-1  0   0  0⎦  ⎣-ⅈ  0  0  0 ⎦  ⎣0   1  0  0 ⎦⎦

and a simple matrix T given by:

In [15]: T
Out[15]: 
⎡2  0  0  0⎤
⎢          ⎥
⎢0  2  0  0⎥
⎢          ⎥
⎢0  0  2  0⎥
⎢          ⎥
⎣0  0  0  2⎦

So that:

In [17]: T.inv()
Out[17]: 
⎡1/2   0    0    0 ⎤
⎢                  ⎥
⎢ 0   1/2   0    0 ⎥
⎢                  ⎥
⎢ 0    0   1/2   0 ⎥
⎢                  ⎥
⎣ 0    0    0   1/2⎦

I suggest to convert everything to Array as I fear that Matrix and Array objects won't interoperate that well.

Symbolic indices are now supported:

In [28]: T[i, j]
Out[28]: 
⎡2  0  0  0⎤      
⎢          ⎥      
⎢0  2  0  0⎥      
⎢          ⎥[i, j]
⎢0  0  2  0⎥      
⎢          ⎥      
⎣0  0  0  2⎦ 

As soon as the values i and j are replaced with numbers, the tensor expression will be evaluated.

If you want to operate on tensor products of arrays, use the tensorproduct and tensorcontraction functions. To emulate the matrix multiplication on Array objects:

In [29]: tensorcontraction(tensorproduct(T, T1), (1, 2))
Out[29]: 
⎡1  0  0  0⎤
⎢          ⎥
⎢0  1  0  0⎥
⎢          ⎥
⎢0  0  1  0⎥
⎢          ⎥
⎣0  0  0  1⎦

NOTE: the latest SymPy version has a bug with tensorcontraction if contracting on many axes at the same time. Development branch should be solved.

The expression Sum(A[i,i,i],(i, 0, 3)) works (NOTE: start index from 0, not 1 in Python).

The expression Sum(T[i, j]*T1[j, k], (j, 0, 3)) does not give a nice result, as it does not understand what to simplify. Compare:

In [38]: Sum(T[i, j]*T1[j, k], (j, 0, 3))
Out[38]: 
  3                                                 
______                                              
╲                                                   
 ╲     ⎡1/2   0    0    0 ⎤       ⎡2  0  0  0⎤      
  ╲    ⎢                  ⎥       ⎢          ⎥      
   ╲   ⎢ 0   1/2   0    0 ⎥       ⎢0  2  0  0⎥      
    ╲  ⎢                  ⎥[j, k]⋅⎢          ⎥[i, j]
    ╱  ⎢ 0    0   1/2   0 ⎥       ⎢0  0  2  0⎥      
   ╱   ⎢                  ⎥       ⎢          ⎥      
  ╱    ⎣ 0    0    0   1/2⎦       ⎣0  0  0  2⎦      
 ╱                                                  
╱                                                   
‾‾‾‾‾‾                                              
j = 0   

If you evaluate it, you get:

In [39]: Sum(T[i, j]*T1[j, k], (j, 0, 3)).doit()
Out[39]: 
⎡1/2   0    0    0 ⎤       ⎡2  0  0  0⎤         ⎡1/2   0    0    0 ⎤       ⎡2  0  0  0⎤         ⎡1/2   0    0    0 ⎤       ⎡2  0  0  0⎤   
⎢                  ⎥       ⎢          ⎥         ⎢                  ⎥       ⎢          ⎥         ⎢                  ⎥       ⎢          ⎥   
⎢ 0   1/2   0    0 ⎥       ⎢0  2  0  0⎥         ⎢ 0   1/2   0    0 ⎥       ⎢0  2  0  0⎥         ⎢ 0   1/2   0    0 ⎥       ⎢0  2  0  0⎥   
⎢                  ⎥[0, k]⋅⎢          ⎥[i, 0] + ⎢                  ⎥[1, k]⋅⎢          ⎥[i, 1] + ⎢                  ⎥[2, k]⋅⎢          ⎥[i,
⎢ 0    0   1/2   0 ⎥       ⎢0  0  2  0⎥         ⎢ 0    0   1/2   0 ⎥       ⎢0  0  2  0⎥         ⎢ 0    0   1/2   0 ⎥       ⎢0  0  2  0⎥   
⎢                  ⎥       ⎢          ⎥         ⎢                  ⎥       ⎢          ⎥         ⎢                  ⎥       ⎢          ⎥   
⎣ 0    0    0   1/2⎦       ⎣0  0  0  2⎦         ⎣ 0    0    0   1/2⎦       ⎣0  0  0  2⎦         ⎣ 0    0    0   1/2⎦       ⎣0  0  0  2⎦   

      ⎡1/2   0    0    0 ⎤       ⎡2  0  0  0⎤      
      ⎢                  ⎥       ⎢          ⎥      
      ⎢ 0   1/2   0    0 ⎥       ⎢0  2  0  0⎥      
 2] + ⎢                  ⎥[3, k]⋅⎢          ⎥[i, 3]
      ⎢ 0    0   1/2   0 ⎥       ⎢0  0  2  0⎥      
      ⎢                  ⎥       ⎢          ⎥      
      ⎣ 0    0    0   1/2⎦       ⎣0  0  0  2⎦ 

which is the correct result, but the representation is not nice-looking. SymPy fails to recognize the possible simplifications that may occur.

To get back a matrix in a nice form, you could use:

In [45]: Matrix([[Sum(T[i, j]*T1[j, k], (j, 0, 3)).doit().subs({i: ni, k: nk}) for ni in range(4)] for nk in range(4)])
Out[45]: 
⎡1  0  0  0⎤
⎢          ⎥
⎢0  1  0  0⎥
⎢          ⎥
⎢0  0  1  0⎥
⎢          ⎥
⎣0  0  0  1⎦


来源:https://stackoverflow.com/questions/42907090/einstein-convention-like-implicit-summation-with-sympy

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!