Calculate covariance matrix for complex data in two channels (no complex data type)

两盒软妹~` 提交于 2021-01-20 20:32:08

问题


I have complex-valued data given in 2 channels of a matrix (one is the real, one the imaginary part, so the matrix dimensions are (height, width, 2), since Pytorch does not have native complex data types. I now want to calculate the covariance matrix. The stripped-down numpy calculation adapted for Pytorch is this:

def cov(m, y=None):

    if m.ndimension() > 2:
        raise ValueError("m has more than 2 dimensions")

    if y.ndimension() > 2:
        raise ValueError('y has more than 2 dimensions')

    X = m
    if X.shape[0] == 0:
        return torch.tensor([]).reshape(0, 0)
    if y is not None:
        X = torch.cat((X, y), dim=0)

    ddof = 1

    avg = torch.mean(X, dim=1)

    fact = X.shape[1] - ddof

    if fact <= 0:
        import warnings
        warnings.warn("Degrees of freedom <= 0 for slice",
                      RuntimeWarning, stacklevel=2)
        fact = 0.0

    X -= avg[:, None]
    X_T = X.t()
    c = dot(X, X_T)
    c *= 1. / fact
    return c.squeeze()

Now in numpy, this would transparently work with complex numbers, but I cannot simply feed a 3-d array with the last dimension being (real, imag) and hope it will work.

How can I adapt the calculation to obtain the complex covariance matrix with real and imaginary channels?


回答1:


[For PyTorch implementation of cov() for complex matrices, skip the explanations and go to the last snippet]


Description of cov() operation

Let M be a HxW matrix, where each of the H rows corresponds to a variable of W complex observations.

Now let cov(M) be the HxH covariance matrix of the H variables of M (definition of numpy.cov()). It can be computed as follows (ignoring edge cases):

cov(M) = 1 / (W - 1) . M * M.T

with * matrix multiplication operator, and M.T tranpose of M.

Note: to clarify the next equations, let cov_prod(X, Y) = 1 / (W - 1) . X * Y.T, with X, Y HxW matrices. We thus have cov(M) = cov_prod(M, M).

So far, nothing new, this corresponds to the code you wrote (minus the y weighting and data checks for edge cases). Let's double-check that the Pytorch implementation of this formula corresponds to the Numpy one, for real-valued data:

import torch
import numpy as np

def cov(m, y=None):
    if y is not None:
        m = torch.cat((m, y), dim=0)
    m_exp = torch.mean(m, dim=1)
    x = m - m_exp[:, None]
    cov = 1 / (x.size(1) - 1) * x.mm(x.t())
    return cov

# Real-valued matrix:
M_np = np.random.rand(3, 2)  
# Same matrix as torch.Tensor:
M = torch.from_numpy(M_np)

cov_real_np = np.cov(M_np)
cov_real = cov(M)
eq = np.allclose(cov_real_np, cov_real.numpy())
print("Numpy & Torch real covariance results equal? > {}".format(eq))
# Numpy & PyTorch real covariance results equal? > True

Extension to complex matrices

Now, how does this works for complex matrices? From here, let M be of complex values, i.e. composed of H row-variables of W complex observations. Furthermore, let A and B be the real-valued matrices such that M = A + i.B.

I will not go into the mathematical demonstration, which you can find here thanks to @zimzam, but in that case cov(M) can be decomposed as:

cov(M) = [cov_prod(A, A) + cov_prod(B, B)] + i.[-cov_prod(A, B) + cov_prod(B, A)]

This makes it straightforward to compute separately the real and imaginary components of cov(M), given the real and imaginary components of M (A and B).


Implementation & Validation

Find below an optimized implementation:

import torch
import numpy as np

def cov_complex(m_comp):
    # (adding further parameters such as `y` is left for exercise)
    # Supposing real and img are stored separately in the last dim:
    real, img = m_comp[..., 0], m_comp[..., 1] 
    x_real = real - torch.mean(real, dim=1)[:, None]
    x_img = img - torch.mean(img, dim=1)[:, None]
    x_real_T = x_real.t()
    x_img_T = x_img.t()
    frac = 1 / (x_real.size(1) - 1)

    cov_real = frac * (x_real.mm(x_real_T) + x_img.mm(x_img_T))
    cov_img = frac * (-x_real.mm(x_img_T) + x_img.mm(x_real_T))
    return torch.stack((cov_real, cov_img), dim=-1)

# Matrix with real/img values stored separately in last dimension:
M_np = np.random.rand(3, 2, 2)  
# Same matrix converted to np.complex format:                  
M_comp_np = M_np.view(dtype=np.complex128)[...,0]
# Same matrix as torch.Tensor:
M = torch.from_numpy(M_np)

cov_com_np = np.cov(M_comp_np)
cov_com = cov_complex(M)
eq = np.allclose(cov_com_np, cov_com.numpy().view(dtype=np.complex128)[...,0])
print("Numpy & Torch complex covariance results equal? > {}".format(eq))
# Numpy & PyTorch complex covariance results equal? > True


来源:https://stackoverflow.com/questions/51416825/calculate-covariance-matrix-for-complex-data-in-two-channels-no-complex-data-ty

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!