Error while creating object from templated class

社会主义新天地 提交于 2021-02-10 06:12:04

问题


I've been trying to find a way to sample random vectors from a multivariate normal distribution in C++, having both the mean vector and the covariance matrix, much like Matlab's mvnrnd function works. I've found relevant code for a class that implements this on this page, but I've been having some problems compiling it. I've created a header file that is being included on my main.cpp, and I'm trying to create an object of the EigenMultivariateNormal class:

MatrixXd MN(10,1);
MatrixXd CVM(10,10);

EigenMultivariateNormal <double,int> (&MN,&CVM) mvn;

The problem is I get a template-related error when compiling:

error: type/value mismatch at argument 2 in template parameter list for ‘template<class _Scalar, int _size> class EigenMultivariateNormal’    
error:   expected a constant of type ‘int’, got ‘int’    
error: expected ‘;’ before ‘mvn’

I only have a superficial idea on how to work with templates, and I am by no means a cpp expert, so I was wondering what exactly am I doing wrong? Apparently I should have a const somewhere on my code.


回答1:


template<class _Scalar, int _size> class EigenMultivariateNormal is specialized template class. The first class _Scalar ask for a type but int _size ask for an int.

You should call it with a constant int instead of the type int as you did. Secondly, your syntax to instance a new class EigenMultivariateNormal is wrong. Try this instead:

EigenMultivariateNormal<double, 10> mvn (&MN, &CVM); // with 10 is the size



回答2:


That code's a bit old. Here's a newer, possibly improved version. There are probably still some bad things. For example, I think it should be changed to use the MatrixBase instead of an actual Matrix. That might let it optimize and better decide when it needs to allocate storage space or not. This also uses the namespace internal which is probably frowned on, but it seems necessary to make use of Eigen's NullaryExpr which seems like the right thing to do. There's usage of the dreaded mutable keyword. That's necessary because of what Eigen thinks should be const when used in a NullaryExpr. It's also a little annoying to rely on boost. It seems that in C++11 the necessary functions have become standard. Below the class code, there's a short usage sample.

The class eigenmultivariatenormal.hpp

#ifndef __EIGENMULTIVARIATENORMAL_HPP
#define __EIGENMULTIVARIATENORMAL_HPP

#include <Eigen/Dense>
#include <boost/random/mersenne_twister.hpp>
#include <boost/random/normal_distribution.hpp>    

/*
  We need a functor that can pretend it's const,
  but to be a good random number generator 
  it needs mutable state.  The standard Eigen function 
  Random() just calls rand(), which changes a global
  variable.
*/
namespace Eigen {
namespace internal {
template<typename Scalar> 
struct scalar_normal_dist_op 
{
  static boost::mt19937 rng;                        // The uniform pseudo-random algorithm
  mutable boost::normal_distribution<Scalar> norm;  // The gaussian combinator

  EIGEN_EMPTY_STRUCT_CTOR(scalar_normal_dist_op)

  template<typename Index>
  inline const Scalar operator() (Index, Index = 0) const { return norm(rng); }
};

template<typename Scalar> 
boost::mt19937 scalar_normal_dist_op<Scalar>::rng;

template<typename Scalar>
struct functor_traits<scalar_normal_dist_op<Scalar> >
{ enum { Cost = 50 * NumTraits<Scalar>::MulCost, PacketAccess = false, IsRepeatable = false }; };

} // end namespace internal
/**
    Find the eigen-decomposition of the covariance matrix
    and then store it for sampling from a multi-variate normal 
*/
template<typename Scalar, int Size>
class EigenMultivariateNormal
{
  Matrix<Scalar,Size,Size> _covar;
  Matrix<Scalar,Size,Size> _transform;
  Matrix< Scalar, Size, 1> _mean;
  internal::scalar_normal_dist_op<Scalar> randN; // Gaussian functor


public:
  EigenMultivariateNormal(const Matrix<Scalar,Size,1>& mean,const Matrix<Scalar,Size,Size>& covar)
  {
    setMean(mean);
    setCovar(covar);
  }

  void setMean(const Matrix<Scalar,Size,1>& mean) { _mean = mean; }
  void setCovar(const Matrix<Scalar,Size,Size>& covar) 
  {
    _covar = covar;

    // Assuming that we'll be using this repeatedly,
    // compute the transformation matrix that will
    // be applied to unit-variance independent normals

    /*
    Eigen::LDLT<Eigen::Matrix<Scalar,Size,Size> > cholSolver(_covar);
    // We can only use the cholesky decomposition if 
    // the covariance matrix is symmetric, pos-definite.
    // But a covariance matrix might be pos-semi-definite.
    // In that case, we'll go to an EigenSolver
    if (cholSolver.info()==Eigen::Success) {
      // Use cholesky solver
      _transform = cholSolver.matrixL();
    } else {*/
      SelfAdjointEigenSolver<Matrix<Scalar,Size,Size> > eigenSolver(_covar);
      _transform = eigenSolver.eigenvectors()*eigenSolver.eigenvalues().cwiseMax(0).cwiseSqrt().asDiagonal();
    /*}*/

  }

  /// Draw nn samples from the gaussian and return them
  /// as columns in a Size by nn matrix
  Matrix<Scalar,Size,-1> samples(int nn)
  {
    return (_transform * Matrix<Scalar,Size,-1>::NullaryExpr(Size,nn,randN)).colwise() + _mean;
  }
}; // end class EigenMultivariateNormal
} // end namespace Eigen
#endif

Here's a simple program that uses it:

#include <fstream>
#include "eigenmultivariatenormal.hpp"
#ifndef M_PI
#define M_PI REAL(3.1415926535897932384626433832795029)
#endif

/**
  Take a pair of un-correlated variances.
  Create a covariance matrix by correlating 
  them, sandwiching them in a rotation matrix.
*/
Eigen::Matrix2d genCovar(double v0,double v1,double theta)
{
  Eigen::Matrix2d rot = Eigen::Rotation2Dd(theta).matrix();
  return rot*Eigen::DiagonalMatrix<double,2,2>(v0,v1)*rot.transpose();
}

void main()
{
  Eigen::Vector2d mean;
  Eigen::Matrix2d covar;
  mean << -1,0.5; // Set the mean
  // Create a covariance matrix
  // Much wider than it is tall
  // and rotated clockwise by a bit
  covar = genCovar(3,0.1,M_PI/5.0);

  // Create a bivariate gaussian distribution of doubles.
  // with our chosen mean and covariance
  Eigen::EigenMultivariateNormal<double,2> normX(mean,covar);
  std::ofstream file("samples.txt");

  // Generate some samples and write them out to file 
  // for plotting
  file << normX.samples(1000).transpose() << std::endl;
}

And here's a plot showing the results.

Two 2d Gaussian distributions

Using the SelfAdjointEigenSolver is probably a lot slower than a Cholesky decomposition, but it is stable, even if the covariance matrix is singular. If you know that your covariance matrices will always be full, then you could use that instead. However, if you create the distribution rarely and sample from it often, then that's probably not a big deal.



来源:https://stackoverflow.com/questions/16361226/error-while-creating-object-from-templated-class

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!