Home > Blockchain >  Eigen tensor reshape then broadcast results in gibberish numbers when going from rank 0 to rank 1 te
Eigen tensor reshape then broadcast results in gibberish numbers when going from rank 0 to rank 1 te

Time:10-22

I have the following code that seeks to find the maximum element of a rank 1 tensor, which shrinks to a rank 0 tensor, and then broadcast it back out to the full length of the rank 1 tensor so I can use it in further computations involving the original rank 1 tensor.

//reduces a rank 1 tensor to a rank 0 tensor.
Tensor<double,0> columnmaximum = input_tensor.maximum(this->imposed_dim).eval();

std::cout << "colmax is\n" << columnmaximum << std::endl;

this->columnbroadcast = Eigen::array<int,1> ({M});
this->rank1base = Eigen::array<int,1> ({1});

//expands it back out to a full column. columnshape is just
Tensor<double,1> columnmaximum_rk2 = columnmaximum.reshape(this->rank1base).broadcast(this->columnbroadcast);
std::cout << "colmaxrk2 is\n" << columnmaximum_rk2 << std::endl;

and noticed the following strange output:

colmax is
-2
colmaxrk2 is
          -2
    0.238402
3.91433e-310
    -3.33086
          -2

Something went wrong when broadcasting. My idea was to elevate the rank 0 tensor to a rank 1 tensor (of length one), and then broadcast in the single dimension to replicate the maximum as many times as I need to be able to subtract it from something else.

What is going wrong here with those three numbers in between when printing the enlarged tensor? I know in this special case, I could use the setConstant method but would like to use the reshape then broadcast trick also for higher-dimensional tensors where a summary statistic is less trivial, i.e. a rank 2 tensor etc..

Can anyone explain to me where these non-sensical numbers appear from? Am I committing a basic mistake? The amazingly small number looks a bit like unallocated memory to me.

Thank you so much!

CodePudding user response:

It appears to be a bug in Eigen 3.4.0 related to tensors of dimension zero. Reproducible example (godbolt):

#include <iostream>
#include <Eigen/../unsupported/Eigen/CXX11/Tensor>

int main()
{
  Eigen::Tensor<double, 3> MaxTest(4, 4, 4);
  MaxTest.setRandom();
  Eigen::Tensor<double, 0> columnmaximum = MaxTest.maximum();

  std::cout << "colmax is\n" << columnmaximum << std::endl;

  Eigen::array<Eigen::Index, 1> columnbroadcast({6});
  Eigen::array<Eigen::Index, 1> rank1base({1});

  Eigen::Tensor<double, 1> columnmaximum_rk2 = columnmaximum.reshape(rank1base).broadcast(columnbroadcast);
  std::cout << "colmaxrk2 is\n" << columnmaximum_rk2 << std::endl;
}

If the debug checks in the standard library are enabled (-D_GLIBCXX_DEBUG for gcc's stdlibc ), the evaluation of the assignment (operator=()) to columnmaximum_rk2 fails. Without debug, it prints some random values as shown in the original post.

Putting an eval() between the reshape() and the broadcast() prevents the issue (godbolt):

Tensor<double, 1> columnmaximum_rk2 = 
  columnmaximum.reshape(rank1base).eval().broadcast(columnbroadcast);

Interestingly, Eigen trunk is not affected anymore by the issue (godbolt). Apparently, it got fixed since Eigen 3.4.0. Indeed, there is one commit that at least deals with the tensor broadcast (but I am not sure if this commit specifically fixes the issue). Thus, another workaround could be to use the current trunk version.

  • Related