Say I had a numpy array A
of shape (55, 50, 2)
, and I would like to do the following operation
B = np.dot(A[0, :, 0][:, None], A[0, :, 0][None, :])
i.e. compute the dot product of each row by its transpose (over i and k)
Without the use of np.einsum
and obviously any for loops, how can this operation be done by pure broadcasting and reshaping (if needed)?
Note:
Im tagging eigen3
here because essentially I would want to rewrite this operation with Eigen::Tensor
in terms of .broadcast()
and .reshape()
(Its easier for me to write it out in numpy
first to get the general picture) . So a direct Eigen
solution would be much appreciated, but since python, hence numpy, is more popular, I would accept a numpy solution as well.
CodePudding user response:
So what you need is an outer product for each row.
It is better to write loops and do a matrix multiplication if you are using C Eigen. You can dispatch to BLAS for performance.
If you want a way using Numpy, a messy solution is
B = A[...,newaxis].transpose(0,2,1,3)
C = [email protected](0,1,3,2) #Matrix multiplication
C = C.transpose(0,3,2,1)
np.array_equal(np.einsum('ijk,ilk->ijlk',A,A),C) ## check if both are identical