Home > Enterprise >  How to use the eigen decomposition in numpy?
How to use the eigen decomposition in numpy?

Time:06-25

In the follow algorithm weights is a 2017x2017 symmetric matrix of rank 7.

I'm trying to do something with the columns of the eigendecomposition however I can't even get them to sum back to the original matrix. neoweights1 = weights, but neoweights = something totally different. What's the error here?

w,v = np.linalg.eig(weights)

neoweights1 = w2 * (v2 @ v2.T)
neoweights = np.zeros((matsize,matsize))
for i in range(7):
    neoweights  = np.real(w2[i] * (v2[i] @ v2[i].T))

CodePudding user response:

Your example doesn't show where w2 and v2 are coming from but I guess they are also the the result of np.linalg.eig. In that case note that each column in v2 is an eigen vector. But v2[i] gives you a row.

Following that, v2[i] @ v2[i].T is a vector dot product, meaning the whole expression neoweights = np.real(w2[i] * (v2[i] @ v2[i].T)) just adds a scalar value to every element of the matrix.

This is what you want instead:

neoweights = sum(np.real(np.outer(w[i] * v[:,i], v[:,i])) for i in range(7))

And for the matrix expression, you have to be careful when and how you do the multiplication with the weights. This should work: np.real(w * v @ v.T)

  • Related