I have a problem using NumPy which I have been working on for three hours and I can't figure it out. It's a five part problem and I figured four out of the five, but this last one I just can't figure out. Given a 3-dimensional array called "X", how would you find the index of the row with the smallest standard deviation in each layer?
So far I have this:
min_std_row_layer = X.std(axis = "").argmin()
But I don't know if that's even a good starting point.
CodePudding user response:
Lets say the first dimension of X is the layer, second is row, third is column.
There is a very compact way to do this:
print(np.argmin(np.std(X, 2), 1))
The 2 in np.std(X,2)
specifies that we want the standard deviations across the third dimension (0, 1, 2). These are the rows in X. We will get a standard deviation for all the rows in each layer, so this will return a 2d array with shape (num_layers, num_rows).
We now need to find the index of the minimum value in the rows of each layer.
np.argmin(np.std(X, 2) , 1)
The 1 indicates that we want the minimum of the row in each layer. Our final result will be a 1d array where each element is the index of the row in X with the smallest standard deviation.