I want to calculate the sum around an element in around. For example, calculate the sum of neighboring elements which are within 5 units (in any x,y,z direction). I wrote a loop to do this. This function is to calculate mean of a block in the 3D array. The shape of array is (159,191,159)
It works ok but because it will be used in another loop, I want to make it run at least one magnitude faster.
How can I use NumPy (or any other way) to make this run more efficient? For example, conditional np.sum()
I guess? Can anyone give me a simple efficient example to calculate the mean?
def patch_mean(coordinate_x,coordinate_y,coordinate_z,image,patch_radius):
for a in range(coordinate_x- patch_radius, coordinate_x patch_radius):
for b in range(coordinate_y - patch_radius, coordinate_y patch_radius):
for c in range (coordinate_z - patch_radius, coordinate_z patch_radius):
if 0<a<159 and 0<b<191 and 0<c<159:
if image[a][b][c] != 0:
sum = sum img[a][b][c]
count = count 1
if count==0:
mean=0
else:
mean=sum/count
return mean
CodePudding user response:
You can use a convolution approach.
(However, I am not sure about its performance.)
Here is a simple example for a 2-D array. This example is referenced from the following two articles:
In numpy, how to efficiently list all fixed-size submatrices?
Convolve2d just by using Numpy
import numpy as np
from numpy.lib.stride_tricks import as_strided
data = np.arange(48).reshape(6, 8)
data =
[[ 0 1 2 3 4 5 6 7]
[ 8 9 10 11 12 13 14 15]
[16 17 18 19 20 21 22 23]
[24 25 26 27 28 29 30 31]
[32 33 34 35 36 37 38 39]
[40 41 42 43 44 45 46 47]]
mean_filter_shape = (3, 4)
data_new_shape = tuple(np.subtract(data.shape, mean_filter_shape) 1) mean_filter_shape
data_new = as_strided(data, data_new_shape, data.strides * 2)
data_new =
[[[[ 0 1 2 3]
[ 8 9 10 11]
[16 17 18 19]]
...
[[28 29 30 31]
[36 37 38 39]
[44 45 46 47]]]]
mean_filter = np.ones(mean_filter_shape)
data_mean = np.einsum('ij,klij->kl', mean_filter, data_new) / np.prod(mean_filter_shape)
data_mean =
[[ 9.5 10.5 11.5 12.5 13.5]
[17.5 18.5 19.5 20.5 21.5]
[25.5 26.5 27.5 28.5 29.5]
[33.5 34.5 35.5 36.5 37.5]]
CodePudding user response:
You can use scipy.signal.convolve
with a numpy.ones
kernel.
Documentation:
import numpy as np
from scipy.signal import convolve
data = np.random.random((159,191,159))
patch_radius = 5
kernel = np.ones((2*patch_radius 1,2*patch_radius 1,2*patch_radius 1))
data_mean = convolve(data, kernel, mode='same')