Is there a faster way to do np.convolve(A, B)[::N]
using Numpy? It feels wasteful to compute all the convolutions and then throw N - 1
of N
away... I could do a for loop or list comprehension, but I thought it would be faster to use only native Numpy methods.
EDIT
Or does Numpy do lazy evaluation? I just saw this from a JS library, would be awesome for Numpy as well:
// Get first 3 unique values
const arr = [1, 2, 2, 3, 3, 4, 5, 6];
const result = R.pipe(
arr,
R.map(x => {
console.log('iterate', x);
return x;
}),
R.uniq(),
R.take(3)
); // => [1, 2, 3]
/**
* Console output:
* iterate 1
* iterate 2
* iterate 2
* iterate 3
* /
CodePudding user response:
A convolution is a product of your kernel and a window on your array, then the sum. You can achieve the same manually using a rolling window:
First let's see a dummy example
A = np.arange(30)
B = np.ones(6)
N = 3
out = np.convolve(A, B)[::N]
print(out)
output: [ 0. 6. 21. 39. 57. 75. 93. 111. 129. 147. 135. 57.]
Now we do the same with a rolling view, padding, and slicing:
from numpy.lib.stride_tricks import sliding_window_view as swv
out = (swv(np.pad(A, B.shape[0]-1), B.shape[0])[::N]*B).sum(axis=1)
print(out)
output: [ 0. 6. 21. 39. 57. 75. 93. 111. 129. 147. 135. 57.]
Intermediate sliding view:
swv(np.pad(A, B.shape[0]-1), B.shape[0])
array([[ 0, 0, 0, 0, 0, 0],
[ 0, 0, 0, 0, 0, 1],
[ 0, 0, 0, 0, 1, 2],
[ 0, 0, 0, 1, 2, 3],
[ 0, 0, 1, 2, 3, 4],
[ 0, 1, 2, 3, 4, 5],
[ 1, 2, 3, 4, 5, 6],
[ 2, 3, 4, 5, 6, 7],
...
[24, 25, 26, 27, 28, 29],
[25, 26, 27, 28, 29, 0],
[26, 27, 28, 29, 0, 0],
[27, 28, 29, 0, 0, 0],
[28, 29, 0, 0, 0, 0],
[29, 0, 0, 0, 0, 0]])
# with slicing
swv(np.pad(A, B.shape[0]-1), B.shape[0])[::N]
array([[ 0, 0, 0, 0, 0, 0],
[ 0, 0, 0, 1, 2, 3],
[ 1, 2, 3, 4, 5, 6],
[ 4, 5, 6, 7, 8, 9],
[ 7, 8, 9, 10, 11, 12],
[10, 11, 12, 13, 14, 15],
[13, 14, 15, 16, 17, 18],
[16, 17, 18, 19, 20, 21],
[19, 20, 21, 22, 23, 24],
[22, 23, 24, 25, 26, 27],
[25, 26, 27, 28, 29, 0],
[28, 29, 0, 0, 0, 0]])