[Numpy-discussion] reduce array by computing min/max every n samples
Thu Jun 17 16:50:24 CDT 2010
I have a 1D array with >100k samples that I would like to reduce by
computing the min/max of each "chunk" of n samples. Right now, my
code is as follows:
n = 100
offset = array.size % downsample
array_min = array[offset:].reshape((-1, n)).min(-1)
array_max = array[offset:].reshape((-1, n)).max(-1)
However, this appears to be running pretty slowly. The array is data
streamed in real-time from external hardware devices and I need to
downsample this and compute the min/max for plotting. I'd like to
speed this up so that I can plot updates to the data as quickly as new
data comes in.
Are there recommendations for faster ways to perform the downsampling?
More information about the NumPy-Discussion