[SciPy-User] Help optimizing an algorithm
Zachary Pincus
zachary.pincus@yale....
Thu Jan 31 18:00:21 CST 2013
> For example, if my exposure times are (10, 20, 25), and for a given pixel, map_coordinates tells me 1.5, then that means that that pixel's exposure time is 22.5 (halfway between the exposure times with indices 1 and 2).
>
> Let's say that my sampling is just 2 4x3 images:
>
> >>> a = np.arange(24.0).reshape(2,4,3)
> >>> a
> array([[[ 0., 1., 2.],
> [ 3., 4., 5.],
> [ 6., 7., 8.],
> [ 9., 10., 11.]],
>
> [[ 12., 13., 14.],
> [ 15., 16., 17.],
> [ 18., 19., 20.],
> [ 21., 22., 23.]]])
>
> and I want to find the proper value for the (0, 0) pixel if its reported value was 6. What I want in this case is actually .5 (i.e. halfway between the two images). This is where I'm getting stuck, unfortunately. I'm missing some conversion or something, I think. Help would be appreciated.
Let's go back a few steps to make sure we're on the same page... You have a series of flat-field images acquired at different exposure times, which together define a per-pixel gain function, right? Then for each new image you want to calculate the "effective exposure time" for the count at a given pixel. Which is to say, the light input. Is this all correct?
So for each pixel, you are estimating the gain function f(exposure) -> value from your series of flat-field calibration images.
Because it's monotonic, you can invert this to g(value) -> exposure.
Then for any given value in an input image, you want to apply function g().
Again, is this all correct?
If so then you're almost there. The problem is one that you point out initially:
> I don't have uniform spacing for my sampling
Without uniform spacing, you can't convert a sample value into an index in the array without doing a search through the elements in the array to figure out where your value will fit. This is why Josef was talking about searchsorted() et al. So you need to first resample your gain function to have uniform sample spacing.
Let's do a one-pixel case first:
exposures = [0,1,10,20,25]
values = [100, 110, 200, 250, 275]
input_value = 260
Now, you could just use numpy.interp() to figure out the exposure time that is the linear interpolation from this:
output_exposure = numpy.interp(input_value, values, exposures)
Except that under the hood this does a linear search through the values array to find the nearest neighbors of input_value, and then does the standard linear interpolation. This is going to be slow to do for every pixel in an image, unless you code it in C or cython. (Which actually wouldn't be that bad.)
Instead let's resample the exposures and values to be uniform:
num_samples = 10
vmin, vmax = values.min(), values.max()
uniform_values = numpy.linspace(vmin, vmax, num_samples)
uniform_exposures = numpy.interp(uniform_values, values, exposures)
Note that we're still using numpy.interp() here: we still have to do the linear search! No free lunch. But we can do it just once and pre-compute the lookup table for a range of values, and then subsequently just calculate the correct index into it:
value_index = (num_samples - 1) * (input_value - vmin) / float(vmax - vmin)
Now we can do linear interpolation with map_coordinates():
exposure_estimate = scipy.ndimage.map_coordinates(uniform_exposures, [[value_index]], order=1)[0] # extra packing/unpacking just for scalar case.
Or just directly do the linear interpolation directly:
fraction, index = numpy.modf(value_index)
index = int(index)
l, h = uniform_exposures[[index, index + 1]]
exposure_estimate = h*fraction + l*(1 - fraction)
So you still need to loop through pixel by pixel and do numpy.interp(), but just once to get a uniformly spaced input array. Then you can use that for map_coordinates() as I described earlier. Remember that in the 2D case, you need to not only provide the appropriate value_index, but also the x- and y-indices, again as I described in the previous email. If you are still stuck, I'll write out example code equivalent to the above but for the 2d case.
This all clear? I'm happy to explain anything in further detail!
Zach
More information about the SciPy-User
mailing list