[SciPy-User] Projecting volumes down to 2D

Christoph Gohlke cgohlke@uci....
Thu Sep 1 16:37:38 CDT 2011



On 9/1/2011 2:24 PM, Chris Weisiger wrote:
> On Thu, Sep 1, 2011 at 8:15 AM, Chris Weisiger<cweisiger@msg.ucsf.edu>  wrote:
>> On Thu, Sep 1, 2011 at 1:34 AM, Christoph Gohlke<cgohlke@uci.edu>  wrote:
>>>
>>> This looks like "maximum intensity projection" visualization
>>> <http://en.wikipedia.org/wiki/Maximum_intensity_projection>. MIP can be
>>> efficiently implemented using OpenGL by blending together multiple
>>> slices, oriented perpendicular to the projection direction, through a 3D
>>> texture (xyz data). Also consider VTK's vtkVolumeRayCastMIPFunction class.
>>
>> Interesting, and I didn't know that OpenGL could do that. However, I'd
>> already considered and rejected using 3D textures for the application
>> as a whole, because my image data can be so large -- upwards of
>> 512x512x60 for a single timepoint, and not only can there be many
>> timepoints, but users can also request projections through time. So we
>> could be talking gigabytes of texture data here. Currently this
>> program runs well on my rather underpowered laptop, and I'd like to
>> keep things that way if possible.
>
> Just to followup, the maximum size of a 3D texture on this laptop is
> only 128 pixels in any direction, so I'd have to do some nasty
> stitching together of texture blocks to use OpenGL to solve this
> problem. Nice idea, though. Of course, practically every computer here
> is more powerful than my laptop, but they don't always have unusually
> strong graphics cards, and I'd rather not restrict what computers my
> code can run on.

Fair enough. Just wondering how your underpowered laptop can run well 
processing multiple gigabyte volumes with ndimage.map_coordinates :)

Christoph

>
> -Chris
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
>


More information about the SciPy-User mailing list