[Numpy-discussion] is numerical python only for prototyping ?
perry at stsci.edu
Tue Jul 26 10:21:26 CDT 2005
On Jul 26, 2005, at 12:41 PM, Sebastian Haase wrote:
> This is not sopposed to be an evil question; instead I'm hoping for the
> answer: "No, generally we get >=95% the speed of a pure C/fortran
> implementation" ;-)
> But as I am the strongest Python/numarray advocate in our group I get
> the answer that Matlab is (of course) also very convenient but
> memory handling and overall execution performance is so bad that for
> implementation one would generally have to reimplement in C.
> We are a bio-physics group at UCSF developping new algorithms for
> deconvolution (often in 3D). Our data sets are regularly bigger than
> 100MB. When deciding for numarray I was assuming that the "Hubble
> Crowd" had
> a similar situation and all the operations are therefore very much
> for this type of data.
> Is 95% a reasonable number to hope for ? I did wrap my own version of
> (with "plan-caching"), which should give 100% of the C-speed. But
> arise from expression like "a=b+c*a" (think "convenience"!): If a,b,c
> each 3D-datastacks creation of temporary data-arrays for 'c*a' AND
> then also
> for 'b+...' would have to be very costly. (I think this is at least
> for Numeric - I don't know about Matlab and numarray)
Is it speed or memory usage you are worried about? Where are you
actually seeing unacceptable performance?
Offhand, I would say the temporaries are not likely to be serious speed
issues (unless you running out of memory). We did envision at some
point (we haven't done it yet) of recognizing situations where the
temporaries could be reused.
As for 95% speed, it's not what required for our work (I think that
what an acceptable speed ratio is depends on the problem). In many
cases being within 50% is good enough except for heavily used things
where it would need to be faster. But yes, we do plan on (and do indeed
actually use it now) for large data sets where speed is important. We
generally don't compare the numarray speed to C speed all the time (if
we were writing C equivalents every time, that would defeat the purpose
of using Python :-). Perhaps you could give a more specific example
with some measurements? I don't think I would promise anyone that all
one's code could be done in Python. Undoubtedly, there are going to be
some cases where coding in C or similar is going to be needed. I'd just
argue that Python let's you keep as much as possible in a higher level
language and a little as necessary in a low level language such as C.
More information about the Numpy-discussion