[Numpy-discussion] Fwd: GPU Numpy
Thu Sep 10 10:46:25 CDT 2009
On Thu, Sep 10, 2009 at 07:28, Francesc Alted<firstname.lastname@example.org> wrote:
> A Thursday 10 September 2009 11:37:24 Gael Varoquaux escrigué:
>> On Thu, Sep 10, 2009 at 11:29:49AM +0200, Francesc Alted wrote:
>> > The point is: are GPUs prepared to compete with a general-purpose CPUs
>> > in all-road operations, like evaluating transcendental functions,
>> > conditionals all of this with a rich set of data types? I would like to
>> > believe that this is the case, but I don't think so (at least not yet).
>> I believe (this is very foggy) that GPUs can implement non trivial logic
>> on there base processing unit, so that conditionals and transcendental
>> functions are indeed possible. Where it gets hard is when you don't have
>> problems that can be expressed in an embarrassingly parallel manner.
> But NumPy is about embarrassingly parallel calculations, right? I mean:
> a = np.cos(b)
> where b is a 10000x10000 matrix is *very* embarrassing (in the parallel
> meaning of the term ;-)
Yes. However, it is worth making the distinction between
embarrassingly parallel problems and SIMD problems. Not all
embarrassingly parallel problems are SIMD-capable. GPUs do SIMD, not
generally embarrassing problems. If there are branches, as would be
necessary for many special functions, the GPU does not perform as
well. Basically, every unit has to do both branches because they all
must do the same instruction at the same time, even though the data on
each unit only gets processed by one branch.
cos() is easy. Or at least is so necessary to graphics computing that
it is already a primitive in all (most?) GPU languages. Googling
around shows SIMD code for the basic transcendental functions. I
believe you have to code them differently than you would on a CPU.
Other special functions would simply be hard to do efficiently.
"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
-- Umberto Eco
More information about the NumPy-Discussion