[Numpy-discussion] .T Transpose shortcut for arrays again

Bill Baxter wbaxter at gmail.com
Thu Jul 6 21:56:11 CDT 2006


On 7/7/06, Robert Kern <robert.kern at gmail.com> wrote:
>
> Bill Baxter wrote:
> > Robert Kern wrote:
> >
> >
> > The slippery slope argument only applies to the .M, not the .T or .H.
>
> No, it was the "Let's have a .T attribute. And if we're going to do that,
> then
> we should also do this. And this. And this."


There's no slippery slope there.   It's just "Let's have a .T attribute, and
if we have that then we should have .H also."  Period.  The slope stops
there.    The .M and .A are a separate issue.

>     I don't think that just because arrays are often used for linear
> >     algebra that
> >
> >     linear algebra assumptions should be built in to the core array
> type.
> >
> > It's not just that "arrays can be used for linear algebra".  It's that
> > linear algebra is the single most popular kind of numerical computing in
> > the world!  It's the foundation for a countless many fields.   What
> > you're saying is like "grocery stores shouldn't devote so much shelf
> > space to food, because food is just one of the products people buy", or
> [etc.]
>
> I'm sorry, but the argument-by-inappropriate-analogy is not convincing.
> Just
> because linear algebra is "the base" for a lot of numerical computing does
> not
> mean that everyone is using numpy arrays for linear algebra all the time.
> Much
> less does it mean that all of those conventions you've devised should be
> shoved
> into the core array type. I hold a higher standard for the design of the
> core
> array type than I do for the stuff around it. "It's convenient for what I
> do,"
> just doesn't rise to that level. There has to be more of an argument for
> it.


My argument is not that "it's convenient for what I do", it's that "it's
convenient for what 90% of users want to do".  But unfortunately I can't
think of a good way to back up that claim with any sort of numbers.

But here's one I just found:  http://www.netlib.org/master_counts2.html
download statistics for various numerical libraries on netlib.org.
The top 4 are all linear algebra related:
/lapack <http://www.netlib.org/lapack/> 37,373,505
/lapack/lug<http://www.netlib.org/lapack/lug/>
19,908,865  /scalapack <http://www.netlib.org/scalapack/> 14,418,172
/linalg <http://www.netlib.org/linalg/> 11,091,511
The next three are more like general computing issues: parallelization lib,
performance monitoring, benchmarks:
/pvm3 <http://www.netlib.org/pvm3/> 10,360,012
/performance<http://www.netlib.org/performance/>
7,999,140  /benchmark <http://www.netlib.org/benchmark/> 7,775,600
Then the next one is more linear algebra.  And that seems to hold pretty far
down the list.  It looks like mostly stuff that's either linear algebra
related or parallelization/benchmarking related.

And as another example, there's the success of higher level numerical
environments like Matlab (and maybe R and S? and Mathematica, and Maple?)
that have strong support for linear algebra right in the core, not requiring
users to go into some syntax/library ghetto to use that functionality.

I am also curious, given the number of times I've heard this nebulous
argument of "there are lots kinds of numerical computing that don't invlolve
linear algebra", that no one ever seems to name any of these "lots of
kinds".  Statistics, maybe?  But you can find lots of linear algebra in
statistics.

--bb
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://projects.scipy.org/pipermail/numpy-discussion/attachments/20060707/4c71f451/attachment.html 


More information about the Numpy-discussion mailing list