[Numpy-discussion] Optimized sum of squares
josef.pktd@gmai...
josef.pktd@gmai...
Tue Oct 20 14:28:45 CDT 2009
On Tue, Oct 20, 2009 at 3:09 PM, Anne Archibald
<peridot.faceted@gmail.com> wrote:
> 2009/10/20 <josef.pktd@gmail.com>:
>> On Sun, Oct 18, 2009 at 6:06 AM, Gary Ruben <gruben@bigpond.net.au> wrote:
>>> Hi Gaël,
>>>
>>> If you've got a 1D array/vector called "a", I think the normal idiom is
>>>
>>> np.dot(a,a)
>>>
>>> For the more general case, I think
>>> np.tensordot(a, a, axes=something_else)
>>> should do it, where you should be able to figure out something_else for
>>> your particular case.
>>
>> Is it really possible to get the same as np.sum(a*a, axis) with
>> tensordot if a.ndim=2 ?
>> Any way I try the "something_else", I get extra terms as in np.dot(a.T, a)
>
> It seems like this would be a good place to apply numpy's
> higher-dimensional ufuncs: what you want seems to just be the vector
> inner product, broadcast over all other dimensions. In fact I believe
> this is implemented in numpy as a demo: numpy.umath_tests.inner1d
> should do the job.
Thanks, this works well, needs core in name
(I might have to learn how to swap or roll axis to use this for more than 2d.)
>>> np.core.umath_tests.inner1d(a.T, b.T)
array([12, 8, 16])
>>> (a*b).sum(0)
array([12, 8, 16])
>>> np.core.umath_tests.inner1d(a.T, b.T)
array([12, 8, 16])
>>> (a*a).sum(0)
array([126, 166, 214])
>>> np.core.umath_tests.inner1d(a.T, a.T)
array([126, 166, 214])
What's the status on these functions? They don't show up in the docs
or help, except for
a brief mention in the c-api:
http://docs.scipy.org/numpy/docs/numpy-docs/reference/c-api.generalized-ufuncs.rst/
Are they for public consumption and should go into the docs?
Or do they remain a hidden secret, to force users to read the mailing lists?
Josef
>
> Anne
> _______________________________________________
> NumPy-Discussion mailing list
> NumPy-Discussion@scipy.org
> http://mail.scipy.org/mailman/listinfo/numpy-discussion
>
More information about the NumPy-Discussion
mailing list