[Numpy-discussion] Dot/inner products with broadcasting?
Fri Mar 15 09:22:21 CDT 2013
In fact, there is already an inner1d implemented in
from numpy.core.umath_tests import inner1d
It should do the trick :)
On Thu, Mar 14, 2013 at 12:54 PM, Jaakko Luttinen
> Answering to myself, this pull request seems to implement an inner
> product with broadcasting (inner1d) and many other useful functions:
> On 03/13/2013 04:21 PM, Jaakko Luttinen wrote:
>> How can I compute dot product (or similar multiply&sum operations)
>> efficiently so that broadcasting is utilized?
>> For multi-dimensional arrays, NumPy's inner and dot functions do not
>> match the leading axes and use broadcasting, but instead the result has
>> first the leading axes of the first input array and then the leading
>> axes of the second input array.
>> For instance, I would like to compute the following inner-product:
>> np.sum(A*B, axis=-1)
>> But numpy.inner gives:
>> A = np.random.randn(2,3,4)
>> B = np.random.randn(3,4)
>> # -> (2, 3, 3) instead of (2, 3)
>> Similarly for dot product, I would like to compute for instance:
>> np.sum(A[...,:,:,np.newaxis]*B[...,np.newaxis,:,:], axis=-2)
>> But numpy.dot gives:
>> In : A = np.random.randn(2,3,4); B = np.random.randn(2,4,5)
>> In : np.dot(A,B).shape
>> # -> (2, 3, 2, 5) instead of (2, 3, 5)
>> I could use einsum for these operations, but I'm not sure whether that's
>> as efficient as using some BLAS-supported(?) dot products.
>> I couldn't find any function which could perform this kind of
>> operations. NumPy's functions seem to either flatten the input arrays
>> (vdot, outer) or just use the axes of the input arrays separately (dot,
>> inner, tensordot).
>> Any help?
>> Best regards,
>> NumPy-Discussion mailing list
> NumPy-Discussion mailing list
More information about the NumPy-Discussion