[SciPy-User] In which numpy modules should MKL be better then ATLAS

Klonuo Umom klonuo@gmail....
Tue Oct 4 10:10:33 CDT 2011


OK, I'm not some skilled programmer, but I suspected as ATLAS autotunes
during building and MKL just installs out of box, that ATLAS could be more
optimized.

This is not some serious test as it's just one calculation. FYI, while still
speaking about old single core processors, I installed using same procedure
and switches, exact same packages on 2.4GHz Pentium Celeron:

atlas_threads_info:
    libraries = ['lapack', 'lapack', 'f77blas', 'cblas', 'atlas']
    library_dirs = ['/usr/local/lib']
    define_macros = [('ATLAS_INFO', '"\\"3.8.4\\""')]
    language = f77
blas_opt_info:
    libraries = ['lapack', 'f77blas', 'cblas', 'atlas']
    library_dirs = ['/usr/local/lib']
    define_macros = [('ATLAS_INFO', '"\\"3.8.4\\""')]
    language = c
atlas_blas_threads_info:
    libraries = ['lapack', 'f77blas', 'cblas', 'atlas']
    library_dirs = ['/usr/local/lib']
    define_macros = [('ATLAS_INFO', '"\\"3.8.4\\""')]
    language = c
lapack_opt_info:
    libraries = ['lapack', 'lapack', 'f77blas', 'cblas', 'atlas']
    library_dirs = ['/usr/local/lib']
    define_macros = [('ATLAS_INFO', '"\\"3.8.4\\""')]
    language = f77
lapack_mkl_info:
  NOT AVAILABLE
blas_mkl_info:
  NOT AVAILABLE
mkl_info:
  NOT AVAILABLE

The result was 2 times slower performance compared to 3GHz Celeron - or same
as using MKL :D

Intel(R) Celeron(R) CPU 2.40GHz
cache size : 128 KB
address sizes : 36 bits physical, 32 bits virtual

%timeit np.dot(A, B)
1 loops, best of 3: 982 ms per loop

vs

Intel(R) Celeron(R) D CPU 3.06GHz
cache size : 512 KB
address sizes : 36 bits physical, 48 bits virtual

%timeit np.dot(A, B)
1 loops, best of 3: 453 ms per loop


This are old PCs, but I wont just throw them yet ;)


2011/10/4 Frédéric Bastien <nouiz@nouiz.org>

> Hi,
>
> Atlas will usually get close to mkl speed, but it will take more time and
> you must compile it yourself. So on recent cpu, mkl should be faster.
>
> There is 2 cases that could make this different: do atlas and mkl used the
> same number ofthread? You should count only real core on the cpu, not
> hyperthread core. Having more then the number of real core will probably
> slow thing down.
>
> Some version of mkl have a speed problem with gemm(used by dot). The speed
> penalty is 2x. I don't know witch version are afftected.
>
> Fred
> On Oct 1, 2011 6:35 AM, "Klonuo Umom" <klonuo@gmail.com> wrote:
> > It's old Intel P4 3Ghz
> > ATLAS/LAPACK are build from source so maybe more optimized
> >
> >
> > On Sat, Oct 1, 2011 at 12:23 PM, Dag Sverre Seljebotn <
> > d.s.seljebotn@astro.uio.no> wrote:
> >
> >> On 10/01/2011 11:55 AM, Klonuo Umom wrote:
> >> > I had a chance to test this sample on different setups on same PC:
> >> >
> >> > import numpy as np
> >> > A=np.ones((1000,1000))
> >> > B=np.ones((1000,1000))
> >> > %timeit np.dot(A, B)
> >> >
> >> > because of OS reinstalling.
> >> >
> >> > 1x = ATLAS on Linux (reference speed)
> >> > 2x = MKL with GNU compilers on Linux
> >> > 2x = MKL with Intel compilers on Windows 7
> >> > 30x = bare numpy
> >> >
> >> > I didn't plan to do this so I didn't test additional calculations, and
> I
> >> > was using latest version to date for all products.
> >> >
> >> > On Internet I usually find that MKL should outperform ATLAS. I'm
> curious
> >> > what would linalg module testing give, but as said I didn't test it.
> So
> >> > in which modules should user expect impact of MKL over ATLAS? In
> matrix
> >> > dot product obviously not.
> >>
> >> What CPU are you on? MKL is tuned for Intel CPUs, perhaps ATLAS
> >> outperforms it on AMD ones.
> >>
> >> Dag Sverre
> >> _______________________________________________
> >> SciPy-User mailing list
> >> SciPy-User@scipy.org
> >> http://mail.scipy.org/mailman/listinfo/scipy-user
> >>
>
> _______________________________________________
> SciPy-User mailing list
> SciPy-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/scipy-user
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/scipy-user/attachments/20111004/bf9db972/attachment.html 


More information about the SciPy-User mailing list