[SciPy-user] Parallel linear solver.

Marian Jakubik mjakubik@ta3...
Fri Jul 25 04:51:29 CDT 2008


Hi Frank,

is it possible to compare the speed of computation using parallelisms
(parallel codes) in C and Python? I am preparing a parallel code and
till now I've been decided to use C. I mean the "difference" in speed
for executable (C) and interpreted (Python) codes is important, isn't
it? 

Thank in advance for response...

Best,
Marian

Dňa Thu, 24 Jul 2008 12:15:27 -0400
"Frank Lagor" <dfranci@seas.upenn.edu> napísal:

> One last thing--
> 
> If you just want parallel interaction between your python codes, and you
> think that petsc4py is overkill with all of it's solvers and such, then just
> use mpi4py to exchange messages, perform reductions, etc. in parallel.  It
> is written by the same author for petsc4py (but it is not a requirement for
> petsc4py) and you may find it a bit easier to use if you want to run
> basically sequential codes in parallel and exchange results.
> 
> -Frank
> 
> On Thu, Jul 24, 2008 at 12:06 PM, Frank Lagor <dfranci@seas.upenn.edu>
> wrote:
> 
> > Hi Nils,
> >
> > Typically users come to find PETSc when they need it-- that is if they are
> > doing a lot of scientific computations and parallel processing is therefore
> > definitely needed. For me, I had access to a small 64 processor cluster and
> > sequential codes that would take days to run, so it was a natural fit.  I
> > could definitely see it being used  in smaller settings (and I'm sure many
> > people do), like on a desktop machine with a few processors, but that is not
> > what I use it for.  I'm not sure about your needs or your access to a
> > cluster, so you'll can probably be the best judge of if it is for you.
> >
> > For software requirements-- PETSc uses BLAS, LAPACK, and an MPI
> > distribution as a mimimum.
> >  It can also interface with countless other packages (e.g. SCALAPACK,
> > ATLAS, SPRNG, etc.), but I don't bother with all this.  For me, there was an
> > OpenMPI implementation of MPI already installed on my cluster (as should be
> > the case for most clusters), so I just linked to it.  And the BLAS and
> > LAPACK on the cluster were not working currently, so I told PETSc to
> > download and install BLAS and LAPACK automatically.  It did and it works
> > fine. Anyways, that's my story-- I hope it helps.
> >
> > -Frank
> >
> >
> >
> >
> > On Thu, Jul 24, 2008 at 11:48 AM, Nils Wagner <
> > nwagner@iam.uni-stuttgart.de> wrote:
> >
> >> On Thu, 24 Jul 2008 11:31:53 -0400
> >>   "Frank Lagor" <dfranci@seas.upenn.edu> wrote:
> >> > Yes, there will be a new release of PETSc soon (I just
> >> >asked one fo the
> >> > developers), but they don't know exactly how long it
> >> >will be. The  2.3.3
> >> > release of PETSc is still current.  They use patches for
> >> >a lot of their
> >> > small changes.  For example, just a two months ago or
> >> >so, I downloaded a
> >> > version that was 2.3.3-p8.  The current patched version
> >> >is 2.3.3-p13.  It is
> >> > still very active, and in my opinion, major releases are
> >> >not a good thing,
> >> > because it may end up affecting your code.  Actually, I
> >> >think this is
> >> > probably why you asked (so you could wait until a new
> >> >release so you
> >> > wouldn't have to worry about code changes for a while).
> >> >
> >> > Hope this help,
> >> >Frank
> >> >
> >> Fank,
> >>
> >> Thank you very much for your prompt. So far I have used
> >> serial programs solely.
> >> How do I benefit from parallel code in python ?
> >> I mean what are the minimal requirements to run codes in
> >> parallel (hardware/software) ?
> >> What software packages are needed to configure petsc in
> >> that context ?
> >>
> >> Nils
> >>
> >>
> >> _______________________________________________
> >> SciPy-user mailing list
> >> SciPy-user@scipy.org
> >> http://projects.scipy.org/mailman/listinfo/scipy-user
> >>
> >
> >
> >
> > --
> > Frank Lagor
> > Ph.D. Candidate
> > Mechanical Engineering and Applied Mechanics
> > University of Pennsylvania
> >
> 
> 
> 


More information about the SciPy-user mailing list