[SciPy-User] multiprocessing module
Mon Nov 1 18:22:08 CDT 2010
If you want a simple solution which doesn't involve lots of syncronisation
between the processes (depending on how often you write to disk this could prove
to be quite a bottleneck), you might consider writing to a separate file from
each process (perhaps using the pid as a unique identifier in the filename) and
concatenating them at the end.
----- Original Message ----
From: Robert Kern <email@example.com>
To: SciPy Users List <firstname.lastname@example.org>
Sent: Tue, 2 November, 2010 11:50:09 AM
Subject: Re: [SciPy-User] multiprocessing module
On Mon, Nov 1, 2010 at 17:42, Ted To <email@example.com> wrote:
> On Mon, Nov 1, 2010 at 5:36 PM, Zachary Pincus <firstname.lastname@example.org>
>>> I'm trying to get multiprocess to do a bunch of independent
>>> calculations and save the results in a file and I'm probably going
>>> about it the wrong way. I have a function defined "computeEq" that
>>> does the calculation and writes the result to a file (outfile) and I
>>> call it using:
>>> po = Pool()
>>> po.map_async(computeEq, product(rules,repeat=N))
>>> This seems to work for the most part but I seem to lose the last few
>>> calculations. Indeed, one of my writes is truncated before the write
>>> is complete.
>> Are you taking proper precautions so that multiple workers aren't
>> trying to write to the file at the same time?
> I'm a bit of a noob as far as multiprocessing goes so no, I'm not.
> How does one do that?
"I have come to believe that the whole world is an enigma, a harmless
enigma that is made terrible by our own mad attempt to interpret it as
though it had an underlying truth."
-- Umberto Eco
SciPy-User mailing list
More information about the SciPy-User