[Numpy-discussion] numpy.random and multiprocessing
Thu Dec 11 12:39:32 CST 2008
>> Is the goal to parallelize a big sampler into N tasks of M trials, to
>> produce the same result as a sequential set of M*N trials ? Then it does
>> sound like a trivial task at all. I know there exists libraries
>> explicitly designed for parallel random number generation - maybe this
>> is where we should look, instead of using heuristics which are likely to
>> be bogus, and generate wrong results.
Another heuristic using pseudo random seed for each process
Generate random integers (large) in the main process, and send it as
seeds to each task. This makes it replicable if the initial seed is
set, and should have independent "pseudo" random numbers in each
This works in probability theory, but I don't know about the quality of RNGs.
More information about the Numpy-discussion