[IPython-User] max engines?

Brian Granger ellisonbg@gmail....
Mon Aug 9 14:28:50 CDT 2010


There are a couple of things to consider with our current design:

1.  Eventually you will hit a file descriptor limit per process.  You can
usually change this by messing with the ulimits.
2 . The performance.  The current twisted based design doesn't scale very
well, but the basic idea is that the more engines you connect, the more
memory the controller will consume and the more latency you will see.  Once
the lantecies get to be as long as the tasks the engines are performing, you
will stop observing speedups.

We are currently working on a new zeromq based architecture that will
dramatically improve the scalability of the parallel stuff.  We are not sure
how far it will go, but i am hopeful that the new stuff will get us up to
500-1000 engines.

On Fri, Aug 6, 2010 at 2:52 PM, Darren Govoni <darren@ontrenet.com> wrote:

> Hi,
>  What's the practical maximum number of engines a controller can
> support? 10? 50? 100?
>
>
As long as you are not sending lots of data between the engines and client,
I think 64-128 is doable with the current design.  Best to just give it a
shot though...

Cheers,

Brian



> thanks
> Darren
>
> _______________________________________________
> IPython-User mailing list
> IPython-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/ipython-user
>



-- 
Brian E. Granger, Ph.D.
Assistant Professor of Physics
Cal Poly State University, San Luis Obispo
bgranger@calpoly.edu
ellisonbg@gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/ipython-user/attachments/20100809/669ee921/attachment.html 


More information about the IPython-User mailing list