[IPython-User] [IPython-user] max engines?
Wed Sep 1 00:20:43 CDT 2010
I'm running into limits on the number of engines at around 256 running on a
linux cluster with PBS. Could you please expand on a couple of points you
raised below Brian. First, what are the file descriptor limits and how would
I change them using ulimits. Please provide more detailed instructions on
this. Second, what are the timelines and ETA on the zeromq architecture? I'm
sure it will be eagerly awaited!
Brian Granger-3 wrote:
> There are a couple of things to consider with our current design:
> 1. Eventually you will hit a file descriptor limit per process. You can
> usually change this by messing with the ulimits.
> 2 . The performance. The current twisted based design doesn't scale very
> well, but the basic idea is that the more engines you connect, the more
> memory the controller will consume and the more latency you will see.
> the lantecies get to be as long as the tasks the engines are performing,
> will stop observing speedups.
> We are currently working on a new zeromq based architecture that will
> dramatically improve the scalability of the parallel stuff. We are not
> how far it will go, but i am hopeful that the new stuff will get us up to
> 500-1000 engines.
> On Fri, Aug 6, 2010 at 2:52 PM, Darren Govoni <email@example.com> wrote:
>> What's the practical maximum number of engines a controller can
>> support? 10? 50? 100?
> As long as you are not sending lots of data between the engines and
> I think 64-128 is doable with the current design. Best to just give it a
> shot though...
>> IPython-User mailing list
> Brian E. Granger, Ph.D.
> Assistant Professor of Physics
> Cal Poly State University, San Luis Obispo
> IPython-User mailing list
View this message in context: http://old.nabble.com/max-engines--tp29371368p29590065.html
Sent from the IPython - User mailing list archive at Nabble.com.
More information about the IPython-User