[IPython-User] Launching IPython kernel in a new thread instead of new process

Marc Liyanage marc@entropy...
Tue Mar 19 14:35:10 CDT 2013


Thanks, that clears a few things up.

Suppose I only need one kernel, and I don't need interruption or restartability. Do you think it would be easy enough to change the code so that it avoids the fork/exec and stays in-process? That way, I could fire up a Python interpreter on a background thread, run the IPython kernel in that environment, and the Python code running in that kernel would then have full access to the host app's Objective-C object graph through the Obj-C <-> Python bridge. That is what I'm after.

I am trying to figure out if it would be possible to make some kind of IPython Notebook-based debugging console for a Cocoa app. Such a console would be a very convenient and powerful way to inspect a Cocoa app at runtime. I should mention that this is unrelated to the IPython Notebook Mac that I am working on, this is just an idea for some other (Cocoa) projects, where IPython would be used just for its nice user interface, but would be an auxiliary thing to the app's main purpose.




On Mar 19, 2013, at 9:33 AM, MinRK <benjaminrk@gmail.com> wrote:

> 
> 
> On Mon, Mar 18, 2013 at 10:29 PM, Marc Liyanage <marc@entropy.ch> wrote:
> 
> When I launch IPython notebook, a new process is launched for the kernel. I was wondering if it's possible to prevent that and instead run the kernel in a new thread of the same process. I can use the C API and instantiate multiple Python environments if that helps.
> 
> There are two reasons why I'd like to avoid forking:
> 
> 1.) On OS X, the process gets killed if it is already multithreaded, and for a GUI app that's always the case
> 2.) I'd like to use the Objective-C bridge and let the Python code access Obj-C objects in the same process
> 
> Is this possible, or is IPython fundamentally based on fork/exec?
> 
> You would definitely need to start a new Python, because there can only be one Kernel in a given Python environment.  I've never thought about multiple Pythons in a single process - you would definitely have problems when users interrupt or restart kernels, since that's based on process signals.
> 
> My guess is that this is probably not going to work without significant restructuring of IPython, which is unlikely to happen.
>  
> 
> 
> _______________________________________________
> IPython-User mailing list
> IPython-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/ipython-user
> 
> _______________________________________________
> IPython-User mailing list
> IPython-User@scipy.org
> http://mail.scipy.org/mailman/listinfo/ipython-user

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/ipython-user/attachments/20130319/0a9ef0dd/attachment.html 


More information about the IPython-User mailing list