[IPython-User] Launching IPython kernel in a new thread instead of new process

Loïc Estève loic.esteve@ymail....
Thu Mar 21 19:33:19 CDT 2013


Hi,

it seems like I am doing something similar to what you are trying to do 
(except for a C++ application). I put together a simplified version of my 
current very hacky solution:
https://gist.github.com/lesteve/5217994

I think that if I understand MinRK's answer what I went for is a hacked 
notebook server (via changing app.kernel_manager.kernel_manager_class) 
instead of subclassing it and removing the kernel and notebook management 
part.

If you get the two files, then running 
embed_kernel_and_connected_notebook.py should embed a kernel and start a 
notebook which can find the embeded kernel. Making sure you can access 
localDict, defined in the embedded namespace, is the thing that shows it 
works.

Credit to https://github.com/ipython/ipython/pull/1220
which gave me hope it was even possible and got me started.

To be honest I'd be really happy to hear about improvements/shortcomings to 
this solution because there is very certainly a better way to do this.

In particular, I have been wondering lately whether embedding multiple 
kernels is possible which in our case would corresponds to multiple users 
having an independent view on the state of our C++ application.

Cheers,
Loïc

Marc Liyanage wrote:

> 
> Thanks a lot for this implementation suggestion, I will investigate that.
> 
> -Marc
> 
> 
> On Mar 20, 2013, at 12:23 PM, MinRK <benjaminrk@gmail.com> wrote:
> 
>> 
>> 
>> On Tue, Mar 19, 2013 at 12:35 PM, Marc Liyanage <marc@entropy.ch> wrote:
>> 
>> Thanks, that clears a few things up.
>> 
>> Suppose I only need one kernel, and I don't need interruption or
>> restartability. Do you think it would be easy enough to change the code
>> so that it avoids the fork/exec and stays in-process? That way, I could
>> fire up a Python interpreter on a background thread, run the IPython
>> kernel in that environment, and the Python code running in that kernel
>> would then have full access to the host app's Objective-C object graph
>> through the Obj-C <-> Python bridge. That is what I'm after.
>> 
>> I am trying to figure out if it would be possible to make some kind of
>> IPython Notebook-based debugging console for a Cocoa app. Such a console
>> would be a very convenient and powerful way to inspect a Cocoa app at
>> runtime. I should mention that this is unrelated to the IPython Notebook
>> Mac that I am working on, this is just an idea for some other (Cocoa)
>> projects, where IPython would be used just for its nice user interface,
>> but would be an auxiliary thing to the app's main purpose.
>> 
>> Ah, this makes more sense.  But I would actually do it a bit differently.
>>  I would embed just the Kernel in your app, but then write a custom
>> subclass of our notebook server that is actually much simpler than our
>> NotebookApp.  You would remove the kernel management part (and possibly
>> the notebook management part), since it can only talk to one kernel that
>> it did not start.  Then I would start this simple webserver in a
>> background process - your UI and Kernel can be in the same Cocoa process,
>> even if the little tornado app that mediates their communication is in a
>> different one.
>> 
>> -MinRK
>> 
>>  
>> 
>> 
>> 
>> 
>> On Mar 19, 2013, at 9:33 AM, MinRK <benjaminrk@gmail.com> wrote:
>> 
>>> 
>>> 
>>> On Mon, Mar 18, 2013 at 10:29 PM, Marc Liyanage <marc@entropy.ch> wrote:
>>> 
>>> When I launch IPython notebook, a new process is launched for the
>>> kernel. I was wondering if it's possible to prevent that and instead run
>>> the kernel in a new thread of the same process. I can use the C API and
>>> instantiate multiple Python environments if that helps.
>>> 
>>> There are two reasons why I'd like to avoid forking:
>>> 
>>> 1.) On OS X, the process gets killed if it is already multithreaded, and
>>> for a GUI app that's always the case 2.) I'd like to use the Objective-C
>>> bridge and let the Python code access Obj-C objects in the same process
>>> 
>>> Is this possible, or is IPython fundamentally based on fork/exec?
>>> 
>>> You would definitely need to start a new Python, because there can only
>>> be one Kernel in a given Python environment.  I've never thought about
>>> multiple Pythons in a single process - you would definitely have
>>> problems when users interrupt or restart kernels, since that's based on
>>> process signals.
>>> 
>>> My guess is that this is probably not going to work without significant
>>> restructuring of IPython, which is unlikely to happen.
>>>  
>>> 
>>> 
>>> _______________________________________________
>>> IPython-User mailing list
>>> IPython-User@scipy.org
>>> http://mail.scipy.org/mailman/listinfo/ipython-user
>>> 
>>> _______________________________________________
>>> IPython-User mailing list
>>> IPython-User@scipy.org
>>> http://mail.scipy.org/mailman/listinfo/ipython-user
>> 
>> 
>> _______________________________________________
>> IPython-User mailing list
>> IPython-User@scipy.org
>> http://mail.scipy.org/mailman/listinfo/ipython-user
>> 
>> 
>> _______________________________________________
>> IPython-User mailing list
>> IPython-User@scipy.org
>> http://mail.scipy.org/mailman/listinfo/ipython-user




More information about the IPython-User mailing list