[IPython-User] database magics, saving notebooks in the database and smallest quantum of shareability etc

Nitin Borwankar nborwankar@gmail....
Sat Feb 23 18:38:36 CST 2013


I would like to connect with the IPython team at PyData 2013 in SantaClara
next month as I have interest in doing the following and would like to
co-ordinate :-

a) want to create a robust plugin framework for database magics (SQL and
NoSQL)

I have Postgres working (%%PGDB) right now and aim to do MySQL and Mongo
(and anything else based on community feedback).

Essentially  (after some database config) you run

"%PGDB <sql query>"  in IPyNB

this executes the query on a PG instance (in your config) via stdin/stdout
voodoo to the psql client (needs to be locally installed) and returns the
results set in the output visually as a text based table.

I am hoping to put out the PGDB magic on github before PyData.

Note that I make no attempt to parse SQL or understand the string - if you
mess up you will hear from the database at the other end as if you were
sitting at a console in a terminal.  I am just the middle man.
Or rather %%PGDB is.

Some work is needed to return the result set as a Py dict that is visible
in the name space so it can be used by other code.

There are some issues to deal with re: keeping a persistent connection so
that multiple requests don't open more and more connections - this is now
running at a basic (optimistic scenario) level but has not been tested for
all kinds of bad scenarios.

I have used standard "magics" metaphors and just dropped the code in the
right place and did some config. The integration with the IPyNB system was
not trivial but it is straightforward enough that it is not rocket science.
I was very pleased it worked after some elementary brain twisting :-)

Kudos to the IPy team for making extensibility straightforward - this is a
HUGE WIN.

b)  want to save JSON .ipynb in Mongo, Postgres key-value store and other
JSON stores instead of the filesystem in current directory.

c) want to be able to compose (note *compose* not edit) Notebooks via a web
UI where a user can assemble content chunks (JSON in Mongo) or .... and
then "publish" to .ipynb compliant JSON.
This will allow massive reuse of working content chunks especially those
that involve code examples and diagrams needing reproduceability.

My motivation for doing this is

1) IMHO, the filesystem based storage is subject to OS level security
issues and the database storage *may* (huge big MAY) provide some
mitigation.  Caveat being don't naively assume that database security is
the whole answer.

2) while the quantum of shareability right now is the single notebook,
which is awesome in itself, this can be taken even further. So if one
wishes, a notebook can be published as a sequence of quasi-atomic chunks
which can then be separately mixed and mashed.  The quasi-atomic means that
we define one further level of granularity inside a notebook - the
boundaries being orthogonal to actual content boundaries. i.e. we should
not say e.g. that "use a horizontal rule" as a chunk marker - this is
brittle etc.

3) Content management becomes easy *in some respects* when the granularity
is smaller.


Thanks for reading this far.



------------------------------------------------------------------
Nitin Borwankar
nborwankar@gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.scipy.org/pipermail/ipython-user/attachments/20130223/4bed3ad9/attachment.html 


More information about the IPython-User mailing list