[SciPy-user] feed-forward neural network for python

Marek Wojciechowski mwojc at p.lodz.pl
Wed Dec 6 14:38:20 CST 2006

On Wed, 06 Dec 2006 18:00:08 -0000, <scipy-user-request at scipy.org> wrote:

> Marek Wojciechowski wrote:
>> Hi!
>> I released feed-forward neural network for python project at sourceforge
>> (ffnet). I'm announcing it here because it depends on numpy/scipy  
>> tandem,
>> so you, folks, are potential users/testers
>> If anyone is interested please visit ffnet.sourceforge.net (and then  
>> post
>> comments if any...)
> Hi Marek,
> thank you for you ANN implementation. I'm currently interested in  
> recurrent
> neural networks so I hope you'll add them too in near future
> Anyway, for those interested in ANN you can check conx.py module of
> PyRobotics project. It's just one python file (182Kb of python source!)
> inside a much bigger project. Conx.py is self contained and unfortunately
> it needs Numeric and not numpy :((
> What about mixing ffnet and conx.py?

Comapring to conx.py ffnet is much, much faster and much, much simpler.  
thanks to scipy optimization routines and simpler because simplicity was
the basic assumption when I created ffnet.

To prove it:
I trained a XOR network (2-2-1) in conx with:
 from pyrobot.brain import conx
net.setInputs([[0., 0.], [0., 1.], [1., 0.], [1., 1.]])
net.setTargets([[1.], [0.], [0.], [1.]])
This performed 5000 iterations in 34.6 s (and the tolerance has not been  

With ffnet it can be done with:
 from ffnet import ffnet, mlgraph
conec = mlgraph((2,2,1))
net = ffnet(conec)
input=[[0., 0.], [0., 1.], [1., 0.], [1., 1.]]; target=[[1.], [0.], [0.],  
net.train_momentum(input, target, eta=0.5, momentum=0.8, maxiter=5000)
The above trains nework in 17.9 ms, and the fitness is perfect.

ffnet is almost 2000 times faster than conx in this example! An the code  
is simpler.

However there are some nice testing methods in conx, which can be used in  
Thanks for this tip.

Marek Wojciechowski

More information about the SciPy-user mailing list