Deze site in het Nederlands

Neural Netwerking with Annie

Neural networks are computer programs that are loosely inspired by biological neurology. Brains consist of neurological cells or neurons and connections between neurons. Neurons communicate with each other by sending short electrical pulses.
You can read more on the working of the mammalian brain here. Neural Networks (or Artificial Neural Networks) are a crude imitiation of the basic working of the brain.
Neural networks consist of neurons and connections between neurons, but that is more or less where the anology halts. Neural networks have learning mechanisms or algorithms that have very little in common with biological learning algorithms.
As a research subject neural networks are very interesting; there are many questions surrounding neural networks, many of them fundamental. There are also many variables in training neural networks, which makes training a neural network a non-trivial task.

The software tool ANNIE (A Neural Network Interaction Invironment) was developed as an exploration tool for neural networks. It aids in training neural networks, and gives feedback on weight updates between connections and the overall error of the network. It also helps in discovering what neural networks are good at. Currently I am using ANNIE to train neural networks with training sets from the University of California at Irvine (UCI)
There are a number of issues I want to implement in the near future; first of all I want to implement the ability to train on image data (two dimensional data), and possibly produce image data as output. This way I could train a network to perform Image segmentation or any other image manipulation task (I could train my own Photoshop plugins). I might create an integration with with my art generation project Argento. Next I want to integrate neural networks with genetic algorithms, in order to to be able to evolve optimal neural network configurations. At last I want to implement the persistence of neural networks from and to XML using the Predicitive Model Markup Language (PMML v3.0) format. This way it will possible to train a certain NN setup with a data set and exchange the settings with anyone else who implements the PMML format (neural networks will become free of the underlying implementation, which will enhance reproducibilty of training results).

Valid CSSValid HTML 4.01 Transitional

© 2000-2007, Eelco den Heijer, Amsterdam, The Netherlands.
Comments, suggestions can be sent to the webmaster