Welcome to Fractal Forums

Fractal Math, Chaos Theory & Research => (new) Theories & Research => Topic started by: Khaotik on August 10, 2012, 10:35:13 AM




Title: Hebbian neuron fractal?
Post by: Khaotik on August 10, 2012, 10:35:13 AM
Can we use Hebbian learning rule to "grow" some neuron fractals?


Title: Re: Hebbian neuron fractal?
Post by: kram1032 on August 10, 2012, 12:46:50 PM
Well, neuronal networks that have to solve multiple problems which have to know about other problems will be highly fractal/scalefree in their organization.
The majority of neurons will try to figure out that one problem and then less and less neurons are devoted on brining in information from less and less related problems.
But if you want physical models of brains, you'll have the problem that mathematical directed graphs do not need to be constraint by physical limitations like volume. Thus, the actual ordering of the graph wont nearly show all the nice features that a brain does, even if you make it learn very similar tasks to what the brain learns.

Funny enough, the newest episode of SciShow talks about a "map" of the brain, the human connectome, made by the highest-resolution MRI scanners.
http://www.youtube.com/watch?v=TxlV50P6NEI&hd=1&feature=html5

If you want to learn about learning algorithms of all kind (if you want even with a diploma in the end that you can actually use to get a work position), and experiment with them in fractal manners, check out one of the several courses on coursera that deal with that subject or alternatively go to udacity where they also have one:
https://www.coursera.org/category/cs-ai
http://www.udacity.com/

(note, in case of coursera, you need to assign to most of the courses. Only few have a "preview" option - but the first on the list does and is by the currently most successful AI guy out there. But I'm sure, all those courses are nice)


Title: Re: Hebbian neuron fractal?
Post by: Khaotik on August 12, 2012, 05:17:23 AM
Thanks kram1032, these courses are great.

" ... mathematical directed graphs do not need to be constraint by physical limitations like volume. ..."

Looks like current ANN models concern much more on network dynamics rather than neuron topology and morphology. The most "fractalish" design I've seen is SOM, which can generate 2D peano curves. However, for one neuron, there has to be a rule so that it won't grow into a sphere. Maybe neuron growth is somewhat like tree growth, forming a fractal structure in order to solve certain constrained optimization tasks.


Title: Re: Hebbian neuron fractal?
Post by: kram1032 on August 18, 2012, 06:33:19 PM
you could probably combine neuronal networks with basic physics-/intersection-simulation features to get some kind of organization going.
Combine that with some sort of genome to get evolving structures and you should be set.

However, my guess is, it would be much more interesting to somehow graph the internal thinking of learning algorithms, for instance, the convergence given an initial condition (Julia-style) or given the entire initial condition plane (mandelbrot-style - for the very basic "gradient decent", it would probably end up looking somewhat like a Newton fractal. If you find some datasets somewhere, you might get some interesting patterns. a dataset with 2 features would be ideal for 2D images. You can then proceed to plot the distance from the exact representation (say, the error for a given initial position to the closest datapoints, interpolated in some manner), or the time to reach a treshold error level (equivalent to bailout) or some kind of thing like that...)

You could also try to train a neuronal network on producing fractal images by processing images and ranking them.

The possibilities are pretty much infinite, be creative :)
Or let a network be creative for you ;)