Logo by mauxuam - Contribute your own Logo!

END OF AN ERA, FRACTALFORUMS.COM IS CONTINUED ON FRACTALFORUMS.ORG

it was a great time but no longer maintainable by c.Kleinhuis contact him for any data retrieval,
thanks and see you perhaps in 10 years again

this forum will stay online for reference
News: Visit us on facebook
 
*
Welcome, Guest. Please login or register. April 23, 2024, 10:27:49 AM


Login with username, password and session length


The All New FractalForums is now in Public Beta Testing! Visit FractalForums.org and check it out!


Pages: [1]   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: (it works!) Neural Network, Self Organizing Map, and Mandelbrot Set  (Read 4947 times)
0 Members and 1 Guest are viewing this topic.
ker2x
Fractal Molossus
**
Posts: 795


WWW
« on: November 22, 2010, 02:32:10 PM »

Why ? Because i can \o/

It's still an early preview, too much copy/paste from GPL code.
But i promise to release the source as GPL really soon smiley

About SOM : http://en.wikipedia.org/wiki/Self-organizing_map



* mandelbrot-SOM.jpg (81.05 KB, 771x524 - viewed 401 times.)
« Last Edit: November 22, 2010, 02:38:26 PM by ker2x » Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #1 on: November 22, 2010, 03:20:55 PM »

the code to generate the mandelbrot set edge :

Code:
            double zr, zi, cr, ci, temp = 0.0;
            int i = 0;

            while (i < mandelbrotPoints)
            {
                int iter = 0;
                zr = 0.0;
                zi = 0.0;
                cr = randomGenerator.NextDouble() * 4 - 2.0f;
                ci = randomGenerator.NextDouble() * 4 - 2.0f;
                int intermediaire = 50;

                while ((iter < intermediaire) && ((zr * zr + zi * zi) < mandelbrotEscapeRadius))
                {
                    temp = zr * zi;
                    zr = zr * zr - zi * zi + cr;
                    zi = temp + temp + ci;
                    iter++;
                }
                if ((iter == intermediaire))   //if in MSET at intermediaire value
                {

                    while ((iter < mandelbrotMaxIteration) && ((zr * zr + zi * zi) < mandelbrotEscapeRadius))
                    {
                        temp = zr * zi;
                        zr = zr * zr - zi * zi + cr;
                        zi = temp + temp + ci;
                        iter++;
                    }
                    if ((iter < mandelbrotMaxIteration))   //If exited at max (then if at the border)
                    {
                        trainingSet.Add(new TrainingSample(new double[] { (cr + 2.0) * 25.0, (ci + 2.0) * 25.0 }));
                        lineItem.AddPoint((cr + 2.0) * 25.0, (ci + 2.0) * 25.0);
                        i++;
                    }

                }


            }

Probably not optimal but very simple.
Explanation :

- if at 50 iteration it escaped, it's not in the mandelbrot set and i skip the point.
- if at 50 iteration it hasn't escaped it's in the mandelbrot set
- then i continue up to 100 iteration and check if it escaped.
- if the point hasn't escaped at 50 and hasn't escaped at 100, it may be anywhere inside the mandelbrot set so i skip this point.
- if the point hasn't escaped at 50 but escaped at 100, then it's on the edge so i keep the point
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
The Rev
Guest
« Reply #2 on: November 22, 2010, 03:21:56 PM »

I have no idea what you're talking about, but it sounds really cool. cheesy

The Rev
Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #3 on: November 22, 2010, 03:31:31 PM »

I have no idea what you're talking about, but it sounds really cool. cheesy

LOL !!!  embarrass
I guess it deserve some explanation, i'll try to explain later. i'm at work and too busy to write a good explanation smiley

I can answer to the most obvious question :
- Is it usefull ? Probably not.
- What's the goal ? How can i use this in my fractal software ? No idea! I just tought it was a cool thing to do...
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
The Rev
Guest
« Reply #4 on: November 22, 2010, 04:08:43 PM »

If it is inspiring, then it's definitely useful! cheesy

The Rev
Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #5 on: November 22, 2010, 05:43:20 PM »

The network on the right try to organize itself to imitate the training set on the left.
...
Flawless victory \o/



* som-mandelbrot-flawless.jpg (53.86 KB, 762x373 - viewed 389 times.)
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #6 on: November 22, 2010, 08:02:04 PM »

Here is the binary (windows executable "Any CPU") : http://fractals.s3.amazonaws.com/app/mandelbrotSOM.zip
Here is the binary + src (Visual Studio 2008 project) : http://fractals.s3.amazonaws.com/app/mandelbrotSOM_src-included.zip

Beware, the source is a mayhem :
- it's a modified sample application from the NeuroDotNet, i will clean that.
- I don't know how to correctly distribute VisualStudio Project. It may work, or not... but the sources are definitively included.

It's GPL code since the original sample app is GPL too.

There is a few bug or missing feature (eg : it doesn't check if MinIter < MaxIter)

Howto :
- Launch the executable, clic start. The network on the right will try to organize itself to look like the training set on the left
- Change nb of plots from 500 to 1000. Click on "generate mandelbrot set". The training set will have more points (more complete). Feel free to clic start.
- Change Layer Width and Layer Height from 12 to 20. The networks on the right will have more node. Click start. It will be slower.
- Uncheck Show Connections. it will not display the edge between node. computation will be faster.
- Check Show only Winner. it will not display the node that are not close to the training set. it will be faster too.
- now the map will look like the training set. (see picture in previous post).

- You can eventually change min iteration to 500, and max iteration to 1000. Click on generate mandelbrot set (it may take some time and the app will hang while computing the training set). The training set will be finer.

Feel free to play with other options.
If you play with layer shape, topology or neighborood function, you should enable show connection and disable show only winner to understand what it does.
If you play with learning rate and training cycle, i suggest to learn about neural network and Self organizing map smiley

Enjoy ! Feel free to report bug, or post suggestion.


* SOMmandelbrot-release.jpg (86.99 KB, 779x556 - viewed 409 times.)
« Last Edit: November 22, 2010, 08:03:40 PM by ker2x » Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #7 on: November 22, 2010, 08:10:02 PM »

you will find some documentation about SOM and NeuroDotNet (the library used for this app), here : http://neurondotnet.freehostia.com/manual/index.html
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #8 on: November 22, 2010, 09:41:14 PM »

if you want to see "smoother" map transformation, change your Learning rate to 0.0001
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
cbuchner1
Fractal Phenom
******
Posts: 443


« Reply #9 on: November 23, 2010, 12:03:24 AM »

now what's the use of such neural net, once it is trained?

Could it be used to recognize patterns - such as finding minibulbs in the M-set?

Christian
Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #10 on: November 23, 2010, 10:23:44 AM »

now what's the use of such neural net, once it is trained?

dunno.

Quote
Could it be used to recognize patterns - such as finding minibulbs in the M-set?

It could probably be done with a (multilayer) perceptron. ( http://en.wikipedia.org/wiki/Perceptron )
« Last Edit: November 23, 2010, 10:27:51 AM by ker2x » Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
Tglad
Fractal Molossus
**
Posts: 703


WWW
« Reply #11 on: November 23, 2010, 12:09:30 PM »

I don't see how any feed-forward neural net could do any better than the polynomial lemniscates that are already a well defined approximation
http://mathworld.wolfram.com/MandelbrotSetLemniscate.html
A recurrent neural net might be capable of something cleverer.
Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #12 on: November 23, 2010, 07:27:13 PM »

I don't see how any feed-forward neural net could do any better than the polynomial lemniscates that are already a well defined approximation
http://mathworld.wolfram.com/MandelbrotSetLemniscate.html
A recurrent neural net might be capable of something cleverer.

You're right. totally right smiley
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
Pages: [1]   Go Down
  Print  
 
Jump to:  

Related Topics
Subject Started by Replies Views Last post
experimenting with mandelbrot set and neural network Programming ker2x 2 2719 Last post November 16, 2010, 12:19:34 AM
by Jesse
Biomechanical Neural Network Mandelbulb3D Gallery dainbramage 0 1084 Last post June 29, 2011, 12:28:18 PM
by dainbramage
network render feature request rama1 2 1050 Last post November 15, 2013, 02:50:39 AM
by rama1
Neural Network Fractal Science Kit Gallery Ross Hilbert 0 1227 Last post December 12, 2013, 06:07:28 PM
by Ross Hilbert
Neural Network Images Showcase (Rate My Fractal) Lois 0 697 Last post July 01, 2016, 07:15:57 PM
by Lois

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 0.221 seconds with 27 queries. (Pretty URLs adds 0.012s, 2q)