Welcome to Fractal Forums

Fractal Software => Programming => Topic started by: nessalc on September 16, 2013, 08:38:03 PM




Title: Mandelbrot "Adaptive" Escape Limit?
Post by: nessalc on September 16, 2013, 08:38:03 PM
Having written probably dozens of Mandelbrot renderers over the years (boredom, new programming languages to try, etc.), one thing has bugged me: the persistent use of a constant escape limit (e.g. 100-10000+ iterations). I'm familiar with the tests for the main cardioid and 0/1 Bulb, but is there some better way to determine, say, based on an image size or zoom factor, how many iterations I ought to use? Is there some "well-known" formula for testing chunks of other bulbs (since they're not strictly circular, I know a complete formula is unrealistic) that I have somehow missed? Basically, I don't want my program wasting precious cycles iterating thousands, or possibly millions of times when there's some better way--my favorite images are typically right along the border, and often feature some decent chunk of the image devoted to points within the Mandelbrot set.

I'm familiar with Mariani/Silver and successive refinement. I've tried a boundary-scanning method, but always run into memory issues. Is there some website that has pseudocode describing a non-recursive implementation that I might be able to utilize (if, that is, this is an appropriate method for me)?

Thanks!

J


Title: Re: Mandelbrot "Adaptive" Escape Limit?
Post by: lkmitch on September 16, 2013, 09:54:16 PM
In this work,

http://www.kerrymitchellart.com/articles/area/mandelbrot-area.html

I used a series of masks to greatly reduce the amount of area that actually had to be computed.  The concept is fairly straightforward, but the implementation takes a bit of effort--the centers of the circles are given by roots of complex polynomials, as are tangency points (and thus, the radii of the masks).  So your problem turns from doing simple iterations to doing much more complicated root-solving.  Or, you could use a lookup table, but the number of components grows so fast that this would not be practical for a general rendered.  Fairly quickly, the overhead in finding the components which can easily be approximated by circles overwhelms the actual iteration.


Title: Re: Mandelbrot "Adaptive" Escape Limit?
Post by: nessalc on September 16, 2013, 10:14:01 PM
Aha! Thanks! At a glance, this seems to be exactly what I was looking for!
 :D


Title: Re: Mandelbrot "Adaptive" Escape Limit?
Post by: DustyMonkey on September 21, 2013, 04:46:48 PM
As far as deciding on a maxiter value, I've always felt that from a strictly purist point of view that having a maxiter at all is simply the wrong way to go about it. "maxiter" is a completely invented notion that serves no purpose but convenience.

The standard implementation is:

for each pixel of image {
do { pixel.z = pixel.z*pixel.z+pixel.c } until pixel.z escapes or arbitrary maxiter reached
}

...but in actuality there is no reason that it isn't more correctly:

do {
for each unescaped pixel of image { pixel.z = pixel.z*pixel.z+pixel.c }
} until the effort exceeds the subjective gains

When we gauge the cost of the effort as infinitesimal and the benefit of the gains as infinite ("perfectionist"), then the perfect heuristic is "until the number of escaped pixels will no longer change" ... but in practice of course the costs are not infinitesimal and the gains are not infinite, but now that its in the form of a cost/benefit analysis then we can decide on a simple universal heuristic where we wont waste any more time trying to correct just a small number of deviations from perfection.

Lets call that ideal "perfectionist" number of escaped pixels N, and the number of escaped pixels the algorithm has seen on any given pass E(T).

We of course don't know N for any given render, but we can observe E(T) over time while rendering and can thus collect statistics. We can make projections about where E(T + DT) will likely be, compute confidence intervals, and so on. Its suddenly all very formal and the only parameters can be the actual subjective quantities themselves rather than that seat-of-the-pants maxiter, quite specifically the number of deviations from N that we will tolerate and the confidence level that we are within that margin.