Logo by Trifox - Contribute your own Logo!
News:
<- Like it? why not donate for continuity of this forum!
info about usage
 
*
Welcome, Guest. Please login or register. August 29, 2014, 02:00:35 PM


Login with username, password and session length



Pages: [1] 2 3 4   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: Antialiasing fractals - how best to do it?  (Read 8622 times)
0 Members and 1 Guest are viewing this topic.
Duncan C
Fractal Fanatic
****
Posts: 346



WWW
« on: March 06, 2008, 04:55:59 PM »

I've got my app, FractalWorks, rendering 3D height maps now, and am overall pretty pleased with the results. Directional lighting and specular highlights add a lot to the look of the plots. I've posted a few of the images to the samples gallery.

Here is a sample image for those that haven't seen the other post:


(click the picture to see the much larger "original" version on pbase.)

I'm using the distance estimate method (DEM) as the basis of my height values. (inverted, the log, scaled and streched, etc.)
The 3D texture of the height map very close to the set changes so much that it doesn't look good.

Simple supersampling doesn't work for fractals because there is an infinite amount of detail.

What approaches have others used to get smooth lines and cleaner textures? I should be able to apply an approach used for antialiasing 2D plots to my 3D plots.

I'm thinking of calculating a weighted average value of the current pixel and it's neighbors in a 3x3 grid
((the sum of the 8 neighboring pixels/8) + the current pixel) / 2.
Or I could do a simple average of all 9 pixels in a 3x3 grid centered on the current pixel. Either approach would tend to smooth out the "spiky" nature of the areas closest to the Mandelbrot/Julia set.

Does anybody else have a good suggestion as to how to do this?

Here's a sample image that shows the effect I want to avoid:

This is my attempt to duplicate the cover of the book "The Beauty of Fractals":



Note how the areas in the peaks closest to the baby mandelbrot are too chaotic (funny, that!) to make much sense visually.


Duncan
Logged

Regards,

Duncan C
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 5513


formerly known as 'Trifox'


WWW
« Reply #1 on: March 06, 2008, 05:20:33 PM »

i am using a gaussian blur filter to smooth out my details, on very high resolutions, this leads to very good results
http://en.wikipedia.org/wiki/Gaussian_blur

but all in all it wringles up to much of my detail, i havent tried out a selective filter yet

due to the fact i am using this method for animations, the chaos even gets worse, if i want to make smooth animations, i have to apply the blur filter also in time direction, so to say, i have to take the last blursize and next blursize images into account when smoothing ... smiley but this i have not done yet, because it would had strange effects on my movie, it would have an effect like motion blur on my movies, but that wouldnt be the result i want to achieve ... sad
« Last Edit: March 06, 2008, 05:25:52 PM by Trifox » Logged

---

divide and conquer - iterate and rule - chaos is No random!
Duncan C
Fractal Fanatic
****
Posts: 346



WWW
« Reply #2 on: March 07, 2008, 01:36:37 AM »

duncanc, you've hit the nail on the head: what you want to do is antialias your heightmap. blurring is something completely different.

a very simple and effective way to antialias the heightmap is via supersampling: instead of just using 1 sample per heightmap value, divide each "region" of your M x N map into K x K subregions (effectively enlarging it by a factor of K in each dimension, but you don't store all those values) which you evaluate and average to produce your final heightmap value. this is a simple way to bandlimit your signal, discarding all high frequency data that cannot be represented at the current sampling frequency (i.e. above the nyquist limit).

trifox, this method will give you much better results than blurring.

lycium,

My app is written to render a 2D fractal, then create a 3D view of that fractal, with one height map (distance estimate) value for each pixel. The easiest thing for me to do will be to create a 2D fractal that's a multiple  of the size of the target 3D mesh, then average the height values to create a smaller mesh.

Do you think a 4x average would be enough to get significant antialising? If not, how many extra samples would you suggest? And should the K x K subregions overlap between points, or should they tile?


Duncan
Logged

Regards,

Duncan C
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 5513


formerly known as 'Trifox'


WWW
« Reply #3 on: March 07, 2008, 03:28:30 PM »

i also do use supersampling, with random points scattered over the subgrid area, and smooth it afterway also,

without the blurring, it increases significantly the image quality, but for 3d ...

i have some more questions, i heard about nyqist frequency, but where does it come into play here ?

the high frequency noise increases towards the mandelbrot area, how can one possibly determine
a good subsampling for these areas, you said 3x3 raster serves in most cases but i am not satisfied here wink
basically it comes down to limiting the iteration depth ?

iteration depth = a indicator for frequencies ?

wouldnt it be possible to use this as a factor for smoothing ?
e.g.make a histogram over the occured iteration depths ?

just thoughts ...

another thing to mention is what kind of grid do you use ?

have you experimented with triangle grids ? or even heaxognal grids `?
http://mathworld.wolfram.com/TriangleTiling.html

i thought this could improve the smoothness of the 3d grid ....
Logged

---

divide and conquer - iterate and rule - chaos is No random!
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 5513


formerly known as 'Trifox'


WWW
« Reply #4 on: March 12, 2008, 10:18:02 AM »

yes, in current version of mutatorkammer, is the oversample feature, wich is also used
when calculating pixels for heightmaps, i am using as told before a randomized grid
as base, so, when using an 8 x oversample 64 pixels are taken into account for one
resulting pixel

i need the blurring for other things, because, no matter how good you do the oversampling
you will always get high peaks in some areas, e.g. the minibrots in some spiral arms of
the mandelbrot...

because of this the rendering for one single image is taking me more than a minute ;(
but i do always calculate everything on the fly, e.g. the antialiased heightmap, the blurring
and then the rendering, but if that is done, i am using opengl buffers to render the scene,
so one can move around the camera ...

but you are right, the blurring is taking me far too much expenses, because of the continuity at
the borders...
Logged

---

divide and conquer - iterate and rule - chaos is No random!
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 5513


formerly known as 'Trifox'


WWW
« Reply #5 on: March 12, 2008, 11:09:00 AM »

1. huh? this does not sound good for me, are you talking about the black hole in the middle of the mandelbrot ?


2.dont worry, i am using the gaussian trick of O(n^2)->O(n)


3. any links to the constant time ?

but the logarithm compression sounds new to me, have not thought about it, all in all i am using a HDR method already, but only for creating the color ranges ...
Logged

---

divide and conquer - iterate and rule - chaos is No random!
twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #6 on: September 24, 2008, 07:20:21 PM »

An old topic, but an interesting one.

I've just found out recently that for some fractals with sharply contrasted edges, one needs to sample 16*16 times greater than the desired resolution. This makes it 256 times slower than usual, but the results are definitely worth it. It makes such a difference to the noise level in very detailed areas.
Logged
HPDZ
Iterator
*
Posts: 157


WWW
« Reply #7 on: December 18, 2008, 04:51:05 PM »

This is an area I too am very interested in, although for the kind of animations I do, I can't afford the 4X or 16X or 256X rendering time increase that anti-aliasing imposes.

First, I think the image on the cover of the book clips the height at some upper limit...if you look closely, around the little mini-brot, everything is flat. I think they so something like if (count>max) count=max, which will obviously remove a lot of the jaggedness.

For anti-aliasing, I think the most important thing is that since the fractal does indeed have infinite spatial detail, there's no perfect way to do it. Some ways are better or faster than others, but none is perfect. There will always be unfilterable noise above the Nyquist frequency (as determined by the size of the sampling grid of your image pixels) in this situation.

When I do use AA, what I do, like everyone else, is supersample and apply a filter (average, windowed average, or median filter) to the supersampled data to generate the final count value for a pixel. I filter the actual count values before they are converted to colors rather than trying to filter the image colors (a much harder problem).

Some variations I've tried:

  • Pseudo-Poisson grid With areas of extreme detail, this can help reduce moire somewhat. The technique is to divide each image pixel into an NxN grid, but instead of calculating the fractal values at the center of each grid box, choose a random location within each grid box. This randomness spreads out the spatial frequencies that would turn into moire and turns them into random noise instead. The Poisson grid is the theoretically ideal way to do this, but it's a hassle to generate a perfect Poisson grid and I've found that the pseudo-Poisson grid is good enough.
  • Selective supersampling The idea here is to select the noisiest pixels and supersample only them. There are a million variations on this, but the basic idea is to compare each pixel to its neighbors, and if it's different enough, then it's a candidate for further oversampling. This can reduce the number of pixels that need oversampling quite a bit.
  • Different filter types Currently I use a median filter to convert the supersampled data to the final image count value. It seems a little "better" (very subjective) than a simple average, especially in areas near the edge of the set, where there is a lot of detail. I've tried averaging and various windowing schemes, and the difference is too subtle to matter much.
  • The Main Thing The more points in the supersampling grid, the better. I almost never go beyond 4x4, but there is a noticeable difference in some images between 4x4 and 16x16. The finer the grid, the more the high-frequency noise gets filtered and the cleaner the image will look. This will affect the final result far more than the details of which type of filter you apply to the supersampled data (for any reasonable filter).

Something else to consider is to apply a filter to the final image itself instead of or in addition to oversampling. This can remove a lot of sparkle noise but also can remove a lot of detail.

Another thing that I haven't tried (but I think is done in many animations) is to undersample and interpolate. That is, calculate image points on, say, a 160x120 grid, then use something like bilinear interpolation (or bicubic, or Lanczos -- pick your favorite method) to get a 640x480 image. The undersampling is kind of like applying a low-pass filter, so you get less noise in the final image.
« Last Edit: December 29, 2008, 06:28:18 PM by HPDZ, Reason: Moire is not a name and should not be capitalized! » Logged

Zoom deeply.
www.hpdz.net
twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #8 on: June 02, 2009, 03:28:57 AM »

Quote
Currently I use a median filter to convert the supersampled data to the final image count value. It seems a little "better" (very subjective) than a simple average,
One way to compare the two types is to render a very highly anti-aliased image (say oversampled to 32*32), and use that to compare to the other two. You'd measure the differences (perhaps something like: abs(red1-red2) + abs(green1-green2) + abs(blue1-blue2) ), and see which picture was more different to the near-perfect oversampled one.

Quote
Pseudo-Poisson grid
Isn't that just the monte carlo method?
Logged
HPDZ
Iterator
*
Posts: 157


WWW
« Reply #9 on: June 02, 2009, 03:43:56 AM »

Well, about the first thing, how to compare the two: you'd still have to choose a filtering method even for the 32x32 oversampling. You would think that as the number of oversampling points gets bigger maybe the details of the filter wouldn't matter as much, but it's not clear to me that's necessarily true.

About the second question: I think "Monte Carlo" is not a precisely defined term. True, the Poisson Grid is a random method, but it's not quite the same as just picking N points independently within the pixel to be supersampled. Doing that leaves each point's location unconstrained, while the pseudo-Poisson grid keeps them sort of evenly spaced from each other.
Logged

Zoom deeply.
www.hpdz.net
twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #10 on: June 03, 2009, 12:18:23 AM »

Quote
Well, about the first thing, how to compare the two: you'd still have to choose a filtering method even for the 32x32 oversampling. You would think that as the number of oversampling points gets bigger maybe the details of the filter wouldn't matter as much, but it's not clear to me that's necessarily true.

Well then just use 64x64, as I'm willing to bet that would beat 32x32, no matter what the filtering type.
At 64*64, noise is so, so, small that it really should provide a practically ideal image yardstick. If it's still not enough, then of course there's 128x128 oversampling. Each one is about 4 times as accurate as the last, so when a comparison always produces consistent results (where filtering type a always beats filter type b according to the yardstick comparison), you know the yardstick is good enough.

Unless I'm mistaken, this seems like a great way to quantitatively compare filtering types.

Quote
You would think that as the number of oversampling points gets bigger maybe the details of the filter wouldn't matter as much, but it's not clear to me that's necessarily true.

Hmm... I very much would think so wink The differences would surely get smaller and smaller, converging to no difference with the super-high oversampling versions. Even the 16x version is almost perfect in the last pic from this thread.
« Last Edit: June 03, 2009, 12:34:00 AM by twinbee » Logged
cKleinhuis
Administrator
Fractal Senior
*******
Posts: 5513


formerly known as 'Trifox'


WWW
« Reply #11 on: June 03, 2009, 12:52:17 AM »

ehrm, you got to take into account the result, in fact, you are proposing, using a 128x128 sub image for calculatin one tiny pixel in your resulting image is considerably different from a 64x64 sub image covering the same area ?!

when talking about those big sub images, you are comming very fast to visible limitations ( considering a rgb pixel of 8bits for each rgb channel ) because the differencies are becoming very fast very small, my experience is that using a 4x4 ( 16x calculation time ! ) sub pixel leads to very good results cheesy

....had to say something wink
Logged

---

divide and conquer - iterate and rule - chaos is No random!
twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #12 on: June 03, 2009, 01:22:06 AM »

That's what I'm saying yes - that such massive resolutions, the quality is so perfect that any deeper just doesn't make any difference, no matter what the filtering algorithm is. I was just making the point that one could go to deeper resolutions if each filtering algorithm does actually give noticably different results (which I bet wouldn't be the case beyond say 16x16 or 32x32).
« Last Edit: June 03, 2009, 01:24:20 AM by twinbee » Logged
HPDZ
Iterator
*
Posts: 157


WWW
« Reply #13 on: September 30, 2009, 03:36:17 AM »

Well, the forum is recommending I start a new topic since nobody's discussed this in over 90 days. That's probably because everyone's been waiting for their 256x256 oversampled test images to render...ha ha

I did in fact make two test images with 256x256 oversampled images ... yes, that is 65536 samples per image pixel! ... one with median filtering, and one with mean filtering. I also did this at 16x16 and 32x32.

I'll start a new thread and post the images to my gallery. They will be JPG images, to keep the sizes reasonable; unfortunately, that means a lot of the detail that differentiates the mean filter result from the median filter result is obscured by the JPG compression. So I have put the original uncompressed BMP files on www.hpdz.net. The link is in the new topic thread.
Logged

Zoom deeply.
www.hpdz.net
HPDZ
Iterator
*
Posts: 157


WWW
« Reply #14 on: September 30, 2009, 04:13:59 AM »

BAH! Right after I clicked "Post" on that last message, I decided to just go ahead and keep the thread going as it is.

So here's the thing: I believe one of these two methods is superior. I won't bias anyone more than I already have (review the thread) by saying which one I think is superior, but I think if you look closely at even the JPG images, with all their artifacts, you can tell. Download the BMP images (almost 3 MB each!) if you really want to scrutinize them.

I further believe that the difference in filtering methods persists even at huge overampling levels like 256x256 as I have done here. One of these methods is just inherently not suited to dealing with the kind of skewed, non-Gaussian noise that we have here (more data on that is coming), and it doesn't matter how much oversampling you do. No matter how large the oversampling is, these two methods do NOT converge to a common "perfect" image as our intuition might lead us to believe. The filtering method definitely matters, even at extreme levels of oversampling like this.

When comparing these test images, don't be distracted by the slightly different coloring; these images were both colorized by the same method, but this method uses the distribution of fractal count data to generate the color map, and since the different filtering techniques generate slightly different count distributions in the final, filtered images, the colorings are slightly different. The important thing is to compare the level of detail between the two. Not in the very central white spot, which is overwhelmed by moire. Check out the peripheral areas to see where more fine structure is evident.

The median filtered images do take longer to render. It is easier to add a whole bunch of elements in a list than it is to find the median element in that list. As the list gets longer, this problem gets larger. It took about 95 hours to render the median filtered image and only about 35 hours for the mean filtered one.

I will post the 16x16 and 32x32 images later. I also want to generate histograms of the two 256x256 images and also try maybe a 2D fourier transform to see if the obvious visual noise can be demonstrated on a power spectrum. And of course, comparing the 16x16 and 32x32 images to the 256x256 images will be helpful too. If there isn't a huge difference between the lower degrees of oversampling and the extreme oversampling, it may not be worth going too crazy with this. These things typically obey the 80-20 rule since some kind of relationship like Performance = sqrt(Effort) typically shows up somewhere.



This is the mean filtered oversampled image. The raw BMP file is at http://www.hpdz.net/images/TechPics/AA3-256x256-Mean.bmp


This is the median filtered oversampled image. The raw BMP file is at http://www.hpdz.net/images/TechPics/AA3-256x256-Median.bmp
« Last Edit: September 30, 2009, 04:16:49 AM by HPDZ, Reason: typo » Logged

Zoom deeply.
www.hpdz.net
Pages: [1] 2 3 4   Go Down
  Print  
 
Jump to:  


Related Topics
Subject Started by Replies Views Last post
filters for antialiasing animations Help & Support jakehughes 2 856 Last post February 10, 2008, 08:22:57 AM
by jakehughes
3D fractals 3D Fractal Generation Duncan C 2 1468 Last post April 25, 2008, 05:40:50 PM
by titia
New animation video with antialiasing completed Movies Showcase (Rate My Movie) ianc101 6 1193 Last post December 31, 2010, 05:07:45 AM
by ianc101
Help required with antialiasing and the median filter Programming zenzero-2001 7 280 Last post October 28, 2012, 02:27:41 PM
by zenzero-2001
[mandelbulb3d] animation: antialiasing or virtualdub filter 2:1 ? General Discussion scavenger 0 453 Last post May 26, 2013, 06:57:54 PM
by scavenger

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 5.221 seconds with 28 queries. (Pretty URLs adds 0.311s, 2q)