Logo by Trifox - Contribute your own Logo!

END OF AN ERA, FRACTALFORUMS.COM IS CONTINUED ON FRACTALFORUMS.ORG

it was a great time but no longer maintainable by c.Kleinhuis contact him for any data retrieval,
thanks and see you perhaps in 10 years again

this forum will stay online for reference
News: Follow us on Twitter
 
*
Welcome, Guest. Please login or register. August 19, 2022, 07:46:09 PM


Login with username, password and session length


The All New FractalForums is now in Public Beta Testing! Visit FractalForums.org and check it out!


Pages: 1 [2]   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: Buddhabrot reinvented  (Read 2546 times)
0 Members and 1 Guest are viewing this topic.
cbuchner1
Fractal Phenom
******
Posts: 443


« Reply #15 on: July 28, 2010, 02:59:40 PM »

I release this idea with an open source licence with share-alike attribute cheesy

what, no attribution attribute?
Logged
richardrosenman
Conqueror
*******
Posts: 104



WWW
« Reply #16 on: August 07, 2010, 08:54:20 AM »

Hey guys;

First off, I am tremendously impressed with this spectral mapping technique and the results it yields. Congratulations for such great work!

I have been attempting to implement this on my own as well based on the information provided but I seem to be having some problems:



I have successfully mapped the light spectrum using the proper algorithm from wavelengths 350-780 and converted them to RGB. You can see this on the bottom of the image. So I think I have this part correct. However, when it comes to mapping it to the Buddhabrot, I seem to be stuck.

On the left you can see the results with the Buddhabrot technique and on the right with the Nebulabrot. Clearly, none of them come even close to a decent result.

Basically, I am defining the wavelength using the following algorithm:

wavelength = 350 + (float)(430.0*((red+green+blue)/3.0));

The red, green and blue are the different Buddhabrot / Nebulabrot densities per color channel averaged together and scaled between 0-1. Right now for ease of sake, I am trying to achieve a good result with the simpler uddhabrot technique and not the Nebulabrot which is why I am averaging them out together for a single result. Then, I'm obviously starting the wavelength at 350 and then adding 430*(0...1) so that I get a total range of 350-780, the full color spectrum.

I then go on to calculate the respective RGB value for that particular wavelength which should also be in the range of 0-1. Finally, I multiply each channel by the result to retrieve what should be the correct result. Something like this:

red=red*abs(rgbwave_red);
green=green*abs(rgbwave_green);
blue=blue*abs(rgbwave_blue);

If you look closely at the images, you'll also notice black speckles which leads me to believe some are going out of range, despite my attempts to keep it within there.

So any thoughts on where I'm going wrong?

-Richard
Logged

kram1032
Fractal Senior
******
Posts: 1863


« Reply #17 on: August 07, 2010, 12:04:08 PM »

Not sure, I guess your conversion from Spectrum to RGB isn't quite correct...

http://www.fourmilab.ch/documents/specrend/ <- look there. You can find a c-file that converts spectral data into RGB values. Maybe it's of use.
The implementation of cbuchner used a converter written in Fortan... The RGB stimuli where pretty sharp. I guess it was easier to handle (not sure^^) but the CIE implementation in that c file is probably better suited if you can use it.

Your mapping doesn't look bad though smiley
Logged
cbuchner1
Fractal Phenom
******
Posts: 443


« Reply #18 on: August 07, 2010, 01:37:26 PM »

I map short wavelengths (blue, violet) map to short orbits. It seems to be
different in your version because the area outside the buddhabrot appears red.
For me this area is entirely violet...

Note: The total orbit length (iteration count) defines the color of a pixel, not
the current iteration in the orbit. So an entire escape orbit adds the same color
contribution. I tried both variations, but the second one looked boring.

wavelength = 350 + (float)(430.0*((red+green+blue)/3.0));

I cannot really understand what you're doing with this mapping. Instead of
(red+green+blue/3.0) it should say "orbit length / Max_Iterations"
and the resulting wavelength THEN maps to red, green, blue color
contributions.

That color contribution is then added to the accumulation buffer. Because my buffer uses
integers of very limited range (~20 bits) I had to use a trick:  I add the integer 1 into a
color channel, if a uniformly distributed random number between 0 and 1 falls into the
range of the color channel's contribution (which is also somewhere between 0 and 1).
If you use floating point buffers or integers of at least 32 bits per color channel, then
there is no problem.

Also in my version of the spectrum there is a drop in intensity towards the edges
to model the reduced sensitivity of the eye when we go towards ultraviolet and
infrared. In a program update that I posted to the nVidia forums I even let intensity
drop towards 0 (that's complete invisibility).


« Last Edit: August 07, 2010, 04:53:00 PM by cbuchner1 » Logged
richardrosenman
Conqueror
*******
Posts: 104



WWW
« Reply #19 on: August 07, 2010, 07:34:05 PM »

Not sure, I guess your conversion from Spectrum to RGB isn't quite correct...

http://www.fourmilab.ch/documents/specrend/ <- look there. You can find a c-file that converts spectral data into RGB values. Maybe it's of use.
The implementation of cbuchner used a converter written in Fortan... The RGB stimuli where pretty sharp. I guess it was easier to handle (not sure^^) but the CIE implementation in that c file is probably better suited if you can use it.

Your mapping doesn't look bad though smiley

Good point. This is the algorithm I have used: http://miguelmoreno.net/sandbox/wavelengthtoRGB/

It sounds (and looks) correct. What do you think?

-Richard
Logged

richardrosenman
Conqueror
*******
Posts: 104



WWW
« Reply #20 on: August 07, 2010, 07:49:06 PM »

I map short wavelengths (blue, violet) map to short orbits. It seems to be
different in your version because the area outside the buddhabrot appears red.
For me this area is entirely violet...

Note: The total orbit length (iteration count) defines the color of a pixel, not
the current iteration in the orbit. So an entire escape orbit adds the same color
contribution. I tried both variations, but the second one looked boring.

wavelength = 350 + (float)(430.0*((red+green+blue)/3.0));


I cannot really understand what you're doing with this mapping. Instead of
(red+green+blue/3.0) it should say "orbit length / Max_Iterations"
and the resulting wavelength THEN maps to red, green, blue color
contributions.

Ok, so in the traditional Buddhabrot render, you figure out the orbit length and then, when it escapes, you increment the accumulation buffer by one. It a second pass, you convert the accumulations into the red, green, blue based on totals. It sounds like you're doing something else, right? It sounds like you are incrementing the accumulation buffer by the escaped orbit length / Max_Iterations. But this would result in a float <1.0 not an integer. Do I have it correct?


That color contribution is then added to the accumulation buffer. Because my buffer uses
integers of very limited range (~20 bits) I had to use a trick:  I add the integer 1 into a
color channel, if a uniformly distributed random number between 0 and 1 falls into the
range of the color channel's contribution (which is also somewhere between 0 and 1).
If you use floating point buffers or integers of at least 32 bits per color channel, then
there is no problem.

I do not get this. Are you calculating color within the orbit length accumulation routine? Is it not calculated afterwards, once you have gathered a growing total number of orbit lengths?

Also in my version of the spectrum there is a drop in intensity towards the edges
to model the reduced sensitivity of the eye when we go towards ultraviolet and
infrared. In a program update that I posted to the nVidia forums I even let intensity
drop towards 0 (that's complete invisibility).




The drop should be easy to implement. I hope you can shed some light into the other areas though. wink

Thanks! Really interested in this...

-Richard
Logged

cbuchner1
Fractal Phenom
******
Posts: 443


« Reply #21 on: August 07, 2010, 08:59:17 PM »

Hmm, let me clarify this a bit more:

First I determine the orbit length. When I find the orbit escapes within the iteration limit, I map this orbit to a wavelength. And I map the wavelength to R,G,B values (each color component is in the range 0...1). With a statistical sampling (above mentioned random number method) I increment my accumulation buffer's red green and blue channels either by 1 or by 0 (i.e. not at all) for all pixels on this orbit individually.

It is mostly a single pass algorithm, except maybe for the first determination whether or not the orbit escapes.

It's probably the easiest choice to use a uint32 per color channel, alternatively a float (maybe a double for best quality during long renders). With 32 bit integers you may increment each channel with an 8 bit RGB value (0-255), depending on color of the orbit. No trickery with random numbers is needed then. The wavelength function should then return R,G,B values from 0-255.

Here's how my accumulation buffer works (that's an implementation detail really):

My buffer holds red, green, blue channels interleaved. Bizarelly I store this in two separate uint32 arrays. Each color channel uses 10 bits within a single uint32. By combining these two uint32 arrays I get 20 bits in total per color channel.

I can increment the color channels of one pixel using a single read-modify-write to one uint32. That's just a single memory transaction to modify three color channels (hooray!).

Overflow from the lower 10 bits to the upper 10 bits of each color channel needs to be dealt with periodically, for example while doing the output processing to show intermediate rendering results on screen (whenever the 10th bit is set in a color channel in the first array, I clear it and then increment the second array by 1).

This strange buffer design is the reason I get good performance on the graphics chip and this isn't much slower than rendering a grayscale buddhabrot. But due to the limited 20 bits precision I cannot really increment the colors by large numbers (0-255) or I would soon be overflowing my buffer.
« Last Edit: August 08, 2010, 12:27:34 AM by cbuchner1 » Logged
kram1032
Fractal Senior
******
Posts: 1863


« Reply #22 on: August 08, 2010, 10:50:30 AM »

hmm... on that site he said, none of them creates a pleasing spectrum...
I wish I could see an example of the corresponding CIE spectrum.

Well that's fine. It's very simplifyed though smiley
Logged
richardrosenman
Conqueror
*******
Posts: 104



WWW
« Reply #23 on: August 09, 2010, 05:00:57 AM »

Hi cbuchner1;

Thanks for the explanation. It makes a lot more sense now but it would take some significant changes to my program to adapt it to a similar system.

I played around with the spectrum mapping some more but didn't get any better results. However, it led me to wonder what it would look like if I mapped an HSV color model to it (Hue, Saturation, Value) on the Hue channel. This, in effect, would yield a similar result to the wavelength mapping since we're telling it to map a hue based on the orbit length. Like the wavelength mapping, it reveals many more details in the Buddhabrot that weren't previously visible. The result is much better than my previous attempts:



I am now experimenting with other color models such as LAB, IUV, YCBCR, etc. I will post the results as I render them.

Cheers,
-Richard
Logged

kram1032
Fractal Senior
******
Posts: 1863


« Reply #24 on: August 09, 2010, 08:48:25 PM »

With Lab you wouldn't be far away from spectral, would you?

Very nice results smiley
Logged
richardrosenman
Conqueror
*******
Posts: 104



WWW
« Reply #25 on: August 10, 2010, 03:42:15 AM »

Hey guys;

I've added new renders but started a new thread as I'm beginning to feel as if I'm hijacking this one! lol. Anyway, all this info can now be found here.

-Richard
Logged

Millennium Nocturne
Guest
« Reply #26 on: August 10, 2010, 11:07:35 PM »

Innocent Question: smiley
Mandelbulb buddha looks like this one?

<a href="http://www.youtube.com/v/5ej3dj4x64k&rel=1&fs=1&hd=1" target="_blank">http://www.youtube.com/v/5ej3dj4x64k&rel=1&fs=1&hd=1</a>
Logged
kram1032
Fractal Senior
******
Posts: 1863


« Reply #27 on: August 10, 2010, 11:17:51 PM »

not so innocent answer: nope.
What you see there is an animation of the full 4D-space of the Buddhabrot.
The Buddhabulb would be 6-dimensional during animation.
That's because you have 6 variables...
Logged
Millennium Nocturne
Guest
« Reply #28 on: August 10, 2010, 11:44:25 PM »

Oh  sad
So, no 3D mandelbulb looking buddhas?
not even with a multidimensional diet? (using 3 dimensions only?)
Logged
Pages: 1 [2]   Go Down
  Print  
 
Jump to:  

Related Topics
Subject Started by Replies Views Last post
3D Buddhabrot 3D Fractal Generation David Makin 1 2882 Last post January 12, 2008, 02:36:31 PM
by twinbee
Buddhabrot fractals Images Showcase (Rate My Fractal) « 1 2 ... 5 6 » Buddhi 75 8959 Last post May 26, 2010, 09:30:20 PM
by johandebock
BuddhaBrot! Images Showcase (Rate My Fractal) « 1 2 » emmmile 17 3457 Last post June 22, 2010, 11:53:15 PM
by Nahee_Enterprises
Buddhabrot Images Showcase (Rate My Fractal) Well En Taoed 3 1631 Last post July 30, 2010, 04:20:28 PM
by Wel lEnTaoed
Detail view of "Buddhabrot reinvented" Images Showcase (Rate My Fractal) cbuchner1 2 1128 Last post July 27, 2010, 03:12:16 PM
by cbuchner1

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 0.187 seconds with 24 queries. (Pretty URLs adds 0.01s, 2q)