Title: Buddhabrot reinvented Post by: cbuchner1 on July 27, 2010, 01:02:35 AM Buddhabrot reinvented
(http://www.fractalforums.com/gallery/3/838_27_07_10_1_02_35.jpeg) http://www.fractalforums.com/index.php?action=gallery;sa=view;id=3037 I added a physics inspired coloring method. Orbit length maps to wavelength which maps to a RGB representation of the spectral color. EDIT: this was first suggested by kram1032 on the Buddhabrot on GPU thread, thank you very much for the general idea. And the effect is mindblowing. Did I mention this image rendered in under a minute? Title: Re: Buddhabrot reinvented Post by: Schlega on July 27, 2010, 01:29:06 AM This is the best Buddhabrot I've seen.:thumbsup1:
Title: Re: Buddhabrot reinvented Post by: Lee Oliver on July 27, 2010, 02:16:04 AM That is amazing! And under a minute as well, wow! :worm:
Title: Re: Buddhabrot reinvented Post by: lycium on July 27, 2010, 10:04:23 AM very nice image chris!! :o
i'm curious as to how you're handling the incoherent histogram writes; atomics? (btw perhaps a little credit is due to kram1032 for the idea to map iteration # -> wavelength?) Title: Re: Buddhabrot reinvented Post by: cbuchner1 on July 27, 2010, 12:51:21 PM i'm curious as to how you're handling the incoherent histogram writes; atomics? (btw perhaps a little credit is due to kram1032 for the idea to map iteration # -> wavelength?) no atomics used, a few collisions will occur, which may lead to a little loss of brightness at the brightest spots. kram1032 may have brought it up publicly in the GPU buddhabrot thread first, that's true. But unfortunately for him, it's too obvious to be patented ;) Title: Re: Buddhabrot reinvented Post by: ker2x on July 27, 2010, 01:45:30 PM That is amazing! And under a minute as well, wow! :worm: The power of OpenCL/Cuda :D Title: Re: Buddhabrot reinvented Post by: lycium on July 27, 2010, 02:45:21 PM Hmm I must say I'm quite disappointed at your attitude; when referring to your (incorrect) implementation you use words like "reinvented" and "mindblowing", but with regard to proper attribution of the idea you implemented, you use "obvious". Who said anything about patenting? I was suggesting a respectful nod; kram is always full of good ideas (I know him from the Indigo renderer forums, as I know you from the nvidia forums), and this is an instance where one has proven quite fruitful - that little bit of credit (for an "obvious" idea) I'm sure you can spare.
About the incorrectness, "a few" collisions is on the order of thousands - as you know GPUs are massively parallel, and can have 20 to 30 thousand threads in flight at once. GPU architecture is entirely based around hiding uncached memory latency by doing things out of order, so it's far from the case with a quadcore CPU where you might produce an incorrect value by "only" a factor of four. Compounding this issue is that the Buddhabrot is a dynamical system, whose attractor is exactly that - an attractor of points, so the collision error is not uniformly spread over the image, it's very concentrated in relatively small and bright regions. Title: Re: Buddhabrot reinvented Post by: cbuchner1 on July 27, 2010, 03:10:42 PM About the incorrectness, "a few" collisions is on the order of thousands - as you know GPUs are massively parallel, and can have 20 to 30 thousand threads in flight at once. GPU architecture is entirely based around hiding uncached memory latency by doing things out of order, so it's far from the case with a quadcore CPU where you might produce an incorrect value by "only" a factor of four. Compounding this issue is that the Buddhabrot is a dynamical system, whose attractor is exactly that - an attractor of points, so the collision error is not uniformly spread over the image, it's very concentrated in relatively small and bright regions. Global atomics are slower on GPUs by an order of magnitude (except for the very latest generation of Fermi GPUs). That's why I generally avoid them. If you look at some GPU accelerated fractal flame programs (e.g. FLAM 4), you will see that they use the same strategy, i.e. brute force overlapping writes. To quantify the error on the nVidia architecture, one would have to determine the probability of a) two threads within the same half-warp issuing a global memory write to the same location at the same time b) threads on different multiprocessors causing an overlapping global read/modify/write memory transaction to the same location Although both effects can cause incorrect results, I think the likelyhood of b) is far greater than that of a), but still below any thresholds to leave a visible effect. I have previously implemented Scott Draves' original Fractal Flame algorithm in CUDA and I carefully tried to avoid write collisions by sorting the contributions by their screen coordinate before applying the contributions in an collision-free manner. The sorting step became the bottleneck of the entire application. So I came to the conclusion that trying to avoid or mitigate write collisions on statistical image generators doesn't bring enough benefits to warrant the additional overhead. As for the criticism related to attribution of kram's "invention". He did not give an exact formula how to map escape orbit length to wavelength. I experimented with several linear and nonlinear mappings and had to find parameters first that were aesthetically pleasing. I don't think this wavelength based coloring method will be the end of it. I will try some more and radically different approaches. And I will have to work on better tone mapping and noise reduction. Title: Re: Buddhabrot reinvented Post by: hobold on July 27, 2010, 03:13:21 PM I'll have to side with Lycium on both the attribution and the correctness issue. Neither is catastrophic, but neither should be glossed over.
Good ideas are always worthy of being valued, no matter if I came up with them or somebody else did. (The use of the word "I" here is meant to prompt every reader to project the situation onto the appropriate case.) Likewise, massively data parallel processors will continue to evolve towards even more parallelism. They might consist not only of more, but also more isolated computational devices, and the collision problems would become noticeable. I am not saying what you did is bad. I am saying that it can be improved. :) Title: Re: Buddhabrot reinvented Post by: cbuchner1 on July 27, 2010, 03:23:59 PM I'll have to side with Lycium on both the attribution and the correctness issue. Neither is catastrophic, but neither should be glossed over. I am not saying what you did is bad. I am saying that it can be improved. :) All righty, it was way past midnight when I posted these images and I wanted to go to bed ASAP, so the attribution issue fell a little short. I've amended the details in the first post. Title: Re: Buddhabrot reinvented Post by: ker2x on July 27, 2010, 05:17:22 PM About the incorrectness, "a few" collisions is on the order of thousands - as you know GPUs are massively parallel, and can have 20 to 30 thousand threads in flight at once. GPU architecture is entirely based around hiding uncached memory latency by doing things out of order, so it's far from the case with a quadcore CPU where you might produce an incorrect value by "only" a factor of four. Compounding this issue is that the Buddhabrot is a dynamical system, whose attractor is exactly that - an attractor of points, so the collision error is not uniformly spread over the image, it's very concentrated in relatively small and bright regions. Considering the results, i think that the "incorrectness" is acceptable if it really speedup the rendering. (and i'm sure it is) Speed vs accuracy : GPGPU was always on the speed side. (until very recently, it was 32bits only, no ECC, "improper (for scientific use)" floating point implementation for the older card, etc ...) *hugs* Title: Re: Buddhabrot reinvented Post by: kram1032 on July 28, 2010, 02:42:04 AM ...it's too obvious to be patented ;) Aww, and I was just about to patent it... This is amazing indeed :D (Dunno why I didn't stop by earlier...) And you're right, there is quite some red-bias in this one... :) But it looks great like that anyway :) It's very nice how basically the full detail spectrum is included. You can see the ~20iter buddhabrot spikes as bluegreen for instance :) (Actually surprising... shouldn't that rather be at the other end of the spectrum? Or did you invert it, giving low period frequencies high spectral frequencies and vice versa?) Title: Re: Buddhabrot reinvented Post by: cbuchner1 on July 28, 2010, 03:05:32 AM Aww, and I was just about to patent it... An important point to remember: File the patent first, only then it is safe to publish. ;) And you're right, there is quite some red-bias in this one... :) But it looks great like that anyway :) It's very nice how basically the full detail spectrum is included. You can see the ~20iter buddhabrot spikes as bluegreen for instance :) (Actually surprising... shouldn't that rather be at the other end of the spectrum? Or did you invert it, giving low period frequencies high spectral frequencies and vice versa?) default parameters were chosen like this --maxR=380 --maxG=200 --maxB=1000 resulting wavelength is maxR + length of orbit* 400/maxG the highest computed iteration count is maxB in case you're wondering about the strange naming (maxR, maxG, maxB), they are simply re-used from the traditional iteration cutoff based coloring method. (http://www.dnr.sc.gov/ael/personals/pjpb/lecture/spectrum.gif) So we're coming in from short wavelengths at low iterations (380nm=maxR), and go towards longer wavelengths at high iteration counts. With my choice of parameters, the infrared (780nm) would be reached at 200 (=maxG) iterations already, in practice red color tones are reached much earlier (maybe around ~130 iterations)! That explains the red bias as you zoom in. The full view doesn't look that red at all - see attachment. My zoom was into the leftmost minibulb that is directly attached to the period-2 bulb. Eww the compression artefacts on this screenshot are ugly. Title: Re: Buddhabrot reinvented Post by: ker2x on July 28, 2010, 09:35:16 AM Eww the compression artefacts on this screenshot are ugly. - Don't use the windows capture tool. (insane compression) - use the good old print screen key on your keyboard (and save it with the gimp, good and free) - host in on Amazon S3 (cost me $0.03/month and i still have 930MB to use before it cost me $0.152/month :D ) Title: Re: Buddhabrot reinvented Post by: kram1032 on July 28, 2010, 02:57:46 PM I was hosting images on photobucked for free and it worked pretty well.
My images are still active, there... although I wasn't on my account for ages. So, that works too :) nice full view :) So this basically means, the closer you get, the more red it becomes... (as the overall view actually looks purple, which is at the other end of the visible spectrum...) I guess, this wont work by far as good but maybe you could let the spectrum basically repeat in higher frequencies... - somewhat like a musical octave, where a is red and h is purple and then, the next octave starts with a = red again.... The visible spectrum of light is too narrow to be interpreted in octaves... However, if it was, I'm somehow pretty sure, we would see in octaves as well :) There would be many kinds of red, orange, yellow, green, cyan, blue, indigo, purple, rather than just one with shades of it. Maybe just put a power-ish distribution above that, making the visible spectrum the strongest and halfing the intensity of every octave before and after that, successively... I release this idea under an open source licence with share-alike attribute :D Title: Re: Buddhabrot reinvented Post by: cbuchner1 on July 28, 2010, 02:59:40 PM I release this idea with an open source licence with share-alike attribute :D what, no attribution attribute? Title: Re: Buddhabrot reinvented Post by: richardrosenman on August 07, 2010, 08:54:20 AM Hey guys;
First off, I am tremendously impressed with this spectral mapping technique and the results it yields. Congratulations for such great work! I have been attempting to implement this on my own as well based on the information provided but I seem to be having some problems: (http://richardrosenman.com/gallery/photo/originals/4c5d002fd6dce.jpg) I have successfully mapped the light spectrum using the proper algorithm from wavelengths 350-780 and converted them to RGB. You can see this on the bottom of the image. So I think I have this part correct. However, when it comes to mapping it to the Buddhabrot, I seem to be stuck. On the left you can see the results with the Buddhabrot technique and on the right with the Nebulabrot. Clearly, none of them come even close to a decent result. Basically, I am defining the wavelength using the following algorithm: wavelength = 350 + (float)(430.0*((red+green+blue)/3.0)); The red, green and blue are the different Buddhabrot / Nebulabrot densities per color channel averaged together and scaled between 0-1. Right now for ease of sake, I am trying to achieve a good result with the simpler uddhabrot technique and not the Nebulabrot which is why I am averaging them out together for a single result. Then, I'm obviously starting the wavelength at 350 and then adding 430*(0...1) so that I get a total range of 350-780, the full color spectrum. I then go on to calculate the respective RGB value for that particular wavelength which should also be in the range of 0-1. Finally, I multiply each channel by the result to retrieve what should be the correct result. Something like this: red=red*abs(rgbwave_red); green=green*abs(rgbwave_green); blue=blue*abs(rgbwave_blue); If you look closely at the images, you'll also notice black speckles which leads me to believe some are going out of range, despite my attempts to keep it within there. So any thoughts on where I'm going wrong? -Richard Title: Re: Buddhabrot reinvented Post by: kram1032 on August 07, 2010, 12:04:08 PM Not sure, I guess your conversion from Spectrum to RGB isn't quite correct...
http://www.fourmilab.ch/documents/specrend/ <- look there. You can find a c-file that converts spectral data into RGB values. Maybe it's of use. The implementation of cbuchner used a converter written in Fortan... The RGB stimuli where pretty sharp. I guess it was easier to handle (not sure^^) but the CIE implementation in that c file is probably better suited if you can use it. Your mapping doesn't look bad though :) Title: Re: Buddhabrot reinvented Post by: cbuchner1 on August 07, 2010, 01:37:26 PM I map short wavelengths (blue, violet) map to short orbits. It seems to be
different in your version because the area outside the buddhabrot appears red. For me this area is entirely violet... Note: The total orbit length (iteration count) defines the color of a pixel, not the current iteration in the orbit. So an entire escape orbit adds the same color contribution. I tried both variations, but the second one looked boring. wavelength = 350 + (float)(430.0*((red+green+blue)/3.0)); I cannot really understand what you're doing with this mapping. Instead of (red+green+blue/3.0) it should say "orbit length / Max_Iterations" and the resulting wavelength THEN maps to red, green, blue color contributions. That color contribution is then added to the accumulation buffer. Because my buffer uses integers of very limited range (~20 bits) I had to use a trick: I add the integer 1 into a color channel, if a uniformly distributed random number between 0 and 1 falls into the range of the color channel's contribution (which is also somewhere between 0 and 1). If you use floating point buffers or integers of at least 32 bits per color channel, then there is no problem. Also in my version of the spectrum there is a drop in intensity towards the edges to model the reduced sensitivity of the eye when we go towards ultraviolet and infrared. In a program update that I posted to the nVidia forums I even let intensity drop towards 0 (that's complete invisibility). Title: Re: Buddhabrot reinvented Post by: richardrosenman on August 07, 2010, 07:34:05 PM Not sure, I guess your conversion from Spectrum to RGB isn't quite correct... http://www.fourmilab.ch/documents/specrend/ <- look there. You can find a c-file that converts spectral data into RGB values. Maybe it's of use. The implementation of cbuchner used a converter written in Fortan... The RGB stimuli where pretty sharp. I guess it was easier to handle (not sure^^) but the CIE implementation in that c file is probably better suited if you can use it. Your mapping doesn't look bad though :) Good point. This is the algorithm I have used: http://miguelmoreno.net/sandbox/wavelengthtoRGB/ It sounds (and looks) correct. What do you think? -Richard Title: Re: Buddhabrot reinvented Post by: richardrosenman on August 07, 2010, 07:49:06 PM I map short wavelengths (blue, violet) map to short orbits. It seems to be different in your version because the area outside the buddhabrot appears red. For me this area is entirely violet... Note: The total orbit length (iteration count) defines the color of a pixel, not the current iteration in the orbit. So an entire escape orbit adds the same color contribution. I tried both variations, but the second one looked boring. wavelength = 350 + (float)(430.0*((red+green+blue)/3.0)); I cannot really understand what you're doing with this mapping. Instead of (red+green+blue/3.0) it should say "orbit length / Max_Iterations" and the resulting wavelength THEN maps to red, green, blue color contributions. Ok, so in the traditional Buddhabrot render, you figure out the orbit length and then, when it escapes, you increment the accumulation buffer by one. It a second pass, you convert the accumulations into the red, green, blue based on totals. It sounds like you're doing something else, right? It sounds like you are incrementing the accumulation buffer by the escaped orbit length / Max_Iterations. But this would result in a float <1.0 not an integer. Do I have it correct? That color contribution is then added to the accumulation buffer. Because my buffer uses integers of very limited range (~20 bits) I had to use a trick: I add the integer 1 into a color channel, if a uniformly distributed random number between 0 and 1 falls into the range of the color channel's contribution (which is also somewhere between 0 and 1). If you use floating point buffers or integers of at least 32 bits per color channel, then there is no problem. I do not get this. Are you calculating color within the orbit length accumulation routine? Is it not calculated afterwards, once you have gathered a growing total number of orbit lengths? Also in my version of the spectrum there is a drop in intensity towards the edges to model the reduced sensitivity of the eye when we go towards ultraviolet and infrared. In a program update that I posted to the nVidia forums I even let intensity drop towards 0 (that's complete invisibility). The drop should be easy to implement. I hope you can shed some light into the other areas though. ;) Thanks! Really interested in this... -Richard Title: Re: Buddhabrot reinvented Post by: cbuchner1 on August 07, 2010, 08:59:17 PM Hmm, let me clarify this a bit more:
First I determine the orbit length. When I find the orbit escapes within the iteration limit, I map this orbit to a wavelength. And I map the wavelength to R,G,B values (each color component is in the range 0...1). With a statistical sampling (above mentioned random number method) I increment my accumulation buffer's red green and blue channels either by 1 or by 0 (i.e. not at all) for all pixels on this orbit individually. It is mostly a single pass algorithm, except maybe for the first determination whether or not the orbit escapes. It's probably the easiest choice to use a uint32 per color channel, alternatively a float (maybe a double for best quality during long renders). With 32 bit integers you may increment each channel with an 8 bit RGB value (0-255), depending on color of the orbit. No trickery with random numbers is needed then. The wavelength function should then return R,G,B values from 0-255. Here's how my accumulation buffer works (that's an implementation detail really): My buffer holds red, green, blue channels interleaved. Bizarelly I store this in two separate uint32 arrays. Each color channel uses 10 bits within a single uint32. By combining these two uint32 arrays I get 20 bits in total per color channel. I can increment the color channels of one pixel using a single read-modify-write to one uint32. That's just a single memory transaction to modify three color channels (hooray!). Overflow from the lower 10 bits to the upper 10 bits of each color channel needs to be dealt with periodically, for example while doing the output processing to show intermediate rendering results on screen (whenever the 10th bit is set in a color channel in the first array, I clear it and then increment the second array by 1). This strange buffer design is the reason I get good performance on the graphics chip and this isn't much slower than rendering a grayscale buddhabrot. But due to the limited 20 bits precision I cannot really increment the colors by large numbers (0-255) or I would soon be overflowing my buffer. Title: Re: Buddhabrot reinvented Post by: kram1032 on August 08, 2010, 10:50:30 AM hmm... on that site he said, none of them creates a pleasing spectrum...
I wish I could see an example of the corresponding CIE spectrum. Well that's fine. It's very simplifyed though :) Title: Re: Buddhabrot reinvented Post by: richardrosenman on August 09, 2010, 05:00:57 AM Hi cbuchner1;
Thanks for the explanation. It makes a lot more sense now but it would take some significant changes to my program to adapt it to a similar system. I played around with the spectrum mapping some more but didn't get any better results. However, it led me to wonder what it would look like if I mapped an HSV color model to it (Hue, Saturation, Value) on the Hue channel. This, in effect, would yield a similar result to the wavelength mapping since we're telling it to map a hue based on the orbit length. Like the wavelength mapping, it reveals many more details in the Buddhabrot that weren't previously visible. The result is much better than my previous attempts: (http://richardrosenman.com/gallery/photo/originals/4c5f7012b3248.jpg) I am now experimenting with other color models such as LAB, IUV, YCBCR, etc. I will post the results as I render them. Cheers, -Richard Title: Re: Buddhabrot reinvented Post by: kram1032 on August 09, 2010, 08:48:25 PM With Lab you wouldn't be far away from spectral, would you?
Very nice results :) Title: Re: Buddhabrot reinvented Post by: richardrosenman on August 10, 2010, 03:42:15 AM Hey guys;
I've added new renders but started a new thread as I'm beginning to feel as if I'm hijacking this one! lol. Anyway, all this info can now be found here. (http://www.fractalforums.com/gallery-b179/buddhabrot-randd-gallery/) -Richard Title: Re: Buddhabrot reinvented Post by: Millennium Nocturne on August 10, 2010, 11:07:35 PM Innocent Question: :)
Mandelbulb buddha looks like this one? http://www.youtube.com/watch?v=5ej3dj4x64k&feature=player_embedded Title: Re: Buddhabrot reinvented Post by: kram1032 on August 10, 2010, 11:17:51 PM not so innocent answer: nope.
What you see there is an animation of the full 4D-space of the Buddhabrot. The Buddhabulb would be 6-dimensional during animation. That's because you have 6 variables... Title: Re: Buddhabrot reinvented Post by: Millennium Nocturne on August 10, 2010, 11:44:25 PM Oh :sad1: So, no 3D mandelbulb looking buddhas? not even with a multidimensional diet? (using 3 dimensions only?) |