News: Support the forums via Fractalforums.com Online Store
 
*
Welcome, Guest. Please login or register. April 21, 2014, 01:31:54 AM


Login with username, password and session length



Pages: 1 [2] 3 4 ... 37   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: True 3D mandelbrot type fractal  (Read 263058 times)
0 Members and 1 Guest are viewing this topic.
lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #15 on: November 19, 2007, 04:11:40 AM »

hmm, this seems to just be performing a rotation and a scale by r^2...

edit: nono, it's doubling the spherical angle and adding an offset (different ones for phi and theta). hmm i wonder if this can be made more efficient, cuz damn it's slow :|
« Last Edit: November 19, 2007, 04:22:24 AM by lycium » Logged

lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #16 on: November 19, 2007, 09:55:39 AM »

here's a preview, i'll update it as it gets smoother:

Logged

twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #17 on: November 19, 2007, 05:00:04 PM »

Oh wow that first one - looks ace!! Amazing how mistakes can often result in these things. It looks like a face with the head part being scalped heh smiley

Ah, yes, the sqrt/sqr in the r variable can be cancelled. Is the pic above (which also looks really nice by the way) the equivalent of my 2D ortho projection? Doesn't quite look like it, but then I wouldn't know what to expect from a different angle!

Quote
edit: nono, it's doubling the spherical angle and adding an offset (different ones for phi and theta). hmm i wonder if this can be made more efficient, cuz damn it's slow :|

Yep, in fact, the spherical angle is doubled not just in one direction, but in the other too. So it's more of a spherical 'twist' than rotation. I wish I could sort of rotate in the 3rd direction too, but as you know, only 2 rotations are needed to represent any possible sphere angle. Maybe there is a way, but I don't want to start representing angles with multiple possible ways of XYZ rotations...

Haha, I'm dying to see higher res versions now. To be honest, I know these things render so slowly, but I'm amazed they can render at all. Even plotting the simple pixel projection I did took around 2 hours in C++, so I'm quite impressed really, especially as it's raytraced. More CPU speed in computers would still really be nice though hehe.
« Last Edit: November 19, 2007, 11:11:41 PM by twinbee » Logged
lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #18 on: November 20, 2007, 12:42:53 AM »

Is the pic above (which also looks really nice by the way) the equivalent of my 2D ortho projection? Doesn't quite look like it, but then I wouldn't know what to expect from a different angle!

yup. i rendered a quick top-down preview (low precision, different materials):



Even plotting the simple pixel projection I did took around 2 hours in C++, so I'm quite impressed really, especially as it's raytraced.

btw, that's not simple ray tracing tongue stuck out it's simulating all possible light/surface interactions (based on metropolis light transport), spectral skylight (based on a practical analytic model for daylight) in a completely correct way, which requires 100-1000 or more samples per pixel for clean images... for me fractals are just fodder for my first love, rendering systems wink

More CPU speed in computers would still really be nice though hehe.
i don't think we'll ever see a 100m times speed improvement, there are physical limits (if you don't believe this, ask yourself: why stop at 100m? why not 1000000000000 quadrillion times faster?) i'd say we have another 1000x or so to go, at most.

oh and about physical limits, do you really think that the speed of light will be broken in the year 2600? cheesy
« Last Edit: March 18, 2009, 12:18:31 PM by lycium, Reason: fix link from dead fractographer.com » Logged

lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #19 on: November 20, 2007, 01:41:35 AM »

it's also definitely the mandelbrot set, here's a thin slice (from y = -0.15 to +0.15) of the object:



it's really fun taking vertical slices of the shape too, a nice project would be to render and assemble a whole bunch of slices to really get an idea of the 3d structure. i think i finally understand the formula btw, it's basically reproducing the dynamics of the mandelbrot iteration in the 2d complex plane (doubling the angle comes from the z^2, adding the offset comes from the starting point) and doing analogous transformations in the other spherical co-ordinate; because of this there are actually entire families of 3d mandelbrots, depending on your choices of angle coefficient and offset.

moreover, i've been thinking of visualising the whole 4d object. yes, 4d: actually, at each point in space there is a scalar potential, which can be interpreted as density (like a cloud). since i plan to simulate atmospheric physics (in particular rayleigh scattering - which is what's responsible for blue skies and red sunsets etc.) anyway, this will serve as a really nice volumetric dataset. i'll definitely pre-render it to a grid though, because it's SO damn slow! that ought to look pretty cool, and would actually be way faster than the method i'm using here.

for the moment, i want to render some more slices of this thing smiley
« Last Edit: March 18, 2009, 12:19:52 PM by lycium » Logged

twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #20 on: November 20, 2007, 09:37:34 AM »

Quote from: lycium
Quote from: twinbee
Is the pic above (which also looks really nice by the way) the equivalent of my 2D ortho projection? Doesn't quite look like it, but then I wouldn't know what to expect from a different angle!
yup.

Oh so it is?? But that previous grey one hardly looks like the next gold one. Not even slightly symmetric for instance. Are you sure?

Quote
i rendered a quick top-down preview (low precision, different materials):
Excellent! This is definitely a 3D version of the ortho one I did cheesy Can I ask you a favour? Can you render it at higher resolution, and also veer the light source to one side, so that the left part of the fractal is brighter than the right, so that we can see a more '3D' like picture. Maybe also use the grey material too, as it better reflects the subtle blue sky light source.

Quote
btw, that's not simple ray tracing  it's simulating all possible light/surface interactions (based on metropolis light transport),

That's all well and good, but I won't be happy until you render with full brute force photon motion simulation to accurately emulate what really happens. Lol, just kidding... cheesy tongue stuck out

Quote
it's simulating all possible light/surface interactions (based on metropolis light transport), spectral skylight (based on a practical analytic model for daylight) in a completely correct way, which requires 100-1000 or more samples per pixel for clean images...

How does that compare to photon mapping or radiosity out of interest?

Quote
.. for me fractals are just fodder for my first love, rendering systems wink
On a tangent, it's be interesting to experience 3D in general (or this world with special goggles) with inverse perspective, so that nearby objects are small, and more distant objects are big (the larger more distant objects would still be 'behind' the nearer stuff). I bet it would look really cool...


Quote
i don't think we'll ever see a 100m times speed improvement, there are physical limits (if you don't believe this, ask yourself: why stop at 100m? why not 1000000000000 quadrillion times faster?) i'd say we have another 1000x or so to go, at most.

Yes that prediction for 2040 is looking slightly optimistic now (that article was written 2 years ago). It's quite a shame the increases have been so small lately sad At least there's parallel processing which will good for raytracing, and radiosity type effects (I hope).

Quote
oh and about physical limits, do you really think that the speed of light will be broken in the year 2600?
Haha tongue stuck out Yeah it probably won't, unless we manage to cheat space-time somehow. Haha, one can dream cheesy cheesy


Quote
it's also definitely the mandelbrot set, here's a thin slice (from y = -0.15 to +0.15) of the object:

Nice!! I'd die to zoom right into that top-left little cave part next to the spindly part to see how much of the detail has survived the move to 3D. If you have time of course. I don't want to eat up all your CPU time! wink

Quote
and doing analogous transformations in the other spherical co-ordinate; because of this there are actually entire families of 3d mandelbrots, depending on your choices of angle coefficient and offset.

I wish!... As good as it looks, this beast is far from the real thing, because most of the infinitely complex details haven't survived the Z axis. Unfortunately, they still look as though they've been 'smeared' over or lost completely. However, there's a chance some of the infinite detail has nearly survived in smaller portions of the set. For example, that cave part near the top looks as though there could be some interesting stuff. Again, I doubt it though.

Quote
moreover, i've been thinking of visualising the whole 4d object. yes, 4d: actually, at each point in space there is a scalar potential, which can be interpreted as density (like a cloud). since i plan to simulate atmospheric physics (in particular rayleigh scattering - which is what's responsible for blue skies and red sunsets etc.) anyway, this will serve as a really nice volumetric dataset. i'll definitely pre-render it to a grid though, because it's SO damn slow! that ought to look pretty cool, and would actually be way faster than the method i'm using here.

Ace. Can't wait to see some more pics of this thing!
Logged
lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #21 on: November 20, 2007, 11:23:42 AM »

Are you sure?
yup.

Can I ask you a favour? Can you render it at higher resolution,
sure, it'll take a while though, once i've set up the scene to be rendered (i do everything with code, and tweaking the camera xyz etc is labourious).

and also veer the light source to one side,
there's no light source in the scene per se, it's all just skylight. i do have light sources though, and will use some.

Quote
btw, that's not simple ray tracing  it's simulating all possible light/surface interactions (based on metropolis light transport),
That's all well and good, but I won't be happy until you render with full brute force photon motion simulation to accurately emulate what really happens. Lol, just kidding... cheesy tongue stuck out
they are totally equivalent: radiance (which is the radiometric quantity corresponding to what we see) is invariant along directions, so it doesn't matter whether you simulate light going from the eye to the lights (path tracing), from the light to the eye (light tracing), or both (bi-directional path tracing). what i'm doing there is a 100% accurate simulation of diffuse light transport, which is really easy actually.

How does that compare to photon mapping or radiosity out of interest?
photon mapping is a full global illumination algorithm, in that it can simulate all modes of light transport. however it is biased (roughly, you can't be sure it's converging to the right image) and i personally dislike the algorithm, it's quite storage intensive and subject to a lot of approximations that make the result uncrisp.

radiosity only works with polygonal scenes, and only simulates diffuse light transport. so it's not at all applicable to scenes having curved (let alone fractal!) surfaces.

the core algorithms used in my renderer are pretty much state of the art, and i'll be extending its functionality a lot these holidays smiley

On a tangent, it's be interesting to experience 3D in general (or this world with special goggles) with inverse perspective, so that nearby objects are small, and more distant objects are big (the larger more distant objects would still be 'behind' the nearer stuff). I bet it would look really cool...
i've not heard of inverse perspective before, can you explain how it works?

Yes that prediction for 2040 is looking slightly optimistic now (that article was written 2 years ago). It's quite a shame the increases have been so small lately sad At least there's parallel processing which will good for raytracing, and radiosity type effects (I hope).
yup, but we've already mostly exhausted the "free ghz" ride, and eternal multicore scaling isn't a given. so there definitely is a limit, and lithographic microprocessor manufacturing is slowly approaching some kind of peak too... all other techs (quantum, biological, nano, ...) are a pie in the sky so far.

Haha tongue stuck out Yeah it probably won't, unless we manage to cheat space-time somehow. Haha, one can dream cheesy cheesy
i'd prefer it not to be the case, it's what keeps super advanced intergalactic civilisations out of our cosmic back yard wink

Nice!! I'd die to zoom right into that top-left little cave part next to the spindly part to see how much of the detail has survived the move to 3D. If you have time of course. I don't want to eat up all your CPU time! wink
the way to visualise this thing in realtime is to precompute a volume... i should get to doing that.
Logged

lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #22 on: November 20, 2007, 11:38:20 AM »

to give an idea of how accurate these rendering methods (based on physical wavelength-based light quantities and reflectances) are, check out http://www.graphics.cornell.edu/online/box/compare.html

that same accuracy extends to less simple scenes, and if you want to see what i'm chasing see http://www.maxwellrender.com/
Logged

David Makin
Global Moderator
Fractal Senior
******
Posts: 2188



Makin' Magic Fractals
WWW
« Reply #23 on: November 21, 2007, 01:04:35 AM »

here's a preview, i'll update it as it gets smoother:

That render doesn't half remind me of this:

<a href="http://www.youtube.com/v/jeVDBvE5Bg0&rel=1&fs=1&hd=1" target="_blank">http://www.youtube.com/v/jeVDBvE5Bg0&rel=1&fs=1&hd=1</a>
Logged

The meaning and purpose of life is to give life purpose and meaning.

http://www.fractalgallery.co.uk/
"Makin' Magic Music" on Jango
David Makin
Global Moderator
Fractal Senior
******
Posts: 2188



Makin' Magic Fractals
WWW
« Reply #24 on: November 21, 2007, 01:05:48 AM »

edit: nono, it's doubling the spherical angle and adding an offset (different ones for phi and theta). hmm i wonder if this can be made more efficient, cuz damn it's slow :|

Maybe there's a way of working out a distance estimation for it ?
Logged

The meaning and purpose of life is to give life purpose and meaning.

http://www.fractalgallery.co.uk/
"Makin' Magic Music" on Jango
lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #25 on: November 21, 2007, 06:08:07 AM »

edit: nono, it's doubling the spherical angle and adding an offset (different ones for phi and theta). hmm i wonder if this can be made more efficient, cuz damn it's slow :|

Maybe there's a way of working out a distance estimation for it ?

that would be great, almost necessary even. i don't have a clue as to how to approach it though :/
Logged

lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #26 on: November 21, 2007, 06:09:49 AM »

Can I ask you a favour? Can you render it at higher resolution, and also veer the light source to one side, so that the left part of the fractal is brighter than the right, so that we can see a more '3D' like picture. Maybe also use the grey material too, as it better reflects the subtle blue sky light source.

i took out the sky and used a simple area-directional light source; it's not very efficiently sampled, but it works. here's the pic, it's still really grainy even after rendering all night sad

« Last Edit: March 18, 2009, 12:20:25 PM by lycium, Reason: fix url » Logged

twinbee
Fractal Fertilizer
*****
Posts: 383



WWW
« Reply #27 on: November 22, 2007, 05:28:36 PM »

Quote
they are totally equivalent: radiance (which is the radiometric quantity corresponding to what we see) is invariant along directions, so it doesn't matter whether you simulate light going from the eye to the lights (path tracing), from the light to the eye (light tracing), or both (bi-directional path tracing). what i'm doing there is a 100% accurate simulation of diffuse light transport, which is really easy actually.

Wow really? I can't really get one up on that one then smiley I'm a bit surprised, because it took 100 Sun SparcStations 1 month to generate some of those pics, but then it was back in 1991...
Does your algorithm even do exotic stuff like account for red/blue shift for rainbow effects?

Quote
photon mapping is a full global illumination algorithm, in that it can simulate all modes of light transport. however it is biased (roughly, you can't be sure it's converging to the right image)

How about if you throw more CPU time at it. Is accuracy increased proportional to the CPU time?

Quote
the core algorithms used in my renderer are pretty much state of the art, and i'll be extending its functionality a lot these holidays

Sounds ace. I have to say, some of these, although grainy, do look very realistic.

Quote
i've not heard of inverse perspective before, can you explain how it works?

Usually, in 3D, lines of perspective converge to a point. Orthographic images on the other hand don't - everything in the back is just as big as everything at the front. Now imagine one step further than even this. lines of perspective would fly away from each other (actually they would converge, but only behind the viewer).

Parallax would look really strange and interesting like this. I was semi-surprised to see the idea is already out there on Wikipedia.

Still love to see a zoom into that top left cave here: http://www.fractographer.com/wip/3dmandel_slice.jpg
Even if you have to forego some of the more realistic lighting effects for quicker speed, I'd love to get a closer eye on the detail.

By the way, what's the building block of these images? Billions of little cubes? Polygons? Pure mathematical curves? I can't imagine the latter, since the difficulty in translating from fractal to curve is daunting (at least for me).

Quote
that same accuracy extends to less simple scenes, and if you want to see what i'm chasing see http://www.maxwellrender.com/

What does that have, that your algorithm lacks?

Quote
That render doesn't half remind me of this:

<a href="http://www.youtube.com/v/jeVDBvE5Bg0&rel=1&fs=1&hd=1" target="_blank">http://www.youtube.com/v/jeVDBvE5Bg0&rel=1&fs=1&hd=1</a>

Hi David, still love to see that quaternion technique at creating the 3D Mandelbrot set. Any attempts so far?

Meanwhile, here's another 2 failed attempts. As before, brightness represents the Z axis.


And the other...
« Last Edit: November 22, 2007, 07:14:59 PM by twinbee » Logged
lycium
Fractal Iambus
***
Posts: 939



WWW
« Reply #28 on: November 23, 2007, 03:43:38 AM »

Quote
they are totally equivalent: radiance (which is the radiometric quantity corresponding to what we see) is invariant along directions, so it doesn't matter whether you simulate light going from the eye to the lights (path tracing), from the light to the eye (light tracing), or both (bi-directional path tracing). what i'm doing there is a 100% accurate simulation of diffuse light transport, which is really easy actually.

Wow really? I can't really get one up on that one then smiley I'm a bit surprised, because it took 100 Sun SparcStations 1 month to generate some of those pics, but then it was back in 1991...
Does your algorithm even do exotic stuff like account for red/blue shift for rainbow effects?

first off, a minor technical correction due to my bad wording: radiance isn't "invariant along direction", it's invariant if you swap the directions; i.e., the (differential) radiance from a to b is the same as the radiance from b to a. so if you have a path of light between the eye and the camera, then it doesn't matter which point is the source and which is the sensor. this symmetry is concisely summarised in this this paper.

about 100 computers to render 1 image, that's a bit crazy. any reference on that? if they were just blindly shooting out photons from the light sources and recording those which hit the image plane, then they should know better in 1991 because path tracing was developed by kajyia in 1986 when he formalised image synthesis in his seminal paper, "the rendering equation"...

regarding spectral effects such as redshift and dispersive refraction, yes my renderer can (efficiently!) do both. even polarisation of light can be taken into account.

Quote
photon mapping is a full global illumination algorithm, in that it can simulate all modes of light transport. however it is biased (roughly, you can't be sure it's converging to the right image)

How about if you throw more CPU time at it. Is accuracy increased proportional to the CPU time?

that's a very good scientifically-minded question. it depends on how you scale up photon mapping; but yes, photon mapping can be made to approach correctness if you make the photon's region of effect really really small. this is extremely inefficient, but it sort of works (however it's still an approximation since you have a nonzero radius of effect): http://www.winosi.onlinehome.de/

be sure to read this excellent article comparing the "accuracy" of various rendering methods: http://www.cgafaq.info/wiki/Bias_in_rendering

I have to say, some of these, although grainy, do look very realistic.

yeah, unfortunately naively rendering fractal surfaces without distance estimation is several hundred times slower than rendering "normal" scenes.

Still love to see a zoom into that top left cave here: http://www.fractographer.com/wip/3dmandel_slice.jpg

i'll look into it now. getting light in there will be tricky...

By the way, what's the building block of these images? Billions of little cubes? Polygons? Pure mathematical curves? I can't imagine the latter, since the difficulty in translating from fractal to curve is daunting (at least for me).

it's working directly with the procedure you gave, same as your images were produced. your rays were going out in parallel bundles, mine leave in perspective and bounce around the scene looking for light sources - that's about it.

Quote
that same accuracy extends to less simple scenes, and if you want to see what i'm chasing see http://www.maxwellrender.com/

What does that have, that your algorithm lacks?

it's difficult to say exactly what technology they're using because they're obviously not going to disclose it, but it's hypothesised that they're also using (some variant of) metropolis light transport. there are a bunch of renderers out there which work in the same way, as does mine.

regarding features, mine can't be compared to any of them as i've hardly worked on it at all and have absolutely no hooks for polygonal data (let alone import from 3ds max, maya, etc.), no material system, no support for textures, no nothing basically wink this is why i need to work on it. however, wrt the core algorithm i think there's rough parity; i just need to take that powerful engine and do useful things with it, these simple renders i've done don't even scratch the surface (as you'll see looking at maxwell render's gallery).
« Last Edit: November 23, 2007, 03:47:14 AM by lycium » Logged

David Makin
Global Moderator
Fractal Senior
******
Posts: 2188



Makin' Magic Fractals
WWW
« Reply #29 on: November 23, 2007, 07:20:18 PM »

Quote
That render doesn't half remind me of this:

<a href="http://www.youtube.com/v/jeVDBvE5Bg0&rel=1&fs=1&hd=1" target="_blank">http://www.youtube.com/v/jeVDBvE5Bg0&rel=1&fs=1&hd=1</a>

Hi David, still love to see that quaternion technique at creating the 3D Mandelbrot set. Any attempts so far?


Not yet as I'm concentrating on the 3D IFS/RIFS formula I just released for Ultrafractal at the moment smiley
Logged

The meaning and purpose of life is to give life purpose and meaning.

http://www.fractalgallery.co.uk/
"Makin' Magic Music" on Jango
Pages: 1 [2] 3 4 ... 37   Go Down
  Print  
 
Jump to:  


Related Topics
Subject Started by Replies Views Last post
Implementation: 3D mandelbrot type fractal 3D Fractal Generation « 1 2 » steamraven 26 16213 Last post December 08, 2011, 04:27:52 PM
by Alef
Re: True 3D mandelbrot type fractal Mandelbulb Software shanest 2 7764 Last post November 20, 2009, 03:24:26 AM
by fractalrebel
True 3D mandelbrot fractal (search for the holy grail continues) 3D Fractal Generation « 1 2 ... 17 18 » illi 260 47261 Last post November 25, 2010, 12:57:55 AM
by cKleinhuis
New fractal type... latest 3d type.. a z^2 for Benoit Images Showcase (Rate My Fractal) M Benesi 0 1507 Last post October 21, 2010, 07:14:00 AM
by M Benesi
My First Mandelbrot...Okay not true. Images Showcase (Rate My Fractal) Zephitmaal 3 936 Last post January 07, 2012, 04:30:36 PM
by Pauldelbrot

Powered by MySQL Powered by PHP Powered by SMF 1.1.19 | SMF © 2013, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 1.038 seconds with 30 queries. (Pretty URLs adds 0.05s, 2q)