Logo by mauxuam - Contribute your own Logo!

END OF AN ERA, FRACTALFORUMS.COM IS CONTINUED ON FRACTALFORUMS.ORG

it was a great time but no longer maintainable by c.Kleinhuis contact him for any data retrieval,
thanks and see you perhaps in 10 years again

this forum will stay online for reference
News: Did you know ? you can use LaTex inside Postings on fractalforums.com!
 
*
Welcome, Guest. Please login or register. January 13, 2026, 02:07:05 AM


Login with username, password and session length


The All New FractalForums is now in Public Beta Testing! Visit FractalForums.org and check it out!


Pages: [1]   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: A couple of programming related questions  (Read 3557 times)
0 Members and 1 Guest are viewing this topic.
A Noniem
Alien
***
Posts: 38


« on: August 11, 2011, 12:47:57 AM »

So I'm currently writing my own ray tracer in openCL and I'm starting to produce some images that look fractalish. However I hit a few bumps in the road and I got a couple of questions considering coloring, distance estimation etc. So perhaps you guys can help me out.

- First of all the distance estimation. I've seen the name passing by quite a lot, but what is it? What does it do, how does it work and (how) does it speed up rendering time? I googled a bit, but couldn't find a good article about it. - See next one

- I'm currently using buddhi's distance estimation formula. It works fine for scale -1.5 for me, but for almost any other scale I get pictures which are nowhere near the ones I see in mandelbulber.

- Second the coloring. In 2d fractals you color the places that escape, but in 3d you color the parts of the fractal. How do you give these points a nice looking color? I've tried a couple of things, but they all look horrible. The nicests pictures I got so far are those from the z-buffer and I'm sure there are better ways to color a fractal  embarrass

- My main focus is the mandelbox right now, I've got something that looks like a mandelbox (even though when I compare it's surface shape with other pictures it's not the same, which is a bit weird) However I expected it's center to be (0,0,0), but in my case (testing on scale -1.5) it's not. Instead (0,0,0) is somewhat near a corner and the rest extends into the negative (so the opposite corner is (-..,-...,-...) . Is my mandelbox just plain wrong, or is it actually right, and if so, is there a formula that estimates the center of a mandelbox with a certain scale? Bug fixed

- When shading fractals the problem is that you don't have normals, so you have to calculate the normals depending on the "heights" of the surrounding area. I thought that you could just do this as a 2nd pass. Output the first rendering pass into a regular picture and a z-buffer and then in a second pass calculate the normals in a point using the z-buffer and apply the shading on the regular picture (which is basically deferred shading I believe). I stumbled on a site however that does this a little bit differently (http://blog.hvidtfeldts.net/index.php/2011/08/distance-estimated-3d-fractals-ii-lighting-and-coloring/) This uses distance estimation (and since I don't really know how that works...) Using this method however you have 6 extra points to calculate per pixel drawn, but you can do all the shading in one pass and you can use a higher precision than the z-buffer. So my question is, which method is faster? And when using the z-buffer which has a lower precision compared to the other method, do you see artifacts? Got it working, thanks!

Well that's a lot of questions. If you can help me with any of these problems (or you have any other tips that might be useful) feel free to help me out and post here. Your help is very much appreciated!

Here is my openCL code. It's basically a rippoff from buddhi's distance estimation. Using absolute values made my -1.5 scale look a lot better. Before I put that in the fractals looked a lot less detailed. I don't get however why it has to be in there.

Code:
float mandelBoxDistance(float4 v,float scale,int maxIter)
{
int iter = 0;
v = fabs(v);
float4 c = v;
float distance = scale;
float fixedRadius = 1.0f;

float fR2 = fixedRadius * fixedRadius;
float minRadius = 0.5f;
float mR2 = minRadius * minRadius;

while (iter < maxIter)
{
if (v.x > 1) v.x = 2-v.x;
else if (v.x < -1) -2-v.x;
if (v.y > 1) v.y = 2-v.y;
else if (v.y < -1) -2-v.y;
if (v.z > 1) v.z = 2-v.z;
else if (v.z < -1) -2-v.z;

float r2 = v.x*v.x+v.y*v.y+v.z*v.z;
if (r2 > 150)
break;
if (r2 < mR2)
{
v *= fR2 / mR2;
distance *= fR2 / mR2;
}
else if (r2 < fR2)
{
v *= fR2 / r2;
distance *= fR2 / r2;
}
distance *= scale;
v = v * scale + c;
iter++;
}
return length(v)/fabs(distance);
}
kernel void MandelBox(const float4 camera,
 const float4 middle,
 const float4 u,
 const float4 v,
 const int maxIter,
 const int2 size,
 const float fieldOfView,
 const float farPlane,
 const float scale,
 __global write_only int* output,
 __global write_only int* zBuffer)
{
int x = get_global_id(0);
int y = get_global_id(1);

int index = 3*(x + y * size.x);

float relx = (-0.5f+(float)x/(float)size.x);
float rely = (-0.5f+(float)y/(float)size.y);

float4 g = middle + relx * u * fieldOfView + rely * v * fieldOfView;

float4 direction = normalize(g - camera);

float t = 0.0f;
int steps = 0;
while (t < farPlane)
{
steps++;
float4 c = camera + t * direction;
float distance = 0.5f*mandelBoxDistance(c,scale,maxIter);
if (distance < 0.001f)
break;
t += distance;
}
float4 pos = camera + t * direction;
float deltaD = 0.001f;
float4 deltaX = (float4)(deltaD,0,0,0);
float4 deltaY = (float4)(0,deltaD,0,0);
float4 deltaZ = (float4)(0,0,deltaD,0);
float4 normal;
float4 lightSource = camera - (float4)(0,0,0,0);
normal.x = mandelBoxDistance(pos+deltaX,scale,maxIter) - mandelBoxDistance(pos-deltaX,scale,maxIter);
normal.y = mandelBoxDistance(pos+deltaY,scale,maxIter) - mandelBoxDistance(pos-deltaY,scale,maxIter);
normal.z = mandelBoxDistance(pos+deltaZ,scale,maxIter) - mandelBoxDistance(pos-deltaZ,scale,maxIter);
normal.w = 0;
normal = normalize(normal);

float4 lightVector = normalize(lightSource - pos);
float4 halfVector = normalize(direction + lightVector);
int specularPower = 5;
float color = fmax(0.0f,dot(normal,lightVector));
// pown(fmax(0.0f,dot(normal,halfVector)),specularPower)
        // Blinn Phong shading looks rather ugly atm
output[index] = (uchar)(255 * color);
output[index+1] = (uchar)(255 * color);
output[index+2] = (uchar)(255 * color);
if (t >= farPlane)
{
output[index] = 0;
output[index+1] = 0;
output[index+2] = 0;
}
    zBuffer[index/3] = (int)((t / farPlane) * 256 * 256 * 256 * 128);
}
« Last Edit: August 12, 2011, 12:36:31 AM by A Noniem » Logged
Syntopia
Fractal Molossus
**
Posts: 681



syntopiadk
WWW
« Reply #1 on: August 11, 2011, 09:23:07 AM »

- First of all the distance estimation. I've seen the name passing by quite a lot, but what is it? What does it do, how does it work and (how) does it speed up rendering time? I googled a bit, but couldn't find a good article about it.

- Second the coloring. In 2d fractals you color the places that escape, but in 3d you color the parts of the fractal. How do you give these points a nice looking color? I've tried a couple of things, but they all look horrible. The nicests pictures I got so far are those from the z-buffer and I'm sure there are better ways to color a fractal  embarrass

I see you found my blog posts :-) The first part talks a bit about distance estimation (http://blog.hvidtfeldts.net/index.php/2011/06/distance-estimated-3d-fractals-part-i/). The second part (the one you link to) also mentions the standard ways of coloring 3D fractals: orbit traps, smooth iteration count and so on.

Quote
- When shading fractals the problem is that you don't have normals, so you have to calculate the normals depending on the "heights" of the surrounding area. I thought that you could just do this as a 2nd pass. Output the first rendering pass into a regular picture and a z-buffer and then in a second pass calculate the normals in a point using the z-buffer and apply the shading on the regular picture (which is basically deferred shading I believe). I stumbled on a site however that does this a little bit differently (http://blog.hvidtfeldts.net/index.php/2011/08/distance-estimated-3d-fractals-ii-lighting-and-coloring/) This uses distance estimation (and since I don't really know how that works...) Using this method however you have 6 extra points to calculate per pixel drawn, but you can do all the shading in one pass and you can use a higher precision than the z-buffer. So my question is, which method is faster? And when using the z-buffer which has a lower precision compared to the other method, do you see artifacts?

Drawing a single pixel requires multiple (perhaps hundreds) of ray steps. And each ray step requires one distance estimator evaluation. So adding six more distance estimator evalutions for the normal vector does not require a lot overhead, and will only slow rendering a few percents. z-buffer normals will be faster though, but I think you might end up with more artifacts (not sure, though). On the other hand, the z-buffer might be useful for other stuff as well (Screen Space AO and Depth-of-field for instance). And in OpenCL you can use whatever resolution you see fit for your z-buffer.
Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #2 on: August 11, 2011, 11:08:24 AM »

Ho... now that's interesting !

As you probably noticed in the same programming section, i started by learning how to compute/render a mandelbulb and flowabrot/flowabulb (3D buddhabrot), then switched to OpenCL, and now i'm starting to learn how to write a raytracer smiley))

And i started learning yesterday by... reading syntopia's blog and links.  grin

About Distance Estimation, i can't tell you why it works.
What i understand is that, instead of moving your ray (and probing for intersection (which is very expensive in the case of a fractal)) by a fixed step, you use the Distance Estimation to move your ray by large step when the ray is far (according to the DE) from the fractal and smaller step when the ray is very close to reach the fractal's surface.
Less step -> less probing -> faster (stronger, better  whistling and rolling eyes )


What langage do you use for your raytracer ? is the source code available ?  cheesy

Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #3 on: August 11, 2011, 11:14:14 AM »

Btw ... here is my openCL code for the Mandelbulb (not optimized at all, but it's a (working) early prototype) :

Code:
__kernel void computeMandelbulb(
    __global float *DE,
    __global float *cx,
    __global float *cy,
    __global float *cz,
    int maxStep,
    int maxIter,
    __global int *cstep,
    const float p
    )
{

    const int gid = get_global_id(0);

    float x = cx[gid];
    float y = cy[gid];
    float z = -4.0;

    float nx = x;
    float ny = y;
    float nz = z;

    int step = 0;
    int iter = 0;
    float r = 0.0;
    float dr = 1.0;
    r=sqrt(x*x+y*y+z*z);
    float th = atan(y/x)*p;
    float ph = asin(z/r)*p;
    float r2p = pow(r,p);

    for(step = 0; step < maxStep && z < 4.0; step++) {
        //r2p = pow(r,p);
        iter = 0;
        r = 0.0;
        dr = 1.0;
        nx = x;
        ny = y;
        nz = z;
        r=sqrt(x*x+y*y+z*z);
        //th = atan(y/x)*p;
        //ph = asin(z/r)*p;
        while(iter < maxIter && r<2.0  ) {
            r2p = pow(r,p);
            th=atan(ny/nx)*p;
            ph=asin(nz/r)*p;
            nx=r2p*cos(ph)*cos(th)+x;
            ny=r2p*cos(ph)*sin(th)+y;
            nz=r2p*sin(ph)+z;
            dr=dr*p*pow(r,p-1.0)+1.0;
            r=sqrt(nx*nx+ny*ny+nz*nz);
            iter++;
        }

        if(0.5*log(r)*r/dr < 0.00001) {
            break;
        } else {
            z+=0.5*log(r)*r/dr;
        }
    }

    if(0.5*log(r)*r/dr < 0.00001 && 0.5*log(r)*r/dr > -0.000001) {
//      DE[gid] = r;//0.5*log(r)*r/dr;
//      cx[gid] = x;
//      cy[gid] = y;
        cz[gid] = z;
        cstep[gid] = step;
    } else {
        cstep[gid] = 0;

    }

}


(you'll notice that all my ray are parallel, which probably not exactly what i'm supposed to do for a raytracer (i don't know Grin with closed eyes ))
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
A Noniem
Alien
***
Posts: 38


« Reply #4 on: August 11, 2011, 04:28:03 PM »

Unfortunately my openCL codes looks a lot different, because it has perspective and I'm rendering a mandelbox instead of a bulb.
This morning I implemented buddhi's distance estimation formula and added shading (which luckily I did before) using Syntopia's blog post. I got a couple of nice images, but some are still a bit grainy (especially negative mandelboxes) and when I render the same fractal in Mandelbulber it just looks completely different. My fractals are less boxy and have more round shapes. Also blinn-phong shading makes everything look even more granier, which looks rather bad.

This is an image of inside a 2.5 scale mandelbox. I still need a nice way to color it, but just plain black/white lightning isn't that bad either.
(it's rather large, so i'll just link to it)
http://img196.imageshack.us/img196/3376/44myfractal.jpg
Rendering takes a couple of seconds on my ati 4350 (which is probably the slowest openCL capable gpu out there  grin)

Edit:
I found a couple bugs in my formula which fixed all the spherical stuff and after I implemented some basic camera movement I saw some parts of the fractal are stretched, which I also fixed. My scale -1.5 looks exactly like the one in mandelbulber, however other scales look completely different and for the positive scales the distance estimation formula doesn't work that well. I'll look into that tomorrow.
« Last Edit: August 11, 2011, 11:50:22 PM by A Noniem » Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #5 on: August 12, 2011, 11:48:45 AM »

Edit:
I found a couple bugs in my formula which fixed all the spherical stuff and after I implemented some basic camera movement I saw some parts of the fractal are stretched, which I also fixed. My scale -1.5 looks exactly like the one in mandelbulber, however other scales look completely different and for the positive scales the distance estimation formula doesn't work that well. I'll look into that tomorrow.

no code available ?  cry
Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
A Noniem
Alien
***
Posts: 38


« Reply #6 on: August 12, 2011, 01:03:48 PM »

Edit:
I found a couple bugs in my formula which fixed all the spherical stuff and after I implemented some basic camera movement I saw some parts of the fractal are stretched, which I also fixed. My scale -1.5 looks exactly like the one in mandelbulber, however other scales look completely different and for the positive scales the distance estimation formula doesn't work that well. I'll look into that tomorrow.

no code available ?  cry

I put the openCL code in the first post. Might help you guys out a little  Azn
Atm the scale -1.5 looks perfect, apart from the single precision artifacts, however other scales look completely different. The code should be good (otherwise the scale -1.5 wouldn't look the same as the one I made in mandelbulber and it's almost a complete copy from buddhi's one), but most other scales are nowhere near their mandelbulber cousins or sometimes aren't even boxy (my scale 2 is more of a sphere)
Logged
ker2x
Fractal Molossus
**
Posts: 795


WWW
« Reply #7 on: August 12, 2011, 01:11:19 PM »

Thank you  angel

Edit :
Nice code. it seems that to managed to avoid the most common performance pitfall . (which is not my case in the code i pasted, it's a very direct port from my cpu code to opencl. i'll do that later  embarrass ).
« Last Edit: August 12, 2011, 01:35:09 PM by ker2x » Logged

often times... there are other approaches which are kinda crappy until you put them in the context of parallel machines
(en) http://www.blog-gpgpu.com/ , (fr) http://www.keru.org/ ,
Sysadmin & DBA @ http://www.over-blog.com/
Pages: [1]   Go Down
  Print  
 
Jump to:  

Related Topics
Subject Started by Replies Views Last post
A couple of 4+ Dimension (not order) animations Movies Showcase (Rate My Movie) M Benesi 5 3063 Last post February 02, 2010, 10:16:32 AM
by Nahee_Enterprises
a couple new HD clips Movies Showcase (Rate My Movie) jockc 4 1912 Last post March 17, 2011, 02:32:20 AM
by ant123
Couple minibrots Images Showcase (Rate My Fractal) Eric B 1 1162 Last post May 26, 2012, 01:14:36 AM
by Pauldelbrot
A romantic couple Images Showcase (Rate My Fractal) ConfusedMonkey 0 972 Last post December 24, 2013, 04:26:31 PM
by ConfusedMonkey
Duralumin Couple Images Showcase (Rate My Fractal) JoeFRAQ 0 1001 Last post January 16, 2014, 02:03:22 PM
by JoeFRAQ

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 0.192 seconds with 24 queries. (Pretty URLs adds 0.012s, 2q)