TruthSerum
Guest
|
|
« on: June 28, 2014, 09:09:18 PM » |
|
Using some algebra software I've derived the Jacobian for the Mandelbulb (specifically for the pow function for vectors - let me know if I should post it here). Then, according to this article I should be able to use this result to help calculate the distance estimate. I'm using the good old approximationf(x+e) = f(x) + f'(x)ewhich leads to the distance estimate d = f(x)/f'(x). Following this, I would expect to find that the distance estimate could be obtained by d = f(x)/|Jv|, where J is the Jacobian matrix at the point x, v is the direction along which I am tracing, and |.| is the vector length. But while it looks almost correct, it is simply not as accurate as the running-scalar derivative technique. Can anybody explain how to compute the same distance estimate metric using the Jacobian?
|
|
« Last Edit: June 28, 2014, 09:12:33 PM by TruthSerum »
|
Logged
|
|
|
|
TruthSerum
Guest
|
|
« Reply #1 on: June 28, 2014, 09:35:41 PM » |
|
I thought I'd post this anyway. It's generating good normals for me, a simple cut-and-paste job out of the (discontinued) Derive 6 computer algebra software for Windows: Matrix dpow(const Vector& p, const Real n) const { const Real r = p.length(); const Real x = p.x; const Real y = p.y; const Real z = p.z; const Real pi = 3.14159265359; Vector a(-n*z*pow(r,n)*sin(-n*acos(y/r))*sin(pi*n*sign(z)/2-n*atan2(x,z))/(pow(x,2)+pow(z,2)), 0, n*z*pow(r,n)*sin(-n*acos(y/r))*cos(pi*n*sign(z)/2-n*atan2(x,z))/(pow(x,2)+pow(z,2))); Vector b(-n*pow(r,n)*sign(r)*cos(-n*acos(y/r))*cos(pi*n*sign(z)/2-n*atan2(x,z))/sqrt(pow(r,2)-pow(y,2)), -n*pow(r,n)*sign(r)*sin(-n*acos(y/r))/sqrt(pow(r,2)-pow(y,2)), -n*pow(r,n)*sign(r)*cos(-n*acos(y/r))*sin(pi*n*sign(z)/2-n*atan2(x,z))/sqrt(pow(r,2)-pow(y,2))); Vector c(n*x*pow(r,n)*sin(-n*acos(y/r))*sin(pi*n*sign(z)/2-n*atan2(x,z))/(pow(x,2)+pow(z,2)), 0, -n*x*pow(r,n)*sin(-n*acos(y/r))*cos(pi*n*sign(z)/2-n*atan2(x,z))/(pow(x,2)+pow(z,2))); Matrix j = Matrix::identity(); j.setRow(0, a); j.setRow(1, b); j.setRow(2, c); return j; }
|
|
|
Logged
|
|
|
|
hobold
Fractal Bachius
Posts: 573
|
|
« Reply #2 on: June 28, 2014, 10:36:34 PM » |
|
On a tangential note: there is an open source Computer Algebra System called "Maxima" (see http://maxima.sourceforge.net ), based on the grandfather of such systems, Macsyma. It is not as polished as more modern commercial offerings like Maple or Mathematica, but Maxima is still actively being maintained to this day, and keeps growing in functionality. It is worth checking out for everyone who ever worked with a computer algebra system and found it useful.
|
|
|
Logged
|
|
|
|
Syntopia
|
|
« Reply #3 on: June 29, 2014, 06:26:01 PM » |
|
But while it looks almost correct, it is simply not as accurate as the running-scalar derivative technique.
Can anybody explain how to compute the same distance estimate metric using the Jacobian?
Remember that the distance estimate is not directional - it is the closest distance from a point P, to the fractal boundary in any direction. So if you want the DE, you probably want to use the matrix norm of the Jacobian, instead of |Jv|. This is actually described in a bit more detail in the post about the Mandelbox: http://blog.hvidtfeldts.net/index.php/2011/11/distance-estimated-3d-fractals-vi-the-mandelbox/Notice that the 'un-directional' DE is always less than the directional distance estimate you use, which might explain why the performance is better. Also note the the four-point numerical approximation of the gradient (the Buddhi/Makin approach), performs better than the simple directional two-point numerical derivative.
|
|
|
Logged
|
|
|
|
TruthSerum
Guest
|
|
« Reply #4 on: June 29, 2014, 07:32:53 PM » |
|
Thanks, actually I had come across your article about this too. Also note the the four-point numerical approximation of the gradient (the Buddhi/Makin approach), performs better than the simple directional two-point numerical derivative. Are these techniques documented somewhere? I've been told that the following norm is the equivalent of the original |f(x)|/|f'(x)|sqrt[f(x)^t * (J^t * J)^(-1) * f(x)], and I'm currently experimenting with this. Maxima is still actively being maintained to this day, and keeps growing in functionality. I have seen Maxima before, but I prefer the point-and-click interface of Derive.
|
|
« Last Edit: June 29, 2014, 09:29:51 PM by TruthSerum »
|
Logged
|
|
|
|
|
TruthSerum
Guest
|
|
« Reply #6 on: June 29, 2014, 11:48:31 PM » |
|
Thanks, I have implemented this Buddhi-Makin technique: - To find an estimate for the distance to the surface from a point, c, begin by iterating c:
Vector Rc = f(c)
- Compute small offsets of h from each axis and iterate these nearby points too
Vector Rx = f(c + Vector(h,0,0)) Vector Ry = f(c + Vector(0,h,0)) Vector Rz = f(c + Vector(0,0,h))
- Compute the distance of each of these new points to the original point and divide this by the offset
Real drx = |Rx - Rc| / h Real dry = |Ry - Rc| / h Real drz = |Rz - Rc| / h
- Compute the final distance from
|Rc| * log|Rc| / sqrt(drx^2 + dry^2 + drz^2)
It is an interesting technique that is very sensitive to the DE step-size used. What I am still concerned with though, is that this is an approximation to the true gradient at a point, while it should be possible to calculate the distance from an analytically determined gradient and obtain a better result.
|
|
« Last Edit: June 30, 2014, 12:03:29 AM by TruthSerum »
|
Logged
|
|
|
|
David Makin
|
|
« Reply #7 on: July 06, 2014, 02:39:12 AM » |
|
AFAIK for the standard Triplex Mandy Jos Leys showed that the fancy Jacobian stuff wasn't actually required to get a decent "more" analytical solution, in fact simply assuming that the Mandelbulb derivative can be used in the same way as say a quaternion seems to work fine - i.e. simply using the chain rule with triplex math *at least for simpler* triplex formulas, if you start using more complex stuff e.g. trying triplex higher functions or maths similar to but not the same as the standard Triplex things don't necessarily go so well which is where maybe the Jacobian would help - though the new method of dual numbers is damn useful, I really must try it !!
|
|
|
Logged
|
|
|
|
|