Welcome to Fractal Forums

Fractal Software => Programming => Topic started by: marius on August 04, 2011, 11:13:59 AM




Title: compile/run glsl shader as C++
Post by: marius on August 04, 2011, 11:13:59 AM
Inspired by effie's awesome mashup videos, I tinkered on boxplorer2 to get the glsl shader code also to compile as C++ so I'd have access to the DE and such both on GPU and CPU.
Watch http://www.youtube.com/user/ytalinflusa as to why one would care about that.

Turned out not to be too bad, with some #define and other fu.
http://code.google.com/p/boxplorer2/source/detail?r=82

Mild modifications needed to the shader code like a few #ifdefs but now the fractal definition is understood by both CPU and GPU.
I'll pull in the fragment main() later as well but this is a promising start.


Title: Re: compile/run glsl shader as C++
Post by: eiffie on August 04, 2011, 05:42:33 PM
Good start! I am sure I will use the glsl.h file. Very helpful but now I am envisioning an app where the user selects (or types in) a fractal formula and instantly has a full physical simulation of the object. Is that too much to ask? :)


Title: Re: compile/run glsl shader as C++
Post by: marius on August 04, 2011, 07:21:41 PM
Good start! I am sure I will use the glsl.h file. Very helpful but now I am envisioning an app where the user selects (or types in) a fractal formula and instantly has a full physical simulation of the object. Is that too much to ask? :)

If the user has a C++ compiler and 'instant' means after run/debug/edit compilation steps, sure ;-)


Title: Re: compile/run glsl shader as C++
Post by: Syntopia on August 04, 2011, 07:24:00 PM
Nice work - I considered this as well for Fragmentarium, in order to have the option of switching to double precision CPU renders for final output. But I couldn't find a small embeddable C-compiler to dynamically compile the translated code :-)

You should also be aware that there is the quite impressive OpenGL Mathematics (GLM at http://glm.g-truc.net/api-0.9.2/index.html), which offers similar functionality. They've even implemented operator swizzling (as both l- and r-values), by using some kind of C++ magic.


Title: Re: compile/run glsl shader as C++
Post by: eiffie on August 04, 2011, 10:50:45 PM
I use the TinyCC c-compiler and it works well for me but no c++ syntax which is what Marius is after to reuse code. There must be one out there. I will keep looking.


Title: Re: compile/run glsl shader as C++
Post by: marius on August 04, 2011, 11:38:18 PM
Nice work - I considered this as well for Fragmentarium, in order to have the option of switching to double precision CPU renders for final output. But I couldn't find a small embeddable C-compiler to dynamically compile the translated code :-)

Yeah, the higher precision appeal is there up to _float128. Haven't tried yet but it's on the todo list to re-render some scenes to see whether the artifacts die down.
The embeddable compiling isn't that big a deal for me. Just run 'make' a bit more often. But for a larger audience any quirks and differences between glsl and C++glsl get too painful I imagine.

Quote
You should also be aware that there is the quite impressive OpenGL Mathematics (GLM at http://glm.g-truc.net/api-0.9.2/index.html), which offers similar functionality. They've even implemented operator swizzling (as both l- and r-values), by using some kind of C++ magic.

I see. Much more serious. Operator swizzling is nifty. As noted, I had to swap operands to make the C++ compile..
Still for a quick hack, I was pleasantly surprised how well it worked.


Title: Re: compile/run glsl shader as C++
Post by: Syntopia on August 04, 2011, 11:56:08 PM
At one point I looked at Clang/LLVM: http://clang.llvm.org/ for c++ support, but it seemed terribly complicated to embed.

For me, I think a better future solution would be to change to OpenCL, which can execute on both CPU and GPU without any hacks (and you get multithreading and SSE "for free" on the CPU).


Title: Re: compile/run glsl shader as C++
Post by: marius on August 05, 2011, 12:02:35 AM
At one point I looked at Clang/LLVM: http://clang.llvm.org/ for c++ support, but it seemed terribly complicated to embed.

For me, I think a better future solution would be to change to OpenCL, which can execute on both CPU and GPU without any hacks (and you get multithreading and SSE "for free" on the CPU).

True.
A bit too big of a change at the moment for my appetite. More like a complete rewrite.

I might push the current approach a bit further, at least to the point of being able to re-render scenes or even flights w/ just the CPU code. Probably will get distracted by coding for effie-style animations of fractals-in-fractals ;-)


Title: Re: compile/run glsl shader as C++
Post by: marius on August 05, 2011, 12:04:02 AM
..

s/effie/eiffie/g


Title: Re: compile/run glsl shader as C++
Post by: A Noniem on August 05, 2011, 12:40:14 AM
At one point I looked at Clang/LLVM: http://clang.llvm.org/ for c++ support, but it seemed terribly complicated to embed.

For me, I think a better future solution would be to change to OpenCL, which can execute on both CPU and GPU without any hacks (and you get multithreading and SSE "for free" on the CPU).


I'm currently writing my own openCL implementation and it is much less limiting than shader code (I did a bit of XNA/HLSL and got frustrated that XNA only supports shader model 3 which is a bitch) You won't regret changing your project to openCL (just make sure that you have a decent up to date graphics card.


Title: Re: compile/run glsl shader as C++
Post by: AndyAlias on August 15, 2011, 05:20:49 AM
This is a very useful thread :).

I like the look of Clang (compiles C++, BSD style license, works on most platforms?), might have to give that a go.

My solution thrown together this morning is to use GLM (getting rid of my adhoc vector/matrix classes *sheds single tear*), use the glm namespace (which gets you 95% of the way to compatible code) and use ifdef/ifndef in the .glsl file to handle the slight language differences (method declarations, in/out etc).

Synopsis:

mandelbulb.h:
Code:
...
#define GLM_SWIZZLE_XYZW
#include <glm/glm.hpp>
...

mandelbulb.cpp:
Code:
#include "mandelbulb.h"
using namespace glm;
#include "../../glsl/fractals/mandelbulb.glsl"
...

mandelbulb.glsl:
Code:
#ifndef GLM_VERSION
uniform float power;
uniform int   max_iterations;
uniform float z_multiplier;
#endif

...

#ifdef GLM_VERSION
float MandelBulb::DE(vec3 p) {

#else
float DE(vec3 p) {
#endif
...


Title: Re: compile/run glsl shader as C++
Post by: marius on October 05, 2011, 12:07:57 AM
My solution thrown together this morning is to use GLM (getting rid of my adhoc vector/matrix classes *sheds single tear*), use the glm namespace (which gets you 95% of the way to compatible code) and use ifdef/ifndef in the .glsl file to handle the slight language differences (method declarations, in/out etc).

Fwiw, since last post I tweaked my approach a bit, still ignoring GLM..  :-\
I minimized #ifdefs in the shader code with just a few TODOs left.
See http://code.google.com/p/boxplorer2/source/browse/trunk/cfgs/menger.cfg.data/fragment.glsl (http://code.google.com/p/boxplorer2/source/browse/trunk/cfgs/menger.cfg.data/fragment.glsl).
And http://code.google.com/p/boxplorer2/source/browse/trunk/glsl.cc (http://code.google.com/p/boxplorer2/source/browse/trunk/glsl.cc).

You can try the CPU only rendering version of a keyframe pretty easily now with
Code:
make -f Makefile.linux (or Makefile.osx or 'nmake -f Makefile.win32')
and for instance
Code:
./glsl cfgs/menger.cfg.data/combi-2.cfg (output is in aptly named './test.tga')

It picks up DE and coloring definitions from the .cfg file (from a known set declared by fragment.glsl at compile time..) with some C++ fu without recompiling.
The CPU and GPU generated images are pretty much identical, at least on win7 w/ amd 5850.