Logo by bib - Contribute your own Logo!

END OF AN ERA, FRACTALFORUMS.COM IS CONTINUED ON FRACTALFORUMS.ORG

it was a great time but no longer maintainable by c.Kleinhuis contact him for any data retrieval,
thanks and see you perhaps in 10 years again

this forum will stay online for reference
News: Visit us on facebook
 
*
Welcome, Guest. Please login or register. January 12, 2026, 11:41:12 AM


Login with username, password and session length


The All New FractalForums is now in Public Beta Testing! Visit FractalForums.org and check it out!


Pages: [1]   Go Down
  Print  
Share this topic on DiggShare this topic on FacebookShare this topic on GoogleShare this topic on RedditShare this topic on StumbleUponShare this topic on Twitter
Author Topic: FOV vs. Zoom  (Read 5661 times)
0 Members and 1 Guest are viewing this topic.
cytotox
Alien
***
Posts: 39


« on: December 19, 2011, 02:43:41 PM »

Hi Jesse

As I recently bought a new TV with stereoscopic display properties (so-called 3D, with shutter glasses), I started to experiment with combining M3D-generated images into mpo-files.

I have a question now: if I understand correctly, FOV is the (horizontal?) viewing angle, with a value of, say, 60 meaning that the image represents the fractal as seen through an opening which limits the field of view to a total of 60°. But when I increase the zoom (after checking the box 'Fixed zoom and steps') in the navigator, the image is enlarged as if the field of vision has been reduced and, as a result, the center of the fractal gets scaled (zoomed) to fill out the image. As the FOV value does not change by this operation (it still reads 60, but is it really still 60°?), is this effect actually achieved by stepping closer to the fractal?

Logged
Jesse
Download Section
Fractal Schemer
*
Posts: 1013


« Reply #1 on: December 19, 2011, 03:20:17 PM »

M3d works still with a camera view plane and the zoom determines the size of it.
The FOVy (vertical) will then propagate from this viewplane, so you could give FOVy also negative values! (disabled in the navi though)

Sometime when i have really much time i want to change it to the usual pinhole camera behaviour, but i would have to change most functions of the program with a lot of testing... that is why i hav not dome it yet.

If you get some object cutting at the image edge in the navi, you can choose the fixed steps option and increase the zoom to make this camera viewplane smaller.  It is usually scaled autmatically with respect for the local distance estimation, but this can fail on some wild formulas.
Logged
cytotox
Alien
***
Posts: 39


« Reply #2 on: December 20, 2011, 01:46:57 PM »

Hi Jesse

Thanks for the reply. So the FOV / angle refers to the vertical (then how is the horizontal angle determined? Proportionally to the image aspect ratio?).

Do I understand you correctly that the projection of the fractal onto the camera view plane works like outlined in the first figure (3a), or, if negative values were allowed, like outlined in the second figure (3b)? There is probably a mistake in my figures as I have drawn the lines from the object to the view plane parallel to the outer lines. However, if this is true, then I do not really understand the (principal) difference between the current implementation of the camera, which should be equivalent to what is outlined in the third figure (all projection rays meet at a single point behind the viewplane; only true if 'rectilinear lens' is switched off?), and a pinhole camera (where all projection rays pass through a single point), as shown in the final figure (2).

(The only difference would be that in the current implementation parts of the fractal occurring between the view plane ("screen") and the "point of view" (convergence point) are not visible, whereas they would be visible in the projection if a pinhole camera would be placed at the former point of view)

All in all, the main question for me is if it is possible to obtain a camera setting from which I can derive two images that can be combined in an - artefact-free - stereoscopic projection (e.g. am I allowed to use the "rectilinear lens" in this instance?) ...

* FOV definition 3a.pdf (13.6 KB - downloaded 299 times.)
* FOV definition 3b.pdf (13.66 KB - downloaded 230 times.)
* FOV definition,2.pdf (14.37 KB - downloaded 251 times.)
* FOV definition.pdf (13.96 KB - downloaded 302 times.)
Logged
Jesse
Download Section
Fractal Schemer
*
Posts: 1013


« Reply #3 on: December 20, 2011, 08:11:10 PM »

With the rectilinear lense option the rays converge into a point, the rayvec is calced as normalized(vec3(x*FOV,y*FOV,zconst)).
Where x and y are the image coords with 0,0 in the center.
In the default option the rayvec is calced this way: normalized(vec3(sin(x),sin(y),cos(x)*cos(y))).  x and y are scaled to fit the FOV.
Dunno how much sense this makes, just works with arbitrary FOV's.
But this does not make a focuspoint behind the projection plane!

With pinhole camera i meant especially that all rayvecs starts from the same point, what can be judged then as camera location.

This points me also to a fault in my stereo calculation, because i used the camera plane as position for an eye, not the focus point.  Wondering how much difference this makes and if this can be corrected by just giving a higher distance to the viewing screen in the stereo mode settings.*Edit: the viewing screen distance must be lowered, not increased!

Do you had trouble with stereo renderings in m3d?
« Last Edit: December 20, 2011, 08:34:20 PM by Jesse » Logged
cytotox
Alien
***
Posts: 39


« Reply #4 on: December 21, 2011, 11:43:06 AM »

Hi Jesse

Ok, so if I understand correctly,
(from:
With the rectilinear lense option the rays converge into a point, the rayvec is calced as normalized(vec3(x*FOV,y*FOV,zconst)).
Where x and y are the image coords with 0,0 in the center.
and
With pinhole camera i meant especially that all rayvecs starts from the same point, what can be judged then as camera location.
)
I should use the rectilinear lens option to have the rays converge in (= start from) a single point, which essentially produces a pinhole camera behaviour (as stated in my previous post, that the two figures shown below result essentially in the same kind of projection)
This points me also to a fault in my stereo calculation, because i used the camera plane as position for an eye, not the focus point.  Wondering how much difference this makes and if this can be corrected by just giving a higher distance to the viewing screen in the stereo mode settings.*Edit: the viewing screen distance must be lowered, not increased!

Do you had trouble with stereo renderings in m3d?

Maybe this can be corrected when you do an update / upgrade of Mandelbulb 3D. I must admit that it's been a while that I tested the implemented stereo function of m3d, which I used for cross-eyed viewing on my Monitor. However, what works in cross-eyed view does not seem to work (at least not problem-free) when switching to a (significantly larger) stereoscopic display. Here, for example, the horizontal separation of the two images (that can be assembled into an mpo file using the tool StereoPhotoMaker) has to be well-defined, taking into account the screen size (for my 55 inch display, the horizontal is 122 cm) as well as intraocular distance (for me, that is ~ 5.4 cm, corresponding to 1920 pixel x (122 cm/5.4 cm) = 85 pixel of L/R image separation).

My current approach is to render two images of the same scene after sliding ~5-10 clicks sideways in the navigator (with fixed zoom & steps) at 4010 x 2160, then scale down by factor of two to 2005 x 1080, then cut the left-eye image by 85 pixels on the left and the other image by 85 pixels on the right (resulting in two 1920 x 1080 images), which are assembled into an mpo file. This has to be done to project points in the fractal image that are lying close to infinity behind the screen.

This seems to work fairly well on most instances, however, changing the field of view from 90° to 60° or 30° does not seem to correlate well with the actual viewing angle I try to achieve by changing the distance to the TV panel, and especially pop-out effects become quite eye-straining, possibly indicating a discrepancy between horizontal separation (defining the point of fixation and thereby the perceived position of an object with respect to the viewer) and the angle from which the object had actually been captured (the 'side views')...


* viewplane with rectilinear lens.png (55.44 KB, 566x499 - viewed 496 times.)

* pinhole camera projection.png (73.15 KB, 566x695 - viewed 551 times.)
Logged
Pages: [1]   Go Down
  Print  
 
Jump to:  

Related Topics
Subject Started by Replies Views Last post
Zoom's Mandelnautics & Fractallography Meet & Greet Zoom 3 2100 Last post October 15, 2006, 09:12:42 PM
by heneganj
IFS zoom Movies Showcase (Rate My Movie) David Makin 2 2261 Last post November 30, 2006, 10:31:43 AM
by Sockratease
CSI zoom Fractal Humor makc 5 2515 Last post April 07, 2010, 11:28:45 PM
by kram1032
Gouden zoom Images Showcase (Rate My Fractal) Dinkydau 9 1343 Last post September 23, 2012, 08:25:33 PM
by Dinkydau
90 billion x zoom Images Showcase (Rate My Fractal) ConfusedMonkey 0 1515 Last post January 13, 2014, 11:00:52 PM
by ConfusedMonkey

Powered by MySQL Powered by PHP Powered by SMF 1.1.21 | SMF © 2015, Simple Machines

Valid XHTML 1.0! Valid CSS! Dilber MC Theme by HarzeM
Page created in 0.418 seconds with 25 queries. (Pretty URLs adds 0.025s, 2q)