OpenGL - gluPerspective / glFrustum - zNear 和 glFrustum zFar问题
我正在编写一个太空探索应用程序。我决定以光年为单位,并准确地模拟了恒星之间的距离。经过修补和大量艰苦的工作(主要是学习诀窍),我已经让相机从穿越宇宙的星际飞船的角度正常工作。
最初我没有关注 gluPerspective() 的 zNear 参数,直到我研究行星物体。由于我的尺度以光年为单位,我很快意识到,由于 zNear 为 1.0f,我将无法看到此类物体。经过实验,我得到了这些数字:
#define POV 45
#define zNear 0.0000001f
#define zFar 100000000.0f
gluPerspective (POV, WinWidth/WinHeight, zNear ,zFar);
这非常有效,因为我能够巡航我的太阳系(位置 0,0,0)并靠近那些看起来光线和纹理映射都很好的行星。然而,其他系统(不在位置 0,0,0)更难以巡航,因为对象以不寻常的方式远离相机。
然而,我注意到,当在宇宙中巡航时,奇怪的视觉故障开始发生。我身后的物体会“环绕”并显示在前面,如果我在 Y 方向上摆动 180 度,它们也会出现在原来的位置。因此,当在太空中扭曲时,大多数恒星都是正确的视差,但有些恒星会出现并朝相反的方向行进(至少可以说这是令人不安的)。
通过将 zNear 更改为 0.1f,可以立即纠正所有这些故障(但也无法解决太阳系物体)。所以我被困住了。我还尝试使用 glFrustum,它产生了完全相同的结果。
我使用以下方法来观察世界:
glTranslatef(pos_x, pos_y, pos_z);
使用相关的相机代码根据需要进行定向。即使禁用相机功能也不会改变任何东西。我什至尝试过 gluLookAt() ,它再次产生相同的结果。
使用极端 zNear / zFar 值时 gluPerspective() 是否有限制?我尝试缩小范围但无济于事。我什至通过放大所有内容并使用更大的 zNear 值将我的世界单位从光年更改为公里 - 什么也没有。帮助!
I'm writing a space exploration application. I've decided on light years being the units and have accurately modeled the distances between stars. After tinkering and a lot of arduous work (mostly learning the ropes) I have got the camera working correctly from the point of view of a starship traversing through the cosmos.
Initially I paid no attention to the zNear parameter of gluPerspective () until I worked on planetary objects. Since my scale is in light year units I soon realized that due to zNear being 1.0f I would not be able to see such objects. After experimentation I arrived at these figures:
#define POV 45
#define zNear 0.0000001f
#define zFar 100000000.0f
gluPerspective (POV, WinWidth/WinHeight, zNear ,zFar);
This works exceptionally well in that I was able to cruise my solar system (position 0,0,0) and move up close to the planets which look great lit and texture mapped. However other systems (not at position 0,0,0) were much harder to cruise through because the objects moved away from the camera in unusual ways.
I had noticed however that strange visual glitches started to take place when cruising through the universe. Objects behind me would 'wrap around' and show ahead, if I swing 180 degrees in the Y direction they'll also appear in their original place. So when warping through space, most the stars are correctly parallaxing but some appear and travel in the opposite direction (which is disturbing to say the least).
By changing the zNear to 0.1f immediately corrects ALL of these glitches (but also won't resolve solar system objects). So I'm stuck. I've also tried working with glFrustum and it produces exactly the same results.
I use the following to view the world:
glTranslatef(pos_x, pos_y, pos_z);
With relevant camera code to orientate as required. Even disabling camera functionality does not change anything. I've even tried gluLookAt() and again it produces the same results.
Does gluPerspective() have limits when extreme zNear / zFar values are used? I tried to reduce the range but to no avail. I even changed my world units from light years to kilometers by scaling everything up and using a bigger zNear value - nothing. HELP!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
问题是你想同时解决太多事情。你想要在太阳系的尺度上观察事物,同时也有半银河系的尺度。那根本不可能。不使用实时渲染器。
浮点精度只有这么多。由于您的 zNear 非常接近,对于距离相机超过 0.0001 的任何物体,您基本上已经破坏了深度缓冲区。
你需要做的是根据距离来绘制事物。近处物体(在太阳系尺度内)是通过一个透视矩阵、使用一个深度范围(例如 0 到 0.8)绘制的。然后使用不同的透视矩阵和不同的深度范围(0.8 到 1)绘制更远的物体。这确实是您实现这项工作的唯一方法。
此外,您可能需要以双精度数学方式计算 CPU 上对象的矩阵,然后将它们转换回单精度以供 OpenGL 使用。
The problem is that you want to resolve too much at the same time. You want to view things on the scale of the solar system, while also having semi-galactic scale. That is simply not possible. Not with a real-time renderer.
There is only so much floating-point precision to go around. And with your zNear being incredibly close, you've basically destroyed your depth buffer for anything that is more than about 0.0001 away from your camera.
What you need to do is to draw things based on distance. Near objects (within a solar system's scale) are drawn with one perspective matrix, using one depth range (say, 0 to 0.8). Then more distant objects are drawn with a different perspective matrix and a different depth range (0.8 to 1). That's really the only ways you're going to make this work.
Also, you may need to compute the matrices for objects on the CPU in double-precision math, then translate them back to single-precision for OpenGL to use.
OpenGL 不应绘制距相机比 zFar 更远的物体,也不应该绘制比 zNear 距相机更近的物体。
但对于介于两者之间的事物,OpenGL 会计算存储在深度缓冲区中的深度值,用于判断一个对象是否阻挡了另一个对象。不幸的是,深度缓冲区的精度有限(通常为 16 或 24 位),并且根据 this ,大约损失了 log2(zFar/zNear) 位精度。因此,10^15 的 zFar/zNear 比率(丢失约 50 位)必然会导致问题。一种选择是稍微增加 zNear(如果可以的话)。否则,您将需要研究分割深度缓冲区或对数深度缓冲区
OpenGL should not be drawing anything farther from the camera than zFar, or closer to the camera than zNear.
But for things in between, OpenGL computes a depth value that is stored in the depth buffer which it uses to tell whether one object is blocking another. Unfortunately, the depth buffer has limited precision (generally 16 or 24 bits) and according to this, roughly log2(zFar/zNear) bits of precision are lost. Thus, a zFar/zNear ratio of 10^15 (~50 bits lost) is bound to cause problems. One option would be to slightly increase zNear (if you can). Otherwise, you will need to look into Split Depth Buffers or Logarithmic Depth Buffers
尼可波拉斯已经告诉过你这个故事的一部分了。另一个是,您应该开始考虑一种结构化的方式来存储坐标:存储每个对象相对于在重力上占主导地位的对象的位置,并为这些对象使用适当的单位。
所以你有星星。恒星之间的距离以光年为单位测量。恒星由行星绕轨道运行。恒星系统内的距离以光分钟到光小时为单位测量。行星由卫星绕轨道运行。行星系统中的距离以光秒为单位测量。
要显示此类比例,您需要分多次渲染。这些带有鳞片的物体形成一棵树。首先,将树枝从远到近排序,然后首先遍历树的深度。对于每个分支级别,您使用适当的投影参数,以便近→远剪裁平面紧密贴合要渲染的对象。渲染每个级别后清除深度缓冲区。
Nicol Bolas already told you one piece of the story. The other is, that you should start thinking about a structured way to store the coordinates: Store the position of each object in relation to the object that dominates it gravitatively and use apropriate units for those.
So you have stars. Distances between stars are measured in lightyears. Stars are orbited by planets. Distances within a starsystem are measured in lightminutes to lighthours. Planets are orbited by moons. Distances in a planetary system are measured in lightseconds.
To display such scales you need to render in multiple passes. The objects with their scales form a tree. First you sort the branches distant to close, then you traverse the tree depth first. For each branching level you use apropriate projection parameters so that the near→far clip planes snuggily fit the to be rendered objects. After rendering each level clear the depth buffer.