在 iPhone 上将触摸屏坐标转换为 3d Opengl 世界坐标
我希望能够单击触摸屏并使用触摸的点作为用于拾取的射线的起始坐标。
如何将触摸屏幕返回的点转换为可以在 GL 世界坐标中使用的点?
搜索会带来许多令人困惑的可能性,包括 gluUnProject 的使用以及许多关于它是否受支持以及如何移植它的报告。
有人可以直接帮我安排一下吗?
我正在使用 Objective C、Xcode,并且正在为 iphone 进行编译。
I want to be able to click the touch screen and use the point touched as the starting coordinate for a ray to be used for picking.
How do I convert the point returned from touching the screen into something I can use in the GL world coordinates?
A search brings up lots of confusing possibilities, including the use of gluUnProject with lots of reports about whether it is supported and how to port it.
Can someone lay it out straight for me please?
I'm using Objective C, Xcode and I'm compiling for iphone.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
第 0 步:获取 gluUnproject:
需要它的报告是真实的。该功能可以为您完成所有繁重的工作。我知道 MESA 项目曾经有一个无需修改就能在 iOS 上几乎完美运行的实现。我不确定这是否仍然可用。除此之外,您只需要对其进行一些研究,然后推出自己的或移植别人的。线性代数有点重,祝你好运。
第 1 步:从 UIKit 坐标转换为 OpenGL 坐标:
这通常涉及两件事:
翻转 Y 坐标,因为 UIKit 喜欢其原点在左上角,而 OpenGL 喜欢其原点在左上角左下角。
从“屏幕单位”转换为像素。这使得标准和视网膜显示设备之间的内容保持一致。
第 3 步:在转换后的坐标上使用 gluUnproject:
gluUnproject()
从技术上讲,将窗口空间中的 3D 点转换为世界空间中的 3D 点。因此,要获得光线,您需要调用它两次:一次用于近裁剪平面,一次用于远裁剪平面。这会给你两个点,从中你可以得到光线。要调用 gluUnproject(),您需要访问 2D 视图坐标、当前 OpenGL 视口、当前 OpenGL 模型视图矩阵和当前 OpenGL 投影矩阵。伪代码:Step 0: Get gluUnproject:
The reports of needing it are true. That function does all the heavy lifting for you. I know at one point the MESA project had an implementation the worked almost perfectly on iOS without modifications. I'm not sure if that's still available. Barring that, you'll just have to do some research on it and either roll your own or port someone else's. It's a bit heavy on the Linear Algebra, so good luck.
Step 1: Convert from UIKit coordinates to OpenGL coordinates:
This normally involves two things:
Flip the Y-coordinate, because UIKit likes its origins in the top left, whereas OpenGL likes its origins in the bottom left.
Convert from "Screen Units" to pixels. This keeps things consistent across standard and retina display devices.
Step 3: Use gluUnproject on your converted coordinate:
gluUnproject()
technically converts a 3D point in window space to a 3D point in world space. So, to get a ray, you'll need to call it twice: once for the near clipping plane and once for the far clipping plane. That will give you two points, from which you can get a ray. To callgluUnproject()
, you'll need access to your 2D view coordinate, the current OpenGL viewport, the current OpenGL model view matrix, and the current OpenGL projection matrix. Pseudocode: