iOS:关于在屏幕上渲染纹理的纹理坐标的混乱
我正在尝试在 iPhone 屏幕上渲染由相机生成的纹理。 我从 Brad Larson 下载了颜色跟踪示例 http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios(示例代码的直接链接:http://www.sunsetlakesoftware.com/sites/default/files/ColorTracking.zip)。
在 ColorTrackingViewController drawFrame 方法中,他使用以下代码生成顶点和相应的纹理坐标以渲染纹理正方形:
static const GLfloat squareVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
static const GLfloat textureVertices[] = {
1.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
0.0f, 0.0f,
};
我不明白为什么这些纹理坐标可以正常工作。
在我看来,在我看到的另一个示例代码中也可以正常工作,它们应该是:
static const GLfloat textureVertices[] = {
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
};
我浏览了整个代码,但我无法弄清楚为什么上面的纹理坐标可以正常工作。我缺少什么?
I am trying to render a texture that was generated by the camera on the iPhone screen.
I downloaded the color tracking example from Brad Larson on http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios (direct link for sample code: http://www.sunsetlakesoftware.com/sites/default/files/ColorTracking.zip).
In the ColorTrackingViewController drawFrame method he uses the following code to generate vertex and corresponding texture coordinates for rendering a textured square:
static const GLfloat squareVertices[] = {
-1.0f, -1.0f,
1.0f, -1.0f,
-1.0f, 1.0f,
1.0f, 1.0f,
};
static const GLfloat textureVertices[] = {
1.0f, 1.0f,
1.0f, 0.0f,
0.0f, 1.0f,
0.0f, 0.0f,
};
I don't understand why these texture coordinates work correctly.
In my opinion, and in another example code I have seen that works also correctly, they should be:
static const GLfloat textureVertices[] = {
0.0f, 1.0f,
1.0f, 1.0f,
0.0f, 0.0f,
1.0f, 0.0f,
};
I went through the whole code, but I cannot figure out why the above texture coordinates work correctly. What am I missing?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
我相信这是因为 iPhone 摄像头的图像数据总是逆时针旋转 90 度呈现。为了抵消这种旋转,他还将纹理坐标设置为逆时针旋转 90 度。有时两个错误确实可以构成一个正确?
I believe it is because the image data from the iphone camera is always presented rotated 90 CCW. To counteract that rotation he's setting the texture co-ordinates to be rotated 90 CCW too. Sometimes two wrongs do make a right?