为什么我的纹理在 OpenGL 应用程序中渲染不正确?
我正在使用 SDL 和 OpenGL 创建一个相当简单的应用程序。我创建了一个基本的文本渲染函数,它将生成的纹理映射到每个字符的四边形上。该纹理是根据每个字符的位图渲染的。
位图相当小,大约 800x16 像素。它在我的台式机和笔记本电脑上、虚拟机内外(以及 Windows 和 Linux 上)都运行得非常好。
现在,我在另一台计算机上尝试它,文本变得全是乱码 - 看起来好像计算机无法处理这样的非常基本的事情。为了看看是否是操作系统的原因,我安装了 VirtualBox 并在虚拟机中进行了测试 - 但结果更糟糕!它不渲染任何东西(尽管是乱码),而只是渲染一个纯白色的盒子。
为什么会出现这种情况,有什么办法可以解决吗?
一些代码 - 我如何初始化纹理:
glGenTextures(1, &fontRef);
glBindTexture(GL_TEXTURE_2D, iFont);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, FNT_IMG_W, FNT_IMG_H, 0,
GL_RGB, GL_UNSIGNED_BYTE, MY_FONT);
上面,MY_FONT
是一个无符号字符数组(来自 GIMP 的原始图像转储)。当我画一个角色时:
GLfloat ix = c * (GLfloat) FNT_CHAR_W;
// We just map each corner of the texture to a new vertex.
glTexCoord2d(ix, FNT_CHAR_H); glVertex3d(x, y, 0);
glTexCoord2d(ix + FNT_CHAR_W, FNT_CHAR_H); glVertex3d(x + iCharW, y, 0);
glTexCoord2d(ix + FNT_CHAR_W, 0); glVertex3d(x + iCharW, y + iCharH, 0);
glTexCoord2d(ix, 0); glVertex3d(x, y + iCharH, 0);
I'm working with SDL and OpenGL, creating a fairly simple application. I created a basic text rendering function, which maps a generated texture onto a quad for each character. This texture is rendered from a bitmap of each character.
The bitmap is fairly small, about 800x16 pixels. It works absolutely fine on my desktop and laptop, both in and out of a VM (and on both Windows and Linux).
Now, I'm trying it on another computer, and the text becomes all garbled - it appears as though the computer can't handle a very basic thing like this. To see if it was due to the OS, I installed VirtualBox and tested it in the VM - but the result is even worse! Instead of rendering anything (albeit garbled), it just renders a plain white box.
Why is this occuring, and is there any way to solve it?
Some code - how I initialize the texture:
glGenTextures(1, &fontRef);
glBindTexture(GL_TEXTURE_2D, iFont);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, FNT_IMG_W, FNT_IMG_H, 0,
GL_RGB, GL_UNSIGNED_BYTE, MY_FONT);
Above, MY_FONT
is an unsigned char array (the raw image dump from GIMP). When I draw a character:
GLfloat ix = c * (GLfloat) FNT_CHAR_W;
// We just map each corner of the texture to a new vertex.
glTexCoord2d(ix, FNT_CHAR_H); glVertex3d(x, y, 0);
glTexCoord2d(ix + FNT_CHAR_W, FNT_CHAR_H); glVertex3d(x + iCharW, y, 0);
glTexCoord2d(ix + FNT_CHAR_W, 0); glVertex3d(x + iCharW, y + iCharH, 0);
glTexCoord2d(ix, 0); glVertex3d(x, y + iCharH, 0);
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
在我看来,您正在使用的机器的显卡仅支持两种纹理的功率(即 16、32、64...)。 800x16 在这样的 gfx 卡上肯定无法工作。
您可以使用glGet和ARB_texture_non_power_of_two来检查gfx卡是否支持它。
或者使用 GLEW 为您进行检查。
That sounds to me as if the Graphics card of the machine you are working on only supports power of two textures (i.e. 16, 32, 64...). 800x16 certainly would not work on such a gfx card.
you can use glGet with ARB_texture_non_power_of_two to check if the gfx card does support it.
Or use GLEW to do that check for you.