通过 Glut 为 OpenGL 定制纹理映射

发布于 2024-11-10 08:48:20 字数 943 浏览 3 评论 0 原文

我想做的是创建易于加载和使用的自定义模型和纹理文件。基本上,这些文件是代表 XYZ 和 RGB 值的浮点列表。 所以就像: 0.5 0.234 0.1 ... ...

问题是我无法让我的浮点数数组适用于纹理。 以下是我如何定义数组:

float*textureMap;

以下是我如何初始化它:

const int SIZE = (128*128*3);

textureMap = (float*)malloc(sizeof(textureMap)*SIZE);

for (int i =0; i 纹理贴图[i] = 0.0f; 现在,我使用Glut

创建了一个窗口,允许我用数据绘制和填充数组,正如您所看到的,所有 RGB 值都已初始化为 0.0f,所以我至少希望看到我的对象为黑色,但它只是保留默认的灰色,并且永远不会变得与我的纹理数组中的颜色相同。

这是我创建纹理的调用:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 130, 130, 0, GL_RGB, GL_FLOAT,textureMap);

我已将宽度和高度设置为 2^n + 2根据 OpenGL 官方网页上的指南,尽管考虑到我如何尝试构建浮点数组,我不确定这是否正确。

我还尝试调用 glGetError() 但没有成功(也就是说,没有抛出错误,并且我确保可以通过将宽度和高度设置为 -1 来抛出错误)。

我已确保在调用 glBegin() 之前绑定纹理,甚至检查了这些调用是否有错误,但无济于事。

有什么建议/指示吗?以前是否有人尝试过定义自己的纹理格式?

顺便说一句,我现在使用的是四边形而不是三角形,这很好,对吧?

What I am trying to do is create custom model and texture files that are simple to load and use. Basically the files are lists of floats representing the XYZ and RGB values.
So like:
0.5
0.234
0.1
...
...

Problem is I can't get my array of floats to work for the texture.
Here is how I define my array:

float* textureMap;

Here is how I initialize it:

const int SIZE = (128*128*3);

textureMap = (float*)malloc(sizeof(textureMap)*SIZE);

for (int i =0; i<SIZE; i++) {
textureMap[i] = 0.0f;
}

Now using Glut I have created a window that allows me to paint and fill the array with data, and as you can see all RGB values have been initialized to 0.0f so I would at least expect to see my object as black but it just remains a default grey color and never becomes the same colours as in my texture array.

Here is my call to create the texture:

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 130, 130, 0, GL_RGB, GL_FLOAT, textureMap);

I have made the width and height 2^n + 2 as per the guidelines on the OpenGl official webpage though I am not sure that this is correct given how I am trying to build my array of floats.

I have also tried a call to glGetError() with no success (that is, no errors are thrown and I have ensured that I can throw errors by setting width and height to -1).

I have made sure that I am binding the texture before my call to glBegin() and have even checked these calls for errors to no avail.

Any suggestions/pointers? Has anyone else tried to define their own texture formats before?

BTW am using quads instead of triangles at the moment, that's fine right?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

油焖大侠 2024-11-17 08:48:20

首先,确保您在渲染之前通过调用 glEnable(GL_TEXTURE_2D) 启用了纹理映射(我假设您目前不使用着色器,否则请查找错误)。

其次,创建大小为 130x130 的纹理并用大小为 128x128 的数据填充它肯定是错误的。您似乎误解了这些准则(也许他们说了一些关于纹理边框的内容,但在您的示例中您没有任何边框,因此 128x128 应该没问题)。

First, make sure you have enabled texture mapping by calling glEnable(GL_TEXTURE_2D) before rendering (I assume you don't use shaders at the moment, otherwise look there for errors).

Second, creating a texture of size 130x130 and filling it with data of size 128x128 is definitely wrong. You seem to have misunderstood those guidelines (perhaps they said something about texture border, but in your example you don't have any border, so 128x128 should be fine).

小帐篷 2024-11-17 08:48:20

解决了!问题是我需要在绘制多边形的同一函数中调用 glTexImage2D(...) (doh!)。我一直在做的是,每当我的纹理在绘画窗口中编辑时,我都会在同一函数中调用 glTexImage2D(...) ,然后告诉 3D 窗口使用 glutPostWindowRedisplay(...) 刷新。 )

讽刺的是,在 init 函数中调用 glTexImage2D(...) 也可以工作,但由于显而易见的原因,这只是第一次。

谢谢大家!

SOLVED! The problem was that I needed to call glTexImage2D(...) in the same function that draws the polygons (doh!). What I had been doing was whenever my texture was edited in the painting window i called glTexImage2D(...) in that same function then told the 3D window to refresh using glutPostWindowRedisplay(...).

Ironically, calling glTexImage2D(...) in the init function also works but only the first time for obvious reasons.

Thanks everybody!

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文