在 OpenGL 中绘制纹理,同时忽略其 Alpha 通道

发布于 2024-09-08 20:24:23 字数 1644 浏览 2 评论 0原文

我有一个加载到内存中的纹理,该纹理是具有各种 alpha 值的 RGBA 格式。

图像是这样加载的:

 GLuint texture = 0;
 glGenTextures(1, &texture);
 glBindTexture(GL_TEXTURE_2D, texture);
 self.texNum = texture;

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR); 

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.imageWidth, self.imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, [self.imageData bytes]);

我想知道如何绘制这个纹理,以便图像中的 alpha 通道被视为全 1,并且纹理像 RGB 图像一样绘制。

考虑基本图像:

http://www.ldeo.columbia.edu/~jcoplan/alpha/base.png

这个图像是从 0 到 255 alpha 的渐变,并且整个 RGB 值为 255,0,0

但是,如果我在禁用混合的情况下绘制它,我会得到一个如下所示的图像: www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

当我真正想要的是这样的图像时: www.ldeo.columbia.edu/~jcoplan/alpha/

Correct.png 我真的很感激一些让它完全忽略 alpha 通道的指示。请注意,我最初不能仅将图像作为 RGB 加载,因为我确实需要在其他点使用 Alpha 通道。

编辑:我尝试使用 GL_COMBINE 来解决我的问题:

glColorf(1,1,1,1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_TEXTURE);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_PRIMARY_COLOR);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); 
[self drawTexture];

但仍然没有运气,它仍然将黑色绘制为红色。

I have a texture loaded into memory that is of RGBA format with various alpha values.

The image is loaded as so:

 GLuint texture = 0;
 glGenTextures(1, &texture);
 glBindTexture(GL_TEXTURE_2D, texture);
 self.texNum = texture;

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,GL_LINEAR); 
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER,GL_LINEAR); 

 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);

 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, self.imageWidth, self.imageHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, [self.imageData bytes]);

I want to know how I can draw this texture so that the alpha channel in the image is treated as all 1's and the texture is drawn like an RGB image.

Consider the base image:

http://www.ldeo.columbia.edu/~jcoplan/alpha/base.png

This image is a progression from 0 to 255 alpha and has the RGB value of 255,0,0 throughout

However if I draw it with blending disabled I get an image that looks like:
www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

When what I really want is an image that looks like this:
www.ldeo.columbia.edu/~jcoplan/alpha/correct.png

I'd really appreciate some pointers to have it ignore the alpha channel completely. Note that I can't just load the image in as an RGB initially because I do need the alpha channel at other points.

Edit: I tried to use GL_COMBINE to solve my problem as such:

glColorf(1,1,1,1);
glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_COMBINE);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_RGB, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_RGB, GL_TEXTURE);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_RGB, GL_SRC_COLOR);

glTexEnvi(GL_TEXTURE_ENV, GL_COMBINE_ALPHA, GL_REPLACE);
glTexEnvi(GL_TEXTURE_ENV, GL_SRC0_ALPHA, GL_PRIMARY_COLOR);
glTexEnvi(GL_TEXTURE_ENV, GL_OPERAND0_ALPHA, GL_SRC_ALPHA); 
[self drawTexture];

But still no luck, it draws black to red still.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

羅雙樹 2024-09-15 20:24:23

我有一个纹理加载到内存中,该纹理是 RGBA 格式,具有各种 alpha 值

glDisable(GL_BLEND)

但是,如果我在禁用混合的情况下绘制它,我会得到一个如下所示的图像:www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

发生这种情况是因为在源图像中所有透明像素都是黑色的。这是你的纹理/图像的问题,或者可能是加载函数的问题,但这不是 OpenGL 的问题。

您可能可以尝试使用 glTexEnv(GL_COMBINE...) 来修复它(即将纹理颜色与基于 Alpha 通道的底层颜色混合),但由于我还没有做过类似的事情,所以我不完全确定,并且可以' t 给你确切的操作数。在 Direct3D9 中这是可能的(使用 D3DTOP_MODULATEALPHA_ADDCOLOR),所以很可能有一种方法可以在 opengl 中做到这一点。

I have a texture loaded into memory that is of RGBA format with various alpha values

glDisable(GL_BLEND)

However if I draw it with blending disabled I get an image that looks like: www.ldeo.columbia.edu/~jcoplan/alpha/no_alpha.png

This happens because in your source image all transparent pixels are black. It's a problem with your texture/image, or maybe with loader function, but it is not an OpenGL problem.

You could probably try to fix it using glTexEnv(GL_COMBINE... ) (i.e. mix texture color with underlying color based on alpha channel), but since I haven't done something like that, i'm not completely sure, and can't give you exact operands. It was possible in Direct3D9 (using D3DTOP_MODULATEALPHA_ADDCOLOR), so most likely there is a way to do it in opengl.

原来分手还会想你 2024-09-15 20:24:23

您不应禁用混合,而应使用带有适当参数的 glBlendFunc

glBlendFunc(GL_ONE, GL_ZERO);

You should not disable blending but use the glBlendFunc with proper parameters:

glBlendFunc(GL_ONE, GL_ZERO);
涙—继续流 2024-09-15 20:24:23

告诉 OpenGL 仅上传图像的 RGB 通道

glPixelStorei(GL_UNPACK_ALIGNMENT, 4)

或者,您可以在调用格式设置为 GL_RGBglTexImage2D 之前 。它会导致它跳过每个像素的第四个字节,即 Alpha 通道。

Or you could tell OpenGL to upload only the RGB channels of your image using

glPixelStorei(GL_UNPACK_ALIGNMENT, 4)

before you call glTexImage2D with format set to GL_RGB. It will cause it to skip the fourth byte of every pixel, i.e. the alpha channel.

得不到的就毁灭 2024-09-15 20:24:23

我遇到了类似的问题,发现这是因为 iOS 图像加载正在对 RBG 值进行预乘(如此处的一些其他答案和评论中所述)。我很想知道是否有办法禁用预乘,但与此同时,我使用源自 此帖子此帖子

    // un pre-multiply
    uint8_t *imageBytes = (uint8_t *)imageData ;
    int byteCount = width*height*4 ;
    for (int i=0; i < byteCount; i+= 4) {
        uint8_t a = imageBytes[i+3] ;
        if (a!=255 && a!=0 ){
            float alphaFactor = 255.0/a ;
            imageBytes[i] *= alphaFactor ; 
            imageBytes[i+1] *= alphaFactor ; 
            imageBytes[i+2] *= alphaFactor ; 
        }
    }

I had a similar problem, and found out that it was because iOS image loading was doing a premultiply on the RBG values (as discussed in some of the other answers and comments here). I'd love to know whether there's a way of disabling pre-multiplication, but in the meantime I'm "un-pre-multiplying" using code derived from this thread and this thread.

    // un pre-multiply
    uint8_t *imageBytes = (uint8_t *)imageData ;
    int byteCount = width*height*4 ;
    for (int i=0; i < byteCount; i+= 4) {
        uint8_t a = imageBytes[i+3] ;
        if (a!=255 && a!=0 ){
            float alphaFactor = 255.0/a ;
            imageBytes[i] *= alphaFactor ; 
            imageBytes[i+1] *= alphaFactor ; 
            imageBytes[i+2] *= alphaFactor ; 
        }
    }
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文