什么会导致 Windows 和 OS X 之间的 OpenGL Alpha 混合差异?

发布于 2024-10-17 02:14:47 字数 2261 浏览 1 评论 0原文

下图中有 3 种背景:黑色、白色和白色。灰色

每条有 3 个条:黑色 -> 灰色透明、白色->透明、颜色->透明

我正在使用 glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);我所有的顶点颜色都是 1,1,1,0。

该缺陷在白色->透明的白色背景下确实可见。

在 Windows XP(和其他 Windows 版本)上,它运行得很好,我得到了全白。然而在 Mac 上,我的中间变成了灰色!

什么会导致这种情况,为什么当我在白色上混合白色时颜色会变暗?

屏幕截图完整尺寸为@ http://dl.dropbox.com/ u/9410632/mac-colorbad.png

Screenshot

更新信息:

在 Windows 上,它不会似乎与 opengl 版本有关。 2.0 到 3.2,一切正常。 在我现在面前的 Mac 上,它是 2.1。

渐变保存在纹理中,所有顶点的颜色均为 1,1,1,1(白色 rgb,全 alpha)。背景只是 1x1 像素纹理(带有渐变图集),顶点根据需要着色,具有完整的 Alpha。

该图集是用 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, data); 创建的,它来自我自己编写的 ARGB dds 文件。

我还应该注意到,所有内容都是使用非常简单的着色器绘制的:

uniform sampler2D tex1;
uniform float alpha;

void main() {
    gl_FragColor = gl_Color * texture2D(tex1, gl_TexCoord[0].st) * vec4(1.0, 1.0, 1.0, alpha);
}

alpha 均匀设置为 1.0

现在,我确实尝试更改它,以便白色渐变不是纹理,而只是 4 个顶点,其中左侧的顶点为纯白色,不透明,正确的值是 1,1,1,0,这有效!

我现在已经对纹理进行了三次检查,它只是白色,并且 alpha 1.0->0.0 不等。

我认为这可能是默认问题。 opengl 或驱动程序的版本可能会以不同的方式初始化。

比如我最近发现大家默认都有GL_TEXTURE_2D glEnabled,但Intel GME965却没有。

找到解决方案

首先,了解更多背景知识。这个程序实际上是用.NET编写的(在OS X上使用Mono),我正在编写的DDS文件是通过将24位PNG文件的目录压缩成它可以的最小纹理而自动生成的图集。我使用 System.Drawing.Bitmap 加载这些 PNG,并在确定布局后将它们渲染为更大的位图。然后,该布局后位图被锁定(以获取其字节),并且这些位图通过我编写的代码写出到 DDS。

读完巴巴尔的建议后,我检查了内存中的纹理,它们确实不同!我加载的 DDS 似乎是罪魁祸首,而不是任何 OpenGL 设置。今天凭直觉,查看了两个平台上的DDS文件本身(用一个字节进行字节比较),确实,它们是不同的!当我使用 WTV 加载 DDS 文件时 ( http://developer.nvidia.com/object/ windows_texture_viewer.html ),它们看起来相同。但是,使用 WTV,您可以关闭每个通道 (RGBA)。当我关闭 Alpha 通道时,在 Windows 上我看到了一个非常糟糕的图像。没有 Alpha 不会导致没有抗锯齿边缘,所以它看起来当然会很糟糕。当我关闭 OSX DDS 上的 Alpha 通道时,它看起来很好!

Mono 中的 PNG 加载器正在预乘,导致了我所有的问题。我为他们输入了一张票证 ( https://bugzilla.novell.com/show_bug.cgi ?id=679242 )并已切换为直接使用 libpng。

谢谢大家!

There are 3 backgrounds in the below image: black white & grey

There are 3 bars on each one: black -> transparent, white -> transparent, and colors -> transparent

I am using glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); and all my vertex colors are 1,1,1,0.

The defect is really visible in the white->transparent on white background.

On Windows XP (and other windows flavors), it works perfectly, and I get fully white. On the Mac however, I get grey in the middle!

What would cause this, and why would it get darker when I'm blending white on white?

Screenshot full size is @ http://dl.dropbox.com/u/9410632/mac-colorbad.png

Screenshot

Updated info:

On Windows, it doesnt seem to matter about opengl version. 2.0 to 3.2, all work.
On the Mac I have in front of me now, it's 2.1.

The gradients were held in textures, and all the vertexes are colored 1,1,1,1 (white rgb, full alpha). The backgrounds are just 1x1 pixel textures (atlased with the gradients) and the vertexes are colored as needed, with full alpha.

The atlas is created with glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_BGRA, GL_UNSIGNED_BYTE, data); It comes from a ARGB dds file that I composed myself.

I should also note that everything is drawn using a trivally simple shader:

uniform sampler2D tex1;
uniform float alpha;

void main() {
    gl_FragColor = gl_Color * texture2D(tex1, gl_TexCoord[0].st) * vec4(1.0, 1.0, 1.0, alpha);
}

the alpha uniform is set to 1.0

Now, I did try to change it so the white gradient was not a texture, but just 4 vertexes where the left ones were solid white and opaque, and the right ones were 1,1,1,0, and that worked!

I have triple checked the texture now, and it is only white, with varying alpha 1.0->0.0.

I'm thinking this may be a defaults issue. The version of opengl or the driver may initialize things differently.

For example, I recently found that everyone has GL_TEXTURE_2D glEnabled by default, but not the Intel GME965.

SOLUTION FOUND

First, a bit more background. This program is actually written in .NET (using Mono on OS X), and the DDS file I'm writing is an atlas automatically generated by compacting a directory of 24 bit PNG files into the smallest texture it can. I am loading those PNGs using System.Drawing.Bitmap and rendering them into a larger Bitmap after determining the layout. That post-layout Bitmap is then locked (to get it's at its bytes), and those are written out to a DDS by code I wrote.

Upon reading Bahbar's advise, I checked out the textures in memory and they were indeed different! My DDS loaded seems to be the culprit and not any OpenGL settings. On a hunch today, I checked out the DDS file itself on the two platforms (using a byte for byte comparison), and indeed, they were different! When I load up the DDS files using WTV ( http://developer.nvidia.com/object/windows_texture_viewer.html ), they looked identical. However, using WTV, you can turn off each channel (R G B A). When I toggled off the Alpha channel, on windows I saw a really bad looking image. No alpha would lead to no antialiased edges, so of course it would look horrible. When I turned off the alpha channel on the OSX DDS, it looked fine!

The PNG loader in Mono is premultiplying, causing all my issues. I entered a ticket for them ( https://bugzilla.novell.com/show_bug.cgi?id=679242 ) and have switched to directly using libpng.

Thanks everyone!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

独自唱情﹋歌 2024-10-24 02:14:47

这有点盲目,但请检查(或明确设置)像素传输模式。

如果您使用带有预乘 Alpha 的纹理,然后使用您设置的混合模式,那么您得到的输出看起来就像您所期望的结果。上传纹理时,某些东西可能会设置像素传输模式以将 Alpha 乘以颜色通道。

当您将混合模式设置为 glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA) 时,您还可以检查 Mac 结果是否正确(带有纹理)。

This is a bit of a stab in the dark, but check (or explicitly set) the pixel transfer modes.

The output you're getting looks like the results you'd expect if you were using textures with pre-multiplied alpha but then using the blend mode you've set. Something might have set up the pixel transfer modes to multiply alpha into the colour channels when textures are uploaded.

You could also check if the Mac result is correct (with textures) when you set the blend mode to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA).

后eg是否自 2024-10-24 02:14:47

检查你的 dds 加载程序。它可能只是在一个平台上进行约翰·巴塞洛缪(John Bartholomew)在负载上谈论的预乘。

一种简单的验证方法是在调用 glTexImage 时查看正在加载到纹理中的数据。该数据完全统一吗?

Check your dds loader. It might be doing the pre-multiplication that John Bartholomew was talking about on load, only on one platform.

An easy way to verify is also to look at the data as it is being loaded into the texture, on the glTexImage call. Is that data completely uniform ?

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文