OpenGL ES 2.0 PNG Alpha 通道
我刚刚学习使用适用于 Android 的 OpenGL ES 2.0。我一直在尝试简单地在屏幕中间显示纹理,这很简单,但我似乎无法让 PNG alpha 正常工作。图像将显示为黑色背景,或者整个图像将稍微混合到背景颜色中,具体取决于我使用的设置。
为了达到这一点,我所遵循的实际教程从未具有透明度,因此我尝试使用通过搜索找到的代码进行工作,并且可能刚刚错过了一个重要步骤。不过,我已经进行了大量搜索来解决这个问题,但我还没有看到任何答案,其中有我的设置没有的东西。我已经尝试了 glBlendFunc 的每种组合,但没有成功。
我想如果我尝试粘贴所有可能与此相关的代码,这个问题就会显得非常臃肿,所以我很乐意发布你们要求的任何代码。我非常感谢任何有关我下一步应该尝试的想法。
编辑::这是我的片段着色器,我认为这就是原因。这是我从未真正找到透明度工作的好例子的唯一部分,其他所有内容都与我在其他地方看到的相匹配。
final String fragmentShader =
"precision mediump float; \n"
+ "varying vec2 v_Color; \n"
+ "uniform sampler2D s_baseMap; \n"
+ "void main() \n"
+ "{ \n"
+ " vec4 baseColor; \n"
+ " baseColor = texture2D( s_baseMap, v_Color ); \n"
+ " gl_FragColor = baseColor; \n"
+ "} \n";
它从来没有明确地对 alpha 执行任何操作,它来自一个毕竟不使用它的示例,但我仍然对片段着色器了解不多,因为当它将图像混合到其中时,它似乎“有点”工作背景,我认为它正在以某种形式与阿尔法一起工作,但我只是设置了一些错误。
编辑::这是“loadTexture”方法。它与我试图学习的 openGL ES 2.0 书中的示例大致相同,但进行了一些更改,似乎使图像更接近正常工作。
private int loadTexture ( InputStream is )
{
int[] textureId = new int[1];
Bitmap bitmap;
bitmap = BitmapFactory.decodeStream(is);
byte[] buffer = new byte[bitmap.getWidth() * bitmap.getHeight() * 4];
for ( int y = 0; y < bitmap.getHeight(); y++ )
for ( int x = 0; x < bitmap.getWidth(); x++ )
{
int pixel = bitmap.getPixel(x, y);
buffer[(y * bitmap.getWidth() + x) * 4 + 0] = (byte)((pixel >> 16) & 0xFF);
buffer[(y * bitmap.getWidth() + x) * 4 + 1] = (byte)((pixel >> 8) & 0xFF);
buffer[(y * bitmap.getWidth() + x) * 4 + 2] = (byte)((pixel >> 0) & 0xFF);
}
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(bitmap.getWidth() * bitmap.getHeight() * 4);
byteBuffer.put(buffer).position(0);
GLES20.glGenTextures ( 1, textureId, 0 );
GLES20.glBindTexture ( GLES20.GL_TEXTURE_2D, textureId[0] );
GLES20.glTexImage2D ( GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, bitmap.getWidth(), bitmap.getHeight(), 0,
GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, byteBuffer );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE );
return textureId[0];
}
我明白代码在做什么,但不可否认,它仍然让我有点困惑,所以由于我缺乏知识,我可能会错过一些明显的东西。
我没有看到我的代码的任何其他部分看起来可能会导致我遇到的那种问题,但是编程总是充满了意想不到的事情(尤其是在 OpenGL 的世界中),所以如果你认为其他的东西是因为我也一定会把它发布给你。抱歉给大家带来麻烦了!
I am just learning to work with OpenGL ES 2.0 for Android. I have been trying to simply display a texture on the middle of the screen, which was easy enough, but I cannot seem to get the PNG alpha to work properly. The image will either show with a black background, or the entire image will be blended into the background color slightly, depending on the settings I use.
The actual tutorials I have followed to get to this point never worked with transparency, so I have tried to work in code I have found by searching around, and likely have just missed one important step. I have searched quite a lot to figure this problem out though, and I have not seen any answers that had something my setup does not. I have tried every combination of glBlendFunc and what not with no luck.
I figured if I tried to paste in all of the code that may be related to this, the question would seem very bloated, so I will be happy to post any bits of code you guys ask for. I'd greatly appreciate any ideas for what I should try next.
EDIT :: Here is my fragment shader, which is what I believe to be the cause. This is the only part that I have never really found a decent example for working with transparency, and everything else matches what I have seen elsewhere.
final String fragmentShader =
"precision mediump float; \n"
+ "varying vec2 v_Color; \n"
+ "uniform sampler2D s_baseMap; \n"
+ "void main() \n"
+ "{ \n"
+ " vec4 baseColor; \n"
+ " baseColor = texture2D( s_baseMap, v_Color ); \n"
+ " gl_FragColor = baseColor; \n"
+ "} \n";
It never does anything with the alpha explicitly, it is from an example that doesn't use it after all, but I still don't know much about fragment shaders and because it seemed to "sort of" work when it blends the image into the background, I figured it was working with the alpha in some form and I just had something set wrong.
EDIT :: Here is the "loadTexture" method. It is roughly the same as the example from the openGL ES 2.0 book that I am trying to learn from, with a few alterations that seem to get the image closer to working properly.
private int loadTexture ( InputStream is )
{
int[] textureId = new int[1];
Bitmap bitmap;
bitmap = BitmapFactory.decodeStream(is);
byte[] buffer = new byte[bitmap.getWidth() * bitmap.getHeight() * 4];
for ( int y = 0; y < bitmap.getHeight(); y++ )
for ( int x = 0; x < bitmap.getWidth(); x++ )
{
int pixel = bitmap.getPixel(x, y);
buffer[(y * bitmap.getWidth() + x) * 4 + 0] = (byte)((pixel >> 16) & 0xFF);
buffer[(y * bitmap.getWidth() + x) * 4 + 1] = (byte)((pixel >> 8) & 0xFF);
buffer[(y * bitmap.getWidth() + x) * 4 + 2] = (byte)((pixel >> 0) & 0xFF);
}
ByteBuffer byteBuffer = ByteBuffer.allocateDirect(bitmap.getWidth() * bitmap.getHeight() * 4);
byteBuffer.put(buffer).position(0);
GLES20.glGenTextures ( 1, textureId, 0 );
GLES20.glBindTexture ( GLES20.GL_TEXTURE_2D, textureId[0] );
GLES20.glTexImage2D ( GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA, bitmap.getWidth(), bitmap.getHeight(), 0,
GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, byteBuffer );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE );
GLES20.glTexParameteri ( GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE );
return textureId[0];
}
I understand what the code is doing, but admittedly it still confuses me a bit so I may just be missing something obvious due to my lack of knowledge.
I don't see any other parts of my code that seem like they could cause the kind of problems I am having, but programming is always full of the unexpected (Especially in the world of OpenGL it seems) so if you think something else is the cause I'll be sure to post that for you as well. Sorry for all the trouble!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
更改
为
包含 Alpha 信息,然后只需
在绘制纹理之前添加即可。完成后请记住禁用 GL_BLEND。
Change
into
to include the alpha information then simply add
right before you draw the texture. Remember to disable GL_BLEND once you are done.
您很可能希望使用glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)作为混合函数,并且您必须确保将纹理的Alpha值写入到
gl_FragColor
片段着色器输出变量。为了使所有这些工作正常,您上传的纹理数据必须包含 Alpha 值,并且您必须使用支持 Alpha 通道(RGBA、RGBA8 等)的纹理格式。
您可以通过简单地将 alpha 值路由到 RGB 颜色分量并检查获得的图像来验证这一点。
编辑:
在您的图像加载代码中,您忘记复制 Alpha 通道!尝试一下 davebytes 给出的建议。
You most likely want to use
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)
as the blending function and you have to make sure to also write the alpha value of the texture to thegl_FragColor
fragment shader output variable.For all that to work your uploaded texture data has to contain an alpha value and you must have used a texture format that supports an alpha channel (RGBA, RGBA8 etc.).
You could verify this by simply routing the alpha value to the RGB color components and inspecting the image that you get.
EDIT:
In your image loading code you forget to copy over the alpha channel! Try the suggestion that davebytes gives.
你的初始着色器很好。 alpha 是颜色操作中固有的,它只是可能无法应用,具体取决于混合/模式/等。
鉴于如果您执行 fragcolor = base.aaa,您的纹理会显示为黑色,这意味着您的纹理数据是“坏”的。
看看你的纹理负载,是的,这是错误的。你永远不会复制alpha,只复制rgb。假设 java 将字节数组清除为 0,则所有 alpha 都将为零,这将导致您的黑匣子,当您启用 alpha 混合时,这将导致图像“消失”。
而不是所有手动复制之类的东西
为了简化您的生活,您可以简单地正常加载位图并使用 GLUtils 帮助程序上传,而不是直接使用 glTexImage2d:类似的东西, 。然后启用混合,如果没有预乘,则使用 src+invsrc 混合模式,然后渲染,您应该得到所需的结果。
your initial shader is fine. alpha is inherent in color ops, it just may not be applied depending on blend/mode/etc.
given that your texture comes out black if you do fragcolor = base.aaa, that implies your texture data is 'bad'.
looking at your texture load, yeah, it's wrong. you never copy over the alpha, just the rgb. assuming java clears the byte array to 0s, all alpha will be zero, that would get you your black box, that would cause the image to 'vanish' when you enable alpha blending.
To simplify your life, instead of all the hand copying and stuff, you can simply load the bitmap normally and use the GLUtils helper to upload instead of directly using glTexImage2d:
Something like that. Then enable blending, use src+invsrc blend mode if not premultiplied, and render, you should get the result desired.