渲染到纹理 OpenGL ES 2.0
我正在尝试使用硬件优化的 2D 库来缩放(非插值缩放)图像。现在我正在
- 加载原始图像
- 制作原始图像的副本
- 使用2D库“缩放”到副本
- 使用glTexImage2D从图像生成纹理
- 将它们应用到我绘制的矩形
我还不能上传图像但是这是屏幕截图的链接。 http://img.photobucket.com/albums/v336/prankstar008/zoom.png
我想每帧将右侧图像放大和缩小一定量,而不是每次都使用 glTexImage2D 来破坏我的性能时间,我想渲染到纹理。我的问题是
- 这是渲染纹理的有效应用程序吗?为了澄清起见,2D 库获取一个指向填充有原始 RGB(A) 数据的缓冲区的指针,并返回一个指向应用了 2D 操作的新数据的指针。
- 我认为我的大部分困惑与纹理如何与着色器交互有关。有人可以解释一下在 GLES2 中将纹理应用到表面的最简单方法吗?显然我有一些工作,如果需要的话我可以发布代码片段。
- 另外澄清一下,虽然我不确定这是否重要,但它是在 Android 上运行的。
谢谢。
I am trying to use a hardware optimized 2D library to zoom (non-interpolated scaling) in and out of an image. Right now I am
- Loading the original image
- Making a copy of the original image
- Using the 2D library to "zoom" into the copy
- Generating textures using glTexImage2D from the images
- Applying them to rectangles that I drew
I can't upload images (yet) but here is a link to a screenshot.
http://img.photobucket.com/albums/v336/prankstar008/zoom.png
I would like to zoom in and out of the image on the right by a certain amount every frame, and rather than kill my performance by using glTexImage2D every time, I would like to render to a texture. My questions are
- Is this a valid application of rendering to a texture? For clarification, the 2D library takes a pointer to a buffer filled with raw RGB(A) data, and returns a pointer to the new data with the 2D operation applied.
- I think most of my confusion has to do with how textures interact with shaders. Can someone explain the simplest way to apply a texture to a surface in GLES2? I obviously have something working, and I can post snippets of code if necessary.
- Also for clarification, although I'm not sure it matters, this is being run on Android.
Thank you.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
1)不,不是。如果您只想放大和缩小图像(而不使用缩放库),那么渲染到另一个纹理将是浪费时间。您可以直接放大视图。
2)着色器是能够计算和转换坐标(通常在顶点着色器中完成)并能够使用这些坐标从纹理读取(当时只能在片段着色器中完成)的程序。
我相信你的着色器可能看起来像这样:
那是顶点着色器,现在是片段着色器:
你可以做的是在顶点着色器中包含缩放参数(所谓的统一):
片段着色器没有变化。这只是计算它的不同坐标,就是这么简单。要设置缩放,您需要:
现在这将导致两个纹理放大和缩小。要解决此问题(我假设您只想缩放正确的纹理),您需要:
我希望这会有所帮助......
1) No, it is not. If you just want to zoom in and out of the image (without using your zoom library), then rendering to another texture would be a waste of time. You can zoom directly in your view.
2) Shaders are programs, that are capable of calculating and transforming coordinates (that is usually done in vertex shaders) and are able to use those coordinates to read from texture (that can only be done in fragment shader at the time).
I believe your shaders could look like this:
That was vertex shader, and now fragment shader:
What you can do, is to include a zoom parameter (so called uniform) in vertex shader:
There are no changes in fragment shader. This just calculates different coordinates for it, and that's how simple it is. To set zoom, you need to:
Now this will cause both textures to zoom in and out. To fix this (I assume you only want the right texture to be zooming), you need to:
I hope this helps ...