渲染到离屏表面时可以使用像素着色器吗?
我正在考虑通过新的 D3DImage 将我拥有的一些 D3D 代码与 WPF 集成,如所述 此处:
我的问题是:像素着色器在屏幕外表面上工作吗?
I'm considering integrating some D3D code I have with WPF via the new D3DImage as described
here:
My question is this: Do pixel shader's work on offscreen surfaces?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
渲染到离屏表面通常比直接渲染到后台缓冲区受到的限制更少。 使用 D3DImage 离屏表面的唯一限制是它必须采用 32 位 RGB/ARGB 格式(取决于您的平台)。 除此之外,硬件提供的所有功能都可供您使用。
事实上,大量的着色器效果利用屏幕外表面进行多通道或全屏后处理。
Rendering to an offscreen surface is generally less constrained than rendering directly to a back buffer. The only constraints that come with using an offscreen surface with D3DImage is that it must be in a 32-bit RGB/ARGB format (depending on your platform). Other than that, all that the hardware has to offer is at your disposal.
In fact, tons of shader effects take advantage of offscreen surfaces for multipass, or full screen post-processing.
我不知道 WPF 是否有什么特别之处,但总的来说,像素着色器可以在屏幕外的表面上工作。
I don't know if there's anything special about it with WPF, but in general yes, pixel shaders work on offscreen surfaces.
对于某些效果,需要渲染到不同的表面 - 例如,着色器渲染场景前面的玻璃折射。 像素着色器无法访问当前屏幕内容,因此必须首先将视图渲染到缓冲区,然后在折射着色器通道中用作纹理,以便它可以从正在计算的像素之外的像素获取背景颜色。
For some effects rendering to a different surface is required - glass refraction in front of a shader-rendered scene for example. Pixel shaders cannot access the current screen contents and so the view has to be first rendered to a buffer and then used as a texture in the refraction shader pass so that it can take the background colour from a pixel other than the one being calculated.