WebGL2:如何通过着色器渲染到纹理_3D
我读到 WebGL2 为我们提供了访问 3D 纹理。我尝试使用它来执行一些 GPU 端计算,然后将输出存储在 64x64x64 3D 纹理中。渲染流程是
计算着色器 ->渲染到 3dTexture ->读取着色器->渲染到屏幕
这是我的简单计算着色器,纹理的 RGB 通道应对应于 XYZ 片段坐标。
#version 300 es
precision mediump sampler3D;
precision highp float;
layout(location = 0) out highp vec4 pc_fragColor;
void main() {
vec3 color = vec3(gl_FragCoord.x / 64.0, gl_FragCoord.y / 64.0, gl_FragDepth);
pc_fragColor.rgb = color;
pc_fragColor.a = 1.0;
}
然而,这似乎只是渲染到 3DTexture 的单个“切片”,其中深度为 0.0。从 1 到 63 px 的所有后续深度均保持黑色:
我在下面创建了一个工作演示来演示此问题。
var renderer, target3d, camera;
const SIDE = 64;
var computeMaterial, computeMesh;
var readDataMaterial, readDataMesh,
read3dTargetMaterial, read3dTargetMesh;
var textField = document.querySelector("#textField");
function init() {
// Three.js boilerplate
renderer = new THREE.WebGLRenderer({antialias: true});
renderer.setSize(window.innerWidth, window.innerHeight);
renderer.setClearColor(new THREE.Color(0x000000), 1.0);
document.body.appendChild(renderer.domElement);
camera = new THREE.Camera();
// Create volume material to render to 3dTexture
computeMaterial = new THREE.RawShaderMaterial({
vertexShader: SIMPLE_VERTEX,
fragmentShader: COMPUTE_FRAGMENT,
uniforms: {
uZCoord: { value: 0.0 },
},
depthTest: false,
});
computeMaterial.type = "VolumeShader";
computeMesh = new THREE.Mesh(new THREE.PlaneGeometry(2, 2), computeMaterial);
// Left material, reads Data3DTexture
readDataMaterial = new THREE.RawShaderMaterial({
vertexShader: SIMPLE_VERTEX,
fragmentShader: READ_FRAGMENT,
uniforms: {
uZCoord: { value: 0.0 },
tDiffuse: { value: create3dDataTexture() }
},
depthTest: false
});
readDataMaterial.type = "DebugShader";
readDataMesh = new THREE.Mesh(new THREE.PlaneGeometry(2, 2), readDataMaterial);
// Right material, reads 3DRenderTarget texture
target3d = new THREE.WebGL3DRenderTarget(SIDE, SIDE, SIDE);
target3d.depthBuffer = false;
read3dTargetMaterial = readDataMaterial.clone();
read3dTargetMaterial.uniforms.tDiffuse.value = target3d.texture;
read3dTargetMesh = new THREE.Mesh(new THREE.PlaneGeometry(2, 2), read3dTargetMaterial);
}
// Creates 3D texture with RGB gradient along the XYZ axes
function create3dDataTexture() {
const d = new Uint8Array( SIDE * SIDE * SIDE * 4 );
window.dat = d;
let i4 = 0;
for ( let z = 0; z < SIDE; z ++ ) {
for ( let y = 0; y < SIDE; y ++ ) {
for ( let x = 0; x < SIDE; x ++ ) {
d[i4 + 0] = (x / SIDE) * 255;
d[i4 + 1] = (y / SIDE) * 255;
d[i4 + 2] = (z / SIDE) * 255;
d[i4 + 3] = 1.0;
i4 += 4;
}
}
}
const texture = new THREE.Data3DTexture( d, SIDE, SIDE, SIDE );
texture.format = THREE.RGBAFormat;
texture.minFilter = THREE.NearestFilter;
texture.magFilter = THREE.NearestFilter;
texture.unpackAlignment = 1;
texture.needsUpdate = true;
return texture;
}
function onResize() {
renderer.setSize(window.innerWidth, window.innerHeight);
}
function animate(t) {
// Render volume shader to target3d buffer
renderer.setRenderTarget(target3d);
renderer.render(computeMesh, camera);
// Update z texture coordinate along sine wave
renderer.autoClear = false;
const sinZCoord = Math.sin(t / 1000);
readDataMaterial.uniforms.uZCoord.value = sinZCoord;
read3dTargetMaterial.uniforms.uZCoord.value = sinZCoord;
textField.innerText = sinZCoord.toFixed(4);
// Render data3D texture to screen
renderer.setViewport(0, window.innerHeight - SIDE*4, SIDE * 4, SIDE * 4);
renderer.setRenderTarget(null);
renderer.render(readDataMesh, camera);
// Render 3dRenderTarget texture to screen
renderer.setViewport(SIDE * 4, window.innerHeight - SIDE*4, SIDE * 4, SIDE * 4);
renderer.setRenderTarget(null);
renderer.render(read3dTargetMesh, camera);
renderer.autoClear = true;
requestAnimationFrame(animate);
}
init();
window.addEventListener("resize", onResize);
requestAnimationFrame(animate);
html, body {
width: 100%;
height: 100%;
margin: 0;
overflow: hidden;
}
#title {
position: absolute;
top: 0;
left: 0;
color: white;
font-family: sans-serif;
}
h3 {
margin: 2px;
}
<div id="title">
<h3>texDepth</h3><h3 id="textField"></h3>
</div>
<script src="https://threejs.org/build/three.js"></script>
<script>
/////////////////////////////////////////////////////////////////////////////////////
// Compute frag shader
// It should output an RGB gradient in the XYZ axes to the 3DRenderTarget
// But gl_FragCoord.z is always 0.5 and gl_FragDepth is always 0.0
const COMPUTE_FRAGMENT = `#version 300 es
precision mediump sampler3D;
precision highp float;
precision highp int;
layout(location = 0) out highp vec4 pc_fragColor;
void main() {
vec3 color = vec3(gl_FragCoord.x / 64.0, gl_FragCoord.y / 64.0, gl_FragDepth);
pc_fragColor.rgb = color;
pc_fragColor.a = 1.0;
}`;
/////////////////////////////////////////////////////////////////////////////////////
// Reader frag shader
// Samples the 3D texture along uv.x, uv.y, and uniform Z coordinate
const READ_FRAGMENT = `#version 300 es
precision mediump sampler3D;
precision highp float;
precision highp int;
layout(location = 0) out highp vec4 pc_fragColor;
in vec2 vUv;
uniform sampler3D tDiffuse;
uniform float uZCoord;
void main() {
vec3 UV3 = vec3(vUv.x, vUv.y, uZCoord);
vec3 diffuse = texture(tDiffuse, UV3).rgb;
pc_fragColor.rgb = diffuse;
pc_fragColor.a = 1.0;
}
`;
/////////////////////////////////////////////////////////////////////////////////////
// Simple vertex shader,
// renders a full-screen quad with UVs without any transformations
const SIMPLE_VERTEX = `#version 300 es
precision highp float;
precision highp int;
in vec2 uv;
in vec3 position;
out vec2 vUv;
void main() {
vUv = uv;
gl_Position = vec4(position, 1.0);
}`;
/////////////////////////////////////////////////////////////////////////////////////
</script>
- 在左侧,我对通过 JavaScript 创建的 Data3DTexture 进行采样。正如预期的那样,当我在深度轴上上下移动时,蓝色通道会平滑过渡。
- 在右侧,我对在上面显示的片段着色器中渲染的 WebGL3DRenderTarget 纹理进行采样。如您所见,仅当深度坐标为 0.0 时才会渲染到纹理。所有其他“切片”都是黑色的。
如何将计算渲染到所有 64 个深度切片?我在这个演示中使用 Three.js,但我可以使用任何其他库,例如 TWGL 或 vanilla WebGL 来实现相同的结果。
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
它看起来没有记录,但您可以使用
setRenderTarget
的第二个参数来设置要渲染到的 3d 渲染目标的“图层”。以下是要进行的更改:除此之外,我不相信有一种方法可以渲染到完整的 3d单次绘制调用中目标的体积。这个 Three.js 示例展示了如何执行此操作,但也使用渲染目标数组:
https: //twojs.org/examples/?q=array#webgl2_rendertarget_texture2darray
It doesn't look documented but you can use a second argument to
setRenderTarget
to set the "layer" of the 3d render target to render to. Here are the changes to make:Other than that I don't believe theres a way to render to the full 3d volume of the target in a single draw call. This three.js example shows how to do this but with render target arrays, as well:
https://threejs.org/examples/?q=array#webgl2_rendertarget_texture2darray