在 OpenGL GLSL 中使用单个图像作为立方体贴图
我正在使用 ARCore 开发增强现实应用程序。我正在使用 OpenGL 和 GLSL 着色器在场景中绘制一个对象(假设它是一个球体)。我想使用 ARCore 背景纹理图像在我的对象上进行环境映射。 我知道要创建环境贴图或立方体贴图,我需要 openGL 中的 6 个图像。在AR应用中,我们只能访问当前的可见区域,它由AR SDK作为单个纹理提供。我想找到一种方法将该图像分为 6 个逻辑部分。我看过一些例子 将 2:1 等距柱状全景图转换为立方体贴图,但找不到任何明确的内容。另外,这些示例将图像实际分为 6 个部分,但我更愿意更改要转换的片段着色器中的纹理坐标以处理它,而不是实际划分图像并作为每帧统一的立方体贴图纹理传递。
我附上了图像规格以及如何将其划分以映射到立方体的 6 个面。
这是我试图在片段 GLSL 着色器中执行的伪代码。我正在寻找一种将反射方向坐标转换为 2D 纹理坐标的方法。如果我有一个立方体贴图,我就可以使用 3D 方向对立方体贴图进行采样。由于我只有 1 个图像,我想将此 3D 方向转换为 2D 纹理坐标,假设我们将图像划分为立方体的 6 个逻辑部分,并且该立方体覆盖了我正在绘制的对象。请看一下数学部分是否正确。
uniform sampler2D ARCoreSampler;
vec3 Normal = normalize(v_normal);
vec3 ViewVector = normalize(v_camera_pos - v_model_pos);
vec3 direction= reflect(ViewVector, Normal);
direction = normalize(direction);
float unitsy = 4.0;
float unitsx = 3.0;
float ztan = atan(direction.z, direction.x);
float ytan = atan(direction.y, direction.x);
vec2 uv;
//Top face
if (ytan>=M_PI/4.0 && ytan < 3.0*M_PI/4.0) {
uv = direction.xz;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 3.0/unitsy;
uv.x += 1.0/unitsx;
}//bottom face
else if (ytan>=5.0*M_PI/4.0 && ytan < 7.0*M_PI/4.0) {
uv = direction.xz;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 1.0/unitsy;
uv.x += 1.0/unitsx;
}// front face
else if (ztan>=M_PI/4.0 && ztan<3.0*M_PI/4.0) {
uv = direction.xy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 2.0/unitsy;
uv.x += 1.0/unitsx;
}// Left face
else if (ztan>=3.0*M_PI/4.0 && ztan<5.0*M_PI/4.0) {
uv = direction.zy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 2.0/unitsy;
}// Back face
else if (ztan>=5.0*M_PI/4.0 && ztan<7.0*M_PI/4.0) {
uv = direction.xy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
//uv.y += 1.0/unitsy;
uv.x += 1.0/unitsx;
}// Right face
else if (ztan>=7.0*M_PI/4.0 && ztan<M_PI/4.0) {
uv = direction.zy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 2.0/unitsy;
uv.x += 2.0/unitsx;
}
vec4 envColor = texture(ARCoreSampler, uv);
I am working on a Augmented Reality application using ARCore. I am drawing an object in my scene (assume it is an sphere) using OpenGL and GLSL shaders. I want to do environment mapping on my object using the ARCore background texture image.
I know that to create an environment map or cube map, I need 6 images in openGL. In AR application, we only have access to the current visible area and it is given by AR SDK as single texture. I want to find a way to divide this image in 6 logical parts. I have seen some examples of it Convert 2:1 equirectangular panorama to cube map, but could not find anything clear. Also, these examples are dividing the image in actual 6 parts, but I would prefer to change my texture coordinates in the fragment shader to be transformed to take care of it instead of actually dividing the image and passing as cube map texture uniform per frame.
I am attaching the image specs and how I am going to divide it to map to 6 faces of cube.
Here is an pseudo code of what I am trying to do in fragment GLSL shader. I am looking for a way to convert the reflected direction coordinate to 2D texture coordinate. If I had a cubeMap, I could have just sampled the cubeMap using the 3D direction. As I have only 1 image, I want to convert this 3D direction in to 2D texture coordinate assuming we divide the image in 6 logical parts of cube, and this cube is covering the object I am drawing. Please look at the math part if it looks correct.
uniform sampler2D ARCoreSampler;
vec3 Normal = normalize(v_normal);
vec3 ViewVector = normalize(v_camera_pos - v_model_pos);
vec3 direction= reflect(ViewVector, Normal);
direction = normalize(direction);
float unitsy = 4.0;
float unitsx = 3.0;
float ztan = atan(direction.z, direction.x);
float ytan = atan(direction.y, direction.x);
vec2 uv;
//Top face
if (ytan>=M_PI/4.0 && ytan < 3.0*M_PI/4.0) {
uv = direction.xz;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 3.0/unitsy;
uv.x += 1.0/unitsx;
}//bottom face
else if (ytan>=5.0*M_PI/4.0 && ytan < 7.0*M_PI/4.0) {
uv = direction.xz;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 1.0/unitsy;
uv.x += 1.0/unitsx;
}// front face
else if (ztan>=M_PI/4.0 && ztan<3.0*M_PI/4.0) {
uv = direction.xy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 2.0/unitsy;
uv.x += 1.0/unitsx;
}// Left face
else if (ztan>=3.0*M_PI/4.0 && ztan<5.0*M_PI/4.0) {
uv = direction.zy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 2.0/unitsy;
}// Back face
else if (ztan>=5.0*M_PI/4.0 && ztan<7.0*M_PI/4.0) {
uv = direction.xy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
//uv.y += 1.0/unitsy;
uv.x += 1.0/unitsx;
}// Right face
else if (ztan>=7.0*M_PI/4.0 && ztan<M_PI/4.0) {
uv = direction.zy;
uv.x = uv.x/unitsx;
uv.y = uv.y/unitsy;
uv.y += 2.0/unitsy;
uv.x += 2.0/unitsx;
}
vec4 envColor = texture(ARCoreSampler, uv);
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论