光线投射:如何正确应用投影矩阵?

发布于 2024-08-23 11:38:15 字数 413 浏览 11 评论 0原文

我目前正在 GLSL 中进行一些光线投射,效果很好。无论如何,我现在想从正交投影转到透视投影,但我不确定如何正确执行此操作。 关于如何将投影矩阵与光线投射结合使用,是否有任何好的链接? 我什至不确定我必须将矩阵应用到什么(可能以某种方式应用到光线方向?)。现在我这样做(伪代码):

vec3 rayDir = (0.0, 0.0, -1.0); //down the negative -z axis in parallel;

但现在我想使用一个 projMatrix,它的工作原理与 gluPerspective 函数类似,这样我就可以简单地定义纵横比、fov 以及近平面和远平面。 所以基本上,任何人都可以为我提供一段代码来设置类似于 gluProjection 的项目矩阵吗? 其次告诉我将它与 rayDirection 相乘是否正确?

I am currently working on some raycasting in GLSL which works fine. Anyways I want to go from orthogonal projection to perspective projection now but I am not sure how to properly do so.
Are there any good links on how to use a projection Matrix with raycasting?
I am not even sure what I have to apply the matrix to (propably to the ray direction somehow?). Right now I do it like this (pseudocode):

vec3 rayDir = (0.0, 0.0, -1.0); //down the negative -z axis in parallel;

but now I would like to use a projMatrix which works similar to gluPerspective function so that I can simply define an aspect ratio, fov and near and far plane.
So basically, can anybody provide me a chunk of code to set up a proj matrix similar to gluProjection does?
And secondly tell me if it is correct to multiply it with the rayDirection?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(5

漆黑的白昼 2024-08-30 11:38:15

对于与标准渲染相同的场景中的光线跟踪,我发现以下方法可以从屏幕坐标获取场景空间光线:(例如,渲染从 [-1,-1] 到 [1,1] 的全屏四边形],或该范围内的某些子区域)

顶点着色器

uniform mat4 invprojview;
uniform float near;
uniform float far;

attribute vec2 pos; // from [-1,-1] to [1,1]

varying lowp vec3 origin;
varying lowp vec3 ray;

void main() {
    gl_Position = vec4(pos, 0.0, 1.0);
    origin = (invprojview * vec4(pos, -1.0, 1.0) * near).xyz;
    ray = (invprojview * vec4(pos * (far - near), far + near, far - near)).xyz;

    // equivalent calculation:
    // ray = (invprojview * (vec4(pos, 1.0, 1.0) * far - vec4(pos, -1.0, 1.0) * near)).xyz
}

片段着色器

varying lowp vec3 origin;
varying lowp vec3 ray;

void main() {
    lowp vec3 rayDir = normalize(ray);
    // Do raytracing from origin in direction rayDir
}

请注意,您需要提供反转的投影视图矩阵,以及近距和远距剪切距离。我确信有一种方法可以从矩阵中获取这些裁剪距离,但我还没有弄清楚如何实现。

这将定义一条从近平面开始的射线,而不是从相机的位置开始。这提供了在 OpenGL 裁剪三角形的同一位置进行裁剪的优点,使光线追踪对象与场景匹配。由于光线变量将是到达远平面的正确长度,因此您也可以在那里进行剪辑。

至于首先获得透视矩阵(并理解其背后的数学),我总是使用此参考页面:

http://www.songho.ca/opengl/gl_projectionmatrix.html

我建议查看该网站上的推导,但以防万一它不可用,这里是最终的投影矩阵定义:

2n/(r-l)      0      (r+l)/(r-l)      0
    0     2n/(t-b)   (t+b)/(t-b)      0
    0         0     -(f+n)/(f-n)  -2fn/(f-n)
    0         0          -1           0

For raytracing in the same scene as a standard render, I have found that the following works for getting a scene-space ray from screen coordinates: (e.g. render a full-screen quad from [-1,-1] to [1,1], or some sub-area within that range)

Vertex Shader

uniform mat4 invprojview;
uniform float near;
uniform float far;

attribute vec2 pos; // from [-1,-1] to [1,1]

varying lowp vec3 origin;
varying lowp vec3 ray;

void main() {
    gl_Position = vec4(pos, 0.0, 1.0);
    origin = (invprojview * vec4(pos, -1.0, 1.0) * near).xyz;
    ray = (invprojview * vec4(pos * (far - near), far + near, far - near)).xyz;

    // equivalent calculation:
    // ray = (invprojview * (vec4(pos, 1.0, 1.0) * far - vec4(pos, -1.0, 1.0) * near)).xyz
}

Fragment Shader

varying lowp vec3 origin;
varying lowp vec3 ray;

void main() {
    lowp vec3 rayDir = normalize(ray);
    // Do raytracing from origin in direction rayDir
}

Note that you need to provide the inverted projection-view matrix, as well as the near and far clipping distances. I'm sure there's a way to get those clipping distances from the matrix, but I haven't figured out how.

This will define a ray which starts at the near plane, not the camera's position. This gives the advantage of clipping at the same position that OpenGL will clip triangles, making your ray-traced object match the scene. Since the ray variable will be the correct length to reach the far plane, you can also clip there too.

As for getting a perspective matrix in the first place (and understanding the mathematics behind it), I always use this reference page:

http://www.songho.ca/opengl/gl_projectionmatrix.html

I recommend looking through the derivation on that site, but in case it becomes unavailable here is the final projection matrix definition:

2n/(r-l)      0      (r+l)/(r-l)      0
    0     2n/(t-b)   (t+b)/(t-b)      0
    0         0     -(f+n)/(f-n)  -2fn/(f-n)
    0         0          -1           0
乜一 2024-08-30 11:38:15

要将光线射入场景,您需要首先将自己(在精神上)置于应用投影矩阵之后的世界中。这意味着视锥体现在是一个 2x2x1 的盒子 - 这称为规范视图体积。 (盒子的对角是 (-1, -1, 0) 和 (1, 1, -1)。)您生成的光线将(在投影后变换的世界中)从原点开始并击中后方剪裁平面(位于 z=-1 处)。第一条光线的“目的地”应该是 (-1, 1, -1) - 远剪裁平面的左上角。 (后续光线“目的地”是根据视口的分辨率计算的。)

现在您已将此光线置于规范视图体积中,您需要将其放入标准世界坐标中。你如何做到这一点?简单 - 只需乘以投影矩阵的逆,通常称为观察变换。这会将您的光线置于与场景中的对象相同的坐标系中,从而使光线碰撞测试变得轻松简单。

To shoot rays out into the scene, you want to start by putting yourself (mentally) into the world after the projection matrix has been applied. This means that the view frustrum is now a 2x2x1 box - this is known as the canonical view volume. (The opposing corners of the box are (-1, -1, 0) and (1, 1, -1).) The rays you generate will (in the post-projection transformed world) start at the origin and hit the rear clipping plane (located at z=-1). The "destination" of your first ray should be (-1, 1, -1) - the upper-left-hand corner of the far clipping plane. (Subsequent rays "destinations" are calculated based on the resolution of your viewport.)

Now that you have this ray in the canonical view volume, you need to get it into standard world coordinates. How do you do this? Simple - just multiply by the inverse of the projection matrix, often called the viewing transformation. This will put your rays into the same coordinate system as the objects in your scene, making ray collision testing nice and easy.

剩余の解释 2024-08-30 11:38:15

在透视投影中,投影矩阵描述了从针孔相机看到的世界上的 3D 点到视口的 2D 点的映射。
相机视锥体(截棱锥体)中的眼睛空间坐标被映射到立方体(标准化设备坐标)。

输入图片这里的描述

透视投影矩阵如下所示:

r = right, l = left, b = bottom, t = top, n = near, f = far

2*n/(r-l)      0              0               0
0              2*n/(t-b)      0               0
(r+l)/(r-l)    (t+b)/(t-b)    -(f+n)/(f-n)   -1    
0              0              -2*f*n/(f-n)    0

其中:

r = w / h
t = tan( fov_y / 2 );

2 * n / (r-l) = 1 / (t * a)
2 * n / (t-b) = 1 / t

如果投影是对称的,其中视线位于视口的中心并且视场没有移位,则矩阵可以简化:

1/(t*a)  0    0               0
0        1/t  0               0
0        0    -(f+n)/(f-n)   -1    
0        0    -2*f*n/(f-n)    0

以下函数将计算与 gluPerspective 相同的投影矩阵:

#include <array>

const float cPI = 3.14159265f;
float ToRad( float deg ) { return deg * cPI / 180.0f; }

using TVec4  = std::array< float, 4 >;
using TMat44 = std::array< TVec4, 4 >;

TMat44 Perspective( float fov_y, float aspect )
{
    float fn = far + near
    float f_n = far - near;
    float r = aspect;
    float t = 1.0f / tan( ToRad( fov_y ) / 2.0f );

    return TMat44{ 
        TVec4{ t / r, 0.0f,  0.0f,                 0.0f },
        TVec4{ 0.0f,  t,     0.0f,                 0.0f },
        TVec4{ 0.0f,  0.0f, -fn / f_n,            -1.0f },
        TVec4{ 0.0f,  0.0f, -2.0f*far*near / f_n,  0.0f }
    };
}

进一步查看:

WebGL 示例:

<script type="text/javascript">

camera_vert =
"precision mediump float; \n" +
"attribute vec3 inPos; \n" +
"attribute vec3 inCol; \n" +
"varying   vec3 vertCol;" +
"uniform   mat4 u_projectionMat44;" +
"uniform   mat4 u_viewMat44;" +
"uniform   mat4 u_modelMat44;" +
"void main()" +
"{" +
"    vertCol       = inCol;" +
"    vec4 modolPos = u_modelMat44 * vec4( inPos, 1.0 );" +
"    vec4 viewPos  = u_viewMat44 * modolPos;" +
"    gl_Position   = u_projectionMat44 * viewPos;" +
"}";

camera_frag =
"precision mediump float; \n" +
"varying vec3 vertCol;" +
"void main()" +
"{" +
"    gl_FragColor = vec4( vertCol, 1.0 );" +
"}";

glArrayType = typeof Float32Array !="undefined" ? Float32Array : ( typeof WebGLFloatArray != "undefined" ? WebGLFloatArray : Array );

function IdentityMat44() {
  var a=new glArrayType(16);
  a[0]=1;a[1]=0;a[2]=0;a[3]=0;a[4]=0;a[5]=1;a[6]=0;a[7]=0;a[8]=0;a[9]=0;a[10]=1;a[11]=0;a[12]=0;a[13]=0;a[14]=0;a[15]=1;
  return a;
};

function Cross( a, b ) { return [ a[1] * b[2] - a[2] * b[1], a[2] * b[0] - a[0] * b[2], a[0] * b[1] - a[1] * b[0], 0.0 ]; }
function Dot( a, b ) { return a[0]*b[0] + a[1]*b[1] + a[2]*b[2]; }
function Normalize( v ) {
    var len = Math.sqrt( v[0] * v[0] + v[1] * v[1] + v[2] * v[2] );
    return [ v[0] / len, v[1] / len, v[2] / len ];
}

var Camera = {};
Camera.create = function() {
    this.pos    = [0, 8, 0.5];
    this.target = [0, 0, 0];
    this.up     = [0, 0, 1];
    this.fov_y  = 90;
    this.vp     = [800, 600];
    this.near   = 0.5;
    this.far    = 100.0;
}
Camera.Perspective = function() {
    var fn = this.far + this.near;
    var f_n = this.far - this.near;
    var r = this.vp[0] / this.vp[1];
    var t = 1 / Math.tan( Math.PI * this.fov_y / 360 );
    var m = IdentityMat44();
    m[0]  = t/r; m[1]  = 0; m[2]  =  0;                              m[3]  = 0;
    m[4]  = 0;   m[5]  = t; m[6]  =  0;                              m[7]  = 0;
    m[8]  = 0;   m[9]  = 0; m[10] = -fn / f_n;                       m[11] = -1;
    m[12] = 0;   m[13] = 0; m[14] = -2 * this.far * this.near / f_n; m[15] =  0;
    return m;
}
function ToVP( v ) { return [ v[1], v[2], -v[0] ] }
Camera.LookAt = function() {
    var p = ToVP( this.pos ); t = ToVP( this.target ); u = ToVP( this.up );
    var mx = Normalize( [ t[0]-p[0], t[1]-p[1], t[2]-p[2] ] );
    var my = Normalize( Cross( u, mx ) );
    var mz = Normalize( Cross( mx, my ) );
    var eyeInv = [ -this.pos[0], -this.pos[1], -this.pos[2] ];
    var tx = Dot( eyeInv, [mx[0], my[0], mz[0]] );
    var ty = Dot( eyeInv, [mx[1], my[1], mz[1]] );
    var tz = Dot( eyeInv, [mx[2], my[2], mz[2]] ); 
    var m = IdentityMat44();
    m[0]  = mx[0]; m[1]  = mx[1]; m[2]  = mx[2]; m[3]  = 0;
    m[4]  = my[0]; m[5]  = my[1]; m[6]  = my[2]; m[7]  = 0;
    m[8]  = mz[0]; m[9]  = mz[1]; m[10] = mz[2]; m[11] = 0;
    m[12] = tx;    m[13] = ty;    m[14] = tz;    m[15] = 1; 
    return m;
}

// shader program object
var ShaderProgram = {};
ShaderProgram.Create = function( shaderList, uniformNames ) {
    var shaderObjs = [];
    for ( var i_sh = 0; i_sh < shaderList.length; ++ i_sh ) {
        var shderObj = this.CompileShader( shaderList[i_sh].source, shaderList[i_sh].stage );
        if ( shderObj == 0 )
          return 0;
        shaderObjs.push( shderObj );
    }
    if ( !this.LinkProgram( shaderObjs ) )
      return 0;
    this.unifomLocation = {};
    for ( var i_n = 0; i_n < uniformNames.length; ++ i_n ) {
        var name = uniformNames[i_n];
        this.unifomLocation[name] = gl.getUniformLocation( this.prog, name );
    }
    return this.prog;
}
ShaderProgram.Use = function() { gl.useProgram( this.prog ); } 
ShaderProgram.SetUniformMat44 = function( name, mat ) { gl.uniformMatrix4fv( this.unifomLocation[name], false, mat ); }
ShaderProgram.CompileShader = function( source, shaderStage ) {
    var shaderObj = gl.createShader( shaderStage );
    gl.shaderSource( shaderObj, source );
    gl.compileShader( shaderObj );
    return gl.getShaderParameter( shaderObj, gl.COMPILE_STATUS ) ? shaderObj : 0;
} 
ShaderProgram.LinkProgram = function( shaderObjs ) {
    this.prog = gl.createProgram();
    for ( var i_sh = 0; i_sh < shaderObjs.length; ++ i_sh )
        gl.attachShader( this.prog, shaderObjs[i_sh] );
    gl.linkProgram( this.prog );
    return gl.getProgramParameter( this.prog, gl.LINK_STATUS ) ? true : false;
}
        

function drawScene(){

    var canvas = document.getElementById( "camera-canvas" );
    Camera.create();
    Camera.vp = [canvas.width, canvas.height];
    var currentTime = Date.now();   
    var deltaMS = currentTime - startTime;
    Camera.pos = EllipticalPosition( 7, 4, CalcAng( currentTime, 10.0 ) );
        
    gl.viewport( 0, 0, canvas.width, canvas.height );
    gl.enable( gl.DEPTH_TEST );
    gl.clearColor( 0.0, 0.0, 0.0, 1.0 );
    gl.clear( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT );
    ShaderProgram.Use();
    ShaderProgram.SetUniformMat44( "u_projectionMat44", Camera.Perspective() );
    ShaderProgram.SetUniformMat44( "u_viewMat44", Camera.LookAt() );
        
    ShaderProgram.SetUniformMat44( "u_modelMat44", IdentityMat44() );
    gl.enableVertexAttribArray( prog.inPos );
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.pos );
    gl.vertexAttribPointer( prog.inPos, 3, gl.FLOAT, false, 0, 0 ); 
    gl.enableVertexAttribArray( prog.inCol );
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.col );
    gl.vertexAttribPointer( prog.inCol, 3, gl.FLOAT, false, 0, 0 ); 
    gl.bindBuffer( gl.ELEMENT_ARRAY_BUFFER, buf.inx );
    gl.drawElements( gl.TRIANGLES, 12, gl.UNSIGNED_SHORT, 0 );
    gl.disableVertexAttribArray( buf.pos );
    gl.disableVertexAttribArray( buf.col );
}

var startTime;
function Fract( val ) { 
    return val - Math.trunc( val );
}
function CalcAng( currentTime, intervall ) {
    return Fract( (currentTime - startTime) / (1000*intervall) ) * 2.0 * Math.PI;
}
function CalcMove( currentTime, intervall, range ) {
    var pos = self.Fract( (currentTime - startTime) / (1000*intervall) ) * 2.0
    var pos = pos < 1.0 ? pos : (2.0-pos)
    return range[0] + (range[1] - range[0]) * pos;
}    
function EllipticalPosition( a, b, angRag ) {
    var a_b = a * a - b * b
    var ea = (a_b <= 0) ? 0 : Math.sqrt( a_b );
    var eb = (a_b >= 0) ? 0 : Math.sqrt( -a_b );
    return [ a * Math.sin( angRag ) - ea, b * Math.cos( angRag ) - eb, 0 ];
}

var gl;
var prog;
var buf = {};
function cameraStart() {

    var canvas = document.getElementById( "camera-canvas");
    gl = canvas.getContext( "experimental-webgl" );
    if ( !gl )
      return;

    prog = ShaderProgram.Create( 
      [ { source : camera_vert, stage : gl.VERTEX_SHADER },
        { source : camera_frag, stage : gl.FRAGMENT_SHADER }
      ],
      [ "u_projectionMat44", "u_viewMat44", "u_modelMat44"] );
    prog.inPos = gl.getAttribLocation( prog, "inPos" );
    prog.inCol = gl.getAttribLocation( prog, "inCol" );
    if ( prog == 0 )
        return;

    var sin120 = 0.8660254
    var pos = [ 0.0, 0.0, 1.0, 0.0, -sin120, -0.5, sin120 * sin120, 0.5 * sin120, -0.5, -sin120 * sin120, 0.5 * sin120, -0.5 ];
    var col = [ 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0 ];
    var inx = [ 0, 1, 2, 0, 2, 3, 0, 3, 1, 1, 3, 2 ];
    buf.pos = gl.createBuffer();
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.pos );
    gl.bufferData( gl.ARRAY_BUFFER, new Float32Array( pos ), gl.STATIC_DRAW );
    buf.col = gl.createBuffer();
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.col );
    gl.bufferData( gl.ARRAY_BUFFER, new Float32Array( col ), gl.STATIC_DRAW );
    buf.inx = gl.createBuffer();
    gl.bindBuffer( gl.ELEMENT_ARRAY_BUFFER, buf.inx );
    gl.bufferData( gl.ELEMENT_ARRAY_BUFFER, new Uint16Array( inx ), gl.STATIC_DRAW );

    startTime = Date.now();
    setInterval(drawScene, 50);
}

</script>

<body onload="cameraStart();">
    <canvas id="camera-canvas" style="border: none;" width="512" height="256"></canvas>
</body>

At Perspective Projection the projection matrix describes the mapping from 3D points in the world as they are seen from of a pinhole camera, to 2D points of the viewport.
The eye space coordinates in the camera frustum (a truncated pyramid) are mapped to a cube (the normalized device coordinates).

enter image description here

The Perspective Projection Matrix looks like this:

r = right, l = left, b = bottom, t = top, n = near, f = far

2*n/(r-l)      0              0               0
0              2*n/(t-b)      0               0
(r+l)/(r-l)    (t+b)/(t-b)    -(f+n)/(f-n)   -1    
0              0              -2*f*n/(f-n)    0

wher :

r = w / h
t = tan( fov_y / 2 );

2 * n / (r-l) = 1 / (t * a)
2 * n / (t-b) = 1 / t

If the projection is symmetric, where the line of sight is in the center of the view port and the field of view is not displaced, then the matrix can be simplified:

1/(t*a)  0    0               0
0        1/t  0               0
0        0    -(f+n)/(f-n)   -1    
0        0    -2*f*n/(f-n)    0

The following function will calculate the same projection matrix as gluPerspective does:

#include <array>

const float cPI = 3.14159265f;
float ToRad( float deg ) { return deg * cPI / 180.0f; }

using TVec4  = std::array< float, 4 >;
using TMat44 = std::array< TVec4, 4 >;

TMat44 Perspective( float fov_y, float aspect )
{
    float fn = far + near
    float f_n = far - near;
    float r = aspect;
    float t = 1.0f / tan( ToRad( fov_y ) / 2.0f );

    return TMat44{ 
        TVec4{ t / r, 0.0f,  0.0f,                 0.0f },
        TVec4{ 0.0f,  t,     0.0f,                 0.0f },
        TVec4{ 0.0f,  0.0f, -fn / f_n,            -1.0f },
        TVec4{ 0.0f,  0.0f, -2.0f*far*near / f_n,  0.0f }
    };
}

See further:

WebGL example:

<script type="text/javascript">

camera_vert =
"precision mediump float; \n" +
"attribute vec3 inPos; \n" +
"attribute vec3 inCol; \n" +
"varying   vec3 vertCol;" +
"uniform   mat4 u_projectionMat44;" +
"uniform   mat4 u_viewMat44;" +
"uniform   mat4 u_modelMat44;" +
"void main()" +
"{" +
"    vertCol       = inCol;" +
"    vec4 modolPos = u_modelMat44 * vec4( inPos, 1.0 );" +
"    vec4 viewPos  = u_viewMat44 * modolPos;" +
"    gl_Position   = u_projectionMat44 * viewPos;" +
"}";

camera_frag =
"precision mediump float; \n" +
"varying vec3 vertCol;" +
"void main()" +
"{" +
"    gl_FragColor = vec4( vertCol, 1.0 );" +
"}";

glArrayType = typeof Float32Array !="undefined" ? Float32Array : ( typeof WebGLFloatArray != "undefined" ? WebGLFloatArray : Array );

function IdentityMat44() {
  var a=new glArrayType(16);
  a[0]=1;a[1]=0;a[2]=0;a[3]=0;a[4]=0;a[5]=1;a[6]=0;a[7]=0;a[8]=0;a[9]=0;a[10]=1;a[11]=0;a[12]=0;a[13]=0;a[14]=0;a[15]=1;
  return a;
};

function Cross( a, b ) { return [ a[1] * b[2] - a[2] * b[1], a[2] * b[0] - a[0] * b[2], a[0] * b[1] - a[1] * b[0], 0.0 ]; }
function Dot( a, b ) { return a[0]*b[0] + a[1]*b[1] + a[2]*b[2]; }
function Normalize( v ) {
    var len = Math.sqrt( v[0] * v[0] + v[1] * v[1] + v[2] * v[2] );
    return [ v[0] / len, v[1] / len, v[2] / len ];
}

var Camera = {};
Camera.create = function() {
    this.pos    = [0, 8, 0.5];
    this.target = [0, 0, 0];
    this.up     = [0, 0, 1];
    this.fov_y  = 90;
    this.vp     = [800, 600];
    this.near   = 0.5;
    this.far    = 100.0;
}
Camera.Perspective = function() {
    var fn = this.far + this.near;
    var f_n = this.far - this.near;
    var r = this.vp[0] / this.vp[1];
    var t = 1 / Math.tan( Math.PI * this.fov_y / 360 );
    var m = IdentityMat44();
    m[0]  = t/r; m[1]  = 0; m[2]  =  0;                              m[3]  = 0;
    m[4]  = 0;   m[5]  = t; m[6]  =  0;                              m[7]  = 0;
    m[8]  = 0;   m[9]  = 0; m[10] = -fn / f_n;                       m[11] = -1;
    m[12] = 0;   m[13] = 0; m[14] = -2 * this.far * this.near / f_n; m[15] =  0;
    return m;
}
function ToVP( v ) { return [ v[1], v[2], -v[0] ] }
Camera.LookAt = function() {
    var p = ToVP( this.pos ); t = ToVP( this.target ); u = ToVP( this.up );
    var mx = Normalize( [ t[0]-p[0], t[1]-p[1], t[2]-p[2] ] );
    var my = Normalize( Cross( u, mx ) );
    var mz = Normalize( Cross( mx, my ) );
    var eyeInv = [ -this.pos[0], -this.pos[1], -this.pos[2] ];
    var tx = Dot( eyeInv, [mx[0], my[0], mz[0]] );
    var ty = Dot( eyeInv, [mx[1], my[1], mz[1]] );
    var tz = Dot( eyeInv, [mx[2], my[2], mz[2]] ); 
    var m = IdentityMat44();
    m[0]  = mx[0]; m[1]  = mx[1]; m[2]  = mx[2]; m[3]  = 0;
    m[4]  = my[0]; m[5]  = my[1]; m[6]  = my[2]; m[7]  = 0;
    m[8]  = mz[0]; m[9]  = mz[1]; m[10] = mz[2]; m[11] = 0;
    m[12] = tx;    m[13] = ty;    m[14] = tz;    m[15] = 1; 
    return m;
}

// shader program object
var ShaderProgram = {};
ShaderProgram.Create = function( shaderList, uniformNames ) {
    var shaderObjs = [];
    for ( var i_sh = 0; i_sh < shaderList.length; ++ i_sh ) {
        var shderObj = this.CompileShader( shaderList[i_sh].source, shaderList[i_sh].stage );
        if ( shderObj == 0 )
          return 0;
        shaderObjs.push( shderObj );
    }
    if ( !this.LinkProgram( shaderObjs ) )
      return 0;
    this.unifomLocation = {};
    for ( var i_n = 0; i_n < uniformNames.length; ++ i_n ) {
        var name = uniformNames[i_n];
        this.unifomLocation[name] = gl.getUniformLocation( this.prog, name );
    }
    return this.prog;
}
ShaderProgram.Use = function() { gl.useProgram( this.prog ); } 
ShaderProgram.SetUniformMat44 = function( name, mat ) { gl.uniformMatrix4fv( this.unifomLocation[name], false, mat ); }
ShaderProgram.CompileShader = function( source, shaderStage ) {
    var shaderObj = gl.createShader( shaderStage );
    gl.shaderSource( shaderObj, source );
    gl.compileShader( shaderObj );
    return gl.getShaderParameter( shaderObj, gl.COMPILE_STATUS ) ? shaderObj : 0;
} 
ShaderProgram.LinkProgram = function( shaderObjs ) {
    this.prog = gl.createProgram();
    for ( var i_sh = 0; i_sh < shaderObjs.length; ++ i_sh )
        gl.attachShader( this.prog, shaderObjs[i_sh] );
    gl.linkProgram( this.prog );
    return gl.getProgramParameter( this.prog, gl.LINK_STATUS ) ? true : false;
}
        

function drawScene(){

    var canvas = document.getElementById( "camera-canvas" );
    Camera.create();
    Camera.vp = [canvas.width, canvas.height];
    var currentTime = Date.now();   
    var deltaMS = currentTime - startTime;
    Camera.pos = EllipticalPosition( 7, 4, CalcAng( currentTime, 10.0 ) );
        
    gl.viewport( 0, 0, canvas.width, canvas.height );
    gl.enable( gl.DEPTH_TEST );
    gl.clearColor( 0.0, 0.0, 0.0, 1.0 );
    gl.clear( gl.COLOR_BUFFER_BIT | gl.DEPTH_BUFFER_BIT );
    ShaderProgram.Use();
    ShaderProgram.SetUniformMat44( "u_projectionMat44", Camera.Perspective() );
    ShaderProgram.SetUniformMat44( "u_viewMat44", Camera.LookAt() );
        
    ShaderProgram.SetUniformMat44( "u_modelMat44", IdentityMat44() );
    gl.enableVertexAttribArray( prog.inPos );
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.pos );
    gl.vertexAttribPointer( prog.inPos, 3, gl.FLOAT, false, 0, 0 ); 
    gl.enableVertexAttribArray( prog.inCol );
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.col );
    gl.vertexAttribPointer( prog.inCol, 3, gl.FLOAT, false, 0, 0 ); 
    gl.bindBuffer( gl.ELEMENT_ARRAY_BUFFER, buf.inx );
    gl.drawElements( gl.TRIANGLES, 12, gl.UNSIGNED_SHORT, 0 );
    gl.disableVertexAttribArray( buf.pos );
    gl.disableVertexAttribArray( buf.col );
}

var startTime;
function Fract( val ) { 
    return val - Math.trunc( val );
}
function CalcAng( currentTime, intervall ) {
    return Fract( (currentTime - startTime) / (1000*intervall) ) * 2.0 * Math.PI;
}
function CalcMove( currentTime, intervall, range ) {
    var pos = self.Fract( (currentTime - startTime) / (1000*intervall) ) * 2.0
    var pos = pos < 1.0 ? pos : (2.0-pos)
    return range[0] + (range[1] - range[0]) * pos;
}    
function EllipticalPosition( a, b, angRag ) {
    var a_b = a * a - b * b
    var ea = (a_b <= 0) ? 0 : Math.sqrt( a_b );
    var eb = (a_b >= 0) ? 0 : Math.sqrt( -a_b );
    return [ a * Math.sin( angRag ) - ea, b * Math.cos( angRag ) - eb, 0 ];
}

var gl;
var prog;
var buf = {};
function cameraStart() {

    var canvas = document.getElementById( "camera-canvas");
    gl = canvas.getContext( "experimental-webgl" );
    if ( !gl )
      return;

    prog = ShaderProgram.Create( 
      [ { source : camera_vert, stage : gl.VERTEX_SHADER },
        { source : camera_frag, stage : gl.FRAGMENT_SHADER }
      ],
      [ "u_projectionMat44", "u_viewMat44", "u_modelMat44"] );
    prog.inPos = gl.getAttribLocation( prog, "inPos" );
    prog.inCol = gl.getAttribLocation( prog, "inCol" );
    if ( prog == 0 )
        return;

    var sin120 = 0.8660254
    var pos = [ 0.0, 0.0, 1.0, 0.0, -sin120, -0.5, sin120 * sin120, 0.5 * sin120, -0.5, -sin120 * sin120, 0.5 * sin120, -0.5 ];
    var col = [ 1.0, 0.0, 0.0, 1.0, 1.0, 0.0, 0.0, 0.0, 1.0, 0.0, 1.0, 0.0 ];
    var inx = [ 0, 1, 2, 0, 2, 3, 0, 3, 1, 1, 3, 2 ];
    buf.pos = gl.createBuffer();
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.pos );
    gl.bufferData( gl.ARRAY_BUFFER, new Float32Array( pos ), gl.STATIC_DRAW );
    buf.col = gl.createBuffer();
    gl.bindBuffer( gl.ARRAY_BUFFER, buf.col );
    gl.bufferData( gl.ARRAY_BUFFER, new Float32Array( col ), gl.STATIC_DRAW );
    buf.inx = gl.createBuffer();
    gl.bindBuffer( gl.ELEMENT_ARRAY_BUFFER, buf.inx );
    gl.bufferData( gl.ELEMENT_ARRAY_BUFFER, new Uint16Array( inx ), gl.STATIC_DRAW );

    startTime = Date.now();
    setInterval(drawScene, 50);
}

</script>

<body onload="cameraStart();">
    <canvas id="camera-canvas" style="border: none;" width="512" height="256"></canvas>
</body>

雅心素梦 2024-08-30 11:38:15

不要尝试修改你的光线。相反,请执行以下操作:

a)使用相机的位置/旋转创建矩阵。
b) 矩阵求逆
c) 将其应用于场景中的所有模型
d) 使用常规方法渲染它。

实际上 OpenGL 也是这样做的。向右旋转相机与向左旋转世界相同。

don't try to modify your rays. Instead do this:

a) create matrix using the location/rotation of your camera.
b) invert the matrix
c) apply it to all the models in the scene
d) render it using your normal methods.

This is actually the way OpenGL does it as well. Rotating the camera to the right is the same as rotating the world to the left.

提笔落墨 2024-08-30 11:38:15

我通过谷歌搜索到达这里后回答了这个问题。

现有的答案似乎忽略了对原始问题缺乏理解。

当光线投射时需要应用投影矩阵的想法是无意义的

我们通过从视图平面开始并对每个像素进行相同方向的光线追踪来创建正交光线投射。每个像素的光线原点都会发生变化

我们通过从视图平面后面的眼睛位置开始并跟踪每个像素的唯一方向来创建透视光线投射。即光线的原点是固定的并且对于每个像素都是相同的。

了解投影矩阵本身以及它们通常涉及的过程源自光线投射。透视矩阵对我所描述的那种光线投射进行编码。

在屏幕上投影一个点是将光线从眼睛/视图平面投射到该点并找到与视图平面的交点...

I answer this after arriving here from a Google search.

The existing answers seem to miss the lack of understanding in the original question.

The idea of needing to apply projection matrix when raycasting is nonsense

We create orthogonal raycasts by starting from the view plane and raytracing the same direction for each pixel. the origin of the ray changes per pixel

We create perspective raycasts by starting at the eye position, behind the view plane and raytracing a unique direction for each pixel. i.e. the origin of the ray is fixed and the same for every pixel.

Understand that the projection matrices themselves, and the process they are usually involved in is derived from raycasting. The perspective matrix encodes a raycast of the kind I described.

Projecting a point on the screen is casting a ray from the eye/view plane to the point and finding the intersection with the view plane...

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文