如何创建第一人称“太空飞行”相机
我目前正在尝试创建第一人称太空飞行相机。
首先,请允许我定义一下我的意思。
请注意,我目前在数学库中使用行主矩阵(这意味着 4x4 矩阵中的基向量按行排列,仿射平移部分位于第四行)。希望这有助于澄清我矩阵相乘的顺序。
到目前为止我所拥有的
到目前为止,我已经成功实现了一个简单的第一人称相机视图。其代码如下:
fn fps_camera(&mut self) -> beagle_math::Mat4 {
let pitch_matrix = beagle_math::Mat4::rotate_x(self.pitch_in_radians);
let yaw_matrix = beagle_math::Mat4::rotate_y(self.yaw_in_radians);
let view_matrix = yaw_matrix.get_transposed().mul(&pitch_matrix.get_transposed());
let translate_matrix = beagle_math::Mat4::translate(&self.position.mul(-1.0));
translate_matrix.mul(&view_matrix)
}
这按预期工作。我能够用鼠标四处走动并环顾四周。
我正在尝试做什么
但是,此实现的一个明显限制是,由于俯仰和偏航始终是相对于全局“向上”方向定义的,当我俯仰超过 90 度时,得到世界本质上是颠倒的,我的偏航运动是反向的。
我想尝试实现的是可以被视为第一人称“太空飞行”相机的东西。也就是说,无论您当前的方向是什么,使用鼠标向上和向下倾斜总是会转化为游戏中相对于您当前方向的向上和向下。使用鼠标向左和向右偏航将始终转换为相对于您当前方向的左右方向。
不幸的是,这个问题已经让我困了好几天了。请耐心等待,因为我是线性代数和矩阵变换领域的新手。所以我一定是误解或忽视了一些基本的东西。到目前为止我所实现的可能看起来......愚蠢和天真:)可能是因为它是。
到目前为止我尝试过的
我最终总是回过头来思考这个问题的方式是基本上在每一帧重新定义世界的方向。也就是说,在一个帧中,您可以使用视图矩阵平移、俯仰和偏航世界坐标空间。然后,您以某种方式重新定义该方向作为新的默认或零旋转。通过这样做,您可以在下一帧中基于这个新的默认方向应用新的俯仰和偏航旋转,这(无论如何,根据我的想法)意味着鼠标移动将始终直接转换为上、下、左、是的,无论你的方向如何,因为你基本上总是相对于你之前的方向重新定义世界坐标空间,而不是简单的第一人称相机,它总是从相同的初始坐标空间开始。
我尝试实现我的想法的最新代码如下:
fn space_camera(&mut self) -> beagle_math::Mat4 {
let previous_pitch_matrix = beagle_math::Mat4::rotate_x(self.previous_pitch);
let previous_yaw_matrix = beagle_math::Mat4::rotate_y(self.previous_yaw);
let previous_view_matrix = previous_yaw_matrix.get_transposed().mul(&previous_pitch_matrix.get_transposed());
let pitch_matrix = beagle_math::Mat4::rotate_x(self.pitch_in_radians);
let yaw_matrix = beagle_math::Mat4::rotate_y(self.yaw_in_radians);
let view_matrix = yaw_matrix.get_transposed().mul(&pitch_matrix.get_transposed());
let translate_matrix = beagle_math::Mat4::translate(&self.position.mul(-1.0));
// SAVES
self.previous_pitch += self.pitch_in_radians;
self.previous_yaw += self.yaw_in_radians;
// RESETS
self.pitch_in_radians = 0.0;
self.yaw_in_radians = 0.0;
translate_matrix.mul(&(previous_view_matrix.mul(&view_matrix)))
}
但是,这对解决问题没有任何作用。它实际上给出了与 fps 相机完全相同的结果和问题。
我在这段代码背后的想法基本上是:始终根据每帧的增量跟踪累积的俯仰和偏航(在代码中为 previous_pitch 和 previous_yaw)。增量是 pitch_in_radians 和 pitch_in_yaw,因为它们总是每帧重置。
然后,我首先构建一个视图矩阵来表示世界之前的方向,即 previous_view_matrix。然后,我根据该帧的增量构建一个新的视图矩阵,即 view_matrix。
然后,我尝试创建一个视图矩阵来执行以下操作:
- 沿与代表相机当前位置的方向相反的方向平移世界。这里与 FPS 相机没有什么不同。
- 根据我迄今为止的方向来确定该世界的方向(使用previous_view_matrix)。我希望它代表的是当前帧运动的增量的默认起点。
- 应用当前帧使用当前视图矩阵,由 view_matrix 表示
我希望在步骤 3 中,如果世界是颠倒的,则先前的方向将被视为新旋转的起点。这以前的方向,view_matrix 会根据相机的“向上”应用偏航,从而避免反向控制的问题,
我肯定是从错误的角度解决了这个问题,或者误解了。矩阵乘法与旋转的基本部分
帮助指出我出错的地方
谁能
?我通过标记答案和洛克答案的组合来修复它(最终,在我的问题给出的示例中,我还搞乱了矩阵乘法顺序)。
此外,当你正确使用相机时,你可能会偶然发现奇怪的副作用,即保持相机静止,然后只是俯仰和偏航(例如将鼠标绕圈移动),就会导致你的世界慢慢滚动以及。
这不是一个错误,这就是 3D 旋转的工作原理。凯文在他的答案中添加了一条评论来解释这一点,此外,我还发现 这个 GameDev Stack Exchange 答案 进一步详细解释了它。
I'm currently attempting to create a first-person space flight camera.
First, allow me to define what I mean by that.
Notice that I am currently using Row-Major matrices in my math library (meaning, the basis vectors in my 4x4 matrices are laid out in rows, and the affine translation part is in the fourth row). Hopefully this helps clarify the order in which I multiply my matrices.
What I have so Far
So far, I have successfully implemented a simple first-person camera view. The code for this is as follows:
fn fps_camera(&mut self) -> beagle_math::Mat4 {
let pitch_matrix = beagle_math::Mat4::rotate_x(self.pitch_in_radians);
let yaw_matrix = beagle_math::Mat4::rotate_y(self.yaw_in_radians);
let view_matrix = yaw_matrix.get_transposed().mul(&pitch_matrix.get_transposed());
let translate_matrix = beagle_math::Mat4::translate(&self.position.mul(-1.0));
translate_matrix.mul(&view_matrix)
}
This works as expected. I am able to walk around and look around with the mouse.
What I am Attempting to do
However, an obvious limitation of this implementation is that since pitch and yaw is always defined relative to a global "up" direction, the moment I pitch more than 90 degrees, getting the world to essentially being upside-down, my yaw movement is inverted.
What I would like to attempt to implement is what could be seen more as a first-person "space flight" camera. That is, no matter what your current orientation is, pitching up and down with the mouse will always translate into up and down in the game, relative to your current orientation. And yawing left and right with your mouse will always translate into a left and right direction, relative to your current orientation.
Unfortunately, this problem has got me stuck for days now. Bear with me, as I am new to the field of linear algebra and matrix transformations. So I must be misunderstanding or overlooking something fundamental. What I've implemented so far might thus look... stupid and naive :) Probably because it is.
What I've Tried so far
The way that I always end up coming back to thinking about this problem is to basically redefine the world's orientation every frame. That is, in a frame, you translate, pitch, and yaw the world coordinate space using your view matrix. You then somehow redefine this orientation as being the new default or zero-rotation. By doing this, you can then, in your next frame apply new pitch and yaw rotations based on this new default orientation, which (by my thinking, anyways), would mean that mouse movement will always translate directly to up, down, left, and right, no matter how you are oriented, because you are basically always redefining the world coordinate space in terms relative to what your previous orientation was, as opposed to the simple first-person camera, which always starts from the same initial coordinate space.
The latest code I have which attempts to implement my idea is as follows:
fn space_camera(&mut self) -> beagle_math::Mat4 {
let previous_pitch_matrix = beagle_math::Mat4::rotate_x(self.previous_pitch);
let previous_yaw_matrix = beagle_math::Mat4::rotate_y(self.previous_yaw);
let previous_view_matrix = previous_yaw_matrix.get_transposed().mul(&previous_pitch_matrix.get_transposed());
let pitch_matrix = beagle_math::Mat4::rotate_x(self.pitch_in_radians);
let yaw_matrix = beagle_math::Mat4::rotate_y(self.yaw_in_radians);
let view_matrix = yaw_matrix.get_transposed().mul(&pitch_matrix.get_transposed());
let translate_matrix = beagle_math::Mat4::translate(&self.position.mul(-1.0));
// SAVES
self.previous_pitch += self.pitch_in_radians;
self.previous_yaw += self.yaw_in_radians;
// RESETS
self.pitch_in_radians = 0.0;
self.yaw_in_radians = 0.0;
translate_matrix.mul(&(previous_view_matrix.mul(&view_matrix)))
}
This, however, does nothing to solve the issue. It actually gives the exact same result and problem as the fps camera.
My thinking behind this code is basically: Always keep track of an accumulated pitch and yaw (in the code that is the previous_pitch and previous_yaw) based on deltas each frame. The deltas are pitch_in_radians and pitch_in_yaw, as they are always reset each frame.
I then start off by constructing a view matrix that would represent how the world was orientated previously, that is the previous_view_matrix. I then construct a new view matrix based on the deltas of this frame, that is the view_matrix.
I then attempt to do a view matrix that does this:
- Translate the world in the opposite direction of what represents the camera's current position. Nothing is different here from the FPS camera.
- Orient that world according to what my orientation has been so far (using the previous_view_matrix. What I would want this to represent is the default starting point for the deltas of my current frame's movement.
- Apply the deltas of the current frame using the current view matrix, represented by view_matrix
My hope was that in step 3, the previous orientation would be seen as a starting point for a new rotation. That if the world was upside-down in the previous orientation, the view_matrix would apply a yaw in terms of the camera's "up", which would then avoid the problem of inverted controls.
I must surely be either attacking the problem from the wrong angle, or misunderstanding essential parts of matrix multiplication with rotations.
Can anyone help pin-point where I'm going wrong?
[EDIT] - Rolling even when you only pitch and yaw the camera
For anyone just stumbling upon this, I fixed it by a combination of the marked answer and Locke's answer (ultimately, in the example given in my question, I also messed up the matrix multiplication order).
Additionally, when you get your camera right, you may stumble upon the odd side-effect that holding the camera stationary, and just pitching and yawing it about (such as moving your mouse around in a circle), will result in your world slowly rolling as well.
This is not a mistake, this is how rotations work in 3D. Kevin added a comment in his answer that explains it, and additionally, I also found this GameDev Stack Exchange answer explaining it in further detail.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
问题在于,俯仰和偏航这两个数字提供的自由度不足以表示在没有任何“地平线”的空间中一致的自由旋转行为。两个数字可以表示观看方向向量,但它们不能表示相机方向的第三个分量,称为滚动(绕屏幕“深度”轴旋转) 。因此,无论您如何实现控件,您都会发现相机在某些方向上会奇怪地滚动,因为尝试使用此信息进行数学计算的效果是,滚动的每一帧都会发生变化。根据俯仰和偏航拾取/重建。
对此的最小解决方案是向相机状态添加滚动组件。然而,这种方法(“欧拉角”)计算起来很棘手,并且存在数值稳定性问题(“万向节锁”)。
相反,您应该将相机/玩家方向表示为四元数,这是一种适合表示任意旋转的数学结构。四元数的使用有点像旋转矩阵,但分量较少;您将四元数乘以四元数以应用玩家输入,并将四元数转换为矩阵以进行渲染。
通用游戏引擎使用四元数来描述对象的旋转是很常见的。我还没有亲自编写四元数相机代码(还!),但我确信互联网上包含许多示例和更长的解释,您可以使用。
The problem is that two numbers, pitch and yaw, provide insufficient degrees of freedom to represent consistent free rotation behavior in space without any “horizon”. Two numbers can represent a look-direction vector but they cannot represent the third component of camera orientation, called roll (rotation about the “depth” axis of the screen). As a consequence, no matter how you implement the controls, you will find that in some orientations the camera rolls strangely, because the effect of trying to do the math with this information is that every frame the roll is picked/reconstructed based on the pitch and yaw.
The minimal solution to this is to add a roll component to your camera state. However, this approach (“Euler angles”) is both tricky to compute with and has numerical stability issues (“gimbal lock”).
Instead, you should represent your camera/player orientation as a quaternion, a mathematical structure that is good for representing arbitrary rotations. Quaternions are used somewhat like rotation matrices, but have fewer components; you'll multiply quaternions by quaternions to apply player input, and convert quaternions to matrices to render with.
It is very common for general purpose game engines to use quaternions for describing objects' rotations. I haven't personally written quaternion camera code (yet!) but I'm sure the internet contains many examples and longer explanations you can work from.
看起来您遇到的很多困难都是由于尝试规范化转换以应用新的翻译。看来这可能是让你绊倒的很大一部分原因。我建议更改您存储位置和旋转的方式。相反,尝试让你的视图矩阵定义你的位置。
我对该库做了一些假设:
Mat4
实现了Mul
,因此您不需要显式调用x.mul(y)
并且可以使用x * y
。Sub
也是如此。Mat4::rotate_xy
函数。如果没有,则相当于Mat4::rotate_xyz(delta_pitch, delta_yaw, 0.0)
或Mat4::rotate_x(delta_pitch) * Mat4::rotate_y(delta_yaw)< /代码>。
我有点盯着方程式,所以希望这是正确的。主要思想是从之前的输入中获取增量并创建矩阵,然后将其添加到之前的 view_matrix 之上。如果您尝试在创建转换矩阵后获取差异,那么您(和您的处理器)只会做更多的工作。
作为旁注,我看到您正在使用
self.position.mul(-1.0)
。这告诉我你的投影矩阵可能是向后的。您可能希望通过在 z 轴上将投影矩阵缩放 -1 倍来调整投影矩阵。It looks like a lot of the difficulty you are having is due to trying to normalize the transformation to apply the new translation. It seems like this is probably a large part of what is tripping you up. I would suggest changing how you store your position and rotation. Instead, try letting your view matrix define your position.
I made a couple assumptions about the library:
Mat4
implementsMul<Self>
so you do not need to callx.mul(y)
explicitly and can instead usex * y
. Same goes forSub
.Mat4::rotate_xy
function. If there isn't one, it would be equivalent toMat4::rotate_xyz(delta_pitch, delta_yaw, 0.0)
orMat4::rotate_x(delta_pitch) * Mat4::rotate_y(delta_yaw)
.I'm somewhat eyeballing the equations so hopefully this is correct. The main idea is to take the delta from the previous inputs and create matrices from that which can then be added on top of the previous
view_matrix
. If you attempt to take the difference after creating transformation matrices it will only be more work for you (and your processor).As a side note I see you are using
self.position.mul(-1.0)
. This tells me that your projection matrix is probably backwards. You likely want to adjust your projection matrix by scaling it by a factor of -1 in the z axis.