使用 Core Motion 确定手机的高度和方位角
看起来这应该是一件容易的事,但我只是没有运气。
我想要做的是使用核心运动来跟踪手机摄像头指向的位置(对于一种增强现实应用程序)。用户将手机指向一个参考点(就像拍照一样),然后点击按钮将手机的位置与物体“对齐”。现在,当用户将手机的摄像头指向其他物体时,我需要确定手机的高度(-90° 到 +90°)和方位角(0° 到 360°)。
CMAttitude 给出的欧拉角似乎不是我需要的。我尝试缓存用户点击“align”时获得的 CMAttitude。然后,当用户移动手机时,我会得到一个新的 CMAttitude 并使用multiplyByInverseOfAttitude 来确定与参考姿态的差异。
- (void)checkForMotion {
CMMotionManager *motionMgr = appDelegate.motionMgr;
CMDeviceMotion *deviceMotion = motionMgr.deviceMotion;
CMAttitude *attitude = deviceMotion.attitude;
if (referenceAttitude != nil)
{
[attitude multiplyByInverseOfAttitude: referenceAttitude];
double deltaAlt = attitude.pitch;
double deltaAzm = attitude.roll;
// Do something with deltaAlt and deltaAz
}
else
{
NSLog(@"Getting new referenceAttitude");
self.referenceAttitude = attitude;
}
}
但这并没有产生正确的结果。如果手机垂直握住(长轴向上),则 deltaAlt 工作正常。只要手机指向地平线 (alt = 0),那么 deltaAzm 也是正确的。然而,如果手机以 45° 角指向天空,那么当我移动手机的方位角时,高度分量也会发生变化。两者以一种我无法理解的方式交织在一起。
我需要一些简单的数学来将它们分开吗?再说一遍,我认为这对于增强现实应用程序来说是必要的,但我找不到任何这样做的例子。
It would seem like this should be something easy, but I'm just not having any luck.
What I want to be able to do is use core motion to keep track of where the phone's camera is being pointed at (for a sort of augmented reality app). The user would point the phone to a reference point (as if taking a picture of it) and hit a button to "align" the phone's position with the object. Now as the user points the phone's camera towards other objects, I need to determine the change phone's altitude (-90º to +90º) and azimuth (0º to 360º).
The Euler angles given by CMAttitude do not appear to be what I need. I've tried caching the CMAttitude obtained when the user hits "align". Then as the user moves the phone around, I get a new CMAttitude and use the multiplyByInverseOfAttitude to determine the difference from the reference attitude.
- (void)checkForMotion {
CMMotionManager *motionMgr = appDelegate.motionMgr;
CMDeviceMotion *deviceMotion = motionMgr.deviceMotion;
CMAttitude *attitude = deviceMotion.attitude;
if (referenceAttitude != nil)
{
[attitude multiplyByInverseOfAttitude: referenceAttitude];
double deltaAlt = attitude.pitch;
double deltaAzm = attitude.roll;
// Do something with deltaAlt and deltaAz
}
else
{
NSLog(@"Getting new referenceAttitude");
self.referenceAttitude = attitude;
}
}
But this just isn't yielding the right results. If the phone is held vertically (long axis up) then the deltaAlt works fine. As long as the phone is pointed at the horizon (alt = 0) , then the deltaAzm is correct as well. However, if the phone if pointed (say) 45º into the sky, then as I move the phone in azimuth, the altitude component is changed as well. The two are intertwined in a way I can't understand.
Is there some simple math that I need to tease these apart? Again, I would think this would be a necessary need for an augmented reality app, but I've been unable to find any examples that do this.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
总的来说,你的方法对我来说听起来很合理。我认为这与 Teapot 演示应用程序中使用的相同。也许最简单的方法是检查并查看它:
CMMotionManager 和 iPhone 4 上的陀螺仪
如果您计划将来处理更复杂的运动,我推荐困难的方法,即四元数。是的,不是那么容易,但如果你一旦得到它,那就非常方便了。
In general your approach sounds reasonable to me. I think it is same thing like it is used in Teapot demo app. Maybe the easiest way is to check it out and have a look at it:
CMMotionManager and the Gyroscope on iPhone 4
If you plan to handle more complex motions in the future, I recommend the hard way i.e. quaternions. Yep, not that easy but if you once got it it's very convenient.