iOS CoreMotion CMAttitude 相对于北极

发布于 10-20 04:10 字数 252 浏览 6 评论 0原文

我目前正在使用 CoreMotion 的 DeviceMotion 来获取 iPhone 的方向(滚动、俯仰、偏航)。现在我想要相对于地理北极的这些值;因此,我需要一个包含滚动、俯仰和偏航值的 CMAttitude 参考对象,如果 iPhone 背面面向北极(3D),则会报告这些值。 CLLocationManager 仅返回 Tesla 中的磁航向 (x, y, z)。

您知道如何将这些值转换为横滚、俯仰和偏航吗?

预先感谢,

亚历山大

I'm currently using CoreMotion's DeviceMotion to get the orientation (roll, pitch, yaw) of the iPhone. Now I would like to have these values relative to the geographic north pole; so I need a CMAttitude reference object containing the roll, pitch and yaw values, which would be reported, if the back of the iPhone would face to the north pole (3D). The CLLocationManager only returns the magnetic heading (x, y, z) in Tesla.

Do you have an idea of how to convert those values to roll, pitch and yaw?

Thanks in advance,

Alexander

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

咋地2024-10-27 04:10:15

iOS 5提供了指定方法。在开发人员文档中查找 CMAttitudeReferenceFrameXTrueNorthZVertical。

iOS 5 provides the designated method. Look for CMAttitudeReferenceFrameXTrueNorthZVertical in the developer documentation.

指尖上得阳光2024-10-27 04:10:15

伪代码:

  1. 启动设备运动更新
  2. 在后台启动相机预览;)
  3. 从设备捕获当前重力读数作为 CMAcceleration...一旦您将重力存储在本地变量中。
  4. 然后你必须获取 2 个向量并获取它们之间的角度,在本例中为 (0,0,-1) 的设备重力和真实的重力向量...
  5. 然后我们将 theta 转换为 thetaPrime...一个变换与 CoreMotion 参考方向相匹配
  6. 设置一个计时器来制作动画....
  7. 在动画期间获取 MotionManager 的 deviceMotion 属性的旋转矩阵的倒数。
  8. 以正确的顺序应用变换来反映设备当前的姿态(欧拉模式下的偏航、俯仰、滚动或设备四元数旋转......基本上有 3 种不同的方式来表达同一件事)

以下是代码:

- (void) initMotionCapture
{
    firstGravityReading = NO;
    referenceAttitude = nil;

    if (motionManager == nil)
    {
        self.motionManager = [CMMotionManager new];
    }
    motionManager.deviceMotionUpdateInterval = 0.01;
    self.gravityTimer = [NSTimer scheduledTimerWithTimeInterval:1/60.0 target:self selector:@selector(getFirstGravityReading) userInfo:nil repeats:YES];
}


- (void) getFirstGravityReading
{
    CMAcceleration currentGravity; 

    CMDeviceMotion *dm = motionManager.deviceMotion;
    referenceAttitude = dm.attitude;
    currentGravity = dm.gravity;

    [motionManager startDeviceMotionUpdates];

    if (currentGravity.x !=0 && currentGravity.y !=0 && currentGravity.z !=0)
    {
        NSLog(@"Gravity = (%f,%f,%f)", currentGravity.x, currentGravity.y, currentGravity.z);

        firstGravityReading = YES;
        [gravityTimer invalidate];
        self.gravityTimer = nil;
        [self setupCompass];
    }
}

- (void) setupCompass
{
    //Draw your cube... I am using a quartz 3D perspective hack!
    CATransform3D initialTransform = perspectiveTransformedLayer.sublayerTransform;
    initialTransform.m34 = 1.0/-10000;


    //HERE IS WHAT YOU GUYS NEED... the vector equations!
    NSLog(@"Gravity = (%f,%f,%f)", currentGravity.x, currentGravity.y, currentGravity.z);

    //we have current gravity vector and our device gravity vector of (0, 0, -1)
    // get the dot product
    float dotProduct = currentGravity.x*0 + currentGravity.y*0 + currentGravity.z*-1;
    float innerMagnitudeProduct = currentGravity.x*currentGravity.x + currentGravity.y + currentGravity.y + currentGravity.z*currentGravity.z;
    float magnitudeCurrentGravity = sqrt(innerMagnitudeProduct);
    float magnitudeDeviceVector = 1; //since (0,0,-1) computes to: 0*0 + 0*0 + -1*-1 = 1

    thetaOffset = acos(dotProduct/(magnitudeCurrentGravity*magnitudeDeviceVector));
    NSLog(@"theta(degrees) = %f", thetaOffset*180.0/M_PI);

    //Now we have the device angle to the gravity vector (0,0,-1)
    //We must transform these coordinates to match our 
    //device's attitude by transforming to theta prime
    float theta_deg = thetaOffset*180.0/M_PI;
    float thetaPrime_deg = -theta_deg + 90; // ThetaPrime = -Theta + 90 <==> y=mx+b

    NSLog(@"thetaPrime(degrees) = %f", thetaOffset*180.0/M_PI);

    deviceOffsetRotation = CATransform3DMakeRotation((thetaPrime_deg) * M_PI / 180.0, 1, 0, 0);
    initialTransform = CATransform3DConcat(deviceOffsetRotation, initialTransform);

    perspectiveTransformedLayer.sublayerTransform = initialTransform;

    self.animationTimer = [NSTimer scheduledTimerWithTimeInterval:1/60.0 target:self selector:@selector(tick) userInfo:nil repeats:YES];

}

- (void) tick
{
    CMRotationMatrix rotation;

    CMDeviceMotion *deviceMotion = motionManager.deviceMotion;
    CMAttitude *attitude = deviceMotion.attitude;

    if (referenceAttitude != nil)
    {
        [attitude multiplyByInverseOfAttitude:referenceAttitude];
    }
    rotation = attitude.rotationMatrix;

    CATransform3D rotationalTransform = perspectiveTransformedLayer.sublayerTransform;

    //inverse (or called the transpose) of the attitude.rotationalMatrix
    rotationalTransform.m11 = rotation.m11;
    rotationalTransform.m12 = rotation.m21;
    rotationalTransform.m13 = rotation.m31;

    rotationalTransform.m21 = rotation.m12;
    rotationalTransform.m22 = rotation.m22;
    rotationalTransform.m23 = rotation.m32;

    rotationalTransform.m31 = rotation.m13;
    rotationalTransform.m32 = rotation.m23;
    rotationalTransform.m33 = rotation.m33;

    rotationalTransform = CATransform3DConcat(deviceOffsetRotation, rotationalTransform);
    rotationalTransform = CATransform3DConcat(rotationalTransform, CATransform3DMakeScale(1.0, -1.0, 1.0));


    perspectiveTransformedLayer.sublayerTransform = rotationalTransform;
}

Pseudo code:

  1. start device motion updates
  2. start a camera preview in the background ;)
  3. capture the current gravity reading from the device as a CMAcceleration... once you have gravity store it in a local variable.
  4. Then you have to take the 2 vectors and get the angle in between them, in this case the device gravity of (0,0,-1) and the real gravity vector...
  5. we then turn theta into thetaPrime... a transform that matches the CoreMotion reference orientation
  6. Setup a timer to animate....
  7. during animation get the inverse of the rotationMatrix of motionManager's deviceMotion property.
  8. Apply the Transformations in the correct order to reflect the device's current attitude (yaw,pitch,roll in eulerian mode or the devices quaternion rotation... 3 different ways to say the same thing basically)

Here is the code:

- (void) initMotionCapture
{
    firstGravityReading = NO;
    referenceAttitude = nil;

    if (motionManager == nil)
    {
        self.motionManager = [CMMotionManager new];
    }
    motionManager.deviceMotionUpdateInterval = 0.01;
    self.gravityTimer = [NSTimer scheduledTimerWithTimeInterval:1/60.0 target:self selector:@selector(getFirstGravityReading) userInfo:nil repeats:YES];
}


- (void) getFirstGravityReading
{
    CMAcceleration currentGravity; 

    CMDeviceMotion *dm = motionManager.deviceMotion;
    referenceAttitude = dm.attitude;
    currentGravity = dm.gravity;

    [motionManager startDeviceMotionUpdates];

    if (currentGravity.x !=0 && currentGravity.y !=0 && currentGravity.z !=0)
    {
        NSLog(@"Gravity = (%f,%f,%f)", currentGravity.x, currentGravity.y, currentGravity.z);

        firstGravityReading = YES;
        [gravityTimer invalidate];
        self.gravityTimer = nil;
        [self setupCompass];
    }
}

- (void) setupCompass
{
    //Draw your cube... I am using a quartz 3D perspective hack!
    CATransform3D initialTransform = perspectiveTransformedLayer.sublayerTransform;
    initialTransform.m34 = 1.0/-10000;


    //HERE IS WHAT YOU GUYS NEED... the vector equations!
    NSLog(@"Gravity = (%f,%f,%f)", currentGravity.x, currentGravity.y, currentGravity.z);

    //we have current gravity vector and our device gravity vector of (0, 0, -1)
    // get the dot product
    float dotProduct = currentGravity.x*0 + currentGravity.y*0 + currentGravity.z*-1;
    float innerMagnitudeProduct = currentGravity.x*currentGravity.x + currentGravity.y + currentGravity.y + currentGravity.z*currentGravity.z;
    float magnitudeCurrentGravity = sqrt(innerMagnitudeProduct);
    float magnitudeDeviceVector = 1; //since (0,0,-1) computes to: 0*0 + 0*0 + -1*-1 = 1

    thetaOffset = acos(dotProduct/(magnitudeCurrentGravity*magnitudeDeviceVector));
    NSLog(@"theta(degrees) = %f", thetaOffset*180.0/M_PI);

    //Now we have the device angle to the gravity vector (0,0,-1)
    //We must transform these coordinates to match our 
    //device's attitude by transforming to theta prime
    float theta_deg = thetaOffset*180.0/M_PI;
    float thetaPrime_deg = -theta_deg + 90; // ThetaPrime = -Theta + 90 <==> y=mx+b

    NSLog(@"thetaPrime(degrees) = %f", thetaOffset*180.0/M_PI);

    deviceOffsetRotation = CATransform3DMakeRotation((thetaPrime_deg) * M_PI / 180.0, 1, 0, 0);
    initialTransform = CATransform3DConcat(deviceOffsetRotation, initialTransform);

    perspectiveTransformedLayer.sublayerTransform = initialTransform;

    self.animationTimer = [NSTimer scheduledTimerWithTimeInterval:1/60.0 target:self selector:@selector(tick) userInfo:nil repeats:YES];

}

- (void) tick
{
    CMRotationMatrix rotation;

    CMDeviceMotion *deviceMotion = motionManager.deviceMotion;
    CMAttitude *attitude = deviceMotion.attitude;

    if (referenceAttitude != nil)
    {
        [attitude multiplyByInverseOfAttitude:referenceAttitude];
    }
    rotation = attitude.rotationMatrix;

    CATransform3D rotationalTransform = perspectiveTransformedLayer.sublayerTransform;

    //inverse (or called the transpose) of the attitude.rotationalMatrix
    rotationalTransform.m11 = rotation.m11;
    rotationalTransform.m12 = rotation.m21;
    rotationalTransform.m13 = rotation.m31;

    rotationalTransform.m21 = rotation.m12;
    rotationalTransform.m22 = rotation.m22;
    rotationalTransform.m23 = rotation.m32;

    rotationalTransform.m31 = rotation.m13;
    rotationalTransform.m32 = rotation.m23;
    rotationalTransform.m33 = rotation.m33;

    rotationalTransform = CATransform3DConcat(deviceOffsetRotation, rotationalTransform);
    rotationalTransform = CATransform3DConcat(rotationalTransform, CATransform3DMakeScale(1.0, -1.0, 1.0));


    perspectiveTransformedLayer.sublayerTransform = rotationalTransform;
}
如何视而不见2024-10-27 04:10:15

您需要时不时地将偏航值校准到磁航向,以确保您处于正确的轨道上。查看有关如何补偿指南针抖动的说明:补偿指南针iPhone 4 上的陀螺仪滞后

You would need to calibrate your yaw value to the magnetic heading now and then to make sure you are on the right track. Check out this explanation on how to compensate the shaky compass: Compensating compass lag with the gyroscope on iPhone 4

淡看悲欢离合2024-10-27 04:10:15

特斯拉值是磁场强度,是三个轴中每个轴感受到的磁“拉力”大小的度量。只有将这些信息与加速度计数据结合起来,并进行一系列奇特的数学运算,您才能获得实际的航向(设备相对于磁北“指向”的方向)。然后添加来自 GPS 的信息并进行更多数学计算,以获得真实航向(相对于地理北极)。

长话短说,您可能不想自己做数学计算。幸运的是,iOS 在其 CLHeading 对象中提供了MagneticHeading 和 trueHeading,可从 CLLocationManager 标题属性获取。

为了获得描述设备如何倾斜的俯仰和滚动,还需要对来自磁力计和加速度计的相同原始数据进行数学计算。抱歉,我不知道有任何用于俯仰和横滚的 iOS API。

The Tesla values are magnetic field strength, a measure of how much magnetic "pull" is felt in each of the three axes. Only by combining this information with accelerometer data, and doing a bunch of fancy math, can you get an actual heading (which way the device is "pointing" relative to magnetic north). Then add in information from the GPS and do more math, to get the true heading (relative to the geographic north pole).

Long story short, you probably don't want to do the math yourself. Luckily iOS provides both magneticHeading and trueHeading in its CLHeading object, available from the CLLocationManager heading property.

To get pitch and roll, which describe how the device is tilted, also involves doing math on these same raw data from the magnetometer and accelerometer. Sorry I don't know of any iOS API for pitch and roll.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文