传感器 TYPE_ACCELEROMETER/TYPE_MAGNETIC_FIELD 和 TYPE_ORIENTATION 之间的值不同
有 2 种方法可以获得 3 个旋转值(方位角、俯仰角、横滚角)。
一种是注册 TYPE_ORIENTATION 类型的侦听器。这是最简单的方法,我从每次旋转中获得正确的值范围,如文档所述: 方位角: [0, 359] 音高:[-180, 180] roll: [-90, 90]
另一种,最精确、最复杂,第一次看到就明白。 Android推荐它,所以我想使用它,但我得到不同的值。
方位角:[-180, 180]。 -180/180 为 S,0 为 N,90 E 和 -90 W。
音高:[-90, 90]。 90 是 90,-90 是 -90,0 是 0,但 -180/180(屏幕向下)是 0。
滚动:[-180, 180]。
我应该得到相同的值,但带有小数,对吧?
我有以下代码:
aValues = new float[3];
mValues = new float[3];
sensorListener = new SensorEventListener (){
public void onSensorChanged (SensorEvent event){
switch (event.sensor.getType ()){
case Sensor.TYPE_ACCELEROMETER:
aValues = event.values.clone ();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mValues = event.values.clone ();
break;
}
float[] R = new float[16];
float[] orientationValues = new float[3];
SensorManager.getRotationMatrix (R, null, aValues, mValues);
SensorManager.getOrientation (R, orientationValues);
orientationValues[0] = (float)Math.toDegrees (orientationValues[0]);
orientationValues[1] = (float)Math.toDegrees (orientationValues[1]);
orientationValues[2] = (float)Math.toDegrees (orientationValues[2]);
azimuthText.setText ("azimuth: " + orientationValues[0]);
pitchText.setText ("pitch: " + orientationValues[1]);
rollText.setText ("roll: " + orientationValues[2]);
}
public void onAccuracyChanged (Sensor sensor, int accuracy){}
};
请帮忙。这非常令人沮丧。
我是否必须以这些价值观来对待,或者我做错了什么?
谢谢。
There are 2 ways to get the 3 rotation values (azimuth, pitch, roll).
One is registering a listener of a type TYPE_ORIENTATION. It's the easiest way and I get a correct range of values from every rotation as the documentation says:
azimuth: [0, 359]
pitch: [-180, 180]
roll: [-90, 90]
The other one, the most precise and complex to understand the first time you see it. Android recommends it, so I want to use it, but I get different values.
azimuth: [-180, 180]. -180/180 is S, 0 i N, 90 E and -90 W.
pitch: [-90, 90]. 90 is 90, -90 is -90, 0 is 0 but -180/180 (lying with the screen downwards) is 0.
roll: [-180, 180].
I should get the same values but with decimals, right?
I have the following code:
aValues = new float[3];
mValues = new float[3];
sensorListener = new SensorEventListener (){
public void onSensorChanged (SensorEvent event){
switch (event.sensor.getType ()){
case Sensor.TYPE_ACCELEROMETER:
aValues = event.values.clone ();
break;
case Sensor.TYPE_MAGNETIC_FIELD:
mValues = event.values.clone ();
break;
}
float[] R = new float[16];
float[] orientationValues = new float[3];
SensorManager.getRotationMatrix (R, null, aValues, mValues);
SensorManager.getOrientation (R, orientationValues);
orientationValues[0] = (float)Math.toDegrees (orientationValues[0]);
orientationValues[1] = (float)Math.toDegrees (orientationValues[1]);
orientationValues[2] = (float)Math.toDegrees (orientationValues[2]);
azimuthText.setText ("azimuth: " + orientationValues[0]);
pitchText.setText ("pitch: " + orientationValues[1]);
rollText.setText ("roll: " + orientationValues[2]);
}
public void onAccuracyChanged (Sensor sensor, int accuracy){}
};
Please help. It's very frustrating.
Do I have to treat with those values or I'm doing something wrong?
Thanks.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
我知道我在这里玩的是线程死灵法师,但最近我一直在研究这个东西,所以我想我应该投入 2 美分。
该设备不包含指南针或倾斜仪,因此它不能直接测量方位角、俯仰角或横滚角。 (顺便说一句,我们称之为欧拉角)。相反,它使用加速度计和磁力计,两者都会产生 3 空间 XYZ 向量。这些用于计算方位角等值。
矢量位于设备坐标空间中:
世界坐标 Y 朝北,X 朝东,Z 朝上:
< img src="https://developer.android.com/images/axis_globe.png" alt="世界坐标">
因此,设备的“中立”方向是平放在桌子上,顶部是设备朝北。
加速度计产生“向上”方向的矢量。磁力计产生“北”方向的矢量。 (请注意,在北半球,由于磁倾角,该
值 往往指向下方。)加速度计矢量和磁力计矢量可以通过 SensorManager.getRotationMatrix() 以数学方式组合,它返回一个 3x3 矩阵,它将设备坐标中的矢量映射到世界坐标,反之亦然。对于处于中性位置的设备,此函数将返回单位矩阵。
该矩阵不随屏幕方向而变化。这意味着您的应用程序需要了解方向并进行相应的补偿。
SensorManager.getOrientation() 采用变换矩阵并计算方位角、俯仰角和横滚角值。这些是相对于处于中间位置的设备而言的。
我不知道调用此函数和仅使用 TYPE_ORIENTATION 传感器之间有什么区别,只是该函数允许您首先操作矩阵。
如果设备向上倾斜 90° 或接近 90°,则欧拉角的使用就会失效。从数学上来说,这是一个退化的情况。在这个领域中,设备如何知道您是否正在改变方位角或滚转角?
SensorManager.remapCoordinateSystem() 函数可用于操纵变换矩阵,以补偿您可能了解的有关设备方向的信息。然而,我的实验表明,这并不能涵盖所有情况,甚至不能涵盖一些常见的情况。例如,如果您想要重新映射直立的设备(例如拍照),则需要
在调用 getOrientation() 之前将变换矩阵乘以以下矩阵: 这不是 remapCooperativeSystem 的方向重新映射之一() 支持[如果我在这里遗漏了什么,请纠正我]。
好的,这就是一种冗长的说法,如果您使用方向,无论是来自 TYPE_ORIENTATION 传感器还是来自 getOrientation(),您可能做错了。您真正需要欧拉角的唯一情况是以用户友好的形式显示方向信息、注释照片、驱动飞行仪表显示或类似的东西。
如果您想要进行与设备方向相关的计算,那么使用变换矩阵并使用 XYZ 向量几乎肯定会更好。
作为一名顾问,每当有人向我提出涉及欧拉角的问题时,我都会退后一步,询问他们真正想要做什么,然后找到一种用向量来实现的方法。
回顾你原来的问题, getOrientation() 应该返回 [-180 180] [-90 90] 和 [-180 180] 中的三个值(从弧度转换后)。在实践中,我们将方位角视为 [0 360) 中的数字,因此您只需将 360 添加到收到的任何负数即可。您的代码看起来是正确的。如果我确切地知道您期望的结果以及您得到的结果,将会有所帮助。
编辑添加:还有一些想法。现代版本的 Android 使用称为“传感器融合”的技术,这基本上意味着所有可用的输入(加速度计、磁力计、陀螺仪)都组合在一个数学黑匣子中(通常是卡尔曼滤波器,但取决于供应商)。所有不同的传感器——加速度、磁场、陀螺仪、重力、线性加速度和方向——都被视为这个黑匣子的输出。
只要有可能,您应该使用 TYPE_GRAVITY 而不是 TYPE_ACCELEROMETER 作为 getRotationMatrix() 的输入。
I know I'm playing thread necromancer here, but I've been working on this stuff a lot lately, so I thought I'd throw in my 2¢.
The device doesn't contain compass or inclinometers, so it doesn't measure azimuth, pitch, or roll directly. (We call those Euler angles, BTW). Instead, it uses accelerometers and magnetometers, both of which produce 3-space XYZ vectors. These are used to compute the azimuth, etc. values.
Vectors are in device coordinate space:
World coordinates have Y facing north, X facing east, and Z facing up:
Thus, a device's "neutral" orientation is lying flat on its back on a table, with the top of the device facing north.
The accelerometer produces a vector in the "UP" direction. The magnetometer produces a vector in the "north" direction. (Note that in the northern hemisphere, this tends to point downward due to magnetic dip.)
The accelerometer vector and magnetometer vector can be combined mathematically through SensorManager.getRotationMatrix() which returns a 3x3 matrix which will map vectors in device coordinates to world coordinates or vice-versa. For a device in the neutral position, this function would return the identity matrix.
This matrix does not vary with the screen orientation. This means your application needs to be aware of orientation and compensate accordingly.
SensorManager.getOrientation() takes the transformation matrix and computes azimuth, pitch, and roll values. These are taken relative to a device in the neutral position.
I have no idea what the difference is between calling this function and just using TYPE_ORIENTATION sensor, except that the function lets you manipulate the matrix first.
If the device is tilted up at 90° or near it, then the use of Euler angles falls apart. This is a degenerate case mathematically. In this realm, how is the device supposed to know if you're changing azimuth or roll?
The function SensorManager.remapCoordinateSystem() can be used to manipulate the transformation matrix to compensate for what you may know about the orientation of the device. However, my experiments have shown that this doesn't cover all cases, not even some of the common ones. For example, if you want to remap for a device held upright (e.g. to take a photo), you would want to multiply the transformation matrix by this matrix:
before calling getOrientation(), and this is not one of the orientation remappings that remapCoordinateSystem() supports [someone please correct me if I've missed something here].
OK, so this has all been a long-winded way of saying that if you're using orientation, either from the TYPE_ORIENTATION sensor or from getOrientation(), you're probably doing it wrong. The only time you actually want the Euler angles is to display orientation information in a user-friendly form, to annotate a photograph, to drive flight instrument display, or something similar.
If you want to do computations related to device orientation, you're almost certainly better off using the transformation matrix and working with XYZ vectors.
Working as a consultant, whenever someone comes to me with a problem involving Euler angles, I back up and ask them what they're really trying to do, and then find a way to do it with vectors instead.
Looking back at your original question, getOrientation() should return three values in [-180 180] [-90 90] and [-180 180] (after converting from radians). In practice, we think of azimuth as numbers in [0 360), so you should simply add 360 to any negative numbers you receive. Your code looks correct as written. It would help if I knew exactly what results you were expecting and what you were getting instead.
Edited to add: A couple more thoughts. Modern versions of Android use something called "sensor fusion", which basically means that all available inputs -- acceleromter, magnetometer, gyro -- are combined together in a mathematical black box (typically a Kalman filter, but depends on vendor). All of the different sensors -- acceleration, magnetic field, gyros, gravity, linear acceleration, and orientation -- are taken as outputs from this black box.
Whenever possible, you should use TYPE_GRAVITY rather than TYPE_ACCELEROMETER as the input to getRotationMatrix().
我可能会在黑暗中拍摄,但如果我正确理解你的问题,你会想知道为什么你得到
[-179..179]
而不是[0..360]?
请注意,
-180
与+180
相同,并且与180 + N*360
相同,其中N
是一个整数(整数)。换句话说,如果您想获得与方向传感器相同的数字,您可以这样做:
这将为您提供您想要的
[0..360]
范围内的值。I might be shooting in the dark here, but if I understand your question correctly, you are wondering why you get
[-179..179]
instead of[0..360]
?Note that
-180
is the same as+180
and the same as180 + N*360
whereN
is a whole number (integer).In other words, if you want to get the same numbers as with orientation sensor you can do this:
This will give you the values in the
[0..360]
range as you wanted.您的计算中遗漏了一项关键计算。
在您执行 <获取旋转矩阵。
将其添加到您的代码中,一切都会好起来的。
您可以在此处了解更多相关信息。
You are missing one critical computation in your calculations.
The remapCoordinateSystem call afer you do a getRotationMatrix.
Add that to your code and all will be fine.
You can read more about it here.