Android 手机方向概述(包括指南针)
一段时间以来,我一直在尝试了解 Android 方向传感器。 我以为我明白了。然后我意识到我没有。现在我想(希望)我对它有了更好的感觉,但我仍然没有100%。我将尝试解释我对它的不完整的理解,如果我有部分错误或填补任何空白,希望人们能够纠正我。
我想象我正站在 0 度经度(本初子午线)和 0 度纬度(赤道)。这个位置实际上位于非洲海岸附近的海域,但请耐心等待。我将手机放在脸前,让手机底部指向我的脚;我面朝北方(朝向格林威治),因此手机的右侧指向东方非洲。在这个方向上(参考下图),我的 X 轴指向东方,Z 轴指向南方,Y 轴指向天空。
现在,手机上的传感器可让您计算出设备的方向(而非位置)设备在这种情况下。这部分一直让我感到困惑,可能是因为在我接受某些东西确实有效之前,我想了解它是如何工作的。手机似乎结合了两种不同的技术来确定其方向。
在我开始之前,想象一下回到那块假想的经纬度为 0 度的土地上,按照上面提到的方向站立。还想象一下,您被蒙住眼睛,鞋子被固定在游乐场的环形交叉路口上。如果有人从后面推你,你就会向前摔倒(向北),并伸出双手来阻止摔倒。同样,如果有人推你的左肩,你的右手也会摔倒。您的内耳具有“重力传感器”(youtube Clip),可以让您检测您是否向前/向后跌倒,或向左/向右跌倒或向下(或向上!!)。因此,人类可以检测到与手机相同的 X 轴和 Z 轴的对齐和旋转。
现在想象有人在环形交叉路口将您旋转 90 度,以便您现在面朝东方。您正在绕 Y 轴旋转。这个轴是不同的,因为我们无法从生物学上检测到它。我们知道我们有一定的角度,但我们不知道相对于地球磁北极的方向。 相反,我们需要使用外部工具……磁罗盘。这使我们能够确定我们面对的方向。我们的手机也是如此。
现在这款手机还配备了 3 轴加速度计。我不知道它们实际上是如何工作的,但我想象它的方式是将重力想象成从天上落下的持续且均匀的“雨”,并将上图中的轴想象成可以检测到的管子流经的雨量。当手机直立时,所有雨水都会流过 Y 形“管”。如果逐渐旋转手机,使其屏幕面向天空,流过 Y 的雨量将减少到零,而流过 Z 的雨量将稳步增加,直到流过最大雨量。同样,如果我们现在将手机倾斜到一侧,X 管最终将收集最大量的雨水。因此,根据手机的方向,通过测量流经 3 个管子的雨量,您可以计算出方向。
这款手机还有一个电子罗盘,其行为与普通指南针类似——它的“虚拟指针”指向磁北。 Android 会合并来自这两个传感器的信息,以便每当生成 TYPE_ORIENTATION
的 SensorEvent
时,values[3]
数组就会
值[0]:方位角 -(磁北以东的罗盘方位)
值[1]:俯仰、绕x轴旋转(手机向前还是向后倾斜)
值[2]:滚动,绕y轴旋转(手机是向左还是向右倾斜)
所以我认为(即我不知道)Android给出方位角(罗盘方位)而不是读数的原因第三个加速度计的优点是罗盘方位更有用。我不确定他们为什么不推荐使用这种类型的传感器,因为现在看来您需要向系统注册一个 TYPE_MAGNETIC_FIELD
类型的 SensorEvent
监听器。事件的 value[]
数组需要传递到 SensorManger.getRotationMatrix(..)
方法中以获取旋转矩阵(见下文),然后将其传递到 SensorManager.getOrientation(..)
方法。 有谁知道为什么 Android 团队弃用 Sensor.TYPE_ORIENTATION
吗?是效率问题吗?这就是对类似问题 但您仍然需要在 development/samples/Compass/src/com/example/android/compass/CompassActivity.java 示例。
我现在想谈谈旋转矩阵。 (这是我最不确定的地方) 上面我们从 Android 文档中得到了三个图形,我们将它们称为 A、B 和 C。
A = SensorManger.getRotationMatrix(..) 方法图形,代表世界坐标系
C= SensorManager.getOrientation(..) 方法图
所以我的理解是A 代表“世界坐标系”,我认为它指的是地球上的位置以(纬度,经度)与可选(高度)对的形式给出的方式。 X 是 "东距" 坐标,Y 是 "northing" 坐标。 Z指向天空,代表高度。
图B所示的电话坐标系是固定的。它的 Y 轴始终指向顶部。手机不断计算旋转矩阵,并允许在两者之间进行映射。那么我认为旋转矩阵将 B 的坐标系转换为 C 的坐标系是否正确?因此,当您调用 SensorManager.getOrientation(..)
方法时,您将使用 values[]
数组,其中的值对应于图 C。 当手机指向天空时,旋转矩阵是单位矩阵(数学上相当于 1 的矩阵),这意味着不需要映射,因为设备与世界坐标系对齐。
好的。我想我最好现在就停下来。就像我之前说的,我希望人们能告诉我哪里搞砸了或帮助了人们(或者让人们更加困惑!)
I've been trying to get my head around the Android orientation sensors for a while.
I thought I understood it. Then I realised I didn't. Now I think (hope) I have a better feeling for it again but I am still not 100%. I will try and explain my patchy understanding of it and hopefully people will be able to correct me if I am wrong in parts or fill in any blanks.
I imagine I am standing at 0 degrees longitude (prime meridian) and 0 degrees latitude (equator). This location is actually in the sea off the coast of Africa but bear with me. I hold my phone in front of my face so that the bottom of the phone points to my feet; I am facing North (looking toward Greenwich) so therefore the right hand side of the phone points East towards Africa. In this orientation (with reference to the diagram below) I have the X-axis pointing East, the Z-axis is pointing South and Y-axis point to the sky.
Now the sensors on the phone allow you to work out the orientation (not location) of the device in this situation. This part has always confused me, probably because I wanted to understand how something worked before I accepted that it did just work. It seems that the phone works out its orientation using a combination of two different techniques.
Before I get to that, imagine being back standing on that imaginary piece of land at 0 degrees latitude and longitude standing in the direction mentioned above. Imagine also that you are blindfolded and your shoes are fixed to a playground roundabout. If someone shoves you in the back you will fall forward (toward North) and put both hands out to break your fall. Similarly if someone shoves you left shoulder you will fall over on your right hand. Your inner ear has "gravitational sensors" (youtube clip) which allow you to detect if you are falling forward/back, or falling left/right or falling down (or up!!). Therefore humans can detect alignment and rotation around the the same X and Z axes as the phone.
Now imagine someone now rotates you 90 degrees on the roundabout so that you are now facing East. You are being rotated around the Y axis. This axis is different because we can't detect it biologically. We know we are angled by a certain amount but we don't know the direction in relation to the planet's magnetic North pole.
Instead we need to use a external tool... a magnetic compass. This allows us to ascertain which direction we are facing. The same is true with our phone.
Now the phone also has a 3-axes accelerometer. I have NO idea how they actually work but the way I visualise it is to imagine gravity as constant and uniform 'rain' falling from the sky and to imagine the axes in the figure above as tubes which can detect the amount of rain flowing through. When the phone held upright all the rain will flow through the Y 'tube'. If the phone is gradually rotated so its screen faces the sky the amount of rain flowing through Y will decrease to zero while the volume through Z will steadily increase until the maximum amount of rain is flowing through. Similarly if we now tip the phone onto its side the X tube will eventually collect the max amount of rain. Therefore depending on the orientation of the phone by measuring the amount of rain flowing through the 3 tubes you can calculate the orientation.
The phone also has an electronic compass which behaves like a normal compass - its "virtual needle" points to magnetic north. Android merges the information from these two sensors so that whenever a SensorEvent
of TYPE_ORIENTATION
is generated the values[3]
array has
values[0]: Azimuth - (the compass bearing east of magnetic north)
values[1]: Pitch, rotation around x-axis (is the phone leaning forward or back)
values[2]: Roll, rotation around y-axis (is the phone leaning over on its left or right side)
So I think (ie I don't know) the reason Android gives the azimuth (compass bearing) rather than the reading of the third accelerometer is that the compass bearing is just more useful. I'm not sure why they deprecated this type of sensor as now it seems you need to register a listener with the system for SensorEvent
s of type TYPE_MAGNETIC_FIELD
. The event's value[]
array needs to bepassed into SensorManger.getRotationMatrix(..)
method to get a rotation matrix (see below) which is then passed into the SensorManager.getOrientation(..)
method.
Does anyone know why the Android team deprecated Sensor.TYPE_ORIENTATION
? Is it an efficiency thing? That is what is implied in one of the comments to a similar question but you still need to register a different type of listener in the development/samples/Compass/src/com/example/android/compass/CompassActivity.java example.
I'd now like to talk about the rotation matrix. (This is where I am most unsure)
So above we have the three figures from the Android documentation, we'll call them A, B and C.
A = SensorManger.getRotationMatrix(..) method figure and represents the World's coordinate system
B = Coordinate system used by the SensorEvent API.
C= SensorManager.getOrientation(..) method figure
So my understanding is that A represents the "world's coordinate system" which I presume refers to the way locations on the planet are given as a (latitude, longitude) couple with an optional (altitude). X is the "easting" co-ordinate, Y is the "northing" co-ordinate. Z points to the sky and represents altitude.
The phones co-ordinate system is shown in figure B is fixed. Its Y axis always points out the top. The rotation matrix is being constantly calculated by the phone and allows mapping between the two. So am I right in thinking that the rotation matrix transforms the coordinate system of B to C? So when you call SensorManager.getOrientation(..)
method you use the values[]
array with values that correspond to figure C.
When the phone is pointed to the sky the rotation matrix is identity matrix (the matrix mathematical equivalent of 1) which means no mapping is necessary as the device is aligned with the world's coordinate system.
Ok. I think I better stop now. Like I said before I hope people will tell me where I've messed up or helped people (or confused people even further!)
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
您可能想查看 One Screen Turn 值得另一篇文章。它解释了为什么需要旋转矩阵。
简而言之,即使设备旋转,手机的传感器也始终使用相同的坐标系。
在未锁定到单一方向的应用程序中,当您旋转设备时,屏幕坐标系会发生变化。因此,当设备从其默认视图模式旋转时,传感器坐标系不再与屏幕坐标系相同。本例中的旋转矩阵用于将 A 变换为 C(B 始终保持固定)。
这是一个代码片段,向您展示如何使用它。
You might want to check out the One Screen Turn Deserves Another article. It explains why you need the rotation matrix.
In a nutshell, the phone's sensors always use the same coordinate system, even when the device is rotated.
In applications that are not locked to a single orientation, the screen coordinate system changes when you rotate the device. Thus, when the device is rotated from its default view mode, the sensor coordinate system is no longer the same as the screen coordinate system. The rotation matrix in this case is used to transform A to C (B always remains fixed).
Here's a code snippet to show you how it can be used.
滚动是重力的函数,90 度滚动将所有重力放入 x 寄存器中。
俯仰角是相同的,90 度的俯仰角将所有重力分量放入 y 寄存器中。
偏航/航向/方位角对重力没有影响,它始终与重力成直角,因此无论您面向哪个方向,重力都是不可测量的。
这就是为什么你需要一个指南针来评估,也许这有道理?
Roll is a function of gravity, a 90 degree roll puts all of gravity into the x register.
Pitch is the same, a 90 degree pitch up puts all of the component of gravity into the y register.
Yaw / Heading / azimuth has no effect on gravity, it is ALWAYS at right angles to gravity, hence no matter which way you are facing gravity will be imeasurable.
This is why you need a compass to assess, maybe that makes sense?
看看这个: Stackoverflow.com:Q.5202147
在 3 个图 A、B、C 之前,您似乎基本上是正确的。
之后你就会让自己感到困惑。
Have a look at this: Stackoverflow.com: Q.5202147
You seem to be mostly right until the 3 diagrams A,B,C.
After that you have got yourself confused.
我遇到了这个问题,所以我规划了不同方向会发生什么。
如果设备以横向方式安装,例如在汽车安装中,指南针的“度数”似乎从 0-275(顺时针方向)高于 269(西和北之间),它从 -90 倒数到 0,然后从 0 向前移动到 269。270 变为 -90
仍然处于横向状态,但设备背面朝下,我的传感器给出 0-360。
在纵向模式下,无论是仰卧还是纵向站立,它都能以 0-360 度运行。
希望对某人有帮助
I was having this issue so I mapped out what happens in different directions.
If the device is mounted in landscape fashion, eg in a car mount the 'degrees' from the compass seem to run from 0-275 (going clockwise) above 269 ( between west and north) it counts backwards from -90 to 0, then forwards from 0 to 269. 270 becomes -90
Still In landscape but with the device lying on its back my sensor gives 0-360.
and in portrait mode it runs 0-360 both lying on its back and standing up in portrait.
Hope that helps someone