如何在Android中获取方向(例如北,西)

发布于 2024-12-18 18:16:00 字数 59 浏览 0 评论 0 原文

我是 Android 新手,我想根据我的相机获得方向。如何根据相机获取方向信息?你能为此提供一个想法吗?

I am new in Android and I want to get direction according to my camera. How can I get direction information according to my camera? Could you give an idea for this?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

乖乖公主 2024-12-25 18:16:01

TYPE_ORIENTATION 已弃用

我们不能再使用方向传感器,我们可以同时使用磁场传感器和加速度传感器来获得等效的功能。这是更多的工作,但它确实允许继续使用回调来处理方向更改。

加速度计和磁场方位角的转换:

float[] mGravity;
float[] mGeomagnetic;

public void onSensorChanged(SensorEvent event) {

    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
        mGravity = event.values;

    if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
        mGeomagnetic = event.values;

    if (mGravity != null && mGeomagnetic != null) {
        float R[] = new float[9];
        float I[] = new float[9];

        if (SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic)) {
            
            // orientation contains azimut, pitch and roll
            float orientation[] = new float[3];
            SensorManager.getOrientation(R, orientation);

            azimut = orientation[0];
        }
    }
}

要指向北方,您可以计算以度为单位的旋转:

float rotation = -azimut * 360 / (2 * 3.14159f);

TYPE_ORIENTATION is deprecated

We cannot use the Orientation Sensor anymore, we can use the Magnetic Field Sensor and Accelerometer Sensors in tandem to get equivalent functionality. It's more work but it does allow to continue to use a callback to handle orientation changes.

Conversion from accelerometer and magnetic field to azimut :

float[] mGravity;
float[] mGeomagnetic;

public void onSensorChanged(SensorEvent event) {

    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
        mGravity = event.values;

    if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
        mGeomagnetic = event.values;

    if (mGravity != null && mGeomagnetic != null) {
        float R[] = new float[9];
        float I[] = new float[9];

        if (SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic)) {
            
            // orientation contains azimut, pitch and roll
            float orientation[] = new float[3];
            SensorManager.getOrientation(R, orientation);

            azimut = orientation[0];
        }
    }
}

To point the north you can calculate a rotation in degrees :

float rotation = -azimut * 360 / (2 * 3.14159f);
自演自醉 2024-12-25 18:16:01

到目前为止,这对我来说有点有用,返回的值在 0 - 360 之间,但我认为北没有正确校准?我在运行 Android 5.0.1 的 LG G Pad 上使用它,

public void onSensorChanged(SensorEvent event) {
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
        mGravity = event.values;
    if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
        mGeomagnetic = event.values;
    if (mGravity != null && mGeomagnetic != null) {
        float R[] = new float[9];
        float outR[] = new float[9];
        float I[] = new float[9];

        boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
        if (success) {
            float orientation[] = new float[3];

            SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Y, outR);

            SensorManager.getOrientation(outR, orientation);
            azimut = orientation[0];

            float degree = (float)(Math.toDegrees(azimut)+360)%360;

            System.out.println("degree " + degree);

我确信我错过了一些东西,但希望这对其他人来说是一个不错的起点。为了达到这一点,我回顾了很多其他问题、评论等。

Here's what I have so far that's somewhat working for me, values returned are between 0 - 360 but I don't think north is properly calibrated? I'm using this on an LG G Pad running Android 5.0.1

public void onSensorChanged(SensorEvent event) {
    if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER)
        mGravity = event.values;
    if (event.sensor.getType() == Sensor.TYPE_MAGNETIC_FIELD)
        mGeomagnetic = event.values;
    if (mGravity != null && mGeomagnetic != null) {
        float R[] = new float[9];
        float outR[] = new float[9];
        float I[] = new float[9];

        boolean success = SensorManager.getRotationMatrix(R, I, mGravity, mGeomagnetic);
        if (success) {
            float orientation[] = new float[3];

            SensorManager.remapCoordinateSystem(R, SensorManager.AXIS_X, SensorManager.AXIS_Y, outR);

            SensorManager.getOrientation(outR, orientation);
            azimut = orientation[0];

            float degree = (float)(Math.toDegrees(azimut)+360)%360;

            System.out.println("degree " + degree);

I'm sure there are things I've missed but hopefully this is a decent starting point for others. I reviewed a good number of other questions, comments, etc. to get to this point.

半步萧音过轻尘 2024-12-25 18:16:01

不确定你的相机是什么意思,但你可以根据硬件传感器得到它。请参阅以下链接:

指南针示例

另请查看 TYPE_ORIENTATION rel="nofollow noreferrer">此页面。

Not sure what you mean by according to your camera, but you can get it based on the hardware sensors. See the following links:

Compass Example

Also check out the Sensor TYPE_ORIENTATION on this page.

骑趴 2024-12-25 18:16:01

使用旋转矢量传感器,它结合了地磁和重力传感器。然后使用 SensorManager.getOrientation< /a> 从 5 个值的数组转换为 OpenGL 样式的旋转矩阵。正如文档所述,第一个值是以弧度为单位的罗盘方向。

就其本身而言,这并不能解决从相机角度了解指南针方向的问题。相反,它假设屏幕与地面平行,就像老式的袖珍指南针一样,并报告屏幕顶部指向的方向。因此,如果屏幕顶部朝北,则方向 (event.values[0]) 为 0。如果屏幕顶部笔直向上,则方向未定义。不幸的是,这是使用相机时的常见情况。

在示例侦听器中,我将使用枚举在 Android 默认的袖珍罗盘样式方向和后置摄像头方向之间切换,以便您可以看到 Android 期望的用例和您想要的用例。

enum class CompassCoordinateSystem { POCKET_COMPASS, REAR_CAMERA }

然后编写一个SensorEventListener来跟踪更改。为了清楚起见,我没有在下面的代码中导入 android.hardware.SensorManager.* 。下面定义的所有数组均由 SensorManager 静态方法填充。

/** The latest compass orientation as a 3D vector. */
private var orientation3D = FloatArray(3)
private var coordinateSystem = CompassCoordinateSystem.REAR_CAMERA_ROTATION

fun compassDegrees(): Float = azimuthToDegrees(compassRadians())
fun compassRadians(): Float = orientation3D[0]

/** Convert such that North=0, East=90, South=180, West=270. */
fun azimuthToDegrees(azimuth: Float): Float {
    return ((Math.toDegrees(azimuth.toDouble())+360) % 360).toFloat()
}

override fun onSensorChanged(event: SensorEvent?) {
    if (event?.sensor?.type == Sensor.TYPE_ROTATION_VECTOR) {
         val rotationMatrix = FloatArray(9)
         SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values)
         when (coordinateSystem) {
            CompassCoordinateSystem.POCKET_COMPASS -> SensorManager.getOrientation(rotationMatrix, orientation3D)
            CompassCoordinateSystem.REAR_CAMERA -> {
                val rearCameraMatrix = FloatArray(9)
                // The axis parameters for remapCoordinateSystem() are
                // from an example in that method's documentation
                SensorManager.remapCoordinateSystem(rotationMatrix,
                    SensorManager.AXIS_X, 
                    SensorManager.AXIS_Z, 
                    rearCameraMatrix)
                SensorManager.getOrientation(rearCameraMatrix, orientation3D)
            }
        }
    }
}

Use the Rotation Vector sensor, which combines the geomagnetic and gravitational sensors. Then use SensorManager.getOrientation to convert from an array of 5 values to an OpenGL-style rotation matrix. As the documentation says, the first value is the compass orientation in radians.

By itself, this doesn't solve your problem of knowing the compass direction from the perspective of your camera. Rather, it assumes the screen is parallel to the ground, like an old-fashioned pocket compass, and it reports which way the top of the screen is pointed. Thus, if the top of the screen is facing north, the orientation (event.values[0]) is 0. If the top of the screen is pointed straight up, the orientation is undefined. Unfortunately, this is a common case when using your camera.

In the example listener, I will use an enum to switch between Android's default pocket-compass style orientation and rear camera orientation, so you can see both the use case Android expects and the one you want.

enum class CompassCoordinateSystem { POCKET_COMPASS, REAR_CAMERA }

Then write a SensorEventListener to track changes. For clarity I did not import android.hardware.SensorManager.* in the code below. All the arrays defined below are populated by SensorManager static methods.

/** The latest compass orientation as a 3D vector. */
private var orientation3D = FloatArray(3)
private var coordinateSystem = CompassCoordinateSystem.REAR_CAMERA_ROTATION

fun compassDegrees(): Float = azimuthToDegrees(compassRadians())
fun compassRadians(): Float = orientation3D[0]

/** Convert such that North=0, East=90, South=180, West=270. */
fun azimuthToDegrees(azimuth: Float): Float {
    return ((Math.toDegrees(azimuth.toDouble())+360) % 360).toFloat()
}

override fun onSensorChanged(event: SensorEvent?) {
    if (event?.sensor?.type == Sensor.TYPE_ROTATION_VECTOR) {
         val rotationMatrix = FloatArray(9)
         SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values)
         when (coordinateSystem) {
            CompassCoordinateSystem.POCKET_COMPASS -> SensorManager.getOrientation(rotationMatrix, orientation3D)
            CompassCoordinateSystem.REAR_CAMERA -> {
                val rearCameraMatrix = FloatArray(9)
                // The axis parameters for remapCoordinateSystem() are
                // from an example in that method's documentation
                SensorManager.remapCoordinateSystem(rotationMatrix,
                    SensorManager.AXIS_X, 
                    SensorManager.AXIS_Z, 
                    rearCameraMatrix)
                SensorManager.getOrientation(rearCameraMatrix, orientation3D)
            }
        }
    }
}
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文