官方 Kinect SDK 和 Unity3d

发布于 2024-11-16 20:45:09 字数 183 浏览 3 评论 0 原文

有谁知道有关通过官方 SDK 使用 Unity3d 的 Kinect 输入的任何信息吗?我被分配了一个项目来尝试整合这两者,但我的主管不希望我使用开放的 Kinect 东西。 Unity 网站的最新消息是 Kinect SDK 需要 4.0 .Net,而 Unity3D 只需要 3.5

解决方法?如果您对此有任何了解,请给我指出资源。

Does anyone know anything about using Kinect input for Unity3d with the official SDK? I've been assigned a project to try and integrate these two, but my super doesn't want me to use the open Kinect stuff. Last news out of the Unity site was that Kinect SDK requires 4.0 .Net and Unity3D only takes 3.5

Workarounds? Point me toward resources if you know anything about it please.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

抹茶夏天i‖ 2024-11-23 20:45:09

Unity 的 OpenNI 绑定可能是最好的选择。 NITE 骨架比 Microsoft Kinect SDK 更稳定,但仍然需要校准(PrimeSense 提到他们很快就会有免校准骨架)。

Kinect SDK 与 OpenNI 的绑定使 Kinect SDK 像 SensorKinect 一样工作,该模块还将 KinectSDK 免校准骨架公开为 OpenNI 模块:

https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/

因为KinectSDK还提供了脚踝和手腕,而且OpenNI已经支持了(甚至尽管 NITE 不支持)所有 OpenNI 东西,包括 Unity 角色装备(包括脚踝和手腕)都可以正常工作,无需校准。 OpenNI 的 KinectSDK 绑定还支持使用 NITE 的骨架和手部跟踪器,但需要注意的是,NITE 手势检测似乎尚无法与 Kinect SDK 配合使用。将 KinectSDK 与 NITE 的 handGenerator 结合使用时的解决方法是使用无骨架跟踪来为您提供手点。不幸的是,当传感器看不到您的身体时,您将无法追踪手部。

尽管如此,NITE 的骨架似乎比 KinectSDK 更稳定、反应更灵敏。

The OpenNI bindings for Unity are probably the best way to go. The NITE skeleton is more stable than the Microsoft Kinect SDK, but still requires calibration (PrimeSense mentioned that they'll have a calibration-free skeleton soon).

There are bindings to OpenNI from the Kinect SDK, that make the Kinect SDK work like SensorKinect, this module also exposes The KinectSDK calibration-free skeleton as an OpenNI module:

https://www.assembla.com/code/kinect-mssdk-openni-bridge/git/nodes/

Because the KinectSDK also provides ankles and wrists, and OpenNI already supported it (even though NITE didn't support it) all the OpenNI stuff including Unity character rigs that had included the ankles and wrists just all work and without calibration. The KinectSDK bindings for OpenNI also support using NITE's skeleton and hand trackers, with one caveat, it seems like the NITE gesture detection aren't working with the Kinect SDK yet. The work-around when using the KinectSDK with NITE's handGenerator is to use skeleton-free tracking to provide you with a hand point. Unfortunately you lose the ability to just track hands when your body isn't visible to the sensor.

Still, NITE's skeleton seems more stable and more responsive than the KinectSDK.

动听の歌 2024-11-23 20:45:09

您需要多少原始 Kinect 数据?对于受限问题,例如仅获得肢体关节,您是否考虑过使用不可知的通信模式,例如 TcpClient。只需在 .net 4.0 中创建一个简单的 TCP 服务器,该服务器链接到 Kinect SDK 并每隔 30 毫秒或其他时间输出包含您需要的信息的数据包。然后在Unity中编写一个接收客户端即可。我在使用不同的 SDK 时遇到了类似的问题。我还没有尝试过 Kinect,所以也许我的建议有点过分了。

如果您想要实时深度/颜色数据,您可能需要更快一些,也许使用 管道

How much of the raw Kinect data do you need? For a constrained problem, like just getting limb articulation, have you thought about using an agnostic communication schema like a TcpClient. Just create a simple TCP server, in .net 4.0, that links to the Kinect SDK and pumps out packets w/ the info you need every 30ms or something. Then just write a receiving client in Unity. I had a similar problem with a different SDK. I haven't tried the Kinect though so maybe my suggestion is overkill.

If you want real-time depth/color data you might need something a bit faster, perhaps using Pipes?

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文