@0xalter/mocap4face 中文文档教程

发布于 3年前 浏览 41 项目主页 更新于 3年前


 
mocap4face by alter

alter 的 mocap4face 是一款免费的多平台 SDK,用于基于面部动作编码系统或 ( FACS )。 它提供来自任何移动相机、网络摄像头、照片或视频的实时 FACS 派生混合形状系数和 3D 空间中的刚性头部姿势,从而实现 3D 化身、数字角色等的实时动画。

从上述来源之一获取输入后,mocap4face SDK 在 ARKit 兼容 blendshapes 中生成数据,即, 变形目标 权重值作为下面视频中显示的每帧表达式。 例如,可用于以实时模仿用户面部表情的方式为 2D 或 3D 头像制作动画 à la Apple Memoji,但不需要基于硬件的 TrueDepth 相机。

借助 mocap4face,您可以驱动实时化身或 NFT PFP,构建类似 Snapchat 的镜头、AR 体验、触发动作的面部滤镜、VTubing 应用程序等,同时尽可能减少能源影响和 CPU/GPU 使用。 例如,查看流行的头像直播应用 REALITY 如何使用我们的 SDK。

请在 GitHub 上给我们加星 这给了我们很大的动力!

Table of Content

Tech Specs

Key Features

  • 42 tracked facial expressions via blendshapes
  • Eye tracking including eye gaze vector
  • Tongue tracking
  • Light & fast, just 3MB ML model size
  • ≤ ±50° pitch, ≤ ±40° yaw and ≤ ±30° roll tracking coverage

Input

  • All RGB camera
  • Photo
  • Video

Output

  • ARKit-compatible blendshapes
  • Head position and scale in 2D and 3D
  • Head rotation in world coordinates

Performance

  • 50 FPS on Pixel 4
  • 60 FPS on iPhone SE (1st gen)
  • 90 FPS on iPhone X or newer

Installation

Prerequisites

  1. Create a dev account at studio.facemoji.co
  2. Generate a unique API key for your app
  3. Paste the API key to your source code

Web

  1. Open the sample project under js-example in an editor of your choice
  2. Run npm install && npm run dev to start a local server with the demo
  3. Run npm install && npm run dev_https to start a local server with self-signed HTTPS support
  4. Run npm install @facemoji/mocap4face in your own project to add mocap4face as a dependency

如果网络摄像头按钮不起作用,您可能需要为本地开发服务器使用 HTTPS。 运行 npm run dev_https ,让浏览器中的自签名证书以 HTTPS 方式启动 demo。

您还可以运行 npm run build 来创建演示应用程序的生产包。

Use Cases

  • AR for NFTs profile pics
  • Live avatar experiences
  • Snapchat-like lense
  • AR experiences
  • VTubing apps
  • Live streaming apps
  • Face filters
  • AR games with facial triggers
  • Beauty AR
  • Virtual try-on
  • Play to earn games

Links

License

该库是根据 Facemoji SDK 许可协议提供的——请参阅 许可 。 还要确保检查 我们的 常见问题 了解更多详情。

此存储库中的示例代码根据 Facemoji 示例许可证

Notices

在 mocap4face SDK 中使用的 OSS 提供:

原始 拉博夫、Rönkkö & 的视频 车工。


mocap4face by alter
mocap4face by alter

mocap4face by alter is a free, multiplatform SDK for real time facial motion capture based on Facial Action Coding System or (FACS). It provides real-time FACS-derived blendshape coefficients, and rigid head pose in 3D space from any mobile camera, webcam, photo, or video enabling live animation of 3D avatars, digital characters, and more.

After fetching the input from one of the mentioned sources, mocap4face SDK produces data in ARKit-compatible blendshapes, i.e., morph targets weight values as a per-frame expression shown in the video below. Useful for, e.g., animating a 2D or 3D avatar in a way that mimics the user's facial expressions in real-time à la Apple Memoji but without the need of a hardware-based TrueDepth Camera.

With mocap4face, you can drive live avatars or NFT PFPs, build Snapchat-like lenses, AR experiences, face filters that trigger actions, VTubing apps, and more with as little energy impact and CPU/GPU use as possible. As an example, check out how the popular avatar live-streaming app REALITY is using our SDK.

Please star us ⭐⭐⭐ on GitHub—it motivates us a lot!

???? Table of Content

???? Tech Specs

✨ Key Features

  • 42 tracked facial expressions via blendshapes
  • Eye tracking including eye gaze vector
  • Tongue tracking
  • Light & fast, just 3MB ML model size
  • ≤ ±50° pitch, ≤ ±40° yaw and ≤ ±30° roll tracking coverage

???? Input

  • All RGB camera
  • Photo
  • Video

???? Output

  • ARKit-compatible blendshapes
  • Head position and scale in 2D and 3D
  • Head rotation in world coordinates

⚡ Performance

  • 50 FPS on Pixel 4
  • 60 FPS on iPhone SE (1st gen)
  • 90 FPS on iPhone X or newer

???? Installation

Prerequisites

  1. Create a dev account at studio.facemoji.co
  2. Generate a unique API key for your app
  3. Paste the API key to your source code

Web

  1. Open the sample project under js-example in an editor of your choice
  2. Run npm install && npm run dev to start a local server with the demo
  3. Run npm install && npm run dev_https to start a local server with self-signed HTTPS support
  4. Run npm install @facemoji/mocap4face in your own project to add mocap4face as a dependency

If the webcamera button is not working, you might need to use HTTPS for the local dev server. Run npm run dev_https and allow the self-signed certificate in the browser to start the demo in HTTPS mode.

You can also run npm run build to create a production bundle of the demo app.

???? Use Cases

  • AR for NFTs profile pics
  • Live avatar experiences
  • Snapchat-like lense
  • AR experiences
  • VTubing apps
  • Live streaming apps
  • Face filters
  • AR games with facial triggers
  • Beauty AR
  • Virtual try-on
  • Play to earn games

❤️ Links

???? License

This library is provided under the Facemoji SDK License Agreement—see LICENSE. Also make sure to check out our FAQ for more details.

The sample code in this repository is provided under the Facemoji Samples License

???? Notices

OSS used in mocap4face SDK:

Original video by LaBeouf, Rönkkö & Turner.

    我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
    原文