@0xalter/alter-core 中文文档教程
Core by Alter
Alter 的核心是一个跨平台的 SDK,由实时 3D 头像系统和 面部动作捕捉 从头开始构建,用于 web3 互操作性和开放元宇宙。 轻松将头像导入您的游戏、应用程序或网站。 它只是工作。 查看包含的代码示例以了解如何开始。 尝试现场演示。
请在 GitHub 上给我们加星 ⭐⭐⭐——这给了我们很大的动力!
Table of Content
Tech Specs
Supported Platforms
- iOS 13+
- Android 8+
- WebGL 2
- macOS (WIP)
- Windows (WIP)
- Unity (Soon)
- Unreal (Soon)
✨ Avatar Formats
- Head only
- A bust with clothing
- Accessories only (for e.g. AR filters) (Soon)
- Full body (Soon)
Variability
- Human and non-human
- From toddler to skeleton
- Genders and non-binary
- Full range of diversity
Motion Capture
✨ Features
42
tracked facial expressions via blendshapes- Eye tracking including eye gaze vector
- Tongue tracking
- Light & fast, just
3MB
ML model size ≤ ±50°
pitch,≤ ±40°
yaw and≤ ±30°
roll tracking coverage- 3D reprojection to input photo/video
- Platform-suited API and packaging with internal optimizations
- Simultaneous back and front camera support
- Light & fast, just
3MB
ML model size
Input
- Any webcam
- Photo
- Video
- Audio
Output
- ARKit-compatible blendshapes
- Head position and scale in 2D and 3D
- Head rotation in world coordinates
- Eye tracking including eye gaze vector
- 3D reprojection to the input photo/video
- Tongue tracking
⚡ Performance
50 FPS
on Pixel 460 FPS
on iPhone SE (1st gen)90 FPS
on iPhone X or newer
More information
如果您只需要面部跟踪技术,请查看我们的 mocap4face 存储库!
Installation
Browser/Javascript
要运行该示例,请转到 js-example 项目并使用 npm install
和 npm run dev
命令。
不要忘记在 studio.alter.xyz 获取您的 API 密钥并将其粘贴到代码中。 寻找“YOUR-API-KEY-HERE”。
NPM Installation
通过 npm
或 yarn
命令安装依赖项。
npm install @0xalter/alter-core@0.10.0
如果您使用的是捆绑器(例如 Webpack),请确保将资产从 @0xalter/alter-core
复制到您的服务目录。 请参阅我们的 Webpack 配置 以获取需要复制的示例。
Core by Alter
Core by Alter is a cross-platform SDK consisting of a real-time 3D avatar system and facial motion capture built from scratch for web3 interoperability and the open metaverse. Easily pipe avatars into your game, app or website. It just works. Check out the included code samples to learn how to get started. Try the live demo.
Please star us ⭐⭐⭐ on GitHub—it motivates us a lot!
???? Table of Content
???? Tech Specs
???? Supported Platforms
- iOS 13+
- Android 8+
- WebGL 2
- macOS (WIP)
- Windows (WIP)
- Unity (Soon)
- Unreal (Soon)
✨ Avatar Formats
- Head only
- A bust with clothing
- Accessories only (for e.g. AR filters) (Soon)
- Full body (Soon)
???? Variability
- Human and non-human
- From toddler to skeleton
- Genders and non-binary
- Full range of diversity
???? Motion Capture
✨ Features
42
tracked facial expressions via blendshapes- Eye tracking including eye gaze vector
- Tongue tracking
- Light & fast, just
3MB
ML model size ≤ ±50°
pitch,≤ ±40°
yaw and≤ ±30°
roll tracking coverage- 3D reprojection to input photo/video
- Platform-suited API and packaging with internal optimizations
- Simultaneous back and front camera support
- Light & fast, just
3MB
ML model size
???? Input
- Any webcam
- Photo
- Video
- Audio
???? Output
- ARKit-compatible blendshapes
- Head position and scale in 2D and 3D
- Head rotation in world coordinates
- Eye tracking including eye gaze vector
- 3D reprojection to the input photo/video
- Tongue tracking
⚡ Performance
50 FPS
on Pixel 460 FPS
on iPhone SE (1st gen)90 FPS
on iPhone X or newer
???? More information
If you only need the facial tracking technology, check out our mocap4face repository!
???? Installation
Browser/Javascript
To run the example, go to the js-example project and use npm install
and npm run dev
commands.
Do not forget to get your API key at studio.alter.xyz and paste it into the code. Look for "YOUR-API-KEY-HERE".
NPM Installation
Install the dependency via npm
or yarn
command.
npm install @0xalter/alter-core@0.10.0
If you are using a bundler (such as Webpack), make sure to copy the assets from @0xalter/alter-core
to your serving directory. See our Webpack config for an example of what needs to be copied.