是否可以将Ovrcamerarig眼睛呈现到Oculus Quest 2和Unity中的纹理2D?
我试图通过WEBRTC连接在游戏视图中流出当前的流程。我的目标是捕获用户将用户视为RGB 24BPP字节阵列。我目前可以流式传输一个空纹理2D。我想用ovrcamerrig在游戏视图中的当前填充空纹理。
我不是一个强大的统一开发人员,但我认为它看起来像这样:
private Texture2D tex;
private RenderTexture rt;
private OVRCameraRig oVRCameraRig;
void Start() {
// I only have 1 camera rig
oVRCameraRig = GameObject.FindObjectOfType<OVRCameraRig>();
tex = new Texture2D(640, 480, TextureFormat.RGB24, false);
rt = new RenderTexture(640, 480, 8, UnityEngine.Experimental.Rendering.GraphicsFormat.R8G8B8_SRGB);
}
public void Update() {
oVRCameraRig.leftEyeCamera.targetTexture = rt;
RenderTexture.active = rt;
tex.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0);
tex.Apply();
}
I am attempting to stream out the current in game view over a WebRTC connection. My goal is to capture what the user is seeing as RGB 24BPP byte array. I am currently able to stream an empty Texture2D. I would like to populate the empty texture with the OVRCamerRig's current in game view.
I am not a strong Unity developer, but I assumed it might look something like this:
private Texture2D tex;
private RenderTexture rt;
private OVRCameraRig oVRCameraRig;
void Start() {
// I only have 1 camera rig
oVRCameraRig = GameObject.FindObjectOfType<OVRCameraRig>();
tex = new Texture2D(640, 480, TextureFormat.RGB24, false);
rt = new RenderTexture(640, 480, 8, UnityEngine.Experimental.Rendering.GraphicsFormat.R8G8B8_SRGB);
}
public void Update() {
oVRCameraRig.leftEyeCamera.targetTexture = rt;
RenderTexture.active = rt;
tex.ReadPixels(new Rect(0, 0, rt.width, rt.height), 0, 0);
tex.Apply();
}
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论