是否可以在 iOS 中使用视频作为 GL 的纹理?

发布于 2024-10-03 18:31:55 字数 85 浏览 0 评论 0原文

是否可以在 iOS 中使用视频(预渲染,用 H.264 压缩)作为 GL 的纹理?

如果可以的话,该怎么做呢?还有播放质量/帧速率或限制吗?

Is it possible using video (pre-rendered, compressed with H.264) as texture for GL in iOS?

If possible, how to do it? And any playback quality/frame-rate or limitations?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

起风了 2024-10-10 18:31:55

从 iOS 4.0 开始,您可以使用 AVCaptureDeviceInput 获取相机作为设备输入,并将其连接到 AVCaptureVideoDataOutput 并将您喜欢的任何对象设置为委托。通过为相机设置 32bpp BGRA 格式,委托对象将以一种非常适合立即处理到 glTexImage2D(或 glTexSubImage2D,如果设备不支持非二次幂纹理;我认为 MBX 设备属于此类)。

有很多帧大小和帧速率选项;猜测你必须根据你想要使用 GPU 的其他用途来调整这些设置。我发现一个完全微不足道的场景,只有一个纹理四边形显示最新的帧,仅当新帧到达 iPhone 4 时才重新绘制,能够显示该设备的最大 720p 24fps 馈送,而没有任何明显的延迟。我还没有进行过比这更彻底的基准测试,所以希望其他人可以提供建议。

原则上,根据 API,帧返回时可以在扫描线之间进行一些内存中填充,这意味着在发布到 GL 之前对内容进行一些改组,因此您确实需要为此实现一个代码路径。实际上,纯粹从经验上来说,当前版本的 iOS 似乎永远不会以这种形式返回图像,因此这并不是真正的性能问题。

编辑:现在已经接近三年后了。在此期间,Apple 发布了 iOS 5、6 和 7。在 iOS 5 中,他们引入了 CVOpenGLESTextureCVOpenGLESTextureCache,它们现在是将视频从捕获设备传输到 OpenGL 的智能方式。 Apple 在此处提供了示例代码,其中特别有趣的部分在 RippleViewController.m 中,特别是它的 setupAVCapturecaptureOutput:didOutputSampleBuffer:fromConnection: — 请参阅第 196-329 行。遗憾的是,条款和条件禁止在不附加整个项目的情况下重复此处的代码,但分步设置是:

  1. 创建一个 CVOpenGLESTextureCacheCreate 和一个 AVCaptureSession ;
  2. 为视频获取合适的 AVCaptureDevice;
  3. 使用该捕获设备创建一个 AVCaptureDeviceInput ;
  4. 附加一个 AVCaptureVideoDataOutput 并告诉它调用您作为示例缓冲区委托。

收到每个样本缓冲区后:

  1. 从中获取 CVImageBufferRef ;
  2. 使用 CVOpenGLESTextureCacheCreateTextureFromImage 从 CV 图像缓冲区获取 Y 和 UV CVOpenGLESTextureRef ;
  3. 从 CV OpenGLES 纹理引用获取纹理目标和名称以绑定它们;
  4. 在着色器中结合亮度和色度。

As of iOS 4.0, you can use AVCaptureDeviceInput to get the camera as a device input and connect it to a AVCaptureVideoDataOutput with any object you like set as the delegate. By setting a 32bpp BGRA format for the camera, the delegate object will receive each frame from the camera in a format just perfect for handing immediately to glTexImage2D (or glTexSubImage2D if the device doesn't support non-power-of-two textures; I think the MBX devices fall into this category).

There are a bunch of frame size and frame rate options; at a guess you'll have to tweak those depending on how much else you want to use the GPU for. I found that a completely trivial scene with just a textured quad showing the latest frame, redrawn only exactly when a new frame arrives on an iPhone 4, was able to display that device's maximum 720p 24fps feed without any noticeable lag. I haven't performed any more thorough benchmarking than that, so hopefully someone else can advise.

In principle, per the API, frames can come back with some in-memory padding between scanlines, which would mean some shuffling of contents before posting off to GL so you do need to implement a code path for that. In practice, speaking purely empirically, it appears that the current version of iOS never returns images in that form so it isn't really a performance issue.

EDIT: it's now very close to three years later. In the interim Apple has released iOS 5, 6 and 7. With 5 they introduced CVOpenGLESTexture and CVOpenGLESTextureCache, which are now the smart way to pipe video from a capture device into OpenGL. Apple supplies sample code here, from which the particularly interesting parts are in RippleViewController.m, specifically its setupAVCapture and captureOutput:didOutputSampleBuffer:fromConnection: — see lines 196–329. Sadly the terms and conditions prevent a duplication of the code here without attaching the whole project but the step-by-step setup is:

  1. create a CVOpenGLESTextureCacheCreate and an AVCaptureSession;
  2. grab a suitable AVCaptureDevice for video;
  3. create an AVCaptureDeviceInput with that capture device;
  4. attach an AVCaptureVideoDataOutput and tell it to call you as a sample buffer delegate.

Upon receiving each sample buffer:

  1. get the CVImageBufferRef from it;
  2. use CVOpenGLESTextureCacheCreateTextureFromImage to get Y and UV CVOpenGLESTextureRefs from the CV image buffer;
  3. get texture targets and names from the CV OpenGLES texture refs in order to bind them;
  4. combine luminance and chrominance in your shader.
白云悠悠 2024-10-10 18:31:55

使用 RosyWriter 获取更好的示例OpenGL 视频渲染。性能非常好,特别是如果降低帧率(1080P/30 时约 10%,1080P/15 时≥5%)。

Use RosyWriter for a MUCH better example of how to do OpenGL video rendering. Performance is very good, especially if you reduce the framerate (~10% at 1080P/30, >=5% at 1080P/15.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文