渲染视频帧的最佳方法是什么?

发布于 2024-11-01 08:10:24 字数 606 浏览 1 评论 0原文

渲染从捆绑到我的应用程序(FFmpeg 等)中的解码器获得的视频帧的最佳选择是什么?

我自然会倾向于选择 OpenGL,如 Android 中提到的使用 NDK、OpenGL ES 和 FFmpeg 的视频播放器

但在Android 中用于视频显示的 OpenGL,评论指出 OpenGL 不是渲染视频的最佳方法。

然后呢? jnigraphics 原生库?还有非 GL SurfaceView?

请注意,我想使用本机 API 来渲染帧,例如 OpenGL 或 jnigraphics。但是用于设置 SurfaceView 等的 Java 代码是可以的。

PS:MediaPlayer在这里无关紧要,我说的是自己解码和显示帧。我不能依赖默认的 Android 编解码器。

what is the best choice for rendering video frames obtained from a decoder bundled into my app (FFmpeg, etc..) ?

I would naturally tend to choose OpenGL as mentioned in Android Video Player Using NDK, OpenGL ES, and FFmpeg.

But in OpenGL in Android for video display, a comment notes that OpenGL isn't the best method for rendering video.

What then? The jnigraphics native library? And a non-GL SurfaceView?

Please note that I would like to use a native API for rendering the frames, such as OpenGL or jnigraphics. But Java code for setting up a SurfaceView and such is ok.

PS: MediaPlayer is irrelevant here, I'm talking about decoding and displaying the frames by myself. I can't rely on the default Android codecs.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

你爱我像她 2024-11-08 08:10:24

我将尝试根据我自己的经验来详细阐述和巩固这里的答案。

为什么选择 openGL

当人们考虑使用 openGL 渲染视频时,大多数都在尝试利用 GPU 进行色彩空间转换和 Alpha 混合。

例如将 YV12 视频帧转换为 RGB。色彩空间转换,如 YV12 -> RGB 要求您单独计算每个像素的值。想象一下,对于 1280 x 720 像素的帧,最终需要进行多少次操作。

我刚才描述的正是 SIMD 的用途——并行地对多条数据执行相同的操作。 GPU 非常适合色彩空间转换。

为什么!openGL

缺点是您将纹理数据获取到 GPU 的过程。考虑到对于每一帧,您必须将纹理数据加载到内存中(CPU 操作),然后必须将此纹理数据复制到 GPU 中(CPU 操作)。正是这种加载/复制可以使得使用openGL比其他方法慢。

如果您正在播放低分辨率视频,那么我想您可能不会看到速度差异,因为您的 CPU 不会成为瓶颈。然而,如果您尝试使用高清,您很可能会遇到这个瓶颈并注意到性能受到显着影响。

传统上解决此瓶颈的方法是使用像素缓冲区对象(分配 GPU 内存来存储纹理负载) 。不幸的是,GLES2 没有像素缓冲区对象。

其他选项

由于上述原因,许多人选择使用软件解码与可用的 CPU 扩展(如 NEON)相结合来进行色彩空间转换。 NEON 的 YUV 2 RGB 实现位于此处。绘制帧的方式(SDL 与 openGL)对于 RGB 来说并不重要,因为在两种情况下都复制相同数量的像素。

您可以通过从 adb shell 运行 cat /proc/cpuinfo 并在功能输出中查找 NEON 来确定您的目标设备是否支持 NEON 增强功能。

I'm going to attempt to elaborate on and consolidate the answers here based on my own experiences.

Why openGL

When people think of rendering video with openGL, most are attempting to exploit the GPU to do color space conversion and alpha blending.

For instance converting YV12 video frames to RGB. Color space conversions like YV12 -> RGB require that you calculate the value of each pixel individually. Imagine for a frame of 1280 x 720 pixels how many operations this ends up being.

What I've just described is really what SIMD was made for - performing the same operation on multiple pieces of data in parallel. The GPU is a natural fit for color space conversion.

Why !openGL

The downside is the process by which you get texture data into the GPU. Consider that for each frame you have to Load the texture data into memory (CPU operation) and then you have to Copy this texture data into the GPU (CPU operation). It is this Load/Copy that can make using openGL slower than alternatives.

If you are playing low resolution videos then I suppose it's possible you won't see the speed difference because your CPU won't bottleneck. However, if you try with HD you will more than likely hit this bottleneck and notice a significant performance hit.

The way this bottleneck has been traditionally worked around is by using Pixel Buffer Objects (allocating GPU memory to store texture Loads). Unfortunately GLES2 does not have Pixel Buffer Objects.

Other Options

For the above reasons, many have chosen to use software-decoding combined with available CPU extensions like NEON for color space conversion. An implementation of YUV 2 RGB for NEON exists here. The means by which you draw the frames, SDL vs openGL should not matter for RGB since you are copying the same number of pixels in both cases.

You can determine if your target device supports NEON enhancements by running cat /proc/cpuinfo from adb shell and looking for NEON in the features output.

离旧人 2024-11-08 08:10:24

我以前也走过 FFmpeg/OpenGLES 这条路,但并不是很有趣。

您可以尝试从 FFmpeg 项目移植 ffplay.c,这在使用 SDL 的 Android 移植之前已经完成。这样,您就不必从头开始构建解码器,也不必处理 AudioTrack 的特性,这是 Android 独有的音频 API。

无论如何,尽可能少地进行 NDK 开发并依赖移植是一个好主意,因为在我看来,现在 ndk-gdb 调试体验非常糟糕。

话虽这么说,我认为 OpenGLES 性能是您最不需要担心的。我发现性能很好,尽管我承认我只在少数设备上进行了测试。解码本身相当密集,并且在播放视频时我无法(从 SD 卡)进行非常积极的缓冲。

I have gone down the FFmpeg/OpenGLES path before, and it's not very fun.

You might try porting ffplay.c from the FFmpeg project, which has been done before using an Android port of the SDL. That way you aren't building your decoder from scratch, and you won't have to deal with the idiosyncracies of AudioTrack, which is an audio API unique to Android.

In any case, it's a good idea to do as little NDK development as possible and rely on porting, since the ndk-gdb debugging experience is pretty lousy right now in my opinion.

That being said, I think OpenGLES performance is the least of your worries. I found the performance to be fine, although I admit I only tested on a few devices. The decoding itself is fairly intensive, and I wasn't able to do very aggressive buffering (from the SD card) while playing the video.

指尖上的星空 2024-11-08 08:10:24

实际上我已经部署了一个自定义视频播放器系统,几乎所有工作都是在 NDK 端完成的。我们正在获得 720P 及以上的全帧视频,包括我们的定制 DRM 系统。 OpenGL 不是你的答案,因为 Android 不支持 Pixbuffers,所以你基本上是在每一帧都破坏你的纹理,这会搞砸 OpenGLES 的缓存系统。坦率地说,您需要通过 Froyo 及更高版本上本机支持的位图推送视频帧。在 Froyo 冲洗之前。我还编写了很多用于颜色转换、重新缩放等的 NEON 内在函数,以提高吞吐量。我可以通过这个模型在高清视频上推送 50-60 帧。

Actually I have deployed a custom video player system and almost all of my work was done on the NDK side. We are getting full frame video 720P and above including our custom DRM system. OpenGL is not your answer as on Android Pixbuffers are not supported, so you are bascially blasting your textures every frame and that screws up OpenGLESs caching system. You frankly need to shove the video frames through the Native supported Bitmap on Froyo and above. Before Froyo your hosed. I also wrote a lot of NEON intrinsics for color conversion, rescaling, etc to increase throughput. I can push 50-60 frames through this model on HD Video.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文