iOS 5.0 中的 Core Image 滤镜对于实时视频处理来说足够快吗?

发布于 2024-11-19 03:47:03 字数 133 浏览 2 评论 0原文

现在 Apple 已将 Core Image 框架移植到 iOS 5.0,我想知道:Core Image 是否足够快,可以将实时滤镜和效果应用于相机视频?

另外,学习 iOS 5.0 的 Core Image 框架的一个好的起点是什么?

Now that Apple has ported the Core Image framework over to iOS 5.0, I'm wondering: is Core Image is fast enough to apply live filters and effects to camera video?

Also, what would be a good starting point to learn the Core Image framework for iOS 5.0?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(3

蓦然回首 2024-11-26 03:47:03

现在 Core Image 已经在 iOS 上发布了一段时间,我们可以讨论一些硬性能数据。我创建了一个基准测试应用程序作为 GPUImage 框架测试的一部分,并分析了原始 CPU 的性能 -基于过滤器、Core Image 过滤器和具有实时视频源的 GPUImage 过滤器。以下是在 iPhone 摄像头的 640x480 视频帧上应用单个伽玛滤镜所需的时间(以毫秒为单位)(对于运行两个不同操作系统版本的两种不同硬件型号):

             iPhone 4 (iOS 5)   | iPhone 4S (iOS 6)
------------------------------------------------
CPU          458 ms (2.2 FPS)     183 ms (5.5 FPS)
Core Image   106 ms (6.7 FPS)     8.2 ms (122 FPS)
GPUImage     2.5 ms (400 FPS)     1.8 ms (555 FPS)

对于 Core Image,这最多可转换为 9.4 iPhone 4 上简单伽玛滤镜的 FPS,但在 iPhone 4S 上同样的滤镜的 FPS 远远超过 60 FPS。这是您可以设置的最简单的 Core Image 过滤器案例,因此性能肯定会随着更复杂的操作而变化。这似乎表明 Core Image 的实时处理速度无法与运行 iOS 5 的 iPhone 4 上的 iPhone 相机速率相匹配,但从 iOS 6 开始,它处理视频的速度足以在 iPhone 4S 及更高版本上进行实时过滤。

如果您想了解我从哪里获得这些基准测试,可以在我的 GitHub 存储库中找到这些基准测试的来源数字来自.

我已经从我原来的答案中更新了这个答案,这个答案对 Core Image 的性能过于挑剔。我用作比较基础的棕褐色色调滤镜没有执行与我自己的相同的操作,因此它是一个很差的基准。 Core Image 滤镜的性能在 iOS 6 中也显着提高,这有助于使其足够快地处理 iPhone 4S 及更高版本上的实时视频。另外,我还发现了几个案例,像大- radius 模糊,其中 Core Image 明显优于我的 GPUImage 框架。

之前的答案,供后代使用:

与任何与性能相关的问题一样,答案将取决于过滤器的复杂性、过滤的图像大小以及正在运行的设备的性能特征。

由于 Core Image 在 Mac 上可用已有一段时间,因此我可以向您指出 核心图像编程指南作为学习框架的资源。鉴于 NDA,我无法评论 iOS 特定元素,但我强烈建议观看 WWDC 2011 会议 422 - 在 iOS 和 Mac OS X 上使用 Core Image。Core

Image(大部分)使用 GPU 进行图像处理,因此您可以了解 OpenGL ES 2.0 着色器处理图像的速度在现有设备上进行处理。我最近在这个领域做了一些工作< /a>,并发现 iPhone 4 可以使用简单的着色器对以 480 x 320 的分辨率输入的实时视频进行 60 FPS 处理。您可以在那里下载我的示例应用程序并尝试自定义着色器和/或视频输入大小,以确定您的特定设备是否可以以合适的帧速率处理此处理。 Core Image 可能会增加一点开销,但它也对其组织过滤器链的方式进行了一些巧妙的优化。

最慢的兼容设备是 iPhone 3G S 和第三代 iPod touch,但它们并不比 iPhone 4 慢多少。iPad 2 以其强大的碎片处理能力击败了它们。

Now that Core Image has been out on iOS for a while, we can talk about some hard performance numbers. I created a benchmark application as part of the testing for my GPUImage framework, and profiled the performance of raw CPU-based filters, Core Image filters, and GPUImage filters with live video feeds. The following were the times (in milliseconds) each took to apply a single gamma filter on a 640x480 video frame from the iPhone's camera (for two different hardware models running two different OS versions):

             iPhone 4 (iOS 5)   | iPhone 4S (iOS 6)
------------------------------------------------
CPU          458 ms (2.2 FPS)     183 ms (5.5 FPS)
Core Image   106 ms (6.7 FPS)     8.2 ms (122 FPS)
GPUImage     2.5 ms (400 FPS)     1.8 ms (555 FPS)

For Core Image, this translates into a maximum of 9.4 FPS for a simple gamma filter on iPhone 4, but well over 60 FPS for the same on an iPhone 4S. This is about the simplest Core Image filter case you can set up, so performance will certainly vary with more complex operations. This would seem to indicate that Core Image cannot do live processing fast enough to match the iPhone's camera rate on the iPhone 4 running iOS 5, but as of iOS 6, it processes video more than fast enough to do live filtering on iPhone 4S and above.

The source for these benchmarks can be found in my GitHub repository), if you wish to see where I got these numbers from.

I've updated this answer from my original, which was too critical of Core Image's performance. The sepia tone filter I was using as a basis of comparison was not performing the same operation as my own, so it was a poor benchmark. The performance of Core Image filters also improved significantly in iOS 6, which helped make them more than fast enough to process live video on iPhone 4S and up. Also, I've since found several cases, like large-radius blurs, where Core Image significantly outperforms my GPUImage framework.

Previous answer, for posterity:

As with any performance-related question, the answer will depend on the complexity of your filters, the image size being filtered, and the performance characteristics of the device you're running on.

Because Core Image has been available for a while on the Mac, I can point you to the Core Image Programming Guide as a resource for learning the framework. I can't comment on the iOS-specific elements, given the NDA, but I highly recommend watching the video for WWDC 2011 Session 422 - Using Core Image on iOS and Mac OS X.

Core Image (mostly) uses the GPU for image processing, so you could look at how fast OpenGL ES 2.0 shaders handle image processing on existing devices. I did some work in this area recently, and found that the iPhone 4 could do 60 FPS processing using a simple shader on realtime video being fed in at a 480 x 320. You could download my sample application there and attempt to customize the shader and / or video input size to determine if your particular device could handle this processing at a decent framerate. Core Image may add a little overhead, but it also has some clever optimizations for how it organizes filter chains.

The slowest compatible devices out there would be the iPhone 3G S and the 3rd generation iPod touch, but they're not that much slower than the iPhone 4. The iPad 2 blows them all away with its massive fragment processing power.

随梦而飞# 2024-11-26 03:47:03

恕我直言,自 iOS6.0 以来,Core Image 始终是您的首选。
有一些您会喜欢的好功能,例如:

  1. Core Image 输入参数支持 glTexture2d;
  2. 核心图像输出参数支持glTexture2d;
  3. 您可以选择仅将 CI 与 GPU 结合使用;初始化一个 CIContext 例如_CIglContext = [CIContext contextWithEAGLContext:_glContext 选项:opts];
  4. 现在iOS平台可用的滤镜有很多,大概有93个滤镜吧?
  5. 当您的应用程序暂时在后台运行时,您可以选择仅使用 CPU 处理缓冲区,但不建议这样做。

这些在WWDC2012会议视频Core Image部分都有提到。看看吧,也许你会在那里找到你的解决方案。

IMHO, Core Image is always your first option since iOS6.0.
There are some good features you r gonna like, such as:

  1. Core Image input params support glTexture2d;
  2. Core Image output params support glTexture2d;
  3. You can choose to use CI exclusively with GPU; init a CIContext eg. _CIglContext = [CIContext contextWithEAGLContext:_glContext options:opts];
  4. Many filters are now available to iOS platform, 93 filters or so?
  5. You can choose to process buffer exclusively with CPU when your application are temporarily running in the background, which though is not recommended.

These are all mentioned in WWDC2012 session video, Core Image part. Have a look, maybe you will find your solution there.

朕就是辣么酷 2024-11-26 03:47:03

在 iPhone 4 及更高版本中,它几乎是实时的。 iPhone 3GS 会有点不稳定。不推荐 iPhone 3 及之前版本。

It will be almost real-time in iPhone 4 and forward. iPhone 3GS will be a little choppy. iPhone 3 and before is not recommended.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文