如何使用 Core Image Framework 为 iOS 创建简单的自定义滤镜?
我想在我的应用程序中使用自定义过滤器。现在我知道我需要使用 Core Image 框架,但我不确定这是正确的方法。 Core Image 框架用于 Mac OS 和 iOS 5.0 - 我不确定它是否可用于自定义 CIFilter 效果。 你能帮我解决这个问题吗? 谢谢大家!
I want to use in my app an custom filter. Now I know that I need to use Core Image framework, but i not sure that is right way.
Core Image framework uses for Mac OS and in iOS 5.0 - I'm not sure that could be used for custom CIFilter effects.
Can you help me with this issues?
Thanks all!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(4)
正如 Adam 所说,目前 iOS 上的 Core Image 并不像旧版 Mac 实现那样支持自定义内核。这限制了您可以使用框架进行的操作,只能是现有过滤器的某种组合。
(更新:2012 年 2 月 13 日)
因此,我为 iOS 创建了一个开源框架,名为 GPUImage,它允许您使用 OpenGL ES 2.0 片段着色器创建要应用于图像和视频的自定义滤镜。我在我关于该主题的帖子。基本上,您可以提供自己的自定义 OpenGL 着色语言 (GLSL) 片段着色器来创建自定义滤镜,然后针对静态图像或实时视频运行该滤镜。该框架与所有支持OpenGL ES 2.0的iOS设备兼容,并且可以创建针对iOS 4.0的应用程序。
例如,您可以使用如下代码设置实时视频的过滤:
作为定义过滤器的自定义片段着色器程序的示例,以下
内容应用棕褐色调效果: 用于在Mac 与 GLSL 非常相似。事实上,您将能够执行一些在桌面 Core Image 中无法执行的操作,因为 Core Image 的内核语言缺少 GLSL 所具有的一些功能(例如分支)。
As Adam states, currently Core Image on iOS does not support custom kernels like the older Mac implementation does. This limits what you can do with the framework to being some kind of combination of existing filters.
(Update: 2/13/2012)
For this reason, I've created an open source framework for iOS called GPUImage, which lets you create custom filters to be applied to images and video using OpenGL ES 2.0 fragment shaders. I describe more about how this framework operates in my post on the topic. Basically, you can supply your own custom OpenGL Shading Language (GLSL) fragment shaders to create a custom filter, and then run that filter against static images or live video. This framework is compatible with all iOS devices that support OpenGL ES 2.0, and can create applications that target iOS 4.0.
For example, you can set up filtering of live video using code like the following:
As an example of a custom fragment shader program that defines a filter, the following applies a sepia tone effect:
The language used for writing custom Core Image kernels on the Mac is very similar to GLSL. In fact, you'll be able to do a few things that you can't in desktop Core Image, because Core Image's kernel language lacks some things that GLSL has (like branching).
原来接受的答案已贬值。从 iOS 8 开始,您可以为过滤器创建自定义内核。您可以在以下位置找到更多相关信息:
Original accepted answer is depreciated. From iOS 8 you can create custom kernels for filters. You can find more information about this in:
已过时
您还无法在 iOS 中创建自己的自定义内核/过滤器。请参阅 http://developer.apple。 com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html,具体来说:
(加粗我的)
OUTDATED
You can't create your own custom kernels/filters in iOS yet. See http://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html, specifically:
(Bolding mine)
与 MacOS X 的 Image Unit 插件相比,您可以更轻松地为 iOS 创建自定义滤镜,因此即使 iOS 支持 Image Unit 插件,它们也是首选。问题是您实际上无法“打包”它们或以其他方式将它们捆绑为像 Image Unit 插件那样的资源;您必须将源代码公开给使用它们的开发人员。而且,它们只对开发者有用;您无法像导入第三方 Core Image 滤镜的 MacOS X 图形应用程序那样将它们分发给 iOS 图形应用程序的最终用户。为此,您必须将它们嵌入照片编辑扩展中。
尽管如此,即使使用 iOS 的自定义 Core Image 过滤器处理图像也比使用 Image Unit 插件更容易。没有导入,接下来是配置 .plist 和描述文件等令人困惑的任务。
iOS 的自定义 Core Image 过滤器只是一个 Cocoa Touch 类,它是 CIFilter 的子类;在其中,您可以指定输入参数(始终至少是图像)、自定义属性设置及其默认值,然后是内置或自定义核心图像过滤器的任意组合。如果您想将 OpenGL 内核添加到图像处理管道中,只需添加一个 CIKernel 方法,该方法会加载您在单独文件中编写的 .cikernel。
这种为 iOS 开发自定义核心图像过滤器的特殊方法的优点在于,自定义过滤器的实例化和调用方式与内置过滤器相同:
这是一个简单的示例,使用 OpenGL 将 Prewitt 运算符应用于图像;首先是 Cocoa Touch 类(CIFilter 的子类),然后是 CIKernel 文件(包含 OpenGL ES 3.0 代码):
头文件:
实现文件:
CIKernel (OpenGL ES 3.0):
这是另一个通过以下方式生成模糊蒙版的滤镜使用内置 Core Image 过滤器从原始图像中减去(或者更确切地说,差分)高斯模糊图像 - 无 Core Image 内核代码 (OpenGL);它展示了如何指定和使用自定义属性,即高斯模糊的半径:
头文件:
实现文件:
You can create custom filters for iOS easier than an Image Unit plug-in for MacOS X, so much so that they would be preferred, even if Image Unit plug-ins were supported by iOS. The problem is you cannot actually "package" them or otherwise bundle them as a resource like Image Unit plug-ins; you have to expose your source code to developers that use them. Moreover, they are only useful to developers; you cannot distribute them to end-users of iOS graphics apps the same way you can for MacOS X graphics apps that import third-party Core Image filters. For that, you must embed them in a Photo Editing Extension.
Still, even processing images with a custom Core Image filter for iOS is easier than with an Image Unit plug-in. There's no importing, followed by the confusing task of configuring .plist and description files and what-not.
A custom Core Image filter for iOS is simply a Cocoa Touch Class that is a subclass of CIFilter; in it, you specify input parameters (always at least the image), custom attributes settings and their defaults, and then any combination of built-in or custom Core Image filters. If you want to add an OpenGL kernel to the image-processing pipeline, you simply add a CIKernel method, which loads the .cikernel you write in a separate file.
The beauty of this particular method for developing a custom Core Image Filter for iOS is that custom filters are instantiated and called the same way as built-in filters:
Here's a simple example that uses OpenGL to apply the Prewitt Operator to an image; first, the Cocoa Touch Class (subclassing CIFilter), then, the CIKernel file (containing the OpenGL ES 3.0 code):
The header file:
The implementation file:
The CIKernel (OpenGL ES 3.0):
Here's another filter that generates an unsharp mask by subtracting (or, rather, differencing) a Gaussian blurred image from the original using built-in Core Image filters—no Core Image kernel code (OpenGL); it shows how to specify and use a custom attribute, namely, the radius of the Gaussian blur:
The header file:
The implementation file: