如何使用 Core Image Framework 为 iOS 创建简单的自定义滤镜?

发布于 2024-12-27 11:24:18 字数 213 浏览 1 评论 0原文

我想在我的应用程序中使用自定义过滤器。现在我知道我需要使用 Core Image 框架,但我不确定这是正确的方法。 Core Image 框架用于 Mac OSiOS 5.0 - 我不确定它是否可用于自定义 CIFilter 效果。 你能帮我解决这个问题吗? 谢谢大家!

I want to use in my app an custom filter. Now I know that I need to use Core Image framework, but i not sure that is right way.
Core Image framework uses for Mac OS and in iOS 5.0 - I'm not sure that could be used for custom CIFilter effects.
Can you help me with this issues?
Thanks all!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

不顾 2025-01-03 11:24:18

正如 Adam 所说,目前 iOS 上的 Core Image 并不像旧版 Mac 实现那样支持自定义内核。这限制了您可以使用框架进行的操作,只能是现有过滤器的某种组合。

(更新:2012 年 2 月 13 日)

因此,我为 iOS 创建了一个开源框架,名为 GPUImage,它允许您使用 OpenGL ES 2.0 片段着色器创建要应用于图像和视频的自定义滤镜。我在我关于该主题的帖子。基本上,您可以提供自己的自定义 OpenGL 着色语言 (GLSL) 片段着色器来创建自定义滤镜,然后针对静态图像或实时视频运行该滤镜。该框架与所有支持OpenGL ES 2.0的iOS设备兼容,并且可以创建针对iOS 4.0的应用程序。

例如,您可以使用如下代码设置实时视频的过滤:

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];
GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

// Add the view somewhere so it's visible

[videoCamera addTarget:thresholdFilter];
[customFilter addTarget:filteredVideoView];

[videoCamera startCameraCapture];

作为定义过滤器的自定义片段着色器程序的示例,以下

varying highp vec2 textureCoordinate;

uniform sampler2D inputImageTexture;

void main()
{
    lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
    lowp vec4 outputColor;
    outputColor.r = (textureColor.r * 0.393) + (textureColor.g * 0.769) + (textureColor.b * 0.189);
    outputColor.g = (textureColor.r * 0.349) + (textureColor.g * 0.686) + (textureColor.b * 0.168);    
    outputColor.b = (textureColor.r * 0.272) + (textureColor.g * 0.534) + (textureColor.b * 0.131);

    gl_FragColor = outputColor;
}

内容应用棕褐色调效果: 用于在Mac 与 GLSL 非常相似。事实上,您将能够执行一些在桌面 Core Image 中无法执行的操作,因为 Core Image 的内核语言缺少 GLSL 所具有的一些功能(例如分支)。

As Adam states, currently Core Image on iOS does not support custom kernels like the older Mac implementation does. This limits what you can do with the framework to being some kind of combination of existing filters.

(Update: 2/13/2012)

For this reason, I've created an open source framework for iOS called GPUImage, which lets you create custom filters to be applied to images and video using OpenGL ES 2.0 fragment shaders. I describe more about how this framework operates in my post on the topic. Basically, you can supply your own custom OpenGL Shading Language (GLSL) fragment shaders to create a custom filter, and then run that filter against static images or live video. This framework is compatible with all iOS devices that support OpenGL ES 2.0, and can create applications that target iOS 4.0.

For example, you can set up filtering of live video using code like the following:

GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
GPUImageFilter *customFilter = [[GPUImageFilter alloc] initWithFragmentShaderFromFile:@"CustomShader"];
GPUImageView *filteredVideoView = [[GPUImageView alloc] initWithFrame:CGRectMake(0.0, 0.0, viewWidth, viewHeight)];

// Add the view somewhere so it's visible

[videoCamera addTarget:thresholdFilter];
[customFilter addTarget:filteredVideoView];

[videoCamera startCameraCapture];

As an example of a custom fragment shader program that defines a filter, the following applies a sepia tone effect:

varying highp vec2 textureCoordinate;

uniform sampler2D inputImageTexture;

void main()
{
    lowp vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
    lowp vec4 outputColor;
    outputColor.r = (textureColor.r * 0.393) + (textureColor.g * 0.769) + (textureColor.b * 0.189);
    outputColor.g = (textureColor.r * 0.349) + (textureColor.g * 0.686) + (textureColor.b * 0.168);    
    outputColor.b = (textureColor.r * 0.272) + (textureColor.g * 0.534) + (textureColor.b * 0.131);

    gl_FragColor = outputColor;
}

The language used for writing custom Core Image kernels on the Mac is very similar to GLSL. In fact, you'll be able to do a few things that you can't in desktop Core Image, because Core Image's kernel language lacks some things that GLSL has (like branching).

一花一树开 2025-01-03 11:24:18

原来接受的答案已贬值。从 iOS 8 开始,您可以为过滤器创建自定义内核。您可以在以下位置找到更多相关信息:

Original accepted answer is depreciated. From iOS 8 you can create custom kernels for filters. You can find more information about this in:

十级心震 2025-01-03 11:24:18

已过时

您还无法在 iOS 中创建自己的自定义内核/过滤器。请参阅 http://developer.apple。 com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html,具体来说:

虽然本文档包含在参考库中,但它具有
iOS 5.0 尚未详细更新。即将进行的修订将
详细介绍 iOS 上 Core Image 的差异。特别是,关键
区别在于 iOS 上的 Core Image 不包含以下功能:
创建自定义图像过滤器。

(加粗我的)

OUTDATED

You can't create your own custom kernels/filters in iOS yet. See http://developer.apple.com/library/mac/#documentation/graphicsimaging/Conceptual/CoreImaging/ci_intro/ci_intro.html, specifically:

Although this document is included in the reference library, it has
not been updated in detail for iOS 5.0. A forthcoming revision will
detail the differences in Core Image on iOS. In particular, the key
difference is that Core Image on iOS does not include the ability to
create custom image filters
.

(Bolding mine)

清风无影 2025-01-03 11:24:18

与 MacOS X 的 Image Unit 插件相比,您可以更轻松地为 iOS 创建自定义滤镜,因此即使 iOS 支持 Image Unit 插件,它们也是首选。问题是您实际上无法“打包”它们或以其他方式将它们捆绑为像 Image Unit 插件那样的资源;您必须将源代码公开给使用它们的开发人员。而且,它们只对开发者有用;您无法像导入第三方 Core Image 滤镜的 MacOS X 图形应用程序那样将它们分发给 iOS 图形应用程序的最终用户。为此,您必须将它们嵌入照片编辑扩展中。

尽管如此,即使使用 iOS 的自定义 Core Image 过滤器处理图像也比使用 Image Unit 插件更容易。没有导入,接下来是配置 .plist 和描述文件等令人困惑的任务。

iOS 的自定义 Core Image 过滤器只是一个 Cocoa Touch 类,它是 CIFilter 的子类;在其中,您可以指定输入参数(始终至少是图像)、自定义属性设置及其默认值,然后是内置或自定义核心图像过滤器的任意组合。如果您想将 OpenGL 内核添加到图像处理管道中,只需添加一个 CIKernel 方法,该方法会加载您在单独文件中编写的 .cikernel。

这种为 iOS 开发自定义核心图像过滤器的特殊方法的优点在于,自定义过滤器的实例化和调用方式与内置过滤器相同:

CIFilter* PrewittKernel = [CIFilter filterWithName:@"PrewittKernel"];

CIImage *result = [CIFilter filterWithName:@"PrewittKernel" keysAndValues:kCIInputImageKey, self.inputImage, nil].outputImage;

这是一个简单的示例,使用 OpenGL 将 Prewitt 运算符应用于图像;首先是 Cocoa Touch 类(CIFilter 的子类),然后是 CIKernel 文件(包含 OpenGL ES 3.0 代码):

头文件:

//
//  PrewittKernel.h
//  Photo Filter
//
//  Created by James Alan Bush on 5/23/15.
//
//

#import <CoreImage/CoreImage.h>

@interface PrewittKernel : CIFilter
{
    CIImage *inputImage;
}

@property (retain, nonatomic) CIImage *inputImage;

@end

实现文件:

//
//  PrewittKernel.m
//  Photo Filter
//
//  Created by James Alan Bush on 5/23/15.
//
//

#import <CoreImage/CoreImage.h>

@interface PrewittKernel : CIFilter
{
    CIImage *inputImage;
}

@property (retain, nonatomic) CIImage *inputImage;

@end


@implementation PrewittKernel

@synthesize inputImage;

- (CIKernel *)prewittKernel
{
    static CIKernel *kernelPrewitt = nil;

    NSBundle    *bundle = [NSBundle bundleForClass:NSClassFromString(@"PrewittKernel")];
    NSStringEncoding encoding = NSUTF8StringEncoding;
    NSError     *error = nil;
    NSString    *code = [NSString stringWithContentsOfFile:[bundle pathForResource:@"PrewittKernel" ofType:@"cikernel"] encoding:encoding error:&error];

    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        kernelPrewitt = [CIKernel kernelWithString:code];
    });

    return kernelPrewitt;
}

- (CIImage *)outputImage
{
    CIImage *result = self.inputImage;
    return [[self prewittKernel] applyWithExtent:result.extent roiCallback:^CGRect(int index, CGRect rect) {
        return CGRectMake(0, 0, CGRectGetWidth(result.extent), CGRectGetHeight(result.extent));
    } arguments:@[result]];
}

@end

CIKernel (OpenGL ES 3.0):

/* PrewittKernel.cikernel */

kernel vec4 prewittKernel(sampler image)
{
    vec2 xy = destCoord();
    vec4 bottomLeftIntensity = sample(image, samplerTransform(image, xy + vec2(-1, -1)));
    vec4 topRightIntensity = sample(image, samplerTransform(image, xy + vec2(+1, +1)));
    vec4 topLeftIntensity = sample(image, samplerTransform(image, xy + vec2(+1, -1)));
    vec4 bottomRightIntensity = sample(image, samplerTransform(image, xy + vec2(-1, +1)));
    vec4 leftIntensity = sample(image, samplerTransform(image, xy + vec2(-1, 0)));
    vec4 rightIntensity = sample(image, samplerTransform(image, xy + vec2(+1, 0)));
    vec4 bottomIntensity = sample(image, samplerTransform(image, xy + vec2(0, -1)));
    vec4 topIntensity = sample(image, samplerTransform(image, xy + vec2(0, +1)));
    vec4 h = vec4(-topLeftIntensity - topIntensity - topRightIntensity + bottomLeftIntensity + bottomIntensity + bottomRightIntensity);
    vec4 v = vec4(-bottomLeftIntensity - leftIntensity - topLeftIntensity + bottomRightIntensity + rightIntensity + topRightIntensity);
    float h_max = max(h.r, max(h.g, h.b));
    float v_max = max(v.r, max(v.g, v.b));
    float mag = length(vec2(h_max, v_max)) * 1.0;

    return vec4(vec3(mag), 1.0);
}

这是另一个通过以下方式生成模糊蒙版的滤镜使用内置 Core Image 过滤器从原始图像中减去(或者更确切地说,差分)高斯模糊图像 - 无 Core Image 内核代码 (OpenGL);它展示了如何指定和使用自定义属性,即高斯模糊的半径:

头文件:

//
//  GaussianKernel.h
//  Chroma
//
//  Created by James Alan Bush on 7/12/15.
//  Copyright © 2015 James Alan Bush. All rights reserved.
//

#import <CoreImage/CoreImage.h>

@interface GaussianKernel : CIFilter
{
    CIImage *inputImage;
    NSNumber *inputRadius;
}

@property (retain, nonatomic) CIImage *inputImage;
@property (retain, nonatomic) NSNumber *inputRadius;

@end

实现文件:

//
//  GaussianKernel.m
//  Chroma
//
//  Created by James Alan Bush on 7/12/15.
//  Copyright © 2015 James Alan Bush. All rights reserved.
//

#import "GaussianKernel.h"

@implementation GaussianKernel

@synthesize inputImage;
@synthesize inputRadius;

+ (NSDictionary *)customAttributes
{
    return @{
             @"inputRadius" :
                 @{
                     kCIAttributeMin       : @3.0,
                     kCIAttributeMax       : @15.0,
                     kCIAttributeDefault   : @7.5,
                     kCIAttributeType      : kCIAttributeTypeScalar
                     }
             };
}

- (void)setDefaults
{
    self.inputRadius = @7.5;
}

    - (CIImage *)outputImage
    {
        CIImage *result = self.inputImage;

        CGRect rect = [[GlobalCIImage sharedSingleton].ciImage extent];
        rect.origin = CGPointZero;
        CGRect cropRectLeft = CGRectMake(0, 0, rect.size.width, rect.size.height);
        CIVector *cropRect = [CIVector vectorWithX:rect.origin.x Y:rect.origin.y Z:rect.size.width W:rect.size.height];

    result = [[CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, result, @"inputRadius", [NSNumber numberWithFloat:inputRadius.floatValue], nil].outputImage imageByCroppingToRect:cropRectLeft];

    result = [CIFilter filterWithName:@"CICrop" keysAndValues:@"inputImage", result, @"inputRectangle", cropRect, nil].outputImage;

    result = [CIFilter filterWithName:@"CIDifferenceBlendMode" keysAndValues:kCIInputImageKey, result, kCIInputBackgroundImageKey, result, nil].outputImage;

        return result;
    }

    @end

You can create custom filters for iOS easier than an Image Unit plug-in for MacOS X, so much so that they would be preferred, even if Image Unit plug-ins were supported by iOS. The problem is you cannot actually "package" them or otherwise bundle them as a resource like Image Unit plug-ins; you have to expose your source code to developers that use them. Moreover, they are only useful to developers; you cannot distribute them to end-users of iOS graphics apps the same way you can for MacOS X graphics apps that import third-party Core Image filters. For that, you must embed them in a Photo Editing Extension.

Still, even processing images with a custom Core Image filter for iOS is easier than with an Image Unit plug-in. There's no importing, followed by the confusing task of configuring .plist and description files and what-not.

A custom Core Image filter for iOS is simply a Cocoa Touch Class that is a subclass of CIFilter; in it, you specify input parameters (always at least the image), custom attributes settings and their defaults, and then any combination of built-in or custom Core Image filters. If you want to add an OpenGL kernel to the image-processing pipeline, you simply add a CIKernel method, which loads the .cikernel you write in a separate file.

The beauty of this particular method for developing a custom Core Image Filter for iOS is that custom filters are instantiated and called the same way as built-in filters:

CIFilter* PrewittKernel = [CIFilter filterWithName:@"PrewittKernel"];

CIImage *result = [CIFilter filterWithName:@"PrewittKernel" keysAndValues:kCIInputImageKey, self.inputImage, nil].outputImage;

Here's a simple example that uses OpenGL to apply the Prewitt Operator to an image; first, the Cocoa Touch Class (subclassing CIFilter), then, the CIKernel file (containing the OpenGL ES 3.0 code):

The header file:

//
//  PrewittKernel.h
//  Photo Filter
//
//  Created by James Alan Bush on 5/23/15.
//
//

#import <CoreImage/CoreImage.h>

@interface PrewittKernel : CIFilter
{
    CIImage *inputImage;
}

@property (retain, nonatomic) CIImage *inputImage;

@end

The implementation file:

//
//  PrewittKernel.m
//  Photo Filter
//
//  Created by James Alan Bush on 5/23/15.
//
//

#import <CoreImage/CoreImage.h>

@interface PrewittKernel : CIFilter
{
    CIImage *inputImage;
}

@property (retain, nonatomic) CIImage *inputImage;

@end


@implementation PrewittKernel

@synthesize inputImage;

- (CIKernel *)prewittKernel
{
    static CIKernel *kernelPrewitt = nil;

    NSBundle    *bundle = [NSBundle bundleForClass:NSClassFromString(@"PrewittKernel")];
    NSStringEncoding encoding = NSUTF8StringEncoding;
    NSError     *error = nil;
    NSString    *code = [NSString stringWithContentsOfFile:[bundle pathForResource:@"PrewittKernel" ofType:@"cikernel"] encoding:encoding error:&error];

    static dispatch_once_t onceToken;
    dispatch_once(&onceToken, ^{
        kernelPrewitt = [CIKernel kernelWithString:code];
    });

    return kernelPrewitt;
}

- (CIImage *)outputImage
{
    CIImage *result = self.inputImage;
    return [[self prewittKernel] applyWithExtent:result.extent roiCallback:^CGRect(int index, CGRect rect) {
        return CGRectMake(0, 0, CGRectGetWidth(result.extent), CGRectGetHeight(result.extent));
    } arguments:@[result]];
}

@end

The CIKernel (OpenGL ES 3.0):

/* PrewittKernel.cikernel */

kernel vec4 prewittKernel(sampler image)
{
    vec2 xy = destCoord();
    vec4 bottomLeftIntensity = sample(image, samplerTransform(image, xy + vec2(-1, -1)));
    vec4 topRightIntensity = sample(image, samplerTransform(image, xy + vec2(+1, +1)));
    vec4 topLeftIntensity = sample(image, samplerTransform(image, xy + vec2(+1, -1)));
    vec4 bottomRightIntensity = sample(image, samplerTransform(image, xy + vec2(-1, +1)));
    vec4 leftIntensity = sample(image, samplerTransform(image, xy + vec2(-1, 0)));
    vec4 rightIntensity = sample(image, samplerTransform(image, xy + vec2(+1, 0)));
    vec4 bottomIntensity = sample(image, samplerTransform(image, xy + vec2(0, -1)));
    vec4 topIntensity = sample(image, samplerTransform(image, xy + vec2(0, +1)));
    vec4 h = vec4(-topLeftIntensity - topIntensity - topRightIntensity + bottomLeftIntensity + bottomIntensity + bottomRightIntensity);
    vec4 v = vec4(-bottomLeftIntensity - leftIntensity - topLeftIntensity + bottomRightIntensity + rightIntensity + topRightIntensity);
    float h_max = max(h.r, max(h.g, h.b));
    float v_max = max(v.r, max(v.g, v.b));
    float mag = length(vec2(h_max, v_max)) * 1.0;

    return vec4(vec3(mag), 1.0);
}

Here's another filter that generates an unsharp mask by subtracting (or, rather, differencing) a Gaussian blurred image from the original using built-in Core Image filters—no Core Image kernel code (OpenGL); it shows how to specify and use a custom attribute, namely, the radius of the Gaussian blur:

The header file:

//
//  GaussianKernel.h
//  Chroma
//
//  Created by James Alan Bush on 7/12/15.
//  Copyright © 2015 James Alan Bush. All rights reserved.
//

#import <CoreImage/CoreImage.h>

@interface GaussianKernel : CIFilter
{
    CIImage *inputImage;
    NSNumber *inputRadius;
}

@property (retain, nonatomic) CIImage *inputImage;
@property (retain, nonatomic) NSNumber *inputRadius;

@end

The implementation file:

//
//  GaussianKernel.m
//  Chroma
//
//  Created by James Alan Bush on 7/12/15.
//  Copyright © 2015 James Alan Bush. All rights reserved.
//

#import "GaussianKernel.h"

@implementation GaussianKernel

@synthesize inputImage;
@synthesize inputRadius;

+ (NSDictionary *)customAttributes
{
    return @{
             @"inputRadius" :
                 @{
                     kCIAttributeMin       : @3.0,
                     kCIAttributeMax       : @15.0,
                     kCIAttributeDefault   : @7.5,
                     kCIAttributeType      : kCIAttributeTypeScalar
                     }
             };
}

- (void)setDefaults
{
    self.inputRadius = @7.5;
}

    - (CIImage *)outputImage
    {
        CIImage *result = self.inputImage;

        CGRect rect = [[GlobalCIImage sharedSingleton].ciImage extent];
        rect.origin = CGPointZero;
        CGRect cropRectLeft = CGRectMake(0, 0, rect.size.width, rect.size.height);
        CIVector *cropRect = [CIVector vectorWithX:rect.origin.x Y:rect.origin.y Z:rect.size.width W:rect.size.height];

    result = [[CIFilter filterWithName:@"CIGaussianBlur" keysAndValues:kCIInputImageKey, result, @"inputRadius", [NSNumber numberWithFloat:inputRadius.floatValue], nil].outputImage imageByCroppingToRect:cropRectLeft];

    result = [CIFilter filterWithName:@"CICrop" keysAndValues:@"inputImage", result, @"inputRectangle", cropRect, nil].outputImage;

    result = [CIFilter filterWithName:@"CIDifferenceBlendMode" keysAndValues:kCIInputImageKey, result, kCIInputBackgroundImageKey, result, nil].outputImage;

        return result;
    }

    @end
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文