无论是在 GPU 还是 CPU 上执行,Quartz CIFilter 都会创建不同的输出吗?

发布于 2024-12-10 19:06:24 字数 2549 浏览 0 评论 0原文

我在 CIFilters 和应用上遇到了问题。滤镜仅应用于图像的 1/3 (GPU),或者滤镜仅工作一次 (CPU)。在CPU上重做同样的操作会导致1/3的图像;重新执行与 CPU 计算相同的操作,CIFilter 不起作用/未应用。

我尝试添加示例图片,但我是这里的初学者,所以这是不允许的。请查看我在 Apple 的原始帖子 https://discussions.apple.com/message/16412850#16412850 。对于给您带来的不便,我们深表歉意。

问题:

  • 为什么 CIFilter 在 CPU 上不能确定工作?
  • 为什么 GPU 渲染器在图像三分之一之后停止?
  • 有没有办法调试 CIFilter ? QuartzComposer 只允许给定内核代码 - 但有调试器/记录器吗?

提前致谢, 德克

PS: 我也尝试过更换显卡,但也没有帮助。

这是代码,部分内容源自 Apple 的 ImageApp 示例。

- (void)createCGImageAtPathNamed:(NSString*)destPath
{
    if (false == didInit)
        return ;

    CGColorSpaceRef myColorspace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);


    size_t height = CGImageGetHeight(leftCGImageRef);
    size_t width = CGImageGetWidth(leftCGImageRef);
    CGRect rect = {{0,0}, {width, height}};
    size_t bitsPerComponent = 8;
    size_t bytesPerRow = rect.size.width*4;         //bytes per row - one byte each for argb
    bytesPerRow += (16 - bytesPerRow%16)%16;    

    CGImageAlphaInfo alphaInfo = kCGImageAlphaPremultipliedFirst;

    CGContextRef context = CGBitmapContextCreate(nil, width, height, bitsPerComponent, bytesPerRow, myColorspace, alphaInfo);

    NSDictionary *contextOptions = [NSDictionary dictionaryWithObjectsAndKeys:
                                     [NSNumber numberWithBool: wantsSoftRenderer],kCIContextUseSoftwareRenderer,nil];


    CIContext* cicontext = [CIContext contextWithCGContext: context options: contextOptions];
    CIImage *ciimgLeft = leftCIImage;
    CIImage *ciimgRight = rightCIImage;


    CIFilter *mFilter;

    [CIPlugIn loadAllPlugIns];
    mFilter = [[CIFilter filterWithName: @"CILightenBlendMode"] retain];
    [mFilter setValue: ciimgLeft forKey: @"inputImage"];
    [mFilter setValue: ciimgRight forKey: @"inputBackgroundImage"];

    CIImage* resultingImage = [mFilter valueForKey: kCIOutputImageKey];

    CGRect extent = [ciimgLeft extent];

    [cicontext drawImage: resultingImage inRect:rect fromRect:extent];

    CGImageRef image = CGBitmapContextCreateImage(context);

    CGContextRelease(context);

    NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage: image];
    NSData *jpegData = [bitmapRep representationUsingType:NSJPEGFileType properties:nil];

    [jpegData writeToFile: destPath atomically: YES];
    [bitmapRep release];
    [mFilter release];
    CGColorSpaceRelease(myColorspace);
}

I'm stuck with a problem on CIFilters and applying them. Either the filter is only applied to 1/3 of an image (GPU) or the filter is only working only once (CPU). Redoing the same operation on CPU will cause the 1/3 image; redoing the operation on the same with CPU calculations, the CIFilter does not work/ is not applied.

I tried to add sample pictures, but I'm a starter here, so this is not allowed.Please have a look at my original post at Apple https://discussions.apple.com/message/16412850#16412850 . I'm sorry for the inconvenience.

Questions:

  • Why are is the CIFilter not working deterministic on CPU ?
  • Why is the GPU renderer stopping exactly after one third of the image ?
  • Is there a way to debug a CIFilter ? QuartzComposer only allows it given kernel code - but is there a debugger/logger ?

Thanks in advance,
Dirk

PS.:
I've also tried by switching the graphics card, but that didn't help either.

Here is code, parts are derived from the ImageApp Example from Apple.

- (void)createCGImageAtPathNamed:(NSString*)destPath
{
    if (false == didInit)
        return ;

    CGColorSpaceRef myColorspace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);


    size_t height = CGImageGetHeight(leftCGImageRef);
    size_t width = CGImageGetWidth(leftCGImageRef);
    CGRect rect = {{0,0}, {width, height}};
    size_t bitsPerComponent = 8;
    size_t bytesPerRow = rect.size.width*4;         //bytes per row - one byte each for argb
    bytesPerRow += (16 - bytesPerRow%16)%16;    

    CGImageAlphaInfo alphaInfo = kCGImageAlphaPremultipliedFirst;

    CGContextRef context = CGBitmapContextCreate(nil, width, height, bitsPerComponent, bytesPerRow, myColorspace, alphaInfo);

    NSDictionary *contextOptions = [NSDictionary dictionaryWithObjectsAndKeys:
                                     [NSNumber numberWithBool: wantsSoftRenderer],kCIContextUseSoftwareRenderer,nil];


    CIContext* cicontext = [CIContext contextWithCGContext: context options: contextOptions];
    CIImage *ciimgLeft = leftCIImage;
    CIImage *ciimgRight = rightCIImage;


    CIFilter *mFilter;

    [CIPlugIn loadAllPlugIns];
    mFilter = [[CIFilter filterWithName: @"CILightenBlendMode"] retain];
    [mFilter setValue: ciimgLeft forKey: @"inputImage"];
    [mFilter setValue: ciimgRight forKey: @"inputBackgroundImage"];

    CIImage* resultingImage = [mFilter valueForKey: kCIOutputImageKey];

    CGRect extent = [ciimgLeft extent];

    [cicontext drawImage: resultingImage inRect:rect fromRect:extent];

    CGImageRef image = CGBitmapContextCreateImage(context);

    CGContextRelease(context);

    NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCGImage: image];
    NSData *jpegData = [bitmapRep representationUsingType:NSJPEGFileType properties:nil];

    [jpegData writeToFile: destPath atomically: YES];
    [bitmapRep release];
    [mFilter release];
    CGColorSpaceRelease(myColorspace);
}

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。
列表为空,暂无数据
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文