如何从 -captureStillImageAsynchronouslyFromConnection:completionHandler: 获得的 CMSampleBuffer 获取 NSImage?

发布于 2024-12-28 02:11:41 字数 2493 浏览 1 评论 0原文

我有一个 Cocoa 应用程序,旨在从 USB 显微镜捕获静态图像,然后对它们进行一些后处理,然后将它们保存到图像文件。目前,我试图从传递给我的 completionHandler 块的 CMSampleBufferRef 获取到 NSImage 或我可以操作的其他表示形式并使用熟悉的 Cocoa API 进行保存。

我在 AVFoundation 文档中找到了函数 imageFromSampleBuffer() ,该函数旨在将 CMSampleBuffer 转换为 UIImage (叹气),并对其进行了适当的修改返回一个NSImage。但在这种情况下它不起作用,因为对 CMSampleBufferGetImageBuffer() 的调用返回 nil

以下日志显示了传递到我的完成块的 CMSampleBuffer

2012-01-21 19:38:36.293 LabCam[1402:cb0f] CMSampleBuffer 0x100335390 retainCount: 1 allocator: 0x7fff8c78620c
     invalid = NO
     dataReady = YES
     makeDataReadyCallback = 0x0
     makeDataReadyRefcon = 0x0
     buffer-level attachments:
          com.apple.cmio.buffer_attachment.discontinuity_flags(P) = 0
          com.apple.cmio.buffer_attachment.hosttime(P) = 79631546824089
          com.apple.cmio.buffer_attachment.sequence_number(P) = 42
     formatDescription = <CMVideoFormatDescription 0x100335220 [0x7fff782fff40]> {
     mediaType:'vide' 
     mediaSubType:'jpeg' 
     mediaSpecific: {
          codecType: 'jpeg'          dimensions: 640 x 480 
     } 
     extensions: {<CFBasicHash 0x100335160 [0x7fff782fff40]>{type = immutable dict, count = 5,
entries =>
     1 : <CFString 0x7fff773dff48 [0x7fff782fff40]>{contents = "Version"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     2 : <CFString 0x7fff773dff68 [0x7fff782fff40]>{contents = "RevisionLevel"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     3 : <CFString 0x7fff7781ab08 [0x7fff782fff40]>{contents = "CVFieldCount"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     4 : <CFString 0x7fff773dfdc8 [0x7fff782fff40]>{contents = "FormatName"} = <CFString 0x7fff76d35fb0 [0x7fff782fff40]>{contents = Photo - JPEG"}
     5 : <CFString 0x7fff773dff88 [0x7fff782fff40]>{contents = "Vendor"} = <CFString 0x7fff773dffa8 [0x7fff782fff40]>{contents = "appl"}
}
}
}
     sbufToTrackReadiness = 0x0
     numSamples = 1
     sampleTimingArray[1] = {
          {PTS = {2388943236/30000 = 79631.441, rounded}, DTS = {INVALID}, duration = {3698/30000 = 0.123}},
     }
     sampleSizeArray[1] = {
          sampleSize = 55911,
     }
     dataBuffer = 0x100335300

它显然包含 JPEG 数据,但我如何获取它? (最好保留相关的元数据......)

I have a Cocoa app that is intended to capture still images from a USB microscope and then do some post-processing on them before saving them to an image file. At the moment, I am stuck trying to get from the CMSampleBufferRef that's passed to my completionHandler block to an NSImage or some other representation I can manipulate and save using familiar Cocoa APIs.

I found the function imageFromSampleBuffer() in the AVFoundation docs, which purports to convert a CMSampleBuffer to a UIImage (sigh), and revised it appropriately to return an NSImage. But it does not work in this case, as the call to CMSampleBufferGetImageBuffer() returns nil.

Here is a log showing the CMSampleBuffer passed to my completion block:

2012-01-21 19:38:36.293 LabCam[1402:cb0f] CMSampleBuffer 0x100335390 retainCount: 1 allocator: 0x7fff8c78620c
     invalid = NO
     dataReady = YES
     makeDataReadyCallback = 0x0
     makeDataReadyRefcon = 0x0
     buffer-level attachments:
          com.apple.cmio.buffer_attachment.discontinuity_flags(P) = 0
          com.apple.cmio.buffer_attachment.hosttime(P) = 79631546824089
          com.apple.cmio.buffer_attachment.sequence_number(P) = 42
     formatDescription = <CMVideoFormatDescription 0x100335220 [0x7fff782fff40]> {
     mediaType:'vide' 
     mediaSubType:'jpeg' 
     mediaSpecific: {
          codecType: 'jpeg'          dimensions: 640 x 480 
     } 
     extensions: {<CFBasicHash 0x100335160 [0x7fff782fff40]>{type = immutable dict, count = 5,
entries =>
     1 : <CFString 0x7fff773dff48 [0x7fff782fff40]>{contents = "Version"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     2 : <CFString 0x7fff773dff68 [0x7fff782fff40]>{contents = "RevisionLevel"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     3 : <CFString 0x7fff7781ab08 [0x7fff782fff40]>{contents = "CVFieldCount"} = <CFNumber 0x183 [0x7fff782fff40]>{value = +1, type = kCFNumberSInt32Type}
     4 : <CFString 0x7fff773dfdc8 [0x7fff782fff40]>{contents = "FormatName"} = <CFString 0x7fff76d35fb0 [0x7fff782fff40]>{contents = Photo - JPEG"}
     5 : <CFString 0x7fff773dff88 [0x7fff782fff40]>{contents = "Vendor"} = <CFString 0x7fff773dffa8 [0x7fff782fff40]>{contents = "appl"}
}
}
}
     sbufToTrackReadiness = 0x0
     numSamples = 1
     sampleTimingArray[1] = {
          {PTS = {2388943236/30000 = 79631.441, rounded}, DTS = {INVALID}, duration = {3698/30000 = 0.123}},
     }
     sampleSizeArray[1] = {
          sampleSize = 55911,
     }
     dataBuffer = 0x100335300

It clearly appears to contain JPEG data, but how do I get at it? (Preferably keeping the associated metadata along for the ride…)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

七禾 2025-01-04 02:11:41

我最终在另一个代码示例的帮助下解决了这个问题。 CMSampleBufferGetImageBuffer 仅返回相机可用的未压缩本机图像格式的有效结果。因此,为了让我的程序正常工作,我必须将 AVCaptureStillImageOutput 实例配置为使用 k32BGRAPixelFormat 而不是其默认 (JPEG) 压缩格式。

session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
imageOutput = [[AVCaptureStillImageOutput alloc] init];
// Configure imageOutput for BGRA pixel format [#2].
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat];
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat
                                                           forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[session addOutput:imageOutput];

I eventually solved this with help from another code example. CMSampleBufferGetImageBuffer only returns a valid result for the uncompressed, native image formats available from the camera. So to get my program to work, I had to configure the AVCaptureStillImageOutput instance to use k32BGRAPixelFormat instead of its default (JPEG) compressed format.

session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
imageOutput = [[AVCaptureStillImageOutput alloc] init];
// Configure imageOutput for BGRA pixel format [#2].
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat];
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat
                                                           forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[session addOutput:imageOutput];
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文