iOS下逐帧读取视频
我正在寻找一种使用 iOS API 检索视频各个帧的方法。 我尝试使用 AVAssetImageGenerator 但它似乎只提供最接近秒的帧,这对我的使用来说有点太粗糙了。
根据我对文档的理解,AVAssetReader、AVAssetReaderOutput 和 CMSampleBufferGetImageBuffer 的管道我应该能够做一些事情,但我被 CVImageBufferRef 困住了。有了这个,我正在寻找一种获取 CGImageRef 或 UIImage 的方法,但还没有找到。
不需要实时,我越能坚持提供的 API 就越好。
多谢!
编辑: 基于此网站:http://www.7twenty7。 com/blog/2010/11/video-processing-with-av-foundation 和这个问题:如何将 CVImageBufferRef 转换为 UIImage 我即将找到解决方案。问题,AVAssetReader 在第一个 copyNextSampleBuffer 后停止读取,没有给我任何东西(sampleBuffer 为 NULL)。
该视频可由 MPMoviePlayerController 读取。我不明白出了什么问题。
I'm looking for a way to retrieve the individual frames of a video using iOS API.
I tried using AVAssetImageGenerator but it seems to only provide frame to the nearest second which is a bit too rough for my usage.
From what I understand of the documentation, a pipeline of AVAssetReader, AVAssetReaderOutput and CMSampleBufferGetImageBuffer I should be able to do something but I'm stuck with a CVImageBufferRef. With this I'm looking for a way to get a CGImageRef or a UIImage but haven't found it.
Real-time is not needed and the more I can stick to provided API the better.
Thanks a lot!
Edit:
Based on this site: http://www.7twenty7.com/blog/2010/11/video-processing-with-av-foundation and this question: how to convert a CVImageBufferRef to UIImage I'm nearing on a solution. Problem, the AVAssetReader stops reading after the first copyNextSampleBuffer
without giving me anything (the sampleBuffer is NULL).
The video is readable by MPMoviePlayerController. I don't understand what's wrong.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(3)
上面的两个链接实际上回答了我的问题,空的
copyNextBufferSample
是 iOS SDK 5.0b3 的问题,它可以在设备上运行。The two links above actually answer my question and the empty
copyNextBufferSample
is an issue with iOS SDK 5.0b3, it works on the device.AVAssetImageGenerator
对于抓取的确切帧时间具有非常宽松的默认容差。它有两个确定容差的属性:requestedTimeToleranceBefore
和requestedTimeToleranceAfter
。这些容差默认为kCMTimePositiveInfinity
,因此如果您想要精确的时间,请将它们设置为kCMTimeZero
以获得精确的帧。(抓取精确帧可能比抓取近似帧需要更长的时间,但您声明实时不是问题。)
AVAssetImageGenerator
has very loose default tolerances for the exact frame time that is grabbed. It has two properties that determine the tolerance:requestedTimeToleranceBefore
andrequestedTimeToleranceAfter
. These tolerances default tokCMTimePositiveInfinity
, so if you want exact times, set them tokCMTimeZero
to get exact frames.(It may take longer to grab the exact frames than approximate frames, but you state that realtime is not an issue.)
使用 AVReaderWriter。尽管它是 OS X Apple 示例代码,但 AVFoundation 几乎不需要任何更改即可在两个平台上使用。
Use AVReaderWriter. Though its an OS X Apple sample code, AVFoundation is available on both platforms with little changes.