从 QuickTime 电影中实时提取 YV12 (YUV420) 格式的所有视频帧的方法?

发布于 2024-11-01 05:30:21 字数 925 浏览 1 评论 0原文

我在 QTKit 中打开了 QTMovie。

我需要实时获取 YV12 格式(kYUV420PixelFormat)的该视频的每一帧(即,我将其传递给仅接受 YV12 并且需要实时播放视频的外部代码)。

似乎应该完成的方式是为当前帧调用[movie frameImageAtTime: [movie currentTime] withAttributes: error: ],然后[movie stepForward ] 到达下一帧,依此类推,直到获得所有帧。然而,据我研究,我找不到一种方法让 QTKit 为我提供 YV12 格式或任何其他 YUV 格式的数据。 frameImageAtTime: 调用可以将其转换为:

  • NSImage(但NSImage不能存储平面YUV)、
  • CGImage(同样的东西)、
  • CIImage(同样的东西)、
  • CVPixelBuffer(这个可以存储YUV,但似乎有无法配置调用以从中获取 YUV 默认情况下它返回 ARGB32 数据)
  • OpenGL 纹理(这可能也可以配置,但我在 OpenGL 中不需要此数据,我需要它在内存中)

所以似乎使用这种据称是新的和优化的 QTKit 技术的唯一方法是从中获取 ARGB 数据,使用自定义代码将每个帧转换为 YV12,并希望它仍然足够快以实现实时。或者我错过了什么?

在旧的 QuickTime 中,使用 kYUV420PixelFormat 设置 GWorld 并对其进行电影渲染相对容易,并且它刚刚工作。但旧的 QuickTime 调用是遗留的、已弃用的调用,不再使用...

我应该怎样做才能获得 YUV420 平面帧而无需进行不必要的转换?

I have QTMovie open in QTKit.

I need to get each frame of this video in YV12 format (kYUV420PixelFormat), in real time (ie. I'm passing it to foreign code which only accepts YV12 and needs to play the video in real time).

It seems The Way It Should Be Done is to call [movie frameImageAtTime: [movie currentTime] withAttributes: error: ] for current frame, and then [movie stepForward] to get to the next frame, and so on until I get all the frames. However, as much as I look into it, I can't find a way to make QTKit give me the data in YV12 format, or any other YUV format. frameImageAtTime: call can convert it to:

  • NSImage (but NSImage can't store planar YUV),
  • CGImage (same thing),
  • CIImage (same thing),
  • CVPixelBuffer (this one can store YUV, but there seems to be no way to configure the call to get YUV from it. By default it returns ARGB32 data)
  • OpenGL texture (this probably can be configured as well, but I don't need this data in OpenGL, I need it in memory)

So it seems that the only way to use this supposedly new and optimized QTKit technology is to get ARGB data from it, convert each frame to YV12 with custom code, and hope it will still be fast enough for realtime. Or am I missing something?

In old QuickTime it was relatively easy to set up GWorld with kYUV420PixelFormat, have Movie render to it, and it just worked. But old QuickTime calls are legacy, deprecated calls, not to be used anymore...

What should I do to get YUV420 planar frames without the unnecessary conversions?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

残月升风 2024-11-08 05:30:21

根据旧的苹果邮件列表之一上的这个帖子,我想说这至少在过去是不可能的。我现在正在尝试找出是否可以使用较低级别的 API 来完成。

http://lists.apple.com/archives/quicktime-api /2008/Nov/msg00049.html

2008 年 11 月 5 日中午 12:08,Neil Clayton 写道:

我想从电影中获取 YUV 帧(电影编码为
YUV,帧最初是 k2vuyPixelFormat 和编码器
格式将与其兼容 - 例如:H264 或 AIC 等)。

当我这样做时:

NSError *error = nil;
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:
    QTMovieFrameImageTypeCVPixelBufferRef, QTMovieFrameImageType,
    [NSNumber numberWithBool:YES], QTMovieFrameImageHighQuality,
    nil];
CVPixelBufferRef buffer = [qtMovie frameImageAtTime:QTMakeTime(lastFrame, movie.timeScale)
    withAttributes:dict error:&error];

框架显示有效。它具有正确的宽度和高度。但它
当我这样做时,似乎是 k32ARGBPixelFormat 类型:

OSType type = CVPixelBufferGetPixelFormatType(buffer);

假设我这样做的方式是错误的 - 正确的方法是什么
从电影中获取 k2vuyPixelFormat 类型的帧?
或者,如果这是不可能的,那么执行 RGB 的最简单方法是什么?
YUV 转换为 k2vuyPixelFormat 类型的 CVPixelBuffer?我
这里不需要速度(这是一次性的、一帧的操作)。

2008年11月7日,QuickTime Engineering的Tim Monroe回应:

目前还没有办法通过frameImageAtTime做你想做的事。我建议提交增强请求。

Based on this thread on one of the old apple mailing lists, I'd say this at least used to be impossible. I'm trying now to find out if it can be done with a lower level API.

http://lists.apple.com/archives/quicktime-api/2008/Nov/msg00049.html

On Nov 5, 2008, at 12:08 PM, Neil Clayton wrote:

I'd like to get a YUV frame out of a movie (the movie is encoded in
YUV, the frames were originally k2vuyPixelFormat and the encoder
format would have been compatible with that - e.g: H264 or AIC etc).

When I do this:

NSError *error = nil;
NSDictionary *dict = [NSDictionary dictionaryWithObjectsAndKeys:
    QTMovieFrameImageTypeCVPixelBufferRef, QTMovieFrameImageType,
    [NSNumber numberWithBool:YES], QTMovieFrameImageHighQuality,
    nil];
CVPixelBufferRef buffer = [qtMovie frameImageAtTime:QTMakeTime(lastFrame, movie.timeScale)
    withAttributes:dict error:&error];

The frame appears valid. It has a correct width and height. But it
seems to be of type k32ARGBPixelFormat when I do:

OSType type = CVPixelBufferGetPixelFormatType(buffer);

Presuming I'm doing this the wrong way - what's the correct method for
getting a frame of type k2vuyPixelFormat from a movie?
Or if this isn't possible, what's the easiest way to perform a RGB-
YUV conversion into a CVPixelBuffer of type k2vuyPixelFormat? I
don't need speed here (it's a one off, one frame operation).

On Nov 7, 20008, Tim Monroe of QuickTime Engineering responds:

Currently there is no way to do what you want via frameImageAtTime. I would suggest filing an enhancement request.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文