如何以 60fps 录制离屏 NSView

发布于 2025-01-10 12:11:33 字数 615 浏览 0 评论 0原文

我想从 macOS 上运行的离屏 NSView 录制视频输出(编码或未编码)。我很确定没有 API 可以做到这一点,但我相信通过将其逐帧渲染到帧缓冲区中是可行的。

问题是我找不到一种方法以足够快的速度渲染视图。我尝试过但没有成功的方法(在运行 Monterey 的 MacBook M1 Pro 上测试):

  • [view dataWithPDFInsideRect:][view dataWithEPSInsideRect:] :执行大约需要 200 毫秒。
  • [view.layer renderInContext:] :执行大约需要 350 毫秒。
  • [view cacheDisplayInRect: toBitmapImageRep:] :执行大约需要 100 毫秒。

我还尝试将视图嵌入到窗口中并捕获窗口。窗口捕获函数(例如CGWindowListCreateImage)要快得多,但在窗口离屏时不起作用。

考虑到视图可以在窗口中以 60fps 的速度渲染而没有问题,为什么这些方法要花费这么多时间?我是否错过了将 NSView 渲染到帧缓冲区的方法?

I want to record the video output (encoded or not) from an off-screen NSView running on macOS. I'm pretty sure there is no API to do this, however I believe it is feasible by rendering it frame-by-frame into a framebuffer.

The problem is that I can't find a way to render the view at a fast enough rate. Methods I've tried without success (tested on a MacBook M1 Pro running Monterey) :

  • [view dataWithPDFInsideRect:] and [view dataWithEPSInsideRect:] : takes about 200ms to execute.
  • [view.layer renderInContext:] : takes about 350ms to execute.
  • [view cacheDisplayInRect: toBitmapImageRep:] : takes about 100ms to execute.

I also tried to embed the view in a window and capture the window. Window capturing functions (such as CGWindowListCreateImage) are much faster, but does not work when windows are off-screen.

Considering the view can be rendered at 60fps in a window without issue, why do theese methods take so much time? Is there any method I missed to render an NSView into a framebuffer?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

夕色琉璃 2025-01-17 12:11:33

我终于找到了一种高效的方法。通过以这种方式捕捉视图,我能够达到 60+ fps。

NSView* view = ...;
NSBitmapImageRep* bitmap = [
    [NSBitmapImageRep alloc]
    initWithBitmapDataPlanes:nil
    pixelsWide:view.bounds.size.width
    pixelsHigh:view.bounds.size.height
    bitsPerSample:8
    samplesPerPixel:4
    hasAlpha:YES
    isPlanar:NO
    colorSpaceName:NSCalibratedRGBColorSpace
    bitmapFormat:0
    bytesPerRow:(4 * view.bounds.size.width)
    bitsPerPixel:32
];

NSGraphicsContext* graphicsContext = [NSGraphicsContext graphicsContextWithBitmapImageRep:bitmap];
[NSGraphicsContext saveGraphicsState];
[NSGraphicsContext setCurrentContext:graphicsContext];
[view displayRectIgnoringOpacity:view.bounds inContext:graphicsContext];
[NSGraphicsContext restoreGraphicsState];

// pixels are in format [R, G, B, A, R, G, B, A, ...]
unsigned char* pixels = [bitmap bitmapData];

I finally found a performant way of doing it. By capturing the view this way I am able to reach 60+ fps.

NSView* view = ...;
NSBitmapImageRep* bitmap = [
    [NSBitmapImageRep alloc]
    initWithBitmapDataPlanes:nil
    pixelsWide:view.bounds.size.width
    pixelsHigh:view.bounds.size.height
    bitsPerSample:8
    samplesPerPixel:4
    hasAlpha:YES
    isPlanar:NO
    colorSpaceName:NSCalibratedRGBColorSpace
    bitmapFormat:0
    bytesPerRow:(4 * view.bounds.size.width)
    bitsPerPixel:32
];

NSGraphicsContext* graphicsContext = [NSGraphicsContext graphicsContextWithBitmapImageRep:bitmap];
[NSGraphicsContext saveGraphicsState];
[NSGraphicsContext setCurrentContext:graphicsContext];
[view displayRectIgnoringOpacity:view.bounds inContext:graphicsContext];
[NSGraphicsContext restoreGraphicsState];

// pixels are in format [R, G, B, A, R, G, B, A, ...]
unsigned char* pixels = [bitmap bitmapData];
蓝眼泪 2025-01-17 12:11:33

另一种高效的方式可能是:

NSBitmapImageRep *bm = [view bitmapImageRepForCachingDisplayInRect:view.bounds];
[view cacheDisplayInRect:view.bounds toBitmapImageRep:bm];
return (CGImageRef)CFRetain( bm.CGImage );

Another performant way may be:

NSBitmapImageRep *bm = [view bitmapImageRepForCachingDisplayInRect:view.bounds];
[view cacheDisplayInRect:view.bounds toBitmapImageRep:bm];
return (CGImageRef)CFRetain( bm.CGImage );
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文