如何以 60fps 录制离屏 NSView
我想从 macOS 上运行的离屏 NSView 录制视频输出(编码或未编码)。我很确定没有 API 可以做到这一点,但我相信通过将其逐帧渲染到帧缓冲区中是可行的。
问题是我找不到一种方法以足够快的速度渲染视图。我尝试过但没有成功的方法(在运行 Monterey 的 MacBook M1 Pro 上测试):
[view dataWithPDFInsideRect:]
和[view dataWithEPSInsideRect:]
:执行大约需要 200 毫秒。[view.layer renderInContext:]
:执行大约需要 350 毫秒。[view cacheDisplayInRect: toBitmapImageRep:]
:执行大约需要 100 毫秒。
我还尝试将视图嵌入到窗口中并捕获窗口。窗口捕获函数(例如CGWindowListCreateImage
)要快得多,但在窗口离屏时不起作用。
考虑到视图可以在窗口中以 60fps 的速度渲染而没有问题,为什么这些方法要花费这么多时间?我是否错过了将 NSView 渲染到帧缓冲区的方法?
I want to record the video output (encoded or not) from an off-screen NSView running on macOS. I'm pretty sure there is no API to do this, however I believe it is feasible by rendering it frame-by-frame into a framebuffer.
The problem is that I can't find a way to render the view at a fast enough rate. Methods I've tried without success (tested on a MacBook M1 Pro running Monterey) :
[view dataWithPDFInsideRect:]
and[view dataWithEPSInsideRect:]
: takes about 200ms to execute.[view.layer renderInContext:]
: takes about 350ms to execute.[view cacheDisplayInRect: toBitmapImageRep:]
: takes about 100ms to execute.
I also tried to embed the view in a window and capture the window. Window capturing functions (such as CGWindowListCreateImage
) are much faster, but does not work when windows are off-screen.
Considering the view can be rendered at 60fps in a window without issue, why do theese methods take so much time? Is there any method I missed to render an NSView into a framebuffer?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
我终于找到了一种高效的方法。通过以这种方式捕捉视图,我能够达到 60+ fps。
I finally found a performant way of doing it. By capturing the view this way I am able to reach 60+ fps.
另一种高效的方式可能是:
Another performant way may be: