将 OpenGL 视图捕获到 AVAssetWriterInputPixelBufferAdaptor

发布于 2024-12-14 10:58:21 字数 2942 浏览 0 评论 0原文

我正在尝试创建一个 AVAssetWriter 来屏幕捕获 openGL 项目。我从未编写过 AVAssetWriter 或 AVAssetWriterInputPixelBufferAdaptor 所以我不确定我是否做了正确的事情。

- (id) initWithOutputFileURL:(NSURL *)anOutputFileURL {
    if ((self = [super init])) {
        NSError *error;
        movieWriter = [[AVAssetWriter alloc] initWithURL:anOutputFileURL fileType:AVFileTypeMPEG4 error:&error];
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:640], AVVideoWidthKey,
                                       [NSNumber numberWithInt:480], AVVideoHeightKey,
                                       nil];
        writerInput = [[AVAssetWriterInput
                        assetWriterInputWithMediaType:AVMediaTypeVideo
                        outputSettings:videoSettings] retain];
        writer = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,nil]];

        [movieWriter addInput:writerInput];
        writerInput.expectsMediaDataInRealTime = YES;
    }

    return self;
}

类的其他部分:

- (void)getFrame:(CVPixelBufferRef)SampleBuffer:(int64_t)frame{
    frameNumber = frame;
    [writer appendPixelBuffer:SampleBuffer withPresentationTime:CMTimeMake(frame, 24)]; 
}

- (void)startRecording {
   [movieWriter startWriting];
   [movieWriter startSessionAtSourceTime:kCMTimeZero];
}

- (void)stopRecording {
   [writerInput markAsFinished];
   [movieWriter endSessionAtSourceTime:CMTimeMake(frameNumber, 24)];
   [movieWriter finishWriting];
}

assetwriter 由以下方式启动:

    NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]];
    recorder = [[GLRecorder alloc] initWithOutputFileURL:outputFileURL];

视图以这种方式记录:

    glReadPixels(0, 0, 480, 320, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    for(int y = 0; y <320; y++) {
    for(int x = 0; x <480 * 4; x++) {
        int b2 = ((320 - 1 - y) * 480 * 4 + x);
        int b1 = (y * 4 * 480 + x);
        buffer2[b2] = buffer[b1];
    }
}    
pixelBuffer = NULL;
CVPixelBufferCreateWithBytes (NULL,480,320,kCVPixelFormatType_32BGRA,buffer2,1920,NULL,0,NULL,&pixelBuffer);
[recorder getFrame:pixelBuffer :framenumber];
    framenumber++;

注意:

pixelBuffer 是一个 CVPixelBufferRef
framenumber 是一个 int64_t
bufferbuffer2GLubyte

我没有收到任何错误,但当我完成录制时,没有文件。任何帮助或帮助链接将不胜感激。 opengl 具有来自摄像机的实时反馈。我已经能够将屏幕保存为 UIImage 但想要获取我创建的内容的电影。

I am trying to create a AVAssetWriter to screen capture an openGL project. I have never written a AVAssetWriter or an AVAssetWriterInputPixelBufferAdaptor so I am not sure if I did anything correctly.

- (id) initWithOutputFileURL:(NSURL *)anOutputFileURL {
    if ((self = [super init])) {
        NSError *error;
        movieWriter = [[AVAssetWriter alloc] initWithURL:anOutputFileURL fileType:AVFileTypeMPEG4 error:&error];
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:640], AVVideoWidthKey,
                                       [NSNumber numberWithInt:480], AVVideoHeightKey,
                                       nil];
        writerInput = [[AVAssetWriterInput
                        assetWriterInputWithMediaType:AVMediaTypeVideo
                        outputSettings:videoSettings] retain];
        writer = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,nil]];

        [movieWriter addInput:writerInput];
        writerInput.expectsMediaDataInRealTime = YES;
    }

    return self;
}

Other parts of the class:

- (void)getFrame:(CVPixelBufferRef)SampleBuffer:(int64_t)frame{
    frameNumber = frame;
    [writer appendPixelBuffer:SampleBuffer withPresentationTime:CMTimeMake(frame, 24)]; 
}

- (void)startRecording {
   [movieWriter startWriting];
   [movieWriter startSessionAtSourceTime:kCMTimeZero];
}

- (void)stopRecording {
   [writerInput markAsFinished];
   [movieWriter endSessionAtSourceTime:CMTimeMake(frameNumber, 24)];
   [movieWriter finishWriting];
}

The assetwriter is initiated by:

    NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]];
    recorder = [[GLRecorder alloc] initWithOutputFileURL:outputFileURL];

The view is recorded this way:

    glReadPixels(0, 0, 480, 320, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    for(int y = 0; y <320; y++) {
    for(int x = 0; x <480 * 4; x++) {
        int b2 = ((320 - 1 - y) * 480 * 4 + x);
        int b1 = (y * 4 * 480 + x);
        buffer2[b2] = buffer[b1];
    }
}    
pixelBuffer = NULL;
CVPixelBufferCreateWithBytes (NULL,480,320,kCVPixelFormatType_32BGRA,buffer2,1920,NULL,0,NULL,&pixelBuffer);
[recorder getFrame:pixelBuffer :framenumber];
    framenumber++;

Note:

pixelBuffer is a CVPixelBufferRef.
framenumber is an int64_t.
buffer and buffer2 are GLubyte.

I get no errors but when I finish recording there is no file. Any help or links to help would greatly be appreciated. The opengl has from live feed from the camera. I've been able to save the screen as a UIImage but want to get a movie of what I created.

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

白云悠悠 2024-12-21 10:58:21

如果您正在编写 RGBA 帧,我认为您可能需要使用 AVAssetWriterInputPixelBufferAdaptor 将它们写出来。这个类应该管理像素缓冲区池,但我的印象是它实际上将你的数据处理成 YUV。

如果这有效,那么我想你会发现你的颜色都被交换了,此时你可能需要编写像素着色器将它们转换为 BGRA。或者(颤抖)在 CPU 上进行。由你决定。

If you're writing RGBA frames, I think you may need to use a AVAssetWriterInputPixelBufferAdaptor to write them out. This class is supposed to manage a pool of pixel buffers, but I get the impression that it actually massages your data into YUV.

If that works, then I think you'll find that your colours are all swapped at which point you'll probably have to write pixel shader to convert them to BGRA. Or (shudder) do it on the CPU. Up to you.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文