将 OpenGL 视图捕获到 AVAssetWriterInputPixelBufferAdaptor
我正在尝试创建一个 AVAssetWriter
来屏幕捕获 openGL 项目。我从未编写过 AVAssetWriter
或 AVAssetWriterInputPixelBufferAdaptor 所以我不确定我是否做了正确的事情。
- (id) initWithOutputFileURL:(NSURL *)anOutputFileURL {
if ((self = [super init])) {
NSError *error;
movieWriter = [[AVAssetWriter alloc] initWithURL:anOutputFileURL fileType:AVFileTypeMPEG4 error:&error];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
nil];
writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
writer = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,nil]];
[movieWriter addInput:writerInput];
writerInput.expectsMediaDataInRealTime = YES;
}
return self;
}
类的其他部分:
- (void)getFrame:(CVPixelBufferRef)SampleBuffer:(int64_t)frame{
frameNumber = frame;
[writer appendPixelBuffer:SampleBuffer withPresentationTime:CMTimeMake(frame, 24)];
}
- (void)startRecording {
[movieWriter startWriting];
[movieWriter startSessionAtSourceTime:kCMTimeZero];
}
- (void)stopRecording {
[writerInput markAsFinished];
[movieWriter endSessionAtSourceTime:CMTimeMake(frameNumber, 24)];
[movieWriter finishWriting];
}
assetwriter 由以下方式启动:
NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]];
recorder = [[GLRecorder alloc] initWithOutputFileURL:outputFileURL];
视图以这种方式记录:
glReadPixels(0, 0, 480, 320, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
for(int y = 0; y <320; y++) {
for(int x = 0; x <480 * 4; x++) {
int b2 = ((320 - 1 - y) * 480 * 4 + x);
int b1 = (y * 4 * 480 + x);
buffer2[b2] = buffer[b1];
}
}
pixelBuffer = NULL;
CVPixelBufferCreateWithBytes (NULL,480,320,kCVPixelFormatType_32BGRA,buffer2,1920,NULL,0,NULL,&pixelBuffer);
[recorder getFrame:pixelBuffer :framenumber];
framenumber++;
注意:
pixelBuffer
是一个 CVPixelBufferRef
。framenumber
是一个 int64_t
。buffer
和 buffer2
是 GLubyte
。
我没有收到任何错误,但当我完成录制时,没有文件。任何帮助或帮助链接将不胜感激。 opengl 具有来自摄像机的实时反馈。我已经能够将屏幕保存为 UIImage
但想要获取我创建的内容的电影。
I am trying to create a AVAssetWriter
to screen capture an openGL project. I have never written a AVAssetWriter
or an AVAssetWriterInputPixelBufferAdaptor so I am not sure if I did anything correctly.
- (id) initWithOutputFileURL:(NSURL *)anOutputFileURL {
if ((self = [super init])) {
NSError *error;
movieWriter = [[AVAssetWriter alloc] initWithURL:anOutputFileURL fileType:AVFileTypeMPEG4 error:&error];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithInt:640], AVVideoWidthKey,
[NSNumber numberWithInt:480], AVVideoHeightKey,
nil];
writerInput = [[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings] retain];
writer = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,nil]];
[movieWriter addInput:writerInput];
writerInput.expectsMediaDataInRealTime = YES;
}
return self;
}
Other parts of the class:
- (void)getFrame:(CVPixelBufferRef)SampleBuffer:(int64_t)frame{
frameNumber = frame;
[writer appendPixelBuffer:SampleBuffer withPresentationTime:CMTimeMake(frame, 24)];
}
- (void)startRecording {
[movieWriter startWriting];
[movieWriter startSessionAtSourceTime:kCMTimeZero];
}
- (void)stopRecording {
[writerInput markAsFinished];
[movieWriter endSessionAtSourceTime:CMTimeMake(frameNumber, 24)];
[movieWriter finishWriting];
}
The assetwriter is initiated by:
NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]];
recorder = [[GLRecorder alloc] initWithOutputFileURL:outputFileURL];
The view is recorded this way:
glReadPixels(0, 0, 480, 320, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
for(int y = 0; y <320; y++) {
for(int x = 0; x <480 * 4; x++) {
int b2 = ((320 - 1 - y) * 480 * 4 + x);
int b1 = (y * 4 * 480 + x);
buffer2[b2] = buffer[b1];
}
}
pixelBuffer = NULL;
CVPixelBufferCreateWithBytes (NULL,480,320,kCVPixelFormatType_32BGRA,buffer2,1920,NULL,0,NULL,&pixelBuffer);
[recorder getFrame:pixelBuffer :framenumber];
framenumber++;
Note:
pixelBuffer
is a CVPixelBufferRef
.framenumber
is an int64_t
.buffer
and buffer2
are GLubyte
.
I get no errors but when I finish recording there is no file. Any help or links to help would greatly be appreciated. The opengl has from live feed from the camera. I've been able to save the screen as a UIImage
but want to get a movie of what I created.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
如果您正在编写 RGBA 帧,我认为您可能需要使用
AVAssetWriterInputPixelBufferAdaptor
将它们写出来。这个类应该管理像素缓冲区池,但我的印象是它实际上将你的数据处理成 YUV。如果这有效,那么我想你会发现你的颜色都被交换了,此时你可能需要编写像素着色器将它们转换为 BGRA。或者(颤抖)在 CPU 上进行。由你决定。
If you're writing RGBA frames, I think you may need to use a
AVAssetWriterInputPixelBufferAdaptor
to write them out. This class is supposed to manage a pool of pixel buffers, but I get the impression that it actually massages your data into YUV.If that works, then I think you'll find that your colours are all swapped at which point you'll probably have to write pixel shader to convert them to BGRA. Or (shudder) do it on the CPU. Up to you.