AVAssetWriter 在影片输出中添加绿框
我正在完成一个 iPhone 应用程序,可以将电影保存到相册中。电影源是图像数组。我用它制作电影,并将它们很好地放入相册中,但它在开始时总是有一个额外的绿框。
有什么想法吗?
我重新阅读了苹果的文档,摇动了电线,并对编号图像进行了一些测试,以确认它没有丢帧或类似的情况。仍然没有得到正确的方法。
//build save path with time
NSMutableString *buildPath = [[NSMutableString alloc] init];
[buildPath setString:@"Documents/"];
[buildPath appendString:@"temporary.mp4"];
NSString *fullPath = [NSHomeDirectory() stringByAppendingPathComponent:buildPath];
[buildPath release];
//if the file already exists, deleate it
NSFileManager *fileMgr = [NSFileManager defaultManager];
if([fileMgr fileExistsAtPath:fullPath]){
if([fileMgr removeItemAtPath:fullPath error:&error] != YES){
//error
}
}
//prepare to write the movie
AVAssetWriter *videoWriter =
[[AVAssetWriter alloc] initWithURL
:[NSURL fileURLWithPath:fullPath]
fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264,AVVideoCodecKey,
[NSNumber numberWithInt:width],AVVideoWidthKey,
[NSNumber numberWithInt:height],AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput =
[[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings]
retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor =
[AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput
:writerInput sourcePixelBufferAttributes
:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//start writing
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage
:[[imageArray objectAtIndex:0] CGImage]
:CGSizeMake(width,height)];
CVPixelBufferPoolCreatePixelBuffer(NULL,adaptor.pixelBufferPool,&buffer);
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
//loop through image array
int y = [imageArray count];
int x = 0;
while(x < y)
{
if(writerInput.readyForMoreMediaData == YES){
CMTime frameTime = CMTimeMake(1,24);
CMTime lastTime = CMTimeMake(x,24);
CMTime presentTime = CMTimeAdd(lastTime,frameTime);
buffer = [self pixelBufferFromCGImage
:[[imageArray objectAtIndex:x] CGImage]
:CGSizeMake(width,height)];
[adaptor appendPixelBuffer:buffer
withPresentationTime:presentTime];
x++;
}
}
//finish writing
[writerInput markAsFinished];
[videoWriter finishWriting];
//clean up
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
//handle after save, save is asynchronous
void(^completionBlock)(NSURL *, NSError *) =
^(NSURL *assetURL, NSError *error)
{
if(error != nil){
//error
}
//remove temp movie file
if([fileMgr removeItemAtPath:fullPath error:&error] != YES){
//error
}
};
//write the movie to photo album
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSURL *filePathURL = [NSURL fileURLWithPath:fullPath isDirectory:NO];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:filePathURL]){
[library
writeVideoAtPathToSavedPhotosAlbum:filePathURL
completionBlock:completionBlock];
}
//clean up
[library release];
I'm finishing an iPhone app that saves a movie to the photos album. The movie source is an array of images. I have it making movies and it's putting them in the photos album quite nicely, but it always has an extra green frame at the start.
Any ideas?
I've re-read the docs at Apple, jiggled the wires and did some tests with numbered images to confirm it's not dropping a frame or something like that. still not getting it the right way round.
//build save path with time
NSMutableString *buildPath = [[NSMutableString alloc] init];
[buildPath setString:@"Documents/"];
[buildPath appendString:@"temporary.mp4"];
NSString *fullPath = [NSHomeDirectory() stringByAppendingPathComponent:buildPath];
[buildPath release];
//if the file already exists, deleate it
NSFileManager *fileMgr = [NSFileManager defaultManager];
if([fileMgr fileExistsAtPath:fullPath]){
if([fileMgr removeItemAtPath:fullPath error:&error] != YES){
//error
}
}
//prepare to write the movie
AVAssetWriter *videoWriter =
[[AVAssetWriter alloc] initWithURL
:[NSURL fileURLWithPath:fullPath]
fileType:AVFileTypeMPEG4
error:&error];
NSParameterAssert(videoWriter);
NSDictionary *videoSettings =
[NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264,AVVideoCodecKey,
[NSNumber numberWithInt:width],AVVideoWidthKey,
[NSNumber numberWithInt:height],AVVideoHeightKey,
nil];
AVAssetWriterInput* writerInput =
[[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
outputSettings:videoSettings]
retain];
AVAssetWriterInputPixelBufferAdaptor *adaptor =
[AVAssetWriterInputPixelBufferAdaptor
assetWriterInputPixelBufferAdaptorWithAssetWriterInput
:writerInput sourcePixelBufferAttributes
:nil];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];
//start writing
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer = NULL;
buffer = [self pixelBufferFromCGImage
:[[imageArray objectAtIndex:0] CGImage]
:CGSizeMake(width,height)];
CVPixelBufferPoolCreatePixelBuffer(NULL,adaptor.pixelBufferPool,&buffer);
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
//loop through image array
int y = [imageArray count];
int x = 0;
while(x < y)
{
if(writerInput.readyForMoreMediaData == YES){
CMTime frameTime = CMTimeMake(1,24);
CMTime lastTime = CMTimeMake(x,24);
CMTime presentTime = CMTimeAdd(lastTime,frameTime);
buffer = [self pixelBufferFromCGImage
:[[imageArray objectAtIndex:x] CGImage]
:CGSizeMake(width,height)];
[adaptor appendPixelBuffer:buffer
withPresentationTime:presentTime];
x++;
}
}
//finish writing
[writerInput markAsFinished];
[videoWriter finishWriting];
//clean up
CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
[videoWriter release];
[writerInput release];
//handle after save, save is asynchronous
void(^completionBlock)(NSURL *, NSError *) =
^(NSURL *assetURL, NSError *error)
{
if(error != nil){
//error
}
//remove temp movie file
if([fileMgr removeItemAtPath:fullPath error:&error] != YES){
//error
}
};
//write the movie to photo album
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
NSURL *filePathURL = [NSURL fileURLWithPath:fullPath isDirectory:NO];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:filePathURL]){
[library
writeVideoAtPathToSavedPhotosAlbum:filePathURL
completionBlock:completionBlock];
}
//clean up
[library release];
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您的第一个 PTS 应该是 0/24 秒,而不是 1/24 秒
哎呀,对不起,我的错误,您的第一个 PTS 是 零 - 我没有注意到
CVPixelBufferPoolCreatePixelBuffer
和appendPixelBuffer:withPresentationTime:
,所以我改变了我的答案。您附加的第一个像素缓冲区与您的图像数组无关。是未定义的吗?我猜它是绿色的。我不确定你对像素缓冲池做了什么 - 删除这两行并将循环重新设置为零应该消除绿框。
Your first PTS should be 0/24s and not 1/24s
Oops, sorry my mistake, your first PTS is zero - I didn't notice that
CVPixelBufferPoolCreatePixelBuffer
andappendPixelBuffer:withPresentationTime:
, so I've changed my answer.That very first pixel buffer that you append has nothing to do with your array of images. Is it undefined? I guess it's green. I'm not sure what you're doing with the pixel buffer pool - deleting those two lines and rebasing your loop at zero should get rid of the green frame.