如何使用 AVAssetWriter 编写带有视频和音频的电影?
我想使用 AVAssetWriter
导出电影,但不知道如何同步包含视频和音频轨道。仅导出视频效果很好,但是当我添加音频时,生成的电影如下所示:
首先我看到视频(没有音频),然后视频冻结(显示最后一个图像帧直到结束),几秒钟后我听到音频。
我尝试使用 CMSampleBufferSetOutputPresentationTimeStamp
(从当前减去第一个 CMSampleBufferGetPresentationTimeStamp
)来获取音频,但这一切都不起作用,我不认为这是正确的方向,因为视频和无论如何,源电影中的音频应该同步...
我的设置简而言之:我创建一个 AVAssetReader
和 2 AVAssetReaderTrackOutput
(一个用于视频,一个用于音频)并添加将它们添加到 AVAssetReader
中,然后创建一个 AVAssetWriter
和 2 AVAssetWriterInput
(视频和音频)并将它们添加到 AVAssetWriter< /code>...我开始一切:
[assetReader startReading];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
然后我运行 2 个队列来执行示例缓冲区内容:
dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
while([assetWriterVideoInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer];
if(sampleBuffer)
{
[assetWriterVideoInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else
{
[assetWriterVideoInput markAsFinished];
dispatch_release(queueVideo);
videoFinished=YES;
break;
}
}
}];
dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
while([assetWriterAudioInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer];
if(sampleBuffer)
{
[assetWriterAudioInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else
{
[assetWriterAudioInput markAsFinished];
dispatch_release(queueAudio);
audioFinished=YES;
break;
}
}
}];
在主循环中,我等待两个队列直到它们完成:
while(!videoFinished && !audioFinished)
{
sleep(1);
}
[assetWriter finishWriting];
此外,我尝试将生成的文件保存在库中以下代码...
NSURL *url=[[NSURL alloc] initFileURLWithPath:path];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url])
{
[library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error)
{
if(error)
NSLog(@"error=%@",error.localizedDescription);
else
NSLog(@"completed...");
}];
} else
NSLog(@"error, video not saved...");
[library release];
[url release];
...但我收到错误:
视频 /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4 无法保存到已保存的相册:错误 Domain=NSOSStatusErrorDomain Code=-12950“电影无法播放。” UserInfo=0x5e4fb90 {NSLocalizedDescription=电影无法播放 已播放。}
该代码在另一个程序中运行没有问题。那么这部电影有什么问题吗...?
I want to export a movie with AVAssetWriter
and can't figure out how to include video and audio tracks in sync. Exporting only video works fine, but when I add audio the resulting movie looks like this:
First I see the video (without audio), then the video freezes (showing the last image frame until the end) and after some seconds I hear the audio.
I tried some things with CMSampleBufferSetOutputPresentationTimeStamp
(subtracting the first CMSampleBufferGetPresentationTimeStamp
from the current) for the audio, but it all didn't work and I don't think it is the right direction, since video & audio in the source movie should be in sync anyway...
My setup in short: I create an AVAssetReader
and 2 AVAssetReaderTrackOutput
(one for video, one for audio) and add them to the AVAssetReader
, then I create an AVAssetWriter
and 2 AVAssetWriterInput
(video & audio) and add them to the AVAssetWriter
... I start it all up with:
[assetReader startReading];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
Then I run 2 queues for doing the sample buffer stuff:
dispatch_queue_t queueVideo=dispatch_queue_create("assetVideoWriterQueue", NULL);
[assetWriterVideoInput requestMediaDataWhenReadyOnQueue:queueVideo usingBlock:^
{
while([assetWriterVideoInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer=[assetReaderVideoOutput copyNextSampleBuffer];
if(sampleBuffer)
{
[assetWriterVideoInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else
{
[assetWriterVideoInput markAsFinished];
dispatch_release(queueVideo);
videoFinished=YES;
break;
}
}
}];
dispatch_queue_t queueAudio=dispatch_queue_create("assetAudioWriterQueue", NULL);
[assetWriterAudioInput requestMediaDataWhenReadyOnQueue:queueAudio usingBlock:^
{
while([assetWriterAudioInput isReadyForMoreMediaData])
{
CMSampleBufferRef sampleBuffer=[assetReaderAudioOutput copyNextSampleBuffer];
if(sampleBuffer)
{
[assetWriterAudioInput appendSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
} else
{
[assetWriterAudioInput markAsFinished];
dispatch_release(queueAudio);
audioFinished=YES;
break;
}
}
}];
In the main loop I wait for both queues until they finish:
while(!videoFinished && !audioFinished)
{
sleep(1);
}
[assetWriter finishWriting];
Furthermore I try to save the resulting file in the library with the following code...
NSURL *url=[[NSURL alloc] initFileURLWithPath:path];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if([library videoAtPathIsCompatibleWithSavedPhotosAlbum:url])
{
[library writeVideoAtPathToSavedPhotosAlbum:url completionBlock:^(NSURL *assetURL, NSError *error)
{
if(error)
NSLog(@"error=%@",error.localizedDescription);
else
NSLog(@"completed...");
}];
} else
NSLog(@"error, video not saved...");
[library release];
[url release];
...but I get the error:
Video /Users/cb/Library/Application Support/iPhone Simulator/4.2/Applications/E9865BF9-D190-4912-9248-66768B1AB635/Documents/export.mp4
cannot be saved to the saved photos album: Error
Domain=NSOSStatusErrorDomain Code=-12950 "Movie could not be played."
UserInfo=0x5e4fb90 {NSLocalizedDescription=Movie could not be
played.}
The code works without problems in another program. So something is wrong with the movie...?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(2)
您可以使用此代码来合并音频和视频。
You can use this code to merge audio and video.
assetWriterAudioInput 似乎忽略了音频写入的样本缓冲时间。
这样做。
1)写入视频轨道。
2) 完成后,将其标记为完成,即 [videoWriterInput markAsFinished];
3) 执行 [assetWriter startSessionAtSourceTime:timeRangeStart];
3)实例化音频读取器并开始写入音频。
It seams that assetWriterAudioInput ignores sample buffer time for audio writing.
Do this way.
1) Write video track.
2) When done, mark it finished i.e. [videoWriterInput markAsFinished];
3) do [assetWriter startSessionAtSourceTime:timeRangeStart];
3) instantiate audio reader and start writing audio.