录制“会说话的汤姆猫”等视频iPhone(添加音频时出现问题)

发布于 2024-12-12 12:15:33 字数 446 浏览 0 评论 0原文

我必须录制一个与“汤姆猫”完全相同的应用程序视频。 从此处获取帮助和此处我已经捕获了屏幕并使用这些图像制作了视频,但没有任何声音。

我已经分别录制了声音和视频文件,但不知道如何添加它们

,谁能告诉我如何为此视频添加声音如何录制声音

有人可以帮忙吗?

I have to record a video of app exactly similar to "Talking tom".
Taking help from Here and Here i have captured screen and made a video using those images but that does not has any sound.

I have recorded both sound and video files separately but don't know how to add them

can anyone tell me how to add sound to this video or how to record it with sound.

Can anyone help ?

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

榆西 2024-12-19 12:15:34
-(void) processVideo: (NSURL*) videoUrl{   
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];

AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];

NSError * error = nil;

for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
{
    NSString *pathString = [[NSHomeDirectory() stringByAppendingString:@”/Documents/”] stringByAppendingString: [audioInfo objectForKey: @”fileName”]];

    AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];

    AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
    AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                                                   preferredTrackID: kCMPersistentTrackID_Invalid];

    NSLog(@”%lf”, [[audioInfo objectForKey: @”startTime”] doubleValue]);

    CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: @”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);

    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];      
}


AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                                                                               preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
                               ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
                                atTime:kCMTimeZero error:nil];

AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                                                                      presetName:AVAssetExportPresetPassthrough];   

NSString* videoName = @”export.mov”;

NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

_assetExport.outputFileType = @”com.apple.quicktime-movie”;
NSLog(@”file type %@”,_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;

[_assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (_assetExport.status) 
     {
         case AVAssetExportSessionStatusCompleted:
             //export complete 
             NSLog(@”Export Complete”);
             //[self uploadToYouTube];

             break;
         case AVAssetExportSessionStatusFailed:
             NSLog(@”Export Failed”);
             NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
             //export error (see exportSession.error)  
             break;
         case AVAssetExportSessionStatusCancelled:
             NSLog(@”Export Failed”);
             NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
             //export cancelled  
             break;
     }
 }];    }

只需将您的电影文件(即没有音频)分配给 NSURL 并将其传递给上面的 ProcessVideo 方法。然后只需将您的声音文件(您想要与您的视频合并)添加到程序中其他位置的 audioInfoArray 中,然后再调用 processVideo 方法然后它将您的音频与视频文件合并。

您还可以根据audioinfoArray 中“startTime”键下分配的值来决定声音在视频中开始播放的位置。使用 Switch Case,您可以根据您的意愿播放视频、上传到 Facebook 等。

-(void) processVideo: (NSURL*) videoUrl{   
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL: videoUrl options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];

AppDelegate *appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];

NSError * error = nil;

for (NSMutableDictionary * audioInfo in appDelegate.audioInfoArray)
{
    NSString *pathString = [[NSHomeDirectory() stringByAppendingString:@”/Documents/”] stringByAppendingString: [audioInfo objectForKey: @”fileName”]];

    AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];

    AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
    AVMutableCompositionTrack *compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                                                   preferredTrackID: kCMPersistentTrackID_Invalid];

    NSLog(@”%lf”, [[audioInfo objectForKey: @”startTime”] doubleValue]);

    CMTime audioStartTime = CMTimeMake(([[audioInfo objectForKey: @”startTime”] doubleValue]*TIME_SCALE), TIME_SCALE);

    [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];      
}


AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                                                                               preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
                               ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
                                atTime:kCMTimeZero error:nil];

AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                                                                      presetName:AVAssetExportPresetPassthrough];   

NSString* videoName = @”export.mov”;

NSString *exportPath = [[self pathToDocumentsDirectory] stringByAppendingPathComponent:videoName];
NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

_assetExport.outputFileType = @”com.apple.quicktime-movie”;
NSLog(@”file type %@”,_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;

[_assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {
     switch (_assetExport.status) 
     {
         case AVAssetExportSessionStatusCompleted:
             //export complete 
             NSLog(@”Export Complete”);
             //[self uploadToYouTube];

             break;
         case AVAssetExportSessionStatusFailed:
             NSLog(@”Export Failed”);
             NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
             //export error (see exportSession.error)  
             break;
         case AVAssetExportSessionStatusCancelled:
             NSLog(@”Export Failed”);
             NSLog(@”ExportSessionError: %@”, [_assetExport.error localizedDescription]);
             //export cancelled  
             break;
     }
 }];    }

Just assign your movie file(ie.without audio) to NSURL and pass it to the above ProcessVideo method.Then just add your Sound files(you want to merge with your video) in the audioInfoArray somewhere else in the program before calling the processVideo method.Then it will merge your audio with your video file.

You can also decide where the sound starts to play in the video as per the value assigned under the key "startTime" in audioinfoArray. Using the Switch Case,you can play the video,upload to facebook etc as per your wish.

指尖微凉心微凉 2024-12-19 12:15:34

iOS 应用程序无法真正录制(使用任何公共 API)它本身发出的声音。应用程序可以做的就是生成两次相同的音频,一次用于播放,一次用于流式传输到文件。您必须坚持只使用您知道如何以两种方式执行的声音,例如将 PCM 波形复制到缓冲区等。

一旦您拥有音频样本的副本缓冲区,应该有关于如何将其发送到 AVAssetWriter 的示例代码。

An iOS app can't really record (using any public API) the sound that it itself makes. What an app can do is generate the same audio twice, one for playing, one for streaming to a file. You have to stick with only sounds that you know how to do both ways, such as copying PCM waveforms into buffers, etc.

Once you have your duplicate buffer of audio samples, there should be example code on how to send it to an AVAssetWriter.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文