同时 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput

发布于 2024-09-28 06:59:47 字数 3414 浏览 4 评论 0原文

我需要能够让 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 同时工作。下面的代码可以工作,但是视频录制却不能。调用 startRecordingToOutputFileURL 后,会直接调用 didFinishRecordingToOutputFileAtURL 委托。现在,如果我从 AVCaptureSession 只需注释掉以下行:

[captureSession addOutput:captureDataOutput];

视频录制有效,但不会调用 SampleBufferDelegate (这是我需要的)。

我怎样才能让 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput 同时工作。

- (void)initCapture {
 AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:NULL]; 

 captureDataOutput = [[AVCaptureVideoDataOutput alloc] init]; 
 [captureDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; 

 m_captureFileOutput = [[AVCaptureMovieFileOutput alloc] init];

 NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
 NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
 NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 

 [captureDataOutput setVideoSettings:videoSettings]; 

 captureSession = [[AVCaptureSession alloc] init]; 

 [captureSession addInput:captureInput];
 [captureSession addOutput:m_captureFileOutput]; 
 [captureSession addOutput:captureDataOutput]; 

 [captureSession beginConfiguration]; 
 [captureSession setSessionPreset:AVCaptureSessionPresetLow]; 
 [captureSession commitConfiguration]; 

 [self performSelector:@selector(startRecording) withObject:nil afterDelay:10.0];
 [self performSelector:@selector(stopRecording) withObject:nil afterDelay:15.0];

 [captureSession startRunning];
}


- (void) startRecording
{
    [m_captureFileOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self];

}

- (void) stopRecording
{
    if([m_captureFileOutput isRecording])
 [m_captureFileOutput stopRecording];

}


- (NSURL *) tempFileURL
{
 NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"camera.mov"];
 NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
 NSFileManager *fileManager = [NSFileManager defaultManager];
 if ([fileManager fileExistsAtPath:outputPath]) {
  [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
 [outputPath release];
 return [outputURL autorelease];
}



- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
 NSLog(@"start record video");
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
 NSLog(@"end record");
}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
      // do stuff with sampleBuffer
}

我应该补充一点,我收到错误:

Error Domain=NSOSStatusErrorDomain Code=-12780 "The operation couldn’t be completed. (OSStatus error -12780.)" UserInfo=0x23fcd0 {AVErrorRecordingSuccessfullyFinishedKey=false}

来自

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error

干杯

I need to be able to have AVCaptureVideoDataOutput and AVCaptureMovieFileOutput working at the same time. The below code works, however, the video recording does not. The didFinishRecordingToOutputFileAtURL delegate is called directly after startRecordingToOutputFileURL is called. Now if i remove AVCaptureVideoDataOutput from the
AVCaptureSession by simply commenting out the line:

[captureSession addOutput:captureDataOutput];

The video recording works but then the SampleBufferDelegate is not called (which i need).

How can i go about having AVCaptureVideoDataOutput and AVCaptureMovieFileOutput working simultaneously.

- (void)initCapture {
 AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:NULL]; 

 captureDataOutput = [[AVCaptureVideoDataOutput alloc] init]; 
 [captureDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; 

 m_captureFileOutput = [[AVCaptureMovieFileOutput alloc] init];

 NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
 NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
 NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 

 [captureDataOutput setVideoSettings:videoSettings]; 

 captureSession = [[AVCaptureSession alloc] init]; 

 [captureSession addInput:captureInput];
 [captureSession addOutput:m_captureFileOutput]; 
 [captureSession addOutput:captureDataOutput]; 

 [captureSession beginConfiguration]; 
 [captureSession setSessionPreset:AVCaptureSessionPresetLow]; 
 [captureSession commitConfiguration]; 

 [self performSelector:@selector(startRecording) withObject:nil afterDelay:10.0];
 [self performSelector:@selector(stopRecording) withObject:nil afterDelay:15.0];

 [captureSession startRunning];
}


- (void) startRecording
{
    [m_captureFileOutput startRecordingToOutputFileURL:[self tempFileURL] recordingDelegate:self];

}

- (void) stopRecording
{
    if([m_captureFileOutput isRecording])
 [m_captureFileOutput stopRecording];

}


- (NSURL *) tempFileURL
{
 NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"camera.mov"];
 NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
 NSFileManager *fileManager = [NSFileManager defaultManager];
 if ([fileManager fileExistsAtPath:outputPath]) {
  [[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
 [outputPath release];
 return [outputURL autorelease];
}



- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections
{
 NSLog(@"start record video");
}

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
 NSLog(@"end record");
}


- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection 
{ 
      // do stuff with sampleBuffer
}

I should add i am getting the error:

Error Domain=NSOSStatusErrorDomain Code=-12780 "The operation couldn’t be completed. (OSStatus error -12780.)" UserInfo=0x23fcd0 {AVErrorRecordingSuccessfullyFinishedKey=false}

from

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error

Cheers

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(4

束缚m 2024-10-05 06:59:47

我联系了苹果支持的一位工程师,他告诉我不支持同时使用 AVCaptureVideoDataOutput + AVCaptureMovieFileOutput。我不知道他们以后是否支持,但他用了“目前不支持”这个词。

我鼓励您像我一样填写错误报告/功能请求(bugreport.apple.com),因为它们衡量人们想要某样东西的难度,我们也许可以在不久的将来看到这一点。

I have contacted an engineer at Apple's support and he told me that simultaneous AVCaptureVideoDataOutput + AVCaptureMovieFileOutput use is not supported. I don't know if they will support it in the future, but he used the word "not supported at this time".

I encourage you to fill a bug report / feature request on this, as I did (bugreport.apple.com), as they measure how hard people want something and we perhaps can see this in a near future.

何其悲哀 2024-10-05 06:59:47

9 年后的今天,苹果显然并不希望这两者能够协同工作。

但您可以轻松使用 AVAssetWriter。

您不能同时使用 AVCaptureVideoDataOutput 和 AVCaptureMovieFileOutput。但是您可以使用 AVCaptureVideoDataOutput 并分析或修改数据,然后使用 AVAsseWriter 将帧写入文件。

来源:https://developer.apple.com/forums/thread/98113

使用 AVAssetWriter 保存视频和输出:

使用以下命令将 AVCaptureVideoDataOutput 保存到电影文件Swift 中的 AVAssetWriter

Still 9 years later Apple apparently does not seem to want this to work together.

But you can easily work with AVAssetWriter.

You can't use AVCaptureVideoDataOutput and AVCaptureMovieFileOutput on the same time. But you can use AVCaptureVideoDataOutput and analyse or modify on the data, then use AVAsseWriter to write the frames to a file.

Source: https://developer.apple.com/forums/thread/98113

How to save video with output using AVAssetWriter:

Save AVCaptureVideoDataOutput to movie file using AVAssetWriter in Swift

许你一世情深 2024-10-05 06:59:47

虽然您无法使用 AVCaptureVideoDataOutput,但可以同时使用 AVCaptureVideoPreviewLayerAVCaptureMovieFileOutput。请参阅 Apple 网站上的“AVCam”示例。

在 Xamarin.iOS 中,代码如下所示:

var session = new AVCaptureSession();

var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
var  mic = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio);
if(camera == null || mic == null){
    throw new Exception("Can't find devices");
}

if(session.CanAddInput(camera)){
    session.AddInput(camera);
}
if(session.CanAddInput(mic)){
   session.AddInput(mic);
}

var layer = new AVCaptureVideoPreviewLayer(session);
layer.LayerVideoGravity = AVLayerVideoGravity.ResizeAspectFill;
layer.VideoGravity = AVCaptureVideoPreviewLayer.GravityResizeAspectFill;

cameraView = new UIView();
cameraView.Layer.AddSublayer(layer);

var filePath = System.IO.Path.Combine( Path.GetTempPath(), "temporary.mov");
var fileUrl = NSUrl.FromFilename( filePath );

var movieFileOutput = new AVCaptureMovieFileOutput();
var recordingDelegate = new MyRecordingDelegate();
session.AddOutput(movieFileOutput);

movieFileOutput.StartRecordingToOutputFile( fileUrl, recordingDelegate);

Although you cannot use AVCaptureVideoDataOutput, you can use AVCaptureVideoPreviewLayer simultaneously with AVCaptureMovieFileOutput. See the "AVCam" example on Apple's Website.

In Xamarin.iOS, the code looks like this:

var session = new AVCaptureSession();

var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
var  mic = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Audio);
if(camera == null || mic == null){
    throw new Exception("Can't find devices");
}

if(session.CanAddInput(camera)){
    session.AddInput(camera);
}
if(session.CanAddInput(mic)){
   session.AddInput(mic);
}

var layer = new AVCaptureVideoPreviewLayer(session);
layer.LayerVideoGravity = AVLayerVideoGravity.ResizeAspectFill;
layer.VideoGravity = AVCaptureVideoPreviewLayer.GravityResizeAspectFill;

cameraView = new UIView();
cameraView.Layer.AddSublayer(layer);

var filePath = System.IO.Path.Combine( Path.GetTempPath(), "temporary.mov");
var fileUrl = NSUrl.FromFilename( filePath );

var movieFileOutput = new AVCaptureMovieFileOutput();
var recordingDelegate = new MyRecordingDelegate();
session.AddOutput(movieFileOutput);

movieFileOutput.StartRecordingToOutputFile( fileUrl, recordingDelegate);
寂寞花火° 2024-10-05 06:59:47

XCode 14.1 已经支持它。

  • XCode 13.4:不起作用
  • XCode 14.1:起作用

XCode 14.1 is already support it.

  • XCode 13.4: Not works
  • XCode 14.1: works
~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文