如何在库中设置 AVCaptureVideoDataOutput

发布于 2024-12-11 03:32:41 字数 2975 浏览 0 评论 0原文

我正在尝试为 iPhone 创建一个库,因此我正在尝试通过调用来初始化相机。 当我在此声明中调用“self”时,问题就出现了:

"[captureOutput setSampleBufferDelegate:self queue:queue];"

因为编译器说:“在此范围内未声明 self”,我需要做什么才能将同一类设置为“AVCaptureVideoDataOutputSampleBufferDelegate”?至少给我指出正确的方向:P。

谢谢 !!!

这是完整的功能:

bool VideoCamera_Init(){


    //Init Capute from the camera and show the camera


    /*We setup the input*/
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] 
                                          error:nil];
    /*We setupt the output*/
    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    /*While a frame is processes in -captureOutput:didOutputSampleBuffer:fromConnection: delegate methods no other frames are added in the queue.
     If you don't want this behaviour set the property to NO */
    captureOutput.alwaysDiscardsLateVideoFrames = YES; 
    /*We specify a minimum duration for each frame (play with this settings to avoid having too many frames waiting
     in the queue because it can cause memory issues). It is similar to the inverse of the maximum framerate.
     In this example we set a min frame duration of 1/10 seconds so a maximum framerate of 10fps. We say that
     we are not able to process more than 10 frames per second.*/
    captureOutput.minFrameDuration = CMTimeMake(1, 20);

    /*We create a serial queue to handle the processing of our frames*/
    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    variableconnombrealeatorio= [[VideoCameraThread alloc] init];
    [captureOutput setSampleBufferDelegate:self queue:queue];


    dispatch_release(queue);
    // Set the video output to store frame in BGRA (It is supposed to be faster)
    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings]; 
    /*And we create a capture session*/
    AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
    captureSession.sessionPreset= AVCaptureSessionPresetMedium;
    /*We add input and output*/
    [captureSession addInput:captureInput];
    [captureSession addOutput:captureOutput];
    /*We start the capture*/
    [captureSession startRunning];


    return TRUE;
}

我也做了下一堂课,但缓冲区是空的:

"

#import "VideoCameraThread.h"

CMSampleBufferRef bufferCamara;

@implementation VideoCameraThread

  • (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)连接 { bufferCamera=sampleBuffer;

    } “

I´m trying to make a library for iPhone, so I´m trying to init the camera just with a call.
The problem comes when I call "self" in this declaration:

"[captureOutput setSampleBufferDelegate:self queue:queue];"

because the compiler says:" self was not declared in this scope", what Do I need to do to set the same class as a "AVCaptureVideoDataOutputSampleBufferDelegate"?. At least point me in the right direction :P.

Thank you !!!

here is the complete function:

bool VideoCamera_Init(){


    //Init Capute from the camera and show the camera


    /*We setup the input*/
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] 
                                          error:nil];
    /*We setupt the output*/
    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    /*While a frame is processes in -captureOutput:didOutputSampleBuffer:fromConnection: delegate methods no other frames are added in the queue.
     If you don't want this behaviour set the property to NO */
    captureOutput.alwaysDiscardsLateVideoFrames = YES; 
    /*We specify a minimum duration for each frame (play with this settings to avoid having too many frames waiting
     in the queue because it can cause memory issues). It is similar to the inverse of the maximum framerate.
     In this example we set a min frame duration of 1/10 seconds so a maximum framerate of 10fps. We say that
     we are not able to process more than 10 frames per second.*/
    captureOutput.minFrameDuration = CMTimeMake(1, 20);

    /*We create a serial queue to handle the processing of our frames*/
    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    variableconnombrealeatorio= [[VideoCameraThread alloc] init];
    [captureOutput setSampleBufferDelegate:self queue:queue];


    dispatch_release(queue);
    // Set the video output to store frame in BGRA (It is supposed to be faster)
    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings]; 
    /*And we create a capture session*/
    AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
    captureSession.sessionPreset= AVCaptureSessionPresetMedium;
    /*We add input and output*/
    [captureSession addInput:captureInput];
    [captureSession addOutput:captureOutput];
    /*We start the capture*/
    [captureSession startRunning];


    return TRUE;
}

I also did the next class, but the buffer is empty:

"

#import "VideoCameraThread.h"

CMSampleBufferRef bufferCamara;

@implementation VideoCameraThread

  • (void)captureOutput:(AVCaptureOutput *)captureOutput
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
    fromConnection:(AVCaptureConnection *)connection
    {
    bufferCamera=sampleBuffer;

    }
    "

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

我是男神闪亮亮 2024-12-18 03:32:41

您正在编写一个 C 函数,它没有 Objective C 类、对象或 self 标识符的概念。您将需要修改您的函数以接受一个参数来接受您想要使用的sampleBufferDelegate:

bool VideoCamera_Init(id<AVCaptureAudioDataOutputSampleBufferDelegate> sampleBufferDelegate) {
    ...
    [captureOutput setSampleBufferDelegate:sampleBufferDelegate queue:queue];
    ...
}

或者您可以使用 Objective C 面向对象的接口而不是 C 风格的接口来编写您的库。

此函数中的内存管理也存在问题。例如,您正在分配 AVCaptureSession 并将其分配给局部变量。此函数返回后,您将无法检索该 AVCaptureSession 以便释放它。

You are writing a C function, which has no concept of Objective C classes, objects or the self identifier. You will need to modify your function to take a parameter to accept the sampleBufferDelegate that you want to use:

bool VideoCamera_Init(id<AVCaptureAudioDataOutputSampleBufferDelegate> sampleBufferDelegate) {
    ...
    [captureOutput setSampleBufferDelegate:sampleBufferDelegate queue:queue];
    ...
}

Or you could write your library with an Objective C object-oriented interface rather than a C-style interface.

You also have problems with memory management in this function. For instance, you are allocating an AVCaptureSession and assigning it to a local variable. After this function returns you will have no way of retrieving that AVCaptureSession so that you can release it.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文