iOS CoreVideo 内存泄漏

发布于 2024-10-30 07:50:44 字数 3657 浏览 2 评论 0原文

有人可以帮助我在 XCode 中运行 Instruments 时跟踪这些 CoreVideo 内存泄漏吗?

基本上,当我按下自定义运动 jpeg 播放器上的“录制视频”按钮时,就会发生内存泄漏。我无法准确判断我的代码的哪一部分正在泄漏,因为 Leaks Instruments 没有指向我的任何调用。顺便说一句,我正在使用 iPad 设备来测试泄漏。

这是来自 Leaks Instruments 的消息:

  • 责任库 = CoreVideo
  • 责任框架: CVPixelBufferBacking::initWithPixelBufferDescription(..) CVObjectAlloc(...) CVBuffer::init()

这是我处理服务器流式传输的每个运动 jpeg 帧的代码:

-(void)processServerData:(NSData *)data{    

/*
//render the video in the UIImage control
*/
UIImage *image =[UIImage imageWithData:data];
self.imageCtrl.image = image;

/*
//check if we are recording
*/
if (myRecorder.isRecording) {

    //create initial sample: todo:check if this is still needed
    if (counter==0) {

        self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
        CVPixelBufferPoolCreatePixelBuffer (NULL, myRecorder.adaptor.pixelBufferPool, &buffer);

        if(buffer) 
        {
            CVBufferRelease(buffer);
        }
    }

    if (counter < myRecorder.maxFrames)
    {
        if([myRecorder.writerInput isReadyForMoreMediaData])
        {
            CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
            CMTime lastTime=CMTimeMake(counter, myRecorder.timeScale); 
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];

            [myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];

            if(buffer)
            {
                CVBufferRelease(buffer);
            }

            counter++;

            if (counter==myRecorder.maxFrames)
            {
                [myRecorder finishSession];

                counter=0;
                myRecorder.isRecording = NO;
            }
        }
        else
        {
            NSLog(@"adaptor not ready counter=%d ",counter );
        }
    }
}

}

这是 PixelBufferFromCGImage 函数:

+ (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image size:(CGSize) size{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                      size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                      &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                             size.height, 8, 4*size.width, rgbColorSpace, 
                                             kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                       CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;

}

感谢任何帮助!谢谢

Can somebody help me trace these CoreVideo memory leaks when running Instruments in XCode?

Basically, the memory leak happens when I press the "Record Video" button on my custom motion jpeg player. I cannot tell exactly which part of my code is leaking as Leaks Instruments is not pointing to any of my calls. BTW, I'm using the iPad device to test the leaks.

Heres the messages from the Leaks Instruments:

  • Responsible Library = CoreVideo
  • Responsible Frame:
    CVPixelBufferBacking::initWithPixelBufferDescription(..)
    CVObjectAlloc(...)
    CVBuffer::init()

Here's my code that handles each motion jpeg frames streamed by the server:

-(void)processServerData:(NSData *)data{    

/*
//render the video in the UIImage control
*/
UIImage *image =[UIImage imageWithData:data];
self.imageCtrl.image = image;

/*
//check if we are recording
*/
if (myRecorder.isRecording) {

    //create initial sample: todo:check if this is still needed
    if (counter==0) {

        self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
        CVPixelBufferPoolCreatePixelBuffer (NULL, myRecorder.adaptor.pixelBufferPool, &buffer);

        if(buffer) 
        {
            CVBufferRelease(buffer);
        }
    }

    if (counter < myRecorder.maxFrames)
    {
        if([myRecorder.writerInput isReadyForMoreMediaData])
        {
            CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
            CMTime lastTime=CMTimeMake(counter, myRecorder.timeScale); 
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];

            [myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];

            if(buffer)
            {
                CVBufferRelease(buffer);
            }

            counter++;

            if (counter==myRecorder.maxFrames)
            {
                [myRecorder finishSession];

                counter=0;
                myRecorder.isRecording = NO;
            }
        }
        else
        {
            NSLog(@"adaptor not ready counter=%d ",counter );
        }
    }
}

}

Here's the pixelBufferFromCGImage function:

+ (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image size:(CGSize) size{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                      size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                      &pxbuffer);
NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                             size.height, 8, 4*size.width, rgbColorSpace, 
                                             kCGImageAlphaNoneSkipFirst);
NSParameterAssert(context);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                       CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;

}

Aprpeciate any help! Thanks

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

魔法唧唧 2024-11-06 07:50:44

我重构了 processFrame 方法,并且不再出现泄漏。

-(void) processFrame:(UIImage *) image {

    if (myRecorder.frameCounter < myRecorder.maxFrames)
    {
        if([myRecorder.writerInput isReadyForMoreMediaData])
        {
            CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
            CMTime lastTime=CMTimeMake(myRecorder.frameCounter, myRecorder.timeScale); 
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];

            if(buffer)
            {
                [myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];

                myRecorder.frameCounter++;

                CVBufferRelease(buffer);

                if (myRecorder.frameCounter==myRecorder.maxFrames)
                {
                    [myRecorder finishSession];

                    myRecorder.frameCounter=0;
                    myRecorder.isRecording = NO;
                }
            }
            else
            {
                NSLog(@"Buffer is empty");
            }
        }
        else
        {
            NSLog(@"adaptor not ready frameCounter=%d ",myRecorder.frameCounter );
        }
    }

}

I refactored the processFrame method and I'm no longer getting the leaks.

-(void) processFrame:(UIImage *) image {

    if (myRecorder.frameCounter < myRecorder.maxFrames)
    {
        if([myRecorder.writerInput isReadyForMoreMediaData])
        {
            CMTime frameTime = CMTimeMake(1, myRecorder.timeScale);
            CMTime lastTime=CMTimeMake(myRecorder.frameCounter, myRecorder.timeScale); 
            CMTime presentTime=CMTimeAdd(lastTime, frameTime);

            buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];

            if(buffer)
            {
                [myRecorder.adaptor appendPixelBuffer:buffer withPresentationTime:presentTime];

                myRecorder.frameCounter++;

                CVBufferRelease(buffer);

                if (myRecorder.frameCounter==myRecorder.maxFrames)
                {
                    [myRecorder finishSession];

                    myRecorder.frameCounter=0;
                    myRecorder.isRecording = NO;
                }
            }
            else
            {
                NSLog(@"Buffer is empty");
            }
        }
        else
        {
            NSLog(@"adaptor not ready frameCounter=%d ",myRecorder.frameCounter );
        }
    }

}
逆光下的微笑 2024-11-06 07:50:44

我没有看到任何太明显的东西。我确实注意到你在这里使用 self.buffer 和 buffer 。如果它被保留,你可能会在那里泄漏。如果 CVPixelBufferPoolCreatePixelBuffer 在第一行保留 self.buffer 之后在第二行分配内存,则第一行可能会泄漏。

    self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
    CVPixelBufferPoolCreatePixelBuffer (NULL, myRecorder.adaptor.pixelBufferPool, &buffer);

希望有帮助。

I don't see anything too obvious. I did notice you use self.buffer and buffer here. If it is retained, you might be leaking there. If CVPixelBufferPoolCreatePixelBuffer if allocating memory in the second line after self.buffer retained in the first line, the first might be leaking.

    self.buffer = [Recorder pixelBufferFromCGImage:image.CGImage size:myRecorder.imageSize];
    CVPixelBufferPoolCreatePixelBuffer (NULL, myRecorder.adaptor.pixelBufferPool, &buffer);

Hope that helps.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文