如何修复 CVPixelBuffer 泄漏

发布于 2024-12-23 04:23:08 字数 7015 浏览 3 评论 0原文

请告诉我这段代码中的泄漏在哪里...

//这里我用文档目录中的图像制作了视频

- (void) testCompressionSession:(NSString *)path
{
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
    [[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
NSArray *array = [dictInfo objectForKey:@"sortedKeys"];

NSString *betaCompressionDirectory = path;
NSError *error = nil;

unlink([betaCompressionDirectory UTF8String]);

NSLog(@"array = %@",array);
NSData *imgDataTmp = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:0]]];
NSLog(@"link : %@",[projectPath stringByAppendingPathComponent:[array objectAtIndex:0]]);
CGSize size = CGSizeMake([UIImage imageWithData:imgDataTmp].size.width, [UIImage imageWithData:imgDataTmp].size.height);
//----initialize compression engine
NSLog(@"size : w : %f, h : %f",size.width,size.height);
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);

[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    //BOOL isEffect = NO;
    int i = 0;
    float totalTime = 0.0f;
    float nextTime = 0;
    if ([writerInput isReadyForMoreMediaData]) {
        while (1)
        {   
            if (i <= [array count] && i > 0) {
                nextTime = [[dictInfo objectForKey:[array objectAtIndex:i-1]] floatValue];
            }
            totalTime += i == 0 ? 0 : nextTime;
            CMTime presentTime=CMTimeMake(totalTime, 1);
            printf("presentTime : %f  ",CMTimeGetSeconds(presentTime));
            if (i >= [array count]) 
            {
                NSData *imgData = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:i-1]]];
                UIImage* tmpImg = [UIImage imageWithData:imgData];
                tmpImg = [self imageWithImage:tmpImg scaledToSize:size];
                while ( !writerInput.readyForMoreMediaData)
                {
                    sleep(0.01);
                }
                CVPixelBufferRef buffer = NULL;
                buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
                [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(totalTime-nextTime+(nextTime/2.0), 1)];
                NSLog(@"%f",totalTime-nextTime+(nextTime/2.0));
                [writerInput markAsFinished];
                [videoWriter finishWriting];
                //CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
                [videoWriter release];
                break;
            } else {
                NSData *imgData = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:i]]];
                UIImage* tmpImg = [UIImage imageWithData:imgData];
                //tmpImg = [self imageWithImage:tmpImg scaledToSize:size];
                //UIImageWriteToSavedPhotosAlbum(tmpImg, nil, nil, nil);
                while (!adaptor.assetWriterInput.readyForMoreMediaData && !writerInput.readyForMoreMediaData)
                {
                    sleep(0.01);
                }
                CVPixelBufferRef buffer = NULL;
                buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
                if (buffer)
                {
                    if(![adaptor appendPixelBuffer:buffer withPresentationTime:presentTime])
                        NSLog(@"FAIL");
                    else
                        NSLog(@"Success:%d",i);
                    CVPixelBufferRelease(buffer);
                }
            }
    i++;
        }
    }
}];

//这里我从CGImageRef中制作了CVPixelBufferRef

- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;

泄漏日志是:

CVObject CFRetain 00:37.957.985 2 0x1ecae0 0 CoreVideo CVPixelBufferPool:: createPixelBuffer(__CFAllocator const*, __CFDictionary const*, int*) Malloc 96 字节 Malloc 00:40.015.872 1 0x1f0750 96 CoreVideo CVBuffer::init() CVPixelBuffer Malloc 00:40.969.716 1 0x1f2570 96 CoreVideo CVObject::alloc(无符号长整型,__CFAllocator const*,无符号长整型,无符号长整型)

please tell me where is leak in this code...

//here I did video with images from Document Directory

- (void) testCompressionSession:(NSString *)path
{
if ([[NSFileManager defaultManager] fileExistsAtPath:path]) {
    [[NSFileManager defaultManager] removeItemAtPath:path error:nil];
}
NSArray *array = [dictInfo objectForKey:@"sortedKeys"];

NSString *betaCompressionDirectory = path;
NSError *error = nil;

unlink([betaCompressionDirectory UTF8String]);

NSLog(@"array = %@",array);
NSData *imgDataTmp = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:0]]];
NSLog(@"link : %@",[projectPath stringByAppendingPathComponent:[array objectAtIndex:0]]);
CGSize size = CGSizeMake([UIImage imageWithData:imgDataTmp].size.width, [UIImage imageWithData:imgDataTmp].size.height);
//----initialize compression engine
NSLog(@"size : w : %f, h : %f",size.width,size.height);
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:betaCompressionDirectory]
                                                       fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];
NSParameterAssert(videoWriter);
if(error)
    NSLog(@"error = %@", [error localizedDescription]);

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey, nil];
AVAssetWriterInput *writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];

NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                                                                                 sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);

if ([videoWriter canAddInput:writerInput])
    NSLog(@"I can add this input");
else
    NSLog(@"i can't add this input");

[videoWriter addInput:writerInput];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

dispatch_queue_t    dispatchQueue = dispatch_queue_create("mediaInputQueue", NULL);

[writerInput requestMediaDataWhenReadyOnQueue:dispatchQueue usingBlock:^{
    //BOOL isEffect = NO;
    int i = 0;
    float totalTime = 0.0f;
    float nextTime = 0;
    if ([writerInput isReadyForMoreMediaData]) {
        while (1)
        {   
            if (i <= [array count] && i > 0) {
                nextTime = [[dictInfo objectForKey:[array objectAtIndex:i-1]] floatValue];
            }
            totalTime += i == 0 ? 0 : nextTime;
            CMTime presentTime=CMTimeMake(totalTime, 1);
            printf("presentTime : %f  ",CMTimeGetSeconds(presentTime));
            if (i >= [array count]) 
            {
                NSData *imgData = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:i-1]]];
                UIImage* tmpImg = [UIImage imageWithData:imgData];
                tmpImg = [self imageWithImage:tmpImg scaledToSize:size];
                while ( !writerInput.readyForMoreMediaData)
                {
                    sleep(0.01);
                }
                CVPixelBufferRef buffer = NULL;
                buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
                [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(totalTime-nextTime+(nextTime/2.0), 1)];
                NSLog(@"%f",totalTime-nextTime+(nextTime/2.0));
                [writerInput markAsFinished];
                [videoWriter finishWriting];
                //CVPixelBufferPoolRelease(adaptor.pixelBufferPool);
                [videoWriter release];
                break;
            } else {
                NSData *imgData = [NSData dataWithContentsOfFile:[projectPath stringByAppendingPathComponent:[array objectAtIndex:i]]];
                UIImage* tmpImg = [UIImage imageWithData:imgData];
                //tmpImg = [self imageWithImage:tmpImg scaledToSize:size];
                //UIImageWriteToSavedPhotosAlbum(tmpImg, nil, nil, nil);
                while (!adaptor.assetWriterInput.readyForMoreMediaData && !writerInput.readyForMoreMediaData)
                {
                    sleep(0.01);
                }
                CVPixelBufferRef buffer = NULL;
                buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
                if (buffer)
                {
                    if(![adaptor appendPixelBuffer:buffer withPresentationTime:presentTime])
                        NSLog(@"FAIL");
                    else
                        NSLog(@"Success:%d",i);
                    CVPixelBufferRelease(buffer);
                }
            }
    i++;
        }
    }
}];

//and here I did CVPixelBufferRef from CGImageRef

- (CVPixelBufferRef )pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width, size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, &pxbuffer);

NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
NSParameterAssert(pxdata != NULL);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, size.width, size.height, 8, 4*size.width, rgbColorSpace, kCGImageAlphaPremultipliedFirst);
NSParameterAssert(context);

CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
return pxbuffer;

leak log is :

CVObject CFRetain 00:37.957.985 2 0x1ecae0 0 CoreVideo CVPixelBufferPool::createPixelBuffer(__CFAllocator const*, __CFDictionary const*, int*)
Malloc 96 Bytes Malloc 00:40.015.872 1 0x1f0750 96 CoreVideo CVBuffer::init()
CVPixelBuffer Malloc 00:40.969.716 1 0x1f2570 96 CoreVideo CVObject::alloc(unsigned long, __CFAllocator const*, unsigned long, unsigned long)

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

他夏了夏天 2024-12-30 04:23:08

看这里:

CVPixelBufferRef buffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &buffer);
CVPixelBufferLockBaseAddress(buffer, 0);
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];

首先创建一个像素缓冲区并将其地址放入 info buffer 变量,然后相同的变量被 PixelBufferFromCGImage 覆盖,因此其先前的内容无法再释放。

编辑

您刚刚删除了我使用的代码,因此我的答案现在不再适用。

现在这部分:

CVPixelBufferRef buffer = NULL;
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(totalTime-nextTime+(nextTime/2.0), 1)];
NSLog(@"%f",totalTime-nextTime+(nextTime/2.0));
...

您有一个注释掉的 CVPixelBufferPoolRelease(adaptor.pixelBufferPool),这没关系,因为在这个版本中您没有像素缓冲池,但我在这里错过了对 CVPixelBufferRelease(buffer) 的调用。

Look here:

CVPixelBufferRef buffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &buffer);
CVPixelBufferLockBaseAddress(buffer, 0);
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];

first a pixel buffer gets created and its address put info buffer variable, then the same variable gets overwritten by pixelBufferFromCGImage, so its previous content cannot be released any more.

EDIT

you've just removed the code I used, so my answer is now no more applicable.

Now this part:

CVPixelBufferRef buffer = NULL;
buffer = (CVPixelBufferRef)[self pixelBufferFromCGImage:[tmpImg CGImage] size:size];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(totalTime-nextTime+(nextTime/2.0), 1)];
NSLog(@"%f",totalTime-nextTime+(nextTime/2.0));
...

You have a commented out CVPixelBufferPoolRelease(adaptor.pixelBufferPool), which is okay, since in this version you have not pixel buffer pool, but I miss here a call to CVPixelBufferRelease(buffer).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文