从 AVCaptureSession 捕获的 iPhone 图像比例

发布于 2024-12-26 01:46:28 字数 5719 浏览 4 评论 0原文

我正在使用某人的源代码来使用 AVCaptureSession 捕获图像。然而,我发现CaptureSessionManager的previewLayer比最终捕获的图像更准确。在此处输入图像描述

我发现结果图像是比例始终为 720x1280=9:16。现在我想将结果图像裁剪为比例为 320:480 的 UIImage,以便它只捕获预览层中可见的部分。有什么想法吗?多谢。

stackoverflow中的相关问题(还没有好的答案): 第一季度Q2

源代码:

- (id)init {
if ((self = [super init])) {
    [self setCaptureSession:[[[AVCaptureSession alloc] init] autorelease]];
}
return self;
}

- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

}

- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];   
if (videoDevice) {
    NSError *error;


    if ([videoDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [videoDevice lockForConfiguration:&error]) {
        [videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        [videoDevice unlockForConfiguration];
    }        

    AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (!error) {
        if ([[self captureSession] canAddInput:videoIn])
            [[self captureSession] addInput:videoIn];
        else
            NSLog(@"Couldn't add video input");     
    }
    else
        NSLog(@"Couldn't create video input");
}
else
    NSLog(@"Couldn't create video capture device");
}

- (void)addStillImageOutput 
{
  [self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
  NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
  [[self stillImageOutput] setOutputSettings:outputSettings];

  AVCaptureConnection *videoConnection = nil;
  for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
      if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
        videoConnection = connection;
        break;
      }
    }
    if (videoConnection) { 
      break; 
    }
  }

  [[self captureSession] addOutput:[self stillImageOutput]];
}

- (void)captureStillImage
{  
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
        if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
            videoConnection = connection;
            break;
        }
    }
    if (videoConnection) { 
  break; 
}
}

NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection 
                                                   completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
                                                     CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                     if (exifAttachments) {
                                                       NSLog(@"attachements: %@", exifAttachments);
                                                     } else { 
                                                       NSLog(@"no attachments");
                                                     }
                                                     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];    
                                                     UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                     [self setStillImage:image];
                                                     [image release];
                                                     [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
                                                   }];
}


之后编辑做更多的研究和测试: AVCaptureSession 的属性“sessionPreset”有以下常量,我没有检查它们中的每一个,但注意到它们中的大多数比例是 9:16 或 3:4,

  • NSString *const AVCaptureSessionPresetPhoto;
  • NSString *const AVCaptureSessionPresetHigh;
  • NSString *const AVCaptureSessionPresetMedium;
  • NSString *const AVCaptureSessionPresetLow;
  • NSString *const AVCaptureSessionPreset352x288;
  • NSString *const AVCaptureSessionPreset640x480;
  • NSString *const AVCaptureSessionPresetiFrame960x540;
  • NSString *const AVCaptureSessionPreset1280x720;
  • NSString *const AVCaptureSessionPresetiFrame1280x720;

在我的项目中,我有全屏预览(帧尺寸为320x480) 另外:[[self PreviewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

我是这样做的:拍摄尺寸为 9:16 的照片并将其裁剪为 320:480,正好是预览层的可见部分。看起来很完美。

用于调整大小和裁剪以替换旧代码的代码是

NSData *imageData = [AVCaptureStillImageOutput  jpegStillImageNSDataRepresentation:imageSampleBuffer];
 UIImage *image = [UIImage imageWithData:imageData]; 
UIImage *scaledimage=[ImageHelper scaleAndRotateImage:image];
 //going to crop the image 9:16 to 2:3;with Width fixed 
float width=scaledimage.size.width; 
float height=scaledimage.size.height; 
float top_adjust=(height-width*3/2.0)/2.0;  
[self setStillImage:[scaledimage croppedImage:rectToCrop]];

I am using somebody's source code for capturing image with AVCaptureSession. However,I found that CaptureSessionManager's previewLayer is shotter then the final captured image.enter image description here

I found that the resulted image is always with ratio 720x1280=9:16. Now I want to crop the resulted image to an UIImage with ratio 320:480 so that it will only capture the portion visible in previewLayer. Any Idea? Thanks a lot.

Relevant Questions in stackoverflow(NO good answer yet):
Q1,
Q2

Source Code:

- (id)init {
if ((self = [super init])) {
    [self setCaptureSession:[[[AVCaptureSession alloc] init] autorelease]];
}
return self;
}

- (void)addVideoPreviewLayer {
[self setPreviewLayer:[[[AVCaptureVideoPreviewLayer alloc] initWithSession:[self captureSession]] autorelease]];
[[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

}

- (void)addVideoInput {
AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];   
if (videoDevice) {
    NSError *error;


    if ([videoDevice isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [videoDevice lockForConfiguration:&error]) {
        [videoDevice setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
        [videoDevice unlockForConfiguration];
    }        

    AVCaptureDeviceInput *videoIn = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
    if (!error) {
        if ([[self captureSession] canAddInput:videoIn])
            [[self captureSession] addInput:videoIn];
        else
            NSLog(@"Couldn't add video input");     
    }
    else
        NSLog(@"Couldn't create video input");
}
else
    NSLog(@"Couldn't create video capture device");
}

- (void)addStillImageOutput 
{
  [self setStillImageOutput:[[[AVCaptureStillImageOutput alloc] init] autorelease]];
  NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil];
  [[self stillImageOutput] setOutputSettings:outputSettings];

  AVCaptureConnection *videoConnection = nil;
  for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
      if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
        videoConnection = connection;
        break;
      }
    }
    if (videoConnection) { 
      break; 
    }
  }

  [[self captureSession] addOutput:[self stillImageOutput]];
}

- (void)captureStillImage
{  
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in [[self stillImageOutput] connections]) {
    for (AVCaptureInputPort *port in [connection inputPorts]) {
        if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
            videoConnection = connection;
            break;
        }
    }
    if (videoConnection) { 
  break; 
}
}

NSLog(@"about to request a capture from: %@", [self stillImageOutput]);
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection 
                                                   completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
                                                     CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                     if (exifAttachments) {
                                                       NSLog(@"attachements: %@", exifAttachments);
                                                     } else { 
                                                       NSLog(@"no attachments");
                                                     }
                                                     NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];    
                                                     UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                     [self setStillImage:image];
                                                     [image release];
                                                     [[NSNotificationCenter defaultCenter] postNotificationName:kImageCapturedSuccessfully object:nil];
                                                   }];
}


Edit after doing some more research and testing:
AVCaptureSession's property "sessionPreset" has the following constants, I haven't checked each one of them, but noted that most of them ratio is either 9:16, or 3:4,

  • NSString *const AVCaptureSessionPresetPhoto;
  • NSString *const AVCaptureSessionPresetHigh;
  • NSString *const AVCaptureSessionPresetMedium;
  • NSString *const AVCaptureSessionPresetLow;
  • NSString *const AVCaptureSessionPreset352x288;
  • NSString *const AVCaptureSessionPreset640x480;
  • NSString *const AVCaptureSessionPresetiFrame960x540;
  • NSString *const AVCaptureSessionPreset1280x720;
  • NSString *const AVCaptureSessionPresetiFrame1280x720;

In My project, I have the fullscreen preview(frame size is 320x480)
also: [[self previewLayer] setVideoGravity:AVLayerVideoGravityResizeAspectFill];

I have done it in this way: take the photo in size 9:16 and crop it to 320:480, exactly the visible part of the previewlayer. It looks perfect.

The piece of code for resizing and croping to replace with old code is

NSData *imageData = [AVCaptureStillImageOutput  jpegStillImageNSDataRepresentation:imageSampleBuffer];
 UIImage *image = [UIImage imageWithData:imageData]; 
UIImage *scaledimage=[ImageHelper scaleAndRotateImage:image];
 //going to crop the image 9:16 to 2:3;with Width fixed 
float width=scaledimage.size.width; 
float height=scaledimage.size.height; 
float top_adjust=(height-width*3/2.0)/2.0;  
[self setStillImage:[scaledimage croppedImage:rectToCrop]];

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(1

梦忆晨望 2025-01-02 01:46:28

iPhone 的摄像头本来就是 ​​4:3 的。您获得的 16:9 图像已经从 4:3 裁剪了。将 16:9 图像再次裁剪为 4:3 并不是您想要的。相反,通过设置 self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto 来从 iPhone 的相机获取原生 4:3 图像(在向会话添加任何输入/输出之前)。

iPhone's camera is natively 4:3. The 16:9 images you get are already cropped from 4:3. Cropping those 16:9 images again to 4:3 is not what you want. Instead get the native 4:3 images from iPhone's camera by setting self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto (before adding any inputs/outputs to the session).

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文