iPhone 上 UIImagePickerController 和 AVCaptureSession 之间的相机差异
我正在尝试使用 AVCaptureSession
和 AVCaptureDeviceInput< 构建
UIImagePickerController
的替代品/code>和
AVCaptureStillImageOutput
分别作为输入/输出。
为了预览相机流,我使用 AVCaptureVideoPreviewLayer
。
它现在可以像默认相机一样正常拍摄和存储照片。
但是,我发现了 3 个无法解决的问题:
- 拍摄的照片质量不一样 默认相机提供的
- 观看/拍摄角度缩短,就像在默认相机上使用视频拍摄一样
- 无法控制相机的特定选项像 flash
有没有办法使用更可定制的方法(即 AVFoundation
或任何其他)达到 UIImagePickerController
的级别?
I'm trying to build a replacement for UIImagePickerController
, using AVCaptureSession
with AVCaptureDeviceInput
and AVCaptureStillImageOutput
, as input/output respectively.
To preview the camera stream I'm using AVCaptureVideoPreviewLayer
.
It's now working correctly for capturing and storing photos just like the default camera.
However, I found 3 problems I was unable to solve:
- photos captured don't get the same quality the default camera provides
- the viewing/capture angle is shortened, just like using the video capture on the default camera
- no way to control camera specific options like flash
Is there any way to get to the level of UIImagePickerController
using a more customizable approach (i.e. AVFoundation
or any other)?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
请观看 WWDC 2010 视频中的“第 409 节 - 将摄像头与 AV Foundation 结合使用” 。根据视频,看起来您可以使用
AVFoundation
解决所有三个问题。希望这有帮助!
Check out "Session 409 - Using the Camera with AV Foundation" in the WWDC 2010 videos. Based on the video, it looks like you can resolve all three of your issues with
AVFoundation
.Hope this helps!