CVPixelBufferRef:视频缓冲区和深度缓冲区具有不同的方向
现在我正在 iOS 上使用深度相机,因为我想测量帧中某些点到相机的距离。
我在相机解决方案中完成了所有必要的设置,现在我手中有两个 CVPixelBufferRef - 一个包含像素数据,另一个包含深度数据。
这就是我从 AVCaptureDataOutputSynchronizer 获取两个缓冲区的方式:
- (void)dataOutputSynchronizer:(AVCaptureDataOutputSynchronizer *)synchronizer didOutputSynchronizedDataCollection:(AVCaptureSynchronizedDataCollection *)synchronizedDataCollection
{
AVCaptureSynchronizedDepthData *syncedDepthData = (AVCaptureSynchronizedDepthData *)[synchronizedDataCollection synchronizedDataForCaptureOutput:depthDataOutput];
AVCaptureSynchronizedSampleBufferData *syncedVideoData = (AVCaptureSynchronizedSampleBufferData *)[synchronizedDataCollection synchronizedDataForCaptureOutput:dataOutput];
if (syncedDepthData.depthDataWasDropped || syncedVideoData.sampleBufferWasDropped) {
return;
}
AVDepthData *depthData = syncedDepthData.depthData;
CVPixelBufferRef depthPixelBuffer = depthData.depthDataMap;
CMSampleBufferRef sampleBuffer = syncedVideoData.sampleBuffer;
if (!CMSampleBufferDataIsReady(sampleBuffer)) {
return;
}
//... code continues
}
在获取任何深度数据之前,我决定检查缓冲区的尺寸是否对齐。我发现,我的带有像素数据的缓冲区
的尺寸为480x640
(垂直,就像我的应用程序的方向)和带有深度数据的缓冲区
> 尺寸为 640x480
(水平)。
显然,缓冲区是不同的,我无法将像素与深度值匹配。我需要以某种方式旋转我的深度缓冲区吗?这是一个已知问题吗?
请告知我应该如何解决这个问题。提前致谢!
Right now I'm working with Depth camera at iOS since I want to measure distance to the camera of certain points at the frame.
I did all necessary setup in my camera solution and now I have two CVPixelBufferRef
in my hands - one with pixel data and one with depth data.
This is how I fetch both buffers from AVCaptureDataOutputSynchronizer
:
- (void)dataOutputSynchronizer:(AVCaptureDataOutputSynchronizer *)synchronizer didOutputSynchronizedDataCollection:(AVCaptureSynchronizedDataCollection *)synchronizedDataCollection
{
AVCaptureSynchronizedDepthData *syncedDepthData = (AVCaptureSynchronizedDepthData *)[synchronizedDataCollection synchronizedDataForCaptureOutput:depthDataOutput];
AVCaptureSynchronizedSampleBufferData *syncedVideoData = (AVCaptureSynchronizedSampleBufferData *)[synchronizedDataCollection synchronizedDataForCaptureOutput:dataOutput];
if (syncedDepthData.depthDataWasDropped || syncedVideoData.sampleBufferWasDropped) {
return;
}
AVDepthData *depthData = syncedDepthData.depthData;
CVPixelBufferRef depthPixelBuffer = depthData.depthDataMap;
CMSampleBufferRef sampleBuffer = syncedVideoData.sampleBuffer;
if (!CMSampleBufferDataIsReady(sampleBuffer)) {
return;
}
//... code continues
}
Before getting any depth data, I decided to check if dimensions of my buffers align. And I have found out, that my buffer with pixel data
has dimensions 480x640
(vertical, like the orientation of my app) and buffer with depth data
has dimensions 640x480
(horizontal).
Obviously, buffers are different and I can not match pixels to depth values. Do I need to rotate my depth buffer somehow? Is this a known issue?
Please advise how should I solve this problem. Thanks in advance!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
![扫码二维码加入Web技术交流群](/public/img/jiaqun_03.jpg)
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
是的,我也看到了,希望这有帮助。以弧度表示的角度。
呼叫方:我使用角度为:-(.pi/2)
Yes, I too see it, Hope this helps. Angle in radians.
caller side: I used angle as: -(.pi/2)