openCV:是否可以定时 cvQueryFrame 与投影仪同步?

发布于 2024-11-04 09:01:17 字数 640 浏览 0 评论 0原文

当我通过“cvQueryFrame”使用 openCV 捕获投影图案的相机图像时,我经常会得到一个意想不到的伪影:投影仪的扫描线。也就是说,由于我无法精确计算“cvQueryFrame”捕获图像的时间,因此拍摄的图像不符合投影仪的恒定 30Hz 刷新率。结果是那些将摄像机转向电视屏幕的人所熟悉的典型水平带。

除了诉诸硬件同步之外,是否有人在 openCV 中通过近似(例如,“足够好”)非正式投影仪-相机同步取得了一些成功?

以下是我正在考虑的两个解决方案,但希望这是一个足够常见的问题,可能存在一个优雅的解决方案。我不太优雅的想法是:

  • 在显示视频的 cvWindow 中添加一个滑块控件,以便用户控制从 0 到 1/30 秒的计时偏移,然后在此间隔设置一个队列计时器。每当需要帧时,我不会直接调用“cvQueryFrame”,而是请求回调以在下次触发计时器时执行“cvQueryFrame”。这样,理论上,只要定时器分辨率足够,用户就可以使用滑块来减少扫描线伪影。

  • 通过“cvQueryFrame”接收帧后,通过查找垂直像素列的 HSV 值增量来检查帧中是否有水平带。当然,只有当被拍摄的对象在平滑变化的光照下包含均匀颜色的基准条时,这才有效。

我使用过几台带有 OpenCV 的相机,最近使用的是佳能 SLR (7D)。

When I capture camera images of projected patterns using openCV via 'cvQueryFrame', I often end up with an unintended artifact: the projector's scan line. That is, since I'm unable to precisely time when 'cvQueryFrame' captures an image, the image taken does not respect the constant 30Hz refresh of the projector. The result is that typical horizontal band familiar to those who have turned a video camera onto a TV screen.

Short of resorting to hardware sync, has anyone had some success with approximate (e.g., 'good enough') informal projector-camera sync in openCV?

Below are two solutions I'm considering, but was hoping this is a common enough problem that an elegant solution might exist. My less-than-elegant thoughts are:

  • Add a slider control in the cvWindow displaying the video for the user to control a timing offset from 0 to 1/30th second, then set up a queue timer at this interval. Whenever a frame is needed, rather than calling 'cvQueryFrame' directly, I would request a callback to execute 'cvQueryFrame' at the next firing of the timer. In this way, theoretically the user would be able to use the slider to reduce the scan line artifact, provided that the timer resolution is sufficient.

  • After receiving a frame via 'cvQueryFrame', examine the frame for the tell-tale horizontal band by looking for a delta in HSV values for a vertical column of pixels. Naturally this would only work when the subject being photographed contains a fiducial strip of uniform color under smoothly varying lighting.

I've used several cameras with OpenCV, most recently a Canon SLR (7D).

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

染火枫林 2024-11-11 09:01:17

我认为您提出的解决方案不会起作用。 cvQueryFrame 基本上从相机驱动程序的缓冲区复制下一个可用帧(或在内存映射区域中前进指针,或者根据您的驱动程序实现)。

无论如何,cvQueryFrame 调用的时间对捕获图像的时间没有影响。

因此,正如您所建议的,硬件同步实际上是唯一的途径,除非您有一个特殊的相机,例如点灰相机,它可以让您对帧集成开始触发器进行明确的软件控制。

I don't think that your proposed solution will work. cvQueryFrame basically copies the next available frame from the camera driver's buffer (or advances a pointer in a memory mapped region, or blah according to your driver implementation).

In any case, the timing of the cvQueryFrame call has no effect on when the image was captured.

So as you suggested, hardware sync is really the only route, unless you have a special camera, like a point grey camera, which gives you explicit software control of the frame integration start trigger.

趁年轻赶紧闹 2024-11-11 09:01:17

我知道这与同步无关,但是,您尝试过延长曝光时间吗?或者通过故意将两个或多个图像“混合”成一个来实现这一点?

I know this has nothing to do with synchronizing but, have you tried extending the exposure time? Or doing so by intentionally "blending" two or more images into one?

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文