通过 FW 总线写入 IIDC 视频数据包
这更多的是一个一般性问题,而不是一个具体问题,但是我正在寻找任何意见和建议。
我正在寻找欺骗 IIDC 摄像机、打包并通过火线/1394 端口将未压缩的 2VUY 视频数据流式传输到另一台计算机的方法。我已经能够执行类似于流 DV 数据包的操作,但使用更高级别的库(我使用的是 OS X,因此我可以使用 Apple 提供的库和 FW SDK 中的示例代码作为基础*)。我不是一个低级程序员(更多的是图形/GL 程序员),所以所有这些较低级的东西对我来说有点新鲜。
我为什么要这样做,我的目标是什么?我希望能够欺骗相机,这样我就可以通过 DCAM/IIDC 流从应用程序以未压缩的 2vuy(4:2:2“Y'CbCr 格式“组件 YUV”)从 OpenGL 将视频发送到另一台计算机,因此它被视为一个有效的视频输入/摄像头,可以摄取并使用它做事。我是一名程序员和一名 VJ,我为 Mac 编写开源视频效果软件,这可能是一种便宜、便携且简单的软件。在计算机之间混合视频的解决方案。**
我寻找了编写 IIDC 相机流的示例,但没有找到。我见过很多用于读取各种 IIDC 相机输入并从中获取像素/图像缓冲区的库。 ,但我想知道是否有人有任何关于如何去做的信息,
我知道,可能可以做大量的工作并从字面上扭转像 libdc1394 这样的东西,但问题的一部分。将编写适当的火线数据包、广告相机功能等,而这些库都没有这样做(据我所知)。所以我很好奇是否存在可以帮助启动这个项目的东西。
如果有人有任何指示,或者熟悉这种努力,我将非常感谢任何信息。我确实从一个人那里得到了一张纸条,他们做了这样的事情作为调试设置来测试他们公司的数字化仪,但他们所有的代码都是专有的,不向公众开放:(。
再次感谢您提供任何信息 - 对此非常好奇:)
*实际上我今年有幸参加了 WWDC,并且能够向 Apple FW 团队询问此事。我的表情很奇怪,但确认这是可能的,但这将是一个完全“自己动手”的情况,可用的高级 SDK 几乎没有什么帮助。
** 没有真正便宜的便携式高清视频混合器或采集卡不会对 VJ 造成问题。我几乎了解所有这些,而且它们都有问题。虽然这是一个软件解决方案,并且由于需要 OpenGL 回读而确实存在问题,但它是可行的,并且假设您缓冲 PBO 下载(这会增加延迟,但对于分辨率和 FPS 来说是值得的),它可以很快!
This is more of a general question than a specific, however I am looking for any input and advice.
I'm looking to spoof an IIDC camera, packetize and stream uncompressed 2VUY video data over a firewire/1394 port to another machine. I've been able to do something similar to stream DV packets, but using higher level libraries (I'm on OS X, so I'm able use Apples provided libraries and sample code from the FW SDK as a basis*). Im not much of a low level programmer (more a graphics/GL programmer), so all of this lower level stuff is a bit new to me.
Why do I want to do this, whats my goal? I want to be able to spoof a camera so I can send video from Applications over a DCAM/IIDC stream, in uncompressed, 2vuy (4:2:2 "Y'CbCr format "component YUV") from OpenGL to another computer, so its seen as a valid video input / camera that it can ingest and do things with. I'm a programmer and a VJ, I write Open Source video effects software for the Mac, and this this could be a cheap, portable, and easy solution to mix video between computers.**
I've looked for examples of writing IIDC camera streams, but have found none. I've seen quite a few libraries for reading various IIDC camera inputs and getting a pixel/image buffer out of it, but I want to go the other direction. I'm curious if anyone has any information on how to go about doing it.
I know, probably could do a shit ton of work and literally reverse something like libdc1394, but part of the issue would be writing the proper firewire packets, advertising camera capabilities, etc, none of which those libraries do (to my knowledge). So Im curious if something exists out there that may be able to help bootstrap this project.
If anyone has any pointers, or is familiar with this sort of an endeavor, i'd be super appreciative of any information. I did get a note from one person that they do something like this as a debug setup to test their companies digitizers, but all of their code was proprietary and not available to the public :(.
Thanks again for any info - super curious about this :)
*I was actually fortunate enough to go to WWDC this year, and I was able to ask the Apple FW team about this. I got odd looks, but confirmation its possible, but that it would be a total "roll it yourself" situation, with little in the available high level SDK to be helpful.
** There are no real cheap, portable HD capable video mixers, or capture cards that don't have issues for VJs. I'm aware of just about all of them, and they all have gotchas. While this is a software solution, and does have issues due to requiring OpenGL readback, its doable and can be fast assuming you buffer PBO downloads (which yes, adds latency, but its worth it for resolution and FPS), anyway!
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论