使用硬件加速将视频流传输到 Raspberry pi
这就是我想做的:拥有一个树莓派作为流媒体视频的简单专用播放器,就像一个信息亭。我有一个专用的小型以太网。在一个节点上,rpi 连接到 HDMI 显示器。流发送器是运行ubuntu linux的PC。我想通过以太网从那里传输视频文件并将其显示在 rpi 上。我已经成功地与 udpserver 和 udpsink 建立了原型连接,但是 CPU 已满,我找不到使用 rpi 的硬件解码和显示的方法。理论上它“应该”是可能的,因为我可以将 omxplayer 与 rpi 上的本地文件一起使用。到处都有类似的例子,但我无法让它们发挥作用。最常见的用例是 rpi 执行发送而不是接收。
有没有人有一个 PC 从文件生成流式视频并通过网络发送它,以及 rpi 选择该流并使用 omx 加速进行显示的示例?举个例子我可以做很多事情!
RPI 是硬件的第一次迭代,即模型 b,因此不具备具有多核和更高时钟速率的 3/4 模型的原始 CPU 容量。
Here's what I want to do: have a raspberry pi as a simple dedicated player of streamed video, like a kiosk. I have a dedicated small ethernet. On one node The rpi is connected to an HDMI display. The stream sender is a PC running ubuntu linux. I want to stream a video file from there across the ethernet and display it on the rpi. I've managed to set up a prototype connection with udpserver and udpsink, but the CPU maxes out and I can't find a way to use rpi's hardware decode and display. It "should" be possible in theory because I can use omxplayer with a local file on the rpi. There are examples of similar things everywhere, but I can't get them to work. The most common use case is rpi doing the sending and not the receiving.
Does anyone have an example of a PC generating streamable video from a file and sending it over network, and an rpi picking that stream and displaying using omx acceleration? I could do a lot given an example!
RPI is first iteration of hardware, a model b, so doesn't have the raw CPU capacity of the 3/4 models with multiple cores and higher clock rates.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论