使用 v4l2h264enc 使用 opencv 和 gstreamer 进行 RPi tcp 视频流传输

发布于 2025-01-19 13:43:59 字数 2189 浏览 1 评论 0原文

我正在尝试在 Python 中使用 OpenCV 和 Gstreamer 流式传输帧。我使用的是 64 位 Bulseye Raspberry Pi 4。 这是我在 Raspberry 上使用的管道:

pipeline = 'appsrc ! "video/x-raw,framerate=25/1,format=BGR,width=640,height=480" ! ' \
           'queue ! v4l2h264enc ! "video/x-h264,level=(string)4" ! h264parse ! ' \
           'rtph264pay ! gdppay ! tcpserversink host=0.0.0.0 port=7000 '
cv2.VideoWriter(pipeline, cv2.CAP_GSTREAMER, 0, args.fps, (args.width, args.height))

v4l2h264enc 似乎存在一些问题。启用 GST_DEBUG=4 给了我

0x3e39a00 ERROR           GST_PIPELINE gst/parse/grammar.y:1007:priv_gst_parse_yyparse: no source element for URI "/x-raw,framerate=25/1,format=BGR,width=640,height=480""
0:00:00.087855767 92892      0x3e39a00 ERROR           GST_PIPELINE gst/parse/grammar.y:1007:priv_gst_parse_yyparse: no source element for URI "/x-h264,level=(string)4""

这两个错误对我来说看起来最重要,但您可以看到完整的日志 这里

使用类似的 CLI 管道,流连接得很好(除了一些编码灰度,这对我来说现在不是最重要的)。

# Stream
gst-launch-1.0 v4l2src device=/dev/video0 ! \
    'video/x-raw,framerate=30/1,format=UYVY,width=1280,height=720' ! \
    v4l2h264enc ! 'video/x-h264,level=(string)4' ! h264parse ! \
    rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=7000
# Client
sudo gst-launch-1.0 -v tcpclientsrc host=yraspberry ip> port=7000 ! \
    gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! \
    autovideosink sync=false

使用appsrc和opencv我也尝试写入文件但没有成功。

opencv 库是使用 Gstream 支持编译的。这是我从 cv2.getBuildInformation() 得到的:

Video I/O:
    DC1394:                      NO
    FFMPEG:                      YES
      avcodec:                   YES (58.91.100)
      avformat:                  YES (58.45.100)
      avutil:                    YES (56.51.100)
      swscale:                   YES (5.7.100)
      avresample:                NO
    GStreamer:                   YES (1.18.4)
    v4l/v4l2:                    YES (linux/videodev2.h)

非常欢迎任何帮助!

I am trying to stream frames using OpenCV and Gstreamer in Python. I'm on a 64 bit Bulseye Raspberry Pi 4.
This is the pipeline I am using on the Raspberry:

pipeline = 'appsrc ! "video/x-raw,framerate=25/1,format=BGR,width=640,height=480" ! ' \
           'queue ! v4l2h264enc ! "video/x-h264,level=(string)4" ! h264parse ! ' \
           'rtph264pay ! gdppay ! tcpserversink host=0.0.0.0 port=7000 '
cv2.VideoWriter(pipeline, cv2.CAP_GSTREAMER, 0, args.fps, (args.width, args.height))

It seems to be some problem with v4l2h264enc. Enabling GST_DEBUG=4 gives me

0x3e39a00 ERROR           GST_PIPELINE gst/parse/grammar.y:1007:priv_gst_parse_yyparse: no source element for URI "/x-raw,framerate=25/1,format=BGR,width=640,height=480""
0:00:00.087855767 92892      0x3e39a00 ERROR           GST_PIPELINE gst/parse/grammar.y:1007:priv_gst_parse_yyparse: no source element for URI "/x-h264,level=(string)4""

These two errors look most important to me, but you can see the full log here.

Using a similar CLI pipeline the stream connects just fine (except for some encoding grayness, which isn't the most critical to me right now).

# Stream
gst-launch-1.0 v4l2src device=/dev/video0 ! \
    'video/x-raw,framerate=30/1,format=UYVY,width=1280,height=720' ! \
    v4l2h264enc ! 'video/x-h264,level=(string)4' ! h264parse ! \
    rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=7000
# Client
sudo gst-launch-1.0 -v tcpclientsrc host=yraspberry ip> port=7000 ! \
    gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! \
    autovideosink sync=false

With appsrc and opencv I also tried writing to a file without success.

The opencv library is compiled with Gstream support. This is what I get from cv2.getBuildInformation():

Video I/O:
    DC1394:                      NO
    FFMPEG:                      YES
      avcodec:                   YES (58.91.100)
      avformat:                  YES (58.45.100)
      avutil:                    YES (56.51.100)
      swscale:                   YES (5.7.100)
      avresample:                NO
    GStreamer:                   YES (1.18.4)
    v4l/v4l2:                    YES (linux/videodev2.h)

Any help would be most welcome!

如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。

扫码二维码加入Web技术交流群

发布评论

需要 登录 才能够评论, 你可以免费 注册 一个本站的账号。

评论(2

小梨窩很甜 2025-01-26 13:43:59

@seb ,BGR框架可能不受v4l2H264enc的支持。并导致此错误,哪个错误> videoconvert 修复了:

opencv/opencv/modules/videoio/src/cap_gstreamer.cpp (2293) writeFrame OpenCV | GStreamer warning: Error pushing buffer to GStreamer pipeline

但是uri错误的主要原因被证明是video/x的双引号。 -raw视频/X-H264

这是有效的最终管道。

pipeline = 'appsrc ! videoconvert ! v4l2h264enc ! video/x-h264,level=(string)4 ! ' \
          'h264parse ! matroskamux ! tcpserversink host=0.0.0.0 port=7000 '

另外,如 @seb 建议,我还提供了matroskamux而不是rtph264pay! GDPPAY,因为它提供了更好的流性能。

As mentioned by @SeB, the BGR frames might not be supported by v4l2h264enc. And leads to this error, which videoconvert fixes:

opencv/opencv/modules/videoio/src/cap_gstreamer.cpp (2293) writeFrame OpenCV | GStreamer warning: Error pushing buffer to GStreamer pipeline

But the main cause for the no source element for URI errors turned out to be the double quotes around video/x-raw and video/x-h264.

This is the final pipeline that works.

pipeline = 'appsrc ! videoconvert ! v4l2h264enc ! video/x-h264,level=(string)4 ! ' \
          'h264parse ! matroskamux ! tcpserversink host=0.0.0.0 port=7000 '

Also, as @SeB suggested, I also included the matroskamux instead of rtph264pay ! gdppay, since it gives better stream performance.

人事已非 2025-01-26 13:43:59

不确定这是否适合您的情况,但以下内容可能会有所帮助:

  1. 不要将 RTP 用于 TCP 流。 AFAIK,RTP 主要依赖于 UDP 打包(尽管并非不可能,如 RTSP 服务器在请求 TCP 传输时所做的那样)。您可以只使用 flv、matroska 或 mpegts 等容器:
... ! h264parse ! matroskamux ! tcpserversink
... ! h264parse ! flvmux ! tcpserversink
... ! h264parse ! mpegtsmux ! tcpserversink

并调整接收器,例如:

tcpclientsrc ! matroskademux ! h264parse ! ...
tcpclientsrc ! flvdemux ! h264parse ! ...
tcpclientsrc ! tsdemux ! h264parse ! ...
  1. 在 gst-launch 情况下,您正在接收 UYVY 帧并将其发送到 h264 编码器,而在 opencv 情况下,您将获得 BGR编码器可能不支持作为输入的帧。只需在编码器之前添加插件 videoconvert 即可。

  2. 您还可以设置 h264 配置文件级别。

Not sure this is the solution for your case, but the following may help:

  1. Don't use RTP for TCP streaming. AFAIK, RTP mostly relies on UDP packetization (although not impossible as done by RTSP servers when requesting TCP transport). You may just use a container such as flv, matroska or mpegts:
... ! h264parse ! matroskamux ! tcpserversink
... ! h264parse ! flvmux ! tcpserversink
... ! h264parse ! mpegtsmux ! tcpserversink

and adjust receiver such as:

tcpclientsrc ! matroskademux ! h264parse ! ...
tcpclientsrc ! flvdemux ! h264parse ! ...
tcpclientsrc ! tsdemux ! h264parse ! ...
  1. In gst-launch case, your are receiving UYVY frames and send these to h264 encoder, while in opencv case, you are getting BGR frames that may not be supported as input by encoder. Just add plugin videoconvert before encoder.

  2. You may also set h264 profile with level.

~没有更多了~
我们使用 Cookies 和其他技术来定制您的体验包括您的登录状态等。通过阅读我们的 隐私政策 了解更多相关信息。 单击 接受 或继续使用网站,即表示您同意使用 Cookies 和您的相关数据。
原文