用于将 h264 流式传输到 nvr 的 Gstreamer 管道
我使用此管道来流处理处理的帧:
pipeline = Gst.parse_launch('appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert !x264enc ! rtph264pay !udpsink name=m_udpsink')
我可以捕获带有 appsink 的框架
cap = cv2.VideoCapture(
‘udpsrc port=5004 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264”’
’ ! rtph264depay’
’ ! avdec_h264’
’ ! videoconvert’
’ ! appsink’, cv2.CAP_GSTREAMER)
但我想在 NVR 上接收框架并且我想知道连接的 url。 当我尝试通过 url rtsp://127.0.0.1:5004
与 opencv 连接时:
cap = cv2.VideoCapture('rtsp://127.0.0.1:5004')
我收到错误:
[tcp @ 0x2f0cf80] 连接到tcp://127.0.0.1:5004?timeout=0 failed: 连接被拒绝
如何找到连接到流的 url?
先感谢您!
UPD:我尝试在同一个 Jetson Nano 上发送和接收帧,但在不同的 docker 容器中(使用标志 --net=host
运行)。
我发现 示例对于 rtsp 流,在 我的代码 中添加了 276-283 行,并运行管道,没有错误。在第二个容器中,我运行此脚本:
cap = cv2.VideoCapture('rtsp://localhost:8554/ds-test', cv2.CAP_FFMPEG)
if cap.isOpened():
print('opened')
但视频未打开。
I’m using this pipeline for streaming processed frames:
pipeline = Gst.parse_launch(‘appsrc name=m_appsrc ! capsfilter name=m_capsfilter ! videoconvert ! x264enc ! rtph264pay ! udpsink name=m_udpsink’)
i can capture frames with appsink
cap = cv2.VideoCapture(
‘udpsrc port=5004 caps = “application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264”’
’ ! rtph264depay’
’ ! avdec_h264’
’ ! videoconvert’
’ ! appsink’, cv2.CAP_GSTREAMER)
But i want to recieve frame on NVR and i want to know url for connection.
When I try to connect by url rtsp://127.0.0.1:5004
with opencv:
cap = cv2.VideoCapture(‘rtsp://127.0.0.1:5004’)
I get error:
[tcp @ 0x2f0cf80] Connection to tcp://127.0.0.1:5004?timeout=0 failed: Connection refused
How can I find the url to connect to the stream?
Thank you in advance!
UPD: I'm trying to send and recieve frames on the same Jetson Nano, but in different docker containers (run with flag --net=host
).
I found example for rtsp streaming, added 276-283 lines to my code and run pipeline without errors. In second container I run this script:
cap = cv2.VideoCapture('rtsp://localhost:8554/ds-test', cv2.CAP_FFMPEG)
if cap.isOpened():
print('opened')
But video is not opening.
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您可能对 RTP 和 RTSP 感到困惑。 RTSP 基本上是一个应用层协议,用于提供 SDP,赋予流属性并为 RTP 建立网络传输链接(通常通过 UDP,但如果使用
rtspt:
url 请求或指定传输,也可以使用 TCP gstreamer rtspsrc 协议,或者如果通过网络可能会阻止正常操作)。我不确定您使用 docker 容器的情况,但您可以尝试创建一个包含以下内容的 SDP 文件 test.sdp:
表示 RTP 流是在端口 5004 上接收的视频,作为具有 IPv4 的本地主机,其中有效负载 96具有时钟速率 90000 的 H264 编码视频。
然后您可以使用 FFMPEG 后端通过 opencv videoCapture 接收:
类似于您的 gstreamer 后端 opencv capture。
确保没有防火墙规则阻止发送或接收 UDP/5004。
另请注意,对于运行来说,使用简单的 sdp 接收器可能无法获得解析等...因此最好让发送者使用以下命令定期发送其配置:
You may be confused with RTP and RTSP. RTSP is basically an application layer protocol for providing a SDP giving stream properties and establishing a network transport link for RTP (usually over UDP, but TCP may also be used if asked using
rtspt:
url, or specifying transport protocol to gstreamer rtspsrc, or if going thru networks that may prevent normal operation).I'm not sure for your case with docker containers, but you may try to create a SDP file test.sdp with the following content:
saying that RTP stream is video to be received on port 5004, as localhost with IPv4, where payload 96 has H264 encoded video with clock-rate 90000.
Then you may be able to receive with opencv videoCapture using FFMPEG backend with:
similar to your gstreamer backend opencv capture.
Be sure there is no firewall rule preventing UDP/5004 to be sent or received.
Also note that for running, with that simple sdp the receiver may not be able to get resolution, etc... so better have your sender to periodically send its config using: