同步视频流+眼睛跟踪器
我在流视频方面有很多经验。 我目前正在处理以下任务: 我需要将流视频(当前是MPEG TS + H264)与从眼球跟踪器接收的数据同步。 从眼睛跟踪器中,我可以在数据记录时获得时间戳。 我的想法是花时间将视频框架发送到网络并从眼球跟踪器中获取最接近的记录 是否可以使用FFMPEG获取此框架信息?我找到了有关框架的PT和DTS的信息,但似乎完全不一样。 我还发现可以从RTP协议中获得NTP时间戳。事实证明,这些信息需要从运输协议中获取?
I have quite a little experience with streaming video.
I am currently working on the following task:
I need to synchronize the streaming video (currently it is mpeg ts + h264) with the data received from the eye tracker.
From eye tracker I can get a timestamp when the data has been logged.
My idea is to get the time when the video frame was sent to the network and pick up the closest record from the eye tracker to it
Is it possible to get this frame information using ffmpeg ? I found information about the PTS and DTS of the frame, but it doesn't seem to be the same at all.
I also found that it is possible to get an NTP timestamp from the RTP protocol. It turns out that the information needs to be taken from the transport protocol?
如果你对这篇内容有疑问,欢迎到本站社区发帖提问 参与讨论,获取更多帮助,或者扫码二维码加入 Web 技术交流群。
绑定邮箱获取回复消息
由于您还没有绑定你的真实邮箱,如果其他用户或者作者回复了您的评论,将不能在第一时间通知您!
发布评论
评论(1)
您正在尝试做的类似于RTCP中的Lipsync。
当收到RTCP SR记录时,SR数据用于使用NTP时间戳更新时间同步信息。
当您获取眼动样品时,您可以使用传入的时间戳并计算更新的实时时间戳。
当然,我认为眼动样品必须提供RTCP SR记录或足够相似的内容。
What you are trying to do is similar to lipsync in RTCP.
When a RTCP SR record is received, SR data is used to update time synchronization information using NTP timestamp.
When you get the eye tracker sample you can use the incoming timestamp and compute an updated real-time timestamp.
Of course, I assume the eye tracker samples must provide either RTCP SR records or something similar enough.