You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In traditional live streaming it is common place to send per-frame picture timing information, typically using the H264 Picture timing SEI messages (ITU-T REC H.264 annex D.2.3).
This timing information is later used to synchronize out of band events like ad insertion using SCTE 35 splice events (SCTE 35 2023r1 section 6.3). Per frame picture timing is widely used in video editing tools.
Retrieving the picture timing information of the rendered frame is quite clumbsy at the moment at it requires interpolating values using the rtp timestamps of the frame which is the only available value apis exposing frames.
An attempt to add the capture timestamp values is being doing at different apis:
Hi! This is a great idea and I see two questions here:
how to get original timestamp
how to keep it inside whole video pipeline
Right now we are developing end-to-end latency measuring system and looking for an approach that will allow me to enable it in webrtc-published environments and when OBS is publishing rtmp.
Later frame must pass transcoders, including third party, without losing timecodes so looks like the best way is to put capture timecode into SEI. But SEI timing seems to have only second resolution and we need at least millisecond.
So maybe unregistered user SEI (which I don't like).
About original timestamp: why not to take just RTP timecode? It is very precise.
Btw, @murillo128 are you sure that pic timing SEI is a good place for keeping realtime timestamp of a frame? I'm looking at our redmine and it seems that this mechanism can be related to HRD buffer.
In traditional live streaming it is common place to send per-frame picture timing information, typically using the H264 Picture timing SEI messages (ITU-T REC H.264 annex D.2.3).
This timing information is later used to synchronize out of band events like ad insertion using SCTE 35 splice events (SCTE 35 2023r1 section 6.3). Per frame picture timing is widely used in video editing tools.
Implementation/spec wise, we can use the abs-catpure-time header extension which is exposed in RTCRtpContributingSource captureTimestamp WebRTC Extension .
Retrieving the picture timing information of the rendered frame is quite clumbsy at the moment at it requires interpolating values using the rtp timestamps of the frame which is the only available value apis exposing frames.
An attempt to add the capture timestamp values is being doing at different apis:
The text was updated successfully, but these errors were encountered: