I am creating an application for viewing camera feeds by creating an ABR ladder for the requested resolution dynamically. The following is pipeline I have that is static
gst-launch-1.0 hlssink2 name=ingest1 playlist-location=live/stream.m3u8 location=live/segment%d.ts max-files=10 target-duration=2 \
rtspsrc location="rtsp://myrtspcamurl:554/" name=rtspsrc0" \
rtspsrc0. ! rtph264depay ! tee name=t \
t. ! queue ! ingest1.video \
t. ! queue ! decodebin ! tee name=decode_t \
decode_t. ! queue ! videorate ! videoscale ! video/x-raw,framerate=2/1,width=240,height=240 ! videoconvert ! jpegenc ! multifilesink location=stills/stills%d.jpg \
decode_t. ! queue ! videoscale ! video/x-raw,width=640,height=360 ! openh264enc ! hlssink2 location=360/segment%d.ts playlist-location=360/stream.m3u8
I am trying to programmatically add the following dynamic encoder pipeline like below
decode_t. ! queue ! videoscale ! video/x-raw,width=854,height=480 ! openh264enc ! hlssink2 location=480/segment%d.ts playlist-location=480/stream.m3u8
To add this pipeline I get a new src pad from the decode_t
tee and set BLOCK_DOWNSTREAM
.
It all works fine.
The issue I am having is:
When the pipeline running for hours ingesting the video, when I start a new encoder pipeline, it blocks the other 360p ladder until the segment count of this new hlsink2(480p) comes to the same number as the 360p.
Hence the stream.m3u8 of both the streams are unusable until then. It almost seems like it is trying to sync the pipeline run time.
Is there a workaround for this? I have a working version of this in rus as well in github -> https://github.com/ggovindan/abr_ladder
I really appreciate any help in this!!
CodePudding user response:
The videorate
you have in each of the HLS output bins would always start outputting from running time 0 by default. If you do
videorate.set_property("skip-to-first", true);
in create_encoder_bin()
then this should behave as expected.