Home > front end >  RPi tcp video streaming with opencv and gstreamer using v4l2h264enc
RPi tcp video streaming with opencv and gstreamer using v4l2h264enc

Time:04-07

I am trying to stream frames using OpenCV and Gstreamer in Python. I'm on a 64 bit Bulseye Raspberry Pi 4. This is the pipeline I am using on the Raspberry:

pipeline = 'appsrc ! "video/x-raw,framerate=25/1,format=BGR,width=640,height=480" ! ' \
           'queue ! v4l2h264enc ! "video/x-h264,level=(string)4" ! h264parse ! ' \
           'rtph264pay ! gdppay ! tcpserversink host=0.0.0.0 port=7000 '
cv2.VideoWriter(pipeline, cv2.CAP_GSTREAMER, 0, args.fps, (args.width, args.height))

It seems to be some problem with v4l2h264enc. Enabling GST_DEBUG=4 gives me

0x3e39a00 ERROR           GST_PIPELINE gst/parse/grammar.y:1007:priv_gst_parse_yyparse: no source element for URI "/x-raw,framerate=25/1,format=BGR,width=640,height=480""
0:00:00.087855767 92892      0x3e39a00 ERROR           GST_PIPELINE gst/parse/grammar.y:1007:priv_gst_parse_yyparse: no source element for URI "/x-h264,level=(string)4""

These two errors look most important to me, but you can see the full log here.

Using a similar CLI pipeline the stream connects just fine (except for some encoding grayness, which isn't the most critical to me right now).

# Stream
gst-launch-1.0 v4l2src device=/dev/video0 ! \
    'video/x-raw,framerate=30/1,format=UYVY,width=1280,height=720' ! \
    v4l2h264enc ! 'video/x-h264,level=(string)4' ! h264parse ! \
    rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=7000
# Client
sudo gst-launch-1.0 -v tcpclientsrc host=yraspberry ip> port=7000 ! \
    gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! \
    autovideosink sync=false

With appsrc and opencv I also tried writing to a file without success.

The opencv library is compiled with Gstream support. This is what I get from cv2.getBuildInformation():

Video I/O:
    DC1394:                      NO
    FFMPEG:                      YES
      avcodec:                   YES (58.91.100)
      avformat:                  YES (58.45.100)
      avutil:                    YES (56.51.100)
      swscale:                   YES (5.7.100)
      avresample:                NO
    GStreamer:                   YES (1.18.4)
    v4l/v4l2:                    YES (linux/videodev2.h)

Any help would be most welcome!

CodePudding user response:

Not sure this is the solution for your case, but the following may help:

  1. Don't use RTP for TCP streaming. AFAIK, RTP mostly relies on UDP packetization (although not impossible as done by RTSP servers when requesting TCP transport). You may just use a container such as flv, matroska or mpegts:
... ! h264parse ! matroskamux ! tcpserversink
... ! h264parse ! flvmux ! tcpserversink
... ! h264parse ! mpegtsmux ! tcpserversink

and adjust receiver such as:

tcpclientsrc ! matroskademux ! h264parse ! ...
tcpclientsrc ! flvdemux ! h264parse ! ...
tcpclientsrc ! tsdemux ! h264parse ! ...
  1. In gst-launch case, your are receiving UYVY frames and send these to h264 encoder, while in opencv case, you are getting BGR frames that may not be supported as input by encoder. Just add plugin videoconvert before encoder.

  2. You may also set h264 profile with level.

CodePudding user response:

As mentioned by @SeB, the BGR frames might not be supported by v4l2h264enc. And leads to this error, which videoconvert fixes:

opencv/opencv/modules/videoio/src/cap_gstreamer.cpp (2293) writeFrame OpenCV | GStreamer warning: Error pushing buffer to GStreamer pipeline

But the main cause for the no source element for URI errors turned out to be the double quotes around video/x-raw and video/x-h264.

This is the final pipeline that works.

pipeline = 'appsrc ! videoconvert ! v4l2h264enc ! video/x-h264,level=(string)4 ! ' \
          'h264parse ! matroskamux ! tcpserversink host=0.0.0.0 port=7000 '

Also, as @SeB suggested, I also included the matroskamux instead of rtph264pay ! gdppay, since it gives better stream performance.

  • Related