Home > OS >  How to Gstreamer's udpsrc display (wpf)
How to Gstreamer's udpsrc display (wpf)

Time:11-24

I would like to use gstreamer to display udpsrc video, but it does not work.
On the command line it works fine.
"Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(msg)" will always be False.
Please help me...

send command:

gst-launch-1.0.exe autovideosrc ! videoconvert ! vp8enc deadline=1 ! rtpvp8pay pt=96 ! udpsink port=5200 host=<MY-LOCALIP> async=false

recv command:

gst-launch-1.0.exe udpsrc port=5200 caps="application/x-rtp,payload=(int)96" ! rtpjitterbuffer latency=10 ! rtpvp8depay ! avdec_vp8 output-corrupt=false ! videoconvert ! autovideosink

csharp program:

private void CreatePipeLine()
{
        if(pipeline != null)
        {
            pipeline.SetState(State.Null);
            islive = false;
            return;
        }
    
        pipeline = new Pipeline("pipeline");//"playback");
    
        var udpsrc = Gst.ElementFactory.Make("udpsrc", "source");
        udpsrc["port"] = 5200;
        //udpsrc["caps"] = "application/x-rtp,payload=96";
    
        var jitter = Gst.ElementFactory.Make("rtpjitterbuffer", "jitter");
        jitter["latency"] = 10;
    
        var depay = Gst.ElementFactory.Make("rtpvp8depay", "depay");
        var avdec = Gst.ElementFactory.Make("avdec_vp8", "avdec");
        avdec["output-corrupt"] = false;
    
        var convert = Gst.ElementFactory.Make("videoconvert", "convert");
    
        var capsRTP = Gst.Global.CapsFromString("application/x-rtp,payload=96");
        var mAppSink = new AppSink("appsink");
        mAppSink["emit-signals"] = true;
        mAppSink["caps"] = capsRTP;
        mAppSink.NewSample  = OnNewSample;
    
        ink = Gst.ElementFactory.Make("autovideosink", "ink");
    
        pipeline.Add(udpsrc, jitter, depay, avdec, convert, mAppSink); //ink);
    
             
        udpsrc.Link(jitter);
        jitter.Link(depay);
        depay.Link(avdec);
        avdec.Link(convert);
        convert.Link(ink);
    
        Bus bus = pipeline.Bus;
        bus.AddSignalWatch();
        bus.Message  = OnBusMessage;
    
        bus.EnableSyncMessageEmission();
        bus.SyncMessage  = new SyncMessageHandler(OnBusSynMessage);
    
        /*
        VideoSink = new AppSink(ink.Handle);
        VideoSink.Drop = true;
        VideoSink.MaxLateness = (1000 / 30) * 1000000;
        VideoSink.MaxBuffers = 1;
        VideoSink.Qos = true;
        VideoSink.EnableLastSample = false;
        VideoSink.Caps = Gst.Caps.FromString("video/x-raw,format=RGBA");
    
        pipeline.Bus.EnableSyncMessageEmission();
        pipeline.Bus.AddSignalWatch();
        pipeline.Bus.SyncMessage  = OnBusSynMessage;
        pipeline.Bus.Message  = OnBusMessage;
        */
    
        pipeline.SetState(State.Null);
        var ret = pipeline.SetState(State.Ready);
}
private void OnBusSynMessage(object o, SyncMessageArgs sargs)
{
    Gst.Message msg = sargs.Message;
    if (!Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(msg)) { return;}

    Element src = msg.Src as Element;
    if(src == null) { return;}
    try { src["force-aspect-ration"] = true;}
    catch(PropertyNotFoundException) {  }

    Element overlay = ((Gst.Bin)src).GetByInterface(VideoOverlayAdapter.GType);
    if(overlay == null) 
    {
        System.Diagnostics.Debug.WriteLine($"Overlay is null");
        return;
    }

    _adapter = new VideoOverlayAdapter(overlay.Handle);
    _adapter.WindowHandle = _windowHandle;
    _adapter.HandleEvents(true);
    _isRender = true;

}

Tried this method during playback, but "sample" was null.

var sink = VideoSink;
if(sink == null) { return; }

Sample sample = sink.TryPullSample(5000);
if(sample == null) 
{
    return; 
}

Reference Site:
https://github.com/vladkol/gstreamer-netcore/tree/master/samples/AvaloniaPlayer
how to use application/x-rtp binding with gstreamer-sharp?

I've changed the caps location to udpsrc, I have tried autovideosink, etc.

CodePudding user response:

Gst.Video.Global.IsVideoOverlayPrepareWindowHandleMessage(msg)

will be not null only if you have autovideosink in your pipeline. This is because a Window creation request will be created if you have Window "VideoSink" like autovideosink, glvideosink... which implements GstVideoOverlay. (https://gstreamer.freedesktop.org/documentation/opengl/glimagesink.html?gi-language=c#hierarchy see implemented interfaces)

pipeline.Add(udpsrc, jitter, depay, avdec, convert, ink);

Why not setting your pipeline to playing instead of ready ?

    var ret = pipeline.SetState(State.Playing);
  • Related