我正在尝试在Ubuntu 14.04上运行Gstreamer视频流,但接收器端无法正确显示视频。我有一个发送方管道发送MJPEG图像,我从下面开始:
gst-launch-1.0 -v videotestsrc ! jpegenc ! rtpjpegpay ! udpsink host=127.0.0.1 port=5200
输出为:
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:sink: caps = video/x-raw, width=(int)320, height=(int)240, framerate=(fraction)30/1, format=(string)I420, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstJpegEnc:jpegenc0.GstPad:src: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96, ssrc=(uint)1970481773, timestamp-offset=(uint)1012832172, seqnum-offset=(uint)21614
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, sof-marker=(int)0, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 1012832172
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 21614
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
接收管道是:
gst-launch-1.0 udpsrc port=5200 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.000000, a-framesize=(string)320-240, payload=(int)96" ! rtpjpegdepay ! jpegdec ! xvimagesink
请注意,我已经从发件人信息中复制了大写字母。
然而,我不断收到“无法发送粘性事件”错误,接收器立即终止。我可能做错了什么?我的输出如下:
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
0:00:00.016961467 12229 0x266a400 WARN GST_PADS gstpad.c:3669:gst_pad_peer_query:<jpegdec0:src> could not send sticky events
0:00:00.017297845 12229 0x266a400 WARN basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: Internal data flow error.
0:00:00.017306308 12229 0x266a400 WARN basesrc gstbasesrc.c:2865:gst_base_src_loop:<udpsrc0> error: streaming task paused, reason not-negotiated (-4)
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.000651039
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
我试过自动视频接收器ximagesink,结果也是一样的。
谢谢