[[PageOutline]] [=#streaming] = Streaming = GStreamer has elements that allow for network streaming to occur. For example, the [wiki:Yocto/gstreamer#gst-variable-rtsp-server] is an example application that uses the gstreamer-rtsp-plugin to create a rtsp stream. However, creating a GStreamer application is not the only way to create a network stream. Simple GStreamer pipelines to accomplish this as well which is often used for testing purposes. The following examples are based on GStreamer-1.0 using [wiki:Yocto/gstreamer#gstreamer-imx gstreamer-imx] plugins. There are several ways to accomplish networked streaming over Internet Protocol (IP): * [#udp Raw UDP/IP] * [#tcp Raw TCP/IP] * [#rtp Real-time Transport Protocol (RTP)] * [#rtsp Real Time Streaming Protocol (RTSP)] ('''recommended''') * [#abs Adaptive Bitrate Streaming] To see older/deprecated information, please see [http://trac.gateworks.com/wiki/Yocto/gstreamer/streaming?version=1 this older revision page]. [=#udp] == Raw UDP == Using UDP/IP is the simplest mechanism for streaming and utilizes the least amount of bandwidth. Because UDP does not provide any error detection, packet ordering, or error correction the bitrate is deterministic and simply the bitrate of the media you are streaming. The limitations of raw UDP is: * requires codec that can handle missing/corrupt data (most do these days) * does not use headers containing payload type or timestamp info on stream (making it suitable for only a single type of media, or a pre-muxed type of media) * does not fragment packets - will try to send a raw udp packet for whatever size buffer the udpsink is passed (which can lead to pipeline errors). To fragment packets use RTP The only benefit of using raw UDP is that it is the simplest pipeline you can create for streaming and requires the least amount of dependencies (albeit you might run into one or all of the above problems). '''Note that it is recommended that you use [#rtp RTP] or [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above''' The {{{udpsrc}}} element can be used to render/save a stream originated from a {{{udpsink}}} pipeline. Examples: * encode and send H264 video from Ventana: 1. Start decoder first: {{{ #!bash ifconfig eth0 192.168.1.1 gst-launch-1.0 udpsrc port=9001 ! h264parse ! imxvpudec ! imxipuvideosink sync=false }}} 2. Start encoder second: {{{ #!bash ifconfig eth0 192.168.1.2 gst-launch-1.0 videotestsrc is-live=true ! imxvpuenc_h264 bitrate=1000 ! udpsink host=192.168.1.1 port=9001 }}} Notes: * On the client (stream receiver and renderer) you must use the {{{sync=false}}} property to render frames as they are received otherwise the stream will stall because their is no headers containing timestamps * the decoder (udpsrc) needs to be started first because udpsink will fail if nothing is listening to the socket [=#tcp] == TCP == Using TCP/IP brings error detection, packet re-ordering, and error correction to the network stream. This however causes the bitrate to be non-deterministic because as the error rate increases so does the bitrate and latency. The limitations of using TCP: * non-deterministic bitrate * added latency * does not use headers containing payload type or timestamp info on stream (making it suitable for only a single type of media, or a pre-muxed type of media) '''Note that it is recommended that you use [#rtp RTP] or [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above''' TCP/IP introduces the concept of a socket connection therefore there must exist a server and a client in which case the server must be started first to listen for a connection. You can use a server sink or a server source. The {{{tcpserversrc}}} source can be used to create a TCP server that waits for a connection from a {{{tcpclientsink}}} to render/save. Alternatively the {{{tcpserversink}}} sink can be used to create a TCP server that waits for a connection from a {{{tcpclientsrc}}} that will send data. Examples: * encode and send H264 video from Ventana with '''decoder as server''': 1. Start decoder (server) first: {{{ #!bash ifconfig eth0 192.168.1.1 gst-launch-1.0 tcpserversrc host=192.168.1.1 port=9001 ! decodebin ! autovideosink sync=false }}} 2. Start encoder (client) second: {{{ #!bash ifconfig eth0 192.168.1.2 gst-launch-1.0 videotestsrc is-live=true ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! tcpclientsink host=192.168.1.1 port=9001 }}} Notes: * TCP is connection oriented therefore the TCP 'server' must be started first. You can choose your elements such that the stream originator is the server or the stream renderer is the server however doing so can be problematic for certain codecs because the client decoding the stream may pick up the stream somewhere in the middle and not know how to parse it. [=#rtp] == RTP (raw/session-less) == The [https://en.wikipedia.org/wiki/Real-time_Transport_Protocol Real-time Transport Protocol (RTP)] is a network protocol for delivering audio and video over IP networks. RTP is used extensively in communication and entertainment systems that involve streaming media, such as telephony, video teleconference applications, television services and web-based push-to-talk features. The RTP packet type encapsulates multimedia data with a payload type and time-stamp and therefore can be used to compensate for jitter, out of sequence packets, and time synchronization between streams of different types (ie audio/video lip-sync). RTP is typically used in conjunction with other protocols such as RTP Control Protocol (RTCP) and [#rtsp Real Time Streaming Protocol (RTSP)] to manage stream sessions however can be used on its own in a raw session-less fashion using {{{udpsink}}} and {{{udpsrc}}} elements. The limitations of using raw/session-less RTP: * session management needs to be handled manually (capsfilter is needed to specify stream format) '''Note that it is recommended that you use [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above''' === Example 1 === * Encode and send H264 video from Ventana: 1. Start decoder first: {{{ #!bash ifconfig eth0 192.168.1.1 gst-launch-1.0 udpsrc port=9001 \ caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" \ ! decodebin ! autovideosink }}} 2. Start encoder second: {{{ #!bash gst-launch-1.0 videotestsrc is-live=true \ ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay ! udpsink host=192.168.1.1 port=9001 }}} Notes: * when using RTP a capsfilter must be used to specify the payload as application/x-rtp as above. You can determine the capsfilter required by starting the encoder with a verbose flag {{{-v}}} and looking for {{{caps = "application/x-rtp"}}} === Example 2 === * Encode and send H264 video from Ventana to a PC with VLC: 1. Start decoder first: a. Create SDP file like below (IP address in example is that of the Ventana board) {{{ v=0 m=video 5000 RTP/AVP 96 c=IN IP4 172.24.20.207 a=rtpmap:96 H264/90000 }}} b. Open SDP file in VLC 2. Start encoder (Ventana) second: (IP address in below example is IP of the PC) {{{ gst-launch-1.0 videotestsrc ! imxipuvideotransform ! imxvpuenc_h264 ! rtph264pay config-interval=3 ! udpsink host=172.24.20.26 port=5000 }}} [=#rtsp] == RTSP (Real Time Streaming Protocol) '''(recommended)''' == The [https://en.wikipedia.org/wiki/Real_Time_Streaming_Protocol Real Time Streaming Protocol (RTSP)] is a network control protocol designed for use in entertainment and communications systems to control streaming media servers. The protocol is used for establishing and controlling media sessions between end points. Clients of media servers issue VCR-style commands, such as play and pause, to facilitate real-time control of playback of media files from the server. This protocol uses the Real-time Transport Protocol (RTP) in conjunction with Real-time Control Protocol (RTCP) for media stream delivery. The limitations of using RTSP are: * gst-inspect has no way of using a simple pipeline to create an RTSP server - you must create or use an existing gstreamer based application (keep reading below). Creating an RTSP server is not possible via a simple pipeline to gst-launch however GStreamer libraries do exist to make writing an RTSP server trivial. The source for gst-rtsp-server contains an example application [http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-launch.c test-launch.c] which provides a simple example that can take a GStreamer 'bin' element consisting of everything but the sink element and serves it via RTSP. An extension of the gst-rtsp-server test-launch application [https://github.com/Gateworks/gst-gateworks-apps/blob/master/src/gst-variable-rtsp-server.c gst-variable-rtsp-server] is included on our [wiki:Yocto Yocto BSP] images which will setup a RTSP server, encode the video stream to h264, and allow multiple clients to connect to it. The enhancements made to gst-variable-rtsp-server includes a mechanism for auto-adjusting the encoding bitrate depending on the number of clients connected in addition to serving as a fairly simple example of how to write a gstreamer application. Notes: * refer [wiki:Yocto/gstreamer#gst-variable-rtsp-server here] for more info on gst-variable-rtsp-server. * refer to the [wiki:Yocto/gstreamer/video gstreamer/video] and [wiki:Yocto/gstreamer/audio gstreamer/audio] pages to understand how to first capture [wiki:Yocto/gstreamer/video video] and [wiki:Yocto/gstreamer/audio audio] sources. * refer to [wiki:ventana/audio ventana/audio] and [wiki:Yocto/Video_In Yocto/Video_In] for more info on Ventana Audio input and Video input devices. * RTP streams must use a payloader element appropriate for the media type for that stream. Additionally the payloader {{{name}}} property must be defined with the first stream starting at 0 (ie 'pay0') and the {{{pt}}} property must be set to a value according to [https://tools.ietf.org/html/rfc3551 RFC3551]. Use {{{gst-inspect-1.0 | grep rtp.*pay}}} to see a full list of available payloaders. * you can use {{{playbin}}} (ie {{{gst-launch-1.0 playbin uri=rtsp://:/}}}) as well as an RTSP client if you do not want to specify the various element details such as jitterbuffer latency. === Video only === encode and send '''H264 video''' from Ventana: 1. Start server (encoder) first: {{{ #!bash ifconfig eth0 192.168.1.2 }}} {{{ #!bash # video test source: gst-variable-rtsp-server -p 9001 -u \ "videotestsrc ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96" }}} {{{ #!bash # or alternatively live captured video gst-variable-rtsp-server -p 9001 -u \ "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96" }}} * see [wiki:Yocto/Video_In Yocto/Video_In] for details on Ventana video capture devices (typically the first video capture device is HDMI if available, and otherwise analog CVBS) 2. Connect decoder client(s) second: {{{ #!bash ifconfig eth0 192.168.1.1 }}} {{{ #!bash # view with gstreamer rtspsrc gst-launch-1.0 rtspsrc location=rtsp://192.168.1.2:9001/stream latency=10 ! decodebin ! autovideosink }}} {{{ #!bash # or with vlc vlc rtsp://192.168.1.2:9001/stream --rtsp-caching=10 }}} === Audio only === encode and send '''Audio only''' from Ventana: 1. Start server (encoder) first: {{{ #!bash ifconfig eth0 192.168.1.2 }}} {{{ #!bash # audio test source (tone generator) and AC3 audio encoding gst-variable-rtsp-server -p 9001 -u \ "audiotestsrc ! avenc_ac3 ! rtpac3pay name=pay0 pt=97" }}} {{{ #!bash # or audio test soruce (tone generator) and alaw G711 audio encoding gst-variable-rtsp-server -p 9001 -u \ "audiotestsrc ! alawenc ! rtppcmapay name=pay0 pt=97" }}} {{{ #!bash # or live audio input (of the first audio capture device) and alaw G711 audio encoding gst-variable-rtsp-server -p 9001 -u \ "alsasrc device=hw:0,0 ! alawenc ! rtppcmapay name=pay0 pt=97" }}} {{{ #!bash # or live audio input of HDMI audio (specified by card name) and alaw G711 audio encoding gst-variable-rtsp-server -p 9001 -u \ "alsasrc device=sysdefault:CARD=tda1997xaudio ! alawenc ! rtppcmapay name=pay0 pt=97" }}} * use {{{arecord -L}}} to list available audio input devices by name and {{{arecord -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details 2. Connect decoder client(s) second: {{{ #!bash ifconfig eth0 192.168.1.1 }}} {{{ #!bash # via playbin gst-launch-1.0 -v playbin uri=rtsp://192.168.1.2:9001/stream }}} {{{ #!bash # or via playbin calling out a specific audio output device (HDMI out in this case) gst-launch-1.0 playbin uri=rtsp://192.168.1.2:9001/stream audio-sink="alsasink device=sysdefault:CARD=imxhdmisoc" }}} {{{ #!bash # or with vlc vlc rtsp://192.168.1.2:9001/stream --rtsp-caching=10 }}} * use {{{aplay -L}}} to list available audio output devices by name and {{{aplay -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details Codec Notes: * We have seen issues decoding AC3 with GStreamer on Ventana - you may want to use alaw for compatibility. === Audio + Video === encode and stream '''H264 video and encoded audio''' from Ventana: 1. Start server (encoder) first: {{{ #!bash ifconfig eth0 192.168.1.2 }}} {{{ #!bash # test video (colorbars) and test audio (tone) via AC3 encoding: gst-variable-rtsp-server -p 9001 -u \ "videotestsrc ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \ audiotestsrc ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97" }}} {{{ #!bash # or test video (colorbars) and test audio (tone) via ulaw G711 encoding: gst-variable-rtsp-server -p 9001 -u \ "videotestsrc ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \ audiotestsrc ! alawenc ! rtppcmapay name=pay1 pt=97" }}} {{{ #!bash # or live captured HDMI alaw audio and H264 video on a GW540x: gst-variable-rtsp-server -p 9001 -u \ "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \ alsasrc device=sysdefault:CARD=tda1997xaudio ! alawenc ! rtppcmapay name=pay1 pt=97" }}} {{{ #!bash # or live captured HDMI AC3 audio and video H264 video on a GW540x: gst-variable-rtsp-server -p 9001 -u "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \ alsasrc device=sysdefault:CARD=tda1997xaudio ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97" }}} {{{ #!bash # or live captured CVBS audio and video on a GW540x: gst-variable-rtsp-server -p 9001 -u \ "imxv4l2videosrc device=/dev/video1 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \ alsasrc device=sysdefault:CARD=sgtl5000audio ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97" }}} {{{ #!bash # or live captured CVBS audio and video on a GW510x: gst-variable-rtsp-server -p 9001 -u \ "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \ alsasrc device=sysdefault:CARD=sgtl5000audio ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97" }}} * use {{{arecord -L}}} to list available audio input devices by name and {{{arecord -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details 2. Connect decoder client(s) second: {{{ #!bash ifconfig eth0 192.168.1.1 }}} {{{ #!bash # via playbin gst-launch-1.0 -v playbin uri=rtsp://192.168.1.2:9001/stream }}} {{{ #!bash # or via playbin calling out a specific audio output device (HDMI out in this case) gst-launch-1.0 playbin uri=rtsp://192.168.1.2:9001/stream audio-sink="alsasink device=sysdefault:CARD=imxhdmisoc" }}} {{{ #!bash # or manually specifying sinks gst-launch-1.0 rtspsrc location=rtsp://192.168.1.2:9001/stream latency=10 name=demux \ demux. ! decodebin ! autovideosink sync=true \ demux. ! decodebin ! autoaudiosink sync=true }}} {{{ #!bash # or with vlc vlc rtsp://192.168.1.2:9001/stream --rtsp-caching=10 }}} * use {{{aplay -L}}} to list available audio output devices by name and {{{aplay -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details Codec Notes: * We have seen issues decoding AC3 both with GStreamer on Ventana and with VLC when using audio+video - you may want to use alaw for compatibility. === File based Audio + Video === stream '''file based audio+video''': 1. Start server first: {{{ #!bash ifconfig eth0 192.168.1.2 }}} {{{ #!bash gst-variable-rtsp-server -p 9001 -u \ "filesrc location=/mnt/usb/open-media/tears_of_steel_1080p.webm typefind=true do-timestamp=true ! \ matroskademux name=demux \ demux. ! queue2 ! rtpvorbispay name=pay0 \ demux. ! queue2 ! rtpvp8pay name=pay1" }}} 2. Connect decoder client(s) second: {{{ #!bash ifconfig eth0 192.168.1.1 }}} {{{ #!bash gst-launch-1.0 rtspsrc location=rtsp://192.168.1.2:9001/stream latency=2000 name=demux \ demux. ! decodebin ! queue2 ! autovideosink sync=true \ demux. ! decodebin ! queue2 ! autoaudiosink sync=true }}} * the larger latency can help account for audio/video timestamp discontinuities in the encoded file source == RTMP Youtube Streaming == It is possible to stream video to ​Youtube Live Streaming from the Gateworks board. Please note this requires the rtmpsink gstreamer plugin which is available on the Gateworks Trusty Multimedia Ubuntu image (which uses the Gateworks 3.14 kernel and gstreamer) Below is an example pipeline (which needs to be adjusted with the right youtube RTMP address). The pipeline will playback a colorbar pattern live on youtube. {{{ gst-launch-1.0 videotestsrc do-timestamp=true is-live=true ! \ "video/x-raw,width=640,height=360,framerate=15/1" ! queue ! \ autovideoconvert ! imxvpuenc_h264 bitrate=600 idr-interval=4 ! \ h264parse ! "video/x-h264,level=4.1,profile=main" ! queue ! \ mux. audiotestsrc is-live=true ! \ "audio/x-raw, format=S16LE, endianness=1234, signed=true, width=16, depth=16, rate=44100,channels=2" ! \ queue ! voaacenc bitrate=128000 ! aacparse ! \ audio/mpeg,mpegversion=4,stream-format=raw ! queue ! \ flvmux streamable=true name=mux ! queue ! \ rtmpsink location="rtmp://a.rtmp.youtube.com/live2/gwtest-0665.yvr6-25sv-7gth-dff1 live=true" }}} Another example below using a !GoPro camera via HDMI input: {{{ gst-launch-1.0 imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=5000 idr-interval=4 ! h264parse ! "video/x-h264,level=4.1,profile=main" ! queue ! mux. audiotestsrc is-live=true ! "audio/x-raw, format=S16LE, endianness=1234, signed=true, width=16, depth=16, rate=44100,channels=2" ! queue ! voaacenc bitrate=128000 ! aacparse ! audio/mpeg,mpegversion=4,stream-format=raw ! queue ! flvmux streamable=true name=mux ! queue ! rtmpsink location="rtmp://a.rtmp.youtube.com/live2/test.ygg4-24rv-54gh-dtt1 live=true" }}} [=#abs] == Adaptive Bitrate Streaming == Adaptive bitrate streaming is the concept of a video lowering its' image quality based on its' network quality. This is often seen in situations of online media streaming from services such as !YouTube and Netflix where a lower quality connection will receive SD quality video, which a higher quality connection will receive HD. Some common protocols that exist are: [https://en.wikipedia.org/wiki/HTTP_Live_Streaming HLS (Created by Apple Inc.)], [https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP MPEG DASH], [https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Microsoft_Smooth_Streaming SmoothStreaming (Created by Microsoft)]. Please note that these protocols are not provided on any BSPs by Gateworks. Gateworks has decided to create a sample application that features our implementation of adaptive bitrate live video streaming for our customers. Please see the below section for more details. [=#gst-variable-rtsp-server] === Gateworks Adaptive Bitrate solution (RTSP) === For low latency live video streaming, RTSP might be a good choice. Taking the data found on [wiki:Yocto/gstreamer/latency#LatencySummaryTable our Latency] page, we see that live streaming with RTSP had a low end-to-end latency of just 98ms when capturing with an analog CVBS camera (this is including latency in the camera itself). The reason we are including this information under the "Adaptive Bitrate" section is due to the fact that our [wiki:Yocto/gstreamer#gst-variable-rtsp-server gst-variable-rtsp-server] has the ability to change bitrate on the fly. Our implementation relies on the number of clients currently connected. The quality of the stream will decrease as more users join the stream and increase with less users. This simple GStreamer application is fully open-sourced so you may reference how to do something similar, maybe utilizing other information to determine stream quality. Please visit the [https://github.com/ GitHub] page [https://github.com/Gateworks/gst-gateworks-apps here] to get started. For more detail on this application, please visit our [wiki:Yocto/gstreamer#gst-variable-rtsp-server gst-variable-rtsp-server wiki page] on the topic. === References === * [https://coaxion.net/blog/2014/05/http-adaptive-streaming-with-gstreamer/ HTTP Adaptive Streaming with GStreamer] * [https://developer.mozilla.org/en-US/Apps/Build/Audio_and_video_delivery/Live_streaming_web_audio_and_video Live streaming web audio and video by Mozilla] == Troubleshooting == If you're having issues with network streaming: * Verify that both sides can ping one another * If the message {{{There may be a timestamping problem, or this computer is too slow}}} appears and the video display appears choppy, try the following: * Lower the bitrate from the server * Place a {{{sync=false}}} on the sink side of the server and client. * If video appears choppy, try using UDP over TCP. * Verify that the network is not congested. * Verify your gstreamer pipeline is correct. The best way to find the element that causes a negotiation failure is to end your pipeline in a fakesink and one-by-one eliminate elements leading up to it until it negotiates successfully. * When encoding streams from video input devices, you will need an imxipuvideotransform if using HDMI capture in the yuv422smp mode. It doens't hurt to add one regardless as it will be skipped if not needed.