Changes between Initial Version and Version 1 of Yocto/gstreamer/streaming


Ignore:
Timestamp:
10/22/2017 05:28:45 AM (2 years ago)
Author:
trac
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Yocto/gstreamer/streaming

    v1 v1  
     1[[PageOutline]]
     2
     3[=#streaming]
     4= Streaming =
     5GStreamer has elements that allow for network streaming to occur. For example, the [wiki:Yocto/gstreamer#gst-variable-rtsp-server] is an example application that uses the gstreamer-rtsp-plugin to create a rtsp stream.
     6
     7However, creating a GStreamer application is not the only way to create a network stream. Simple GStreamer pipelines to accomplish this as well which is often used for testing purposes. The following examples are based on GStreamer-1.0 using [wiki:Yocto/gstreamer#gstreamer-imx gstreamer-imx] plugins.
     8
     9There are several ways to accomplish networked streaming over Internet Protocol (IP):
     10 * [#udp Raw UDP/IP]
     11 * [#tcp Raw TCP/IP]
     12 * [#rtp Real-time Transport Protocol (RTP)]
     13 * [#rtsp Real Time Streaming Protocol (RTSP)] ('''recommended''')
     14 * [#abs Adaptive Bitrate Streaming]
     15
     16To see older/deprecated information, please see [http://trac.gateworks.com/wiki/Yocto/gstreamer/streaming?version=1 this older revision page].
     17
     18
     19[=#udp]
     20== Raw UDP ==
     21Using UDP/IP is the simplest mechanism for streaming and utilizes the least amount of bandwidth. Because UDP does not provide any error detection, packet ordering, or error correction the bitrate is deterministic and simply the bitrate of the media you are streaming.
     22
     23The limitations of raw UDP is:
     24 * requires codec that can handle missing/corrupt data (most do these days)
     25 * does not use headers containing payload type or timestamp info on stream (making it suitable for only a single type of media, or a pre-muxed type of media)
     26 * does not fragment packets - will try to send a raw udp packet for whatever size buffer the udpsink is passed (which can lead to pipeline errors). To fragment packets use RTP
     27
     28The only benefit of using raw UDP is that it is the simplest pipeline you can create for streaming and requires the least amount of dependencies (albeit you might run into one or all of the above problems).
     29
     30'''Note that it is recommended that you use [#rtp RTP] or [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above'''
     31
     32The {{{udpsrc}}} element can be used to render/save a stream originated from a {{{udpsink}}} pipeline.
     33
     34Examples:
     35 * encode and send H264 video from Ventana:
     36  1. Start decoder first:
     37 {{{
     38#!bash
     39ifconfig eth0 192.168.1.1
     40gst-launch-1.0 udpsrc port=9001 ! h264parse ! imxvpudec ! imxipuvideosink sync=false
     41}}}
     42  2. Start encoder second:
     43{{{
     44#!bash
     45ifconfig eth0 192.168.1.2
     46gst-launch-1.0 videotestsrc is-live=true ! imxvpuenc_h264 bitrate=1000 ! udpsink host=192.168.1.1 port=9001
     47}}}
     48
     49Notes:
     50 * On the client (stream receiver and renderer) you must use the {{{sync=false}}} property to render frames as they are received otherwise the stream will stall because their is no headers containing timestamps
     51 * the decoder (udpsrc) needs to be started first because udpsink will fail if nothing is listening to the socket
     52
     53
     54[=#tcp]
     55== TCP ==
     56Using TCP/IP brings error detection, packet re-ordering, and error correction to the network stream. This however causes the bitrate to be non-deterministic because as the error rate increases so does the bitrate and latency.
     57
     58The limitations of using TCP:
     59 * non-deterministic bitrate
     60 * added latency
     61 * does not use headers containing payload type or timestamp info on stream (making it suitable for only a single type of media, or a pre-muxed type of media)
     62
     63'''Note that it is recommended that you use [#rtp RTP] or [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above'''
     64
     65TCP/IP introduces the concept of a socket connection therefore there must exist a server and a client in which case the server must be started first to listen for a connection. You can use a server sink or a server source. The {{{tcpserversrc}}} source can be used to create a TCP server that waits for a connection from a {{{tcpclientsink}}} to render/save. Alternatively the {{{tcpserversink}}} sink can be used to create a TCP server that waits for a connection from a {{{tcpclientsrc}}} that will send data.
     66
     67Examples:
     68 * encode and send H264 video from Ventana with '''decoder as server''':
     69  1. Start decoder (server) first:
     70{{{
     71#!bash
     72ifconfig eth0 192.168.1.1
     73gst-launch-1.0 tcpserversrc host=192.168.1.1 port=9001 ! decodebin ! autovideosink sync=false
     74}}}
     75  2. Start encoder (client) second:
     76{{{
     77#!bash
     78ifconfig eth0 192.168.1.2
     79gst-launch-1.0 videotestsrc is-live=true ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! tcpclientsink host=192.168.1.1 port=9001
     80}}}
     81
     82Notes:
     83 * TCP is connection oriented therefore the TCP 'server' must be started first. You can choose your elements such that the stream originator is the server or the stream renderer is the server however doing so can be problematic for certain codecs because the client decoding the stream may pick up the stream somewhere in the middle and not know how to parse it.
     84
     85
     86[=#rtp]
     87== RTP (raw/session-less) ==
     88The [https://en.wikipedia.org/wiki/Real-time_Transport_Protocol Real-time Transport Protocol (RTP)] is a network protocol for delivering audio and video over IP networks. RTP is used extensively in communication and entertainment systems that involve streaming media, such as telephony, video teleconference applications, television services and web-based push-to-talk features.
     89
     90The RTP packet type encapsulates multimedia data with a payload type and time-stamp and therefore can be used to compensate for jitter, out of sequence packets, and time synchronization between streams of different types (ie audio/video lip-sync).
     91
     92RTP is typically used in conjunction with other protocols such as RTP Control Protocol (RTCP) and [#rtsp Real Time Streaming Protocol (RTSP)] to manage stream sessions however can be used on its own in a raw session-less fashion using {{{udpsink}}} and {{{udpsrc}}} elements.
     93
     94The limitations of using raw/session-less RTP:
     95 * session management needs to be handled manually (capsfilter is needed to specify stream format)
     96
     97'''Note that it is recommended that you use [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above'''
     98
     99Examples:
     100 * encode and send H264 video from Ventana:
     101  1. Start decoder first:
     102{{{
     103#!bash
     104ifconfig eth0 192.168.1.1
     105gst-launch-1.0 udpsrc port=9001 \
     106 caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" \
     107 ! decodebin ! autovideosink
     108}}}
     109  2. Start encoder second:
     110{{{
     111#!bash
     112gst-launch-1.0 videotestsrc is-live=true \
     113 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay ! udpsink host=192.168.1.1 port=9001
     114}}}
     115
     116Notes:
     117 * when using RTP a capsfilter must be used to specify the payload as application/x-rtp as above. You can determine the capsfilter required by starting the encoder with a verbose flag {{{-v}}} and looking for  {{{caps = "application/x-rtp"}}}
     118
     119
     120[=#rtsp]
     121== RTSP (Real Time Streaming Protocol) '''(recommended)''' ==
     122The [https://en.wikipedia.org/wiki/Real_Time_Streaming_Protocol Real Time Streaming Protocol (RTSP)] is a network control protocol designed for use in entertainment and communications systems to control streaming media servers. The protocol is used for establishing and controlling media sessions between end points. Clients of media servers issue VCR-style commands, such as play and pause, to facilitate real-time control of playback of media files from the server. This protocol uses the Real-time Transport Protocol (RTP) in conjunction with Real-time Control Protocol (RTCP) for media stream delivery.
     123
     124The limitations of using RTSP are:
     125 * gst-inspect has no way of using a simple pipeline to create an RTSP server - you must create or use an existing gstreamer based application (keep reading below).
     126
     127Creating an RTSP server is not possible via a simple pipeline to gst-launch however GStreamer libraries do exist to make writing an RTSP server trivial. The source for gst-rtsp-server contains an example application [http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-launch.c test-launch.c] which provides a simple example that can take a GStreamer 'bin' element consisting of everything but the sink element and serves it via RTSP.
     128
     129An extension of the gst-rtsp-server test-launch application [https://github.com/Gateworks/gst-gateworks-apps/blob/master/src/gst-variable-rtsp-server.c gst-variable-rtsp-server] is included on our [wiki:Yocto Yocto BSP] images which will setup a RTSP server, encode the video stream to h264, and allow multiple clients to connect to it. The enhancements made to gst-variable-rtsp-server includes a mechanism for auto-adjusting the encoding bitrate depending on the number of clients connected in addition to serving as a fairly simple example of how to write a gstreamer application.
     130
     131Notes:
     132 * refer [wiki:Yocto/gstreamer#gst-variable-rtsp-server here] for more info on gst-variable-rtsp-server.
     133 * refer to the [wiki:Yocto/gstreamer/video gstreamer/video] and [wiki:Yocto/gstreamer/audio gstreamer/audio] pages to understand how to first capture [wiki:Yocto/gstreamer/video video] and [wiki:Yocto/gstreamer/audio audio] sources.
     134 * refer to [wiki:ventana/audio ventana/audio] and [wiki:Yocto/Video_In Yocto/Video_In] for more info on Ventana Audio input and Video input devices.
     135 * RTP streams must use a payloader element appropriate for the media type for that stream. Additionally the payloader {{{name}}} property must be defined with the first stream starting at 0 (ie 'pay0') and the {{{pt}}} property must be set to a value according to [https://tools.ietf.org/html/rfc3551 RFC3551]. Use {{{gst-inspect-1.0 | grep rtp.*pay}}} to see a full list of available payloaders.
     136 * you can use {{{playbin}}} (ie {{{gst-launch-1.0 playbin uri=rtsp://<server>:<port>/<stream>}}}) as well as an RTSP client if you do not want to specify the various element details such as jitterbuffer latency.
     137
     138
     139=== Video only ===
     140encode and send '''H264 video''' from Ventana:
     141 1. Start server (encoder) first:
     142{{{
     143#!bash
     144ifconfig eth0 192.168.1.2
     145}}}
     146{{{
     147#!bash
     148# video test source:
     149gst-variable-rtsp-server -p 9001 -u \
     150 "videotestsrc ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96"
     151}}}
     152{{{
     153#!bash
     154# or alternatively live captured video
     155gst-variable-rtsp-server -p 9001 -u \
     156 "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96"
     157}}}
     158  * see [wiki:Yocto/Video_In Yocto/Video_In] for details on Ventana video capture devices (typically the first video capture device is HDMI if available, and otherwise analog CVBS)
     159
     160
     161 2. Connect decoder client(s) second:
     162{{{
     163#!bash
     164ifconfig eth0 192.168.1.1
     165}}}
     166{{{
     167#!bash
     168# view with gstreamer rtspsrc
     169gst-launch-1.0 rtspsrc location=rtsp://192.168.1.2:9001/stream latency=10 ! decodebin ! autovideosink
     170}}}
     171{{{
     172#!bash
     173# or with vlc
     174vlc rtsp://192.168.1.2:9001/stream --rtsp-caching=10
     175}}}
     176
     177
     178=== Audio only ===
     179encode and send '''Audio only''' from Ventana:
     180 1. Start server (encoder) first:
     181{{{
     182#!bash
     183ifconfig eth0 192.168.1.2
     184}}}
     185{{{
     186#!bash
     187# audio test source (tone generator) and AC3 audio encoding
     188gst-variable-rtsp-server -p 9001 -u \
     189  "audiotestsrc ! avenc_ac3 ! rtpac3pay name=pay0 pt=97"
     190}}}
     191{{{
     192#!bash
     193# or audio test soruce (tone generator) and alaw G711 audio encoding
     194gst-variable-rtsp-server -p 9001 -u \
     195  "audiotestsrc ! alawenc ! rtppcmapay name=pay0 pt=97"
     196}}}
     197{{{
     198#!bash
     199# or live audio input (of the first audio capture device) and alaw G711 audio encoding
     200gst-variable-rtsp-server -p 9001 -u \
     201  "alsasrc device=hw:0,0 ! alawenc ! rtppcmapay name=pay0 pt=97"
     202}}}
     203{{{
     204#!bash
     205# or live audio input of HDMI audio (specified by card name) and alaw G711 audio encoding
     206gst-variable-rtsp-server -p 9001 -u \
     207  "alsasrc device=sysdefault:CARD=tda1997xaudio ! alawenc ! rtppcmapay name=pay0 pt=97"
     208}}}
     209  * use {{{arecord -L}}} to list available audio input devices by name and {{{arecord -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details
     210
     211 2. Connect decoder client(s) second:
     212{{{
     213#!bash
     214ifconfig eth0 192.168.1.1
     215}}}
     216{{{
     217#!bash
     218# via playbin
     219gst-launch-1.0 -v playbin uri=rtsp://192.168.1.2:9001/stream
     220}}}
     221{{{
     222#!bash
     223# or via playbin calling out a specific audio output device (HDMI out in this case)
     224gst-launch-1.0 playbin uri=rtsp://192.168.1.2:9001/stream audio-sink="alsasink device=sysdefault:CARD=imxhdmisoc"
     225}}}
     226{{{
     227#!bash
     228# or with vlc
     229vlc rtsp://192.168.1.2:9001/stream --rtsp-caching=10
     230}}}
     231  * use {{{aplay -L}}} to list available audio output devices by name and {{{aplay -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details
     232
     233Codec Notes:
     234 * We have seen issues decoding AC3 with GStreamer on Ventana - you may want to use alaw for compatibility.
     235
     236
     237=== Audio + Video ===
     238encode and stream '''H264 video and encoded audio''' from Ventana:
     239 1. Start server (encoder) first:
     240{{{
     241#!bash
     242ifconfig eth0 192.168.1.2
     243}}}
     244{{{
     245#!bash
     246# test video (colorbars) and test audio (tone) via AC3 encoding:
     247gst-variable-rtsp-server -p 9001 -u \
     248 "videotestsrc ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \
     249  audiotestsrc ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97"
     250}}}
     251{{{
     252#!bash
     253# or test video (colorbars) and test audio (tone) via ulaw G711 encoding:
     254gst-variable-rtsp-server -p 9001 -u \
     255 "videotestsrc ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \
     256  audiotestsrc ! alawenc ! rtppcmapay name=pay1 pt=97"
     257}}}
     258{{{
     259#!bash
     260# or live captured HDMI alaw audio and H264 video on a GW540x:
     261gst-variable-rtsp-server -p 9001 -u \
     262 "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \
     263  alsasrc device=sysdefault:CARD=tda1997xaudio ! alawenc ! rtppcmapay name=pay1 pt=97"
     264}}}
     265{{{
     266#!bash
     267# or live captured HDMI AC3 audio and video H264 video on a GW540x:
     268gst-variable-rtsp-server -p 9001 -u  "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \
     269  alsasrc device=sysdefault:CARD=tda1997xaudio ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97"
     270}}}
     271{{{
     272#!bash
     273# or live captured CVBS audio and video on a GW540x:
     274gst-variable-rtsp-server -p 9001 -u \
     275 "imxv4l2videosrc device=/dev/video1 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \
     276  alsasrc device=sysdefault:CARD=sgtl5000audio ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97"
     277}}}
     278{{{
     279#!bash
     280# or live captured CVBS audio and video on a GW510x:
     281gst-variable-rtsp-server -p 9001 -u \
     282 "imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 bitrate=1000 ! rtph264pay name=pay0 pt=96 ! \
     283  alsasrc device=sysdefault:CARD=sgtl5000audio ! audioconvert ! avenc_ac3 ! rtpac3pay name=pay1 pt=97"
     284}}}
     285  * use {{{arecord -L}}} to list available audio input devices by name and {{{arecord -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details
     286
     287 2. Connect decoder client(s) second:
     288{{{
     289#!bash
     290ifconfig eth0 192.168.1.1
     291}}}
     292{{{
     293#!bash
     294# via playbin
     295gst-launch-1.0 -v playbin uri=rtsp://192.168.1.2:9001/stream
     296}}}
     297{{{
     298#!bash
     299# or via playbin calling out a specific audio output device (HDMI out in this case)
     300gst-launch-1.0 playbin uri=rtsp://192.168.1.2:9001/stream audio-sink="alsasink device=sysdefault:CARD=imxhdmisoc"
     301}}}
     302{{{
     303#!bash
     304# or manually specifying sinks
     305gst-launch-1.0 rtspsrc location=rtsp://192.168.1.2:9001/stream latency=10 name=demux \
     306 demux. ! decodebin ! autovideosink sync=true \
     307 demux. ! decodebin ! autoaudiosink sync=true
     308}}}
     309{{{
     310#!bash
     311# or with vlc
     312vlc rtsp://192.168.1.2:9001/stream --rtsp-caching=10
     313}}}
     314  * use {{{aplay -L}}} to list available audio output devices by name and {{{aplay -l}}} to list by number - see [wiki:ventana/audio ventana/audio] for more details
     315
     316Codec Notes:
     317 * We have seen issues decoding AC3 both with GStreamer on Ventana and with VLC when using audio+video - you may want to use alaw for compatibility.
     318
     319=== File based Audio + Video ===
     320stream '''file based audio+video''':
     321 1. Start server first:
     322{{{
     323#!bash
     324ifconfig eth0 192.168.1.2
     325}}}
     326{{{
     327#!bash
     328gst-variable-rtsp-server -p 9001 -u \
     329 "filesrc location=/mnt/usb/open-media/tears_of_steel_1080p.webm typefind=true do-timestamp=true ! \
     330  matroskademux name=demux \
     331  demux. ! queue2 ! rtpvorbispay name=pay0 \
     332  demux. ! queue2 ! rtpvp8pay name=pay1"
     333}}}
     334
     335
     336 2. Connect decoder client(s) second:
     337{{{
     338#!bash
     339ifconfig eth0 192.168.1.1
     340}}}
     341{{{
     342#!bash
     343gst-launch-1.0 rtspsrc location=rtsp://192.168.1.2:9001/stream latency=2000 name=demux \
     344 demux. ! decodebin ! queue2 ! autovideosink sync=true \
     345 demux. ! decodebin ! queue2 ! autoaudiosink sync=true
     346}}}
     347  * the larger latency can help account for audio/video timestamp discontinuities in the encoded file source
     348
     349[=#abs]
     350== Adaptive Bitrate Streaming ==
     351Adaptive bitrate streaming is the concept of a video lowering its' image quality based on its' network quality. This is often seen in situations of online media streaming from services such as !YouTube and Netflix where a lower quality connection will receive SD quality video, which a higher quality connection will receive HD.
     352
     353Some common protocols that exist are: [https://en.wikipedia.org/wiki/HTTP_Live_Streaming HLS (Created by Apple Inc.)], [https://en.wikipedia.org/wiki/Dynamic_Adaptive_Streaming_over_HTTP MPEG DASH], [https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Microsoft_Smooth_Streaming SmoothStreaming (Created by Microsoft)]. Please note that these protocols are not provided on any BSPs by Gateworks.
     354
     355Gateworks has decided to create a sample application that features our implementation of adaptive bitrate live video streaming for our customers. Please see the below section for more details.
     356
     357[=#gst-variable-rtsp-server]
     358=== Gateworks Adaptive Bitrate solution (RTSP) ===
     359For low latency live video streaming, RTSP might be a good choice. Taking the data found on [wiki:Yocto/gstreamer/latency#LatencySummaryTable our Latency] page, we see that live streaming with RTSP had a low end-to-end latency of just 98ms when capturing with an analog CVBS camera (this is including latency in the camera itself).
     360
     361The reason we are including this information under the "Adaptive Bitrate" section is due to the fact that our [wiki:Yocto/gstreamer#gst-variable-rtsp-server gst-variable-rtsp-server] has the ability to change bitrate on the fly. Our implementation relies on the number of clients currently connected. The quality of the stream will decrease as more users join the stream and increase with less users. This simple GStreamer application is fully open-sourced so you may reference how to do something similar, maybe utilizing other information to determine stream quality. Please visit the [https://github.com/ GitHub] page [https://github.com/Gateworks/gst-gateworks-apps here] to get started.
     362
     363For more detail on this application, please visit our [wiki:Yocto/gstreamer#gst-variable-rtsp-server gst-variable-rtsp-server wiki page] on the topic.
     364
     365=== References ===
     366* [https://coaxion.net/blog/2014/05/http-adaptive-streaming-with-gstreamer/ HTTP Adaptive Streaming with GStreamer]
     367* [https://developer.mozilla.org/en-US/Apps/Build/Audio_and_video_delivery/Live_streaming_web_audio_and_video Live streaming web audio and video by Mozilla]
     368
     369== Troubleshooting ==
     370If you're having issues with network streaming:
     371* Verify that both sides can ping one another
     372* If the message {{{There may be a timestamping problem, or this computer is too slow}}} appears and the video display appears choppy, try the following:
     373 * Lower the bitrate from the server
     374 * Place a {{{sync=false}}} on the sink side of the server and client.
     375* If video appears choppy, try using UDP over TCP.
     376* Verify that the network is not congested.
     377* Verify your gstreamer pipeline is correct. The best way to find the element that causes a negotiation failure is to end your pipeline in a fakesink and one-by-one eliminate elements leading up to it until it negotiates successfully.
     378* When encoding streams from video input devices, you will need an imxipuvideotransform if using HDMI capture in the yuv422smp mode. It doens't hurt to add one regardless as it will be skipped if not needed.