Changes between Version 1 and Version 2 of Yocto/gstreamer/multimedia


Ignore:
Timestamp:
10/22/2017 06:18:20 AM (7 years ago)
Author:
Chris Lang
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • Yocto/gstreamer/multimedia

    v1 v2  
    33[=#audio-and-video]
    44= Video and Audio =
    5 Playback of combined video and audio is a bit more complicated as now you need to utilize parsers, named elements to split the audio and video paths, and queues to avoid single-threaded blocking.
    6 
    7 The following pipeline examples utilize plugins from the the Freescale gst-fsl-plugin package on GStreamer 0.10.
    8 
    9 gst-fsl-plugin elements used:
    10  * Parsers / Demuxers:
    11   * '''aiurdemux''' - parse and identify source stream formats
    12 
    13 Example using Big Buck Bunney H.264 (MOV H264 video, AAC suuround sound):
    14 {{{
    15 gst-launch filesrc location=/media/sda1/big_buck_bunny_1080p_h264.mov typefind=true ! aiurdemux name=d ! \
    16   queue2 ! beepdec ! alsasink d. ! \
    17   queue2 ! vpudec ! mfw_v4lsink device=/dev/video16
    18 }}}
    19 
    20 
    21 == GStreamer '''playbin2''' and '''decodebin2''' elements ==
    22 GStreamer has some '''bin''' elements that try to 'autoplug' other elements together by inspecting the pipeline.
    23 
    24 The '''decodebin2''' element will try to figure out the proper decoder pipeline for the sink specified:
    25 {{{
    26 gst-launch filesrc location=/media/sda1/big_buck_bunny_720p_surround.avi ! \
    27   decodebin2 ! mfw_v4lsink device=/dev/video16
    28 }}}
    29 
    30 There is also '''playbin2''' which just needs to be given a source and a sink
    31 {{{
    32 gst-launch playbin2 uri=file:///home/root/hawaii.mp4 \
    33   video-sink="mfw_v4lsink device=/dev/video16"
    34 }}}
    35 
    36 Note that these 'bin' elements may not always do what you want or expect (they may not use hardware decode for example).
    37 
    38 
    39 == Freescale gplay ==
    40 Or if simplicity is required, the Freescale GStreamer based '''gplay''' app will try to determine the file type and play it back as best as it can.
    41 {{{
    42 gplay /media/sda1/big_buck_bunny.mov
    43 }}}
     5Playback of a file that has both audio and video requires a slightly more complex pipeline than the standard [wiki:Yocto/gstreamer/audio audio] and [wiki:Yocto/gstreamer/video video] pipelines.
     6
     7Generally, a mixed media pipeline will consist of a demuxer (to split audio and video), individualized pipelines per video stream and audio stream, and {{{queue}}} elements to provide asynchronous playback of each stream type (which basically relates to using multiple threads of execution so that one element doesn't block the pipeline waiting for more data).
     8
     9The examples on this page will refer to GStreamer-1.0. To see GStreamer-0.10 (deprecated) examples, please see this [http://trac.gateworks.com/wiki/Yocto/gstreamer/multimedia?version=1 older revision page].
     10
     11
     12[=#named-elements]
     13== Named Elements, queues, and Multiple pipelines with gst-launch ==
     14When mixing audio and video elements with {{{gst-launch}}} one must make use of multiple pipelines using {{{named elements}}}. When developing GStreamer based applications you create 'Bin' elements that put multiple elements together in a bin.
     15
     16The {{{name}}} property can be specified on any element in a pipeline and by default if not specified it will be set to the previous name (if any). Multiple pipelines can be provided to {{{gst-launch}}} and connected together by their names by either sourcing a pipeline with a name followed by a '.' or sinking a pipeline to a name followed by a '.'.
     17
     18This is best explained with some examples:
     19 * Encoding a stream with audio and video content into an AVI file:
     20{{{#!bash lineno=1
     21gst-launch-1.0 \
     22  v4l2src \
     23    ! $VIDEO_CAPABILITIES \
     24    ! mux. \
     25  alsasrc \
     26    ! $AUDIO_CAPABILITIES \
     27    ! mux. \
     28  avimux name=mux \
     29    ! filesink location=test.avi
     30}}}
     31  - The {{{v4l2src}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}}
     32  - The {{{alsasrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}}
     33  - The {{{avimux}}} pipeline specifies {{{name=mux}}} therefore it takes as a source all pipelines that ended with {{{mux.}}}
     34 * Decoding a stream with audio and video content from an AVI file:
     35{{{#!bash lineno=1
     36gst-launch-1.0 \
     37  filesource location=test.avi \
     38    ! avidemux name=demux \
     39  demux. ! queue ! ac3parse ! a52dec ! audioconvert ! alsasink \
     40  demux. ! queue ! mpeg4videoparse ! imxvpudec ! imxipuvideosink
     41}}}
     42  - The {{{filesource}}} pipeline ends with {{{name=demux}}} which means the output of this pipeline will be sent to all pipelines with a {{{demux.}}} source (who's types have been successfully negotiated)
     43  - The audio pipeline consisting of the ac3parse element will source buffers that are supported by its sink capabilities (ie audio/x-ac3, audio/x-eac3, audio/ac3)
     44  - The video pipeline consisting of the mpeg4videoparse element will source buffers that are supported by its sink capabilities (ie video/mpeg, video-x-divx)
     45  - Queue elements are used to keep one pipeline or element from blocking another. For example, if the mpeg4videoparse element needs more data from the avidemux element before it can decode a frame and send it down its pipeline it would normally stall the pipeline unless a queue element was in place to allow buffering
     46
     47
     48[=#mux]
     49== Muxing Mixed Content ==
     50Often a multi-media stream will consist of mixed audio and video streams that are multiplexed (aka 'muxed') together into a single bitstream. The GStreamer elements that perform the combining or muxiplexing on the stream creation side are called 'Muxers'.
     51
     52You can use {{{gst-inspect}}} to see a list of most of these using grep:
     53{{{
     54gst-inspect-1.0 | grep -i muxer | grep -vi de
     55}}}
     56
     57Some common examples:
     58- mpegtsmux: MPEG Transport Stream Muxer
     59- mpegpsmux: MPEG Program Stream Muxer
     60- matroskamux: Matroska muxer
     61- avimux: Avi muxer
     62- qtmux: !QuickTime Muxer
     63- oggmux: Ogg muxer
     64
     65To mux mixed content together include one of these elements following the audio and video pipelines.
     66
     67Examples:
     68* Encoding a stream with audio and video content into an AVI file:
     69{{{#!bash lineno=1
     70gst-launch-1.0 \
     71  videotestsrc \
     72    ! $VIDEO_CAPABILITIES \
     73    ! mux. \
     74  audiotestsrc \
     75    ! $AUDIO_CAPABILITIES \
     76    ! mux. \
     77  avimux name=mux \
     78    ! filesink location=test.avi
     79}}}
     80  - the {{{videotestsrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}} and who's format has been successfully negotiated.
     81  - the {{{audiotestsrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}} and who's format has been successfully negotiated.
     82  - the {{{avimux}}} pipeline specifies {{{name=mux}}} therefore it takes as a source all pipelines that ended with {{{mux.}}} and it understands how to multiplex the two types of data together into its output which is written to the file test.avi
     83
     84
     85=== Example: Capture Video and Audio Through HDMI Port ===
     86
     87 * Note the -e switch is needed to properly terminate for playing back in a video player like VLC
     88{{{#!bash lineno=1
     89gst-launch-1.0 -e imxv4l2videosrc device=/dev/video0 queue-size=6 do-timestamp=true \
     90! queue flush-on-eos=true silent=true leaky=0 max-size-buffers=0 max-size-time=0 max-size-bytes=0 \
     91! imxipuvideotransform \
     92! imxvpuenc_h264 bitrate=10000 \
     93! h264parse disable_passthrough=true \
     94! queue flush-on-eos=true silent=true leaky=0 max-size-buffers=0 max-size-time=0 max-size-bytes=0 \
     95! mux. alsasrc device="sysdefault:CARD=tda1997xaudio" \
     96! audioconvert \
     97! imxmp3audioenc bitrate=128 \
     98! mux.  mp4mux name=mux \
     99! filesink location=audiovideo.mp4 sync=true
     100}}}
     101
     102=== Example: Capture MPEG4 video, MP3 Audio muxed together in a AVI File ===
     103To capture, encode, and output audio and video using MPEG4 video compression, MP3 audio compression, and an AVI file format you could use:
     104{{{#!bash lineno=1
     105gst-launch-1.0 \
     106  imxv4l2videosrc device=/dev/video0 \
     107    ! imxvpuenc_mpeg4 bitrate=10000 \
     108    ! mux. \
     109  alsasrc device="sysdefault:CARD=sgtl5000audio" \
     110    ! audioconvert ! imxmp3audioenc bitrate=128 \
     111    ! mux. \
     112  avimux name=mux \
     113    ! filesink location=test.avi
     114}}}
     115  - The {{{imxv4l2videosrc}}} pipeline ends with {{{mux.}}} which means its output (video/mpeg) is sent to the pipeline who's name is {{{mux}}}
     116  - The {{{alsasrc}}} pipeline ends with {{{mux.}}} which means its output (audio/mpeg) is sent to the pipeline who's name is {{{mux}}}
     117  - The {{{avimux}}} pipeline specifies {{{name=mux}}} therefore it takes as a source all pipelines that ended with {{{mux.}}} and it understands how to multiplex the two types of data together into its output which is written to the file test.avi.
     118
     119
     120
     121[=#demux]
     122== De-muxing mixed content ==
     123Often a multi-media stream will consist of mixed audio and video streams that are multiplexed (aka 'muxed') together into a single bitstream. The GStreamer elements that perform the de-multiplexing on the stream consumption side are called 'De-Muxers'.
     124
     125You can use {{{gst-inspect}}} to see a list of most of these using grep:
     126{{{
     127gst-inspect-1.0 | grep -i 'de\?muxer'
     128}}}
     129
     130Some common examples:
     131- tsparse: MPEG transport stream parser
     132- tsdemux: MPEG transport stream demuxer
     133- matroskademux: Matroska demuxer
     134- avidemux: Avi demuxer
     135- qtdemux: !QuickTime demuxer
     136- oggdemux: Ogg demuxer
     137
     138To de-mux mixed content include one of these elements following the audio and video pipelines. Note that unlike muxing typically you also need to use a {{{parser}}} element to parse the bitstream and break it into discrete buffers that the downstream decoder can understand.
     139
     140Some common parsers:
     141- ogmaudioparse: OGM audio stream parser
     142- ogmvideoparse: OGM video stream parser
     143- aacparse: AAC audio stream parser
     144- amrparse: AMR audio stream parser
     145- ac3parse: AC3 audio stream parser
     146- flacparse: FLAC audio parser
     147- mpegaudioparse: MPEG1 Audio Parser
     148- h263parse: H.263 parser
     149- h264parse: H.264 parser
     150- mpegvideoparse: MPEG video elementary stream parser
     151- mpeg4videoparse: MPEG 4 video elementary stream parser
     152- pngparse: PNG parser
     153- vc1parse: VC1 parser
     154
     155Examples:
     156 * decoding a stream with audio and video content from an AVI file:
     157{{{#!bash lineno=1
     158gst-launch-1.0 \
     159  filesource location=test.avi \
     160    ! avidemux name=demux \
     161  demux. ! queue ! ac3parse ! a52dec ! audioconvert ! alsasink \
     162  demux. ! queue ! mpeg4videoparse ! imxvpudec ! imxipuvideosink
     163}}}
     164  - the {{{filesource}}} pipeline ends with {{{name=demux}}} which means the output of this pipeline will be sent to all pipelines with a {{{demux.}}} source (who's types have been successfully negotiated)
     165  - the audio pipeline consisting of the ac3parse element will source buffers that are supported by its sink capabilities (ie audio/x-ac3, audio/x-eac3, audio/ac3)
     166  - the video pipeline consisting of the mpeg4videoparse element will source buffers that are supported by its sink capabilities (ie video/mpeg, video-x-divx)
     167  - queue elements are used to keep one pipeline or element from blocking another. For example, if the mpeg4videoparse element needs more data from the avidemux element before it can decode a frame and send it down its pipeline it would normally stall the pipeline unless a queue element was in place to allow buffering
     168
     169
     170[=#ex1]
     171=== Example: Playback Matroska file with WEBM video and Vorbis Audio ===
     172An example file consisting of a Matroska file container that includes WEBM encoded video and Vorbis encoded audio can be found at the [https://mango.blender.org/download/ Tears of Steel] download site. Tears of Steel is a relatively popular video (Creative Commons License) that has several encodings.
     173
     174Video file downloadable [http://media.xiph.org/mango/tears_of_steel_1080p.webm here]
     175
     176For this file we can use the {{{filessrc}}} source element, the {{{matroskademux}} demuxer, the {{{ivorbisdec}}} Vorbos audio decoder, and the {{{imxg2dvideosink}}} sink element:
     177{{{#!bash
     178gst-launch-1.0 \
     179  filesrc location=/media/open-media/tears_of_steel_1080p.webm do-timestamp=true typefind=true ! \
     180    matroskademux name=d \
     181  d. ! queue ! ivorbisdec  ! queue ! alsasink device=hw:1,0 \
     182  d. ! queue ! imxvpudec   ! queue ! imxg2dvideosink framebuffer=/dev/fb0
     183}}}
     184
     185
     186[=#ex2]
     187=== Example: Playback Quicktime file with H.264/AVC video and AAC Audio ===
     188An example file consisting of a Quicktime file container that includes H.264/AVC encoded video and AAC encoded audio can be found at the [https://durian.blender.org/download/ Sintel] download site. Sintel is a relatively popular video (Creative Commons License) that has several encodings.
     189
     190Video file downloadable [https://download.blender.org/durian/trailer/sintel_trailer-1080p.mp4 here]
     191
     192For this file we can use the {{{filesrc}}} source element, the {{{h264parse}}} parser element along with the {{{imxvpudec}}} decoder element, and the {{{avdec_aac}}} audio decoder element.
     193{{{#!bash
     194 gst-launch-1.0  filesrc location=/home/root/sintel_trailer-1080p.mp4  ! \
     195   qtdemux name=d \
     196 d. ! queue ! h264parse ! imxvpudec ! imxg2dvideosink \
     197 d. ! queue ! avdec_aac ! audioconvert ! alsasink
     198}}}
     199
     200
     201=== Example: Playback MPEG-TS file ===
     202
     203Video only:
     204{{{
     205gst-launch-1.0 \
     206  filesrc location=/tmp2/T2C00201_1080p60_Crop.ts ! \
     207  tsdemux ! mpegvideoparse ! imxvpudec ! imxipuvideosink sync=false async=false
     208}}}
     209
     210Video + Audio
     211{{{
     212gst-launch-1.0 \
     213  filesrc location=/tmp2/T2C00201_1080p60_Crop.ts \
     214    ! tsdemux name=demux \
     215  demux. ! queue ! mpegaudioparse ! queue ! mad ! audioconvert ! queue ! alsasink \
     216  demux. ! queue ! mpegvideoparse ! queue ! imxvpudec ! queue ! imxg2dvideosink sync=false async=false
     217
     218}}}
     219
     220
     221[=#bin]
     222== Bin elements ==
     223A '''Bin''' element refers to a group of elements strung together and referenced as one. However, there are stand-alone elements that provide some automatic negotiation of sub-elements which use this concept.
     224
     225[=#playbin]
     226=== GStreamer {{{playbin}}} ===
     227GStreamer {{{playbin}}} element attempts to create a pipeline that will play both the audio and video portions of a file. For example:
     228{{{
     229#!bash
     230gst-launch-1.0 playbin uri=file:///media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi
     231}}}
     232
     233The above pipeline will attempt to output to the first video device and first audio devices found. However, you can further specify this by:
     234{{{
     235#!bash
     236gst-launch-1.0 playbin uri=file:///media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi audio-sink="alsasink device=hw:1,0"
     237}}}
     238
     239Please type {{{gst-inspect-1.0 playbin}}} to see more options.
     240
     241[=#decodebin]
     242=== GStreamer {{{decodebin}}} ===
     243The GStreamer plugin {{{decodebin}}} is very useful if you're unsure of which decoder to use on a stream. For example, we can replace the example under [#ex1 the first example] with the following:
     244{{{
     245#!bash
     246gst-launch-1.0 \
     247  filesrc location=/media/open-media/tears_of_steel_1080p.webm do-timestamp=true typefind=true ! \
     248  matroskademux name=d \
     249  d. ! queue ! ivorbisdec ! queue ! alsasink device=hw:1,0 \
     250  d. ! queue ! decodebin  ! queue ! imxg2dvideosink framebuffer=/dev/fb0
     251}}}
     252
     253Note that {{{decodebin}}} doesn't always choose the correct decoder, so be wary of this. It is similar to {{{playbin}}} in that it aids in creating a dynamic pipeline.
     254
     255
     256[=#gst-play]
     257=== GStreamer {{{gst-play-1.0}}} ===
     258The stand-alone application {{{gst-play}}} is is a program that utilizes the {{{playbin}}} element and thus can be used for playback of many file types. The above example {{{gst-launch-1.0 playbin uri=file:///media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi}}} can be replaced with:
     259{{{
     260#!bash
     261gst-play-1.0 /media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi
     262}}}
     263
     264
     265[=#determining-pipelines]
     266== How to determine what pipeline is needed to decode and play ==
     267Sometimes the above Bin elements are not flexible enough and you need to determine exactly what pipeline you can use to decode and play a stream.
     268
     269The {{{gst-launch}}} application provides a couple of useful debugging tools that can help with this:
     270 * using {{{GST_DEBUG_DUMP_DOT_DIR}}} and Graphviz
     271 * using {{{gst-launch -v}}}
     272
     273[=#filter-graph]
     274=== GST_DEBUG_DUMP_DOT_DIR ===
     275You can set the GST_DEBUG_DUMP_DOT_DIR env variable to a directory which will cause {{{gst-launch}}} to output a {{{.dot}}} file for each phase of the pipeline then use a tool such as {{{Graphviz}}} to visualize the {{{.dot}}} file.
     276
     277Example:
     278 * use playbin to playback a file:
     279{{{
     280root@ventana:~# GST_DEBUG_DUMP_DOT_DIR=/tmp/dot gst-launch-1.0 playbin uri=file:///mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi
     281}}}
     282  - hit Cntl-C after decoding starts to exit early
     283 * see the dot files created:
     284{{{
     285root@ventana:~# ls /tmp/dot
     2860.00.00.108710334-gst-launch.NULL_READY.dot
     2870.00.00.490807334-gst-launch.READY_PAUSED.dot
     2880.00.00.506736000-gst-launch.PAUSED_PLAYING.dot
     2890.00.03.135202001-gst-launch.PLAYING_PAUSED.dot
     2900.00.03.254000001-gst-launch.PAUSED_READY.dot
     291}}}
     292 * transfer to a PC and use something like {{{xdot}}} to view:
     293{{{
     294xdot 0.00.03.135202001-gst-launch.PLAYING_PAUSED.dot
     295}}}
     296  - zoom in along the graph and you can see that:
     297   - {{{GstFileSrc}}} is the source,
     298   - {{{GstAviDemux}}} is used to demux to audio/x-ac3,
     299   - {{{GstAc3Parse}}} is used to parse the audio into audio frames,
     300   - {{{GstMpeg4VParse}}} is used to parse the video into video frames,
     301   - {{{GstImxVpuDec}}} is used to decode the video from video/mpeg to video/x-raw,
     302   - {{{ GstA52Dec}}} is used to decode the audio from audio/x-ac3 to audio/x-raw,
     303   - etc
     304  - Note that some hunting with {{{gst-inspect}}} must be done to determine what elements coorespond to the above class names
     305
     306Reference:
     307 - http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+11%3A+Debugging+tools
     308
     309=== gst-launch -v ===
     310The verbose debugging from {{{gst-launch -v}}} can show you the negotiation that takes place as a pipeline moves through its stages.
     311
     312Example:
     313{{{
     314gst-launch-1.0 -v playbin uri=file:///mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi
     315}}}
     316
     317examining the verbose output can show you the following:
     318 * container: AVI: avidemux
     319 * video: MPEG-4 4481kbps min, 6668kbps max: mpeg4videoparse ! imxvpudec
     320 * audio: AC3 48khz 5channels: ac3parse ! a52dec
     321
     322Therefore you can use these pipelines to decode and play:
     323 * video only (output to imxipuvideosink fb0)
     324{{{
     325gst-launch-1.0 -v filesrc location=/mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi ! avidemux ! mpeg4videoparse ! imxvpudec ! imxipuvideosink
     326}}}
     327 * audio only (output to hdmi audio sink)
     328{{{
     329gst-launch-1.0 -v filesrc location=/mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi ! avidemux ! ac3parse ! a52dec ! audioconvert ! alsasink device="sysdefault:CARD=imxhdmisoc"
     330}}}
     331 * both audio and video
     332{{{
     333gst-launch-1.0 -v filesrc location=/mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi ! avidemux name=d \
     334  d. ! queue ! mpeg4videoparse ! imxvpudec ! imxipuvideosink \
     335  d. ! queue ! ac3parse ! a52dec ! audioconvert ! alsasink device="sysdefault:CARD=imxhdmisoc"
     336}}}