| 1 | [[PageOutline]] |
| 2 | |
| 3 | = GStreamer on Gateworks SBCs |
| 4 | The intended goal of this article is to provide the user with enough information so they can create their own GStreamer pipelines. These '''GStreamer pipelines have been tested and created for use on the Gateworks Single Board Computers (SBCs)''', specifically the Ventana family that utilizes the Freescale i.MX6 processors. |
| 5 | |
| 6 | Gateworks SBCs can be viewed at the following link: [http://www.gateworks.com] |
| 7 | |
| 8 | [[Image(http://trac.gateworks.com/raw-attachment/wiki/OpenWrt/wireless/relayd/gw5100realsmall.png,200px)]] |
| 9 | |
| 10 | '''Important Note''' |
| 11 | |
| 12 | The IMX6 (Ventana family) software has been evolving over the years. This page is focused on mainline Linux kernels and not the Freescale GStreamer {{{gstreamer-imx}}} plugin legacy software that used the 3.14 Freescale Vendor kernel. If using the Gateworks 3.14 kernel with the Freescale capture drivers please see [wiki:Yocto/gstreamer] |
| 13 | |
| 14 | Also note that GStreamer capture devices are {{{/dev/videoN}}} which require some configuration if using Processor specific video input busses such as the IMX6 capture using Analog (CVBS) or Digital video (HDMI). If using these capture interfaces please see [wiki:linux/media] for details on how to use {{{media-ctl}}} to configure those devices. |
| 15 | |
| 16 | |
| 17 | = Video4Linux2 Devices |
| 18 | The Video4Linux2 API is the device driver API used for kernel capture drivers as well as video encode and video decode drivers. |
| 19 | |
| 20 | |
| 21 | [=#media-ctl] |
| 22 | == Media Control API |
| 23 | The !MediaControl API is used to configure Video4Linux2 capture devices. See [wiki:linux/media] for more info. |
| 24 | |
| 25 | |
| 26 | [=#v4l2-ctrls] |
| 27 | == V4L2 Controls (brightness, contrast, saturation, rotation, flip, bitrate etc) |
| 28 | When capturing video from a video4linux2 device you can use the v4l2 API to get and set various controls the device may provide access to such as: |
| 29 | * brightness |
| 30 | * contrast |
| 31 | * hue |
| 32 | * saturation |
| 33 | * horizontal flip |
| 34 | * veritcal flip |
| 35 | * rotation |
| 36 | |
| 37 | What controls are available depend on the device. Furthermore when using capture devices that have media-ctl pipelines controls provided by various elements within the pipeline are passed on to the capture device. |
| 38 | |
| 39 | Examples: |
| 40 | * list available controls of a capture device: |
| 41 | {{{#!bash |
| 42 | media-ctl-setup adv7180 > setup |
| 43 | source setup |
| 44 | v4l2-ctl --device $DEVICE --list-ctrls # list available controls |
| 45 | }}} |
| 46 | * get brightness control for adv7180 analog video capture device: |
| 47 | {{{#!bash |
| 48 | media-ctl-setup adv7180 > setup |
| 49 | source setup |
| 50 | v4l2-ctl --device $DEVICE --get-ctrl=brightness |
| 51 | }}} |
| 52 | * set brightness control for adv7180 analog video capture device: |
| 53 | {{{#!bash |
| 54 | media-ctl-setup adv7180 > setup |
| 55 | source setup |
| 56 | v4l2-ctl --device $DEVICE --set-ctrl=brightness=50 |
| 57 | }}} |
| 58 | * set video bitrate of the v4l2h264enc H264 encoder element: |
| 59 | {{{#!bash |
| 60 | media-ctl-setup adv7180 > setup |
| 61 | source setup |
| 62 | v4l2-ctl --device $ENCODER --set-ctrl=video_bitrate=10000000 |
| 63 | }}} |
| 64 | |
| 65 | |
| 66 | |
| 67 | [=#test-media] |
| 68 | = Test Media Files |
| 69 | There is a variety of royalty free media available on the Internet that can be used freely in demos and for codec and file format testing: |
| 70 | * [https://peach.blender.org/download/ Big Buck Bunny] - Creative Commons |
| 71 | * [https://durian.blender.org/download/ Sintel] - Creative Commons |
| 72 | * [https://mango.blender.org/download/ Tears of Steel] - Creative Commons |
| 73 | |
| 74 | |
| 75 | = GStreamer |
| 76 | [http://GStreamer.freedesktop.org GStreamer] is an opensource library and framework created to handle multimedia. Pipelines are used to source and sink media however you would like (e.g. decoding the mp3 out of a video file and playing it back through speakers). |
| 77 | |
| 78 | It is important to understand that GStreamer is foremost a library meant for developing applications. While it has an extremely useful test application called {{{gst-launch}}} meant for testing pipelines using that application alone can't solve all problems. There are plenty of resources online including mailing lists and IRC channels that can be helpful in understanding GStreamer and writing applications that use it. |
| 79 | |
| 80 | To install GStreamer on Ubuntu: |
| 81 | {{{#!bash |
| 82 | apt-get install gstreamer1.0-x gstreamer1.0-tools libgstreamer1.0-0 \ |
| 83 | gstreamer1.0-plugins-base gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-plugins-ugly \ |
| 84 | gstreamer1.0-libav gstreamer1.0-alsa gstreamer1.0-gl gstreamer1.0-gtk3 gstreamer1.0-qt5 gstreamer1.0-pulseaudio \ |
| 85 | v4l-utils |
| 86 | }}} |
| 87 | |
| 88 | [=#plugins] |
| 89 | == Plugins and Elements |
| 90 | GStreamer works on 'plugins' that provide pipeline 'elements'. A plugin comprises of elements that can do work on a media stream. For example, the {{{mpeg2dec}}} can decode mpeg1 and mpeg2 video streams (software based decoding, not hardware accelerated). |
| 91 | |
| 92 | |
| 93 | [=#pipelines] |
| 94 | == Pipeline Construction |
| 95 | [http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/section-checklist-applications.html gst-launch] is a simple GStreamer application that can be used to test pipelines. A pipeline connects media handling components such as sources, sinks, decoders, encoders, muxes, demuxes, etc. Each of these elements are made available through GStreamer plugins. That is, if a certain plugin isn't installed on your device, you won't be able to use that element. To see that element's information, type {{{gst-inspect element}}} where {{{element}}} is the element you are looking for. To see a full listing of elements installed, you can type just {{{gst-inspect}}}. |
| 96 | |
| 97 | Here is a graphical example of a pipeline. |
| 98 | |
| 99 | [[Image(https://gstreamer.freedesktop.org/documentation/tutorials/basic/images/figure-1.png, 512px)]] |
| 100 | |
| 101 | The most basic and important things to know is the pipeline '!'. This is a pipe, similar to Linux's pipe '|'. It takes the output of one element and inputs it into the next |
| 102 | |
| 103 | [http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+10%3A+gstreamer+tools gstreamer.com] contains much more information on the construction of pipelines. |
| 104 | |
| 105 | |
| 106 | [=#capsfilters] |
| 107 | == Caps Filters |
| 108 | GStreamer has a concept called [http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/section-caps-api.html caps filters]. A 'cap' (short for capability) is used to describe the type of data that links a src (output) pad from one element to a sink (input) pad of another element. |
| 109 | |
| 110 | Adding a {{{-v}}} flag to {{{gst-launch-1.0}}} will output the capabilities negotiated between elements: |
| 111 | {{{#!bash |
| 112 | # gst-launch-1.0 -v videotestsrc ! fakesink |
| 113 | Setting pipeline to PAUSED ... |
| 114 | Pipeline is PREROLLING ... |
| 115 | /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive" |
| 116 | /GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive" |
| 117 | Pipeline is PREROLLED ... |
| 118 | Setting pipeline to PLAYING ... |
| 119 | New clock: GstSystemClock |
| 120 | }}} |
| 121 | |
| 122 | In the output, the {{{caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"}}} are the video caps negotiated between the two. |
| 123 | |
| 124 | You can force a caps filter between two elements by treating it as a pipeline element: |
| 125 | {{{#!bash |
| 126 | ~# gst-launch-1.0 -v videotestsrc ! 'video/x-raw, format=UYVY, width=1920, height=1080, framerate=10/1' ! fakesink |
| 127 | Setting pipeline to PAUSED ... |
| 128 | Pipeline is PREROLLING ... |
| 129 | /GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive" |
| 130 | /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive" |
| 131 | /GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive" |
| 132 | /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive" |
| 133 | Pipeline is PREROLLED ... |
| 134 | Setting pipeline to PLAYING ... |
| 135 | New clock: GstSystemClock |
| 136 | }}} |
| 137 | |
| 138 | As you can see, the caps filter above changed both the video format, resolution, and framerate of the video stream coming out of the {{{videotestsrc}}} element. |
| 139 | |
| 140 | Caps filters are useful when you want to use a specific format where a device may support multiple. For example when capturing audio from an ALSA sound device that supports a variety of samplerates, channels, and bit formats you can use a capsfilter to override any default configuration. |
| 141 | |
| 142 | |
| 143 | |
| 144 | [=#applications] |
| 145 | == Applications |
| 146 | GStreamer is a library and framework for multimedia processing. It is not a user-space application. It is easy to forget this because of the usefulness of the {{{gst-launch}}} application that can be used to connect simple pipelines together for testing. For your needs you very likely need to write a GStreamer Application. GStreamer applications use glib, so one should be somewhat familiar with that. |
| 147 | |
| 148 | GStreamer has an excellent hello world example that clearly explain what each step is meant to do. Please see [http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-helloworld.html here] for more details. |
| 149 | |
| 150 | Furthermore, Gateworks has created a [#gst-variable-rtsp-server gst-variable-rtsp-server] example application that demonstrates how to server simple GStreamer pipelines via an RTSP server and automatically adjust compression quality based on client connections. |
| 151 | |
| 152 | |
| 153 | [=#fakesink] |
| 154 | == fakesink |
| 155 | The {{{fakesink}}} element is a very useful video sink when debugging or learning GStreamer. It takes whatever frames is given to it and drops them. This might help debugging pipelines if problems ever arise. |
| 156 | |
| 157 | Example: |
| 158 | {{{#!bash |
| 159 | gst-launch-1.0 videotestsrc ! fakesink |
| 160 | }}} |
| 161 | |
| 162 | |
| 163 | == gst-inspect |
| 164 | The {{{gst-insped-1.0}}} tool is an excellent tool if you're unfamiliar with the functionality, source/sink formats, or properties of any Gstreamer element. |
| 165 | |
| 166 | Usage: |
| 167 | {{{#!bash |
| 168 | root@bionic-armhf:~# gst-inspect-1.0 videotestsrc |
| 169 | Factory Details: |
| 170 | Rank none (0) |
| 171 | Long-name Video test source |
| 172 | Klass Source/Video |
| 173 | Description Creates a test video stream |
| 174 | Author David A. Schleef <ds@schleef.org> |
| 175 | |
| 176 | Plugin Details: |
| 177 | Name videotestsrc |
| 178 | Description Creates a test video stream |
| 179 | Filename /usr/lib/arm-linux-gnueabihf/gstreamer-1.0/libgstvideotestsrc.so |
| 180 | Version 1.14.5 |
| 181 | License LGPL |
| 182 | Source module gst-plugins-base |
| 183 | Source release date 2019-05-29 |
| 184 | Binary package GStreamer Base Plugins (Ubuntu) |
| 185 | Origin URL https://launchpad.net/distros/ubuntu/+source/gst-plugins-base1.0 |
| 186 | |
| 187 | GObject |
| 188 | +----GInitiallyUnowned |
| 189 | +----GstObject |
| 190 | +----GstElement |
| 191 | +----GstBaseSrc |
| 192 | +----GstPushSrc |
| 193 | +----GstVideoTestSrc |
| 194 | |
| 195 | Pad Templates: |
| 196 | SRC template: 'src' |
| 197 | Availability: Always |
| 198 | Capabilities: |
| 199 | video/x-raw |
| 200 | format: { (string)I420, (string)YV12, (string)YUY2, (string)UYVY, (string)AYUV, (string)RGBx, (string)BGRx, (string)xRGB, (string)xBGR, (string)RGBA, (string)BGRA, (string)ARGB, (string)ABGR, (string)RGB, (string)BGR, (string)Y41B, (string)Y42B, (string)YVYU, (string)Y444, (string)v210, (string)v216, (string)NV12, (string)NV21, (string)GRAY8, (string)GRAY16_BE, (string)GRAY16_LE, (string)v308, (string)RGB16, (string)BGR16, (string)RGB15, (string)BGR15, (string)UYVP, (string)A420, (string)RGB8P, (string)YUV9, (string)YVU9, (string)IYU1, (string)ARGB64, (string)AYUV64, (string)r210, (string)I420_10BE, (string)I420_10LE, (string)I422_10BE, (string)I422_10LE, (string)Y444_10BE, (string)Y444_10LE, (string)GBR, (string)GBR_10BE, (string)GBR_10LE, (string)NV16, (string)NV24, (string)NV12_64Z32, (string)A420_10BE, (string)A420_10LE, (string)A422_10BE, (string)A422_10LE, (string)A444_10BE, (string)A444_10LE, (string)NV61, (string)P010_10BE, (string)P010_10LE, (string)IYU2, (string)VYUY, (string)GBRA, (string)GBRA_10BE, (string)GBRA_10LE, (string)GBR_12BE, (string)GBR_12LE, (string)GBRA_12BE, (string)GBRA_12LE, (string)I420_12BE, (string)I420_12LE, (string)I422_12BE, (string)I422_12LE, (string)Y444_12BE, (string)Y444_12LE, (string)GRAY10_LE32, (string)NV12_10LE32, (string)NV16_10LE32 } |
| 201 | width: [ 1, 2147483647 ] |
| 202 | height: [ 1, 2147483647 ] |
| 203 | framerate: [ 0/1, 2147483647/1 ] |
| 204 | multiview-mode: { (string)mono, (string)left, (string)right } |
| 205 | video/x-bayer |
| 206 | format: { (string)bggr, (string)rggb, (string)grbg, (string)gbrg } |
| 207 | width: [ 1, 2147483647 ] |
| 208 | height: [ 1, 2147483647 ] |
| 209 | framerate: [ 0/1, 2147483647/1 ] |
| 210 | multiview-mode: { (string)mono, (string)left, (string)right } |
| 211 | |
| 212 | Element has no clocking capabilities. |
| 213 | Element has no URI handling capabilities. |
| 214 | |
| 215 | Pads: |
| 216 | SRC: 'src' |
| 217 | Pad Template: 'src' |
| 218 | |
| 219 | Element Properties: |
| 220 | name : The name of the object |
| 221 | flags: readable, writable |
| 222 | String. Default: "videotestsrc0" |
| 223 | parent : The parent of the object |
| 224 | flags: readable, writable |
| 225 | Object of type "GstObject" |
| 226 | blocksize : Size in bytes to read per buffer (-1 = default) |
| 227 | flags: readable, writable |
| 228 | Unsigned Integer. Range: 0 - 4294967295 Default: 4096 |
| 229 | num-buffers : Number of buffers to output before sending EOS (-1 = unlimited) |
| 230 | flags: readable, writable |
| 231 | Integer. Range: -1 - 2147483647 Default: -1 |
| 232 | typefind : Run typefind before negotiating (deprecated, non-functional) |
| 233 | flags: readable, writable, deprecated |
| 234 | Boolean. Default: false |
| 235 | do-timestamp : Apply current stream time to buffers |
| 236 | flags: readable, writable |
| 237 | Boolean. Default: false |
| 238 | pattern : Type of test pattern to generate |
| 239 | flags: readable, writable |
| 240 | Enum "GstVideoTestSrcPattern" Default: 0, "smpte" |
| 241 | (0): smpte - SMPTE 100% color bars |
| 242 | (1): snow - Random (television snow) |
| 243 | (2): black - 100% Black |
| 244 | (3): white - 100% White |
| 245 | (4): red - Red |
| 246 | (5): green - Green |
| 247 | (6): blue - Blue |
| 248 | (7): checkers-1 - Checkers 1px |
| 249 | (8): checkers-2 - Checkers 2px |
| 250 | (9): checkers-4 - Checkers 4px |
| 251 | (10): checkers-8 - Checkers 8px |
| 252 | (11): circular - Circular |
| 253 | (12): blink - Blink |
| 254 | (13): smpte75 - SMPTE 75% color bars |
| 255 | (14): zone-plate - Zone plate |
| 256 | (15): gamut - Gamut checkers |
| 257 | (16): chroma-zone-plate - Chroma zone plate |
| 258 | (17): solid-color - Solid color |
| 259 | (18): ball - Moving ball |
| 260 | (19): smpte100 - SMPTE 100% color bars |
| 261 | (20): bar - Bar |
| 262 | (21): pinwheel - Pinwheel |
| 263 | (22): spokes - Spokes |
| 264 | (23): gradient - Gradient |
| 265 | (24): colors - Colors |
| 266 | timestamp-offset : An offset added to timestamps set on buffers (in ns) |
| 267 | flags: readable, writable |
| 268 | Integer64. Range: 0 - 2147483646999999999 Default: 0 |
| 269 | is-live : Whether to act as a live source |
| 270 | flags: readable, writable |
| 271 | Boolean. Default: false |
| 272 | k0 : Zoneplate zero order phase, for generating plain fields or phase offsets |
| 273 | flags: readable, writable |
| 274 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 275 | kx : Zoneplate 1st order x phase, for generating constant horizontal frequencies |
| 276 | flags: readable, writable |
| 277 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 278 | ky : Zoneplate 1st order y phase, for generating contant vertical frequencies |
| 279 | flags: readable, writable |
| 280 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 281 | kt : Zoneplate 1st order t phase, for generating phase rotation as a function of time |
| 282 | flags: readable, writable |
| 283 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 284 | kxt : Zoneplate x*t product phase, normalised to kxy/256 cycles per vertical pixel at width/2 from origin |
| 285 | flags: readable, writable |
| 286 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 287 | kyt : Zoneplate y*t product phase |
| 288 | flags: readable, writable |
| 289 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 290 | kxy : Zoneplate x*y product phase |
| 291 | flags: readable, writable |
| 292 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 293 | kx2 : Zoneplate 2nd order x phase, normalised to kx2/256 cycles per horizontal pixel at width/2 from origin |
| 294 | flags: readable, writable |
| 295 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 296 | ky2 : Zoneplate 2nd order y phase, normailsed to ky2/256 cycles per vertical pixel at height/2 from origin |
| 297 | flags: readable, writable |
| 298 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 299 | kt2 : Zoneplate 2nd order t phase, t*t/256 cycles per picture |
| 300 | flags: readable, writable |
| 301 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 302 | xoffset : Zoneplate 2nd order products x offset |
| 303 | flags: readable, writable |
| 304 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 305 | yoffset : Zoneplate 2nd order products y offset |
| 306 | flags: readable, writable |
| 307 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 308 | foreground-color : Foreground color to use (big-endian ARGB) |
| 309 | flags: readable, writable, controllable |
| 310 | Unsigned Integer. Range: 0 - 4294967295 Default: 4294967295 |
| 311 | background-color : Background color to use (big-endian ARGB) |
| 312 | flags: readable, writable, controllable |
| 313 | Unsigned Integer. Range: 0 - 4294967295 Default: 4278190080 |
| 314 | horizontal-speed : Scroll image number of pixels per frame (positive is scroll to the left) |
| 315 | flags: readable, writable |
| 316 | Integer. Range: -2147483648 - 2147483647 Default: 0 |
| 317 | animation-mode : For pattern=ball, which counter defines the position of the ball. |
| 318 | flags: readable, writable |
| 319 | Enum "GstVideoTestSrcAnimationMode" Default: 0, "frames" |
| 320 | (0): frames - frame count |
| 321 | (1): wall-time - wall clock time |
| 322 | (2): running-time - running time |
| 323 | motion : For pattern=ball, what motion the ball does |
| 324 | flags: readable, writable |
| 325 | Enum "GstVideoTestSrcMotionType" Default: 0, "wavy" |
| 326 | (0): wavy - Ball waves back and forth, up and down |
| 327 | (1): sweep - 1 revolution per second |
| 328 | (2): hsweep - 1/2 revolution per second, then reset to top |
| 329 | flip : For pattern=ball, invert colors every second. |
| 330 | flags: readable, writable |
| 331 | Boolean. Default: false |
| 332 | }}} |
| 333 | |
| 334 | |
| 335 | |
| 336 | |
| 337 | [=#video] |
| 338 | = GStreamer Video |
| 339 | This section will introduce several concepts regarding how video media is handled by GStreamer and provide several example pipelines. |
| 340 | |
| 341 | |
| 342 | [=#video-output] |
| 343 | == Video Output |
| 344 | Generally, a GStreamer 'sink' element is one that will take a video stream and output it to a display. |
| 345 | |
| 346 | Some commonly used video output sinks: |
| 347 | * autovideosink: Wrapper video sink for automatically detecting video output device |
| 348 | * kmssink: KMS video device sink |
| 349 | * fbdevsink: Framebuffer video device sink |
| 350 | |
| 351 | Execute a {{{gst-inspect-1.0 | grep sink}}} to see a complete list of sinks available and {{{gst-inspect-1.0 <element>}}} to see specific details about the applicable formats and properties of a specific element. |
| 352 | |
| 353 | **Important Note**: Video display drivers for use with video display sinks such as kmssink and fbdevsink for HDMI out and LVDS require proper kernel commandline parameters to configure them. Sometimes the Bootloader bootscript will do this for you - Please see [wiki:linux/display] for details and check the kernel command-line after booting to make sure the desired device is enabled and configured for the proper resolution and mode. |
| 354 | |
| 355 | Examples: |
| 356 | * HDMI enabled and configured for 1080p60Hz LVDS disabled: |
| 357 | {{{#!bash |
| 358 | root@bionic-armhf:~# for arg in $(cat /proc/cmdline); do echo $arg | grep video; done |
| 359 | video=HDMI-A-1:1920x1080M@60 |
| 360 | video=LVDS-1:d |
| 361 | }}} |
| 362 | * HDMI disabled and LVDS configured for 1280x800 60Hz |
| 363 | {{{#!bash |
| 364 | root@bionic-armhf:~# for arg in $(cat /proc/cmdline); do echo $arg | grep video; done |
| 365 | video=LVDS-1:1280x800@60M |
| 366 | video=HDMI-A-1:d |
| 367 | }}} |
| 368 | |
| 369 | |
| 370 | [=#autovideosink] |
| 371 | === autovideosink |
| 372 | This GStreamer sink is not really a 'video' sink in the traditional sense. Similar to {{{playbin}}} and {{{decodebin}}}, this element selects what it thinks is the best available video sink and uses it. |
| 373 | |
| 374 | This will typically use {{{kmssink}}} unless format choices require one of the other sinks. |
| 375 | |
| 376 | You can add a verbose flag {{{gst-launch-1.0 -v}}} to see details about the elements and caps chosen when using any type of 'auto' element or 'bin' element. |
| 377 | |
| 378 | Example: |
| 379 | {{{#!bash |
| 380 | gst-launch-1.0 videotestsrc ! autovideosink |
| 381 | }}} |
| 382 | * Note that adding a '-v' flag for verbose mode will show the element configuration to let you know what sink was actually chosen |
| 383 | |
| 384 | |
| 385 | [=#kmssink] |
| 386 | === kmssink |
| 387 | This sink allows you to directly output to a KMS (kernelmode setting API) driver. This is the most efficient video output method. |
| 388 | |
| 389 | Example: |
| 390 | {{{#!bash |
| 391 | gst-launch-1.0 videotestsrc ! kmssink |
| 392 | }}} |
| 393 | |
| 394 | |
| 395 | [=#fbdevsink] |
| 396 | The Framebuffer device sink allows you to directly output to a Linux Framebuffer device if you have one (ls /sys/class/graphics/fb*). |
| 397 | |
| 398 | Example: |
| 399 | {{{#!bash |
| 400 | gst-launch-1.0 videotestsrc ! fbdevsink |
| 401 | }}} |
| 402 | |
| 403 | |
| 404 | |
| 405 | [=#video-input] |
| 406 | == Video Input (Capture) |
| 407 | A video input source is anything coming from an input on the device, e.g. HDMI input/USB Web Cam. This is referred to as Video Capture. In GStreamer terms a video capture device is a 'source' or 'src' element. |
| 408 | |
| 409 | Please refer to the [wiki:linux/media linux/media] page for details on the video in devices on the Ventana platform. |
| 410 | |
| 411 | Some commonly used GStreamer video input sources: |
| 412 | * videotestsrc |
| 413 | * v4l2src |
| 414 | |
| 415 | Execute a {{{gst-inspect-1.0 | grep src}}} to see a complete list of source elements available and {{{gst-inspect-1.0 <element>}}} to see specific details about the applicable formats and properties of a specific element. |
| 416 | |
| 417 | If the {{{is-live}}} property is set to true this will cause buffers to be discarded on a pipeline paused state and pipelines will not participate in the PREROLL phase of a pipeline. |
| 418 | |
| 419 | ** Important Note: ** Linux video capture devices provided by CPU's such as the IMX6 Analog (CVBS) and Digital (HDMI) inputs provided on Ventana boards require a hardware pipeline configuration. See [wiki:linux/media] for details on how to configure the video capture device pipeline. |
| 420 | |
| 421 | |
| 422 | [=#videotestsrc] |
| 423 | === videotestsrc |
| 424 | This is a very useful element for testing pipelines as it creates video content. It can output a huge number of video formats for raw video {{{video/x-raw}}} and bayer video {{{video/x-bayer}}}. Use {{{gst-inspect-1.0 videotestsrc}}} to see its output formats and properties such as video test pattern, orientation, colrs, etc. |
| 425 | |
| 426 | Examples: |
| 427 | * display SMPTE pattern to connected and configured display |
| 428 | {{{#!bash |
| 429 | gst-launch-1.0 videotestsrc ! kmssink |
| 430 | }}} |
| 431 | * set the 'pattern' property to display a moving ball: |
| 432 | {{{#!bash |
| 433 | gst-launch-1.0 videotestsrc pattern=18 ! kmssink |
| 434 | }}} |
| 435 | * Use a '[#capsfilter capsfilter]' to force output of videotestsrc to a specific resolution, format, and framerate (ie 1080p 10fps YUV): |
| 436 | {{{#!bash |
| 437 | gst-launch-1.0 videotestsrc ! 'video/x-raw,format=UYVY,width=1920,height=1080,framerate=10/1' ! kmssink |
| 438 | }}} |
| 439 | |
| 440 | [=#v4l2src] |
| 441 | === v4l2src |
| 442 | This element uses the video4linux2 API to capture video from input sources (/dev/video<n>). |
| 443 | |
| 444 | Note that {{{v4l2src}}} is always live regardless of the is-live property. |
| 445 | |
| 446 | Note that you need to configure the media-ctl pipeline that feeds the /dev/video<n> devices - see [wiki:linux/media] for details |
| 447 | |
| 448 | You can use the {{{v4l2-ctl}}} application from the {{{v4l-utils}}} package to to interact with the device to get/set various capabilities and controls. |
| 449 | |
| 450 | Examples: |
| 451 | * show all video4linux2 devices: |
| 452 | {{{#!bash |
| 453 | root@bionic-armhf:~# v4l2-ctl --list-devices |
| 454 | CODA960 (platform:coda): |
| 455 | /dev/video8 |
| 456 | /dev/video9 |
| 457 | |
| 458 | imx-media-mem2mem (platform:imx-media-mem2mem): |
| 459 | /dev/video10 |
| 460 | |
| 461 | imx-media-capture (platform:ipu1_csi0): |
| 462 | /dev/video4 |
| 463 | |
| 464 | imx-media-capture (platform:ipu1_csi1): |
| 465 | /dev/video5 |
| 466 | |
| 467 | imx-media-capture (platform:ipu1_ic_prpenc): |
| 468 | /dev/video0 |
| 469 | |
| 470 | imx-media-capture (platform:ipu1_ic_prpvf): |
| 471 | /dev/video1 |
| 472 | |
| 473 | imx-media-capture (platform:ipu2_csi0): |
| 474 | /dev/video6 |
| 475 | |
| 476 | imx-media-capture (platform:ipu2_csi1): |
| 477 | /dev/video7 |
| 478 | |
| 479 | imx-media-capture (platform:ipu2_ic_prpenc): |
| 480 | /dev/video2 |
| 481 | |
| 482 | imx-media-capture (platform:ipu2_ic_prpvf): |
| 483 | /dev/video3 |
| 484 | |
| 485 | }}} |
| 486 | * display all details about /dev/video1: |
| 487 | {{{#!bash |
| 488 | v4l2-ctl -d /dev/video1 --all |
| 489 | }}} |
| 490 | * Use Gateworks {{{media-ctl-setup}}} to configure the media-ctl pipeline for Analog Video capture using the adv7180 chip: |
| 491 | {{{#!bash |
| 492 | wget https://raw.githubusercontent.com/Gateworks/media-ctl-setup/master/media-ctl-setup |
| 493 | chmod +x media-ctl-setup |
| 494 | ./media-ctl-setup adv7180 > setup |
| 495 | . ./setup |
| 496 | }}} |
| 497 | - you can look at the setup script to see what was done - the details vary between SoC (IMX6DL vs IMX6Q), board, and sensor (adv7180 for Analog video capture and tda1997x for digital video capture) |
| 498 | - note that the setup script exports the video capture device to the DEVICE env variable which you can use as shown below |
| 499 | * After sourcing setup above you can capture and display to the connected and configured output display (video loopback): |
| 500 | {{{#!bash |
| 501 | gst-launch-1.0 v4l2src device=$DEVICE ! kmssink |
| 502 | }}} |
| 503 | |
| 504 | |
| 505 | [=#colorspace] |
| 506 | [=#scaling] |
| 507 | [=#mem2mem] |
| 508 | == Colorspace Converting and/or Video Scaling (via IMX6 IPU mem2mem driver) |
| 509 | Often times, a colorspace conversion or scaling is required in order to link GStreamer elements together. This is often due to the fact that not all elements can every format available. |
| 510 | |
| 511 | A Linux V4L2 MEM2MEM imx-media driver exists that allows utilizing the IMX6 IPU Image Converter hardware blocks (IC) to perform hardware colorspace conversion (CSC), scaling, cropping, rotation, and flip operations. |
| 512 | |
| 513 | The GStreamer {{{video4linux2}}} element provides an element that uses this driver to expose these capabilities to GStreamer applications. |
| 514 | |
| 515 | Notes: |
| 516 | - for GStreamer-1.14 the name of the element depends on the video device the driver registers with the kernel (ie v4l2video8convert if mem2mem driver registers /dev/video8) |
| 517 | - for GStreamer master (in development) the name of the element is always 'v4l2videoconvert' |
| 518 | - the {{{kmssink}}} examples below need a {{{can-scale=false}}} property to tell GStreamer not to scale via the KMS driver (as the IMX6 KMS driver does not support scaling) |
| 519 | - ensure that the input format differs from the output format otherwise GStreamer will bypass the conversion completely; note that GStreamer doesn't understand flipping or rotation as part of the format. Gstreamer master (in development) adds a 'disable-passthrough' property to the v4l2videoconvert entity that can be set to force the conversion regardless of input and output format |
| 520 | - when using imx entities (ie capture, encode/decode, mem2mem, display) you can specify 'output-io-mode=dmabuf-import' to share dmabuf pointers for a zero-copy pipeline however if using non imx entities (ie videotestsrc) you must omit these as you can not ensure the buffers share the alignment/stride necessary to share dmabuf pointers |
| 521 | |
| 522 | Examples: |
| 523 | * Ensure mem2mem is in your kernel: |
| 524 | {{{#!bash |
| 525 | ~# dmesg | grep mem2mem |
| 526 | [ 18.356023] imx-media: Registered ipu_ic_pp mem2mem as /dev/video8 |
| 527 | }}} |
| 528 | * Ensure GStreamer element exists: |
| 529 | {{{#!bash |
| 530 | ~# gst-inspect-1.0 | grep -e "v4l2.*convert" |
| 531 | video4linux2: v4l2convert: V4L2 Video Converter |
| 532 | }}} |
| 533 | - Note that for GStreamer-1.14, the name of the element depends on the video device the driver registers with the kernel (video8 in the above example). This changes in GStreamer-1.16 to always be 'v4l2videoconvert' |
| 534 | * Obtain the name of the element (as it can vary between GStreaqmer-1.14 and GStreamer 1.16) |
| 535 | {{{#!bash |
| 536 | GST_CONVERT=$(gst-inspect-1.0 | grep -e "v4l2.*convert*" | sed -e 's/.*:\s*\(v4l2.*convert\):.*/\1/') |
| 537 | }}} |
| 538 | * scale/rotate/flip using {{{videotestsrc}}} (can not use dmabufs for this as it is a non-imx entity) |
| 539 | {{{#!bash |
| 540 | # upscale |
| 541 | gst-launch-1.0 videotestsrc ! video/x-raw,width=320,height=240 ! \ |
| 542 | $GST_CONVERT ! \ |
| 543 | video/x-raw,width=640,height=480 ! kmssink can-scale=false |
| 544 | # downscale |
| 545 | gst-launch-1.0 videotestsrc ! video/x-raw,width=640,height=480 ! \ |
| 546 | $GST_CONVERT ! \ |
| 547 | video/x-raw,width=320,height=240 ! kmssink can-scale=false |
| 548 | # rotate |
| 549 | gst-launch-1.0 videotestsrc ! video/x-raw,width=320,height=240 ! \ |
| 550 | $GST_CONVERT extra-controls=cid,rotate=90 ! \ |
| 551 | video/x-raw,width=240,height=320 ! kmssink can-scale=false |
| 552 | # hflip |
| 553 | gst-launch-1.0 videotestsrc ! video/x-raw,width=320,height=240 ! \ |
| 554 | $GST_CONVERT extra-controls=cid,horizontal_flip=1 ! \ |
| 555 | video/x-raw,width=640,height=480 ! kmssink can-scale=false |
| 556 | # vflip |
| 557 | gst-launch-1.0 videotestsrc ! video/x-raw,width=320,height=240 ! \ |
| 558 | $GST_CONVERT extra-controls=cid,vertical_flip=1 ! \ |
| 559 | video/x-raw,width=640,height=480 ! kmssink can-scale=false |
| 560 | }}} |
| 561 | - note the above examples force the input format (resolution in this case) to differ from the output format otherwise gstreamer will bypass the v4l2convert entity thinking it not necessary as gstreamer does not understand the flip/rotation properties. GStreamer master (in development) adds the 'disable-passthrough' property which can be enabled to force disabling passthrough |
| 562 | * scale/rotate/flip using imx-media capture device and KMS display driver (can use dmabufs for this as they are all imx hardware entitites): |
| 563 | {{{#!bash |
| 564 | # scale sensor input to 720p display |
| 565 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 566 | $GST_CONVERT output-io-mode=dmabuf-import ! \ |
| 567 | video/x-raw,width=1280,height=720 ! \ |
| 568 | kmssink can-scale=false |
| 569 | # scale sensor input to 1080p display |
| 570 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 571 | $GST_CONVERT output-io-mode=dmabuf-import ! \ |
| 572 | video/x-raw,width=1920,height=1080 ! \ |
| 573 | kmssink can-scale=false |
| 574 | # scale/flip |
| 575 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 576 | $GST_CONVERT output-io-mode=dmabuf-import extra-controls=cid,horizontal_flip=1 ! \ |
| 577 | video/x-raw,width=1920,height=1080 ! \ |
| 578 | kmssink can-scale=false |
| 579 | # scale/rotate |
| 580 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 581 | $GST_CONVERT output-io-mode=dmabuf-import extra-controls=cid,rotate=90 ! \ |
| 582 | video/x-raw,width=720,height=1280 ! \ |
| 583 | kmssink can-scale=false |
| 584 | }}} |
| 585 | * capture, scale, rotate, flip and encode using imx-media capture device mem2mem device and coda device (can use dmabufs for zero-copy) |
| 586 | {{{#!bash |
| 587 | # encode |
| 588 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 589 | $GST_CONVERT output-io-mode=dmabuf-import ! \ |
| 590 | v4l2h264enc output-io-mode=dmabuf-import ! \ |
| 591 | rtph264pay ! udpsink host=$SERVER port=$PORT |
| 592 | # scale/encode |
| 593 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 594 | $GST_CONVERT output-io-mode=dmabuf-import ! \ |
| 595 | video/x-raw,width=1440,height=960 ! \ |
| 596 | v4l2h264enc output-io-mode=dmabuf-import ! \ |
| 597 | rtph264pay ! udpsink host=$SERVER port=$PORT |
| 598 | # scale/flip/encode |
| 599 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 600 | $GST_CONVERT output-io-mode=dmabuf-import extra-controls=cid,horizontal_flip=1 ! \ |
| 601 | video/x-raw,width=1440,height=960 ! \ |
| 602 | v4l2h264enc output-io-mode=dmabuf-import ! \ |
| 603 | rtph264pay ! udpsink host=$SERVER port=$PORT |
| 604 | # scale/rotate/encode |
| 605 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 606 | $GST_CONVERT output-io-mode=dmabuf-import extra-controls=cid,rotate=90 ! \ |
| 607 | video/x-raw,width=1440,height=960 ! \ |
| 608 | v4l2h264enc output-io-mode=dmabuf-import ! \ |
| 609 | rtph264pay ! udpsink host=$SERVER port=$PORT |
| 610 | }}} |
| 611 | |
| 612 | For anything not able to be handled by the video4linux2 element and any mem2mem drivers there are software based elements that can convert pixel colorspace and/or scale however note these are extremely CPU intensive: |
| 613 | * autovideoconvert |
| 614 | * videoconvert |
| 615 | * rgb2bayer |
| 616 | |
| 617 | |
| 618 | [=#interlaced-video] |
| 619 | == Interlaced Video and Deinterlacing |
| 620 | Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured at two different times. The alternative to interlaced video is called progressive video. |
| 621 | |
| 622 | While reducing bandwidth this can cause a perceived flicker effect as well as a very apparent artifact seen during motion. For example a car moving horizontally across a scene will show every other vertical line differently: the second field interlaced with the first field will be a full frame period ahead in time from the other. The visual affect can be seen in [https://en.wikipedia.org/wiki/File:Interlaced_video_frame_(car_wheel).jpg this image from Wikipedia] |
| 623 | |
| 624 | Television signals are typically interlaced, or at least were until recently. For example, analog television standards such as NTSC used in North America as well as the PAL and SECAM formats used abroad use interlaced video and therefore any analog video decoder such as the ADV7180 found on many Gateworks Ventana boards will capture interlaced video and are subject to interlacing artifacts. Interlaced video is still used in High Definition signals as well and the letter at the end of the format tells you if its interlaced (ie 480i, 720i, 1080i) or progressive (ie 480p, 720p, 1080p). |
| 625 | |
| 626 | To deinterlace video captured from the adv7180 analog decoder you need to use media-ctl to configure your pipeline to include the IMX6 Video Deinterlacer block. This is done for you if you use the Gateworks {{{media-ctl-setup}}} script to configure your pipeline. See [wiki:linux/media#media-ctl-setup linux/media] for more info. |
| 627 | |
| 628 | |
| 629 | References: |
| 630 | * [https://en.wikipedia.org/wiki/Interlaced_video Wikipedia interlaced video - includes several images demonstrating interlacing artifacts] |
| 631 | * [https://en.wikipedia.org/wiki/DeinterlacingWikipedia deinterlacing] |
| 632 | |
| 633 | |
| 634 | [=#encoding] |
| 635 | == Video Encoding (via IMX6 coda driver and video4linux2 element) |
| 636 | Encoding video means taking the raw captured pixels and converting them to a standard compressed format. |
| 637 | |
| 638 | The Linux CODA driver provides access to the IMX6 hardware encode/decode codecs and the GStreamer {{{video4linux2}}} element provides encode/decode elements that tap into this. |
| 639 | |
| 640 | Notes: |
| 641 | * Ensure the CODA driver is in your kernel (CONFIG_VIDEO_CODA added in Linux 3.16) and that firmware was loaded: |
| 642 | {{{#!bash |
| 643 | ~# dmesg | grep coda |
| 644 | [ 16.721698] coda 2040000.vpu: Direct firmware load for vpu_fw_imx6q.bin failed with error -2 |
| 645 | [ 16.721724] coda 2040000.vpu: Falling back to syfs fallback for: vpu_fw_imx6q.bin |
| 646 | [ 18.381136] coda 2040000.vpu: Using fallback firmware vpu/vpu_fw_imx6q.bin |
| 647 | [ 18.433648] coda 2040000.vpu: Firmware code revision: 570363 |
| 648 | [ 18.433683] coda 2040000.vpu: Initialized CODA960. |
| 649 | [ 18.433706] coda 2040000.vpu: Firmware version: 3.1.1 |
| 650 | [ 18.442312] coda 2040000.vpu: codec registered as /dev/video[9-10] |
| 651 | ~# cat /sys/class/video4linux/video9/name |
| 652 | coda-encoder |
| 653 | ~# cat /sys/class/video4linux/video10/name |
| 654 | coda-decoder |
| 655 | }}} |
| 656 | * Ensure GStreamer encode elements exists: |
| 657 | {{{#!bash |
| 658 | ~# gst-inspect-1.0 | grep -e "v4l2.*enc" |
| 659 | video4linux2: v4l2h264enc: V4L2 H.264 Encoder |
| 660 | video4linux2: v4l2mpeg4enc: V4L2 MPEG4 Encoder |
| 661 | }}} |
| 662 | * The CODA960 encoder requires NV12/I420/YV12 YUV pixel formats with rec709 colorimetry |
| 663 | * The CODA driver requires CMA memory for buffers when it is used. Make sure to provide the kernel with enough CMA memory with the kernel command-line (ie 'cma=64M' for 64MB which should be enough) |
| 664 | * Encoder and Decoder options are exposed through the v4l2 control IDs (CID) and can be listed with the v4l2-ctl -L and v4l2-ctl -l parameters on the devices exposed by the coda driver: |
| 665 | |
| 666 | Examples: |
| 667 | * Encode to H264 (hardware based encode) and stream via RTP/UDP: |
| 668 | {{{#!bash |
| 669 | # stream H264/RTP/UDP |
| 670 | gst-launch-1.0 v4l2src device=$DEVICE ! \ |
| 671 | v4l2video10convert output-io-mode=dmabuf-import ! \ |
| 672 | v4l2h264enc output-io-mode=dmabuf-import ! \ |
| 673 | rtph264pay ! udpsink host=$SERVER port=$PORT |
| 674 | # client on $SERVER:$PORT could be viewing via 'gst-launch-1.0 udpsrc port=$PORT caps=application/x-rtp,payload=96 ! rtph264depay ! decodebin ! autovideosink' |
| 675 | }}} |
| 676 | * Encode to JPEG (software based encode) and stream via RTP/UDP: |
| 677 | {{{#!bash |
| 678 | # stream JPEG/RTP/UDP |
| 679 | gst-launch-1.0 v4l2src device=$DEVICE ! jpegenc ! rtpjpegpay ! udpsink host=$SERVER port=$PORT |
| 680 | # client on $SERVER:$PORT could be viewing via 'gst-launch-1.0 udpsrc port=$PORT ! application/x-rtp,payload=96 ! rtpjpegdepay ! jpegdec ! autovideosink' |
| 681 | }}} |
| 682 | |
| 683 | See also: |
| 684 | * [wiki:ventana/vpu ventana/vpu] |
| 685 | |
| 686 | |
| 687 | |
| 688 | [=#decoding] |
| 689 | == Video Decoding (via IMX6 coda driver and video4linux2 element) |
| 690 | Decoding video means taking compressed formats such as MPEG and H264 and decoding them into frames comprised of raw pixels (typically so they can be displayed on a video display output device). |
| 691 | |
| 692 | The Linux CODA driver provides access to the IMX6 hardware encode/decode codecs and the GStreamer {{{video4linux2}}} element provides encode/decode elements that tap into this. |
| 693 | |
| 694 | Notes: |
| 695 | * Ensure the CODA driver is in your kernel (CONFIG_VIDEO_CODA added in Linux 3.16) and that firmware was loaded: |
| 696 | {{{#!bash |
| 697 | ~# dmesg | grep coda |
| 698 | [ 16.721698] coda 2040000.vpu: Direct firmware load for vpu_fw_imx6q.bin failed with error -2 |
| 699 | [ 16.721724] coda 2040000.vpu: Falling back to syfs fallback for: vpu_fw_imx6q.bin |
| 700 | [ 18.381136] coda 2040000.vpu: Using fallback firmware vpu/vpu_fw_imx6q.bin |
| 701 | [ 18.433648] coda 2040000.vpu: Firmware code revision: 570363 |
| 702 | [ 18.433683] coda 2040000.vpu: Initialized CODA960. |
| 703 | [ 18.433706] coda 2040000.vpu: Firmware version: 3.1.1 |
| 704 | [ 18.442312] coda 2040000.vpu: codec registered as /dev/video[9-10] |
| 705 | ~# cat /sys/class/video4linux/video9/name |
| 706 | coda-encoder |
| 707 | ~# cat /sys/class/video4linux/video10/name |
| 708 | coda-decoder |
| 709 | }}} |
| 710 | * Ensure GStreamer decode elements exists: |
| 711 | {{{#!bash |
| 712 | ~# gst-inspect-1.0 | grep -e "v4l2.*dec" |
| 713 | video4linux2: v4l2mpeg4dec: V4L2 MPEG4 Decoder |
| 714 | video4linux2: v4l2mpeg2dec: V4L2 MPEG2 Decoder |
| 715 | video4linux2: v4l2h264dec: V4L2 H264 Decoder |
| 716 | }}} |
| 717 | * The CODA960 decoder outputs NV12/I420/YV12 YUV pixel formats |
| 718 | * The CODA driver requires CMA memory for buffers when it is used. Make sure to provide the kernel with enough CMA memory with the kernel command-line (ie 'cma=64M' for 64MB which should be enough) |
| 719 | |
| 720 | Note that the following examples assume you are using raw video encoded files, not container formats used for mimxed multimedia types (audio + video) such as ogg, avi, or mov (Quicktime). For information on de-muxing container formats see [wiki:gstreamer/multimedia] |
| 721 | |
| 722 | Examples: |
| 723 | * Decoding H264 video might look like this: |
| 724 | {{{#!bash |
| 725 | gst-launch-1.0 filesrc location=file.h264 ! h264parse ! v4l2h264dec ! kmssink |
| 726 | }}} |
| 727 | * Decoding MPEG4 video might look like this: |
| 728 | {{{#!bash |
| 729 | gst-launch-1.0 filesrc location=file.mp4v ! mpeg4videoparse ! v4l2mpeg4dec ! kmssink |
| 730 | }}} |
| 731 | |
| 732 | For more examples including working with 'multimedia' files that contain both audio and video see [wiki:gstreamer/multimedia] |
| 733 | |
| 734 | |
| 735 | |
| 736 | [=#gstreamer-alsa] |
| 737 | [=#audio] |
| 738 | = GStreamer Audio |
| 739 | The GStreamer {{{alsasrc}}} and {{{alsasink}}} provide audio capture and playback to Advanced Linux Sound Architecture (ALSA) devices. ALSA is the modern device-driver API for the Linux kernel. |
| 740 | |
| 741 | The Gstreamer {{{alsasrc}}} and {{{alsasink}}} can be passed a {{{device}}} property which can represent either the device name (reported from {{{aplay -L}}} for playback, or {{{arecord -L}}} for record) (ie {{{sysdefault:CARD=sgtl5000audio}}}) or the hw:x,y notation (ie {{{hw:0,0}}} for first device, first subdevice) |
| 742 | |
| 743 | If not specified via the {{{device}}} property, the device used for alsasrc and alsasink will depend on {{{/etc/asound.conf}}} and/or {{{~/.asoundrc}}}. |
| 744 | |
| 745 | See [wiki:ventana/audio#devices here] for details on specifying audio input and output devices on the Ventana product family. |
| 746 | |
| 747 | Playback and Capture also depends on your ALSA mixer ({{{amixer}}}) settings. See [wiki:ventana/audio#Mixermuxgainvolume mixer settings] for more info. |
| 748 | |
| 749 | |
| 750 | [=#audio-output] |
| 751 | == Audio Output via Linux ALSA driver and GStreamer alsasink |
| 752 | Generally, a 'sink' element is one that will take an audio stream and send it to an audio device. Please refer to [wiki:ventana/audio this audio] page for more details on audio devices on the ventana platform. |
| 753 | |
| 754 | This {{{alsasink}}} element can accept any of the following formats via {{{audio/x-raw}}}: S8, U8, S16LE, S16BE, U16LE, U16BE, S24_32LE, S24_32BE, U24_32LE, U24_32BE, S32LE, S32BE, U32LE, U32BE, S24LE, S24BE, U24LE, U24BE, S20LE, S20BE, U20LE, U20BE, S18LE, S18BE, U18LE, U18BE, F32LE, F32BE, F64LE, F64BE |
| 755 | |
| 756 | Examples: |
| 757 | * Generate and play a 1kHz tone to the 'hw:0,0' device: |
| 758 | {{{#!bash |
| 759 | gst-launch-1.0 audiotestsrc ! alsasink device="hw:0,0" |
| 760 | }}} |
| 761 | |
| 762 | |
| 763 | [=#audio-input] |
| 764 | == Audio Input via Linux ALSA driver and GStreamer alsasrc |
| 765 | An input source is anything coming from a capture device on the SBC, e.g. HDMI audio in/analog audio in. Please refer to [wiki:ventana/audio this audio] page for more details on audio devices on the ventana platform. |
| 766 | |
| 767 | The {{{alsasrc}}} element can output the following source types in {{{audio/x-raw}}}: S8, U8, S16LE, S16BE, U16LE, U16BE, S24_32LE, S24_32BE, U24_32LE, U24_32BE, S32LE, S32BE, U32LE, U32BE, S24LE, S24BE, U24LE, U24BE, S20LE, S20BE, U20LE, U20BE, S18LE, S18BE, U18LE, U18BE, F32LE, F32BE, F64LE, F64BE |
| 768 | |
| 769 | Examples: |
| 770 | * Capture 32kHz stereo 16bit audio from the 'hw:0,0' device and mux into into an AVI file: |
| 771 | {{{#!bash |
| 772 | gst-launch-1.0 alsasrc device="hw:0,0" ! "audio/x-raw,rate=32000,channels=2,depth=16" ! \ |
| 773 | audioconvert ! avimux ! filesink location=./audio.avi |
| 774 | }}} |
| 775 | |
| 776 | |
| 777 | [=#audio-test] |
| 778 | == audiotestsrc |
| 779 | The {{{audiotestsrc}}} is a very useful element for testing. It can output 16-64bit {{{audio/x-raw}}}: S16LE, S32LE, F32LE, F64LE |
| 780 | |
| 781 | This can output audio signals from {{{sine}}} to {{{violet-noise}}}. |
| 782 | |
| 783 | Examples: |
| 784 | * Generate and play a 1kHz tone to the 'hw:0,0' device: |
| 785 | {{{#!bash |
| 786 | gst-launch-1.0 audiotestsrc ! alsasink device="hw:0,0" |
| 787 | }}} |
| 788 | |
| 789 | |
| 790 | [=#audio-encoding] |
| 791 | == Audio Encoding |
| 792 | Encoding is the term used to capture audio and encode it to a new format type. |
| 793 | |
| 794 | Commonly used GStreamer Audio encoders: |
| 795 | * vorbisenc - encode {{{audio/x-raw}}} F32LE to {{{audio/x-vorbis}}} |
| 796 | * mulawenc - encode {{{audio/x-raw}}} S16LE to {{{audio/x-mulaw}}} |
| 797 | * wavenc - encode {{{audio/x-raw}}} S32LE, S24LE, S16LE, U8, F32LE, F64LE, {{{audio/x-alaw}}}, and {{{audio/x-mulaw}}} to {{{audio/x-wav}}} |
| 798 | * alawenc - encode {{{audio/x-raw}}} S16LE to {{{audio/x-alaw}}} |
| 799 | * flacenc - encode {{{audio/x-raw}}} S24LE, S24_32LE, S16LE, S8 to {{{audio/x-flac}}} |
| 800 | * lamemp3enc - encode {{{audio/x-raw}}} S16LE to {{{audio/mpeg}}} |
| 801 | |
| 802 | There are many more. You can search for your specified one by running a similar search: {{{gst-inspect-1.0 | grep enc}}}. |
| 803 | |
| 804 | Note that each encoder has its own limits on the samples it accepts (S16LE, S24LE etc). You can find the formats accepted via {{{gst-inspect-1.0 <element>}}} and you can use the [#audioconvert {{{audioconvert}}}] element to convert audio samplerate / bitwidth / format (via software algorithms) as needed between two elements. |
| 805 | |
| 806 | Examples: |
| 807 | * Capture audio from the 'hw:0,0' device, encode it using the MPEG Layer 3 audio codec and store it to a file: |
| 808 | {{{#!bash |
| 809 | gst-launch-1.0 alsasrc device="hw:0,0" ! audioconvert ! lamemp3enc ! filesink location=file.mp3 |
| 810 | }}} |
| 811 | |
| 812 | |
| 813 | [=#audio-decoding] |
| 814 | == Audio Decoding |
| 815 | Decoding is the term used to decode an encoded audio stream to a raw audio stream. |
| 816 | |
| 817 | Commonly used GStreamer Audio decoders: |
| 818 | * mpg123audiodec - decode {{{audio/mpeg}}} to {{{audio/x-raw}}} |
| 819 | * vorbisdec - decode {{{audio/x-vorbis}}} to {{{audio/x-raw}}} |
| 820 | * ivorbisdec - decode {{{audio/x-vorbis}}} to {{{audio/x-raw}}} |
| 821 | * a52dec - decode {{{audio/x-ac3}}}, {{{audio/ac3}}}, and {{{audio/x-private1-ac3}}} to {{{audio/x-raw}}} |
| 822 | * mulawdec - decode {{{audio/x-mulaw}}} to {{{audio/x-raw}}} |
| 823 | * alawdec - decode {{{audio/x-alaw}}} to {{{audio/x-raw}}} |
| 824 | * flacdec - decode {{{audio/x-flac}}} to {{{audio/x-raw}}} |
| 825 | |
| 826 | Note that each decoder has its own limits on the samples it produces at its src pad (S16LE, S24LE etc). You can find the formats accepted via {{{gst-inspect-1.0 <element>}}} and you can use the [#audioconvert {{{audioconvert}}}] element to convert audio samplerate / bitwidth / format (via software algorithms) as needed between two elements. |
| 827 | |
| 828 | Examples: |
| 829 | * Decode MPEG Layer 3 audio and playback to the default audio device with the following: |
| 830 | {{{#!bash |
| 831 | gst-launch-1.0 filesrc location=file.mp3 ! mpegaudioparse ! mpg123audodec ! alsasink |
| 832 | }}} |
| 833 | |
| 834 | |
| 835 | [=#audio-format] |
| 836 | [=#audioconvert] |
| 837 | == Audio Formats and Conversion |
| 838 | Audio samples specified by the {{{audio/x-raw}}} vary between bit width and endieness as well as the sampling rate (how many times a second audio is measured) and the number of channels that are sampled. |
| 839 | |
| 840 | To specify the details of the audio format, you use a [#capsfilter GStreamer {{{capsfilter}}}] and to convert it from one format to another you use the GStreamer {{{audioconvert}}} element. |
| 841 | |
| 842 | Examples: |
| 843 | * specify the sample-rate (32kHz), channels (2:stereo), and bit depth (16bit) and capture to an AVI file: |
| 844 | {{{#!bash |
| 845 | gst-launch-1.0 alsasrc device="hw:0,0" ! "audio/x-raw,rate=32000,channels=2,depth=16" ! \ |
| 846 | audioconvert ! avimux ! filesink location=./audio.avi |
| 847 | }}} |
| 848 | |
| 849 | |
| 850 | [=#audio-loopback] |
| 851 | == Audio loopback (Useful for testing audio input and output) |
| 852 | A simple aduio loopback test can take audio input from an input device and output it to an output device. |
| 853 | |
| 854 | loopback audio from the first audio card: |
| 855 | {{{#!bash |
| 856 | gst-launch-1.0 alsasrc device="hw:0,0" ! alsasink device="hw:0,0" |
| 857 | }}} |
| 858 | |
| 859 | To send audio from the sgtl5000 (analog input) to the imxhdmisoc (HDMI output): |
| 860 | * using device/function numbers (ie from {{{aplay -l}}}): |
| 861 | {{{#!bash |
| 862 | gst-launch-1.0 alsasrc device="hw:0,0" ! alsasink device="hw:2,0" |
| 863 | }}} |
| 864 | * or, using names (ie from {{{aplay -L}}}): |
| 865 | {{{#!bash |
| 866 | gst-launch-1.0 alsasrc device="sysdefault:CARD=sgtl5000audio" ! \ |
| 867 | alsasink device="sysdefault:CARD=imxhdmisoc" |
| 868 | }}} |
| 869 | |
| 870 | To send audio from the tda1997x HDMI receiver (digital input) to the imxhdmisoc (digital output): |
| 871 | {{{#!bash |
| 872 | gst-launch-1.0 alsasrc device="sysdefault:CARD=tda1997xaudio" ! \ |
| 873 | "audio/x-raw,rate=44100" ! alsasink device="sysdefault:CARD=imxhdmisoc" |
| 874 | }}} |
| 875 | * Note here we need to specify the audio sample-rate as it can vary per input stream (and gstreamer does not validate the rate). This rate must match the source stream samplrate which can be found via sysfs {{{/sys/bus/i2c/drivers/tda1997x/2-0048/audmode}}}. If your output device requires a different sample-rate than the source input device, you need to perform a sample-rate conversion. |
| 876 | |
| 877 | |
| 878 | [=#multimedia] |
| 879 | = Multimedia (Video and Audio combined) |
| 880 | Handling media content that has both audio and video stream types requires a slightly more complex pipeline than the standard [#audio audio] and [#video video] pipeline examples. |
| 881 | |
| 882 | Generally, a mixed media pipeline which consumes multimedia will consist of a demuxer (to split audio and video), individualized pipelines per video stream and audio stream, and {{{queue}}} elements to provide asynchronous playback of each stream type (which basically relates to using multiple threads of execution so that one element doesn't block the pipeline waiting for more data). |
| 883 | |
| 884 | Conversely if producing multimedia content your pipeline will consist of a muxer to join audio and video streams and {{{queue}}} elements. |
| 885 | |
| 886 | |
| 887 | [=#named-elements] |
| 888 | == Named Elements, queues, and Multiple pipelines with gst-launch |
| 889 | When mixing audio and video elements with {{{gst-launch}}} one must make use of multiple pipelines using {{{named elements}}}. |
| 890 | |
| 891 | The {{{name}}} property can be specified on any element in a pipeline and by default if not specified it will be set to the previous name (if any). |
| 892 | |
| 893 | Multiple pipelines can be provided to {{{gst-launch}}} and connected together by their names by either sourcing a pipeline with a name followed by a '.' or sinking a pipeline to a name followed by a '.'. |
| 894 | |
| 895 | This is best explained with some examples: |
| 896 | * Encoding a stream with audio and video content into an AVI file: |
| 897 | {{{#!bash |
| 898 | gst-launch-1.0 \ |
| 899 | videotestsrc \ |
| 900 | ! $VIDEO_CAPABILITIES \ |
| 901 | ! mux. \ |
| 902 | audiotestsrc \ |
| 903 | ! $AUDIO_CAPABILITIES \ |
| 904 | ! mux. \ |
| 905 | avimux name=mux \ |
| 906 | ! filesink location=test.avi |
| 907 | }}} |
| 908 | - The {{{videotestsrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}} |
| 909 | - The {{{audiotestsrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}} |
| 910 | - The {{{avimux}}} pipeline specifies {{{name=mux}}} therefore it takes as a source all pipelines that ended with {{{mux.}}} |
| 911 | * Decoding a Matroska container file containing H264 video and AC3 audio |
| 912 | {{{#!bash |
| 913 | gst-launch-1.0 \ |
| 914 | filesrc location=file.mkv \ |
| 915 | ! matroskademux name=demux \ |
| 916 | demux. ! queue ! ac3parse ! a52dec ! audioconvert ! alsasink \ |
| 917 | demux. ! queue ! h264parse ! v4l2h264dec ! kmssink |
| 918 | }}} |
| 919 | - The {{{filesrc}}} pipeline ends with {{{name=demux}}} which means the output of this pipeline will be sent to all pipelines with a {{{demux.}}} source (who's types have been successfully negotiated) |
| 920 | - The audio pipeline consisting of the {{{ac3parse}}} element will source buffers that are supported by its sink capabilities (ie audio/x-ac3, audio/x-eac3, audio/ac3) |
| 921 | - The video pipeline consisting of the mpeg4videoparse element will source buffers that are supported by its sink capabilities (ie video/mpeg, video-x-divx) |
| 922 | - Queue elements are used to keep one pipeline or element from blocking another. For example, if the {{{v4l2h264dec}}} element needs more data from the demux element before it can decode a frame and send it down its pipeline it would normally stall the pipeline unless a queue element was in place to allow buffering |
| 923 | * Decoding a MOV file containing H264 video and AAC audio: |
| 924 | {{{#!bash |
| 925 | gst-launch-1.0 \ |
| 926 | filesrc location=file.mov \ |
| 927 | ! qtdemux name=demux \ |
| 928 | demux. ! queue ! aacparse ! avdec_aac ! alsasink \ |
| 929 | demux. ! queue ! h264parse ! v4l2h264dec ! kmssink |
| 930 | }}} |
| 931 | |
| 932 | |
| 933 | [=#mux] |
| 934 | == Muxing Mixed Content |
| 935 | Often a multi-media stream will consist of mixed audio and video streams that are multiplexed (aka 'muxed') together into a single bitstream. The GStreamer elements that perform the combining or muxiplexing on the stream creation side are called 'Muxers'. |
| 936 | |
| 937 | You can use {{{gst-inspect}}} to see a list of most of these using grep: |
| 938 | {{{#!bash |
| 939 | gst-inspect-1.0 | grep -i muxer | grep -vi de |
| 940 | }}} |
| 941 | |
| 942 | Some common examples: |
| 943 | - mpegtsmux: MPEG Transport Stream Muxer |
| 944 | - mpegpsmux: MPEG Program Stream Muxer |
| 945 | - matroskamux: Matroska muxer |
| 946 | - avimux: Avi muxer |
| 947 | - qtmux: !QuickTime Muxer |
| 948 | - oggmux: Ogg muxer |
| 949 | |
| 950 | To mux mixed content together include one of these elements following the audio and video pipelines. |
| 951 | |
| 952 | Examples: |
| 953 | * Encoding a stream with audio and video content into an AVI file: |
| 954 | {{{#!bash |
| 955 | gst-launch-1.0 \ |
| 956 | videotestsrc \ |
| 957 | ! $VIDEO_CAPABILITIES \ |
| 958 | ! mux. \ |
| 959 | audiotestsrc \ |
| 960 | ! $AUDIO_CAPABILITIES \ |
| 961 | ! mux. \ |
| 962 | avimux name=mux \ |
| 963 | ! filesink location=test.avi |
| 964 | }}} |
| 965 | - the {{{videotestsrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}} and who's format has been successfully negotiated. |
| 966 | - the {{{audiotestsrc}}} pipeline ends with {{{mux.}}} which means its output is sent to the pipeline who's name is {{{mux}}} and who's format has been successfully negotiated. |
| 967 | - the {{{avimux}}} pipeline specifies {{{name=mux}}} therefore it takes as a source all pipelines that ended with {{{mux.}}} and it understands how to multiplex the two types of data together into its output which is written to the file test.avi |
| 968 | |
| 969 | |
| 970 | |
| 971 | [=#demux] |
| 972 | == De-muxing mixed content |
| 973 | Often a multi-media stream will consist of mixed audio and video streams that are multiplexed (aka 'muxed') together into a single bitstream. The GStreamer elements that perform the de-multiplexing on the stream consumption side are called 'De-Muxers'. |
| 974 | |
| 975 | You can use {{{gst-inspect}}} to see a list of most of these using grep: |
| 976 | {{{#!bash |
| 977 | gst-inspect-1.0 | grep -i 'de\?muxer' |
| 978 | }}} |
| 979 | |
| 980 | Some common examples: |
| 981 | - tsparse: MPEG transport stream parser |
| 982 | - tsdemux: MPEG transport stream demuxer |
| 983 | - matroskademux: Matroska demuxer |
| 984 | - avidemux: Avi demuxer |
| 985 | - qtdemux: !QuickTime demuxer |
| 986 | - oggdemux: Ogg demuxer |
| 987 | |
| 988 | To demux mixed content include one of these elements following the audio and video pipelines. |
| 989 | |
| 990 | Unlike muxing you also need to use a {{{parser}}} element to parse the bitstream and break it into discrete buffers (frames) that the downstream decoder expects. In other words this 'frames' the data going to decoders. |
| 991 | |
| 992 | Some common parsers: |
| 993 | - ogmaudioparse: OGM audio stream parser |
| 994 | - ogmvideoparse: OGM video stream parser |
| 995 | - aacparse: AAC audio stream parser |
| 996 | - amrparse: AMR audio stream parser |
| 997 | - ac3parse: AC3 audio stream parser |
| 998 | - flacparse: FLAC audio parser |
| 999 | - mpegaudioparse: MPEG1 Audio Parser |
| 1000 | - h263parse: H.263 parser |
| 1001 | - h264parse: H.264 parser |
| 1002 | - mpegvideoparse: MPEG video elementary stream parser |
| 1003 | - mpeg4videoparse: MPEG 4 video elementary stream parser |
| 1004 | - pngparse: PNG parser |
| 1005 | - vc1parse: VC1 parser |
| 1006 | |
| 1007 | Examples: |
| 1008 | * Demuxing a Matroska container file containing H264 video and AC3 audio into its raw components: |
| 1009 | {{{#!bash |
| 1010 | gst-launch-1.0 \ |
| 1011 | filesrc location=file.mkv \ |
| 1012 | ! matroskademux name=demux \ |
| 1013 | demux. ! queue ! ac3parse ! filesink location=file.ac3 \ |
| 1014 | demux. ! queue ! h264parse ! filesink location=file.h264 |
| 1015 | }}} |
| 1016 | * Demuxing a MOV file containing H264 video and AAC audio into its raw components: |
| 1017 | {{{#!bash |
| 1018 | gst-launch-1.0 \ |
| 1019 | filesrc location=file.mov \ |
| 1020 | ! qtdemux name=demux \ |
| 1021 | demux. ! queue ! aacparse ! filesink location=file.aac \ |
| 1022 | demux. ! queue ! h264parse ! filesink locatino=file.h264 |
| 1023 | }}} |
| 1024 | |
| 1025 | |
| 1026 | |
| 1027 | [=#bin] |
| 1028 | == bin elements |
| 1029 | A '''bin''' element refers to a group of elements strung together and referenced as one. However, there are stand-alone elements that provide some automatic negotiation of sub-elements which use this concept. |
| 1030 | |
| 1031 | Sometimes the bin elements are not flexible enough and you need to determine exactly what pipeline you can use to decode and play a stream. The {{{gst-launch}}} application provides a couple of useful debugging tools that can help with this: |
| 1032 | * using {{{GST_DEBUG_DUMP_DOT_DIR}}} and Graphviz |
| 1033 | * adding the '-v' parameter to gst-launch will provide verbose feedback on the pipeline configuration that can tell you what is going on |
| 1034 | |
| 1035 | See [#troubleshooting] for more details on these methods. |
| 1036 | |
| 1037 | |
| 1038 | [=#playbin] |
| 1039 | === GStreamer playbin |
| 1040 | GStreamer {{{playbin}}} element attempts to create a pipeline that will play both the audio and video portions of a file. For example: |
| 1041 | {{{#!bash |
| 1042 | gst-launch-1.0 playbin uri=file:///media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi |
| 1043 | }}} |
| 1044 | |
| 1045 | The above pipeline will attempt to output to the first video device and first audio devices found. However, you can further specify this by: |
| 1046 | {{{#!bash |
| 1047 | gst-launch-1.0 playbin uri=file:///media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi audio-sink="alsasink device=hw:1,0" |
| 1048 | }}} |
| 1049 | |
| 1050 | Please type {{{gst-inspect-1.0 playbin}}} to see more options. |
| 1051 | |
| 1052 | |
| 1053 | [=#decodebin] |
| 1054 | === GStreamer decodebin |
| 1055 | The GStreamer {{{decodebin}}} element is very useful if you're unsure of which decoder to use on a stream. For example, we can replace the example under [#ex1 the first example] with the following: |
| 1056 | {{{#!bash |
| 1057 | gst-launch-1.0 \ |
| 1058 | filesrc location=/media/open-media/tears_of_steel_1080p.webm do-timestamp=true typefind=true ! \ |
| 1059 | matroskademux name=d \ |
| 1060 | d. ! queue ! ivorbisdec ! queue ! alsasink device=hw:1,0 \ |
| 1061 | d. ! queue ! decodebin ! queue ! kmssink |
| 1062 | }}} |
| 1063 | |
| 1064 | Note that {{{decodebin}}} doesn't always choose the correct decoder, so be wary of this. It is similar to {{{playbin}}} in that it aids in creating a dynamic pipeline. |
| 1065 | |
| 1066 | |
| 1067 | [=#gst-play] |
| 1068 | === GStreamer gst-play |
| 1069 | The stand-alone application {{{gst-play}}} is is a program that utilizes the {{{playbin}}} element and thus can be used for playback of many file types. The above example {{{gst-launch-1.0 playbin uri=file:///media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi}}} can be replaced with: |
| 1070 | {{{#!bash |
| 1071 | gst-play-1.0 /media/open-media/big_buck_bunny_1080p_mp4v_ac3_5.1.avi |
| 1072 | }}} |
| 1073 | |
| 1074 | |
| 1075 | [=#streaming] |
| 1076 | = Streaming (send multimedia to or receive from Network) |
| 1077 | GStreamer has elements that allow for network streaming to occur. |
| 1078 | |
| 1079 | There are several ways to accomplish networked streaming over Internet Protocol (IP): |
| 1080 | * [#udp Raw UDP/IP] |
| 1081 | * [#tcp Raw TCP/IP] |
| 1082 | * [#rtp Real-time Transport Protocol (RTP)] |
| 1083 | * [#rtsp Real Time Streaming Protocol (RTSP)] ('''recommended''') |
| 1084 | * [#abs Adaptive Bitrate Streaming] |
| 1085 | |
| 1086 | |
| 1087 | [=#udp] |
| 1088 | == Raw UDP |
| 1089 | Using UDP/IP is the simplest mechanism for streaming and utilizes the least amount of bandwidth. Because UDP does not provide any error detection, packet ordering, or error correction the bitrate is deterministic and simply the bitrate of the media you are streaming. |
| 1090 | |
| 1091 | The limitations of raw UDP is: |
| 1092 | * requires codec that can handle missing/corrupt data (most do these days) |
| 1093 | * does not use headers containing payload type or timestamp info on stream (making it suitable for only a single type of media, or a pre-muxed type of media) |
| 1094 | * does not fragment packets - will try to send a raw udp packet for whatever size buffer the udpsink is passed (which can lead to pipeline errors). To fragment packets use RTP |
| 1095 | |
| 1096 | The only benefit of using raw UDP is that it is the simplest pipeline you can create for streaming and requires the least amount of dependencies (albeit you might run into one or all of the above problems). |
| 1097 | |
| 1098 | '''Note that it is recommended that you use [#rtp RTP] or [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above''' |
| 1099 | |
| 1100 | The {{{udpsrc}}} element can be used to render/save a stream originated from a {{{udpsink}}} pipeline. |
| 1101 | |
| 1102 | Examples: |
| 1103 | * encode and send H264 video from Ventana: |
| 1104 | 1. Start decoder first: |
| 1105 | {{{ |
| 1106 | #!bash |
| 1107 | ifconfig eth0 192.168.1.1 |
| 1108 | gst-launch-1.0 udpsrc port=9001 ! h264parse ! v4l2h264dec ! kmssink sync=false |
| 1109 | }}} |
| 1110 | 2. Start encoder second: |
| 1111 | {{{ |
| 1112 | #!bash |
| 1113 | ifconfig eth0 192.168.1.2 |
| 1114 | gst-launch-1.0 videotestsrc is-live=true ! v4l2h264enc ! udpsink host=192.168.1.1 port=9001 |
| 1115 | }}} |
| 1116 | |
| 1117 | Notes: |
| 1118 | * On the client (stream receiver and renderer) you must use the {{{sync=false}}} property to render frames as they are received otherwise the stream will stall because their is no headers containing timestamps |
| 1119 | * the decoder (udpsrc) needs to be started first because udpsink will fail if nothing is listening to the socket |
| 1120 | |
| 1121 | |
| 1122 | [=#tcp] |
| 1123 | == TCP |
| 1124 | Using TCP/IP brings error detection, packet re-ordering, and error correction to the network stream. This however causes the bitrate to be non-deterministic because as the error rate increases so does the bitrate and latency. |
| 1125 | |
| 1126 | The limitations of using TCP: |
| 1127 | * non-deterministic bitrate |
| 1128 | * added latency |
| 1129 | * does not use headers containing payload type or timestamp info on stream (making it suitable for only a single type of media, or a pre-muxed type of media) |
| 1130 | |
| 1131 | '''Note that it is recommended that you use [#rtp RTP] or [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above''' |
| 1132 | |
| 1133 | TCP/IP introduces the concept of a socket connection therefore there must exist a server and a client in which case the server must be started first to listen for a connection. You can use a server sink or a server source. The {{{tcpserversrc}}} source can be used to create a TCP server that waits for a connection from a {{{tcpclientsink}}} to render/save. Alternatively the {{{tcpserversink}}} sink can be used to create a TCP server that waits for a connection from a {{{tcpclientsrc}}} that will send data. |
| 1134 | |
| 1135 | Examples: |
| 1136 | * encode and send H264 video from Ventana with '''decoder as server''': |
| 1137 | 1. Start decoder (server) first: |
| 1138 | {{{ |
| 1139 | #!bash |
| 1140 | ifconfig eth0 192.168.1.1 |
| 1141 | gst-launch-1.0 tcpserversrc host=0.0.0.0 port=9001 ! decodebin ! autovideosink sync=false |
| 1142 | }}} |
| 1143 | 2. Start encoder (client) second: |
| 1144 | {{{ |
| 1145 | #!bash |
| 1146 | ifconfig eth0 192.168.1.2 |
| 1147 | ENCODER=/dev/$(for i in $(ls -d /sys/class/video4linux/video*); do [ "coda-encoder" = "$(cat $i/name)" ] && basename $i; done) |
| 1148 | gst-launch-1.0 videotestsrc is-live=true ! v4l2h264enc extra-controls="controls,h264_profile=4,video_bitrate=1000;" ! tcpclientsink host=192.168.1.1 port=9001 |
| 1149 | }}} |
| 1150 | |
| 1151 | Notes: |
| 1152 | * TCP is connection oriented therefore the TCP 'server' must be started first. You can choose your elements such that the stream originator is the server or the stream renderer is the server however doing so can be problematic for certain codecs because the client decoding the stream may pick up the stream somewhere in the middle and not know how to parse it. |
| 1153 | * the {{{host=0.0.0.0}}} property means listen to all network interfaces |
| 1154 | |
| 1155 | |
| 1156 | [=#rtp] |
| 1157 | == RTP (raw/session-less) |
| 1158 | The [https://en.wikipedia.org/wiki/Real-time_Transport_Protocol Real-time Transport Protocol (RTP)] is a network protocol for delivering audio and video over IP networks. RTP is used extensively in communication and entertainment systems that involve streaming media, such as telephony, video teleconference applications, television services and web-based push-to-talk features. |
| 1159 | |
| 1160 | The RTP packet type encapsulates multimedia data with a payload type and time-stamp and therefore can be used to compensate for jitter, out of sequence packets, and time synchronization between streams of different types (ie audio/video lip-sync). |
| 1161 | |
| 1162 | RTP is typically used in conjunction with other protocols such as RTP Control Protocol (RTCP) and [#rtsp Real Time Streaming Protocol (RTSP)] to manage stream sessions however can be used on its own in a raw session-less fashion using {{{udpsink}}} and {{{udpsrc}}} elements. |
| 1163 | |
| 1164 | The limitations of using raw/session-less RTP: |
| 1165 | * session management needs to be handled manually (capsfilter is needed to specify stream format) |
| 1166 | |
| 1167 | '''Note that it is recommended that you use [#rtsp RTSP] unless you know exactly what you are doing to overcome the limitations listed above''' |
| 1168 | |
| 1169 | === Example: Capture, encode and stream H264 via RTP with GStreamer playback: |
| 1170 | * Encode and send H264 video from Ventana: |
| 1171 | 1. Start decoder first: |
| 1172 | {{{ |
| 1173 | #!bash |
| 1174 | ifconfig eth0 192.168.1.1 |
| 1175 | gst-launch-1.0 udpsrc port=9001 \ |
| 1176 | caps="application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264" \ |
| 1177 | ! decodebin ! autovideosink |
| 1178 | }}} |
| 1179 | 2. Start encoder second: |
| 1180 | {{{ |
| 1181 | #!bash |
| 1182 | gst-launch-1.0 videotestsrc is-live=true \ |
| 1183 | ! v4l2h264enc extra-controls="controls,h264_profile=4,video_bitrate=1000;" ! rtph264pay ! udpsink host=192.168.1.1 port=9001 |
| 1184 | }}} |
| 1185 | |
| 1186 | Notes: |
| 1187 | * when using RTP a capsfilter must be used to specify the payload as application/x-rtp as above. You can determine the capsfilter required by starting the encoder with a verbose flag {{{-v}}} and looking for {{{caps = "application/x-rtp"}}} |
| 1188 | |
| 1189 | |
| 1190 | === Example: Capture, encode and stream H264 via RTP with VLC playback: |
| 1191 | * Encode and send H264 video from Ventana to a PC with VLC: |
| 1192 | 1. Start decoder first: |
| 1193 | a. Create SDP file like below (IP address in example is that of the Ventana board) |
| 1194 | {{{ |
| 1195 | v=0 |
| 1196 | m=video 5000 RTP/AVP 96 |
| 1197 | c=IN IP4 172.24.20.207 |
| 1198 | a=rtpmap:96 H264/90000 |
| 1199 | }}} |
| 1200 | b. Open SDP file in VLC |
| 1201 | 2. Start encoder (Ventana) second: (IP address in below example is IP of the PC) |
| 1202 | {{{ |
| 1203 | gst-launch-1.0 videotestsrc ! v4l2h264enc ! rtph264pay config-interval=3 ! udpsink host=172.24.20.26 port=5000 |
| 1204 | }}} |
| 1205 | |
| 1206 | |
| 1207 | [=#rtsp] |
| 1208 | == RTSP (Real Time Streaming Protocol) '''(recommended)''' |
| 1209 | The [https://en.wikipedia.org/wiki/Real_Time_Streaming_Protocol Real Time Streaming Protocol (RTSP)] is a network control protocol designed for use in entertainment and communications systems to control streaming media servers. The protocol is used for establishing and controlling media sessions between end points. Clients of media servers issue VCR-style commands, such as play and pause, to facilitate real-time control of playback of media files from the server. This protocol uses the Real-time Transport Protocol (RTP) in conjunction with Real-time Control Protocol (RTCP) for media stream delivery. |
| 1210 | |
| 1211 | The limitations of using RTSP are: |
| 1212 | * {{{gst-inspect}}} has no way of using a simple pipeline to create an RTSP server - you must create or use an existing gstreamer based application (keep reading below). |
| 1213 | |
| 1214 | Creating an RTSP server is not possible via a simple pipeline to gst-launch however GStreamer libraries do exist to make writing an RTSP server trivial. The source for gst-rtsp-server contains an example application [http://cgit.freedesktop.org/gstreamer/gst-rtsp-server/tree/examples/test-launch.c test-launch.c] which provides a simple example that can take a GStreamer 'bin' element consisting of everything but the sink element and serves it via RTSP. |
| 1215 | |
| 1216 | Gateworks has extended this with [#gst-variable-rtsp-server gst-variable-rtsp-server] which demonstrates how to auto-adjusting properties like the encoding bitrate depending on the number of clients connected in addition to serving as a fairly simple example of how to write a gstreamer application. |
| 1217 | |
| 1218 | |
| 1219 | [=#troubleshooting] |
| 1220 | = Troubleshooting |
| 1221 | If something doesn't work right, make sure all of the defaults being used in your pipeline element properties are correct. Many of the examples above or found online may omit the 'device' property from the source or sink elements which will make it default to the first appropriate device. |
| 1222 | |
| 1223 | You can see all the properties of a particular element with the 'gst-inspect <elementname>' command. You can see a list of all elements with 'gst-inspect' without arguments. |
| 1224 | |
| 1225 | You can enable verbose output for GStreamer by adding the '-v' flag which will show you the negotiated pipeline details and the state machine details. |
| 1226 | |
| 1227 | You can enable debug output for GStreamer by using the [http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/section-checklist-debug.html '--gst-debug'] parameter to gst-launch or by setting the [http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gst-running.html GST_DEBUG] environment variable. |
| 1228 | |
| 1229 | For example, if you wanted to see DEBUG level messages for the {{{videotestsrc}}} element you can set GST_DEBUG=videotestsrc:4 |
| 1230 | |
| 1231 | |
| 1232 | [=#filter-graph] |
| 1233 | == GST_DEBUG_DUMP_DOT_DIR |
| 1234 | You can set the GST_DEBUG_DUMP_DOT_DIR env variable to a directory which will cause {{{gst-launch}}} to output a {{{.dot}}} file for each phase of the pipeline then use a tool such as {{{Graphviz}}} to visualize the {{{.dot}}} file. |
| 1235 | |
| 1236 | Example: |
| 1237 | * use playbin to playback a file: |
| 1238 | {{{#!bash |
| 1239 | root@ventana:~# GST_DEBUG_DUMP_DOT_DIR=/tmp/dot gst-launch-1.0 playbin uri=file:///mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi |
| 1240 | }}} |
| 1241 | - hit Cntl-C after decoding starts to exit early |
| 1242 | * see the dot files created: |
| 1243 | {{{#!bash |
| 1244 | root@ventana:~# ls /tmp/dot |
| 1245 | 0.00.00.108710334-gst-launch.NULL_READY.dot |
| 1246 | 0.00.00.490807334-gst-launch.READY_PAUSED.dot |
| 1247 | 0.00.00.506736000-gst-launch.PAUSED_PLAYING.dot |
| 1248 | 0.00.03.135202001-gst-launch.PLAYING_PAUSED.dot |
| 1249 | 0.00.03.254000001-gst-launch.PAUSED_READY.dot |
| 1250 | }}} |
| 1251 | * transfer to a PC and use something like {{{xdot}}} to view: |
| 1252 | {{{#!bash |
| 1253 | xdot 0.00.03.135202001-gst-launch.PLAYING_PAUSED.dot |
| 1254 | }}} |
| 1255 | - zoom in along the graph and you can see that: |
| 1256 | - {{{GstFileSrc}}} is the source, |
| 1257 | - {{{GstAviDemux}}} is used to demux to audio/x-ac3, |
| 1258 | - {{{GstAc3Parse}}} is used to parse the audio into audio frames, |
| 1259 | - {{{GstMpeg4VParse}}} is used to parse the video into video frames, |
| 1260 | - {{{GstV4l2VideoDec}}} is used to decode the video from video/mpeg to video/x-raw, |
| 1261 | - {{{ GstA52Dec}}} is used to decode the audio from audio/x-ac3 to audio/x-raw, |
| 1262 | - etc |
| 1263 | - Note that some hunting with {{{gst-inspect}}} must be done to determine what elements coorespond to the above class names |
| 1264 | |
| 1265 | Reference: |
| 1266 | - http://docs.gstreamer.com/display/GstSDK/Basic+tutorial+11%3A+Debugging+tools |
| 1267 | |
| 1268 | == gst-launch -v |
| 1269 | The verbose debugging from {{{gst-launch -v}}} can show you the negotiation that takes place as a pipeline moves through its stages. |
| 1270 | |
| 1271 | Example: |
| 1272 | {{{#!bash |
| 1273 | gst-launch-1.0 -v playbin uri=file:///mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi |
| 1274 | }}} |
| 1275 | |
| 1276 | examining the verbose output can show you the following: |
| 1277 | * container: AVI: avidemux |
| 1278 | * video: MPEG-4 4481kbps min, 6668kbps max: mpeg4videoparse ! v4l2mpeg4dec |
| 1279 | * audio: AC3 48khz 5channels: ac3parse ! a52dec |
| 1280 | |
| 1281 | Therefore you can use these pipelines to decode and play: |
| 1282 | * video only (output to kmssink) |
| 1283 | {{{#!bash |
| 1284 | gst-launch-1.0 -v filesrc location=/mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi ! avidemux ! mpeg4videoparse ! v4l2mpeg4dec ! kmssink |
| 1285 | }}} |
| 1286 | * audio only (output to hdmi audio sink) |
| 1287 | {{{#!bash |
| 1288 | gst-launch-1.0 -v filesrc location=/mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi ! avidemux ! ac3parse ! a52dec ! audioconvert ! alsasink device="sysdefault:CARD=imxhdmisoc" |
| 1289 | }}} |
| 1290 | * both audio and video |
| 1291 | {{{#!bash |
| 1292 | gst-launch-1.0 -v filesrc location=/mnt/big_buck_bunny_1080p_ac3-5.1_mp4.avi ! avidemux name=d \ |
| 1293 | d. ! queue ! mpeg4videoparse ! v4l2mpeg4dec ! kmssink \ |
| 1294 | d. ! queue ! ac3parse ! a52dec ! audioconvert ! alsasink device="sysdefault:CARD=imxhdmisoc" |
| 1295 | }}} |
| 1296 | |
| 1297 | |
| 1298 | == streaming |
| 1299 | If you're having issues with network streaming: |
| 1300 | * Verify that both sides can ping one another |
| 1301 | * If the message {{{There may be a timestamping problem, or this computer is too slow}}} appears and the video display appears choppy, try the following: |
| 1302 | * Lower the bitrate from the server |
| 1303 | * Place a {{{sync=false}}} on the sink side of the server and client. |
| 1304 | * If video appears choppy, try using UDP over TCP. |
| 1305 | * Verify that the network is not congested. |
| 1306 | * Verify your gstreamer pipeline is correct. The best way to find the element that causes a negotiation failure is to end your pipeline in a fakesink and one-by-one eliminate elements leading up to it until it negotiates successfully. |