Changes between Version 1 and Version 2 of Yocto/gstreamer/video


Ignore:
Timestamp:
03/12/2018 09:37:39 PM (6 years ago)
Author:
Tim Harvey
Comment:

restore more recent version of page

Legend:

Unmodified
Added
Removed
Modified
  • Yocto/gstreamer/video

    v1 v2  
    11[[PageOutline]]
    22
    3 [=#video]
    43= Video =
    5 The following pipeline examples utilize plugins from the the Freescale gst-fsl-plugin package on GStreamer 0.10.
    6 
    7 gst-fsl-plugin elements used:
    8  * Output:
    9   * '''mfw_v4lsink''' - stretch and render video frames to a V4L output device using hardware acceleration via the IMX6 IPU.
    10  * Input:
    11   * '''mfw_v4lsrc''' - capture video utilizing hardware acceleration via the IMX6 IPU.
    12   * '''tvsrc''' - capture video from an interlaced source utilizing hardware acceleration via the IMX6 IPU to capture and de-interlace.
    13  * Encode / Decode:
    14   * '''vpuenc''' - encode video using the specified code utilizing hardware acceleration via the IMX6 VPU.
    15   * '''vpudec''' - decode video utilizing hardware acceleration via the IMX6 VPU.
    16 
    17 For a description of Video devices see:
    18  * [wiki:Yocto/Video_In Video Input Devices]
    19  * [wiki:Yocto/Video_Out Video Output Devices]
    20 
     4This page will show several example pipelines for getting video through our boards using the gstreamer-imx set of plugins. The gstreamer-imx set of plugins have several elements that can be used to output a frame to a display. Please see [wiki:Yocto/gstreamer#gstreamer-imx] for element specifics.
     5
     6
     7[=#output]
     8== Output ==
     9Generally, a 'sink' plugin is one that will take a video stream and output it to a display. Please refer to the [wiki:Yocto/Video_Out] page for details on the video out devices on the ventana platform.
     10
     11 
     12A complete list of output sinks on the imx6:
     13* {{{gstreamer-imx}}} specific sinks
     14 * imxg2dvideosink
     15 * imxipuvideosink
     16 * imxpxpvideosink
     17 * imxeglvivsink
     18* Other GStreamer sinks
     19 * autovideosink
     20 * fbdevsink
     21 * fdsink
     22 * ximagesink/xvimagesink
     23 * fakesink
     24 * v4l2sink
     25
     26Plus many more! Execute a {{{gst-inspect-1.0 | grep sink}}} to see a complete list of video sinks available.
     27
     28
     29[=#imxg2dvideosink]
     30=== imxg2dvideosink ===
     31This video sink is very versatile in that it can output any image size. It can also transform images (changing size, rotation etc), place images in specified locations, and can accept the following video formats: RGBx, BGRx, RGBA, BGRA, RGB16, NV12, NV21, I420, YV12, YUY2, UYVY
     32
     33 
     34For drawing to a display, this is our recommended GStreamer video sink.
     35
     36 
     37The {{{imxg2dvideosink}}} also supports vertical sync to eliminate [wiki:Yocto/gstreamer/video#tearing screen tearing]. To enable this set the {{{use-vsync}}} property to true.
     38
     39
     40[=#imxipuvideosink]
     41=== imxipuvideosink ===
     42This video sink is not nearly as versatile in output sizes. In many cases, it will refuse a format and bail out. However, one advantage it has over the {{{imxg2dvideosink}}} is that it includes a deinterlacer and can sink more video formats: RGB16, BGR, RGB, BGRx, BGRA, RGBx, RGBA, ABGR, UYVY, v308, NV12, YV12, I420, Y42B, Y444
     43
     44 
     45This is only recommended if you require a deinterlacer to eliminate the effect of [wiki:Yocto/gstreamer/video#interlaced-video interlaced video effects] or requires a certain video format only this video sink can provide. To enable the deinterlacer set the deinterlace property to true.
     46
     47 
     48The imxipuvideosink also supports vertical sync to eliminate [wiki:Yocto/gstreamer/video#tearing screen tearing]. To enable this set the {{{use-vsync}}} property to true.
     49
     50
     51[=#imxpxpvideosink]
     52=== imxpxpvideosink ===
     53This sink is only available on the i.mx6solo and i.mx6dl processors. It can do the same as the above (minus having a built-in deinterlacer), and has support for the following video formats: BGRx, RGB16, I420, YV12, Y42B, NV12, YUY2, UYVY, YVYU
     54
     55This is recommended if resources are limited and you require offloading some processing to the PXP engine.
     56
     57Enabling the {{{use-vsync}}} property is useful to prevent [wiki:Yocto/gstreamer/video#tearing screen tearing] in the video.
     58
     59
     60[=#imxeglvidsink]
     61=== imxeglvivsink ===
     62This sink is very useful when you have a X11 display, and uses Vivante direct textures to output to the primary display. Like the {{{imxipuvideosink}}}, it has a flexible output video format list: I420, YV12, NV12, NV21, UYVY, RGB16, RGBA, BGRA, RGBx, BGRx, BGR, ARGB, ABGR, xRGB, xBGR
     63
     64This is recommended when running a pipeline which will output to a display with X11/wayland.
     65 
     66To use double-buffering (to eliminate tearing) set the FB_MULTI_BUFFER env variable to 2.
     67
     68
     69[=#autovideosink]
     70=== autovideosink ===
     71This GStreamer sink is not really a 'video' sink in the traditional sense. Similar to {{{playbin}}} and {{{decodebin}}}, this element selects what it thinks is the best available video sink and uses it. This will typically use {{{imxg2dvideosink}}} unless format choices require one of the other sinks. Generally this is not recommended as it avoids understanding the specific pipeline that is in use.
     72
     73You can add a verbose flag {{{gst-launch-1.0 -v}}} to see details about the elements and caps chosen when using any type of 'auto' element or 'bin' element.
     74
     75
     76[=#fbdevsink]
     77=== fbdevsink ===
     78This sink allows you to directly output to the framebuffer.
     79
     80
     81[=#fdsink]
     82=== fdsink ===
     83This sink allows you to write to an open file descriptor.
     84
     85
     86[=#xvimagesink]
     87=== ximagexsink/xvimagesink ===
     88These sinks output to the X11 display using standard Xlib API calls. The {{{xvimagesink}}} is used for XFree86 video out.
     89
     90
     91[=#fakesink]
     92=== fakesink ===
     93This is a very useful video sink. It takes whatever frames is given to it and drops them. This might help debugging pipelines if problems ever arise.
     94
     95{{{#!bash
     96gst-launch-1.0 videotestsrc pattern=0 ! fakesink
     97}}}
     98
     99
     100[=#v4l2sink]
     101=== v4l2sink ===
     102This sink is useful when displaying frames on a video4linux2 device. Generally not used on imx6 based product unless all other sinks fail.
     103
     104 
     105=== Examples ===
     106Using the above, an example video output pipeline might look like the following:
     107
     108{{{#!bash
     109gst-launch-1.0 videotestsrc pattern=18 ! imxg2dvideosink framebuffer=/dev/fb0
     110}}}
     111
     112
     113[=#input]
     114== Input ==
     115An input source is anything coming from an input on the device, e.g. HDMI input/USB Web Cam. In order to capture these frames and display it. Please refer to the [wiki:Yocto/Video_In] page for details on the video in devices on the Ventana platform.
     116
     117A complete list of input sources on the imx6:
     118* gstreamer-imx specific sources
     119 * imxv4l2videosrc
     120* Other GStreamer sources
     121 * autovideosrc
     122 * videotestsrc
     123 * v4l2src
     124
     125Plus many more! Execute a {{{gst-inspect-1.0 | grep src}}} to see a complete list of video sinks available.
     126
     127If the {{{is-live}}} property is set to true this will cause buffers to be discarded on a pipeline paused state and pipelines will not participate in the PREROLL phase of a pipeline.
     128
     129
     130[=#v4l2-ctl]
     131=== video4linux2 devices and v4l2-ctl ===
     132The {{{imxv4l2videosrc}}} and {{{v4l2src}}} elements capture from a video4linux2 device. You can use the {{{v4l2-ctl}}} application to to interact with the device to get/set various capabilities and controls.
     133
     134For example:
     135 * display all details about /dev/video1:
     136{{{#!bash
     137v4l2-ctl -d /dev/video1 --all
     138}}}
     139
     140Note that the IMX6 capture driver uses the v4l-int-dev API which creates a 'master' and 'slave' relationship between the CPU's IPU capture driver (mxc_capture) and the driver for the actual image sensor or video decoder (ie adv7180 analog video decoder or tda1997x HDMI receiver). As such the V4L2 API and {{{v4l2-ctl}}} may not give you access to all the knobs that may exist on the 'slave' or 'sensor' driver. Please see the [wiki:Yocto/Video_In] page for more details.
     141
     142
     143[=#imxv4l2videosrc]
     144=== imxv4l2videosrc ===
     145This is the recommended video capture source element for the least amount of CPU overhead if you are going to be using any of the IMX6 IPU/VPU/GPU capabilities such as displaying on an IMX6 output, [#encoding encoding]/[#decoding decoding]/[#transcoding transcoding] video, [wiki:Yocto/gstreamer/compositing video composition], or using any of the other transforms such as [#colorspace colorspace conversion], [#scaling scaling], or [#interlaced-video de-interlacing].
     146
     147 
     148This is because the imxv4l2videosrc element is necessary to achieve [[​zero-copy]] where DMA-able buffers can be shared among gstreamer elements and eliminate CPU-intensive memory copies.
     149
     150 
     151The v4l CSI drivers in the Gateworks downstream vendor kernel has some extra calls that allow one to retrieve the physical address that corresponds to a v4l buffer and since the IPU, G2D, VPU, PxP driver API's all use physical addresses to access data via DMA, this allows for zero-copy.
     152
     153See also [#v4l2-ctl v4l2-ctl] above
     154
     155
     156[=#autovideosrc]
     157=== autovideosrc ===
     158Like the other {{{auto*}}} GStreamer plugins, this one attempts to pick a video source and use it.
     159
     160
     161[=#videotestsrc]
     162=== videotestsrc ===
     163This is a very useful plugin for testing. It can output a huge number for raw video formats: I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, NV16, NV24, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, !r210, I420_10LE, I420_10BE, I422_10LE, I422_10BE, Y444_10LE, Y444_10BE, GBR, GBR_10LE, GBR_10BE, NV12_64Z32
     164
     165In addition, it can output several bayer video formats: bggr, rggb, grbg, gbrg
     166
     167Selecting a test pattern from the range of 0 - 22, you can verify colors, movement, among other things.
     168
     169
     170[=#v4l2src]
     171=== v4l2src ===
     172This plugin is similar to the {{{imxv4l2videosrc}}} plugin in that it uses the v4l2 api to capture video from input sources, however it does not have access to the physical memory addresses necessary to achieve [https://github.com/Freescale/gstreamer-imx/blob/master/docs/zerocopy.md ​zero-copy] and therefore is typically more CPU intensive depending on your pipeline. If you are going to be using any IMX6 IPU/GPU/VPU capabilities, use [wiki:Yocto/gstreamer/video#imxv4l2videosrc imxv4l2videosrc] instead.
     173
     174Note that {{{v4l2src}}} is always live regardless of the is-live property.
     175
     176See also [#v4l2-ctl v4l2-ctl] above
     177
     178
     179=== Examples ===
     180Here are some capture examples:
     181 * Use a videotestsrc as the source and output to a fakesink
     182{{{#!bash
     183gst-launch-1.0 videotestsrc pattern=0 ! fakesink
     184}}}
     185 * Use imxv4l2videosrc to capture video from camera source (/dev/video0 in this case) and output to the first video device via imxg2dvideosink
     186{{{#!bash
     187gst-launch-1.0 imxv4l2videosrc device=/dev/video0 ! imxg2dvideosink
     188}}}
     189
     190
     191[=#colorspace]
     192[=#scaling]
     193== Colorspace Converting and/or Video Scaling ==
     194Often times, a colorspace conversion or scaling is required in order to link GStreamer elements together. This is often due to the fact that not all elements can every format available.
     195
     196A list of available video transforms:
     197* gstreamer-imx specific hardware-accelerated converters:
     198 * imxipuvideotransform (uses IMX6 IPU)
     199 * imxg2dvideotransform (uses IMX6 GPU)
     200 * imxpxpvideotransform (uses IMX6 PXP)
     201  * Note again that the PXP is only available on the i.mx6solo and i.mx6dl processors.
     202* Other GStreamer colorspace converters (software based):
     203 * autovideoconvert
     204 * videoconvert
     205 * rgb2bayer
     206
     207
     208[=#transform]
     209[=#imxipuvideotransform]
     210=== imxipuvideotransform ===
     211This plugin can convert between input type RGB16, BGR, RGB, BGRx, BGRA, RGBx, RGBA, ABGR, UYVY, v308, NV12, YV12, I420, Y42B, Y444 to output type RGB16, BGR, RGB, BGRx, BGRA, RGBx, RGBA, ABGR, UYVY, v308, NV12, YV12, I420, Y42B, Y444 video formats. Further, this element can deinterlace video before sending a frame on, which can be very useful depending on your video types.
     212 
     213This is the recommended video transform, however please note that the IPU cannot accept non-standard video resolutions.
     214
     215[=#imxg2dvideotransform]
     216=== imxg2dvideotransform ===
     217This plugin can convert between input type RGBx, BGRx, RGBA, BGRA, RGB16, NV12, NV21, I420, YV12, YUY2, UYVY and output type RGBx, BGRx, RGBA, BGRA, RGB16. This video transform obviously doesn't support any many video formats as the imxipuvideotransform, which is why it is not recommended.
     218
     219[=#imxpxpvideotransform]
     220=== imxpxpvideotransform ===
     221This plugin can convert between input type BGRx, RGB16, I420, YV12, Y42B, NV12, YUY2, UYVY, YVYU and output type BGRx, BGRA, RGB16, GRAY8. Like the imxg2dvideotransform, it cannot handle many video formats, which is why it is not recommended.
     222
     223[=#autovideoconvert]
     224=== autovideoconvert ===
     225Like the other auto* plugins, this one chooses the best plugin it thinks can convert one video format to another. It is generally not recommended.
     226
     227[=#videoconvert]
     228=== videoconvert ===
     229This is the GStreamer software video colorspace converter. Because it is software based, it can output a whole slew of video formats:
     230
     231Input type I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, NV16, NV24, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10LE, I420_10BE, I422_10LE, I422_10BE, Y444_10LE, Y444_10BE, GBR, GBR_10LE, GBR_10BE, NV12_64Z32 to output type I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, NV16, NV24, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10LE, I420_10BE, I422_10LE, I422_10BE, Y444_10LE, Y444_10BE, GBR, GBR_10LE, GBR_10BE, NV12_64Z32.
     232
     233 
     234This converter is only recommended when the above cannot be used. Because it is software based, it's performance cost is very high.
     235
     236=== rgb2bayer ===
     237This plugin can convert between the following: Input type bggr, gbrg, grbg, rggb to output type ARGB.
     238
     239=== Examples ===
     240Examples using colorspace conversion:
     241 * Loopback video from a non-IMX capture source to IMX output:
     242{{{#!bash
     243gst-launch-1.0 v4l2src device=/dev/video0 ! imxipuvideotransform ! imxg2dvideosink
     244}}}
     245 * Take camera input /dev/video0 and output it to /dev/fb1 using the IPU to both colorspace convert and display:
     246{{{#!bash
     247gst-launch-1.0  imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxipuvideosink framebuffer=/dev/fb1
     248}}}
     249
     250
     251[=#tearing]
     252== Screen Tearing ==
     253Screen tearing occurs when the video output is not in sync with the display's refresh rate. The process of synchronizing the output to the refresh rate is also referred to as "vsync".
     254
     255If the video frames are displayed directly on the framebuffer, this is easy to fix. The blitter-based video sinks (= the IPU, G2D, PxP sinks) have a "use-vsync" property, which is set to false by default. If set to true, it reconfigures the framebuffer, enlarging its virtual height. It then performs page flipping during playback. The page flipping is synchronized to the display's refresh rate, eliminating the tearing effects. If {{{imxeglvivsink}}} is used, the {{{FB_MULTI_BUFFER}}} environment variable needs to be set to 2. This instructs the Vivante EGL libraries to set up the framebuffer in a way that is similar to what the blitter-based sinks do.
     256
     257In X11, vsync is not doable from the gstreamer-imx side. It would require changes to the existing i.MX6 X11 driver. So far, no such change has been made, meaning that as of now, tearing-free video playback in X11 is not possible.
     258
     259In Wayland, vsync is possible when using Weston as the Wayland compositor. Weston can use OpenGL ES for rendering and also Vivante's G2D. With OpenGL ES, the {{{FB_MULTI_BUFFER}}} approach mentioned above enables vsync for Weston output. This means that the export {{{FB_MULTI_BUFFER=2}}} line needs to be added to the Weston init script. {{{imxeglvivsink}}} can then be used to display video in Wayland, and it will automatically be in sync with the display's refresh rate.
     260
     261References:
     262* [https://github.com/Freescale/gstreamer-imx/blob/master/docs/faq.md ​gstreamer-imx FAQ]
     263* [https://en.wikipedia.org/wiki/Screen_tearing ​Wikipedia - includes a simulated image showing tearing]
     264
     265
     266[=#interlaced-video]
     267== Interlaced Video and Deinterlacing ==
     268Interlaced video is a technique for doubling the perceived frame rate of a video display without consuming extra bandwidth. The interlaced signal contains two fields of a video frame captured at two different times. The alternative to interlaced video is called progressive video.
     269
     270 
     271While reducing bandwidth this can cause a perceived flicker effect as well as a very apparent artifact seen during motion. For example a car moving horizontally across a scene will show every other vertical line differently: the second field interlaced with the first field will be a full frame period ahead in time from the other. The visual affect can be seen in [https://en.wikipedia.org/wiki/File:Interlaced_video_frame_(car_wheel).jpg ​this image from Wikipedia]
     272
     273Television signals are typically interlaced, or at least were until recently. For example, analog television standards such as NTSC used in North America as well as the PAL and SECAM formats used abroad use interlaced video and therefore any analog video decoder such as the ADV7180 found on many Gateworks Ventana boards will capture interlaced video and are subject to interlacing artifacts. Interlaced video is still used in High Definition signals as well and the letter at the end of the format tells you if its interlaced (ie 480i, 720i, 1080i) or progressive (ie 480p, 720p, 1080p).
     274
     275 
    21276References:
    22  * [https://community.freescale.com/docs/DOC-93446 Here] are more examples.
    23  * [https://community.freescale.com/docs/DOC-93387/version/4 Freescale Site Examples]
    24 
    25 
    26 [=#video-output]
    27 == Output ==
    28 For boards that support video out, the following pipeline will generate a video test pattern and output it to the first video device using [[wiki:Video_Out|v4l]]:
    29 {{{
    30 gst-launch videotestsrc pattern=18 ! mfw_v4lsink device=/dev/video16
    31 }}}
    32 
    33 
    34 [=#video-capture]
    35 == Input ==
    36 We can even take video from an external source and stream it to a display
    37 {{{
    38 gst-launch mfw_v4lsrc device=/dev/video0 ! mfw_v4lsink device=/dev/video16
    39 }}}
    40  * tvsrc is used for interlaced video input such as Ventana Analog video In
    41  * mfw_v4lsrc is used for non-interlaced video input such as HDMI video In available on the GW54xx and GW551x
    42 
    43 
    44 [=#video-encoding]
     277* [https://en.wikipedia.org/wiki/Interlaced_video Wikipedia interlaced video - includes several images demonstrating interlacing artifacts]
     278* [https://en.wikipedia.org/wiki/Deinterlacing​Wikipedia deinterlacing]
     279
     280
     281[=#capsfilters]
     282== Caps Filters ==
     283 
     284GStreamer has a concept called [http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/section-caps-api.html caps filters]. A 'cap' is used to describe the type of data that links two pads (two plugins). For example, adding a -v flag to a pipeline pipe will output the caps negotiated between these two plugins:
     285
     286{{{#!bash
     287# gst-launch-1.0 -v videotestsrc ! fakesink
     288Setting pipeline to PAUSED ...
     289Pipeline is PREROLLING ...
     290/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
     291/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
     292Pipeline is PREROLLED ...
     293Setting pipeline to PLAYING ...
     294New clock: GstSystemClock
     295}}}
     296
     297In the output, the {{{caps = "video/x-raw\,\ format\=\(string\)I420\,\ width\=\(int\)320\,\ height\=\(int\)240\,\ framerate\=\(fraction\)30/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"}}} are the video caps negotiated between the two. You can also force in a caps filter between two elements:
     298{{{#!bash
     299~# gst-launch-1.0 -v videotestsrc ! 'video/x-raw, format=UYVY, width=1920, height=1080, framerate=10/1' ! fakesink
     300Setting pipeline to PAUSED ...
     301Pipeline is PREROLLING ...
     302/GstPipeline:pipeline0/GstVideoTestSrc:videotestsrc0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
     303/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
     304/GstPipeline:pipeline0/GstFakeSink:fakesink0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
     305/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ framerate\=\(fraction\)10/1\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)progressive"
     306Pipeline is PREROLLED ...
     307Setting pipeline to PLAYING ...
     308New clock: GstSystemClock
     309}}}
     310As you can see, the caps filter I introduced changed both the video format, resolution, and framerate of the video stream coming out of the {{{videotestsrc}}} plugin. Caps filters are useful when you want to capture at a specific resolution/format, changing audio sample rate etc.
     311
     312
     313[=#loopback]
     314== Loopback Test ==
     315Looping video (i.e. going from an input source back out to an output sink) is something that can be useful if doing colorspace conversions/video composition/resizing etc.
     316 
     317The easiest method of confirming video in to out is by doing something like the following:
     318
     319{{{#!bash
     320# Take camera input /dev/video0 and output it to /dev/fb0 using the GPU
     321gst-launch-1.0  imxv4l2videosrc ! imxg2dvideosink
     322}}}
     323
     324However, there are some input devices that the GPU sink cannot accept, and therefore you'll need to do a colorspace conversion:
     325{{{#!bash
     326# Take camera input /dev/video0, colorspace convert it using the IPU, and finally output it to /dev/fb0 using the GPU
     327gst-launch-1.0  imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxg2dvideosink
     328}}}
     329
     330Some other examples:
     331 * Take camera input /dev/video0 and output it to /dev/fb0 using the IPU to both colorspace convert and display
     332{{{#!bash
     333gst-launch-1.0  imxv4l2videosrc ! imxipuvideotransform ! imxipuvideosink
     334}}}
     335 * Take two inputs, and place them in separate sections of the screen using both IPU and GPU
     336{{{#!bash
     337gst-launch-1.0 \
     338  imxv4l2videosrc device=/dev/video0 ! imxipuvideosink window-width=480 window-height=272 \
     339  videotestsrc ! imxg2dvideosink window-width=480 window-height=272 window-x-coord=900
     340}}}
     341
     342
     343[=#encoding]
    45344== Encoding ==
    46 Here is a simple example that encodes a test pattern (pattern 18 is a bouncing ball) to MJPEG and stored to a file with an AVI container.
    47 {{{
    48 gst-launch videotestsrc pattern=18 num-buffers=512 ! vpuenc codec=mjpg ! \
    49   avimux ! filesink location=/tmp/test.avi
    50 }}}
    51 
    52 
    53 [=#video-decoding]
     345 
     346Encoding a file means to take a raw stream and convert it to a file format (e.g. avi/mp4 etc). The hardware accelerated encoder elements on the i.mx6 are: imxvpuenc_h263, imxvpuenc_h264, imxvpuenc_mpeg4, and imxvpuenc_mjpeg. See below for some examples.
     347
     348 
     349=== VBR/CBR ===
     350Variable Bitrate (VBR) and Constant Bitrate (CBR) are some some parameters you can pass into an encoder. For example, the imxvpuenc_h263 encoder can set it's bitrate property for CBR, or can change quant-param for VBR. When setting the bitrate, you're guaranteed a maximum bitrate of that stream. For example, if filming an action scene, the bitrate is guaranteed to not exceed 10mbps. However, on calm scenes, the camera might decide to lower it's own bitrate (depending on if it's doing it's own compression). On the other hand, the quant-param will take in whatever bitrate was decided on by the camera and only attempt to quantize it. That is, a quantization level of 25 means each frame will be split into 25 chunks from the original, thus lowering it's quality (and dynamically changing the bitrate).
     351
     352 
     353=== h264 ===
     354h264 is a very popular encoding technology. It is most often used for HD content, such as Blu-rays and HDTV. See below for an example using the hardware accelerated h264 encoder:
     355{{{#!bash
     356# Take camera input /dev/video0, encode it to h264 at a bitrate of 10mbit/s (CBR) and save to a file.
     357gst-launch-1.0  imxv4l2videosrc device=/dev/video0 ! imxvpuenc_h264 bitrate=10000 ! filesink location=/tmp/file.mp4
     358}}}
     359
     360Some cameras provide output that the vpu encoder can't handle, thus colorspace convert it first:
     361{{{#!bash
     362# Take camera input /dev/video0, colorspace convert it, encode it to h264 at a quant-param level of 25 (VBR) and save to a file
     363gst-launch-1.0  imxv4l2videosrc device=/dev/video0 ! imxipuvideotransform ! imxvpuenc_h264 quant-param=25 ! filesink location=/tmp/file.mp4
     364}}}
     365 
     366=== mpeg4 ===
     367mpeg4 is another very popular encoding technology. It is the older version of h264, but is still prevalent. See below for examples using the hardware accelerated mpeg4 encoder:
     368 
     369{{{#!bash
     370# Take camera input /dev/video0, encode it to mpeg4 at a bitrate of 10mbit/s (CBR) and save to a file.
     371gst-launch-1.0  imxv4l2videosrc device=/dev/video0 ! imxvpuenc_mpeg4 bitrate=10000 ! filesink location=/tmp/file.mp4
     372}}}
     373 
     374=== mjpeg ===
     375mjpeg is the format often used when saving a .jpg file, or using the .jpg compression technology over a stream of frames. Please note that mjpeg's bitrate/quant-param parameters don't work. Currently, there is no way to vary the compression rate of the mjpeg. See below for an example usage:
     376 * Take 1 frame, encode it to mjpeg, and save to a file
     377{{{#!bash
     378gst-launch-1.0 videotestsrc pattern=0 num-buffers=1 ! imxvpuenc_mjpeg ! filesink location=/tmp/file.mjpg
     379}}}
     380 * Take 50 frames, encode it to mjpeg, and save to a file
     381{{{#!bash
     382gst-launch-1.0 videotestsrc pattern=0 num-buffers=50 ! imxvpuenc_mjpeg ! filesink location=/tmp/file.mjpg
     383}}}
     384 
     385=== jpeg ===
     386jpeg is an image format, usually used to display single frame captures. For example, a user can capture a single frame from a camera or other source via the jpegenc element and display it in their favorite image viewer. For example:
     387 * Capture a frame and save it to a .jpg file
     388{{{#!bash
     389gst-launch-1.0 imxv4l2videosrc device=/dev/video0 num-buffers=1 ! jpegenc ! filesink location=/tmp/file.jpg
     390}}}
     391
     392
     393[=#decoding]
    54394== Decoding ==
    55 We can decode the video from an avi using vpudec which utilizes hardware decoding from the VPU:
    56 {{{
    57 gst-launch filesrc location=/media/sda1/big_buck_bunny_720p_surround.avi ! \
    58  avidemux ! vpudec ! mfw_v4lsink device=/dev/video16
    59 }}}
    60 
    61 
    62 [=#video-transcode]
     395Like encoding, you can go the other way (in order to display video on a monitor).
     396
     397Instead of having a different plugin element for each codec supported, the {{{imxvpudec}}} plugin element will automatically use the appropriate VPU decoder. However, it is necessary to have a bitstream parser in order to detect and split up the video into elements depending on the codec in use.
     398 
     399Note that the following examples assume you are using raw video encoded files, not container formats used for mimxed multimedia types (audio + video) such as ogg, avi, or mov (Quicktime). For information on de-muxing container formats see [wiki:Yocto/gstreamer/multimedia]
     400
     401 
     402=== H.264 ===
     403Decoding a h264 file might look like this:
     404 * Take an input, decode, and display
     405{{{#!bash
     406gst-launch-1.0 filesrc location=/tmp/file.h264 ! h264parse ! imxvpudec ! imxipuvideotransform ! imxipuvideosink
     407}}}
     408 
     409=== MPEG4 ===
     410Decoding an MPEG4 file might look like this:
     411 * Take an input, decode, and display
     412{{{#!bash
     413gst-launch-1.0 filesrc location=/tmp/file.mp4 ! mpeg4videoparse ! imxvpudec ! imxipuvideotransform ! imxipuvideosink
     414}}}
     415
     416
     417[=#transcoding]
    63418== Transcoding ==
    64 We can transcode a file from an avi bin and encode to the H.263 standard
    65 {{{
    66 gst-launch filesrc location=/media/sda1/big_buck_bunny_720p_surround.avi ! \
    67   avidemux ! vpudec ! vpuenc codec=h263 ! filesink location=/tmp/test.mov
    68 }}}
    69 
    70 References:
    71  * [https://community.freescale.com/docs/DOC-93448 IMX Community DOC-93448]
    72 
    73 
    74 [=#video-loopback]
    75 == Video loopback (Useful for testing video input and output) ==
    76 A simple video loopback test can take video input and output it to the HDMI output as shown in the example below from a GW5400:
    77 
    78 Digital HDMI Video Input to HDMI Video Output:
    79 {{{
    80 gst-launch mfw_v4lsrc device=/dev/video0 ! mfw_v4lsink device=/dev/video16
    81 }}}
    82  * Note: For certain resolutions / modes tvsrc may work for HDMI input but typically should not be used.
    83 
    84 Analog Video Input to HDMI Video Output:
    85 {{{
    86 gst-launch -v tvsrc device=/dev/video1 ! mfw_v4lsink device=/dev/video16
    87 }}}
    88  * Note that the GW54xx has digital HDMI input as well which is /dev/video0. Boards with analog video input only will present it on /dev/video0
    89 
    90 We also invite you to look at [https://raw.githubusercontent.com/Gateworks/meta-gateworks/dizzy/recipes-support/gateworks-test/gateworks-test/test_video this script] for similar use cases and examples. This script is what we use to test video input and output capabilities.
     419Transcoding is the ability for a file to be converted to another format. For example:
     420 * Take a file in the AVI format, and converts it to a h264 at a bitrate of 5mbit/s.
     421{{{#!bash
     422gst-launch-1.0 filesrc location=/tmp/file.avi ! imxvpudec ! queue2 ! imxvpuenc_h264 bitrate=5000 ! filesink location=file.mp4
     423}}}