Changes between Initial Version and Version 1 of TPU

06/18/2020 10:45:27 PM (2 years ago)
Cale Collins

first draft


  • TPU

    v1 v1  
     1= Coral — What is it?
     3The Coral Edge TPU provides a means to perform advanced machine learning tasks in a low power, small form factor package.  This hardware is based on an application specific IC (ASIC) tailored for hardware accelerated AI calculations. 
     5This TPU is appropreate for an application where identifying an object or pattern is required.  This could be, but is not limited to:
     6* Object detection
     7* Pose or gesture estimation
     8* Image segmentation
     9* Key phrase detection
     11Some practical applications are:
     12* Autonomous vehicles
     13* Robots
     14* Voice control/Language Processing
     15* Monitering devices
     17Nearly all industries can benifit from this technology. To name a few more specifically:
     18* Health Care
     19* Agrictuture
     20* Manufacturing
     21* Oil and Gas
     22* !Security/Defence
     23* Automated Kiosk
     25For more information about the hardware see these refrences:
     31= Getting started with the Coral Edge TPU
     33To begin you will need:
     34* A workstation with Linux natively installed.
     35* A Newport SBC — Coral requires AARCH64.
     36* A Coral EDGE TPU, our testing was done with the mPCI-e form factor model.
     37* !Network/Internet connection.
     38* Optional: USB webcam
     40== Compiling the kernel
     42The Gateworks kernel defconfig for Newport does not include support for video devices.  For the sake of convenience a pre-built image is avaiable for download.  If you would like to create a similar image manually:
     44* Acquire the [wiki:/newport/bsp Newport BSP], we will call the directory this repo has been sync'ed to the <BSP> directory. 
     45* Follow the steps [wiki:/newport/bsp#Modifyingthestand-aloneLinuxKernelieforUbuntu here] to modify the kernel and create and Ubuntu image.
     46 * In the menuconfig enable the module "USB_VIDEO_CLASS", this will allow you to use a USB webcam with v4l-utils. 
     47 * Complete the procedure detailed in the aformentioned section, build your Bionic image and [wiki:/newport/firmware#UpdateFirmwareviaSerialConsoleandEthernetfromBootloader flash it to your SBC]
     49== Building and installing the Gasket and Apex modules
     52The source code for the modules can be downloaded here as a tar.gz file:
     54 * Extract this tar into a directory, this dir will be refered to as <the_module_directory>.
     55Build the source using the same method as you would for an out of tree module (out of tree = when the module source is not included in the kernel source).
     56* cd to the <BSP> directory.
     59source setup-environment
     61Doing this will configure the toolchain for building.  Keep in mind some of the lines in this file use the argument $PWD thus it should not be sourced from any location other than the <BSP> folder.
     62* You can verify the arguments have been exported by executing the command
     64echo $ARCH
     66This will return "arm64"
     67* cd to <the_module_directory>
     68* cat the "Makefile", you'll see two variables which need to be set for the modules to build correctly.
     73* Execute the following command
     75make -C <BSP>/linux M=$PWD
     77This procedure will result in two modueles being created, "apex.ko" and "gasket.ko".  Copy the .ko files to your target board "/lib/modules/<kernel_version>/extra/" folder.  Using SCP may be the simplest way to go about this.
     79With the modules copied to the board execute the following commands:
     81depmod -a
     84insmod gasket.ko
     87Remove power from the board and reboot.
     89On reboot verify that "/dev/apex_0" device is present.
     91== Installing and configureing Python
     93Python 3.7 is required to run the tenser flow examples.  Other versions can be used, though at the time of writing this wiki 3.7 is the best option when using the Bionic Ubuntu for Newport BSP.
     95apt-get update
     96apt install python3.7 -y
     98Set Python 3.7 to have priority over 3.6.
     100update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.6 1
     101update-alternatives --install /usr/bin/python3 python3 /usr/bin/python3.7 2 
     104update-alternatives --config python3
     105#on this menu enter the number 2.
     107You can verify you have been sucessful in changing the version with the following command:
     109python3 --version
     112== Installing the TPU runtime
     114Install Curl
     116apt-get install curl -y
     118Add Debian package repository to your system
     120echo "deb coral-edgetpu-stable main" | sudo tee /etc/apt/sources.list.d/coral-edgetpu.list
     121curl | sudo apt-key add -
     122sudo apt update
     124Install PIP and required libraries
     126apt-get install python3-pip libedgetpu1-std -y
     129pip3 install --upgrade pip setuptools wheelpi
     131Download the TFlite Runtime .whl
     136***Note:*** If you have chosen not to use Python 3.7 you can find the .whl appropreate for your version here
     138Acquire necessary libraries to build the TFlite runtime
     140pip3 install cython
     141pip3 install numpy
     142apt-get install python3-pil
     145Install TFlite runtime
     147pip3 install tflite_runtime-2.1.0.post1-cp36-cp36m-linux_aarch64.whl 
     151== Download classifcation example, test an infrencing operation
     153Create a place for the Coral examples to reside.
     155mkdir coral && cd coral
     157Clone the examples, "cd" to examples directory.
     159git clone
     160cd tflite/python/examples/classification
     162Install prerequisite programs using supplied script.
     166Run the classifcation demo.
     168python3 \
     169--model models/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite \
     170--labels models/inat_bird_labels.txt \
     171--input images/parrot.jpg 
     174== Gstreamer example
     176This example will use a USB webcam and the TPU to identify objects presented to the webcam. The video output and overlay will be streamed to a location on the network for viewing. 
     178Install Gstreamer, you will need this program on both the SBC and your workstation where you will be viewing the output.
     180apt-get install gstreamer1.0-x gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-bad gstreamer1.0-alsa -y
     183Clone the Gstreamer example
     185mkdir google-coral && cd google-coral
     186git clone --depth 1
     188Download models.  Models are the information that will be feed to the TPU for it to refrence when identifying an object.
     190cd examples-camera
     193Configure the Gstreamer Python script
     195cd gstreamer
     198Adaptations to the existing will be required for this example to work.  Amodified script is available to download from [ here].
     199Edit this file line 231 with the IP address of the desktop workstation you will be streaming to.
     200This is what the edited line will look like. 
     202  ! rsvgoverlay name=overlay ! videoconvert ! jpegenc ! tcpclientsink host= port=9001
     204* The workstation being used in this example has an IP address of, the SBC is using an IP which is on the same subnet. 
     206Launch Gstreamer on your workstation.
     208gst-launch-1.0 tcpserversrc host= port=9001 ! jpegdec ! videoconvert ! autovideosink sync=false
     210On the Gateworks SBC run:
     215Here's a video demonstration of the output you can expect if everything is working: