= Neural Processing Unit The Gateworks Venice SBCs that are using the i.MX8M Plus processors have a built in Neural Processing Unit (NPU) for machine learning. * All GW74xx use the i.MX8M Plus processor * Any GW71xx, GW72xx and GW73xx using a GW702x SOM module will use the i.MX8M Plus processor The NPU operatines up to 2.25 TOPS. NXP uses the term eIQ, which is 'edge intelligence'. NXP has a eIQ ML software environment for neural networks (NN). With eIQ, there are 4 inference engines: 1. OpenCV 1. ArmĀ® NN 1. Arm CMSIS-NN 1. !TensorFlow Lite Some of the default NXP Yocto software has examples in the directory /usr/bin/tensorflow-lite-2.4.0/examples PyeIQ is demo software written on top of the eIQ Machine Learning software environment. This Python code provides python classes to provide a simple and efficient baseline to get started. * [https://pypi.org/project/pyeiq/] * {{{ apt install pip pip3 install pyeiq }}} * PyeIQ Examples are shown here: [https://community.nxp.com/t5/Blog/PyeIQ-3-x-Release-User-Guide/ba-p/1305998] More information can also be found on the NXP eIQ community page: * eIQ Edge Intelligence Starter PDF: [https://www.nxp.com/docs/en/fact-sheet/EIQ-FS.pdf] * eIQ Edge Intelligence Community Page: https://community.nxp.com/t5/eIQ-Machine-Learning-Software/bd-p/eiq Other links: * [https://www.nxp.com/video/using-i-mx-8m-plus-applications-processors-to-enable-ai-in-factory:USING-I.MX-8M-PLUS-APPS-PROCESSORS-TO-ENABLE-AI Using i.MX 8M Plus Applications Processors to Enable AI in Factory]