= Neural Processing Unit The Gateworks Venice SBCs that are using the i.MX8M Plus processors have a built in Neural Processing Unit (NPU) for machine learning. * All GW74xx use the i.MX8M Plus processor * Any GW71xx, GW72xx and GW73xx using a GW702x SOM module will use the i.MX8M Plus processor The NPU operatines up to 2.25 TOPS. NXP uses the term eIQ, which is 'edge intelligence'. NXP has a eIQ ML software environment for neural networks (NN). With eIQ, there are 4 inference engines – OpenCV, Arm® NN, Arm CMSIS-NN and !TensorFlow Lite Some of the default NXP Yocto software has examples in the directory /usr/bin/tensorflow-lite-2.4.0/examples More information can also be found on the eIQ community page: * eIQ Edge Intelligence Starter PDF: [https://www.nxp.com/docs/en/fact-sheet/EIQ-FS.pdf] * eIQ Edge Intelligence Community Page: https://community.nxp.com/t5/eIQ-Machine-Learning-Software/bd-p/eiq Other links: * [https://www.nxp.com/video/using-i-mx-8m-plus-applications-processors-to-enable-ai-in-factory:USING-I.MX-8M-PLUS-APPS-PROCESSORS-TO-ENABLE-AI Using i.MX 8M Plus Applications Processors to Enable AI in Factory]