Version 2 (modified by 8 months ago) ( diff ) | ,
---|
Neural Processing Unit
The Gateworks Venice SBCs that are using the i.MX8M Plus processors have a built in Neural Processing Unit (NPU) for machine learning.
- All GW74xx use the i.MX8M Plus processor
- Any GW71xx, GW72xx and GW73xx using a GW702x SOM module will use the i.MX8M Plus processor
The NPU operatines up to 2.25 TOPS.
NXP uses the term eIQ, which is 'edge intelligence'. NXP has a eIQ ML software environment for neural networks (NN).
With eIQ, there are 4 inference engines – OpenCV, Arm® NN, Arm CMSIS-NN and TensorFlow Lite
Some of the default NXP Yocto software has examples in the directory /usr/bin/tensorflow-lite-2.4.0/examples
More information can also be found on the eIQ community page:
- eIQ Edge Intelligence Starter PDF: https://www.nxp.com/docs/en/fact-sheet/EIQ-FS.pdf
- eIQ Edge Intelligence Community Page: https://community.nxp.com/t5/eIQ-Machine-Learning-Software/bd-p/eiq
Attachments (10)
- 0001-arm64-dts-imx8mp-venice-fix-USB_OC-pinmux.patch (2.7 KB ) - added by 4 months ago.
- 0002-arm64-dts-imx8mm-venice-gw700x-remove-ddrc.patch (963 bytes ) - added by 4 months ago.
- 0003-arm64-dts-freescale-add-Gateworks-venice-board-dtbs.patch (1.4 KB ) - added by 4 months ago.
- 0004-arm64-dts-imx8mp-venice-gw74xx-enable-gpu-nodes.patch (1.1 KB ) - added by 4 months ago.
- imx8mp_border.png (387.1 KB ) - added by 4 months ago.
-
Screenshot from 2024-08-09 12-16-33.png
(33.0 KB
) - added by 3 months ago.
updated benchmark data
- gw74xx_npu_benchmark_new.png (39.1 KB ) - added by 3 months ago.
- pipeline.svg (84.9 KB ) - added by 3 months ago.
- hostpipeline.svg (20.3 KB ) - added by 3 months ago.
- hostpipeline.2.svg (20.3 KB ) - added by 3 months ago.
Download all attachments as: .zip
Note:
See TracWiki
for help on using the wiki.