text.skipToContent text.skipToNavigation
NXP


The NXP® eIQ Machine Learning Software Development Environment enables the use of ML algorithms on NXP MCUs, i.MX RT crossover MCUs, and i.MX family SoCs. eIQ software includes inference engines, neural network compilers and optimized libraries. This software leverages open-source technologies and is fully integrated into our MCUXpresso SDK and Yocto development environments, allowing you to develop complete system-level applications with ease.

The NXP® eIQ (“edge intelligence”) ML software environment provides the key ingredients to do inference with neural network (NN) models on embedded systems and deploy ML algorithms on NXP microprocessors and microcontrollers for edge nodes. It includes inference engines, NN compilers, libraries, and hardware abstraction layers that support Google TensorFlow Lite, Glow, Arm® NN, Arm CMSIS-NN, and OpenCV.

With NXP’s i.MX applications processors and i.MX RT crossover processors based on Arm Cortex®-A and M cores, respectively, embedded designs can now support deep learning applications that require high-performance data analytics and fast inferencing.

eIQ software includes a variety of application examples that demonstrate how to integrate neural networks into voice, vision and sensor applications. The developer can choose whether to deploy their ML applications on Arm Cortex A, Cortex M, and GPUs, or for high-end acceleration on the neural processing unit of the i.MX 8M Plus.

eIQ-enabled Devices

  • i.MX RT1050
  • i.MX RT1060
  • i.MX RT1170
  • i.MX RT600
  • i.MX 8M Plus
  • i.MX 8M
  • i.MX 8M Nano
  • i.MX 8M Mini
  • i.MX 8
  • i.MX 8X


NXP.jpg