§ Robotics
See, learn, move — on one SoM.
ROS 2 perception on the NPU, real-time motion control on the R-cores, deterministic IPC across both.
§ Why edge AI
The case for running it on the asset.
Robots have always lived in two minds: a real-time controller for joints + motion, and a Linux brain for perception. The E1M-X V2H+M1 puts both on the same module — Yocto Linux on the A55 cluster, Zephyr on the Cortex-R8 real-time cores, deterministic shared-memory IPC between them. One firmware project. One pinout. One SDK.
§ The constraint
Latency-sensitive perception (collision avoidance, manipulation) can't round-trip to a cloud. Determinism matters: motors don't care about your 99th percentile.
§ Recommended modules
Pick the AI compute that fits.
§ Reference examples from the Alp SDK
Code you can fork and ship.
Every example is real C/C++ in the open-source alp-sdk repo . Clone, change the SKU in board.yaml, build.
v2n-m1-ros-perception
ROS 2 perception pipeline
ROS 2 node with camera + sensor topics, AI perception on the DX-M1 NPU.
View on GitHub ↗
mproc-mailbox
Multi-processor mailbox IPC
Deterministic shared-memory IPC between A55 cluster and Cortex-R8 real-time cores.
View on GitHub ↗
mproc-dual-os-yocto-zephyr
Yocto + Zephyr dual-OS
Linux on the A55 cluster, Zephyr on the M33 — both shipped from one board.yaml.
View on GitHub ↗
§ Built for this vertical
Why E1M wins here.
-
→
Deterministic dual-OS architecture (Yocto + Zephyr) on a single module
-
→
Up to 8 cameras via 2× 4-lane MIPI CSI-2
-
→
CAN-FD for industrial actuator buses, EtherCAT-capable PHYs
-
→
DRP-AI3 + DX-M1 = 29–33 TOPS dense for ROS 2 perception