§ Drones & UAVs
Autonomous flight, on-device intelligence.
Real-time AI processing for autonomous flight, object detection, and mission planning — no cloud round-trip.
§ Why edge AI
The case for running it on the asset.
Flight controllers and AI cameras shouldn't be separate boxes. The E1M-X family runs perception, mission planning, and motor control on the same module — with hard real-time guarantees from the Cortex-R8 cores and 29–33 TOPS of NPU compute for live vision. The drone industry moved to AI-native autopilots last year. We make that the default.
§ The constraint
No cellular coverage at 400 m AGL. No cloud round-trip budget when the drone is 200 ms from a power line. AI runs on the airframe or it doesn't run.
§ Recommended modules
Pick the AI compute that fits.
§ Reference examples from the Alp SDK
Code you can fork and ship.
Every example is real C/C++ in the open-source alp-sdk repo . Clone, change the SKU in board.yaml, build.
drone-autopilot
Drone autopilot
Autonomous flight control — sensor fusion + path planning on the M33, perception on the NPU.
View on GitHub ↗
drone-hud
Drone HUD
Heads-up display for the operator: live telemetry + AI inference overlays.
View on GitHub ↗
ai-object-detection-realtime
Real-time object detection
YOLOv8-tiny on the NPU — bounding boxes + live FPS counter at 30 FPS.
View on GitHub ↗
§ Built for this vertical
Why E1M wins here.
-
→
Cortex-R8 hard-real-time cores for flight control loop (V2H)
-
→
2× 4-lane MIPI CSI-2 — up to 8 cameras for stereo and 360° vision
-
→
Wi-Fi 6 + Bluetooth 5 telemetry, USB 3.2 to ground station
-
→
IP67-ready when paired with a sealed carrier board