ModelNova Introduces Native ExecuTorch Support for Arm Ethos

2–3 minutes
data-recovery

From News Desk

embedUR systems, a Silicon Valley–based embedded software and Edge AI engineering company, has announced a significant expansion of Arm ecosystem support within ModelNova Fusion Studio, its desktop application for end-to-end Edge AI development. The announcement, made at Embedded World 2026 (Hall 4, Booth 600), introduces Fusion Studio’s deep integration with Arm’s most widely adopted Edge AI technologies.

Edge AI development spans a diverse landscape of silicon platforms, software frameworks and toolchains. Developers building on these platforms have historically faced a fragmented workflow — assembling separate model conversion utilities, platform-specific compilers, quantisation scripts and deployment tools before they can bring an AI workload to production.

Fusion Studio is said to be designed to address this directly. The platform integrates support for Arm Ethos NPU architectures, toolchains and frameworks throughout the development lifecycle, from model training and optimisation through on-device deployment.

Arm-Native Capabilities in Fusion Studio

The latest release of Fusion Studio introduces several capabilities purpose-built for the Arm Edge AI ecosystem –

  • ExecuTorch Support for Arm Ethos-U85 and Ethos-U55: Native support for ExecuTorch-optimised models targeting Arm’s Ethos-U85 and Ethos-U55 NPUs, compiled through the Vela toolchain. Developers can train, quantise and deploy NPU-optimised models directly from the desktop application without managing command-line conversion tools or framework dependencies.
  • Ensemble Series Platform Integration – Full agent and model integration with the Alif Semiconductor Ensemble Series development platforms and sensors. Developers can deploy directly to Ensemble Development Kits from within Fusion Studio, with live inference playback and real-time camera capture on target hardware.
  •  Arm SDS Framework Integration – Integration with Arm’s Synchronous Data Stream (SDS) Framework enables dataset capture, playback and inference workflows directly from Arm-based hardware targets — streamlining the data pipeline from sensor to trained model.
  • Arm Keil MDK Integration – Support for Arm’s Keil MDK toolchain, including Fixed Virtual Platform (FVP) simulation and direct Ensemble DevKit deployment, enabling developers to build, test and iterate without leaving the Fusion Studio environment.
  • In-Tool MLOps End-to-End Flow: A complete MLOps pipeline — from dataset capture and annotation, through model training with ExecuTorch and Arm KleidiAI libraries integration, to firmware build and direct deployment on Arm-based targets — all executed locally with no cloud compute costs.

Expanding on Fusion Studio

The new Arm-native capabilities build on a platform that has been available in beta since 2025. Fusion Studio currently offers a library of over 160 pre-trained and optimised AI models spanning computer vision, audio, and text generation, with 21 industry-specific starter packs designed to accelerate proof-of-concept development. The platform supports deployment to Raspberry Pi 4 and 5 with integrated camera support and real-time inference and runs on PC (CPU/CUDA) and macOS with Apple Silicon host machines.

Recent beta releases introduced LLM-powered AI Assistance that provides contextual guidance across model selection, dataset evaluation, training configuration, platform optimisation and troubleshooting — along with training checkpoint and continuation support for interrupted workflows.