On October 15, 2025, Google Research announced Coral NPU, an open-source neural processing unit platform for edge AI. The announcement includes initial documentation and tools for developers.
Key details
- Full-stack platform for low-power, on-device inference co-designed by Google Research and Google DeepMind. Documentation and SDKs are available on the developer portal.
- Built on RISC-V ISA-compliant architectural IP for energy-efficient, always-on inference.
- Base design targets about 512 giga operations per second while consuming a few milliwatts, according to Google.
- Core components include a C-programmable RISC-V scalar core, an RVV v1.0 vector unit, and a matrix execution unit.
- The matrix execution unit is under development and will be released on GitHub later this year.
- Toolchain integrates IREE, TFLite Micro (TFLM), MLIR, a C compiler, custom kernels, and a simulator.
- Models from TensorFlow, JAX, and PyTorch can be compiled via StableHLO and IREE workflows.
- Co-design priorities include encoder-based vision and audio models and support for small transformer models with the Gemma team.
- Target devices include wearables, hearables, AR glasses, smartwatches, and IoT systems focused on ambient sensing.
- Security roadmap includes planned support for CHERI-based memory safety and compartmentalization.
- Synaptics announced Astra SL2610 AI-Native IoT processors featuring the Torq NPU, the first production implementation of the Coral NPU architecture.
Background
Coral NPU builds on Google's earlier Coral project for edge AI. Google positions Coral NPU as an AI-first architecture that prioritizes the matrix engine over scalar compute and simplifies model deployment across diverse hardware through a unified software stack.
Google has published an architecture guide detailing the scalar, vector, and matrix units. The vector unit conforms to the RISC-V Vector v1.0 specification, and the matrix execution unit is slated for release on the Coral NPU GitHub repository later this year.
First production silicon
At its Tech Day, Synaptics introduced the Astra SL2610 product line and identified Torq NPU as the first production implementation of the Coral NPU architecture. According to Synaptics, the Torq Edge AI platform uses an open-source compiler and runtime based on IREE and MLIR details.