Particle.news
Download on the App Store

Raspberry Pi Launches AI HAT+ 2 With Hailo‑10H and 8GB RAM for On‑Device GenAI

Early reviews highlight working on‑device models, memory limits, power‑capped NPU performance.

Overview

  • The $130 add‑on is available now for Raspberry Pi 5 and launches with Llama 3.2, DeepSeek‑R1‑Distill, and Qwen 2/2.5 variants at roughly 1.5 billion parameters.
  • The board delivers a Hailo‑10H accelerator rated at 40 TOPS (INT4) with 8GB onboard RAM, connects over PCIe, integrates with Raspberry Pi’s camera stack, and ships with a recommended heatsink.
  • Independent testing found the Pi 5’s CPU generally outperformed the Hailo 10H on supported LLMs, with the NPU constrained to about 3W versus up to 10W for the Pi SoC.
  • Reviewers reported smooth single‑model demos using hailo‑ollama, yet noted early software rough edges and instability when attempting simultaneous vision and LLM workloads.
  • Coverage positions the HAT+ 2 for niche or development use where private, low‑latency inference is needed, as it cannot match larger cloud models and may be less flexible than a 16GB Pi for bigger local LLMs.