Want some AI with your Pi? Raspberry Pi 5 gets an LLM upgrade with new AI
HAT+ 2 - but will it be powerful enough?
Date:
Tue, 20 Jan 2026 19:35:00 +0000
Description:
Raspberry Pi AI HAT+ 2 enables local LLM and VLM workloads with 40 TOPS, 8 GB memory, and PCIe connectivity.
FULL STORY
Raspberry Pi has expanded its edge computing ambitions with the release of
the AI HAT+ 2 , an add-on board designed to bring generative AI workloads
onto the Raspberry Pi 5.
Earlier AI HAT hardware focused almost entirely on computer vision acceleration, handling tasks such as object detection and scene segmentation.
The new board broadens that scope by supporting large language models and vision language models which run locally, without relying on cloud infrastructure or persistent network access.
Hardware changes that enable local language models
At the center of the upgrade is the Hailo-10H neural network accelerator,
which delivers 40TOPS of INT4 inference performance.
Unlike its predecessor, the AI HAT+ 2 features 8GB of dedicated onboard
memory, enabling larger models to run without consuming system RAM on the
host Raspberry Pi.
This change allows direct execution of LLMs and VLMs on the device and maintains low latency and local data, which is a key requirement for many
edge deployments.
Using a standard Raspberry Pi distro , users can install supported models and access them through familiar interfaces such as browser-based chat tools.
The AI HAT+ 2 connects to the Raspberry Pi 5 through the GPIO header and
relies on the systems PCIe interface for data transfer, which rules out compatibility with the Raspberry Pi 4.
This connection supports high-bandwidth data transfer between the accelerator and the host, which is essential for moving model inputs, outputs, and camera data efficiently.
Demonstrations include text-based question answering with Qwen2, code generation using Qwen2.5-Coder, basic translation tasks, and visual scene descriptions from live camera feeds.
These workloads rely on AI tools packaged to work within the Pi software
stack, including containerized backends and local inference servers.
All processing occurs on the device, without external compute resources.
The supported models range from one to one and a half billion parameters,
which is modest compared to cloud-based systems that operate at far larger scales.
These smaller LLMs target limited memory and power envelopes rather than
broad, general-purpose knowledge.
To address this constraint, the AI HAT+ 2 supports fine-tuning methods such
as Low-Rank Adaptation, which allows developers to customize models for
narrow tasks while keeping most parameters unchanged.
Vision models can also be retrained using application-specific datasets
through Hailos toolchain.
The AI HAT+ 2 is available for $130, placing it above earlier vision-focused accessories while offering similar computer vision throughput.
For workloads centered solely on image processing, the upgrade offers limited gains, as its appeal rests largely on local LLM execution and
privacy-sensitive applications.
In practical terms, the hardware shows that generative AI on Raspberry Pi hardware is now feasible, although limited memory headroom and small model sizes remain an issue.
======================================================================
Link to news story:
https://www.techradar.com/pro/want-some-ai-with-your-pi-raspberry-pi-5-gets-an -llm-upgrade-with-new-ai-hat-2-but-will-it-be-powerful-enough
$$
--- SBBSecho 3.28-Linux
* Origin: Capitol City Online (42:17/1)