Firefly Launches AIBOX-K3 Edge AI Mini PC with SpacemiT K3 RISC-V SoC
Key Takeaways
- ▸The AIBOX-K3 enables practical local LLM inference on edge devices, supporting models up to 30 billion parameters with inference speeds exceeding 10 tokens per second at 60 TOPS of AI compute performance
- ▸Built on open-source RISC-V architecture, the device provides an open-ISA alternative to proprietary ARM-based edge AI solutions, with strong ecosystem support including multiple Linux distributions and containerization tools
- ▸Industrial-grade design with comprehensive security features (hardware encryption, memory/I/O protection) and wide operating temperature range positions the AIBOX-K3 for enterprise and industrial edge computing deployments
Summary
Firefly has officially launched the AIBOX-K3, an industrial-grade edge AI mini PC powered by SpacemiT's K3 RISC-V SoC. The device features an octa-core RISC-V X100 processor running at up to 2.4 GHz and an integrated AI engine with eight dedicated RISC-V AI cores, delivering up to 60 TOPS (Tensor Operations Per Second) of INT4 compute performance. According to Firefly, the AIBOX-K3 can run large language models with up to 30 billion parameters locally, achieving inference speeds exceeding 10 tokens per second—making it suitable for edge AI and local LLM inference applications without cloud dependency.
The device comes in multiple configurations with up to 32GB of LPDDR5 memory and 512GB of UFS 2.2 on-board storage, plus an M.2 socket for NVMe SSD expansion. It features dual gigabit Ethernet ports, USB 3.0 connectivity, HDMI 2.0 display output, and an industrial-grade aluminum enclosure designed for reliable operation in temperatures ranging from -20°C to 60°C. Security features include hardware-level memory and I/O protection (PMP/ePMP, IOPMP), plus built-in encryption engines supporting AES, SHA, RSA, and SM2/SM3/SM4 standards.
Firefly provides flexible operating system support, with the K3 platform compatible with multiple open-source operating systems including Ubuntu 26.04 (developed with Canonical), Bianbu OS 3.0, OpenHarmony, and others. Developer tools including Docker, KVM virtualization, and ROS are available, with Linux 7.0 mainline support already integrated. The AIBOX-K3 is priced competitively at $349 USD for the base 8GB/128GB configuration and $689 for the fully-configured 32GB variant.
- Competitive pricing ($349–$689) brings edge AI inference capability within reach of mid-market organizations, addressing the growing market segment between consumer IoT and cloud-based AI services
Editorial Opinion
The AIBOX-K3 represents a compelling intersection of open-source hardware (RISC-V) and practical edge AI capability at an accessible price point. By enabling local inference of models up to 30B parameters without cloud dependency, Firefly addresses critical pain points around latency, privacy, and operational cost for industrial and enterprise deployments. However, success will ultimately depend on software ecosystem maturity; RISC-V's growing data center traction is promising, but the narrower software library compared to ARM remains a consideration for potential customers.



