
US deep-tech startup Tiiny AI revealed the Tiiny AI Pocket Lab, which has been officially verified by Guinness World Records as the world’s smallest personal AI supercomputer.
Resembling a power bank with a pocket-sized design, the supercomputer can run 120 billion-parameter LLMs locally without relying on cloud connectivity, servers, or high-end GPUs.
Through this device, Tiiny AI aims to reduce the dependency of supercomputers on cloud and GPUs, while making data-center-level power accessible to common users. The AI Pocket Lab, launched on December 10, also stands as a viable alternative solution to combat sustainability concerns, rising energy costs, and privacy risks due to cloud-based AI infrastructure.
“Cloud AI has brought remarkable progress, but it also created dependency, vulnerability, and sustainability challenges,” said Samar Bhoj, GTM Director of Tiiny AI, according to the official press release.
“With Tiiny AI Pocket Lab, we believe intelligence shouldn’t belong to data centers, but to people. This is the first step toward making advanced AI truly accessible, private, and personal, by bringing the power of large models from the cloud to every individual device,” he continued.
A myriad of use cases
The AI Pocket Lab is designed to serve every possible personal AI use case, serving a variety of professionals, including creators, developers, researchers, and students.
It allows users to enable multi-step reasoning, deep context understanding, agent workflows, content generation, and secure processing of sensitive information without relying on the internet.
The device stores user data, preferences, and documents locally using bank-level encryption, giving it long-term memory and stronger privacy than cloud-based AI systems.
The Tiiny AI Pocket Lab is designed for the most useful range of personal AI, running models between 10B and 100B parameters that cover over 80 percent of real-world tasks.
It can even scale up to 120B models, offering GPT-4-level intelligence for complex reasoning and multi-step analysis — all while keeping data fully offline and secure on the device.
The empowering tech specs
Equipped with the ARMv9.2 12-core CPU, the supercomputing device offers a 65W power capacity. Typically, it offers a large model performance at a fraction of the energy and carbon footprint of traditional systems reliant on GPUs.
The Tiiny AI Pocket Lab relies on two key technologies that make large AI models run on a small device. The first is TurboSparse, which boosts efficiency by activating only the required neurons without reducing model intelligence.
The second is Powerinfer, an open-source engine with over 8,000 GitHub stars that spreads AI workloads across the CPU and NPU to improve performance while using far less power.
Together, these advances enable the Pocket Lab to deliver a GPU-level AI performance in a compact, low-power form factor.
The open-source ecosystem
Tiiny AI also provides a ready-to-use open-source ecosystem. The Pocket Lab supports one-click installation of popular open-source models such as Llama, Qwen, DeepSeek, Mistral, Phi, and GPT-OSS, as well as easy setup of AI agents such as OpenManus, ComfyUI, Flowise, and SillyTavern.
Users will also get regular updates, including official over-the-air hardware upgrades. These features are set to launch at CES in January 2026.