Tiiny AI Pocket Lab puts a 120-billion-parameter Artificial Intelligence model in a pocket-sized PC

US startup Tiiny AI Inc has unveiled Tiiny AI Pocket Lab, a Guinness World Records certified mini PC that can locally run up to a full 120-billion-parameter large language model without relying on the cloud. The device targets energy-efficient, private Artificial Intelligence computing for individuals and low-resource environments.

US deep-tech startup Tiiny AI Inc has introduced Tiiny AI Pocket Lab, a pocket-sized personal Artificial Intelligence supercomputer that the company says can run up to a full 120-billion-parameter large language model entirely on-device. The device has been officially verified by Guinness World Records in the category “The Smallest MiniPC (100 LLM Locally)”, highlighting its focus on local model execution instead of cloud-based processing. Tiiny AI positions the Pocket Lab as a way to bring large-scale Artificial Intelligence to the edge, making advanced systems more personal, accessible and integrated into daily life.

Tiiny AI Pocket Lab is described as a compact inference system that can run an LLM with as many as 120 billion parameters. The unit measures about 14.2 x 8 x 2.53 cm, weighs around 300 grams, and operates within a 65W power envelope, aiming to deliver large-model performance at significantly lower energy consumption than conventional GPU-backed Artificial Intelligence setups. The hardware is built around an ARMv9.2 12-core CPU and a dedicated neural processing unit capable of delivering about 190 TOPS of Artificial Intelligence compute, backed by 80GB of LPDDR5X memory and 1TB of storage. According to the company, Tiiny AI Pocket Lab operates in the “golden zone” of personal Artificial Intelligence (10B-100B parameters), which it claims is ideal for more than 80 per cent of real-world needs and reportedly offers intelligence comparable to GPT-4o, with PhD-level reasoning, multi-step analysis and deep contextual understanding.

The system relies on two core technologies, TurboSparse and PowerInfer, to make large-parameter models viable on such a small device. TurboSparse applies neuron-level sparse activation to improve inference efficiency, while PowerInfer is an open-source inference engine that accelerates heavy LLM workloads by dynamically sharing computation between the CPU and NPU, enabling performance previously associated with professional GPUs worth thousands of dollars. Tiiny AI Pocket Lab supports one-click installation of open-source models including OpenAI GPT-OSS, Qwen, DeepSeek, Llama, Phi and Mistral, and can deploy Artificial Intelligence agents like OpenManus, ComfyUI, Flowise, Libra, Presenton, Bella and SillyTavern, with the company promising continuous updates and official OTA hardware upgrades, and stating that these features will be released at CES in January 2026. By reducing dependence on cloud servers, Tiiny AI Pocket Lab is framed as a way to cut operational costs, mitigate latency and sustainability concerns associated with data centres, and deliver advanced Artificial Intelligence capabilities to individuals, especially in constrained environments, building on a founding team formed in 2024 with engineers from institutions such as MIT, Stanford, Intel, Meta, HKUST and SJTU.

58

Impact Score

Congress weighs Artificial Intelligence transparency rules

Bipartisan lawmakers are pushing a federal transparency standard for the largest Artificial Intelligence models as Congress works on a broader national framework. The proposal aims to increase public trust while avoiding stricter state-by-state requirements and heavier regulation.

Report finds California creative job losses are not driven by Artificial Intelligence

New research from Otis College of Art and Design finds California’s recent creative industry job losses stem from cost pressures and structural shifts, not direct worker displacement by generative Artificial Intelligence. The technology is changing workflows and expectations, but it is largely replacing tasks rather than entire jobs.

U.S. senators propose broader chip tool export ban for Chinese firms

A bipartisan proposal in the U.S. Senate would shift semiconductor equipment controls from specific fabs to targeted Chinese companies and their affiliates. The measure is aimed at cutting off access to advanced lithography and other wafer fabrication tools for firms such as Huawei, SMIC, YMTC, CXMT, and Hua Hong.

Trump executive order targets state Artificial Intelligence laws

Executive Order 14365 lays out a federal strategy to discourage, challenge, and potentially preempt state Artificial Intelligence laws viewed as burdensome. Employers are advised to keep complying with current state and local rules while preparing for regulatory uncertainty in 2026.

Who decides how America uses Artificial Intelligence in war

Stanford experts are divided over how the United States should govern Artificial Intelligence in defense, surveillance, and warfare. Their views converge on one point: decisions with such high stakes cannot be left to companies alone.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.