Gigabyte used CES 2026 to showcase its AI Top product line, positioning the company around a shift in artificial intelligence workloads from cloud infrastructure to local computing. The company argues that the rapid adoption of artificial intelligence inferencing is making the technology more accessible, responsive, and adaptable for everyday use, which it frames as achieving the concept of ‘artificial intelligence for everyone’ in what it calls the artificial intelligence era. Gigabyte first introduced the AI Top system in 2024 with the goal of enabling local artificial intelligence development using household-standard power, and it has since expanded the lineup to serve scenarios ranging from personal experimentation to full business deployments.
At the top of the lineup is the AI Top system, described as a fully customizable artificial intelligence computing solution built from Gigabyte AI Top series hardware that can be matched to specific artificial intelligence workload requirements. The systems are designed to empower desktop users who want to develop and run artificial intelligence models locally instead of relying solely on cloud services. Gigabyte is targeting both individual creators and organizations that need dedicated on-premises compute without building a full data center.
The flagship AI Top 500 system supports AI models up to 405B parameters and is tailored for medium-sized businesses that require scalable artificial intelligence computing power on site. For smaller businesses, startups, and individual users, the AI Top 100 system supports LLM fine-tuning up to 110B+ parameters. Gigabyte notes that multiple AI Top systems can be clustered through Ethernet and Thunderbolt ports to expand performance and accelerate the computing process, which allows training speeds to increase and provides scalable power for a broader set of workloads.
