Artificial intelligence coverage focuses on hardware, data centers, and consumer skepticism

Tom's Hardware highlights how rapidly expanding artificial intelligence workloads are reshaping hardware design, data center planning, and consumer products, while also surfacing new reliability and legal concerns.

The artificial intelligence section outlines how rapidly growing workloads are driving demand for specialized hardware, including the latest CPUs, GPUs, ASICs, and FPGAs that power training and inference for large language models. Coverage centers on both the chips themselves and the broader systems that host them, from memory technologies and storage to networking in data centers. The focus extends to different types of large language models and the methods used to train them, as well as how these models are deployed in real world inference workloads across industries.

Recent reporting highlights that integrating artificial intelligence into critical systems can create unexpected risks. A new investigation found that adding artificial intelligence to a sinus surgery system saw malfunctions rocket from eight to 100 incidents, underscoring the safety and regulatory questions around automated surgical tools. In parallel, legal and ethical concerns surface in disputes over training data, as Nvidia says it did not use pirated books to train its artificial intelligence models and argues that alleged contact with a piracy site does not prove copyright infringement. Reliability issues can also emerge in consumer facing deployments, illustrated by AI.com’s $85 million Super Bowl ad campaign, where a surge of traffic crashed the site’s servers.

On the infrastructure side, a warning from the co CEO of China’s top chipmaker SMIC cautions that rushed artificial intelligence data center capacity in the U.S. and other regions could remain idle if projects fail to materialize, even as companies explore ambitious concepts like orbiting data centers, with SpaceX acquiring xAI to pursue a goal of hitting 100 gigawatts of compute in space. Memory and interconnect technologies remain central, with Intel co developing Z Angle Memory to compete with HBM in artificial intelligence data centers and Samsung, SK Hynix, and Micron teaming up to block memory hoarding. At the same time, a report from consumer firm Circana shows that one third of consumers reject artificial intelligence on their devices, with two thirds of those opposed saying it is not useful, revealing a gap between industry investment and user demand. Experiments at the other end of the spectrum, such as a developer creating a conversational artificial intelligence that can run on a 1976 Zilog Z80 CPU with 64kb of RAM, showcase how minimal models can be adapted to legacy hardware, even if their outputs are necessarily terse.

58

Impact Score

Mit researchers propose self distillation fine tuning to add skills to large language models without forgetting

Researchers at MIT, the Improbable artificial intelligence lab and ETH Zurich have introduced self distillation fine tuning, a method that lets large language models gain new enterprise skills while preserving prior capabilities. The approach uses a model’s own in context learning as a teacher, avoiding explicit reward functions and reducing catastrophic forgetting.

Samsung Exynos 2700 tipped for SF2P node and larger role in Galaxy S27

Samsung is preparing the Exynos 2600 for a limited Galaxy S26 rollout while analysts expect the Exynos 2700 on the SF2P node to ramp hard in the Galaxy S27. The foundry push on 2 nm GAA is tied to ambitions for higher Exynos share and new deals with major chip customers.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.