Samsung shows 96% power reduction in NAND flash

Samsung researchers report a design that combines ferroelectric materials with oxide semiconductors to cut NAND flash string-level power by up to 96%. The team says the approach supports high density, including up to 5 bits per cell, and could lower power for data centers and mobile and edge-Artificial Intelligence devices.

the download: fossil fuels and new endometriosis tests

This edition of The Download highlights how this year’s UN climate talks again omitted the phrase “fossil fuels” and why new noninvasive tests could shorten the nearly 10 years it now takes to diagnose endometriosis.

HPC won’t be an x86 monoculture forever

x86 dominance in high-performance computing is receding – its share of the TOP500 has fallen from almost nine in ten machines a decade ago to 57 percent today. The rise of GPUs, Arm and RISC-V and the demands of Artificial Intelligence and hyperscale workloads are reshaping processor choices.

A trillion dollars is a terrible thing to waste

Gary Marcus argues that the machine learning mainstream’s prolonged focus on scaling large language models may have cost roughly a trillion dollars and produced diminishing returns. He urges a pivot toward new ideas such as neurosymbolic techniques and built-in inductive constraints to address persistent problems.

experts divided over claim that Chinese hackers launched world-first Artificial Intelligence-powered cyber attack

Anthropic said in a Nov. 13 statement that engineers disrupted a ‘largely autonomous’ operation that used its Claude model to automate roughly 80-90% of reconnaissance and exploitation against 30 organizations worldwide. Experts dispute the degree of autonomy but warn even partial Artificial Intelligence-driven orchestration lowers barriers to espionage and increases scalability.

Seagate HAMR prototype achieves 6.9 TB per platter for 55 TB HDDs

Seagate disclosed a prototype heat-assisted magnetic recording platter that stores roughly 6.9 TB and enables drives with roughly 55 TB of capacity. The company says the technology would benefit data center cold tiers and workloads such as Artificial Intelligence.