Kioxia America, Inc. has announced the industry´s first 245.76 terabyte NVMe SSD, expanding its LC9 Series for enterprise storage. Available in both 2.5-inch and EDSFF E3.L form factors, this drive is tailored to meet the escalating performance and efficiency demands of generative artificial intelligence environments. Building on the earlier 122.88 TB model, the new SSD is designed to handle the colossal datasets necessary for training large language models, managing embeddings, and powering retrieval augmented generation (RAG) processes, all of which underpin cutting-edge artificial intelligence inference workloads.
The LC9 Series leverages a 32-die stack of 2 terabit BiCS FLASH QLC 3D flash memory, employing innovative CBA (CMOS directly Bonded to Array) technology. This allows for an industry-first 8 TB on a compact 154 BGA package, representing significant advancements in wafer processing and bonding techniques. The LC9 Series directly addresses conventional bottlenecks posed by hard drives, enabling dense storage within a minimal physical footprint. Its deployment enables data centers to replace multiple high-power hard disk drives, resulting in better GPU utilization, superior performance, lower power consumption, reduced drive slot requirements, and more efficient cooling, thereby offering a lower total cost of ownership for large-scale deployments.
Key features include compliance with PCIe 5.0, NVMe 2.0, and NVMe-MI 1.2c standards, support for Open Compute Project Datacenter NVMe SSD specification v2.5, and Flexible Data Placement (FDP) for extending drive lifespan. Security options abound, including SIE, SED, and FIPS SED, along with CNSA 2.0-aligned digital signature algorithms designed to prepare for the quantum computing era. The LC9 Series is currently sampling for select customers and is slated for a public showcase at the Future of Memory and Storage 2025 conference. Kioxia´s announcement reinforces its leadership in flash storage innovation, crucial for evolving artificial intelligence data center architecture and the relentless growth of machine learning workloads.