Kioxia prototypes 5 TB, 64 GB/s flash memory module for large-scale Artificial Intelligence

Kioxia has developed a prototype flash memory module with 5 terabytes capacity and 64 gigabytes per second bandwidth targeted at large-scale Artificial Intelligence models as part of NEDO project JPNP20017.

Kioxia Corporation, described in the article as a world leader in memory solutions, announced a successful prototype of a large-capacity, high-bandwidth flash memory module. The module delivers five terabytes of capacity and a sustained bandwidth of 64 gigabytes per second. This development was carried out under the Post-5G Information and Communication Systems Infrastructure Enhancement R&D Project (JPNP20017) commissioned by the New Energy and Industrial Technology Development Organization, referred to as NEDO.

To overcome limits seen with conventional DRAM-based memory modules, Kioxia developed a new module configuration that uses daisy-chained connections with beads of flash memories. The company also developed high-speed transceiver technology capable of 128 gigabits per second and implemented techniques to enhance flash memory performance. According to the article, these technological advances have been applied to both memory controllers and memory modules in the prototype.

The announcement frames the prototype as addressing a long-standing trade-off between capacity and bandwidth in memory design, especially relevant for large-scale Artificial Intelligence models that demand both large capacity and high data throughput. The article does not specify production timelines, costs, target systems, or compatibility details. It limits its claims to the successful prototyping and the technical features listed, including the 5 TB capacity, 64 GB/s bandwidth, daisy-chained flash configuration, the 128 Gbps transceiver capability, and the application of these innovations to controllers and modules.

74

Impact Score

Adobe advances edge delivery and artificial intelligence in experience manager evolution

Adobe is recasting experience manager and edge delivery services as a tightly connected, artificial intelligence driven platform for intelligent content orchestration and ultra-fast web delivery. A recent two-day developer event in San Jose showcased edge native architecture, agentic workflows, and automated content supply chains that target both authors and developers.

Artificial intelligence initiatives at argonne national laboratory

Argonne national laboratory is expanding its artificial intelligence research portfolio, from next generation supercomputing partnerships to urban digital twins and nuclear maintenance frameworks. A series of recent press releases and feature stories outlines how artificial intelligence is being integrated across scientific disciplines and large scale facilities.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.