Intel details Clearwater Forest Xeon built on 18A with new Darkmont e-cores

Intel unveiled Clearwater Forest, a next-generation E-core Xeon, at Hot Chips, pairing major core architecture changes with the 18A process and advanced 3D packaging.

Intel used Hot Chips to present Clearwater Forest, its next-generation E-core Xeon, positioning the design as a showcase for its foundry and packaging capabilities. The chip replaces Sierra Forest and is built around the new Darkmont efficiency cores. Intel said Clearwater Forest will ship in a large chiplet package that uses the 18A node for compute dies and relies on Foveros Direct 3D stacking combined with EMIB for advanced multi-die integration.

At the core level, Darkmont emphasizes a wider and more parallel front end and a strengthened out-of-order engine. Intel disclosed a 64 KB instruction cache per core and described an expanded decoder that can handle more instructions per cycle than the previous Crestmont E-core design. The company also detailed enhancements to reorder and allocation structures, an increase in allocation units, and an enlarged out-of-order window to support more in-flight work.

Execution resources were significantly increased, with many more execution ports for integer and vector math to sustain much higher parallel throughput. Intel framed the combination of architectural upgrades, the 18A process, and advanced 3D Foveros packaging paired with EMIB as a demonstration of what its foundry and packaging teams can now deliver. Specific shipping timelines, power or performance numbers, and final system-level configurations were not stated.

65

Impact Score

Saudi Artificial Intelligence startup launches Arabic LLM

Misraj Artificial Intelligence unveiled Kawn, an Arabic large language model, at AWS re:Invent and launched Workforces, a platform for creating and managing Artificial Intelligence agents for enterprises and public institutions.

Introducing Mistral 3: open artificial intelligence models

Mistral 3 is a family of open, multimodal and multilingual Artificial Intelligence models that includes three Ministral edge models and a sparse Mistral Large 3 trained with 41B active and 675B total parameters, released under the Apache 2.0 license.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.