TSMC´s Role in the Silicon Photonic Era

Explore how TSMC is leading the charge in the Silicon Photonics era with essential advancements in Artificial Intelligence.

TSMC´s strategic position in the Silicon Photonics era addresses the growing demands of Artificial Intelligence models, high-performance computing, and data centers. As the leading pure-play foundry, TSMC is establishing a comprehensive ecosystem involving photonics process platforms, heterogeneous packaging, IP libraries, and system-level integration support. This article highlights TSMC´s recent advancements and key offerings shaping its role in the photonics landscape.

The evolution of Silicon Photonics (SiPh) sets the stage for next-generation interconnects and system architectures. As Artificial Intelligence applications continue to require more photonic technology, TSMC is expanding its photonics capabilities. The company´s efforts are focused on building a robust infrastructure to support global needs, enhancing its Silicon Photonics process capabilities, and fostering system innovations.

Moreover, TSMC is transitioning from being merely a foundry to a photonic system enabler. This shift aims to unlock new design opportunities and foster system-level innovation, cementing its role as a leader in global Silicon Photonics platforms. Topics such as process capability, platform technologies, and new technology opportunities are central to TSMC´s ongoing research and development efforts.

75

Impact Score

Robotics special: Waymo heads across the pond

Waymo will bring its robotaxis to London in 2026, a high-stakes test for autonomous driving in one of the world’s toughest urban environments. This week’s robotics roundup also spotlights fresh hardware and consumer concepts powered by Artificial Intelligence across phones, homes, and labs.

Key large language model papers from October 13 to 18

A roundup of notable large language model research from the third week of October 2025, spanning generative modeling, multimodal embeddings, and evaluation. Highlights include a diffusion transformer built on representation autoencoders and a language-centric scaling law for embeddings.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.