Apertus: Swiss teams release fully open multilingual large language model

EPFL, ETH Zurich and the Swiss National Supercomputing Centre have released Apertus, a fully open multilingual large language model with its architecture, weights and training recipes published. The model is intended to support research, commercial adoption and public oversight of Artificial Intelligence.

EPFL, ETH Zurich and the Swiss National Supercomputing Centre (CSCS) have released Apertus, a large-scale, open, multilingual large language model. The project name Apertus is Latin for open and the team says the entire development process is openly accessible. That includes model architecture, model weights, intermediate checkpoints, training data and training recipes, all accompanied by comprehensive documentation and source code.

The release is positioned as a tool for research, industry and public examination rather than a conventional technology transfer to a single product. Thomas Schulthess, director of CSCS and professor at ETH Zurich, frames the project as a driver of innovation and a way to strengthen expertise across research, society and industry. Imanol Schlag, technical lead of the LLM project and research scientist at ETH Zurich, says Apertus is built for the public good and designed around multilingualism, transparency and compliance.

Access will be provided through several routes. Swisscom will offer a dedicated interface for participants at upcoming Swiss Artificial Intelligence Weeks hackathons and make the model available to Swisscom business customers via its sovereign Swiss artificial intelligence platform. For users outside Switzerland, access will be offered through the Public Artificial Intelligence Inference Utility. The research team released the model under a permissive open-source license that also allows commercial use and says future work will expand the model family, improve efficiency, explore domain-specific adaptations for law, climate, health and education, and add capabilities while maintaining reproducibility and transparency.

75

Impact Score

Marvell extends CXL ecosystem leadership with Structera interoperability across major memory and CPU platforms

Marvell announced its Structera Compute Express Link memory-expansion controllers and near memory compute accelerators passed interoperability testing with DDR4 and DDR5 from Micron Technology, Samsung Electronics, and SK hynix. The company says this makes Structera the only CXL 2.0 product family validated across both major CPU architectures and all three memory suppliers.

32 GB of RAM could become the new standard for gamers

Steam´s hardware survey shows 32 GB of RAM rose to 36.46% of surveyed systems in August, up 1.31 percentage points from July and closing on 16 GB at 41.88%. Cheaper DDR5, broader OEM memory options, and local Artificial Intelligence and streaming workloads are cited as drivers.

Artificial intelligence sharpens humidity maps to improve forecasts

Researchers at Wrocław University of Environmental and Life Sciences used a super-resolution approach powered by Artificial Intelligence and NVIDIA GPUs to turn low-resolution GNSS snapshots into high-resolution 3D humidity maps, cutting retrieval errors in test regions.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.