Data storage strategies for data intensive scientific research

Scientific computing teams are confronting surging data volumes from Artificial Intelligence, simulation and genomics, forcing a rethinking of storage architectures across on premises and cloud environments. An expert roundtable report profiles how leading research institutions are balancing performance, scalability, security and cost while planning for future workloads.

Scientific research infrastructures are facing a rapid escalation in data volumes generated by Artificial Intelligence and machine learning, large scale simulations and genomics, which is forcing research teams to manage petabytes of data while maintaining performance, security and cost efficiency. Storage strategy decisions are becoming more complex as organisations weigh high performance NVMe, scalable tiered storage and cloud based options, and must also consider how Artificial Intelligence workloads will reshape infrastructure requirements. Research computing leaders are increasingly focused on aligning storage capabilities with high performance computing environments so that data intensive workflows are not constrained by bottlenecks.

An expert roundtable report brings together high performance computing leaders from organisations including the University of Cambridge, Lund University and Simula Research Laboratory to describe practical approaches to modern storage design. The report details how leading research institutions design tiered storage architectures that can match different performance and capacity needs, and it highlights the key factors influencing infrastructure choices, such as workload type, budget constraints and long term retention requirements. Contributors explain how Artificial Intelligence and GPU workloads are changing storage requirements, prompting demand for faster access to active data as well as scalable systems that can accommodate rapid growth without sacrificing reliability.

The roundtable also explores why many research organisations remain cautious about cloud storage despite its flexibility, citing concerns over cost predictability, data sovereignty and performance for large scale scientific workloads. The report is positioned as a white paper for high performance computing and research computing leaders, data centre architects, research IT and infrastructure teams, Artificial Intelligence and data platform specialists, and university and laboratory technology managers who support data intensive environments. Readers are promised peer insights into how research organisations are preparing infrastructure for the next wave of data driven discovery, including perspectives on what experts believe the future of research data storage will look like.

50

Impact Score

What businesses need to know about the EU cyber resilience act

The EU cyber resilience act is turning product cybersecurity into a legal requirement for companies that sell digital products into the European Union. A key compliance milestone arrives in September 2026, well before the full regulation takes effect in 2027.

Claude Mythos and cyber insurance’s next inflection point

Claude Mythos is being treated by governments and regulators as a potential systemic cyber risk with implications for financial stability and insurance markets. Its emergence is intensifying pressure on insurers to clarify whether Artificial Intelligence-enabled cyber losses are covered, excluded, or require new stand-alone products.

OpenAI expands ChatGPT ads with self-serve manager

OpenAI is widening its ChatGPT ads pilot with a beta self-serve Ads Manager, new bidding options and broader measurement tools. The push signals a deeper move into advertising as the company expands the program into several international markets.

OpenAI launches Artificial Intelligence deployment consulting unit

OpenAI has created a new consulting and deployment business aimed at helping enterprises build and roll out Artificial Intelligence systems. The move mirrors a similar push by Anthropic and signals a broader effort by model providers to capture more of the enterprise services market.

SK Group warns DRAM shortages could curb memory use

SK Group chairman Chey Tae-won warned that customers may reduce memory consumption through infrastructure and software optimization if DRAM suppliers fail to raise output. Demand from Artificial Intelligence data centers is keeping the market tight as memory makers weigh expansion against the long timelines for new fabs.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.