OpenAI is pivoting more directly into scientific research, three years after ChatGPT’s explosive debut reshaped everyday tasks at home, work, and in schools. The company has created a new group called OpenAI for Science, which is focused on how its large language models can assist researchers and on adapting its tools for scientific workflows. Vice president Kevin Weil leads the team, and in an interview he addresses why OpenAI is moving into science now, how this strategy fits into the company’s broader mission, and what concrete outcomes it hopes to deliver for scientists.
Alongside this push into research, the newsletter highlights a growing safety front: how to keep children safe when interacting with Artificial Intelligence chatbots. The long standing method of simply asking users for their birthdays, which they could fabricate to avoid child privacy rules, is increasingly seen as inadequate. New developments in the US within the last week show how rapidly expectations are shifting, as age verification and content moderation around minors become a fresh battleground among policymakers, parents, and child safety advocates. A separate report finds that the Grok chatbot is not safe for children or teens, and European Union regulators are examining whether it spreads illegal content.
The issue of safety and automation also appears in US transport policy, where the Department of Transport plans to use Artificial Intelligence to help write new safety regulations, sparking criticism that undetected errors could have lethal consequences. Political tensions continue around immigration enforcement technology, as hundreds of tech workers pressure employers to condemn ICE and question TikTok’s handling of “Epstein” messages and anti ICE videos, while California governor Gavin Newsom seeks to probe whether TikTok censors Trump critical content. Law enforcement and civil liberties collide in an FBI investigation into Minnesota Signal chats that tracked federal agents, which some free speech advocates argue involve legally obtained information.
In space, the newsletter spotlights commercial stations as one of the year’s 10 breakthrough technologies. After two decades of human occupation on the International Space Station, that platform is aging and is expected to be deorbited into the ocean in 2031. To fill the gap and expand access to orbit, NASA has awarded more than ? million to multiple firms building private space stations, while additional companies are financing their own designs. The vision is that private outposts will eventually succeed the ISS and open new opportunities for research, manufacturing, and tourism. This comes as Saudi Arabia’s futuristic city project The Line, once proposed to house 9 million people, faces uncertainty and may end up focused more on data centers than residents.
Other stories track domestic infrastructure and energy shifts. Georgia joins Maryland and Oklahoma in considering bans on new data centers, even as data centers become central to computing and cloud services. A feature follows developer Michael Skelly, who has spent about 15 years pushing high voltage transmission lines to connect US regional grids and move wind power from the Great Plains, Midwest, and Southwest to population centers. His earlier company folded in 2019 after canceling two projects and selling stakes in three others, but he argues that he was early rather than wrong, and notes that markets and policymakers are gradually embracing his long held view that better grid connections are key to cutting coal and natural gas pollution.
The newsletter closes with a broader reflection on the trajectory of Artificial Intelligence from Anthropic chief executive Dario Amodei, who warns that humanity is on the verge of receiving almost unimaginable power from advanced systems without clear evidence that social, political, and technological institutions are ready. It also surfaces research into Earth’s lighter elements possibly hiding deep in the core, the erosion of the US measles free status amid outbreaks, and the rise of increasingly surreal Artificial Intelligence generated influencers, including virtual conjoined twins and triple breasted characters. A lighter section offers cultural diversions, from cats on magazine covers to music pairings and an orphaned baby seal, as a reminder that technology coverage can coexist with small moments of comfort.
