Artificial intelligence in the dock: should machines have legal rights?

Recent multi-billion pound investment into the UK's Artificial Intelligence infrastructure has refocused attention on regulation. The Law Commission's discussion paper on "Artificial Intelligence and the law" asks whether existing frameworks can address liability and accountability, and whether some form of legal personality for machines should be considered.

Published on December 3, 2025, the article reviews the UK position on Artificial Intelligence regulation following recent multi-billion pound investment into the country’s infrastructure and scientific research. It notes that, unlike the European Union’s AI Act, the UK has no dedicated Artificial Intelligence Act and has preferred a decentralised, principles-based approach that relies on existing regulators such as the CMA, FCA, ICO and Ofcom. The Law Commission has issued a discussion paper, titled “AI and the Law”, and launched a project limited to public sector uses of Artificial Intelligence and automated decision-making. The Commission cannot make law but its recommendations often inform government policy.

The Law Commission paper aims to raise awareness of legal risks and to prompt wider discussion rather than to propose detailed reforms. It revisits long-standing questions about whether granting some form of legal personality to Artificial Intelligence systems could close so-called liability gaps. The paper highlights core challenges: the autonomy and adaptiveness of systems that learn and change post-deployment, the difficulty of establishing factual and legal causation and mens rea when outputs are unpredictable, and the opacity of models protected by proprietary rights or technical complexity. It also addresses oversight and over-reliance concerns in regulated professions and public decision-making, and training and data issues, including copyright and personal data protection.

The article concludes that the Law Commission’s paper is a measured starting point that organises key issues and identifies areas for further policy work. It flags that the suggestion of legal personality raises complex criteria questions such as how to define autonomy thresholds and design accountability mechanisms. Without follow-up work that converts the discussion into clear priorities and concrete proposals, uncertainty will persist. Developments in the EU and elsewhere will continue to be watched closely as domestic and global regulatory responses to Artificial Intelligence evolve.

55

Impact Score

Microsoft previews Shader Model 6.10 for gpu Artificial Intelligence engines

Microsoft has introduced Shader Model 6.10 in AgilitySDK 1.720-preview with a new matrix API designed to unify access to dedicated gpu Artificial Intelligence hardware from AMD, Intel, and NVIDIA. The change is aimed at making neural rendering features easier to deploy across multiple vendors with a single programming model.

Europe’s Artificial Intelligence challenge is structural dependence

Europe has talent, research strength, and rising investment in Artificial Intelligence, but startups remain reliant on American infrastructure, platforms, and late-stage capital. The argument centers on digital sovereignty, interoperability, and ownership as the conditions for building durable European champions.

Community backlash slows Artificial Intelligence data center expansion

Political resistance, regulatory scrutiny, and rising energy and water concerns are complicating the build-out of large Artificial Intelligence data centers across the United States. The pressure is increasing costs, delaying projects, and adding fresh risks to the economics behind Generative Artificial Intelligence infrastructure.

House panel advances export controls after China report

The House Foreign Affairs Committee moved export control legislation after a House Select Committee report detailed China’s use of illegal means to build its Artificial Intelligence and semiconductor sectors. The measure is aimed at chip smuggling and Artificial Intelligence model theft.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.