Proposals to regulate artificial intelligence have been delayed by at least a year as UK ministers plan a bumper bill to regulate the technology and its use of copyrighted material.
Peter Kyle, the technology secretary, intends to introduce a “comprehensive” AI bill in the next parliamentary session to address concerns about issues including safety and copyright.
This will not be ready before the next king’s speech, and is likely to trigger concerns about delays to regulating the technology. The date for the next king’s speech has not been set but several sources said it could take place in May 2026.
Labour had originally planned to introduce a short, narrowly-drafted AI bill within months of entering office that would have been focused on large language models, such as ChatGPT.
The legislation would have required companies to hand over their models for testing by the UK’s AI Security Institute. It was intended to address concerns that AI models could become so advanced that they posed a risk to humanity.
This bill was delayed, with ministers choosing to wait and align with Donald Trump’s administration in the US because of concerns that any regulation might weaken the UK’s attractiveness to AI companies.
Ministers now want to include copyright rules for AI companies as part of the AI bill.
“We feel we can use that vehicle to find a solution on copyright,” a government source said. “We’ve been having meetings with both creators and tech people and there are interesting ideas on moving forward. That work will begin in earnest once the data bill passes.”
The government is already locked in a standoff with the House of Lords over copyright rules in a separate data bill. It would allow AI companies to train their models using copyrighted material unless the rights holder opts out.
It has caused a fierce backlash from the creative sector, with artists including Elton John, Paul McCartney and Kate Bush throwing their weight behind a campaign to oppose the changes.
This week, peers backed an amendment to the data bill that would require AI companies to disclose if they were using copyrighted material to train their models, in an attempt to enforce current copyright law.
Ministers have refused to back down, however, even though Kyle has expressed regret about the way the government has gone about the changes. The government insists the data bill is not the right vehicle for the copyright issue and has promised to publish an economic impact assessment and series of technical reports on copyright and AI issues.
Beeban Kidron, the film director and crossbench peer who has been campaigning on behalf of the creative sector, said on Friday that ministers “have shafted the creative industries, and they have proved willing to decimate the UK’s second-biggest industrial sector”.
Kyle told the Commons last month that AI and copyright should be dealt with as part of a separate “comprehensive” bill.
Most of the UK public (88%) believe the government should have the power to stop the use of an AI product if it is deemed to pose a serious risk, according to a survey published by the Ada Lovelace Institute and the Alan Turing Institute in March. More than 75% said the government or regulators should oversee AI safety rather than private companies alone.