Musicians and other celebrities should be protected by a law that would ban artificial intelligence-generated deepfakes in the UK, a group of MPs has suggested. The use of AI to impersonate the voices and images of well-known singers and rappers has been growing.

In early April, Jess Glynne, Mumford and Sons, Sam Smith and Zayn Malik were among the British artists who signed an international open letter calling for more protection against “the predatory use of AI to steal artists' voices and likenesses”.

The All-Party Parliamentary Group on Music has now called on the government to regulate the use of AI in music. It should include “a specific personality right to protect creators and artists from misappropriation and false endorsement,” the MPs said in a report published on Wednesday.

Such a law should also include other measures to protect musicians from the risk of AI becoming “a destroyer of creators' livelihoods”, it said. Politicians must “confront the danger that unfettered developments in AI could pose to the UK’s musicians and music businesses”, said the cross-party group's chair, Labour MP Kevin Brennan.

But the power of AI can also be a force for good to “help musicians to innovate and to inspire new human creativity”, he added.

‘Servant not master'

“We ignore the necessity to sow policies, which will harvest the benefits of AI, and help stave off the threats it poses, at our peril,” he said.

“Our central insight must always be that AI can be a great servant but would be a terrible master.” The report highlighted new powers in Tennessee, which recently passed a law – dubbed the Elvis Act – to prohibit the use of AI to mimic an artist's voice without their permission.

The MPs said this example “showed the case that the UK should introduce a personality right to protect the individuality of creators in the UK and not fall behind our international competitors”.

The UK does already have existing protections, including “passing off rights” that stop one person from misrepresenting another person when offering goods or services. However, the MPs said it was yet “to be seen if this would be effective against deepfakes”.

They added: “Unambiguous legislation that protects creators and artists from misappropriation and false endorsement would provide clarity and certainty for all involved, including tech providers.”

They said a “pro-creative industries AI Bill” should also include a right for musicians to prevent their work being used by AI, clear labelling of music made with AI, the creation of an international taskforce, and copyright reforms.

A government spokesperson said: “We are committed to helping artists and the creative industries work with the AI sector to harness the opportunities this technology provides, and ensure our music can continue to be enjoyed around the world.

“Trust and transparency are vital to this shared approach. We are working closely with stakeholders and will provide a further update in due course.” The government has already pledged that the creation of sexually explicit “deepfake” images will be made a criminal offence in England and Wales.

US hearing

On the other side of the Atlantic, UK artist FKA Twigs issued a statement to a US Senate Judiciary subcommittee looking at deep fakes and intellectual property. She wrote: “Our careers and livelihoods are in jeopardy, and so potentially are the wider image-related rights of others in society. You have the power to change this and safeguard the future.

“That the very essence of our being at its most human level can be violated by the unscrupulous use of AI to create a digital facsimile that purports to be us, and our work, is inherently wrong. She added: “It is therefore vital that as an industry and as legislators we work together to ensure we do all we can to protect our creative and intellectual rights as well as the very basis of who we are.”

— CutC by bbc.com

Leave A Reply

Exit mobile version