UK Music's Jamie Njoku-Goodwin: Government must act to prevent AI-based 'music laundering'

UK Music's Jamie Njoku-Goodwin: Government must act to prevent AI-based 'music laundering'

Even amid the current economic challenges, there is no more pressing issue for the music industry than the impact of AI.

So far, labels and DSPs have been working to take down fake tracks based on major artists, while detection systems are being created to tackle generative AI music that infringes copyright.

UK Music has been at the forefront of the debate, and has recently been in talks with members on the issue and how to address it at the policy level with ministers. 

Here, UK Music CEO Jamie Njoku-Goodwin shares an urgent call for the government to protect the music industry from rapidly developing AI technology…

The recent advances in artificial intelligence have been as rapid as they are transformative. They will have profound implications for all areas of modern society. 

Just as we have seen with the advance of the internet, these technological developments bring huge opportunities, but also massive risks and challenges if we get it wrong; unless the right guardrails are in place for its development. 

The music industry is a highly innovative industry which embraces and uses new technologies, both in terms of creating music and consuming music. That is why we endorse the forward looking principles of the Human Artistry Campaign, which put human creativity at the heart of copyright protection. 

AI is already playing a key role as an assistive tool in our industry where it is used for tasks like detecting copyright infringement, as well as analysing and predicting consumer trends.  

However, the developments and implications of generative AI technologies (ie. AI which generates “music”) raise many concerning and difficult questions.

We have seen recent examples like the use of AI to clone the voices of Drake and The Weeknd for a song that went viral after the software was trained on the musicians’ voices.  

It might seem novel and groundbreaking, but it’s important to remember that these are not “creations” in the genuine sense; it impacts on the creators and performers involved, as well as all those who support, invest in, and work with them. AI generates, never creates. It relies totally on what is ingested. It does not have original thought. There is no imagination. It is never more than the sum of its parts.

AI cannot produce music on its own. It needs to be trained, which happens through a process of “ingesting” a myriad of existing pieces of music, copying and analysing the various patterns and structures, and then generating a “new” piece of music based on that computation.. 

This might be acceptable if and only if those pieces had been used with the original creators’ consent. 

However, too often people are illicitly using music to train AI technologies without any regard to copyright and without the consent of those who made the music or whose personal identity is being used.

In fact, proposals from government last year sought to remove the need for any consent of the original creator at all, with a proposed copyright exception covering all copies made by the machine to learn. This process is legally referred to as text and data mining covering the process of feeding an AI by taking text (such as lyrics) and data (such as digitised music) from existing music 

The proposals would have been hugely damaging, as well as morally and ethically wrong for two key reasons. 

Firstly, they undermine the mechanism by which people who create and invest in music can earn a living from it, by means of choice in how and where their music is used.

Legally, this contravenes the basic principle of property rights upon which liberal democracies are founded. If you have created something, then you as the owner permit or forbid someone from using it. It is morally wrong for someone else to use your talent against your will, especially if they are then extracting value and making money from it. 

Secondly, it creates a potential avenue for writers’ work and artists’ performances to be ripped off by bad actors. 

Take as an example John Williams, one of our greatest and most popular living composers. Want John Williams to score your new film? Well, you could just take his music, feed it into an AI and use it to generate a film score.

You get a new “John Williams’ soundtrack” without paying him a penny. A win for you perhaps, but certainly not for John Williams. His intellectual property has been exploited and he gets nothing and the consumer doesn’t get the real deal. The same would apply for any creator, whether its David Arnold, Debbie Wiseman, John Powell or Andrew Lloyd Webber.

We’ve dubbed this “music laundering” – a process where you could steal someone’s work, feed it into an AI, and then generate clean, “new” music, just as a money laundering operation might do with stolen money.

Particularly perversely, creators could be having their work stolen and used to train an AI without even being aware. There is a complete lack of transparency around the ingestion process for AIs and without this, it will be difficult, if not impossible in some cases, to hold bad actors to account.  

That has to change. We need far greater transparency and detailed record keeping about that process as a first step towards working together to create a system which safeguards human creativity and human connection through music, while fostering rewarding innovation. 

We welcome that the government listened to concerns on its original proposal regarding text and data mining, yet the threat of new legislation remains a possibility.

The next time a new piece of AI music goes viral, remember that we won’t know what content the AI that produced it was trained on, and we won’t know whether or not consent was given to use that content to train the AI. 

Indeed, in some cases we don’t even know if a piece of music is human-created or artificially generated. This poses existential challenges to musicians and all creators; it undermines the concept of copyright and the intellectual property (IP) framework that underpins  our industry.

However, the ramifications are deeper and broader than the issues around copyright and illuminate some of the deficiencies in our current legal framework.

Image and personality rights are another hugely contentious area when it comes to AI which can create an almost exact likeness of an individual creator. While countries like America have rules on image rights, there are no such specific protections in UK law and that is another key area that needs to be addressed urgently. 

We must ensure that AI technologies are developed in a way that supports human culture and artistry

Jamie Njoku-Goodwin

It’s clear that government needs to take action  and ensure that the regulatory frameworks that govern the use of AI are fit for purpose, and don’t just create the conditions for AI to thrive, but also ensure necessary protections for creators are in place. 

Misregulate AI and you could have songs and videos of artists seemingly endorsing campaigns they don’t agree with, or products with which they would not want to be connected.  

Going beyond music, it could get even more sinister. You might see a Martin Lewis lookalike and soundalike endorsing questionable business investment for life savings, or a bogus Chris Whitty suggesting drinking bleach protects against Covid. 

There are a number of specific actions government could and should take to support our world-leading music industry and the talent pipeline on which it depends - measures that would also benefit many other sectors. 

Firstly, government must put copyright and IP protection at the heart of its approach to AI, and commit categorically to there being no new copyright exceptions. The industry has a track record of developing effective licensing solutions for innovative new technologies, from the advent of radio and TV, video gaming and currently, with a host of apps and online platforms. AI is no different, and we are eager to work with the AI sector to find those solutions. 

It’s crucial the training of AI respects the creativity of writers and artists and is based on consent and appropriate permissions. Some artists will be happy to agree to an AI being trained on their work. Some, however, will not want an AI ingesting their work at all – as is and should be their right in line with robust principles enshrined in law for many years. Government must recognise and protect that basic right.

Secondly, there must be an obligation for adequate record keeping. We need to know exactly what content an AI has been trained on. The same applies across the generative AI field. While it’s exciting to play around with large language models like Chat GPT, don’t forget that we have no idea what the inputs for these technologies were. We don’t know what trained them or whether the creators’ consented or what biases may have been introduced unwittingly.

Thirdly, labelling. It’s important to know whether something has been generated by a computer, or if it is a real human product. This is true not just for music, but for all manner of content. Advertisements, political campaign materials, reports, advice. This helps not only the creator, but also protects the consumer. 

Fourthly, we must rapidly look at the issues around the protection of personality rights, and ensure that there are adequate protections in order to keep pace with the rapid development of AI in the creative space.

Other countries are ahead of us. Proposals have been put forward in the European Parliament and recently the Chinese regulator also set out new draft measures. Many of these were welcome – the need to respect IP, an obligation to exclude content that infringes copyright, a requirement to keep records of inputted data sets and clear labelling rules.

The UK government is currently drafting its own code of practice to give guidance on how AI firms should access copyrighted work. It’s imperative that this code of practice promotes the highest ethical standards of copyright and IP protection, and we protect the property rights of creators. 

It would be quite something if we were to end up in a situation where the UK’s regulatory framework ends up less transparent and with weaker protections for copyright and property rights than the People’s Republic of China. If our standards are set at a low bar then this would undermine both the AI sector and creative industries in the long-term. The AI industry can only benefit from high standards as it needs a successful creative sector to generate “new” products. 

Ultimately, it’s vital that we follow key principles and maintain core values throughout this debate. 

We must ensure that AI technologies are developed in a way that supports human culture and artistry rather than eroding our creative endeavours and threatening the deeply personal creations we all cherish - whether that’s music, literature or other great works of art. 

The brilliant global Human Artistry Campaign has set out clear principles that must underpin this debate.  As the collective voice for the UK music industry, UK Music has already signed up to those principles. 

There is much to be excited about and just as much to be rightly concerned given the ferocious pace of AI development. It’s absolutely critical we enshrine and uphold these key principles as we work our way through these issues.


For more stories like this, and to keep up to date with all our market leading news, features and analysis, sign up to receive our daily Morning Briefing newsletter

subscribe link free-trial link

follow us...