How the BPI action against Jammable highlights the conflict between rights-holders and generative AI

How the BPI action against Jammable highlights the conflict between rights-holders and generative AI

The UK’s recorded music trade association, the BPI, has publicly threatened legal action over AI voice cloning technology Jammable, previously known as Voicify (the company did not respond to a request for comment from Music Week). 

It comes as the music industry continues to grapple with the rapid growth of generative AI technology. Together with other recent AI copyright infringement cases, this latest action might signal the beginning of the end of the ‘Wild West’ era of unlicensed AI music generation.

Here, Nick Eziefula (partner, pictured above), Andrew Wilson-Bushell (associate) and Catherine Clover (trainee solicitor), music, AI and copyright lawyers at specialist media and entertainment law firm Simkins, look at the implications of the BPI action…

Jammable uses voice models to allow its customers to strip out vocals from recorded music and insert AI generated vocals mimicking the sound of an artist’s voice. Jammable has built up an extensive library of thousands of voice models including Rihanna, Drake and Taylor Swift. The BPI argues that this involves using recordings owned by its members to train the model and that, without permission from those owners, Jammable is infringing copyright.

The BPI’s case is persuasive. For AI models to generate output, they must be trained using extensive data sets. It is hard to see how AI cloning of well-known artists’ voices can be achieved without using recordings of those voices. In the UK, the sound of a voice cannot be copyright protected but a recording of that voice might well be and, in the case of well-known artists, the recordings are highly likely to be copyright works, owned by the artist or their label. Broadly speaking, the use of a copyright work without the owner’s permission will be copyright infringement unless any of the relevant exceptions or defences to copyright infringement apply. The question is whether any of those might be relevant to the use of recordings as training data in this way. 

Although UK law does involve an exception for text and data mining, it is currently limited in scope and is unlikely to permit broad use of this kind. Exceptions for parody, caricature and pastiche may be more relevant to this scenario, but they all require “fair dealing”, and only apply to limited use, for particular purposes (such as for criticism, review, or reporting events). Whilst we do not yet have a definitive case applying these principles to the AI training scenario, semi-analogous cases in other situations indicate that courts do not apply these exceptions to infringement lightly, nor often. 

Generative AI is the latest step-change, but lessons learned from the last one are pertinent

Simkins

The legal position is complicated by the fact that international laws in this area are not fully cohesive.  In the EU, similar copyright exceptions apply, although the EU approach to text and data mining, whilst potentially broader, involves an opt-out mechanism, enabling copyright owners to determine whether the exception is even available in relation to their works.  

The US law concept of “fair use” is typically more permissive than the UK’s “fair dealing”. As yet we have no concrete guidance or case law applying this to the use of AI training data but that may soon change – last year, the New York Times sued OpenAI and Microsoft in New York for copyright infringement for allegedly using the newspaper’s content to train its AI tools. The case is ongoing and the outcome will be significant. 

Similarly, in the UK, Getty Images took legal action against Stability AI for allegedly using millions of copyright images to train its AI model, Stable Diffusion. Getty has accused Stability AI not just of copyright infringement but also breach of trademark and database rights and passing-off (a form of claim based on misappropriation of a brand identity). 

While we await decisions in these cases, some direction has been given in Europe by the passing of the EU AI Act. The new rules will require “general purpose AI models” to meet transparency requirements and to comply with EU copyright law. For example, generative AI platforms will need to publish a summary of information about the content that was used to train their model, which will likely deter unauthorised use and facilitate copyright owners’ efforts to pursue infringers. 

As far as we are aware, no formal legal proceedings have yet been issued by the BPI as at the date of this article but, by issuing press releases about its intended action, the BPI has effectively taken the matter straight to the court of public opinion. This confident approach seems to have had an impact, as Voicify has changed its branding to Jammable and big-name voice-models such as that relating to Rihanna seem to have been removed from the platform. 

The music industry has long had to adapt to changes in technology. Generative AI is the latest step-change, but lessons learned from the last one are pertinent. The widespread adoption of file-sharing technology was disruptive almost to the point of destruction. The recorded music business was eventually restored through a multi-faceted approach: clamping down on unauthorised use through legal action (against platforms such as the infamous Pirate Bay); legislative change; and the development and licensing of lawful services that offer consumers a quality customer experience. In summary: enforce, enact, embrace, evolve.  

We are beginning to see examples of such enforcement and enactment in relation to generative AI. We can expect soon to see licensing regimes that embrace the change, as business models evolve.

 

FOLLOW Andre Paine


For more stories like this, and to keep up to date with all our market leading news, features and analysis, sign up to receive our daily Morning Briefing newsletter

subscribe link free-trial link

follow us...