features

Are Those Jay-Z and Kendrick Lamar AI-Generated Verses Legal? An IP Lawyer Weighs In

Are Those Jay-Z and Kendrick Lamar AI-Generated Verses Legal? An IP Lawyer Weighs In

Published Fri, April 14, 2023 at 4:32 PM EDT

By now you've probably heard the deepfake AI-generated verses mimicking rappers like Drake, Kendrick Lamar, Nas, and Jay-Z.

While the technology was shaky at first, these days it's so eerily good it's almost impossible to even tell you're listening to a fake — especially when the creator has the cadence and delivery of your favorite rapper down pat.

As the world continues to immerse itself in AI — from sophisticated language models like ChatGPT and Bing (or Sydney if you ask it the right way) to AI image generators, and more — the lightening-fast pace of the technology seems to be outrunning society's ability to adapt to the new future. Deepfakes can be entertaining — we all loved Kendrick Lamar's "The Heart Part 5" video, but they can also be scary, even dangerous. They also bring up important questions about copyright, IP, and monetization.

As Axios pointed out in February, right now generative AI is a legal minefield. "New generative AI systems like ChatGPT and Dall-E raise a host of novel questions for a legal system that always imagined people, rather than machines, as the creators of content," the outlet contends.

Never mind dystopian AI-centered fiction like 2001: A Space Odyssey, Terminator, or Black Mirror. In real-time, the question is just how far AI can go, and how quickly laws can be put in place to fend off possible legal issues.

AI-generated music has been a source of worry for Jay-Z's longtime engineer and producer, Young Guru. His apprehension was further fueled when an eerily authentic-sounding Jay-Z verse, created by a group called AllttA, emerged and was nearly impossible to discern as fake.

"I’ve been trying to tell everyone that this is where we are now with AI," Guru wrote on IG on March 31, earning responses from Hip-Hop artists and producers like Royce Da 5'9, 9th Wonder, and Rah Digga. "For some reason, this one got everyone’s attention. So what do we do?"

In April, Tom Graham, CEO of generative AI pioneer Metaphysic, made history as the first person to submit for copyright registration his AI likeness with the U.S. Copyright Office. Even with that kind of move being made, the law still seems to be catching up to AI developments, and until it does, what does it mean for artists who have their voices used for deepfake verses? What does it mean for consumers who think they're listening to a new Black Thought verse only to find out it's actually an unofficial and unsanctioned creation by "XYZ Rando" making deepfakes from his home computer in Colby, Kansas?

"On one hand, I’m well aware that you can’t stop technology," Guru reasoned in his post. "Once the genie is out of the box you can put him back in. On the other hand, we have to protect the rights of the artist. Not only artists but everyone in society. People should not be able to take your name, image, and likeness without permission. We have to add a voice to this law."

In a recent turn of events, on April 12, The Financial Times reported Universal Music Group, which controls about a third of the global music market, demanded Spotify, Apple Music, Tidal and all streaming platforms block AI versions of their artists and their copyrighted songs. “We will not hesitate to take steps to protect our rights and those of our artists,” UMG wrote to online platforms in March, in emails viewed by the FT. However, even that development still leaves questions about deepfake verses with original lyrics and production.

As Hip-Hop, a genre that's always been at the forefront of cutting-edge technology, deals with the new wild, wild west of A.I. tech and the current legal landscape, we talked to an IP lawyer to get some answers. Antoine Wade, an IP and entertainment attorney at The Wade Firm who specializes in helping clients monetize, protect, and maximize their brand recognition in the marketplace, admits that as of now, things are tricky for artists, especially artists who are up-and-coming.

You've heard a lot of the deepfake verses from artists like Jay-Z, and how astoundingly accurate they are. So first things first —are they legal?

As of now, they’re not illegal but because of the chaos and manipulation they can cause, there may be remedies available to the person whose rights are being threatened. 

Right now, as it stands, is there anything in place to protect artists in terms of copyright?

From a legal perspective, I just want to give an overview of the three different types of intellectual property. So, you have trademarks, which cover and protect slogans, phrases, logos, and things like that. Then you have copyrights, which more so protect the artistic side of things, so literature, in this case in which we're speaking, music. And then there's patent law, which is for scientific inventions. But with copyright law, even though it's in place to protect literature, music, and things of that nature, there's nothing under copyright law that's in place to protect an actual voice. As it stands there's no way to protect a vocal style or someone's voice. In this case, Jay Z, wouldn't be able to protect his voice under trademark law, copyright law, or patent law.

Wow. That's obviously deeply concerning if you're an artist.

That's generally speaking. And I don't want to get too complex, but I think the conversation calls for it. With Jay-Z, we don't have to see his name listed on a track, we don't have to see a video of Jay Z. Once we hear the voice of Jay Z, we automatically know. And it's not just Jay Z, it's people like Michael Jackson like when we hear Michael Jackson, we know it's Mike. And there are so many other artists where we can recognize their voices. There's this rule of law that falls outside of the three main areas of intellectual property — trademark, copyright, and patent law — another branch of intellectual property called "name, image and likeness." And under "name, image and likeness," there's this concept called "rights of publicity" which protects a person's image. So, that's a video, photography, a signature... And surprisingly, one of the things that fall under the "rights of publicity" is the right to protect your voice. But there's a caveat with that — it only extends to people that have voices that are recognizable by the public. So in this situation, Jay-Z could actually file a claim under "rights of publicity" but that only extends to people that have what I call, a secondary meaning throughout the entire world. Now, you ask the question, what does that mean for artists who don't have recognition around the world like Jay-Z or Rihanna or Beyoncé?

I expect Congress to get involved as the dangers associated with AI and deepfakes continue to evolve.

So you're saying technically artists who have recognizable voices do have some legal recourse under "rights of publicity"?

Some but not all because everything is on a case-by-case basis. Let's just say you're an up-and-coming underground artist, and I come across your YouTube channel, and I'm like, 'Wow, I love [her] voice. I'm gonna use her voice to code into some lyrics that I made up.' So I'm actually creating my own lyrics but I'm using your voice. You're not known to the world, you're still up-and-coming and your voice is not recognizable to the world. Where does the protection come in for you? It's kind of an unfair situation because it doesn't protect every artist, only certain artists. And even with an artist like Jay-Z, the argument still has to be strong if someone is disguising it as creative expression or if they use a disclaimer that says "what you're listening to is not the real thing," it can be a loophole.

As an attorney, what would you like to see happen to protect artists?

It's either two things, right? You put something in place where if you're going to use artificial intelligence, there has to be some type of restrictions. They have to be required to put disclaimers on their videos that say what you're listening to is not Jay-Z, or is not Kanye West, this is only for entertainment purposes. And then, there should be something in place that says those artificial intelligence videos, or those songs that they create, cannot be monetized. And in the event, that you don't provide a disclaimer, or you attempt to monetize those types of videos, you should be subject to a hefty fine.

We've focused really heavily on the creation of songs and verses using AI but what about AI voice deepfakes used to make statements that are falsely attributed to artists?

Yes, there are legal consequences for such acts. The penalties vary from state to state but such acts are considered a violation of human rights, rights of privacy and personal data. I expect that states will enact new legislation to prevent the public from being manipulated as a result of Artificial Intelligence pertaining to Deep Fakes. I also expect Congress to get involved as the dangers associated with AI and deepfakes continue to evolve. 

Related posts

Samplers

Google Assistant's New Sample-Spotting AI Causes Issues for Producers

Mar 17, 2023

Kendrick Lamar

AI-Generated "Kendrick Lamar" Song Has Young Guru Deeply Concerned

Feb 27, 2023

Spotify Announces a New AI-Powered Listening Guide Called DJ

Feb 22, 2023

What's new