In recent weeks, a certain type of music was inescapable: clips of famous artists covering songs by other artists. The combinations sounded like someone had hit the randomizer. Drake covers singer-songwriter Colbie Caille, Michael Jackson covers The Weeknd and Pop Smoke cover Ice Spice’s In Ha Mood. The songs are generated by artificial intelligence. The videos that resulted have been viewed by tens and tens millions.
Jered Chavez is a Florida college student who made the leap from playing with AI tools to having a viral hit in late March. He posted a YouTube video of Drake, Kendrick, and Ye singing “Fukashigi no Karte”, the theme song for , an anime series. The video has been viewed more than 12,000,000 times in the last month.
Since then, Chavez has created new clips in a constant rate. He’s gotten millions of views on dozens of videos. This is because he runs acappella versions through AI models trained to sound just like the world’s most famous musicians. TikTok is a big fan of them. They are easy to make, cheap and quick.
“I was surprised at how simple it was. It sounds good right out of AI. Chavez describes the process as “real”. It’s actually kind of frightening how easy it is to do these things.
Platforms have not yet removed Chavez videos. But if major artists and labels figure out a way to stop him, the threat could come soon.
Copyright violations are being used by music industry leaders to remove AI-generated songs from streaming services. Legal experts warn that this argument is not straightforward. It’s not clear whether the real Drake could stop the robot Drake based on copyright. Yet copyright is once again the most effective method to remove something from the internet someone does not like.
Because of the strength of the copyright system, it’s very easy to use copyright to pursue new creative content you think crosses a line. Even if there’s no strong legal basis, you can still do so.
The song, posted by a anonymous TikToker going by the name of Ghostwriter and based on a Drake song, went viral in the first week of April. The song was posted by a anonymous TikToker who went by the name Ghostwriter. It amassed millions in streams before Spotify and YouTube took it down. YouTube removed the song after a seemingly unforced mistake: The otherwise original song had an Metro Boomin production tag added at the start. Universal Music Group successfully removed the song by claiming it was an unauthorised sample. A copyright claim was successful in this case — but only just. A AI Drake track called “Winter’s Cold” , as well as other original songs like “Winter’s Cold” , have also been removed from streaming platforms based on claims of copyright violations.
The law doesn’t protect anything that is copied by original songs created using voice cloning.
UMG is pushing for streaming platforms to crackdown on because “Heart on My Sleeve”, a song by Drake and The Weeknd, violates copyright laws. (Both Drake and The Weeknd are under UMG’s control). This is the same argument that’s being used in other creative industries. For example, Getty Images is sueing the makers of Stable Diffusion for claiming Stability AI had “unlawfully processed millions of protected images” when it was training its AI system. is also moving in that direction, saying that they should be paid when their content is used to train bots.
The problem of removing songs like “Heart on My Sleeve”, “Winter’s Cold”, and other tracks that are protected by copyright is that they don’t actually copy anything. The compositions of both songs are original, as they appear to have been written by someone other than Drake, and then fed into voice-cloning software. The voice, style or flow of an artist is not protected under copyright laws ( in most cases). It’s not a copy if an upcoming artist creates their own lyrics and records the vocals, then runs it through The Weekend machine. It would be risky to promote the track as The Weeknd’s song, but it would more likely be a trademark issue than a copyright one.
The current AI voice technology is more advanced, which makes the issue of sampling even more sticky. Many AI systems create new sounds that are similar to the target voice, unlike older technology which chopped and rearranged existing recordings. Garcia says that even if there were tiny bits of the original recording in the new song it wouldn’t be enough to constitute a copyright violation.
Meredith Rose is a senior policy adviser for Public Knowledge. She says that copyright laws can be a powerful tool, able to hit many nails. It’s not easy to prove that an AI Drake song infringes on Drake’s real copyright, but this is the way most labels and the public think about the problems associated with AI songs.
Rose: “Copyright may be a concern but it is not as important as some of the larger, more existential issues such economic displacement and business model disruptions, or deep fakes.”
What if AI-generated Drake tracks appeared and diverted revenues from the actual Drake? What if AI-generated songs were just bad and people thought Drake had lost his magic touch? What if someone made AI Drake sing the white nationalist anthem by a creator? The issue quickly extends beyond copyright to Drake’s personhood, identity and personhood. Unlike the unknowns of AI, there are precedents for how someone’s likeness can be used.
The right to publicity gives a person the ability to decide how their likeness or name is used for profit. Before we get into the implications of AI, it’s important to understand that there are differences in what options individuals have. Modern copyright laws are at the federal government level. DMCA takedowns, as part of this, offer a relatively easy and quick way to remove material without involving a legal representative or filing a suit. The right to publicity, which is sometimes confused with the right to privacy, is more complex and is only available at the state-level.
This kind of law is only in some states, but it’s important to note that California and New York, the two largest entertainment industries, both have them. People love to mock celebrities, but celebrities also hate it when their image is mocked. The real Drake could very well sue robot Drake under the same law as real Vanna did in 1992 to sue robot Vanna. In 1992, was brought against a Samsung advertisement. Robot Vanna in that case was a metallic robot wearing a colorful gown, a blonde mid-length wig and jewelry. She stood next to a board of letters on a game show. The Wheel of Fortune host’s name does not appear in the advertisement in question.
Garcia says that even though the AI clone tracks are being promoted as “Drake’ or “KanyeWest” AI songs, their publicity rights could still extend if they don’t have the creators name on the promotional material. Listeners will recognize the voice singing “Paparazzi, whether or not they are shown a picture of Ariana. It’s only natural that the first voice cloning test subjects are musicians who are known for their voices. It raises many questions about fair use, celebrity and pop culture. As complicated as the questions may be, this debate has existed for decades. What about future AI voice clones that aren’t known for their vocal talent?
When we enter the swamp, it’s already murky water.
“[Let us say] that you have a Ron DeSantis rap track that is blaring out there. God help us. Would the arguments be the same? Rose replies, “Maybe not.” “That’s where we enter even more murky territory.” “We’re already in murky waters. That’s when we get into the swamp.”
Rose believes that the AI tool fight will be more than just copyright. She thinks it will also lead to a re-examination of laws such as the right of publicity within the next 5-10 years. If someone has been the victim of a deepfake or a fabricated recording, they will have a difficult time removing it, depending on where they reside. Rapid accessibility of powerful AI could force the legal systems to fill in existing gaps, whether or not voice cloning is used.
Rose asks: “Do we make things like the right of publicity a federal statute so that everyone in the United States can access whatever tools we decide to build into this?” Right now, where you live is a matter of luck, whether it be for the better or worse.
One problem with AI voice clone songs is that, unfortunately for the subject, they are funny. Nobody asked to hear AI Joe Biden say , “He say that I’m good enough, grabbin’ my duh-duh-duh / Thinkin’ ‘bout shit that I shouldn’t’ve,” but it’s become part of internet culture — the audio has been added to TikToks of people cleaning their bathrooms, making salads, and dancing.
Many listeners have noticed that most of the viral AI songs and covers are created without the consent of the subjects. The comment sections are often filled with variations of “This must be illegal” or “Waiting to take this down.” There is an underlying grossness in the content of the voice clone songs, even as they become increasingly absurd.
Experts are concerned that AI voice clones could become a major problem in the near future
Experts that have dealt with other forms of online sharing without consent are concerned that AI voice clones could become a major problem sooner than later. While the current focus is on AI tools that spoof wealthy, famous individuals, this could become a nightmare to the average person who experiences things like domestic violence.
“I anticipate we’ll start seeing voice-cloning being used to trick schools in order to gain access to kids, or to pretend that an ex-boyfriend has tried to contact you,” says Adam Massey a partner at C.A. Goldberg who specializes in technology-facilitated abuse like nonconsensual distribution of intimate images (often called “revenge porn”). Massey suggests that victims can send a cease and desist letter to the person or entity distributing the fabricated material, claiming they have violated your rights of publicity, fraud, or impersonation. Success will depend on how responsive this individual is. The subject may not have copyright if the unauthorized content is an artificial product created by an AI tool.
Massey says that, like the right to publicity, laws prohibiting sharing intimate images vary from state to state. For example, only a few states have laws addressing deepfake or fabricated porn. NBC News reported last month that websites hosting nonconsensual explicit deepfakes operate openly. Although there is no federal law that prohibits nonconsensual intimate photos, deepfake images are so common that Google offers a DMCA like system for victims to request takedowns.
Unsigned independent musicians who lack the legal backing of big artists will be forced to wade their way through voice clones on their own. Massey says there is no standard way to report AI-generated content, and that artists would need to prove it was affecting their ability to earn money from their publicity rights.
Chavez is not only an AI-assisted mashup artist, he’s also a musician. He records and shares songs in the “aesthetic” rap genre. He’s not concerned with someone cloning the voice of a deceased artist. Instead, he is more worried about labels releasing albums posthumously without the input from the deceased artist.
Since the TikToks went viral, his music streams have also exploded, and his subscriber count on YouTube has doubled.
He says, “So far, I am very pleased with the positive responses.”
People keep asking him for new songs, even though he is not passionate about “AI stuff”. Chavez will continue to make some songs here and there, but he’s getting bored.