When you realize your favorite new song was written and performed by ... AI
Music fans responded with disbelief this week to the release on streaming and social media platforms of the viral song "Heart on My Sleeve."
The hosts of the popular music-related YouTube channel LawTWINZ were among the many who weighed in, discussing whether the track, which uses artificial intelligence to simulate the music of pop stars Drake and The Weeknd, even surpasses the real pop stars' talents.
Advances in AI have gotten to the point where the technology can quickly create new songs like "Heart on My Sleeve" that sound like they're the work of real artists.
Recent examples, which include a faux song that sounds a lot like something the British alt-rock band Oasis would put out, hint at AI's bold, creative possibilities and its ethical and legal limitations.
Now, artists, lawyers and other industry players are trying to figure out how the technology can be used responsibly.
'The cat is not going back in the bag'
The popularity and revenue-earning potential of AI-generated songs have understandably put music industry gatekeepers on guard.
Drake and The Weeknd label owner Universal Music Group invoked copyright violation to get the platforms to take "Heart on My Sleeve" down this week.
"The training of generative AI using our artists' music (which represents both a breach of our agreements and a violation of copyright law) as well as the availability of infringing content created with generative AI on DSPs [Demand Side Platforms], begs the question as to which side of history all stakeholders in the music ecosystem want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation," said the company in a statement shared with NPR.
This wasn't the first time the music corporation flexed its litigation muscles, and it won't be the last; earlier this month, it ordered a takedown of an AI track based on the music of Eminem. The song featured lyrics like, "Cats, cats, cats, always on the prowl / They're sneaky and sly with their eyes on the goal."
"The cat is not going back in the bag," said Stanford University associate professor Ge Wang, of the growing popularity of AI-generated music online. Wang, who teaches a class on AI and music, said as the technology becomes more widespread, people can no longer afford to think of it as the stuff of science fiction.
"There's something that we couldn't do now that we can," Wang said. "And along with it is a ton of legal, ethical and artistic considerations that we didn't have to think about before in a practical sense. But now we do."
Music/AI litigation in its infancy
Now, the music industry is trying to play catch up.
From a legal standpoint, music and AI litigation is in its infancy. "It's an emerging area," said entertainment lawyer Craig Averill. "The courts have not weighed in."
The U.S. Copyright Office has issued decisions around AI-related works.
"The author has to be a human as the law stands," Averill said. "It can't be completely computer-generated."
But Averill said dizzying questions remain about the amount of human intervention needed to make AI-generated musical works copyrightable. And if the face of the work isn't a human, then who's the copyright holder?
"If you come up with a composition and then you have an animated character that's front-facing for it, and you don't have to really pay that entity any royalties, what does that look like?" Averill said. "We're not there yet."
Ethical and aesthetic issues abound
Some artists are skeptical that the law will ever catch up with the technology, given the speed at which it is developing.
"It's completely broken logic that legislation or litigation is going to protect the arts," said Grammy-nominated electronic musician and software developer BT. "It's not gonna happen. It [the technology] is evolving too quickly."
BT said artists — rather than lawyers — should create guardrails around how AI is used for music production and sharing. Like all of the musicians interviewed for this story, BT said he sees great potential in AI as a resource — as long as artists are paid properly.
And he also said there are enormous ethical issues to contend with.
One example is when an AI tool generates lyrics in an artist's style that the actual artist would never sing. The unlikely Eminem song about cats is a case in point — albeit a relatively harmless one. The technology could create lyrics that are much more controversial and potentially damaging to a singer's reputation.
"Where we're talking about the creation of vocals, it could be used to say something that is polar opposite to that person's belief system," BT said.
Then there's the question of aesthetics.
"One danger is the lowering of artistic standards to a point where fake becomes real and mediocrity rules," said singer-songwriter and voice actor Dan Navarro. "Then commercial music becomes like brown-food-product; able to sustain life, but never truly satisfy."
To keep up with the technological advances, dozens of entertainment industry representatives recently joined forces to create the Human Artistry Campaign. Navarro is part of this new advocacy group.
"The Human Artistry Campaign's stated goal is to underscore the unique value of human artistry and human creation, especially as technology and opportunism create a culture for conflict and misuse and even abuse," Navarro said. "I'd like to see a set of agreed principles with legal teeth so that artists, the music industry, streaming services, and audiences can understand what is — and is not — allowed."
Audio and digital stories edited by Ciera Crawford.
Copyright 2023 NPR. To see more, visit https://www.npr.org.