Heart on My Sleeve: Dilemmas with AI-Generated Music
In April of 2023, a collaboration between the highly acclaimed artists Drake and The Weeknd titled “Heart on My Sleeve” gained millions of views on various social media platforms and over half a million streams on Spotify. This new song quickly became a hit as rap and R&B fans repeatedly streamed the song and eagerly added it to their playlists. But this collaboration was unorthodox in more ways than one. The song was created by an anonymous TikTok user named Ghostwriter with generative AI and the vocals were unauthorized generated replicas, or “deepfakes” of the artists.
This is just one of many examples where AI-generated songs mimic the vocals of real actual artists. With AI technology evolving faster than current copyright laws,, there are no established safeguards to protect artists from “deepfake music.” Recognizing this gap, the U.S. Copyright Office recently released a report that addresses the urgent need to address the challenges posed by deepfake music. As this technology continues to advance, legal frameworks must adapt to ensure copyright laws continue to serve their intended purpose of protecting creators’ rights.
Begin Again: Current Outdated Copyright Legal Frameworks
Currently, copyright law in the US is primairly governed by federal statute. The Copyright Act of 1976 provides protection for “works of authorship fixed in any tangible medium of expression.” The Act gives copyright owners an array of exclusive rights, however, “digital replicas that are produced by ingesting copies of preexisting copyrighted works, or by altering them—such as… simulating their voice singing the lyrics of a musical work—may implicate those exclusive rights.” In 1998, copyright law was amended through the Digital Millennium Copyright Act (DMCA) to “address important parts of the relationship between copyright and the internet.” This Act established protections for online service providers if their users engaged in copyright infringement and encouraged copyright owners to give greater access to their works in digital formats by providing legal protections against unauthorized access. In this instance, a gap in the law was recognized and US copyright law was amended to address technological advancements. Similarly, the current gap that exists between copyright law and AI should be addressed.
Past court cases have also acknowledged the potential harm posed by quickly evolving technology. In Arista Records LLC v. Lime Group LLC, a group of record companies sued Lime Group LLC, a peer-to-peer file sharing system that allowed users to download and share copyrighted digital music files without authorization from the copyright owners, for copyright infringement. The Court granted summary judgement to the plaintiff record companies. Streaming platforms like Spotify and Apple Music then soon adopted similar file-sharing streaming processes that instead included subscriptions which legally grants sharing rights to listeners. Much like the evolution of peer-to-peer sharing, legal frameworks and solutions must be developed to meet the technological advancements of AI-generated music.
Beyond federal copyright statutes, , there are also state-level legal frameworks that could play a role in addressing AI-generated deepfakes. For example, the right of publicity, defined as “an intellectual property right that protects against the misappropriation of a person’s name, likeness, or other indicia of personal identity– such as nickname, pseudonym, voice, signature, likeness, or photograph–for commercial benefit,” is largely protected by state common and statutory law. Currently, over 30 states recognize the right of publicity in some form. This right may offer the best defense against AI-generated deepfakes because of its intent to protect aspects of an individual’s identity. However, because these laws vary significantly from state to state, they do not yet provide a consistent or comprehensive legal solution to the issue of AI generative voice imitations.
Formation: Limited Options for Removing Deepfake Music
The lack of federal legislation regarding AI-generated deepfake music currently leaves artists with few effective options to combat the production and spread of these unauthorized imitations. There are currently three main strategies employed to address the issue. Firstly, stakeholders in the music industry directly requesting digital streaming platforms (DSPs) like Spotify and Apple Music to remove songs. Secondly, artists personally asking their fans to cease streaming of deepfake music, and lastly, Digital Millennium Copyright Act (DMCA) takedowns.
A DMCA takedown is “a tool for copyright holders to get user-uploaded material that infringes their copyrights taken down off of websites and other internet sites.” However, because deepfakes do not constitute copyright infringement under current legal frameworks, these takedowns are often unsuccessful. In the case of the viral deepfake track “Heart on My Sleeve,” a DMCA takedown was only successful because the song contained a distinctive producer tag, which is a short audio clip at the beginning of a song that usually contains a producer’s name or catchphrase to identify their work and claim ownership over the track. Had the song not included this producer tag, this takedown would have probably been unsuccessful. These limited and often unreliable methods for removing deepfake music from digital platforms reflect a need for federal legislation to be updated.
Broken Clocks: New Proposals for Federalized Legislation
One solution that has been widely proposed is a federal right of publicity to protect against the misappropriation of a person’s name, image, and likeness. A federalized right of publicity would address the issues created by the lack of consistency between state right of publicity laws. Although a plausible solution, a current congressional proposal may offer a solution more closely tailored to AI generated deepfakes.
In January of 2024, the No Artificial Intelligence Fake Replicas and Unauthorized Duplications (“No AI FRAUD Act”) was introduced. The Act would establish voice and likeness as an intellectual property right and offer protection against “the use of unauthorized digital voice replicas and digital depictions that readily identify an individual.” The Act defines a digital depiction as “a replica, imitation, or approximation of the likeness of an individual that is created or altered in whole or in part using digital technology.” It defines a digital voice replica as “an audio rendering that is created or altered in whole or in part using digital technology and is fixed in a sound recording or audiovisual work which includes replications, imitations, or approximations of an individual that the individual did not actually perform.”
The Act would require that an agreement authorizing the use of a digital depiction or digital voice replica would only be valid if “the applicable individual was represented by counsel in the transaction and the agreement was in writing.” If the individual is a minor, the agreement must be approved in court in accordance with state law. Authorization would also have to be governed by a collective bargaining agreement. This proposal would give artists legal standing to combat unauthorized voice imitations in generative AI music. They would no longer have to rely on direct communication with digital streaming platforms or fans to prevent the music from being streamed. The Act also includes a First Amendment Defense by providing several factors for courts to consider in balancing public interest against the intellectual property interest in the voice or likeness.
Smooth Criminal: Preventing Widespread Deepfakes
Although advancements in technology may promote and encourage creative expression, legal action may have to be taken when this creative expression involves directly replicating an artist’s voice and likeness for commercial use. Quickly advancing technology, coupled with outdated federal regulations and inconsistent state laws, currently give artists very little legal standing to combat AI generated deepfake works. Without legal standing to remove songs like “Heart on My Sleeve” from streaming platforms, fake songs allegedly created by our favorite artists may become widely popularized, and ultimately used for commercialism or even worse– to damage the name, image or likeness of the artist. A federal law like the No AI FRAUDS Act must be adopted before the music industry suffers at the hands of generative AI technology.