Back to Updates

The No AI Fraud Act: Preventing an Artificial Dystopia?

The No AI Fraud Act: Preventing an Artificial Dystopia?

Your Voice Matters

We resurrected an artist’s mom from the dead so he could record a duet with her.

He gave us her phone and we cloned a model off of her voice memo files.  One file contained a half finished song she was writing into her phone. I am sure she never considered that this would be heard by anybody, let alone finished and then fully re-recorded with her voice after her death. Yes, we had all kinds ethical concerns while sifting through her (very personal) voice memos.

Does it feel like we are living in a virtual Black Mirror episode? Yeah… we know about the drake song. We talked with an irritated music professor who said virtually all of his students chose to write papers on the drake AI song this year. The unsettling reality is that this isn't just a plot from a dystopian TV show.

So, What Exactly Is This Act?

Introduced on 1.10.24 by a pair of Representatives (one Republican, one Democrat – talk about a duet), the No AI FRAUD Act is designed to shield individuals from AI tech that can craft realistic digital replicas of their voices and likenesses without consent. This move is a response to the surge in AI impersonations affecting everyone from musicians and actors to sports stars and even high school students.

Key Provisions of the Act

The No AI FRAUD Act includes several crucial measures to protect against unauthorized AI-generated replicas:

  • Likeness and Voice Protection: The bill affirms that individuals have the right to control the use of their likeness and voice
  • Empowerment to Enforce Rights: It empowers individuals to take legal action against those who create or distribute AI fakes without permission, including seeking damages and profits from unauthorized use.
  • Balancing Rights and Free Speech: The Act ensures that while protecting rights, it doesn't hinder free speech and innovation.

Protecting Artists

Congratulations Ghostrider you made it into the act proposal!

SEC. 2. Findings.
(1) On or around April 4, 2023, AI technology was used to create the song titled “Heart on My Sleeve,” emulating the voices of recording artists Drake and The Weeknd. It reportedly received more than 11 million views.

In an era where AI can convincingly create deepfakes of artists performing songs they never recorded, the Act provides a federal right to one's likeness and voice. This empowers artists to take legal action against those who misuse their identity with AI-generated content, addressing the critical issue of artist rights and AI.

Challenges for AI Music Generators

AI music generation platforms may find themselves under increased scrutiny. The Act could impact how these tools are developed and marketed, pushing for proper licenses and permissions when mimicking specific artists' styles or voices. This shift aligns with the principles of ethical AI in music and ethical licensing for AI.

Record Labels Are Doing a Happy Dance

You bet record labels are loving this. It gives them more muscle to protect their artists and recordings from digital doppelgangers. It could apply to some music generative models like Suno, but its important to distinguish that this doesn't directly address training all copywritten music recordings into generative AI models. Look at the Adam Schiff Bill to address that.

Potential for Innovation and Collaboration

While aiming to prevent unauthorized use of identities, the Act could also spark more structured collaborations between AI developers and the music industry. Clear legal boundaries might encourage responsible and creative uses of AI, potentially leading to new forms of artistic expression and innovative music production techniques.

Reducing red tape without compromising Sound Ethics

Sound Ethics recognizes the delicate balance between regulation and innovation, and proposes the following solutions to address these challenges:

  1. Curated Ethical Data Repositories: Establishing pre-vetted, attribution-ready datasets to streamline compliance and accelerate ethical AI development.
  2. Transparency Reporting Framework: Implementing a standardized system that ensures proper attribution while safeguarding intellectual property, thus supporting both oversight and competitiveness.
  3. Metadata Attribution Standards: Developing industry-wide protocols for certifying data provenance, enhancing traceability without compromising efficiency.
  4. AI Ethics Education Initiative: Our Labs to Legends program equips future data scientists with the knowledge to navigate the intersection of AI, ethics, and regulatory compliance in the music industry.

Cash for Clones: The Price of Deepfake Fraud

The penalties laid out in the Act—starting at $5,000 per violation for deepfakes and $50,000 for creating unauthorized replica tools—serve as a strong deterrent. This could lead to more cautious approaches in AI music development, raising the value of authorized AI-assisted music projects.

Industry Response

The music industry has largely welcomed the No AI FRAUD Act. Organizations like the Recording Industry Association of America (RIAA) and figures such as Universal Music Group CEO Sir Lucian Grainge have shown strong support. This suggests a collective agreement that the Act is a necessary step in adapting copyright and publicity rights for the AI age.

The Need for Federal Legislation

While some states have their own laws addressing AI impersonation (ELVIS ACT) , these laws are often inconsistent. The No AI FRAUD Act aims to provide a uniform federal solution with baseline protections for all Americans, crucial as AI technologies evolve and become more accessible.

Support and Criticism

The No AI FRAUD Act has garnered both support and criticism:

  • Support: Advocates argue the Act is essential to protect individual rights and prevent AI technology from being used for harassment or abuse. It safeguards intellectual property and ensures AI advancements don't undermine artistic expression and liberties.

  • Criticism: Critics worry the Act's broad definitions could lead to unintended consequences, such as increased litigation risks for small businesses and developers. The transferability of likeness rights may also complicate existing challenges with publicity rights, potentially leading to unfair licensing agreements.

Conclusions

The No AI FRAUD Act could be a step forward in protecting us from a Black Mirror dystopia. It lays out that we must protect ourselves from the misuse of AI technologies. While addressing AI-generated impersonations, it highlights the need to balance protecting rights without slowing innovation. However, the language is broad as AI shapes our digital landscape, legislation like this will be crucial to ensure technological advancements benefit society without compromising fundamental rights. Ongoing dialogue and refinement will be key to addressing critics' concerns and ensuring the Act effectively balances protection and innovation.

In this evolving landscape, ethical AI in music and music dataset licensing will become even more critical, ensuring that artists' rights are upheld while fostering a collaborative environment where AI technologies can thrive responsibly.

View/download the proposed act :

Download the Act