Lawmakers should consider making federal assistance available to artists in legal disputes with entities claiming rights to their likeness -- or creative works -- in legislation to address the issue of unauthorized digital replicas as artificial intelligence technology has become more broadly available, according to BSA | the Software Alliance.
“To protect artists from those who might unfairly license away their rights on unfavorable terms, any license or assignment must be in writing, and the artist must be formally notified of his/her right to representation by counsel,” the group says in a document it shared with Inside AI Policy that is to be more broadly released June 12.
“We also urge Congress and the Copyright Office to explore the establishment of a system of pro bono or subsidized legal aid services to represent those artists in these licensing negotiations,” it reads.
The recommendation is a novel one among a collection of principles BSA says it believes goes further than other tech industry groups in offering compromises for crafting legislative efforts with goals similar to those like the No FAKES Act.
Issued as a discussion draft in September by Sen. Chris Coons (D-DE) and Thom Tillis (R-NC) -- the chair and ranking member of the Judiciary Committee’s panel on intellectual property -- the NO FAKES Act would generally give individuals the right to license their name, image and likeness as their intellectual property, and thereby the ability to hold platforms liable for knowingly hosting unauthorized digital replicas.
Sens. Marsha Blackburn (R-TN), Amy Klobuchar (D-MN) and Richard Blumenthal (D-CT) have committed to cosponsoring the effort when it is formally introduced -- Coons and Tillis indicated plans to drop the bill in May -- after considering the incorporation of “thousands” of suggested revisions the subcommittee received on the draft.
BSA’s principles steer clear of the words “intellectual property” which is an exempted category in section 230 of the Communications Decency Act, the law that makes many tech platforms -- as providers of “interactive computer services” -- immune from lawsuits over third-party content they publish.
Instead, the group’s principles say, “Beyond protections for an artist’s name, image, likeness, or voice, Congress should consider how to protect the interests of graphic and literary artists from AI-generated forgeries or other digital replicas that are fraudulently passed off as the work of the true artist.”
Organizations like the Association of Research Libraries have previously joined tech industry advocates in arguing that copyright protections should not apply to likenesses under efforts such as the NO FAKES Act and the No AI FRAUD Act in the House.
BSA’s principles also include “creat[ing] incentives to quickly remove unauthorized replicas online” under the Digital Millennium Copyright Act.
“To protect artists and remove unauthorized digital replicas as quickly as possible, service providers should be encouraged to remove unauthorized replicas expeditiously, consistent with the requirements of [the DCMA], the document reads, noting that law’s “provisions should apply without regard to whether the unauthorized production of an artist’s name, image, likeness, or voice infringes copyright.”
BSA’s senior vice president of global policy Aaron Cooper told Inside AI Policy the group “has been working to generate constructive solutions for AI policy, building on a body of work of over half a decade.”
“The issue of digital replication is important to enterprise software companies because there are legitimate concerns being raised by artists,” he said. “This document is meant to highlight the real ways in which Congress can address those concerns. Addressing real concerns posed by AI helps build trust and adoption in technology, and more widely spread the benefits of technology.”