Consumer advocates celebrated Gov. Gavin Newsom’s (D) enactment of a bill that would make it harder for companies to use AI-powered algorithms in collectively setting prices, amid what civil society groups viewed as a less-than-ideal legislative haul this year in California.
“AB 325 updates California’s antitrust laws to address algorithmic collusion. It bans pricing algorithms trained on nonpublic competitor data and makes it easier to bring antitrust claims under the Cartwright Act,” the Consumer Federation of America wrote in an Oct. 14 release.
“CFA has supported and testified on proposed bills like this around the country, most recently in Oregon, Colorado, and Maryland,” the group noted.
Leading up to the Oct. 12 deadline for Newsom to sign or veto bills cleared by the legislature, consumer advocates started looking at a mixed bag, with some like the Center for Democracy and Technology showing little enthusiasm for SB 53 -- a “light touch” transparency measure -- being signed. The governor also went on to veto SB 7, the CDT-supported No Robo Bosses Act that was fiercely opposed by the tech industry.
Other tech industry-opposed bills that were signed -- such as AB 853, which would require AI producers with audiences over a certain threshold to provide an AI detection tool -- went unmentioned in the CFA release.
The bill against algorithmic price fixing was arguably the biggest win for the consumer advocates who also highlighted the enactment of AB 489, legislation that would apply existing laws against impersonating medical professionals to developers of AI chatbots that claim to offer therapy.
While the latter was ultimately uncontroversial -- the U.S. Chamber of Commerce was among those expressing support -- the former was opposed by a string of tech industry associations, including TechNet, the Software and Information Industry Association and the Chamber of Progress, along with the California Chamber of Commerce and other state business groups.
At the federal level, the Computer and Communications Industry Association has led the charge against policies to limit algorithmic price setting. The group dismissed the idea of any collusion threat in comments to the Biden administration’s Department of Justice, while noting the importance of such use cases to maintaining a marketplace for AI.
Algorithmic price setting more broadly has also gained the attention of lawmakers like Sen. Mark Warner (D-VA) in connection with concerns about the use of personal information leading to potential discrimination, an issue that was recently ruled on by a U.S. judge.
But federal legislation, such as the Preventing Algorithmic Facilitation of Rental Housing Cartels Act offered by Rep. Becca Balint (D-VT), has yet to garner bipartisan support.
CFA noted the importance of state action in also cheering California’s enactment of four other tech-related bills.
SB 566 would require “web browsers to allow consumers to send a ‘universal’ opt-out signal, which makes it easy and clear for consumers to exercise their right for their data not to be sold,” according to the release. “This helps actualize privacy rights and make it more workable for the average consumer.”
Similarly, the group said SB 361 would expand “reporting and transparency requirements for data brokers operating in California -- an incremental but important improvement to address the significant danger data brokers are putting people in each day.”
Additionally, AB 621 “Strengthens the private right of action for victims of deepfake non-consensual intimate imagery, allowing people to sue anyone who creates, digitally alters, or distributes some sexually explicit representation of them,” CFA said, noting the group “recently led a complaint about Elon Musk’s ‘Grok,’ a program that was built to create this type of content.”
And finally, CFA praised AB 56 as a “first-of-its-kind requirement for warning labels on social media platforms, an idea largely popularized by former Surgeon General Vivek Murthy.”
“This is exactly why states need the freedom to step in and protect people where Congress has repeatedly failed,” said Ben Winters, CFA’s director of AI and privacy. “These laws are examples of straightforward bills that will provide some concrete protections against dangerous and manipulative AI systems.”
The release said “CFA looks forward to continuing to work with states across the country as well as Congress to expand on these significant improvements for consumers in the coming legislative cycle.”
