Inside AI Policy

July 1, 2024

AI Daily News

Consumer groups push CFPB to issue guidance on less discriminatory algorithms

By Mariam Baksh / June 27, 2024

Leading consumer advocacy groups are escalating their calls for the Consumer Financial Protection Bureau to clarify its regulatory expectation that lenders must use less discriminatory alternatives to algorithms that can result in reduced or unfair access to credit for Black Americans and other underrepresented populations.

Under the Equal Access to Credit Opportunity Act, CFPB enforces what is referred to as the disparate impact standard, whereby, even if an entity is not evidently trying to discriminate -- that would be called disparate treatment -- if the results of their process vary significantly based on protected characteristics such as race or gender, they can be held liable.

In order to avoid that liability, they are required to seek out less discriminatory alternatives, or LDAs, for effectively performing a legitimate business service.

“The risk of harm to consumers due to prolonged resistance to LDA searches calls for moving beyond the status quo,” reads a June 26 letter to CFPB Director Rohit Chopra from Consumer Reports and the Consumer Federation of America. “As long as financial institutions do not see irrefutable evidence of expectations to perform statistically effective reviews, many will fail to do so.”

For several months, the Consumer Federation of America has been urging CFPB to ratify public verbal statements with official written guidance explicitly laying out the expectation that entities do their due diligence in searching for and implementing LDAs, particularly to algorithmic systems that are considered “black boxes” because how they function is so inscrutable.

The issue of disparate treatment has also been at the center of the debate over federal privacy legislation set for markup as civil rights groups stress the importance of transparency for algorithmic accountability while artificial intelligence incentivizes greater collection of individuals’ personal data.

The consumer advocates’ letter, following an encouraging Supreme Court decision that validated the agency’s existence, cites discoveries about AI technology in providing a detailed case for CFPB to provide guidance on “legitimate questions regarding how best to test for and implement LDAs.”

The letter notes that “due to the unique capacity of ML models to improve through rapid iteration, it is now much less burdensome or time-intensive to search for and implement LDAs.” It describes a phenomenon researchers have identified as “model multiplicity” which “shows that there are multiple possible models that are equally effective at a given task.”

“As a result, when an algorithmic system displays a disparate impact, model multiplicity suggests that other models exist that perform equally well but have less discriminatory effects. In other words, in almost all cases, a less discriminatory algorithm exists,” the groups wrote. “Model multiplicity implies that there no longer needs to be a significant tradeoff between performance and fairness.”

Groups like the National Fair Housing Alliance have also been working to correct what they say is a misconception pitting accuracy and performance against fairness, presenting a novel LDA technique in April.

“Particularly in the context of [machine learning] models, it is no longer a question of whether or not a less discriminatory alternative that meets legitimate business needs can be found – this threshold question has already been answered in the affirmative,” the consumer advocates wrote. “Instead, the question now turns to how companies should go about finding and implementing LDAs.”

The consumer groups’ letter quotes the Equal Employment Opportunity Commission in noting “‘one advantage of algorithmic decision-making tools is that the process of developing the tool may itself produce a variety of comparably effective alternative algorithms. Failure to adopt a less discriminatory alternative that was considered during the development process may give rise to liability.’”

Noting that the CFPB need not undertake a controversial rulemaking, the letter explores potential reasons for the agency’s lack of action on LDA guidance, which they say could take the form of an advisory opinion, a research paper, a circular or ‘supervisory highlights.’

“We … reject the idea that guidance and enforcement are mutually exclusive,” the letter reads. “To be certain, providing guidance is not a substitute for using enforcement authority to punish bad actors, but it is an important tool to provide the certainty needed for more well-intentioned actors to move forward during this time of rapid change and uncertainty. We believe clarity and corrective action are complementary.”

The groups added that clarifying guidance would also help to blunt any legal claim following a potential enforcement action.