Inside AI Policy

October 31, 2025

AI Daily News

Consumer group calls for ‘strict testing’ to follow rule on AI-enabled home valuations

By Mariam Baksh / September 3, 2024

Finance-sector agencies correctly included anti-discrimination in a new rule to govern artificial intelligence systems used by lending institutions to assess a dwelling’s worth as collateral, but its success will ultimately depend on whether stakeholders can establish a sufficiently stringent testing approach, according to a leading consumer advocate.

“By including a fifth quality control standard that requires originators and secondary market issuers to ensure their [Automated Valuation Models] comply with anti-discrimination laws, the rule makes certain that regulators will have the tools to prohibit these systems from becoming a new and perhaps opaque force for more bias,” Adam Rust, director of financial services at the Consumer Federation of America, told Inside AI Policy.

“Going forward, it will be important to develop strict testing procedures. Unless they are held accountable, modelers will design black boxes that become brick walls,” he said.

Rust was commenting on a rule finalized Aug. 7 by the Comptroller of the Currency, the Federal Reserve System, the Federal Deposit Insurance Corporation, the National Credit Union Administration, the Federal Housing Finance Agency and the Consumer Financial Protection Bureau.

In a broader public comment proceeding the Treasury Department recently held on the use of AI, the American Bankers Association praised the six-agency rule’s flexibility.

But according to the agencies, some commenters on their rule vigorously opposed its incorporation of the antidiscrimination standard, and there was significant disagreement over the existence of standards to test for compliance with the new rule, some commenters suggesting the responsibility should rest with the providers of the AVMs.

Set to take effect Oct. 1, 2025, the rule applies to AVM-use quality control standards that already included: designing systems that ensure a high level of confidence, protecting against data manipulation, seeking to avoid conflicts of interest, and requiring random testing and reviews. It added “comply[ing] with applicable nondiscrimination laws,” as authorized by section 1125 of the Financial Institution Reform Recovery and Enforcement Act of 1989 as amended by the Dodd-Frank Act, the agencies said.

“Section 1125 provides the agencies with the authority to ‘account for any other such factor’ that the agencies ‘determine to be appropriate,’” the final rule reads. “Based on this authority, the agencies proposed to include [the] fifth quality control factor.”

According to the agencies, “Many commenters stated that discrimination is an issue in valuations, including in AVMs, and that specifying a nondiscrimination factor would be useful for reinforcing the applicability of nondiscrimination laws to AVMs. Several commenters asserted that AVMs risk reproducing bias and perpetuating discrimination if they are not adequately examined and tested. These commenters stated that the information used to develop and train AVMs is often drawn from existing data sets that may reflect human biases and historical prejudices.”

“In contrast,” they said, “several commenters opposed including the fifth factor. Commenters expressed various concerns, including that the factor would impose a significant compliance burden, lender systems are not able to assess whether an AVM discriminates, the factor is not required by statute, and the addition of the factor is unnecessary and duplicates existing law and the other quality control factors.”

They added: “Two commenters suggested that documented instances of bias in AVMs are not prevalent, and one of these commenters stated that it would be a mistake to attempt to eradicate through regulation the speculative possibility of bias in AVMs, which could reduce AVM use, when the use of this technology can remove the type of subjective, personal bias that traditional appraisals bring to the valuation process.”

“In addition, some commenters stated that the agencies should use other tools to address AVM bias concerns and the onus should be on AVM vendors to ensure models comply with nondiscrimination laws,” the agencies said.

According to the agencies, “A number of commenters recommended that the agencies work with the private sector to develop a standard setting organization (SSO) for AVMs and an independent third-party entity responsible for testing AVMs for compliance with the proposed quality control standards.”

“The agencies recognize that one or more SSOs and third-party AVM testing entities could be beneficial to effective compliance with the AVM rule,” the regulators wrote. “As long as financial institutions meet the obligations provided in the final rule, they are free to work with third parties to assist them with their compliance obligations.”

Ultimately, the agencies ruled, “institutions should be able to work with AVM providers to assist them with their compliance obligations under the rule.” But they left much up to the industry, which -- along with consumer advocates -- continues to call for more guidance from agencies like the CFPB.