Inventory
info icon
Single family homes on the market. Updated weekly.Powered by Altos Research
722,032+456
30-yr Fixed Rate30-yr Fixed
info icon
30-Yr. Fixed Conforming. Updated hourly during market hours.
7.00%0.01
Technology

CFPB details legal requirements for lenders’ use of A.I. when making credit decisions

Chopra: "Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence"

The Consumer Financial Protection Bureau (CFPB) on Tuesday announced a series of legal requirements lenders must adhere to when using artificial intelligence (AI) or “other complex models” when making decisions about the creditworthiness of borrowers.

“The guidance describes how lenders must use specific and accurate reasons when taking adverse actions against consumers,” the CFPB said in the announcement. “This means that creditors cannot simply use CFPB sample adverse action forms and checklists if they do not reflect the actual reason for the denial of credit or a change of credit conditions.

Requirements like these become “especially important” due to the evolving pace of using advanced algorithms and personal consumer data in consumer credit underwriting, the Bureau said. Explaining why certain actions are taken will also improve consumers’ future chances for credit while protecting against illegal discrimination.

“Technology marketed as artificial intelligence is expanding the data used for lending decisions, and also growing the list of potential reasons for why credit is denied,” said CFPB Director Rohit Chopra in a statement. “Creditors must be able to specifically explain their reasons for denial. There is no special exemption for artificial intelligence.”

In the consumer credit marketplace, the use of advanced algorithms often marketed as “artificial intelligence” is increasing. AI and other predictive decision-making technologies are increasingly being intertwined with underwriting models, the Bureau said.

“Creditors often feed these complex algorithms with large datasets, sometimes including data that may be harvested from consumer surveillance,” the announcement explained. “As a result, a consumer may be denied credit for reasons they may not consider particularly relevant to their finances.”

Certain creditors may also “inappropriately rely on a checklist of reasons provided in CFPB sample forms,” but the Equal Credit Opportunity Act (ECOA) does not permit creditors to “simply conduct check-the-box exercises when delivering notices of adverse action if doing so fails to accurately inform consumers why adverse actions were taken,” the Bureau said.

A circular published in 2022 detailed that ECOA requires creditors to “explain the specific reasons for taking adverse actions,” a requirement that remains in force even if such companies “use complex algorithms and black-box credit models that make it difficult to identify those reasons.”

The guidance handed down on Tuesday expands on the perspective shared in that 2022 circular by explaining that “sample adverse action checklists should not be considered exhaustive, nor do they automatically cover a creditor’s legal requirements.”

Reasons for “adverse actions” such as credit denials must be specific and avoid the generalities that may arise from the sample language previously provided by the CFPB. Failure to provide enough detail about a particular decision and instead rely on a “broad bucket” remains just as applicable for advanced algorithms as anything or anyone else.

“Creditors must disclose the specific reasons, even if consumers may be surprised, upset, or angered to learn their credit applications were being graded on data that may not intuitively relate to their finances,” the CFPB said.

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular Articles

3d rendering of a row of luxury townhouses along a street

Log In

Forgot Password?

Don't have an account? Please