Inventory
info icon
Single family homes on the market. Updated weekly.Powered by Altos Research
682,150-7,865
30-yr Fixed Rate30-yr Fixed
info icon
30-Yr. Fixed Conforming. Updated hourly during market hours.
6.91%0.02
Real EstateTechnology

Zillow’s new AI tool aims to promote equality in housing

The open-source tool, which is available for free, addresses bias in large language models

Zillow announced the release of an open-source tool, the Fair Housing Classifier, on Tuesday as part of the company’s efforts to “promote responsible and unbiased behavior in real estate conversations powered by large language model (LLM) technology.“

In a news release, Zillow explained that artificial intelligence (AI) tools often fail to account for the myriad requirements of fair housing laws. These tools, when deployed, “can perpetuate bias and undermine the progress achieved in advocating for fair housing.“

The Fair Housing Classifier is designed to act as a protective measure against steering, or the act of influencing a person’s choice of home based upon protected characteristics. The Fair Housing Act of 1968 — which was expanded in 1974 and 1988 — prohibits discrimination in housing based on race, color, religion, gender, disability, familial status or national origin.

“Since 2006, Zillow has used AI to bring transparency to home shoppers, powering tools like the Zestimate,“ Josh Weisberg, Zillow’s senior vice president of artificial intelligence, said in a statement. “We’ve made it our business to increase transparency in real estate — open sourcing this classifier demonstrates that advancements in technology do not need to come at the expense of equity and fairness for consumers.

“We’re offering free and easy access so that others in civil rights, tech and real estate sectors can use it, collaborate and help improve it.“

In a recent survey of more than 12,000 Americans, Zillow found that 57% reported some type of housing discrimination during their life. This share rose to 79% for LGBTQ+ respondents, 69% for Blacks, and 64% for Hispanics and Latinos.

The Fair Housing Classifier is equipped to detect questions “that could lead to discriminatory responses about legally protected groups in real estate experiences, such as search or chatbots,“ Zillow explained. The AI technology can identify cases of noncompliance with equal housing laws when it is given a question or answer. System developers have the ability to intervene in these cases.

“In today’s rapidly evolving AI landscape, promoting safe, secure and trustworthy AI practices in housing and lending is becoming increasingly important to protect consumers against algorithmic harms,“ Michael Akinwumi, chief responsible AI officer for the National Fair Housing Alliance, said in a statement.

“Zillow’s open-source approach sets an admirable precedent for responsible innovation. We encourage other organizations and coalition groups to actively participate, test, and enhance the model and share their findings with the public.”

Companies and individuals that want to use the Fair Housing Classifier can access its code and comprehensive framework on GitHub. Anyone wanting to provide feedback and/or improve the tool can connect with the email alias on the GitHub page.

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular Articles

3d rendering of a row of luxury townhouses along a street

Log In

Forgot Password?

Don't have an account? Please