Zillow announced the release of an open-source tool, the Fair Housing Classifier, on Tuesday as part of the company’s efforts to “promote responsible and unbiased behavior in real estate conversations powered by large language model (LLM) technology.“
In a news release, Zillow explained that artificial intelligence (AI) tools often fail to account for the myriad requirements of fair housing laws. These tools, when deployed, “can perpetuate bias and undermine the progress achieved in advocating for fair housing.“
The Fair Housing Classifier is designed to act as a protective measure against steering, or the act of influencing a person’s choice of home based upon protected characteristics. The Fair Housing Act of 1968 — which was expanded in 1974 and 1988 — prohibits discrimination in housing based on race, color, religion, gender, disability, familial status or national origin.
“Since 2006, Zillow has used AI to bring transparency to home shoppers, powering tools like the Zestimate,“ Josh Weisberg, Zillow’s senior vice president of artificial intelligence, said in a statement. “We’ve made it our business to increase transparency in real estate — open sourcing this classifier demonstrates that advancements in technology do not need to come at the expense of equity and fairness for consumers.
“We’re offering free and easy access so that others in civil rights, tech and real estate sectors can use it, collaborate and help improve it.“
In a recent survey of more than 12,000 Americans, Zillow found that 57% reported some type of housing discrimination during their life. This share rose to 79% for LGBTQ+ respondents, 69% for Blacks, and 64% for Hispanics and Latinos.
The Fair Housing Classifier is equipped to detect questions “that could lead to discriminatory responses about legally protected groups in real estate experiences, such as search or chatbots,“ Zillow explained. The AI technology can identify cases of noncompliance with equal housing laws when it is given a question or answer. System developers have the ability to intervene in these cases.
“In today’s rapidly evolving AI landscape, promoting safe, secure and trustworthy AI practices in housing and lending is becoming increasingly important to protect consumers against algorithmic harms,“ Michael Akinwumi, chief responsible AI officer for the National Fair Housing Alliance, said in a statement.
“Zillow’s open-source approach sets an admirable precedent for responsible innovation. We encourage other organizations and coalition groups to actively participate, test, and enhance the model and share their findings with the public.”
Companies and individuals that want to use the Fair Housing Classifier can access its code and comprehensive framework on GitHub. Anyone wanting to provide feedback and/or improve the tool can connect with the email alias on the GitHub page.