Inventory
info icon
Single family homes on the market. Updated weekly.Powered by Altos Research
667,466-14,684
30-yr Fixed Rate30-yr Fixed
info icon
30-Yr. Fixed Conforming. Updated hourly during market hours.
6.91%0.02
AgentContributorsReal Estate

Responsible AI in Real Estate: Will Artificial Intelligence (like AI Chatbots) Get Me “Canceled”, Blocked, Fined or Jailed?

Not if we remember these 9 considerations

With each year, technology helps us to work smarter in some ways but not so smart in other ways. One critical area as generative AI becomes increasingly our personal assistant is to not outsource the upholding of fair housing laws. 

Why?

If you did not know, generative AI already has instances of contributing to and exacerbating unfairness like here and here. Significant and lofty penalties have not yet started being doled out so now is as good a time as ever to course-set or even course-correct your team. 

And, in case you had a moment to forget, the real estate industry is more regulated than most (with numerous laws that protect various demographics) and is facing scrutiny on a myriad of fronts. As a friendly reminder, depending on where you are in the U.S., protected classes may include:

  • Race
  • Color
  • Sex 
  • Familial status 
  • National origin 
  • Disability (this has evolved to “a person that uses an assistive device”)
  • Religion 
  • Age 
  • Ancestry 
  • Sexual orientation 
  • Gender identity 
  • Marital status 
  • Military status 
  • Domestic violence victims 
  • Source of income 
  • Genetic information 
  • Pregnancy 
  • HIV/AIDS 
  • Criminal record history
  • And others

In today’s litigious climate, this is the opportune time to wonder, “Will artificial intelligence (like AI chatbots) get me ‘canceled’, blocked, fined or jailed?” 

Not if we remember these 9 considerations for responsible AI in real estate:

  1. How does this app/tool integrate fair housing (which includes fair lending) laws at the federal, state, and local levels? Fair Housing DECODER Tip: I’ve noticed that some of the most popular chatbots and other generative AI include the federal “big seven” (race, color, sex, familial status, national origin, disability, religion) but not fair housing laws at the state or local levels.
  2. How often does this app/tool update to include policy changes? Fair Housing DECODER Tip: Developers should account for legal changes at least monthly as there have been numerous new and updated fair housing laws and case laws within just the last twelve months across the U.S. 
  3. Did the developer consult and do paired testing (e.g. think of mystery shoppers of various protected classes) with a local, regional or national fair housing agency?
  4. How does this app/tool target people (such as a “marketing avatar”)? Fair Housing DECODER Tip: B-schools teach us to have a “customer avatar”, which is basically a brand’s ideal client to target. But, fair housing (and again this includes fair lending) means our ideal client cannot exclude protected classes. The key word here is, “exclude”. Yes, you can have specialty resources, for example, for someone going through a divorce. Yet, we are never excluding (turning away) those who are not.
  5. Are the “targets” based on any fair housing protected class (whether federally, locally or through trade organizations)? Fair Housing DECODER Tip: Use tools that allow you to not focus on the features of people but rather on the features of properties (“a home great for a family of 5” versus “home with five spacious bedrooms to use any way you want”).
  6. How does this app/tool treat various neighborhoods/zip codes? Fair Housing DECODER Tip: Modern-day redlining cases (c.f. one example) show companies not providing the same services to neighboring areas. This is a no-no!
  7. Does it “steer” people with one set of demographics to zip codes that it does not steer others? Fair Housing DECODER Tip: Even if the developer has not done paired testing, your team can do paired testing! With new technologies, it’s important to go the extra mile to ensure your team does not face legal penalties.
  8. How does this app/tool segment into niches? 
  9. Are the niches based on protected classes? Fair Housing DECODER Tip: There are “riches in niches,” but also “faces catch cases.” Niche down as long as they are not based on protected demographics.

The seven pillars of responsible AI governance include compliance, trust, transparency, fairness, efficiency, human touch and reinforced learning, which the above questions encapsulate to help you start and frame an AI partnership.  In a litigious industry, if developers are not willing to be transparent about any of these areas (starting with the eight questions above), it may be worth your sanity to not be an early adopter of a particular platform.

This column does not necessarily reflect the opinion of HousingWire’s editorial department and its owners.

To contact the editor responsible for this piece: [email protected]

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular Articles

Latest Articles

loanDepot’s Frank Martell on building lifelong consumer relationships through technology 

In this week’s episode of the Power House podcast, HousingWire President Diego Sanchez sits down for a tantalizing conversation with Frank Martell, the president and CEO of loanDepot, to discuss the company’s profitability in the third quarter of 2024 and its Project North Star growth plan for 2025.

3d rendering of a row of luxury townhouses along a street

Log In

Forgot Password?

Don't have an account? Please