Fact or fantasy? An artificial intelligence (AI) algorithm can do a better job of picking tenants than you can. A recent lawsuit shows that AI screening tools need a lot of improvement. Blindly following their suggestions may not only result in you missing out on a perfectly good tenant, but it could also get you sued. These tools are a black box because you don’t know how they work, and the companies owning them won’t tell you.

The promise of an AI screening tool is that it will do all the work and thinking for you, freeing you up to do other things. All you need to do is pay the service with your money, and you’ll get tenants more likely to pay their rent on time, and pose fewer problems for you. What property owner wouldn’t want that?

What are the Upsides of Using AI to Screen Tenants?

DotCom magazine lists many of the supposed benefits of using AI to screen tenants, including the following:

  • Streamlined tenant screening process. They’re supposed to use large amounts of data, effectively and efficiently, in an automated process
  • By using many types of data (including public records and social media activity), these tools claim to be better predictors of tenant behavior than you are
  • AI screening tools supposedly reduce the chances of explicit or unconscious bias against potential clients. It’s a standard screening process with objective criteria. The person’s color, race, religion, and other illegal factors aren’t taken into account
  • AI’s evaluation of factors like financial stability, behavior trends, criminal and eviction histories could, in theory, warn you of high-risk tenants
  • These systems supposedly use the most current information, like credit scores, to update their assessment
  • AI might be able to verify documents potential tenants submit
  • Landlords could personalize the process by setting their criteria

Using AI for complex tasks is like using a vehicle’s self-drive system. It may work great or drive you into a ditch.

Lawsuit Shows Limitations of AI Tenant Screening

Mary Louis is a Massachusetts security guard, according to the Guardian. She was turned down for a rental apartment in 2021 because of an AI-powered tenant screening tool, SafeRent. The report it sent her didn’t explain how its suitability score was determined or what it signified. It stated her score (443) and informed her it was too low to be recommended. The software’s recommendation to the prospective landlord was to decline her application.

When she toured the apartment, someone from the management company told her she shouldn’t have a problem with her application. Louis wasn’t the perfect applicant. Her credit score was low, and she had some credit card debt. She admits to not paying all her debts on time, but her landlord of 17 years wrote her a letter stating she was never late paying rent. Louis also uses a government housing voucher, which partially pays her rent whether her payment is on time or not. Her son is named on the voucher and would pay the rent if his mother did not.

Louis later got a more expensive apartment and joined more than 400 Massachusetts Black and Hispanic tenants using housing vouchers in suing SafeRent. They alleged their rental applications were rejected due to their low scores. The class action alleged the following:

  • The algorithm disproportionately scored Hispanic and Black tenants using housing vouchers lower than similarly situated white applicants
  • The software weighed irrelevant information, like credit scores, non-housing related debt
  • It didn’t factor in their housing vouchers

Studies show that Black and Hispanic prospective tenants are more likely to use housing vouchers and have lower credit scores than White applicants.

Illegal discrimination need not be shown through direct evidence of biased language or disparate treatment of prospective tenants. It can also be established through disparate impact. That can be facially neutral considerations that result in a biased outcome. The plaintiff need not show an intent to discriminate to win the case.

SafeRent settled the case last November. It paid plaintiffs $2.3 million and agreed not to use a scoring system or make recommendations for prospective tenants using housing vouchers for five years. If they want to use another scoring system in the future, it must be validated by an independent fair housing organization.

Lawsuit Shows Limitations on How Well AI Selects Tenants

If you use an AI tool to screen tenants, you decide whether to accept the person or not. If it’s as flawed as SafeRent, if you do, that potentially puts you at risk for being sued. Using a tool you don’t fully understand to do something critical to your business that entails legal risk may not be the best choice.

The SafeRent case shows that these programs, despite the hype, may do the following:

  • Streamline your selection process but cause you to spend time, energy, and resources defending a lawsuit
  • Not predict a prospective tenant’s behavior better than you can
  • Not reduce the risk of biased outcomes
  • Give false warnings of people supposedly at risk for causing problems

Before you use AI to help your business, evaluate its potential costs and benefits and look beyond the sales pitch.

We’re Here to Help

If you have questions about how to avoid legal claims when choosing a tenant, call the AWB Law PC team at (949) 244-4207 or fill out our online contact form today. We can discuss your situation, how California laws may apply, and how we can help.

Leave a Reply

Skip to content