Two years ago, Mary Louis applied to rent an apartment in Granada Highlands in Malden, Massachusetts. She liked that the unit had two full bathrooms and that there was a pool on the property. But the landlord refused her the apartment, allegedly because of a score assigned to her by a tenant screening algorithm created by SafeRent.
Louis responded with references to prove 16 years of punctual rent payments, to no avail. Instead, she took another apartment that cost $200 more per month in a higher crime area. But a class action filed last May by Louis and others argues that SafeRent scores, based in part on information in a credit report, amounted to discrimination against black and Hispanic renters in violation of the Fair Housing Act. The groundbreaking legislation prohibiting discrimination based on race, disability, religion or national origin was passed by Congress in 1968 a week after the assassination of Martin Luther King Jr.
That case is still pending, but the U.S. Justice Department last week used a preliminary injunction filed with the court to send a warning to landlords and the makers of tenant-screening algorithms. SafeRent had argued that algorithms used to screen tenants are not covered by the Fair Housing Act because the scores only advise landlords and do not make decisions. The DOJ’s briefing, filed with the Department of Housing and Urban Development, rejects that claim, saying the law and associated case law leave no ambiguity.
“Housing providers and tenant screening companies that use algorithms and data to screen tenants are not absolved of liability when their practices disproportionately deny people of color access to fair housing opportunities,” Kristen Clarke, chief of the Justice Department’s civil rights division, said in a statement. .
As in many areas of business and government, algorithms that assign scores to people have become more common in the housing industry. But while they are claimed to improve efficiency or identify “better tenants,” as SafeRent marketing materials suggest, tenant screening algorithms could be contributing to historically persistent housing discrimination, despite decades of civil rights legislation. A 2021 study by the U.S. National Bureau of Economic Research that used bots using names associated with different groups to sign up with more than 8,000 landlords found significant discrimination against tenants of color, and African Americans in particular.
“It’s a relief that this is being taken seriously — there’s an understanding that algorithms aren’t inherently neutral or objective and deserve the same level of scrutiny as human decision makers do,” said Michele Gilman, a law professor at the University of Baltimore and former civil law attorney. at the Ministry of Justice. “Just the fact that the DOJ is working on this is a big step to me.”
A 2020 study by The Markup and Propublica found that tenant screening algorithms often run into obstacles such as mistaken identity, especially for people of color with common last names. A Propublica review of algorithms from Texas-based company RealPage last year suggested it could drive up rents.