SAN FRANCISCO — Meta agreed on Tuesday to change its ad technology and pay a $115,054 fine, in a settlement with the Justice Department over claims the company’s ad systems discriminated against Facebook users by restricting who advertises for homes. on the platform based on their race, gender and zip code.
Under the agreement, Meta, the company formerly known as Facebook, said it would change its technology and use a new computer-aided method that aims to regularly check that those targeted and eligible to receive it. housing advertisements, these people actually see. ads. Called a “variance reduction system,” the new method relies on machine learning to ensure advertisers serve housing-related ads to specific protected groups of people.
“Meta will — for the first time — change its ad serving system to address algorithmic discrimination,” Damian Williams, a U.S. attorney for the Southern District of New York, said in a statement. “But if Meta fails to demonstrate that it has modified its delivery system enough to guard against algorithmic bias, this office will proceed with the lawsuit.”
Facebook, which became a business behemoth by collecting its users’ data and letting advertisers target ads based on audience characteristics, has faced complaints for years that some of those practices are biased and discriminatory. The company’s ad systems allowed marketers to choose who saw their ads by using thousands of different characteristics, allowing those advertisers to also exclude people who fall under a number of protected categories, such as race, gender, and age.
The Justice Department filed both the indictment and the settlement against Meta on Tuesday. In its lawsuit, the agency said it concluded that “Facebook could achieve its interests by maximizing its revenue and providing relevant ads to users through less discriminatory means.”
While the settlement deals specifically with housing ads, Meta said it also plans to apply its new system to control the targeting of ads related to employment and credit. The company previously faced backlash for allowing bias against women in job postings and excluding certain groups of people from seeing credit card ads.
The issue of biased ad targeting has been discussed in particular in home ads. In 2016, Facebook’s potential for ad discrimination was revealed in a study by ProPublica, which found that the company’s technology made it easy for marketers to exclude specific ethnic groups for advertising purposes.
In 2018, Ben Carson, the secretary of the Department of Housing and Urban Development, announced a formal complaint against Facebook, accusing the company of having ad systems that “unlawfully discriminated against” based on categories such as race, religion and disability. In 2019, HUD sued Facebook for housing discrimination and violating the Fair Housing Act. The agency said Facebook’s systems were not delivering ads to “a diverse audience,” even if an advertiser wanted the ad to be widely seen.
“Facebook discriminates against people based on who they are and where they live,” Mr Carson said at the time. “Using a computer to limit one’s housing choices can be as discriminatory as slamming a door in one’s face.”
The Justice Department’s lawsuit and settlement is based in part on HUD’s investigation and Facebook’s 2019 complaint of discrimination.
In its own testing on the matter, the US Attorney’s Office for the Southern District of New York found that Meta’s ad systems were directing housing ads away from certain categories of people, even when advertisers wouldn’t do so. According to the Justice Department complaint, the ads were “disproportionately directed to white users and away from black users, and vice versa”.
Many home ads in neighborhoods where the majority of people were white also primarily targeted white users, while home ads in areas that were mostly black were shown primarily to black users, the complaint added. As a result, the indictment said, Facebook’s algorithms “actually and predictably amplify segregated living patterns because of race.”
In recent years, civil rights organizations have also resisted the massive and complicated advertising systems that underlie some of the largest internet platforms. The groups have argued that those systems have inherent biases and that tech companies like Meta, Google and others should do more to reduce those biases.
The field of study, known as “algorithmic fairness,” has been a major topic of interest among computer scientists in the field of artificial intelligence. Leading researchers, including former Google scientists like Timnit Gebru and Margaret Mitchell, have been sounding the alarm about such prejudices for years.
In the years since, Facebook has narrowed the types of categories marketers could choose from when purchasing housing ads, reduced the number to hundreds, and eliminated options to target based on race, age, and zip code.
Chancela Al-Mansour, executive director of the Housing Rights Center in Los Angeles, said it was “essential” that “fair housing laws are aggressively enforced.”
“Housing ads had become tools for illegal behavior, including segregation and discrimination in housing, employment and credit,” she said. “Most users had no idea they were being targeted or banned from housing ads based on their race and other characteristics.”
Meta’s new ad technology, still under development, will periodically check who sees housing, work, and credit ads, and ensure that those audiences match the people marketers want to target. If the ads being served start to diverge significantly from white males in their twenties, for example, the new system will in theory recognize this and shift the ads to show more equitably to a wider and more diverse audience.
“We’re going to take a snapshot of the audience of marketers every now and then, look at who they’re targeting, and remove as much variety from that audience as possible,” Roy L. Austin, Meta’s vice president of civil rights and deputy general counsel, said in a statement. interview. He called it “a significant technological advancement for how machine learning is used to deliver personalized ads.”
Meta said it would work with HUD in the coming months to incorporate the technology into Meta’s ad targeting systems, and agreed to an external audit of the effectiveness of the new system.
The company also said it would no longer use a feature called “Ad Special Audiences,” a tool it had developed to help advertisers expand the groups of people their ads would reach. The Justice Department said the tool was also involved in discriminatory practices. Meta said the tool was an early effort to fight prejudice and that the new methods would be more effective.
The $115,054 fine that Meta would pay in the settlement is the maximum available under the Fair Housing Act, the Justice Department said.
“The public should know that the latest abuse by Facebook was worth as much money as Meta made in about 20 seconds,” said Jason Kint, chief executive of Digital Content Next, an association for premium publishers.
As part of the settlement, Meta did not admit any wrongdoing.