NEW YORK, NY (January 9, 2023) —The United States Justice Department has reached a key milestone in its settlement agreement with Meta Platforms, Inc, formerly known as Facebook, Inc., requiring Meta to change its ad delivery system to prevent discriminatory advertising in violation of the Fair Housing Act. As required by the settlement entered on June 27, 2022, resolving a lawsuit filed in the U.S. District Court for the Southern District of New York, Meta has now built a new system to address algorithmic discrimination. Today, the parties informed the Court that they have reached agreement on the system’s compliance targets. This development ensures that Meta will be subject to court oversight and regular review of its compliance with the settlement through June 27, 2026.
The United States’ complaint alleged, among other things, that Meta uses algorithms in determining which Meta users receive ads, including housing ads, and that those algorithms rely, in part, on characteristics protected under the FHA. Specifically, the United States alleged that Meta feeds troves of user information into its ad delivery system, including information related to users’ FHA-protected characteristics such as sex and race, and uses that information in its personalization algorithms to predict which ad is most relevant to which user. As the complaint alleged, Meta’s delivery algorithms introduce bias when delivering ads, resulting in a variance along sex and estimated race/ethnicity between the set of users who are eligible to see housing ads based on the advertiser’s targeted audience and the set of users who actually see the ad.
Pursuant to the settlement, Meta has developed a new system — the Variance Reduction System — to reduce the variances between the eligible audience and the actual audience. The United States has concluded that the new system will substantially reduce the variances between the eligible and actual audiences along sex and estimated race/ethnicity in the delivery of housing advertisements. The VRS will operate on all housing advertisements across Meta platforms, and the agreement requires Meta to meet certain compliance metrics in stages. For example, by December 31, 2023, for the vast majority of housing ads on Meta platforms, Meta will reduce variances to less than or equal to 10% for 91.7% of those ads for sex and less than or equal to 10% for 81.0% of those ads for estimated race/ethnicity. For more information on the operation of the VRS, read Meta’s technical paper.
As further provided in the settlement agreement, the parties have selected an independent, third-party reviewer, Guidehouse, Inc., to investigate and verify on an ongoing basis whether the VRS is meeting the compliance metrics agreed to by the parties. Under the agreement, Meta must provide Guidehouse and the United States with regular compliance reports and make available any information necessary to verify compliance with the agreed-upon metrics. The court will have ultimate authority to resolve any disputes over the information that Meta must provide.
Finally, as also required by the settlement agreement, Meta has ceased delivering housing advertisements using the Special Ad Audience tool (which delivered ads to users who “look like” other users), and Meta will not provide any targeting options for housing advertisers that directly describe or relate to FHA-protected characteristics.
This agreement marks the first time that Meta will be subject to court oversight for its ad targeting and delivery system.
What they are saying:
U.S. Attorney Damian Williams: “This groundbreaking resolution sets a new standard for addressing discrimination through machine learning. We appreciate that Meta agreed to work with us toward a resolution of this matter and applaud Meta for taking the first steps towards addressing algorithmic bias. We hope that other companies will follow Meta’s lead in addressing discrimination in their advertising platforms. We will continue to use all of the tools at our disposal to address violations of the Fair Housing Act.”
Assistant Attorney General Kristen Clarke of the Justice Department’s Civil Rights Division: “This development marks a pivotal step in the Justice Department’s efforts to hold Meta accountable for unlawful algorithmic bias and discriminatory ad delivery on its platforms. The Justice Department will continue to hold Meta accountable by ensuring the Variance Reduction System addresses and eliminates discriminatory delivery of advertisements on its platforms. Federal monitoring of Meta should send a strong signal to other tech companies that they too will be held accountable for failing to address algorithmic discrimination that runs afoul of our civil rights laws.”
The case is being handled by the Office’s Civil Rights Unit in the Civil Division. Assistant U.S. Attorneys Ellen Blain, David J. Kennedy, and Christine S. Poscablo are in charge of the case.