As discussed in our prior post, the Department of Justice (DOJ) entered into a settlement with Meta Platforms Inc., formerly known as Facebook Inc. (Meta), to resolve allegations that Meta engaged in discriminatory advertising in violation of the Fair Housing Act (FHA). A key part of that settlement was Facebook’s agreement to build a new Variance Reduction System (VRS) to address disparities for race, ethnicity, and sex between the housing advertisers’ targeted audiences and the group of Facebook users to whom Facebook’s internal personalization algorithms actually deliver the ads. 

This VRS is intended to address the following three issues with Facebook’s ad delivery system raised in the initial DOJ complaint. First, the DOJ alleged that the system engaged in trait-based targeting by encouraging an advertiser to target ads by including or excluding Facebook users based on FHA-protected characteristics that Facebook, through its data collection and analysis, attributed to those users. Second, the DOJ alleged that the system used look-alike targeting, which involved a machine-learning algorithm that an advertiser used to find Facebook users who “look like” an advertiser’s source audience (i.e. an advertiser’s identified audience of Facebook users) based on FHA-protected characteristics. Third, the DOJ alleged that the system made delivery determinations using machine-learning algorithms to help select which subset of an advertiser’s target audience would receive the ad based in part upon FHA-protected characteristics. 

While Meta originally had a December 31, 2022, deadline for implementing the VRS, Meta and the DOJ sought a joint extension to January 9, 2023, which was approved by the court, to finalize the VRS’s compliance metrics. The VRS compliance metrics are Meta and the DOJ’s agreed upon metrics for how much the VRS will reduce variances related to race, ethnicity, and sex. By agreeing to these compliance metrics, Meta is now bound to provide compliance reporting every four months confirming it has met the VRS compliance metrics. Meta is also required to obtain an independent third-party reviewer to verify compliance. And importantly, Meta must provide the independent reviewer with access to information needed to verify compliance. Although the civil penalty was $115,054, which was the maximum under the FHA, the cost of compliance going forward will likely be much higher. While the VRS initially will be used with housing advertisements, over the coming year Meta will extend its use to credit and employment advertisements.

With the VRS finalized along with the terms of the settlement, future parties now have guidance on how to avoid discriminatory ad targeting. The allowable variances outlined in the parties’ letter that confirms finalization of the VRS compliance metrics provide other parties with potential benchmarks for allowable variances going forward not only in the FHA context but also with respect to targeted advertising of products such as consumer credit, which are also covered by anti-discrimination statutes. While this agreement likely provides certainty for advertisers on Facebook going forward, it is a likely sign that regulators will target similar advertising platforms in the future.

In particular, in announcing the agreement on the VRS, U.S. Attorney Damian Williams stated that “[t]his groundbreaking resolution sets a new standard for addressing discrimination through machine learning. We appreciate that Meta agreed to work with us toward a resolution of this matter and applaud Meta for taking the first steps towards addressing algorithmic bias. We hope that other companies will follow Meta’s lead in addressing discrimination in their advertising platforms. We will continue to use all of the tools at our disposal to address violations of the Fair Housing Act.” The last two sentences appear to be a clear warning to similar advertising platforms.

More information on the VRS is available here.