According to a press release from the Department of Justice, the US government and Facebook parent company Meta have reached an agreement to resolve a lawsuit in which the latter was accused of enabling housing discrimination by allowing advertisers to request that their ads not be shown to members of particular protected groups (DOJ). The entire contract may be read below.
Although complaints about Meta’s tactics date back years, the government just recently initiated a lawsuit against the business for algorithmic housing discrimination. Although the corporation took some action to fix the problem, the feds obviously felt that it wasn’t enough. According to the government, this was the first instance of algorithmic Fair Housing Act breaches it has encountered.
The settlement stipulates that Meta will stop using a discriminatory algorithm for housing advertisements and establish a system that would “correct racial and other inequities generated by its use of personalisation algorithms in its ad distribution system” before the judge approves it.
According to Meta, this new system will take the place of its Special Ad Audiences housing, credit, and employment opportunities tool. The tool and its algorithms, according to the DOJ, allowed marketers to target consumers who shared characteristics with a pre-selected group. The DOJ claims that Special Ad Audiences considered factors such a user’s estimated ethnicity, national origin, and sex when determining who to advertise to, which might lead to cherry-picking who viewed housing advertising, a violation of the Fair Housing Act. In the settlement, Meta asserts that no misconduct has occurred and that there has been no admission of guilt or determination of culpability.
The company Meta said in a statement on Tuesday that it intends to use machine learning to address this issue by creating a system that “ensures the age, gender, and estimated race or ethnicity of a housing ad’s overall audience matches the age, gender, and estimated race or ethnicity mix of the population eligible to see that ad.” In other words, the system should ensure that the audiences targeted by and qualified to see the advertisement are the ones who actually see it. To determine how far the intended audience is from the actual audience, Meta will consider age, gender, and race.
The settlement requires the corporation to demonstrate to the government that the system functions as intended and include it into its platform by the end of December 2022.
As it develops the new system, the corporation commits to update customers on its progress. A third party will “examine and verify on an ongoing basis” that it is truly ensuring that advertising are broadcast in a fair and equitable manner if the government authorizes it and it is put into effect.
A $115,054 fine will also be owed by Meta. Even if a firm making billions a month would consider it to be nothing, the DOJ emphasizes that it is the maximum sum that may be charged for a violation of the Fair Housing Act.