Facebook settles civil rights lawsuits over ad discrimination
Facebook has reached settlements in three civil rights cases and two complaints before the Equal Employment Opportunity Commission over ad discrimination on its platform. As part of the settlements, Facebook has promised to make significant changes to its ad tools to curb the ability of advertisers to target users based on their race, gender, age, disability, and other protected characteristics, reports motherjones.com.
Advertisers will no longer be able to exclude users from learning about opportunities for housing, employment, or credit based on gender, age, or other protected characteristics, reports aclu.org.
Facebook has become a peerless source of demographic data for advertisers. The information it gathers on its users allows advertisers to target ads for their products and services to an unprecedented level of precision. If you want to sell baseball gear in Washington, DC, for example, you can target middle-aged mothers with children who live in a wealthy ZIP code. But for years, civil rights groups have been warning Facebook—first in conversations, then in a series of lawsuits—that this hyper-targeting can run afoul of civil rights laws.
On Tuesday, Facebook announced settlements with the National Fair Housing Alliance, the Communications Workers of America, regional fair housing organizations, and individual consumers and job seekers, represented by the ACLU and two civil rights firms across five different cases over ad discrimination on the platform. Without admitting wrongdoing, Facebook agreed to major changes to its targeting tools for ads in industries protected by federal civil rights laws: housing, employment, and credit.
While it’s legal to market baseball caps to men, for example, federal civil rights laws prohibit targeting ads for housing, employment, and loans based on protected characteristics like race, gender, age, and disability. On October 31, 2016, ProPublica published a bombshell investigation showing that advertisers could target ads in protected industries by race. A job ad might be shown just to white men, for example. The following month, the first civil rights lawsuit against Facebook was filed in California, a class-action suit on behalf of the millions of people of color who were missing out on opportunities for housing, jobs, and credit due to Facebook’s ad tools.
In February 2017, Facebook announced that it had removed the ability to target groups in these three protected areas. But a subsequent ProPublica investigation in November 2017 showed that Facebook’s fix didn’t work: Reporters had successfully placed rental housing ads that excluded protected groups of people, including African Americans, Jews, Spanish speakers, and people interested in wheelchair ramps. This second story prompted another major suit by fair housing advocates in New York. The following year, the ACLU and the Communications Workers of America filed two complaints with the Equal Employment Opportunity Commission over gender and age discrimination in employment ads. On Tuesday, Facebook settled with all of them.
Most Facebook users were likely not even aware that this type of exclusionary ad targeting was happening. Some 30 years into the digitization of our daily lives, we’re still coming to grips with the fact that the vast trove of data we hand over with each and every ‘like,’ search, post, or click — often without our knowledge or consent — will be used to target advertisements to us.
This kind of data mining is ubiquitous on Facebook, which attracts advertisers by touting its targeting tool’s power to show users only the ads Facebook or advertisers think they’d be interested in, based on how individualized data describes them. But there’s a discriminatory flip side to this practice. Ad-targeting platforms can be used to exclude users on the basis of race, gender, or age as well as interests or groups that can serve as proxies for those categories.
‘There is a long history of discrimination in the areas of housing, employment, and credit, and this harmful behaviour should not happen through Facebook ads,’ Facebook COO Sheryl Sandberg wrote in blog post announcing the changes, adding, ‘We can do better.’
As more people turn to the internet to find jobs, apartments, and loans, there is a real risk that ad targeting will replicate and even exacerbate existing racial and gender biases in society. Imagine if an employer chooses to display ads for engineering jobs only to men — not only will users who aren’t identified as men never see those ads, they’ll also never know what they missed. After all, we seldom have a way to identify the ads we’re not seeing online. That this discrimination is invisible to the excluded user makes it all the more difficult to stop.
Whether you call it weblining, algorithmic discrimination, or automated inequality, it’s now clear that the rise of big data and the highly personalized marketing it enables has led to these new forms of discrimination. This targeting has undermined longstanding civil rights laws, including Title VII of the Civil Rights Act, the Age Discrimination in Employment Act, the Fair Housing Act, the Equal Credit Opportunity Act, and similar civil rights laws, which prohibit discrimination on the basis of protected characteristics — such as race, gender, and age — in advertising housing, employment, and credit opportunities. It was only after the passage of Title VII in 1964 that job ads stopped specifying whether employers were seeking male or female applicants. It’s imperative that online platforms act to stop these archaic forms of discrimination from taking on new life in the 21st century.
In the first-of-its-kind settlement announced today, Facebook has agreed to create a separate place on its platform for advertisers to create ads for jobs, housing, and credit. Within the separate space, Facebook will eliminate age- and gender-based targeting as well as options for targeting associated with protected characteristics or groups. Targeting based on ZIP code or a geographic area that is less than a 15-mile radius will not be allowed. And Facebook will stop considering users’ age, gender, ZIP code, or membership in Facebook ‘groups’ when creating ‘Lookalike’ Audiences for advertisers.
Facebook will also require advertisers for employment, housing, and credit to certify compliance with anti-discrimination laws, and it will institute a system of automated and human review to ensure that such ads are properly identified and channeled into the separate flow. Additionally, due to a three-year monitoring period in the agreement, we’ll be watching Facebook’s progress closely to ensure that it implements these changes fully.
The ACLU and partner civil rights groups have been advocating for changes like these for close to three years, and Facebook had already agreed to remove some targeting options that could serve as a proxy for race after investigative journalism exposed the practice. Per today’s settlement, Facebook will also remove targeting options based on gender, age, and other protected characteristics while committing itself to ensuring that advertisers using its targeting tools comply with the law.
Facebook’s lawyers initially moved to have the cases against it dismissed. In April 2018, when CEO Mark Zuckerberg testified before Congress, multiple lawmakers pressed him on the ProPublica stories and Facebook’s decision to fight the suits. Sen. Cory Booker (D-N.J.) pressed Zuckerberg to explain how, as Facebook’s court filings claimed, people who were excluded from job advertisements were not being harmed. Zuckerberg acknowledged at the time that protecting people from illegal ad discrimination practices was a work in progress.
Since that hearing, Facebook has agreed to a civil rights audit, a holistic review of possible civil rights issues on its platform. It has hired Laura Murphy, the former head of the ACLU in Washington, to lead the project. Sandberg cited this audit in Facebook’s decision to settle the cases. ‘Civil rights leaders and experts—including members of the Congressional Black Caucus, the Congressional Hispanic Caucus, the Congressional Asian Pacific American Caucus, and Laura Murphy, the highly respected civil rights leader who is overseeing the Facebook civil rights audit—have also raised valid concerns about this issue,’ she wrote in her blog post. ‘We take those concerns seriously and, as part of our civil rights audit, engaged the noted civil rights law firm Relman, Dane & Colfax to review our ads tools and help us understand what more we could do to guard against misuse.’
But if Facebook has finally decided to take this issue seriously, promising to roll out a series of changes this year to dramatically reduce the ad targeting options in these protected categories, it did so under the threat of litigation. The Justice Department had sided with the fair housing advocates in the New York suit. If Facebook had not settled the cases in California and New York and instead had lost in court, it could have faced greater liability for the content published on its platform.
‘As the internet—and platforms like Facebook—play an increasing role in connecting us all to information related to economic opportunities, it’s crucial that micro-targeting not be used to exclude groups that already face discrimination,’ Galen Sherwin, senior staff attorney at the ACLU, said in a statement Tuesday. ‘We are pleased Facebook has agreed to take meaningful steps to ensure that discriminatory advertising practices are not given new life in the digital era, and we expect other tech companies to follow Facebook’s lead.’ Beyond Facebook, civil rights groups hope that the settlement will create a benchmark for the rest of Silicon Valley when it comes to ad discrimination.
As part of the settlement, Facebook will create a separate portal for ads in the areas of housing, employment, and credit that will limit the number of targeting categories available to advertisers. In addition, the parties who filed the suits and complaints will continue to work with Facebook for three years to monitor the effects of the changes and study the potential for unintended biases in its algorithms. And Facebook will make changes to one way it develops target audiences for an ad so that it is no longer taking protected classes like race and gender into account when it creates an audience for an advertiser.
Despite these big changes, Aaron Rieke, the managing director at Upturn, a nonprofit that researches the intersection of technology and discrimination, says they don’t go far enough. ‘These are real improvements, but it’s also not a complete answer to the problem,’ Rieki says. ‘The potential for discrimination still exists here, certainly.’ Upturn submitted a brief in the California class-action case highlighting the ways Facebook’s algorithms and tools can inadvertently discriminate. In those areas, Rieke believes the settlements fall short.
But Morgan Williams, general counsel for the National Fair Housing Alliance, which sued in New York, praised the settlement resulting from that suit. ‘The agreement will set a new standard across the tech industry concerning company policies that intersect with civil rights,’ Williams said. ‘We’re pleased with the settlement because it involves Facebook making broad and unprecedented changes to its platform.’
But there’s still much more to do. Figuring out when discrimination is occurring online requires robust independent auditing, including the investigative journalism that exposed some of these practices in the first place. But researchers and journalists engaging in common activities to test for online discrimination may be subject to liability for violating website terms of service.
Facebook has allowed some of the parties to the settlement to run tests on Facebook’s ad platform to ensure that Facebook complies with the agreement. But this kind of testing — and monitoring of ads that are published that journalists and researchers have done — must be able to take place unimpeded across all platforms in the same way that audit testing has long occurred in the offline world to enforce civil rights laws.
Because Facebook is such a dominant player in online advertising, today’s settlement marks a significant step toward ensuring that we don’t lose our civil rights when we go online to find a house, job, or loan.