National Fair Housing Alliance Settles Lawsuit with Facebook: Transforms Facebook’s Ad Platform Impacting Millions of Users
The National Fair Housing Alliance (NFHA), Fair Housing Council of Greater San Antonio (FHCGSA), Fair Housing Justice Center of New York (FHJC), and Housing Opportunities Project for Excellence, Inc. of Miami (HOPE, Inc.), collectively the “Fair Housing Groups,” settled an historic lawsuit with Facebook that will drive unprecedented and sweeping changes across its advertising platform.
In March of 2018, NFHA and three of its member organizations filed a lawsuit against Facebook, Inc. in federal court in New York City, alleging that Facebook’s advertising platform enabled landlords and real estate brokers to exclude people of color, families with children, women, people with disabilities, and other protected groups from receiving housing ads. NFHA and its members were represented by Diane L. Houk, Katherine Rosenfeld, and David Berman of the New York City-based civil rights law firm of Emery Celli Brinckerhoff & Abady LLP.
The lawsuit alleged that Facebook created pre-populated lists that made it possible for its housing advertisers to “exclude” (in Facebook terminology) home seekers from viewing or receiving rental or sales ads because of protected characteristics, including race, family status, and sex. The Fair Housing Groups conducted investigations that confirmed Facebook’s alleged discriminatory practices.
As a result of advocacy and enforcement, the Fair Housing Groups and Facebook have now settled the lawsuit with an agreement that will set new standards across the Tech industry concerning company policies that intersect with civil rights laws. As part of the settlement agreement, NFHA will work with Facebook to develop an in-house fair housing training program for Facebook leadership and staff. The Fair Housing Groups will also monitor Facebook’s advertising platform on a continual basis. Furthermore, Facebook will work with the Fair Housing Groups to support programs that expand fair housing opportunities throughout the country.
“This settlement positively impacts all of Facebook’s 210 million users in the U.S. since everyone is protected by our nation’s fair housing laws,” said Lisa Rice, President and CEO of NFHA. “As the largest digitally-based advertising platform and a leader in Tech, Facebook has an obligation to ensure that the data it collects on millions of people is not used against those same users in a harmful manner,” Rice added. Facebook took in $8.246 billion in advertising revenue in the U.S. and Canada alone, in the fourth quarter of 2018.
On a platform with such a massive reach, advertisers instantly become powerful players. “Big Tech companies like Facebook must design their platforms in a non-discriminatory manner and have a huge responsibility to ensure advertisers are not enabled to conduct business in a discriminatory fashion,” stated Keenya Robertson, President and CEO of HOPE, Inc.
“Companies must understand that depending on how data is being used, it can harm people and communities. This agreement will help other companies that rely on algorithms and data for a range of services and operations to carefully consider whether their policies, products, and platforms are illegally discriminating against consumers,” added FHJC Executive Director Fred Freiberg.
The federal Fair Housing Act prohibits discrimination against consumers based on race, color, religion, sex, disability, familial status, and national origin.The law also makes it illegal to “make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement” that would limit housing options for protected groups. Housing advertisers must not be allowed or encouraged to exclude these groups from their advertising efforts.
“Facebook’s previous settings allowed advertisers to create ads that excluded people of color or families with children, or limited the specific geographies where people could see ads, which could perpetuate segregation in communities throughout the nation,” explained Sandra Tamez, President and CEO of FHCGSA. According to thePew Research Center, 74 percent of U.S. Facebook users were not even aware that their personal characteristics were being used by advertisers.
Facebook has now agreed to establish a separate advertising portal, the “HEC portal,” for advertisers seeking to create housing, employment, and credit ads on Facebook, Instagram, and Messenger. The portal will limit advertisers’ targeting abilities to prevent them from illegally discriminating. Housing advertisers will no longer be allowed to target consumers based on race, ethnicity, color, national origin, gender, age, religion, family status, disability, or sexual orientation. Housing advertisers will also be prevented from advertising based on zip code. Instead, they will be permitted to advertise based on a 15-mile radius from a city center or address.
Facebook will restructure its “Lookalike Audience” feature, which formerly allowed advertisers to target ads to Facebook users who were similar to an advertiser’s existing customers. Moving forward, Facebook will restructure and rename this tool so that it will not consider users’ age, relationship status, religious or political views, school, interests, zip code or membership in “Facebook Groups.”
Facebook will also create a page for consumers to view all housing ads placed on its platform, post a self-certification agreement that advertisers must agree to regarding all anti-discrimination laws, provide anti-discrimination and civil rights educational materials to advertisers, and continually work with scholars, organizations, experts, and researchers to examine algorithmic modeling and its potential for discriminatory impact and bias.
The Fair Housing Groups’ settlement agreement with Facebook sets a significant and historic precedent for Big Data and Tech companies throughout the country. As more consumers rely on Big Tech in their daily lives, it is important that companies abide by and enforce civil rights laws across their platforms. Big Tech and Big Data companies must not allow their platforms to become tools for unlawful behavior, including segregation and discrimination in housing and beyond.