Massachusetts State House.
Boston Bar Journal

Fair Housing Enforcement in the Age of Digital Advertising: A Closer Look at Facebook’s Marketing Algorithms

February 19, 2020
| Winter 2020 Vol. 64 #1

by Nadiyah Humber and James Matthews

Legal Analysis

Introduction

The increasing use of social media platforms to advertise rental opportunities creates new challenges for fair housing enforcement.  The Fair Housing Act, 42 U.S.C. §§ 3601-19 (“FHA”) makes it unlawful to discriminate in the sale or rental of housing on the basis of race, color, religion, sex, familial status, national origin, and disability (“protected classes”).  The FHA also prohibits discriminatory advertising, including distributing advertisements in a way that denies people information about housing opportunities based on their membership in a protected class.  Accordingly, advertisers and digital platforms that intentionally or unintentionally cause housing advertisements to be delivered to users based on their membership in a protected class may be liable for violating the FHA.

In March 2018, in response to what they perceived to be discriminatory advertising on Facebook, the National Fair Housing Alliance (“NFHA”) and several housing organizations filed suit in federal court in New York City.[1]  The lawsuit alleged that Facebook’s advertising platform enabled landlords and real estate brokers to prevent protected classes from receiving housing ads.  Facebook settled the suit on March 19, 2019.[2]  As part of the settlement, Facebook agreed to make a number of changes to its advertising portal so that housing advertisers can no longer choose to target users based on protected characteristics such as age, sex, race, or zip code.  Facebook also committed to allow experts to study its advertising platform for algorithmic bias.  It remains to be seen whether this agreement goes far enough in curtailing discriminatory advertising practices, as Facebook is confronting further enforcement action from a government watchdog in respect to similar issues.  Moreover, a recent research study found that Facebook’s digital advertising platform may still lead to discriminatory outcomes despite changes already made.

On August 13, 2018, the Assistant Secretary for Fair Housing and Equal Opportunity filed a complaint with the Department of Housing and Urban Development (“HUD”) alleging that Facebook is in violation of the FHA.  The Office of Fair Housing and Equal Opportunity determined in March, 2019 (the same time as the settlement agreement with NFHA) that reasonable cause exists and issued an official Charge against Facebook.[3]

Notwithstanding these suits and administrative actions, it remains that, for fair housing claims to survive in court against media giants like Facebook, HUD and future plaintiffs must first successfully argue that Facebook is not protected by the Communications Decency Act (“CDA”).[4]

Communications Decency Act

Congress enacted the CDA, in part, to prohibit obscene or indecent material from reaching children on the internet, and also to safeguard internet ingenuity.[5]  What was meant as a protectionist measure for the young, impressionable, and inventive, however, evolved into a powerful defense tool used by web applications, like Facebook.  Section 230 of the CDA immunizes providers of interactive computer services against liability arising from content created by third parties.  To overcome the CDA hurdle, litigants have to demonstrate that Facebook “materially contributes” to the management of content on their platform.  Fair Hous. Counsel of San Fernando Valley v. Roommates.com, LLC, 521 F.3d 1157, 1163 (9th Cir. 2008).  While many online service providers have successfully used Section 230 in their defense, the protections offered to internet service providers are not absolute.

The CDA contains requirements that restrict the application of Section 230.[6]  The language in Section 230 prevents a “provider or user of an interactive computer service” from being “treated as the publisher or speaker of any information” that is exclusively “provided by another content provider.”[7]  The U.S Court of Appeals for the Ninth Circuit concluded that publishing amounts to “reviewing, editing, and deciding whether to publish or to withdraw from publication third-party content.” Barnes v. Yahoo!, Inc., 570 F.3d 1096, 1102 (9th Cir. 2009).  The idea is that website operators would no longer be liable for deciding to edit or remove offending third-party content.

Based on this reading, the law immunizes only “certain internet-based actors from certain kinds of lawsuits.”[8] The statute, as discussed in Roommates.com, LLC, 521 F.3d at 1162, provides no protection to online content that was created by a website operator or developed – in whole or in part – by the website operator.  Courts have reaffirmed the CDA’s limited scope to protect self-policing service providers that act as “publishers” of third-party content, as opposed to liability against all categories of third-party claims (i.e. violations of civil rights laws at issue in this article).  Barnes, 570 F.3d at 1105; Accord Doe v. Internet Brands, Inc., 824 F.3d 846, 852-53 (9th Cir. 2016).  These limitations are crucial.  If a plaintiff can show that Facebook developed the content on its platform in whole or in part or, content aside, that Facebook produces discriminatory outcomes via mechanisms on its platform developed by Facebook, it may be excluded from Section 230 immunity.

Optimization Discrimination Study

In a recent study by researchers at Northeastern University, [9] evidence of Facebook’s control over ad dissemination demonstrates how Facebook manages output of information based on headlines, content, and images, using “optimization.”[10]  In short, the Study set out to determine how advertising platforms themselves play a role in creating discriminatory outcomes.  The Study highlighted the mechanisms behind, and impact of, ad delivery, which is a process distinct from ad creation and targeting.  For example, the Study found that inserting musical content stereotypically associated with Black individuals was delivered to over 85% Black users, while musical content stereotypically associated with White people was delivered to over 80% White users.  The researchers concluded that “ad delivery process can significantly alter the audience the ad is delivered to compared to the one intended by the advertiser based on the content of the ad itself.”  The study also simulated marketing campaigns and found that Facebook’s algorithms “skewed [ad] delivery along racial and gender lines,” which are protected categories under the FHA.  These results suggest that, even if a housing advertiser can no longer choose to explicitly target ads based on attributes like age, gender, and zip code, a housing advertiser could still use Facebook’s marketing platform to steer ads away from protected segments of users by manipulating the content of the ad itself.  Moreover, the platform may cause such discriminatory outcomes regardless of whether or not the advertiser intended such results.

Case Law Interpreting CDA

The Study’s findings set the foundation for evaluating Facebook’s control over the manipulation of content and ad distribution on their platform.  Two seminal cases, Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997) and Roommates.com, LLC, 521 F.3d 1152 (2008), outline tests to determine when online platforms are considered content managers versus content providers.[11]  The Study makes a strong case for why Facebook is a content manager, eliminating immunity under Section 230.  Litigants can also persuasively distinguish their arguments against Facebook from a recent decision interpreting Section 230 liability.  In Herrick v. Grindr, LLC, 306 F. Supp.3d 579 (2018), the U.S. Court of Appeals for the Second Circuit ruled in favor of Grindr (a same-sex dating application) on all but one of the plaintiff’s claims.  The plaintiff had argued that Grindr failed to monitor and remove content created by the plaintiff’s ex-partner, and the court concluded that Section 230 barred several of his claims because they were “inextricably related” to Grindr’s role in editing or removing offending content (which is protected conduct under the CDA).  Herrick v. Grindr, LLC, 306 F. Supp.3d 529, 588.  The Supreme Court denied Herrick’s petition to review on October 7, 2019.[12]

A major distinguishing feature between the facts in Herrick and the Study’s findings against Facebook is how the two websites handle third-party content.  In Herrick, the claim against Grindr was based on Grindr’s failure to remove content generated by a third-person.  The issue with Facebook exists in the use of optimization algorithms.  The point is that discriminatory outcomes are ultimately a result of Facebook’s manipulation of ad delivery for the purpose of reaching certain groups at the exclusion of others in protected categories.  Facebook’s tools go well beyond the function of “neutral assistance,” because its platform directs advertisements to sectors of people using discriminatory preferences created by Facebook, not third-parties.[13]

Intentional Discrimination

If it can be successfully argued that Facebook is not immune from suit under the CDA, housing advertisers and digital platforms that intentionally or unintentionally target ads to certain groups of users based on their membership in a protected class may be sued for violating the FHA.  As described above, the Facebook Study determined that housing advertisers may still be able to use Facebook’s marketing platform to steer housing ads away from protected classes of tenants by manipulating the content of the ad.  In such circumstances, the housing advertiser who uses the ad’s content as a covert method of discriminatory distribution may be violating the FHA.  The digital platform may also be liable either because they are actively involved in facilitating the selective distribution of ads, or as an agent vicariously liable for the advertiser’s conduct.

Disparate Impact

Even if it cannot be shown that a housing advertiser intended to discriminate, if the ad delivery mechanism has the effect of distributing housing ads in a discriminatory way, the advertiser and platform may still be liable for violating the FHA under a theory of disparate impact.  Disparate impact discrimination occurs when a neutral policy or practice has a discriminatory effect on members of a protected class.  See Texas Dep’t of Hous. & Cmty. Affairs v. Inclusive Communities Project, Inc., 135 S. Ct. 2507, 2523 (2015); see also 24 C.F.R. § 100.500.  A three-part burden shifting framework is used to evaluate liability.  Id.  Protected class members have the initial burden of establishing that a practice has a disproportionate adverse effect on a protected class.  To meet this initial burden, a plaintiff must “allege facts at the pleading stage or produce statistical evidence demonstrating a causal connection” between the policy and the disparate impact.  Inclusive Communities, 135 S. Ct. 2507, 2523 (2015).

If a protected class member makes out a prima facie claim of disparate impact, the burden then shifts to the accused party to show that the practice is necessary to achieve a valid interest.  See Robert G. Schwemm, Calvin Bradford, Proving Disparate Impact in Fair Housing Cases After Inclusive Communities, 19 N.Y.U.J. Legis. & Pub. Pol’y 685, 696-697 (2016).  The protected class members then have an opportunity to show that the interest could be achieved through less discriminatory means.  Id.

In the digital advertising context, protected class members would have the initial burden of showing that they were denied equal access to information about a housing opportunity as a result of a housing advertiser’s marketing campaign.

Statistical Evidence

While the Facebook Study was able to demonstrate the potential for “skewed” ad delivery based on protected characteristics, further research is needed to determine how a plaintiff might marshal statistical evidence to support a particular claim.  As the Facebook Study notes, without access to a platform’s “data and mechanisms” it may be difficult to assess whether or not a particular advertising campaign has led to discriminatory outcomes.[14]  Therefore, it may be challenging for adversely affected users to develop the necessary data at the pleading stage to make out a prima facie claim of disparate impact.  This might explain why HUD is continuing to pursue its legal challenge against Facebook despite the remedial measures it has already agreed to undertake, including allowing research experts to study its advertising platform for algorithmic bias.[15]  In other words, HUD’s intent may be to better understand how Facebook’s ad delivery algorithm works now so it can limit its discriminatory impact.

Causal Connection

Because digital advertising companies play an active role in the ad delivery process, it follows that a discriminatory distribution of ads could be attributed to the platform.  While there are limited case decisions involving FHA liability and algorithmic decision-making programs, the court in Connecticut Fair Hous. Ctr. v. Corelogic Rental Prop. Sols, LLC, 369 F. Supp. 3d 362 (D. Conn. 2019), found that plaintiffs had pled sufficient facts to establish a causal connection between a tenant screening company’s alleged activity and unlawful housing denials to support a claim of disparate impact based on race.  Id. at 378-379.  The court found that the defendant had created and provided the automated screening process, suggested the categories by which the housing provider could screen potential tenants, made eligibility determinations, and sent out letters to potential tenants notifying them of these decisions.  Id.

Digital advertising companies similarly create the marketing platform for housing advertisers to use, provide criteria from which to choose the users, and design and maintain the algorithms that decide to whom the ads will be delivered.  Therefore, a sufficient nexus should exist between the advertising platform’s activity and the selective distribution of ads to support a disparate impact claim.

Valid Interest

Housing providers and digital advertising platforms arguably have a “valid interest” in being able to effectively market their housing services, and ad delivery algorithms are an efficient way to reach relevant users.  However, given the abundance of print and online advertising options available for housing advertisers that do not rely solely on ad delivery algorithms, such as Craigslist, Zillow, Trulia, and Apartments.com etc., less discriminatory means exist by which housing advertisers can successfully market their services.

HUD recently proposed a new disparate impact rule that would raise the bar even higher for plaintiffs bringing disparate impact claims and provide housing advertisers with a defense if a digital advertising platform’s algorithmic model was the cause of a discriminatory outcome.[16]  A number of tenant advocacy groups and other stakeholders, such as Harvard Law School’s Cyberlaw Clinic, have submitted comments opposing the proposed rule, arguing, among other concerns, that it would perpetuate discrimination by “significantly reduc[ing] incentives for algorithm users and vendors to test their tools for bias” contrary to the purpose of the FHA.[17]

Conclusion

The FHA was designed to provide all home-seekers, who have the resources, with equal access to housing stock and opportunity.  It seems clear that online platforms in the business of designing and maintaining their algorithms have an impact on large segments of protected populations.  The tension between the need for more information to combat discriminatory algorithms and propriety interests remain.  However, one important way to move forward is to balance these interests by staying within the bounds of the FHA, including incentives for platforms to evaluate their ad delivery tools for distribution bias, and ensure a more inclusive participation in the housing market for all social media users.

Nadiyah J. Humber is the Assistant Clinical Professor of Law and Director of the Corporate Counsel, Government, and Prosecution Clinical Externship Programs at Roger Williams University School of Law (“RWU”). RWU students earn academic credit externing for in-house legal offices of corporations, offices of prosecution, and government agencies in Rhode Island and beyond. Professor Humber teaches related seminars for each program on the role of one client entities and professional development through practice.  

James Matthews is a Clinical Fellow in Suffolk Law School’s Accelerator Practice and Housing Discrimination Testing Program (HDTP) where he supervises law students in housing discrimination, landlord-tenant, and other consumer protection matters related to housing. Attorney Matthews also has significant teaching and professional presenting experience. He helps conduct fair housing trainings and presentations as part of HDTP’s community education and outreach. He also teaches an upper-level landlord-tenant course he developed which includes instruction on state and federal fair housing law.   

[1] Nat’l Fair Housing Alliance, et al v. Facebook, Inc., No. 18 Civ. 2689, Complaint (detailing allegations), available at https://nationalfairhousing.org/wp-content/uploads/2018/03/NFHA-v.-Facebook.-Complaint-w-Exhibits-March-27-Final-pdf.pdf (last visited Jan. 13, 2020).

[2] National Fair Housing Alliance, Facebook Settlement, available at https://nationalfairhousing.org/facebook-settlement/ (last visited Jan. 20, 2020).

[3]Assistant Sec’y of Fair Hous. & Equal Opportunity v. Facebook, Inc., No 01-18-0323-8, 1, Charge of Discrimination (detailing procedural history), available at https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdf  (last visited Dec. 8, 2019).

[4] 47 U.S.C. § 230(c) (2019).

[5] Brief of Internet, Business, and Local Government Law Professors as Amici Curiae Supporting the Respondents at 7 Homeaway.com & Airbnb, Inc. v. City of Santa Monica, Nos. 2:16-cv-06641-ODW, 2:16-cv-06645-ODW (9th Cir. May 23, 2018). See generally 47 U.S.C. §§ 230(b) (detailing policy goals for freedom on the internet).

[6] Brief of Internet, Business, and Local Government Law Professors as Amici Curiae Supporting the Respondents at 9, Homeaway.com & Airbnb, Inc. v. City of Santa Monica, Nos. 2:16-cv-06641-ODW, 2:16-cv-06645-ODW (9th Cir. May 23, 2018).

[7] 47 U.S.C. §§ 230(c)(1), (f)(3) (2019).

[8] Brief of Internet, Business, and Local Government Law Professors as Amici Curiae Supporting the Respondents at 4, Homeaway.com & Airbnb, Inc. v. City of Santa Monica, Nos. 2:16-cv-06641-ODW, 2:16-cv-06645-ODW (9th Cir. May 23, 2018).

[9] Ali, Muhammad, et. al, Discrimination through Optimization: How Facebook’s Ad Delivery Can Lead to Skewed Outcomes, available at https://www.ccs.neu.edu/home/amislove/publications/FacebookDelivery-CSCW.pdf (last visited Dec. 6, 2019).

[10] Id. at 7 (explaining optimization on Facebook).

[11] See Nadiyah J. Humber, In West Philadelphia Born and Raised or Moving to Bel-Air? Racial Steering as a Consequence of Using Race Data on Real Estate Websites, 17 Hastings Race & Poverty L.J. 129, 153-155 (2020) (analyzing pertinent case law precedent for Section 230 immunity). There is a difference between online services that manage content (content provider) on their sites versus those that act more as a store house of information (service provider). Id.

[11] 47 U.S.C. § 230(c) (2019).

[12] Bloomberg Law available at https://news.bloomberglaw.com/tech-and-telecom-law/grindr-harassment-case-wont-get-supreme-court-review

[13] 47 U.S.C. § 230(c) (2019) (citing language from the act). Distinguishing O’Kroley v. Fastcase Inc., No. 3-13-0780, 2014 WL 2881526, at *1–2 (M.D. Tenn. June 25, 2014) (finding that providing search returns based on automated algorithms and user inputs does not constitute creating content).

[14] Muhammad, supra note 9.

[15] See supra note 2.

[16] See HUD’s Implementation of the Fair Housing Act’s Disparate Impact Standard, 84 Fed. Reg. 42853 (proposed Aug. 19, 2019) (for example, providing a defense where a “(2) plaintiff alleges that the cause of a discriminatory effect is a model, such as a risk assessment algorithm, and the defendant . . . (ii) Shows that the challenged model is produced, maintained, or distributed by a recognized third party that determines industry standards, the inputs and methods within the model are not determined by the defendant, and the defendant is using the model as intended by the third party . . .”)

[17] See Cathy O’Neil, Comment Regarding Docket No. FR-6111-P-02, http://clinic.cyber.harvard.edu/files/2019/10/HUD-Rule-Comment-ONEIL-10-18-2019-FINAL.pdf (last visited Jan. 20, 2020).