A Facebook user filed a lawsuit in 2021, because they did not think they were getting a fair shot at viewing advertisements. If you are like me, you want ads off of your social media experience at all costs. A prospective tenant in the Washington, D.C. area said it was more than a simple publicity blurb on Facebook. It was argued that it had grave real-life consequences.

Nine companies that manage apartment buildings in the D.C. area are accused of engaging in digital housing discrimination by not including older people. She claims that the defendants deliberately excluded people over the age of 50 from viewing their ads, so she was denied the opportunity to receive housing advertisements targeted to younger potential tenants.

In creating a targeted Facebook advertisement, advertisers can determine who sees their advertisements based on age, gender, location, and preferences, according to the lawsuit. The lawsuit alleged that rental companies used Facebook's targeting function to exclude people like her because of their age, instead directing the ads to younger prospective tenants.

The Lawyers Committee for Civil Rights Under Law, which filed a brief in favor of the user, said in a press release that Facebook is not giving the user what they want. We must not allow corporations to blame technology for harmful decisions made by CEOs, because redlining is discrimination and unjust.

The case was dismissed because the judge felt that online targeting of advertisements did not cause injury to consumers. A law firm that focuses on litigation, securities and regulatory enforcement, business and finance, intellectual property, public finance, and real estate matters said that the ruling could have a significant impact on how we view discrimination online.

It seems likely to make it more difficult for private parties to bring lawsuits related to online ad targeting on social media networks or through methods like paid search, the firm said.

There are many lawsuits against Facebook that claim discrimination. We already know how bad these ads can be, from being used to spy on us to creating a world with further devastating partisan divides. There is something harmful going on with ads online, particularly on one of the largest ad platforms ever, Facebook. The platform has a total advertising audience of more than two billion people. Digital redlining means that any one of them could be missing out on ads for housing, credit opportunities, and other important issues that impact the wealth gap. That is important because of this.

Wait, what is digital redlining?

Redlining is when companies and people deny loans and other resources to people who live in certain neighborhoods. It works to deepen the racial and financial divides. It can happen online as well.

Any use of technology to perpetuate discrimination is referred to as digital redlining. The Greenlining Institute, a California-based organization that works to fix digital redlining, describes the practice of internet companies failing to provide infrastructures for service to lower-income communities.

It results in lower-income people having to pay more for internet while also having to deal with slower speeds than people in wealthier areas. The FCC is forming an agency task force to fight digital discrimination and promote equal broadband access nationwide.

Unfair ad-targeting practices are also referred to as digital redlining. According to the American Civil Liberties Union, online ad-targeting can replicate existing inequalities in society, which can exclude people who belong to historically marginalized groups from opportunities for housing, jobs, and credit.

In today's digital world, digital redlining has become the new frontier of discrimination, as social media platforms like Facebook and online advertisers have increasingly used personal data to target ads based on race, gender, and other protected traits. Despite agreements to make sweeping changes to its ad platform, digital redlining still persists on Facebook.

It isn't that digital redlining is more harmful on Facebook than it is on other online platforms, but it is more prevalent in that.

Even though there have been some steps to mitigate those harms, the fact that Facebook has offered these tools that not just permit, but invite advertisers to exclude users based on certain characteristics is tremendously harmful.

Since a 2016 report from ProPublica, many activists agree that not enough has been done to resolve the ad discrimination problems on Facebook.

How does digital redlining work?

A restaurant wants to only share ads for upcoming job openings with specific candidates, or a real estate group wants to only share ads for their homes with wealthy, upper-class people who live in upper-class neighborhoods. The company will look for ways to suck its ad coverage from those groups when it chooses a platform like Facebook. Companies can choose who can and can't see their ads. Users can use specific and broad approaches to create a target audience on Facebook. Specific targeting can lead to a potential audience that is smaller, like parents living in Tucson, Arizona, while broad targeting includes categories like gender and age.

Special ad categories include housing, employment, and credit, after many court-based struggles. They have restricted targeting options in their ads manager. It is possible for a company to target an advertisement to a specific audience instead of just sending it out widely. They cannot do it based on protected characteristics such as age, gender, and where potential consumers live. That is the goal.

These ads will not allow targeting by age, gender, zip code, multicultural affinity, or any detailed options describing or appearing to relate to protected characteristics, according to Facebook. Advertisers for protected classes can not use lookalike audiences to reach new people who are similar to their existing customers.

But is that enough?

Morgan Williams, the general counsel of the National Fair Housing Alliance, told Mashable that there are other aspects of Facebook that cause scrutiny and concern. In North Carolina, the impacts of Facebook's advertisements were analyzed from public voting records.

This was true for both the Lookalike Audience tool and the Special Ad Audience tool that Facebook designed to explicitly not use sensitive demographic attributes when finding similar users.

If you gave Facebook a set of names of contacts that were similar to your clients list, it would be able to target ads to those users. In engaging in that targeting, there were certain interest metrics that were specifically concerning, and that, from our perspective, would have separated targeting of those ads.

Advertisers can use the lookalike feature to reach audiences in the U.S. with housing, employment, or credit ads. That audience is based on online behavior similarities that do not consider things like age, gender, or zip code. Activists argue that users can target protected traits within a special ad audience. You can use sources like customer lists, website or app traffic, or engagement on Facebook to create a custom audience target.

Advertisers can give Facebook a seed audience, and then Facebook selects other users who look like that audience. Christianna Silva is a 27 year old queer person who lives in Brooklyn. Advertisers aren't saying "show this ad to 27 year old queer people who live in Brooklyn", they are saying "show this to people like Christianna Silva".

Obviously, if your seed audience reflect a certain demographic, the matching audience will also reflect that demographic.

The matching audience will also reflect that demographic, if your seed audience reflects a certain one.

Predicting which users will see the ads based on user data like where they live, what they like or post, and what groups they join, is what the ad delivery system uses to decide which users will see the ads. Data about who we are, where we live, what we post, and what groups we join are indicative of our protected traits and can lead to discrimination.

Is this legal? 

Targeting ads based on protected traits is against the law. The University of Southern California conducted a study in which they found that Facebook's ad delivery system showed different employment ads to women and men, even though the jobs require the same qualifications and the targeting parameters. This is illegal, but there is confusion about how Section 230 of the Communications Decency Act applies to online ad targeting.

According to the senior staff attorney with the American Civil Liberties Union, Facebook has been hiding behind Section 230 in its litigation. Section 230 is mostly supported by the American Civil Liberties Union, but their position is that it doesn't protect Facebook from this conduct because it was the architect of the targeting tools.

Changes have been made

Facebook has made changes to its ad delivery system.

A spokesman for Meta said that Facebook has made significant investments to help prevent discrimination on their ad platforms. Advertisers cannot use their platform to engage in wrongful discrimination according to the spokesman. It feels like a weak point, since no one ever reads the terms and conditions. It is not so much a question of if the user reads the terms as it is whether or not Facebook is policing the rules in their own terms. The platform is terrible at policing its own rules, just consider the way misinformation continues to spread on the platform.

Advertisers are not allowed to use interests, demographics, or behaviors for exclusion targeting. Since advertisers self-report on whether they are posting ads about jobs and housing and the like, Facebook uses human reviewers and machine-learning to identify the ads in case they are incorrect. Meta has not disclosed how well this works.

In the U.S., Canada, and the EU, people running housing, employment, or credit ads have to use special advertisement categories with restricted targeting options, including that they aren't allowed to target by gender, age, or zip code. According to the American Civil Liberties Union, Facebook gives housing providers the ability to target potential renters or homeowners by a certain area, which is a clear proxy for race in our still-segregated country.

Are those changes enough?

The courts have made changes to Facebook. Activists argue that the steps they have taken have been too small.

Facebook disabled a feature for housing, credit, and job ads in March, but still showed ads to different demographic groups. Domino's pizza ad was shown to more men than women in a study, while an ad for the grocery delivery and pick-up service was shown to more women than men. More women than men were shown ads for sales associates for jewelry on Facebook, according to an audit.

Prospective tenants claimed in one lawsuit that Facebook excluded them from receiving housing advertisements because of their protected characteristics.

While ad classification will never be perfect, we are always improving our systems to improve our detection and enforcement over time, a Meta spokesman said.

In January 2022, Facebook began removing more targeting options related to topics people may perceive as sensitive, such as options referencing causes, organizations, or public figures that relate to health, race or ethnicity, political affiliation, religion, or sexual orientation. You can make assumptions about protected classes based on which political affiliation, religion, or sexual orientation they like on Facebook. This is for all types of ads. Users in the U.S. and Canada can search for all active housing, employment, and credit opportunity ads by advertisers and the location they are targeted to, regardless of whether they are in the advertiser.

Until Facebook's appetite changes, much of the work lands upon the shoulders of activists and lawmakers.

Making the housing and employment opportunities searchable through the marketplace was one step forward.

It is an important step, but Sherwin acknowledged that Facebook hasn't shown any appetite to crack the ad delivery algorithm. In the three months ending in June, the company made $29 billion through ad sales.

Activists and lawmakers take on most of the work until Facebook changes. We can always remove our profiles.