Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Facebook Lets Advertisers Exclude Users by Race - ProPublica (Mega-Liberal SiliconValley Hyprocrisy)
ProPublica ^ | 10/28/2016 | Julia Angwin and Terry Parris Jr.

Posted on 10/28/2016 12:12:28 PM PDT by MarchonDC09122009

Facebook Lets Advertisers Exclude Users by Race

Facebook’s system allows advertisers to exclude black, Hispanic, and other “ethnic affinities” from seeing ads.

by Julia Angwin and Terry Parris Jr. ProPublica, Oct. 28, 2016, 8 a.m. David Sleight/ProPublica

Imagine if, during the Jim Crow era, a newspaper offered advertisers the option of placing ads only in copies that went to white readers.

That’s basically what Facebook is doing nowadays.

The ubiquitous social network not only allows advertisers to target users by their interests or background, it also gives advertisers the ability to exclude specific groups it calls “Ethnic Affinities.” Ads that exclude people based on race, gender and other sensitive factors are prohibited by federal law in housing and employment.

Here is a screenshot of a housing ad that we purchased from Facebook’s self-service advertising portal:

The ad we purchased was targeted to Facebook members who were house hunting and excluded anyone with an “affinity” for African-American, Asian-American or Hispanic people. (Here’s the ad itself.)

When we showed Facebook’s racial exclusion options to a prominent civil rights lawyer John Relman, he gasped and said, “This is horrifying. This is massively illegal. This is about as blatant a violation of the federal Fair Housing Act as one can find.”

The Fair Housing Act of 1968 makes it illegal "to make, print, or publish, or cause to be made, printed, or published any notice, statement, or advertisement, with respect to the sale or rental of a dwelling that indicates any preference, limitation, or discrimination based on race, color, religion, sex, handicap, familial status, or national origin.” Violators can face tens of thousands of dollars in fines.

The Civil Rights Act of 1964 also prohibits the “printing or publication of notices or advertisements indicating prohibited preference, limitation, specification or discrimination” in employment recruitment.

Facebook’s business model is based on allowing advertisers to target specific groups — or, apparently to exclude specific groups — using huge reams of personal data the company has collected about its users. Facebook’s microtargeting is particularly helpful for advertisers looking to reach niche audiences, such as swing-state voters concerned about climate change. ProPublica recently offered a tool allowing users to see how Facebook is categorizing them. We found nearly 50,000 unique categories in which Facebook places its users.

Facebook says its policies prohibit advertisers from using the targeting options for discrimination, harassment, disparagement or predatory advertising practices.

“We take a strong stand against advertisers misusing our platform: Our policies prohibit using our targeting options to discriminate, and they require compliance with the law,” said Steve Satterfield, privacy and public policy manager at Facebook. “We take prompt enforcement action when we determine that ads violate our policies."

Satterfield said it’s important for advertisers to have the ability to both include and exclude groups as they test how their marketing performs. For instance, he said, an advertiser “might run one campaign in English that excludes the Hispanic affinity group to see how well the campaign performs against running that ad campaign in Spanish. This is a common practice in the industry.”

He said Facebook began offering the “Ethnic Affinity” categories within the past two years as part of a “multicultural advertising” effort.

Satterfield added that the “Ethnic Affinity” is not the same as race — which Facebook does not ask its members about. Facebook assigns members an “Ethnic Affinity” based on pages and posts they have liked or engaged with on Facebook.

When we asked why “Ethnic Affinity” was included in the “Demographics” category of its ad-targeting tool if it’s not a representation of demographics, Facebook responded that it plans to move “Ethnic Affinity” to another section.

Facebook declined to answer questions about why our housing ad excluding minority groups was approved 15 minutes after we placed the order.

By comparison, consider the advertising controls that the New York Times has put in place to prevent discriminatory housing ads. After the newspaper was successfully sued under the Fair Housing Act in 1989, it agreed to review ads for potentially discriminatory content before accepting them for publication.

Steph Jespersen, the Times’ director of advertising acceptability, said that the company’s staff runs automated programs to make sure that ads that contain discriminatory phrases such as “whites only” and “no kids” are rejected.

The Times’ automated program also highlights ads that contain potentially discriminatory code words such as “near churches” or “close to a country club.” Humans then review those ads before they can be approved.

Jespersen said the Times also rejects housing ads that contain photographs of too many white people. The people in the ads must represent the diversity of the population of New York, and if they don’t, he says he will call up the advertiser and ask them to submit an ad with a more diverse lineup of models.

But, Jespersen said, these days most advertisers know not to submit discriminatory ads: “I haven’t seen an ad with ‘whites only’ for a long time.”


TOPICS:
KEYWORDS: censorship; facebook; hypocrisy; racist
This story made international privacy legal news. It is expected to cost Facebook - Zuckerberg HUGELY. EU and other nation's governments around the world have been very anti-Facebook in large part because of privacy issues, citizen's data security, and it's censorship algorithms. This story proves the disturbing truth that Facebook engages in racial profiling, and overt biased discriminatory practices, inspire of Facebook's claim that their mission is to unify people across the globe.

They are going to pay for this dearly. Bring the popcorn..

1 posted on 10/28/2016 12:12:28 PM PDT by MarchonDC09122009
[ Post Reply | Private Reply | View Replies]

To: MarchonDC09122009

You don’t understand, it’s perfectly OK for liberals to discriminate, it’s only when a conservative does it that it becomes wrong.


2 posted on 10/28/2016 12:16:40 PM PDT by Gunpowder green
[ Post Reply | Private Reply | To 1 | View Replies]

To: MarchonDC09122009

If I’m advertising for my proverbial bakery then it might be nice to exclude the faggots from receiving my ads.


3 posted on 10/28/2016 12:20:12 PM PDT by MeganC (JE SUIS CHARLES MARTEL!!!)
[ Post Reply | Private Reply | To 1 | View Replies]

To: MarchonDC09122009

Recommended reading about abuses of big data algorithms that make the wrong assumptions:

The Uses And Misuses Of ‘Big Data’: A Conversation

http://www.forbes.com/sites/valleyvoices/2016/05/06/the-uses-and-misuses-of-big-data-a-conversation/#7427de1862e0

“data should work in service of human beings, not the other way around.”

And -

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy

https://www.amazon.com/Weapons-Math-Destruction-Increases-Inequality/dp/0553418815#productDescription_secondary_view_div_1477683482352

“O’Neil is an ideal person to write this book. She is an academic mathematician turned Wall Street quant turned data scientist who has been involved in Occupy Wall Street and recently started an algorithmic auditing company. She is one of the strongest voices speaking out for limiting the ways we allow algorithms to influence our lives… While  Weapons of Math Destruction is full of hard truths and grim statistics, it is also accessible and even entertaining. O’Neil’s writing is direct and easy to read—I devoured it in an afternoon.”
— Scientific American

“Cathy O’Neil has seen Big Data from the inside, and the picture isn’t pretty. Weapons of Math Destruction opens the curtain on algorithms that exploit people and distort the truth while posing as neutral mathematical tools. This book is wise, fierce, and desperately necessary.”
— Jordan Ellenberg, University of Wisconsin-Madison, author of How Not To Be Wrong

“O’Neil has become [a whistle-blower] for the world of Big Data… [in] her important new book… Her work makes particularly disturbing points about how being on the wrong side of an algorithmic decision can snowball in incredibly destructive ways.”
—TIME


4 posted on 10/28/2016 12:39:33 PM PDT by MarchonDC09122009 (When is our next march on DC? When have we had enough?)
[ Post Reply | Private Reply | To 1 | View Replies]

To: MarchonDC09122009

I used Facebook for advertising my business.
I did not know that people did not KNOW this.


5 posted on 10/28/2016 1:47:36 PM PDT by GeaugaRepublican (Groups compete. Immigration and Trade to save the country.)
[ Post Reply | Private Reply | To 1 | View Replies]

To: MarchonDC09122009

Pretty racist. Jury should give out big fines.


6 posted on 10/28/2016 3:32:29 PM PDT by minnesota_bound
[ Post Reply | Private Reply | To 1 | View Replies]

To: MarchonDC09122009

Identity Politics = Cultural Marxism.


7 posted on 10/28/2016 6:03:10 PM PDT by YogicCowboy ("I am not entirely on anyone's side, because no one is entirely on mine." - JRRT)
[ Post Reply | Private Reply | To 1 | View Replies]

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson