Facebook Lets Advertisers Exclude Users by Race

Facebook Lets Advertisers Exclude Users by Race

02/11/2016 0 Di puntoacapo

Questo arti­co­lo è sta­to let­to 8335 volte!

Facebook Lets Advertisers Exclude Users by Race

Imag­ine if, dur­ing the Jim Crow era, a news­pa­per offered adver­tis­ers the option of plac­ing ads only in copies that went to white readers.

That’s basi­cal­ly what Face­book is doing nowadays.

 The ubiq­ui­tous social net­work not only allows adver­tis­ers to tar­get users by their inter­ests or back­ground, it also gives adver­tis­ers the abil­i­ty to exclude spe­cif­ic groups it calls “Eth­nic Affini­ties.” Ads that exclude peo­ple based on race, gen­der and oth­er sen­si­tive fac­tors are pro­hib­it­ed by fed­er­al law in hous­ing and employment.

Here is a screen­shot of an ad we pur­chased in Facebook’s hous­ing cat­e­gories via the company’s adver­tis­ing por­tal:





The ad we pur­chased was tar­get­ed to Face­book mem­bers who were house hunt­ing and exclud­ed any­one with an “affin­i­ty” for African-Amer­i­can, Asian-Amer­i­can or His­pan­ic peo­ple. (Here’s the ad itself.)

When we showed Facebook’s racial exclu­sion options to a promi­nent civ­il rights lawyer John Rel­man, he gasped and said, “This is hor­ri­fy­ing. This is mas­sive­ly ille­gal. This is about as bla­tant a vio­la­tion of the fed­er­al Fair Hous­ing Act as one can find.”

The Fair Hous­ing Act of 1968 makes it ille­gal “to make, print, or pub­lish, or cause to be made, print­ed, or pub­lished any notice, state­ment, or adver­tise­ment, with respect to the sale or rental of a dwelling that indi­cates any pref­er­ence, lim­i­ta­tion, or dis­crim­i­na­tion based on race, col­or, reli­gion, sex, hand­i­cap, famil­ial sta­tus, or nation­al ori­gin.” Vio­la­tors can face tens of thou­sands of dol­lars in fines.

The Civ­il Rights Act of 1964 also pro­hibits the “print­ing or pub­li­ca­tion of notices or adver­tise­ments indi­cat­ing pro­hib­it­ed pref­er­ence, lim­i­ta­tion, spec­i­fi­ca­tion or dis­crim­i­na­tion” in employ­ment recruitment.

Facebook’s busi­ness mod­el is based on allow­ing adver­tis­ers to tar­get spe­cif­ic groups — or, appar­ent­ly to exclude spe­cif­ic groups — using huge reams of per­son­al data the com­pa­ny has col­lect­ed about its users. Facebook’s micro­tar­get­ing is par­tic­u­lar­ly help­ful for adver­tis­ers look­ing to reach niche audi­ences, such as swing-state vot­ers con­cerned about cli­mate change. ProP­ub­li­ca recent­ly offered a tool allow­ing users to see how Face­book is cat­e­go­riz­ing them. We found near­ly 50,000 unique cat­e­gories in which Face­book places its users.

Face­book says its poli­cies pro­hib­it adver­tis­ers from using the tar­get­ing options for dis­crim­i­na­tion, harass­ment, dis­par­age­ment or preda­to­ry adver­tis­ing practices.

“We take a strong stand against adver­tis­ers mis­us­ing our plat­form: Our poli­cies pro­hib­it using our tar­get­ing options to dis­crim­i­nate, and they require com­pli­ance with the law,” said Steve Sat­ter­field, pri­va­cy and pub­lic pol­i­cy man­ag­er at Face­book. “We take prompt enforce­ment action when we deter­mine that ads vio­late our policies.”

Sat­ter­field said it’s impor­tant for adver­tis­ers to have the abil­i­ty to both include and exclude groups as they test how their mar­ket­ing per­forms. For instance, he said, an adver­tis­er “might run one cam­paign in Eng­lish that excludes the His­pan­ic affin­i­ty group to see how well the cam­paign per­forms against run­ning that ad cam­paign in Span­ish. This is a com­mon prac­tice in the industry.”

He said Face­book began offer­ing the “Eth­nic Affin­i­ty” cat­e­gories with­in the past two years as part of a “mul­ti­cul­tur­al adver­tis­ing” effort.

Sat­ter­field added that the “Eth­nic Affin­i­ty” is not the same as race — which Face­book does not ask its mem­bers about. Face­book assigns mem­bers an “Eth­nic Affin­i­ty” based on pages and posts they have liked or engaged with on Facebook.

When we asked why “Eth­nic Affin­i­ty” was includ­ed in the “Demo­graph­ics” cat­e­go­ry of its ad-tar­get­ing tool if it’s not a rep­re­sen­ta­tion of demo­graph­ics, Face­book respond­ed that it plans to move “Eth­nic Affin­i­ty” to anoth­er section.

Breaking the Black Box

We live in an era of increas­ing automa­tion. But as machines make more deci­sions for us, it is increas­ing­ly impor­tant to under­stand the algo­rithms that pro­duce their judg­ments. See the series.

Machine Bias

We’re inves­ti­gat­ing algo­rith­mic injus­tice and the for­mu­las that increas­ing­ly influ­ence our lives. See all of our reporting.

Face­book declined to answer ques­tions about why our hous­ing-cat­e­gories ad exclud­ing minor­i­ty groups was approved 15 min­utes after we placed the order.

By com­par­i­son, con­sid­er the adver­tis­ing con­trols that the New York Times has put in place to pre­vent dis­crim­i­na­to­ry hous­ing ads. After the news­pa­per was suc­cess­ful­ly sued under the Fair Hous­ing Act in 1989, it agreed to review ads for poten­tial­ly dis­crim­i­na­to­ry con­tent before accept­ing them for publication.

Steph Jes­persen, the Times’ direc­tor of adver­tis­ing accept­abil­i­ty, said that the company’s staff runs auto­mat­ed pro­grams to make sure that ads that con­tain dis­crim­i­na­to­ry phras­es such as “whites only” and “no kids” are rejected.

The Times’ auto­mat­ed pro­gram also high­lights ads that con­tain poten­tial­ly dis­crim­i­na­to­ry code words such as “near church­es” or “close to a coun­try club.” Humans then review those ads before they can be approved.

Jes­persen said the Times also rejects hous­ing ads that con­tain pho­tographs of too many white peo­ple. The peo­ple in the ads must rep­re­sent the diver­si­ty of the pop­u­la­tion of New York, and if they don’t, he says he will call up the adver­tis­er and ask them to sub­mit an ad with a more diverse line­up of models.

But, Jes­persen said, these days most adver­tis­ers know not to sub­mit dis­crim­i­na­to­ry ads: “I haven’t seen an ad with ‘whites only’ for a long time.”

Clar­i­fi­ca­tion, Oct. 28, 2016: We’ve updat­ed the sto­ry to explain more clear­ly that the ad we bought was not for hous­ing itself — it was placed in Facebook’s hous­ing categories.

Source: ProPublica.org