Home Objective News Today Facebook Denies Letting Advertisers Target Emotional Teens

Facebook Denies Letting Advertisers Target Emotional Teens

262

Advertising has long appealed to consumers’ emotions, but a new report claims Facebook may have taken it a bit too far.

Facebook is reviewing recent research after a report that claimed the social network helped advertisers target teen users who may be feeling insecure, stressed, or anxious.

A leaked document, which was obtained by The Australian, claims the social network’s research can help advertisers hone in on “moments when young people need a confidence boost.”

FILE PHOTO: The Facebook logo is displayed on the company's website in an illustration photo taken in Bordeaux, France


FILE PHOTO: The Facebook logo is displayed on the company's website in an illustration photo taken in Bordeaux, France

FILE PHOTO: The Facebook logo is displayed on the company’s website in an illustration photo taken in Bordeaux, France, February 1, 2017. REUTERS/Regis Duvignau/File Photo

Related: Cleveland Shooting Highlights Facebook’s Responsibility in Policing Depraved Videos

A Facebook representative told NBC News the leaked document is legitimate, but called the report “misleading.”

“Facebook does not offer tools to target people based on their emotional state,” Facebook said in a statement. “The analysis done by an Australian researcher was intended to help marketers understand how people express themselves on Facebook. It was never used to target ads and was based on data that was anonymous and aggregated.”

Related: Facebook Backtracks on Racial Profiling in Ads

While Facebook disputes the characterization of the research in The Australian’s report, the company’s statement said the research “did not follow” the existing protocols Facebook has in place “to review the research we perform.”

“We are reviewing the details to correct the oversight,” the statement said.

It’s not the first time Facebook has been under fire for allegedly exploiting users’ emotions. In 2014, the company published a study explaining how it showed 689,000 users positive and negative content, varying the content of their newsfeed to see if their emotions could be manipulated.

Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here