Teenagers on Facebook can be targeted by ads endorsing alcohol, drugs, gambling, smoking, and eating disorders, according to a report by a watchdog group. The Tech Transparency Project created six test ads and submitted them to Facebook, saying it wanted to reach users ages 13 to 17. Facebook approved all the ads within hours, including one promoting pill parties in 43 minutes.
“This is an easy fix, and Facebook should have had the foresight to make it a long time ago,” said Tech Transparency Project director Katie Paul. “Whether this was an oversight or a money-grab is not important. It’s completely unacceptable.”
As you scroll around Facebook and the wider web, its algorithms keep tabs on your behavior. Eventually, it places you into categories based on what it’s observed about you: your political leanings, your favorite music, your interests and hobbies, and so on. This is what draws advertisers, who want to show ads tailored to these groups.
But many users are unaware that Facebook can infer everything from their race to their sexuality or relationship status just from their online activity. Moreover, several of these categories are inappropriate for minors. The report found that Facebook used teenagers’ behavior to place them in interest categories for “alcoholic beverages,” “extreme weight loss,” and “tobacco,” even noting if the teens were single so they could be targeted by dating site ads.
All Facebook users are placed in interest categories. But minors under 18 aren’t supposed to be placed in certain adult categories. Facebook has gotten in hot water for showing inappropriate ads to children since at least 2014. As recently as 2019, an investigation by The Guardian found that children were still being labeled as interested in tobacco and alcohol.
Reporters have uncovered other issues with the company’s algorithmically created categories. In 2017, a ProPublica report found that the company was permitting advertisers to target users who listed their own occupation as “jew hunters.” The next year Facebook apologized for indicating that thousands of users in Russia were “interested in treason.” Then, in 2019, Facebook settled with civil rights groups who alleged the company allowed advertisers to discriminate against certain groups when posting ads for jobs and housing.
Facebook has guardrails in place to stop these from being shown to underage users, but TTP’s director says the test ads were approved “in a matter of hours.”
“There’s absolutely no reason why Facebook should have tagged nearly a million teens as potentially interested in “alcoholic beverages” and other categories,” Paul said.
A Facebook spokesperson said the company could not comment without seeing the report.
TTP created six test ads, each designed around a topic users under 18 aren’t supposed to see. These include an ad for “ana tips” (“ana” is a well known abbreviation for anorexia), which TTP says it targeted at users that Facebook classifies as being interested in “extreme weight loss” and “diet food.” A fake vaping ad targeted underage users classified as interested in “electronic cigarettes” and “tobacco.” Advertisers aren’t permitted to target users under 18 with dating site ads, but TTP’s test ad was approved in only two hours.
In addition to creating the categories, Facebook also shows advertisers its “estimated reach,” the number of users who may see any ad once it’s placed. Facebook estimated as many as 900,000 users would see the alcohol ad, while as many as 5 million would see the dating site ad. Without immediate correction to how the social network monitors its own rules around ad placement, the group warns, Facebook is “positioned to profit from harmful messages … aimed at a vulnerable age group.”
More Great WIRED Stories