- Instagram Reels is linking some adults to inappropriate images of children, the WSJ reports.
- Instagram is now dealing with a public scandal similar to Facebook’s election interference scandal.
- Instagram is also facing a lawsuit from 33 states that say it ignored warnings about teens’ mental health.
For a few weeks in July, an eerie phrase floated through the air that mildly pained some who typed it: “Hot Zuck Summer.”
There was that photo of Mark Zuckerberg shirtless and creepily torn. The successful launch of Threads. The whole embarrassing thing where Elon Musk challenged Zuckerberg to a physical fight.
Compared to Musk, who was busy destroying both Twitter and his reputation, the Meta CEO seemed like a fairly mature and measured executive.
For a brief summer, things finally seemed rosy for Zuckerberg and Meta: the distance of the 2016 election and all the mess that came from it was far enough away in the rearview mirror. The explosive revelations in 2021 of Frances Haugen and the ‘Facebook Papers’ were not on everyone’s minds. Twitter, FTX and AI made headlines.
It was almost as if Meta had a moment where she actually got to add a few numbers to a big sign that read “Days Without a Major Scandal.”
Today the counter has just been reset to “0”. And while “Hot Zuck Summer” may have been a light-hearted take on Zuckerberg, the latest scandal is anything but.
On Monday, The Wall Street Journal reported on the disturbing way sexualized content from children was marketed to adults via Instagram’s Reels.
In the WSJ’s test, it created new accounts that followed only teen and teen gymnastics and cheerleading influencers. Those accounts were then recommended as Reels for adult sexual content and for sexualized children’s content, the Journal reported. The Canadian Center for Child Protection conducted a similar test with similar results, the Journal reported.
From the WSJ:
Instagram’s system delivered shocking doses of salacious content to these test accounts, including risqué images of children and overtly sexual videos for adults — and ads for some of America’s biggest brands. The Journal set up the test accounts after noticing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts following these children had also expressed an interest in sexual content related to both children as well as adults. .
Alarmingly, sexually suggestive content was also shown among advertisements from major companies.
From the report:
The tests found that following only the young girls prompted Instagram to show videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that followed a video of a woman exposing her crotch.
And perhaps most depressing of all:
An ad for Lean In Girls, the nonprofit organization for empowering young women run by former Meta Chief Operating Officer Sheryl Sandberg, ran just before a promotion for an adult sex content creator who often appears in schoolgirl clothing. Sandberg declined to comment.
A spokesperson for Meta told the WSJ that the company recently launched new brand safety tools and has a task force for tracking down suspicious users. Sandberg declined to comment to the Journal, as did Walmart.
In a statement to Business Insider, Meta said: “We don’t want this type of content on our platforms and brands don’t want their ads appearing next to it. We continue to invest aggressively to stop this – and report quarterly on the prevalence of such content, which remains very low.” It also said that the WSJ’s test was a “manufactured experience” that did not reflect what real users do every day to see.
Earlier this fall, 33 states filed a lawsuit against Meta, accusing the organization of ignoring warnings about potential harm to young girls. It is also alleged that Meta was aware of millions of accounts opened by children under the age of 13, but subsequently failed to close them.
A lawsuit in Massachusetts claims Meta ignored efforts to improve teen well-being in its apps.
And a recently made public complaint as part of the 33 states filing suit appears to show that Instagram executives were well aware of a phenomenon that seems fairly intuitive to anyone who has used Instagram: when you see all your friends having fun living their best life and tons of photos of extremely hot people can make you feel bad.
(Meta spokeswoman Liza Crenshaw told BI that the complaint “mischaracterizes our work using selective quotes and cherry-picked documents.” She said: “We want teens to have safe, age-appropriate experiences online, and we have more than 30 tools to support them and their parents.”)
From the complaint, with emphasis on mine:
Meta’s senior leadership admits that social comparison is a critical issue with serious consequences for users, especially Instagram. [Adam] Mosseri wrote in an internal email: “I see social comparison as the existential question Instagram faces within the broader question of whether social media is good or bad for people.” Due to Instagram’s ‘focus on youth and visual communications’, emphasis on beauty and fashion content, and a ‘marketing look and feel that is often too polished’ Mosseri reasoned that “social comparison is with Instagram [what] election interference is for Facebook.”
I think Mosseri was spot on. This is the existential question now being debated about Instagram.
(The interpretation of Mosseri’s sentence about election interference is not entirely clear to me. I think many Meta employees believe that the 2016 election interference story was exaggerated. Perhaps he means that this will be the time for Instagram to make a cause a huge scandal where public opinion quickly turns against it, people scream on cable news, people delete their accounts, and someone is hauled before Congress to be questioned about it.)
Anyway: Mosseri is right. This is a major public scandal. Enough shoes have fallen. Major advertisers like Match and Bumble are canceling ads because of the WSJ report, the publication said.
‘Hot Zuck Summer’ has turned into ‘Instagram Nightmare Fall’.