According to a Reuters report, tweets soliciting child sex abuse material have appeared alongside or on profile pages of at least 30 major advertisers’ Twitter accounts. Some of those companies are now pulling their ad services off the social media platform.
Ghost Data, a cybersecurity group, released a new research report reviewed by Reuters about child sex abuse online that allegedly identified links to exploitative material containing child pornography. The information has led companies like Dyson, Ecolab, and Mazda to pause or pull their ads or campaigns from the platform.
“There is no place for this type of content online,” a spokesperson for Mazda USA said in a statement to Reuters.
The carmaker now prohibits its ads from appearing on Twitter profile pages.
According to the report, more than 500 accounts openly shared or requested child sex abuse material for 20 days during September 2022. Of those accounts, more than 70% were still active during the study.
Celeste Carswell, a spokesperson for Twitter, told Reuters that the company reviewed, removed, and permanently suspended the complete list of accounts for violating its rules, adding that the platform “has zero tolerance for child sexual exploitation.”
David Maddocks, brand president at Cole Haan, told Reuters the company is “horrified” after learning about Twitter’s alleged child porn problem.
“Either Twitter is going to fix this, or we’ll fix it by any means we can, which includes not buying Twitter ads.”
The shoe and accessories brand had a promoted tweet next to another post from a user who said they were “trading teen/child” content.
Other tweets from the Ghost Data report found that some included terms related to “rape” and “teens” displayed next to corporate advertisers. Another example allegedly promoted a tweet for Texas-based Scottish Rite Children’s Hospital, followed by a user searching for “Yung girls ONLY, NO Boys.”
Twitter officials have since invested more resources into protecting children’s safety by hiring new positions to write policies and provide solutions to prevent the problem.
Eliza Bleu, a survivor advocate of those affected by human trafficking, has been speaking out against Twitter’s alleged child pornography problem since shortly after joining the platform in 2019.
In August 2020, she met with Twitter officials that former CEO Jack Dorsey had set up.
“I had one request,” Bleu told The Daily Wire. “I wanted them to remove child sexual abuse material at scale.”
Bleu said following the meeting, they offered her a position on Twitter’s Trust and Safety Committee, but she declined because she refused to “work with or for abusers.”
“I couldn’t knowingly profit off child sexual exploitation,” she said.
Bleu advocated for a minor survivor anonymously dubbed “John Doe” who filed a lawsuit last year against Twitter for allegedly allowing sexual abuse material of the child, who was solicited and recruited for sex trafficking and had to endure his own sexual abuse material being promoted on Twitter, even after attempts were made to remove the content.
Bleu said the lawsuits brought by the two minor survivors didn’t send a message big enough for the platform to handle.
“It’s time to hit them where it hurts,” she said. “My only hope is that by the time all of the advertisers pull out, they have enough money left over to compensate the minor survivors.”
Andrew Stoppa, the founder of Ghost Data, said in a tweet Monday that he promised to release research about Twitter, adding the company “has a severe problem with child pornography.”
“Since it is a sensitive topic, more time is needed to release the findings publicly (it does not depend on us), but research will come out soon,” Stroppa said.
Stroppa told Reuters he personally funded research into the platform’s child sexual abuse problem after receiving a tip about the severity of the issue.
Elon Musk, who is currently in a court battle with Twitter over his attempt to nix a previous offer to buy the company for $44 billion replied to the Reuters article on the platform, calling the report “extremely concerning.”