TikTok is allegedly saving and sharing uncensored, sexually explicit images and videos of children, according to a report.
A number of sources familiar with the inner workings of the company reported as much, revealing such material is used to train content moderators.
Surely, there’s a way to train moderators without continuing the exploitation and distribution of child pornography. Surely, there’s a way to police child porn without re-victimizing the kids involved. If you were a parent or grandparent of one of the victims, wouldn’t you be asking TikTok those very questions?
A number of former TikTok moderators certainly thought as much.
Nasser worked for Teleperformance, a third-party company responsible for moderating TikTok content. He left the company in 2020.
“I don’t think they should use something like that for training,” Nasser said.
Another Teleperformance moderator who left the company in 2021, Whitney Turner, confirmed Nasser’s story. When she saw the material, Turner’s mind went to the same place as Nasser’s: What would the parents think?
“I was moderating and thinking: This is someone’s son. This is someone’s daughter. And these parents don’t know that we have this picture, this video, this trauma, this crime saved,” she said. “If parents knew that, I’m pretty sure they would burn TikTok down.”
She then went on to detail exactly how the material is distributed to employees, Forbes reported.
“Whitney was given access to a shared spreadsheet that she and other former employees told Forbes is filled with material determined to be violative of TikTok’s community guidelines, including hundreds of images of children who were naked or being abused.
“Former moderators said the document, called the ‘DRR,’ short for Daily Required Reading, was widely accessible to employees at Teleperformance and TikTok as recently as this summer,” Forbes reported.
Turner reported the content to the FBI, but officials from the bureau have yet to confirm whether or not it has or plans to investigate these claims.
In total, hundreds of employees had “free access” to the material.
Although Teleperformance employees were responsible for moderation, ByteDance, the controversial Chinese company that owns TikTok, won’t be able to pin all the blame on the third-party company.
The training materials containing the CSAM (child sexual abuse material) were stored in Lark, a “workplace software developed by” ByteDance.
Representatives from both TikTok and Teleperformance denied that such content was being used. However, a representative from TikTok admitted to Forbes the company “works with third-party firms who may have their own processes.”
In the report, Forbes went on to quote various legal experts in order to point out the obvious — saving and distributing CSAM for any purpose is illegal.
The fact that TikTok has potentially made a mistake of this magnitude makes one thing very clear.
The Chinese company doesn’t take its role in child sex trafficking as seriously as it should if the allegations are true.
This article appeared originally on The Western Journal.