A new lawsuit has accused Twitter of turning a blind eye to child pornography on its platform, claiming that it snubbed repeat requests from an underage sex trafficking victim to remove explicit images obtained through blackmail.

The suit, filed by the teenage victim and his mother in the Northern District of California on Wednesday, argues that Twitter refused to pull the sexually graphic videos on the grounds that they did not violate its policies, allowing them to rack up well over 150,000 views.

The plaintiff in the case – identified only as “John Doe” in court records – says he was just 13 when he was manipulated into sharing nude images of himself with a Snapchat user he believed to be a 16-year-old classmate. After he did so, “the correspondence changed to blackmail,” the lawsuit claims, adding that the perpetrators threatened to share the photos with the victim’s “parents, coach, pastor, and others in his community” if he did not send additional material. He complied with the traffickers’ demands, sending sexually explicit videos of himself, some of which included another minor.

At some point in 2019, a “compilation video” featuring the footage extorted from John Doe surfaced on Twitter through at least two accounts, eventually making their way to the victim in January 2020 after “he learned from his classmates that [the] videos of him and another minor were on Twitter and that many students in the school had viewed them.”

Due to the circulation of these videos, he faced teasing, harassment, vicious bullying, and became suicidal.

Also on rt.com If you can be banned from Twitter for questioning transgenderism, why are accounts advocating pedophilia still on the site?

Immediately, the victim – who by this time was 16-years-old – informed his parents of the situation, prompting his mother, named as “Jane Doe” in the suit, to take up the issue with school officials, local police and with Twitter directly. That followed at least one previous complaint from a concerned Twitter user in late 2019, who reported one of the accounts that shared footage of the victim. The company took no action and the videos remained live.

By January 21, the plaintiff filed his own complaint with Twitter, telling the platform “These videos were taken from harassment and being threatened. It is now spreading around school and we need them taken down as we are both minors and we have a police report for the situation.” At the request of Twitter, he provided a photo of his driver’s license to confirm his identity.

Jane Doe also filed two additional complaints with the company one day later, to which Twitter replied with identical automated messages promising to review the content in question.

After a full week without a response from the company, despite repeat attempts by the victim’s mother beyond her initial complaints, Twitter finally replied on January 28, stating that it found no problems with the sexually explicit videos and would do nothing to have them removed.

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” Twitter said, while insisting without a hint of irony that “your safety is the most important thing.”

Also on rt.com Fortnite used to lure minor into sex, child pornography

The victim replied on the same day, outraged over the platform’s inaction, asking “What do you mean you don’t see a problem?”

We both are minors right now and were minors at the time these videos were taken. We both were 13 years of age. We were baited, harassed, and threatened to take these videos that are now being posted without our permission. We did not authorize these videos AT ALL and they need to be taken down.

While the company ignored John Doe’s final plea, within a few days his family “was able to connect with an agent of the US Department of Homeland Security” through a mutual contact, according to the suit. (RT)