Twitter Failed To Remove Criminal Child Videos Claiming They Didn’t Violate Policies

( )- A blistering complaint claims that Twitter declined to remove widely circulated pornographic pictures and videos of a minor sex trafficking victim because an inquiry “didn’t discover a violation” of the company’s “rules.”

The victim and his mother filed a federal lawsuit against Twitter on Wednesday in the Northern District of California, alleging that the company profited from the films, which showed a 13-year-old engaging in sex acts and are a type of child sexual abuse material, or child porn.

The youngster, John Doe, is now 17 and resides in Florida. According to the lawsuit, when he was between 13 and 14 years old, sex traffickers began communicating with him on Snapchat under the guise of a 16-year-old female classmate.

Doe and the traffickers allegedly sent naked pictures before talking about extortion.

According to the lawsuit, the boy was threatened with sharing the explicit material he had previously sent with his “parents, coach, pastor” and other people if he didn’t send any more sexually explicit pictures and videos.

According to the lawsuit, Doe initially agreed under coercion and uploaded films of himself engaging in sex acts. He was also instructed to include another youngster in his movies, which he allegedly did.

After Doe blocked the traffickers, they eventually stopped bothering him, but at some time in 2019, the films allegedly appeared on Twitter under two accounts known for disseminating content containing child sexual assault.

The recordings would be reported to Twitter at least three times over the following month, starting on December 25, 2019, but the tech giant did nothing until a federal law enforcement officer got involved, the lawsuit claims.

Doe learned about the tweets in January 2020 because his classmates had widely seen them. As a result, he experienced “teasing, harassment, and brutal bullying,” which made him “suicidal,” according to court documents.
The boy’s mother also complained about the same content to Twitter at the same time and received no response from them for a week. On January 28, Twitter told her they would not remove the content, even though it had already received over 167,000 views and 2,223 retweets.

Twitter is being sued by a 15-year-old boy who claims they ignored his case number and did nothing to stop an illegal child sex assault video (CSAM) from spreading on the internet. The teen’s mother contacted the Department of Homeland Security (DHS) and got the video taken down, but Twitter only suspended the user accounts disseminating the video.

Twitter says:

“Any content that depicts or encourages child sexual exploitation is entirely
unacceptable on Twitter. We actively combat online child sex abuse and have
extensively invested in technology and methods to implement our policy.

Our committed staff strives to stay ahead of bad-faith actors and ensure we’re doing all our power to remove content, aid investigations, and safeguard kids from harm – online and offline.

Sure. Sure they do. They will remove the term “groomer” faster.

What does that tell you?