San Francisco: The 9th US Circuit Court of Appeals in San Francisco has revived a key part of a lawsuit against X, formerly Twitter, related to child pornography.
The court ruled that X must face a negligence claim for failing to promptly report a video showing explicit images of minors to the National Center for Missing and Exploited Children (NCMEC).
The lawsuit was originally dismissed in December 2023 but was partially reinstated on August 1, 2025. The case involves two minors, referred to as John Doe 1 and John Doe 2, who were tricked on SnapChat into sharing explicit images with someone posing as a teenage girl.

These images were later compiled into a video that was posted on Twitter, where it reportedly received over 167,000 views before being removed nine days later.
Although Elon Musk is not a defendant in the case, the lawsuit targets X’s handling of the reported content. The court found that Section 230 of the Communications Decency Act, which offers broad protections to online platforms, does not shield X from negligence claims once it had actual knowledge of the abuse material.
Judge Danielle Forrest wrote that the duty to report child pornography is separate from X’s role as a content publisher, making the platform potentially liable for delays in reporting known abuse. The court also allowed a related claim to move forward, arguing that X’s platform infrastructure made it unnecessarily difficult for users to report such content.

However, the appeals court rejected other claims, including allegations that X profited from sex trafficking or intentionally enhanced the reach of exploitative content through its search functions. The ruling narrows the case while preserving the core negligence issue.
The plaintiffs are supported by the National Center on Sexual Exploitation. One of their attorneys, Dani Pinter, said that they look forward to continuing the legal process to achieve accountability.
The case, titled Doe 1 et al v Twitter Inc et al, will now proceed through the discovery phase, potentially setting a precedent for how digital platforms respond to reports of child pornography and other harmful content involving minors.

