A New York judge has ruled that social media giants Meta (Instagram, Facebook) and ByteDance (TikTok) must face a wrongful‑death lawsuit from the mother of a 15‑year‑old teen killed while "subway surfing." The tragic incident occurred in February 2023 when the teen attempted to ride atop a train near the Williamsburg Bridge.
The lawsuit alleges that platforms knowingly exposed her son to addictive and dangerous user‑generated content that glorified subway surfing. The plaintiff claims the companies used algorithmic recommendations to push trending challenge videos to vulnerable youth, fostering risky behavior. The court found these claims sufficient to proceed to trial under product liability, negligence, and wrongful‑death theories.
Meta and ByteDance argue that Section 230 of the Communications Decency Act and First Amendment protections shield them from liability for third‑party content. However, the judge determined that it is plausible the platforms did more than passively host content—they may have targeted teens like this boy with challenge-related videos. This approval effectively rejects a motion to dismiss key claims.
The ruling follows reports showing six subway-surfing‑related deaths in New York City during 2024. The court dismissed claims against the Metropolitan Transportation Authority, citing obvious danger. With similar lawsuits gaining traction nationwide, this case adds to mounting scrutiny over social media’s impact on youth behavior.
SEO focus areas here include “Meta wrongful death lawsuit,” “TikTok dangerous challenges,” “subway surfing death NYC,” and “social media liability youth.”
This case strikes at the heart of debates over platform responsibility. While free‑speech and safe‑harbor defenses remain intact, courts appear open to claims that algorithm-driven recommendation systems bear responsibility. It may prompt platforms to adopt stricter safeguards around harmful content. However, legal and technical challenges remain in defining liability vs. curation.