
Social media and mental health: A California jury is hearing a landmark case that could test whether major social media platforms can be held responsible for harm linked not to user posts, but to how the apps themselves are designed.
In opening statements, lawyers for a 20-year-old woman identified in court as Kaley G.M. argued that Meta Platforms and Google’s YouTube built features they knew could hook young users and keep them scrolling. The plaintiff’s team told jurors internal documents show the companies intentionally engineered engagement tools that “addict the brains of children,” and that Kaley became compulsively attached to the apps at a young age.
Also Read | Landmark lawsuit puts Meta, TikTok and YouTube under scrutiny over teen mental health
Kaley alleges the platforms contributed to worsening depression and suicidal thoughts, and she is seeking damages. Her lawyers plan to argue the companies were negligent in product design and failed to adequately warn the public about risks to young users. If the jury finds liability, it could also consider punitive damages.
The companies deny wrongdoing and are expected to counter that other factors in Kaley’s life played a substantial role. They are also expected to point to safety initiatives for teens and children and to argue that they should not be blamed for harmful material posted by users.
The judge overseeing the case instructed jurors that the companies cannot be held liable for recommending content created by other users, limiting the dispute to the platforms’ own design and operation. That distinction goes to the heart of the industry’s long-standing legal shield against many claims tied to user-generated content.
The trial is expected to run for weeks, potentially into March, and Meta Platforms CEO Mark Zuckerberg is expected to be called as a witness. TikTok and Snap previously settled with the plaintiff before this trial began.
The case is one of thousands facing major platforms in the US, including a large group of similar claims brought by parents, school districts and state officials in federal court. Observers say a plaintiff’s win in California state court could encourage more cases built around allegations that platforms are harmful by design.
Separately, on the same day in Santa Fe, New Mexico, a jury began hearing a state case accusing Meta Platforms of profiting while failing to adequately protect children and teens from exploitation and mental health harms. Prosecutors told jurors the company publicly portrayed its platforms as safer for young users than internal evidence supported, allegations that Meta Platforms disputes.