Major social media companies Meta, TikTok and YouTube are set to face a landmark jury trial in the United States over allegations that their platforms contribute to youth addiction and mental health harm.
The case begins Tuesday in the California Superior Court in Los Angeles, where jury selection is expected to last several days. Court officials say about 75 potential jurors will be questioned daily through at least Thursday.
Test Case for Thousands of Lawsuits
Legal experts say the trial is being closely watched as a bellwether for thousands of similar lawsuits filed across the country that seek damages over alleged social media harms.
A fourth defendant, Snap Inc., the parent company of Snapchat, reached a settlement last week for an undisclosed amount, removing itself from the case.
The trial itself is expected to last between six and eight weeks, with senior executives — including Meta chief executive Mark Zuckerberg — anticipated to testify.
Claims of Deliberate Addictive Design
The plaintiff, a 19-year-old California woman identified in court filings as KGM, alleges she became addicted to social media at a young age due to intentional design choices aimed at maximizing engagement among children and teenagers.
According to the lawsuit, those features worsened her depression and led to suicidal thoughts. She is seeking to hold the companies legally responsible, arguing that profit-driven design decisions placed young users at risk.

The case is the first of several “social media addiction” lawsuits involving minors that are expected to go before juries later this year.
Court documents argue that the platforms adopted tactics similar to those once used by the gambling and tobacco industries to keep users engaged, particularly younger audiences.
Legal Strategy and Industry Response
If successful, the plaintiffs’ arguments could weaken long-standing legal protections often relied upon by tech companies, including First Amendment defenses and Section 230 of the Communications Decency Act, which generally shields platforms from liability for user-generated content.
The companies deny the allegations, saying they do not intentionally harm young users and pointing to safety measures introduced over the years. They argue they cannot be held responsible for content posted by third parties.
Meta says it has sponsored teen safety workshops at schools across the US since at least 2018, while TikTok has promoted parental controls and screen-time limits, including tools that restrict nighttime use.
Growing Global Scrutiny
Advocacy groups remain sceptical. Julie Scelfo, founder of Mothers Against Media Addiction, said parents often struggle to know which sources to trust amid competing messages from the industry.
“It can be deeply confusing for families,” she said, adding that technology companies exert influence across multiple fronts.
The trial comes as governments worldwide move to regulate children’s access to social media. France’s lower house this week approved a proposal to bar children under 15 from social platforms, pending further votes.
Australia last December became the first country to ban social media use for children under 16. Other countries, including the United Kingdom, Denmark, Spain and Greece, are also studying similar restrictions.
As jury selection begins, the outcome of the California case could shape how social media companies are held accountable for their impact on young users — in the US and beyond.

