A Los Angeles jury has ordered tech giants Meta and Google to pay $6 million in damages after finding their platforms contributed to a young woman’s mental health struggles. The landmark verdict in this highly anticipated social media addiction trial marks a major turning point for the tech industry, which has long been shielded from legal responsibility for user harm. The jury concluded that the companies acted negligently and deliberately created addictive products to maximize engagement and profit.
The 20-year-old plaintiff, identified as KGM, argued that Instagram and YouTube were designed to be habit-forming. She testified her early exposure led to severe anxiety, depression, body dysmorphia, and suicidal thoughts. The jury agreed these designs were a substantial factor in her harm, awarding $3 million in compensatory damages and $3 million in punitive damages. Meta is 70 percent responsible for the payout, while Google covers the remaining 30 percent.
Inside the Social Media Addiction Trial
The case is the first of its kind to go to trial, serving as a test for thousands of similar lawsuits from parents, schools, and states. Lawyers argued that tech companies intentionally borrowed psychological tactics from the cigarette industry to keep youth engaged and boost advertising revenue. KGM testified she began using YouTube at age six and Instagram at age nine, eventually developing a compulsion that kept her on Instagram for up to 16 hours a day.
Jurors saw internal communications supporting these claims. An internal Meta study known as “Project Myst” suggested that children experiencing adverse effects were the most likely to become addicted, leaving parents powerless. A YouTube memo reportedly described viewer addiction as a goal, while an Instagram employee compared the staff to drug pushers. After more than 40 hours of deliberation, the jury found this internal awareness supported holding the companies liable.
How Meta and Google Responded
Both Meta and Google have announced plans to appeal the decision. Throughout the legal proceedings, the companies maintained that mental health issues are profoundly complex and cannot be blamed on a single application. Meta argued that the plaintiff’s personal and family challenges were the root cause of her struggles, pointing out that none of her therapists had officially diagnosed her with social media addiction.
Meta CEO Mark Zuckerberg testified that he does not try to maximize user time on his platforms. He highlighted that Meta has spent over a decade developing safety tools, like the Teen Accounts introduced in 2024, though he wished they were implemented sooner. Meta lawyers added that features like infinite scroll are inescapably linked to content delivery.
Similarly, a trial lawyer for YouTube argued that the video platform already offers features to limit screen time and interrupt long viewing sessions. The lawyer argued that the company cannot simply walk into a user’s phone and activate these tools for them.
A Tobacco Moment for Big Tech
Legal experts call this a watershed moment that could spark mass litigation, drawing comparisons to battles against the tobacco industry in the 1990s. Back then, states sued tobacco companies to recover medical costs, forcing the industry to pay billions and change marketing practices. Advocates believe this verdict pushes social media companies down a similar path.
The California decision also followed another major legal blow for Meta. Just 24 hours earlier, a New Mexico jury ordered Meta to pay $375 million for exposing minors to harmful content, including sexual exploitation, in violation of consumer protection laws. With TikTok and Snap having already settled their parts of the California lawsuit out of court on confidential terms, Meta and Google are now facing mounting legal pressure alone.
What This Means for the Future
For years, social media platforms have relied on Section 230 of the Communications Decency Act, a legal provision that protects them from being sued over content posted by their users. However, the judge in this case instructed the jury to evaluate the platforms’ design features separately from the user-generated content. This crucial distinction effectively bypassed their traditional legal shield, opening the door for future product liability claims.
If the verdict stands, Meta and Google may be forced to redesign how their applications function. Removing addictive features could significantly reduce the time users spend scrolling and sharing, which directly threatens the highly profitable advertising models of Instagram and YouTube. The outcome is also reigniting calls for government action, with US lawmakers urging Congress to pass the Kids Online Safety Act to mandate protective design changes.
