
LOS ANGELES — The head of Instagram, Adam Mosseri, defended the platform in court Wednesday, arguing that social media platforms are not intentionally engineered to be addictive.
“I think it’s important to differentiate between clinical addiction and problematic use,” he said when he was pressed about social media addiction.
The landmark trial, which began last week with jury selection, could set a precedent for whether social media platforms are responsible for harming children.
It’s the first of a consolidated group of cases — from more than 1,600 plaintiffs, including over 350 families and over 250 school districts — scheduled to be argued before a jury in Los Angeles County Superior Court. Plaintiffs accuse the owners of Instagram, YouTube, TikTok and Snap of knowingly designing addictive products harmful to young users’ mental health.
Historically, social media platforms have been largely shielded by Section 230, a provision added to the Communications Act of 1934 that says internet companies are not liable for content users post. TikTok and Snap reached a settlement with a 20-year-old woman identified in court as K.G.M. ahead of the trial. The companies remain defendants in a series of similar lawsuits expected to go to trial this year.
This first bellwether case centers on K.G.M., who was a minor at the time of the incidents outlined in her lawsuit.
K.G.M. claims her early use of social media led to addiction and worsened her mental health problems. Her lawsuit alleges that social media companies made deliberate design choices to make their platforms more addictive to children for purposes of profit.
Questioned by K.G.M.’s attorney Mark Lanier, Mosseri said use of Instagram can be “problematic” when people spend excessive time on it.

“It’s a personal thing, but yeah, I do think it’s possible to use Instagram more than you think it helps,” he said. He noted several times that he is not a medical professional.
While it is in Instagram’s business interests to attract as many users as possible, Mosseri said, there are different ways to compete in the social media space. Instagram competes on “making platforms more safe,” he said.
“I believe protecting minors in the long run is good for profit and business,” Mosseri said.
Meta, formerly known as Facebook, introduced beauty filters to Instagram stories in 2017. In 2019, it significantly expanded its slate of augmented reality filters, allowing users to make and publish their own.
“We tried to draw the line around only allowing effects you could create with makeup outside of fantasy effects. In practice we had trouble defining that line,” Mosseri testified. “We ended up in a long debate, settling on not allowing any effects promoting plastic surgery.”
Meta tries to design its products “with sensitive people in mind,” Mosseri added, pointing to an Instagram feature that sends teens “nudges” reminding them to log off at night.
He said Instagram introduced a feature in 2018 that allowed users to track time spent on the app, began adding new safety features for minors starting in 2021 and rolled those features into teen accounts in 2024.
Mosseri said K.G.M., who joined the platform before 2018, experienced a version of Instagram that was “very different” from what it is today. Back then, he said, it was primarily “a feed of photos and primarily friends.”
“We have tried to respond as the world changes to make sure the experience is as positive as possible,” he said.
Outside the courtroom Wednesday, grieving parents held up photos of their children as they waited for Mosseri to emerge from the proceedings.

Matt Bergman, founding director of the Social Media Victims Law Center — which is representing about 750 plaintiffs in the California proceeding and about 500 in the federal proceeding — said Mosseri’s testimony indicates that “Instagram’s executives made a conscious decision to put growth over the safety of minors.”
“The evidence shows that Instagram knew the risks its product posed to young users, yet continued to deploy features engineered to keep kids online longer, even when those features exposed them to significant danger,” Bergman wrote in a statement.
Meta continues to deny allegations of harm.
“The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles,” a spokesperson said in an email statement. “The evidence will show she faced many significant, difficult challenges well before she ever used social media.”
Meta CEO Mark Zuckerberg is expected to testify next week.
If the jury’s verdict favors this first plaintiff, the social media companies could face damages to be determined by the jury and forced to change the designs of their platforms. The verdict could also set the tone for whether the tech giants choose to fight or settle the oncoming cases.
Meanwhile, a separate trial in New Mexico, addressing allegations that Meta’s platforms have failed to protect young users from sexual exploitation, also began opening statements Monday. And in June, the first bellwether trials for school districts that have sued social media platforms over harms to children will kick off in federal court.

