As Mark Zuckerberg heads to a Los Angeles courtroom Wednesday, a landmark trial accuses Meta and rivals of engineering addiction in kids.
Experts speaking to the Daily Caller News Foundation say platforms knew the toll, yet dragged their feet on basic protections until forced by litigation. Revelations kicked off with Frances Haugen’s 2021 leaks exposing Meta’s internal research on Instagram worsening issues like suicidal thoughts and eating disorders among teen girls, continuing with ongoing evidence, including internal data showing predators contacting hundreds of thousands of children daily, driving a surge of multi-district lawsuits and state actions ahead of a bellwether trial.
Dani Pinter, Chief Legal Officer at the National Center on Sexual Exploitation, says the problems are longstanding and deliberately unaddressed: “these platforms have been dangerous for a long time. They know they’ve been dangerous. They’ve even been studying the ways they’ve been dangerous.”
“We focus on exploitation, so I won’t pretend to be an expert about all the addiction properties. But I can say that we’ve been interfacing with Meta and their platforms for close to 10 years, and trying to communicate different aspects of the platform that were really dangerous for a long time,” Pinter told the DCNF.
“We asked Meta to make it impossible for adults, like stranger adults, to connect with children on Instagram, for example. And they refused to do just common sense changes like that, until they actually had to face lawsuits,” Pinter added. “I can’t really speak to the addiction [part] in particular, but these platforms have been dangerous for a long time.”
The National Center for Missing & Exploited Children’s CyberTipline, where tech companies like Meta, Google, and others are required to report suspected incidents, received 20.5 million reports in 2024, equating to about 29.2 million separate cases after adjusting for bundled submissions.
Online enticement reports (grooming and solicitation) surged 192% from 2023 to more than 546,000 last year, while financial sextortion and AI-generated child sexual abuse material rose sharply. Features (i.e. direct messages, image and video sharing, fake profile creation, etc.) on platforms like Instagram, Facebook and Snapchat, have reportedly enabled predators to groom children, blackmail them with explicit images and recruit for trafficking.
Internal Meta documents unsealed in cases like New Mexico showed up to 500,000 children receiving sexually inappropriate messages daily on Facebook and Instagram alone, yet basic safeguards were often delayed until lawsuits and public pressure forced changes, Pinter said.
Pinter told the DCNF that the National Center on Sexual Exploitation is representing two 13 year olds suing X, who had been originally sextorted through Snapchat.
“We represent two 13 year olds. We’re suing the X platform because their child sexual abuse material is spread on that platform. But they were originally sextorted on Snapchat. We’ve seen this exact play happen over and over and over again,” Pinter said. “So like our children were contacted by what they thought was a 16 year old girl, because of the way that these social media platforms link up and the data they provide this predator was able to find the children’s network.”
“They could see where they live. They found their social media accounts and their friends. So they were able to pretend to be a child in their community to convince them to send nude images and then blackmail them with those images to send more and more explicit content,” Pinter continued.
Social media’s addictive design has also fueled a mental health crisis among youth, according to some research. U.S. teens spend nearly five hours daily on platforms, according to Gallup. Children and adolescents spending more than three hours a day on social media face double the risk of depression and anxiety symptoms, per the U.S. Surgeon General’s advisory.
A 2025 Pew Research survey also found 45% of teens feel they spend too much time online, jumping up from 36% in 2022, and 48% say it has a mostly negative effect on peers their age. Recent studies, including one from UC San Francisco, shows depression symptoms rising 35% as daily use increased over time, with problematic social media behavior climbing from 7% to 11% among adolescents between 2018 and 2022, according to the World Health Organization.
Dr. Don Grant, National Advisor of Healthy Device Management for Newport Healthcare and a media psychologist with a doctorate specializing in social media’s effects, told the DCNF that while it’s not formally classified as addiction in the DSM, platforms are engineered to hYeaook vulnerable young users through features mimicking established addictive mechanisms.
“It’s using things like established psychological flaws in our limbic system. Intermittent rewards are craving for variable rewards, intermittent rewards,” Grant said. “It’s the same kind of philosophy that was used when slot machines were developed.”
“You pull the handle or now you push a button and you don’t know what youre going to get. And it keeps you there, keeps you there, keeps you there,” Grant added. “Because we are always going to dig to the bottom of the Cracker Jack box. We’re going to buy the secret things at the store that we don’t know what they are for collectibles to see what we get. It’s a flaw.”
Grant explained that these engineered features exploit young users’ impulsivity and craving for affirmation, turning social media into a feedback loop of comparison and self-doubt. In his years worth of clinical work with teens, Grant told the DCNF he’s seen a surge in depression, anxiety, self-harm and suicide that were once stabilized for decades.
It wasn’t until between 2010 and 2012 that psychologists began to notice what one described as a “crazy spike” in certain behaviors among kids, Grant said. While stressing that “correlation is not causation,” Grant noted he could not rule out social media’s role after working with hundreds of clients showing similar patterns.
“There are many, many factors. Correlation is not causation. There are many factors that could be identified. And we have identified the potential factors that either singularly or together in synergy could be the reason why the kids are not all right, no matter what the who said in 1979. There could be,” Grant said. “And we’re not saying that it’s any one because there’s lots of factors that could explain why.”
“But you are never personally going to convince me without only my knowledge base and research base, study base, and my doctorate, my doctoral addictions counselor certification and working with addiction for all these years,” Grant added. “You will never convince me that social media is not a significant factor for a majority of these things, of this, of these outcomes that we’re seeing with depression, anxiety, self-harm, and you know what? It doesn’t make sense because we started seeing it right after Instagram and Snapchat. And it’s only gotten more prolific.”
Grant highlighted a stark irony: many tech leaders who built or profited from these platforms have long limited or banned their own children’s access.
In a document he shared with the DCNF, Grant compiled statements from Silicon Valley executives showing they knew about the addictive designs and potential harms from the start, yet prioritized growth over safeguards.
Among the examples: Facebook founding President Sean Parker admitted the platform exploited “a vulnerability in human psychology” with variable rewards to consume users’ time, saying, “The inventors … creators … understood this consciously, and we did it anyway.” Former executive Chamath Palihapitiya expressed “tremendous guilt,” calling the platforms tools that are “ripping apart the fabric of how society works.”
Former Apple CEO Steve Jobs wouldn’t let his kids have an iPad, his own company’s invention. Current Apple CEO Tim Cook restricted his nephew from social networks; and even Zuckerberg reportedly wrote in a 2016 email that notifying parents about teens’ live videos “will probably ruin the product from the start.”
Grant said these admissions underscore that industry insiders recognized the dangers early but continued anyway.
As Zuckerberg testifies in this landmark bellwether case — the first jury trial over claims that social media giants knowingly designed addictive, harmful products for kids —the outcome could establish a new baseline for corporate responsibility. A plaintiff verdict could potentially force industry-wide safeguards that parents have long demanded, holding platforms accountable for the exploitation and mental health toll that victims’ say they’ve enabled for years.
All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact [email protected].















Continue with Google