Published
4 hours agoon
By
zaghrah
It was a moment Silicon Valley has long tried to avoid: Mark Zuckerberg seated before a jury, answering questions under oath about children on his platforms.
Inside a Los Angeles courtroom, the 41-year-old tech executive found himself at the centre of a trial that could reshape how social media giants are held accountable for young users. The case, unfolding in California, is the first in what could become a wave of lawsuits brought by American families who believe platforms such as Meta and Google built systems that kept children scrolling and suffering.
This is no ordinary corporate dispute. The trial asks whether companies behind household names like Instagram, Facebook, WhatsApp and YouTube should bear responsibility for the mental health struggles of young users.
At the heart of the case is a now 20-year-old California resident, identified in court as Kaley G.M., who began using YouTube at six, Instagram at nine, and later joined TikTok and Snapchat. Her legal team argues that years of heavy social media use contributed to serious mental health challenges.
Under-13s are barred from Instagram. Yet in court, Zuckerberg was confronted with internal company data showing millions of children under that age were on the app as far back as 2015. One document revealed that roughly 30% of American “tweens” aged 10 to 12 were using Instagram at the time.
That statistic landed heavily in a courtroom already tense with expectation.
Zuckerberg acknowledged that Meta’s age-verification tools were once inadequate. He told jurors he regrets that the company did not move faster to improve them.
“I always wish that we could have gotten there sooner,” he said, referring to efforts to better identify underage users.
Initially reserved on the stand, he reportedly grew more animated as questioning intensified shaking his head, gesturing with his hands, and occasionally appearing frustrated as plaintiff lawyer Mark Lanier pressed him on internal emails and long-standing company goals.
One such email, written years ago by former public policy chief Nick Clegg, described Meta’s inability to properly enforce its under-13 rule as “indefensible.” Other documents showed internal targets focused on increasing user time spent on Instagram.
Zuckerberg admitted that Meta once set goals around time spent on its apps, but insisted the company’s guiding philosophy was to build “useful services” that connect people. Extended time on the platform, he suggested, was a by-product of a good user experience not the end goal.
A central theme of the trial is who should be responsible for keeping children off platforms meant for older users.
Zuckerberg argued that companies like Apple and Google which control the operating systems behind most smartphones should handle age verification at the device level.
“Doing it at the level of the phone is just a lot clearer,” he said, adding that it would be “pretty easy” for them to implement.
It’s a position that shifts part of the burden away from individual apps and onto tech’s gatekeepers and it’s one likely to spark further debate well beyond this courtroom.
This trial, expected to run until late March, could set the tone for thousands of similar lawsuits across the United States. Plaintiffs argue that social media companies knowingly designed features to encourage compulsive use among young people, fuelling depression, anxiety, eating disorders and even suicide.
TikTok and Snapchat, also named in the complaint, reached settlements before the trial began leaving Meta and YouTube to face a jury.
Public reaction has been swift and divided. On social media, some users argue parents should bear primary responsibility for monitoring children’s online habits. Others say Big Tech has long hidden behind disclaimers while quietly benefiting from youthful engagement.
What makes this trial different is that it moves the debate from congressional hearings where executives often face sharp questions but limited consequences to a jury room. This time, ordinary citizens will decide whether corporate strategy crossed a legal line.
Beyond Kaley’s case lies a broader cultural reckoning. Social media has become embedded in childhood itself. For many families, banning platforms outright feels unrealistic; yet trusting companies to self-regulate has proven controversial.
Zuckerberg told the court that Meta is now “in the right place” regarding age verification. Whether the jury agrees and whether that will be enough to protect future users, remains to be seen.
What is certain is this: the era of unchecked tech growth is over. The question now is not whether platforms shape young minds, but how much responsibility their creators must carry for the consequences.
{Source: IOL}
Follow Joburg ETC on Facebook, Twitter , TikTok and Instagram
For more News in Johannesburg, visit joburgetc.com
Instagram CEO pushes back on ‘addiction’ claims in landmark teen harm trial
UK and France weigh youth social media bans as global debate intensifies
TikTok Finds a Way to Stay in America as Joint Venture Deal Reshapes Its Future
“It’s Not Our Choice”: Elon Musk’s X Falls in Line With Australia’s Under-16 Social Media Ban
The Courtroom Win That Revealed Meta’s Identity Crisis
Words That Heal: Sandton City and the Riky Rick Foundation Unite for Youth Mental Health