Whistleblowers tell Senate panel Meta hid VR harms
Published in News & Features
WASHINGTON — Senators on Tuesday accused social media company Meta, the parent company of Facebook and Instagram, of covering up harms like the sexual exploitation of children on their virtual reality platform.
The Senate Judiciary Committee’s Subcommittee on Privacy, Technology, and the Law heard testimony from former Meta Platforms researchers Jason Sattizahn and Cayce Savage, who allege Meta suppressed research into user safety, including psychological harms for children and women.
Subcommittee Chair Marsha Blackburn, R-Tenn., called the researchers’ allegations “shocking” and thanked them for coming forward.
“They were hired to purportedly make the platform safer for children, and what they found was a company that knew their products were unsafe. And they just did not care,” Blackburn said.
The panel’s ranking member, Minnesota Democratic Sen. Amy Klobuchar, put the allegations in context of a history of whistleblowers from Meta and other congressional hearings on the company’s impact on young people.
“For too long, these companies have worked to attract kids to their platforms. They do so knowing that their platforms use algorithms that increase the risk of sexual exploitation, push harmful content, facilitate bullying and provide venues, sadly, for dealers to sell deadly drugs like fentanyl,” she said.
Sattizahn, who worked at Meta from 2018 to 2024 in integrity research, including for Meta’s virtual reality platform, alleged that the company had “no interest” in VR safety unless it could drive profit.
“When our research uncovered that underage children using Meta VR in Germany were subject to demands for sex acts, nude photos and other acts that no child should ever be exposed to,” Sattizahn said, “Meta demanded that we erase any evidence of such dangers that we saw.”
He said he continued research into other harms on the VR platform, despite threats from Meta to his job or concerns that the research would harm the company.
“My research still revealed emotional and psychological damage, particularly to women who were sexually solicited, molested or worse. In response, Meta demanded I change my research in the future to not gather this data,” Sattizahn said.
Savage worked for Meta from 2019 to 2023 as a user experience researcher. She said the company is aware of many children using its virtual reality platform but does not acknowledge the problem because removing young users to comply with federal law would decrease the active user numbers given to shareholders. She said turning a blind eye is part of a wider culture at Meta.
“While I speak about virtual reality, it is important to understand that the way Meta has approached safety for VR is emblematic of its negligent approach to safety for all of its products,” she said.
Savage, who has a background in psychology, emphasized that children experience virtual reality as reality.
“Because VR is immersive and embodied, negative experiences cause greater psychological harm than similar experiences on an iPad or an Xbox,” she said. “In VR, someone can stand behind your child and whisper in their ear and your child will feel their presence as if it is real.”
She listed harms on the VR platform as including bullying, sexual assault and solicitation for nude photos.
Sen. Josh Hawley, R-Mo., asked Savage what proportion of young users are exposed to sexual content or abuse. She said explicit content is difficult to monitor on VR and that the prevalence is “extremely high.”
“I would estimate that any child that is in a social space in VR will come in contact with or directly expose something very inappropriate,” Savage said.
In an emailed statement, Meta called claims put forward in the hearing “nonsense.”
“They’re based on selectively leaked internal documents that were picked specifically to craft a false narrative,” the statement read. “The truth is there was never any blanket prohibition on conducting research with young people and, since the start of 2022, Meta approved nearly 180 Reality Labs-related studies on issues including youth safety and well-being.”
Youth-led coalition Design it for Us said in a statement that the whistleblower allegations confirm young people’s experiences with Meta’s platforms.
“Meta has spent years touting its efficacy in protecting young people online, but all the while, they were deleting evidence and research that confirmed countless instances of harm to young people on their platforms. These allegations are not isolated — they’re part of a well-documented, consistent pattern of negligence and deception from Meta.”
Proposed legislation
Blackburn encouraged senators to pass her bill, known as the Kids’ Online Safety Act, which she said “will hold Big Tech accountable.”
The bill, which has not yet been marked up by the Senate Commerce Committee, would require social media platforms to take design steps to prevent certain harms to kids, including sexual exploitation and the marketing of narcotics.
Klobuchar is a co-sponsor on the bill and also encouraged its passage, although she acknowledged it may be difficult.
“I have worked in this area for a long time myself and have known the frustration,” she said. “No matter what you seem to do, you get lobbied against and millions of dollars against you.”
The bill passed the Senate last year, along with updates to limits on the age minimum for collecting kids’ data without parental consent.
In California, a federal court blocked enforcement of a law known as the California Age-Appropriate Design Code Act that would have similarly required platforms to design their systems in an age-appropriate manner; the case hinged on free speech concerns.
In addition to Blackburn’s platform design measure, Klobuchar encouraged repealing Section 230 of the 1996 telecommunications law, which protects social media platforms from liability for content posted on their sites.
“Other industries do not enjoy the similar level of protection. If they have an appliance that blows up or they have a tire that blows up on the roads, there’s accountability; they get sued,” Klobuchar said.
Court reviews
Federal courts have so far been more open to age verification to prevent harms to children than to design requirements. Savage said she proposed a study to improve Facebook’s age data, combining age verification, stated age and age prediction, but the study was shut down. She also said users are hesitant to provide age data to the platform and parents often set up accounts with their own information, rather than a child’s, because they don’t understand the importance of an age-appropriate account.
The Senate Commerce Committee earlier this year voted to advance two other bills: one that would raise the age at which platforms no longer need a user’s consent to collect their data from 13 to 17 and allow children or their parents to request access to such data and allow them to correct or delete it, and another that would ban users under 13 from using social media platforms and ban such platforms from using data from a child under 17 to feed them personalized content recommendations. Neither has been scheduled for floor consideration.
Last week, Senate Judiciary Chair Charles E. Grassley, R-Iowa, as well as Blackburn and Hawley, sent a letter to Meta requesting information on safeguards in place for young people on the company’s platforms, as well as the use of Meta VR by children and protection against sexual exploitation. The letter requests responses by Sept. 16.
©2025 CQ-Roll Call, Inc., All Rights Reserved. Visit cqrollcall.com. Distributed by Tribune Content Agency, LLC.
Comments