silverguide.site –

Frustrated with what they describe as a lack of accountability from social media companies, two California state lawmakers have introduced a bill that would clear a legal pathway for them to face lawsuits in the state for failing to detect or remove child sexual abuse material on their websites and apps.

Assembly members Maggy Krell and Buffy Wicks, both Democrats, said they are spurred by witnessing how online exploitation is inflicting “profound trauma on a staggering number of children”, in an interview with the Guardian.

The move follows two landmark trial verdicts in California and New Mexico in March, in which Meta and YouTube were found liable for harm inflicted on children. With more lawsuits in the pipeline, states across the country are working to increase legal accountability for tech giants over harms against children committed through their sites and apps.

“This is the most urgent issue of our time when it comes to protecting our most vulnerable children,” said Krell. “I want to see these companies really invest and prioritize protecting kids. The money that they’re spending on defending against lawsuits would be better spent on fixing their platforms so that children do not continue to be harmed on their sites.”

An amended version of the bill, known as AB 1946, was published on 6 April. Under its provisions, companies would be required to perform biannual audits on their platforms for the impact of design choices on child safety risks, and submit them to the attorney general’s office, as well as simplify reporting procedures for users. If the bill passes by the end of the legislative session in August, it would come into force on 1 January 2027.

“Any conversation I’m having with parents right now is about the role of big tech in [the lives of] our kids,” said Wicks. “These companies have so much access to our kids in their day-to-day lives. At a bare minimum, child abuse material should not be on these platforms. It should be low-hanging fruit to prevent this.”

According to Krell and Wicks, the bill is designed to address gaps in existing law by increasing liability for platforms that host content generated by users.

The bill would also empower the attorney general and other public prosecutors to have access to information and to file legal action when appropriate, Krell and Wicks said. Penalties from attorney general enforcement actions would be allocated to a survivor support fund.

“Social media platforms have become superhighways for the proliferation of child sexual abuse material. And we’re sick of it,” said Wicks.

Under current California law, companies have up to 30 days to act in certain cases involving harmful material. The bill would reduce that window to 48 hours in many situations and also require any new child sexual abuse material detected by social media companies to be reviewed by a human moderator.

US federal law shields social media sites and apps from civil liability for what their users post, which empowers companies to remove harmful or objectionable content without being deemed liable for all content on their sites. Where their criminal liability begins when their users commit crimes and how cooperative they are obliged to be in response to law enforcement is a longstanding matter of debate, though federal law specifies that sites are not protected from liability for sex trafficking.

Social media companies had faced few lawsuits over child safety issues until earlier this year. The California and New Mexico lawsuits were among the first of their kind, targeting design features of social media platforms that enabled harms to children as opposed to attempting to hold the platforms responsible for user-generated content. In New Mexico, a jury found that Meta misled consumers about safety and enabled harm to young users. In California, Meta and YouTube were found to have deliberately designed addictive products to hook young users, and to have failed to adequately warn of the risks.

Krell has years of experience of combating online sexual exploitation. Before entering politics, she was a prosecutor who played a key role in the takedown of Backpage, a platform that authorities alleged was knowingly used to facilitate sex trafficking, including of minors.

“I want to see this bill and these lawsuits preventing future harm,” said Krell. “So we’re not winning by filing a lot of lawsuits. Behind each of those lawsuits, it’s a bunch of kids who have been sexually abused.”

Krell said lobbyists for tech companies frequently visit her office to outline the steps their clients take to remove child sexual abuse material from their platforms. Yet any industry-initiated efforts that may be away under way are not sufficient to tackle the problem alone, she said.

“I have no faith in the honor system when it comes to big tech removing harmful child sexual abuse material from their websites,” said Krell. She added that her wariness of tech companies policing their own platforms dates back to the Backpage lawsuit, which came “after two years of them assuring the National Center for Missing Exploited Children that no children were being sold on their site, which was a lie”.