SB 53, California Governor Gavin Newsom, signed the law this week, is evidence that state regulations do not need to hinder AI progress.
Adam Billen, vice president of public policy for a youth-led advocacy group, says he will encode AI in an episode of today’s episode.
“The reality is that policymakers themselves know we have to do something and know that from working on one million other issues, there is a way to pass legislation that truly protects the innovations I care about.
SB 53 is the country’s first legislation that requires large AI labs to be transparent about safety and security protocols. Particularly on how to commit cyberattacks on critical infrastructures and block models from catastrophic risks to build biowarepons. The law also requires businesses to stick to these protocols, which will be enforced by the Department of Emergency Services.
“Companies are already doing what we ask them to do with this bill,” Billen told TechCrunch. “They are doing safety testing on models. They’re releasing model cards. Are they starting to skip in some areas of some companies? Yes.
Viren also noted that some AI companies have policies regarding relaxing safety standards under competitive pressure. Openai, for example, has publicly stated that if a rival AI lab releases a high-risk system without similar protective measures, it could “adjust” safety requirements. Billen argues that policies enforce existing safety promises for businesses and prevent them from cutting corners under competitive or financial pressure.
The public’s opposition to SB 53 was muted compared to its predecessor SB 1047, which Newsom rejected last year, but within rhetoric in Silicon Valley and most AI labs, almost all AI regulations have progressed, eventually thwarting the US with a race that beats China.
TechCrunch Events
San Francisco
|
October 27th-29th, 2025
This is why companies such as Meta, Andreesen Horowitz and others, such as VCs, have been gathering powerful individuals like Greg Brockman’s Open President, and bringing hundreds of millions of people into super PACs to support pro-ai politicians in state elections. And earlier this year, these same forces pushed for an AI moratorium that banned AI regulation for 10 years.
Encoded AI worked to run a coalition of over 200 organizations to defeat the proposal, but Viren says the fight is not over. Senator Ted Cruz, who defended the moratorium, is trying new strategies to achieve federal preemptive goals under federal law. In September, Cruz introduced the Sandbox Act. This allows AI companies to apply for exemptions that temporarily bypass certain federal regulations for up to 10 years. Viren also anticipates future legislation that will be proposed as a central solution, but actually establish federal AI standards that will override state law.
He warned that the narrow scope of federal AI laws “may remove federalism because of the most important technology of our time.”
“If you say SB 53 is a bill that replaces all state bills related to AI, it probably isn’t a very good idea and tells us that this bill is designed for a specific one,” Billen said.

He agrees that AI races with China are important and that policymakers need to enact regulations to support American progress, but he says he is killing state bills that focus primarily on deep fakes, transparency, algorithmic discrimination, child safety, and government use of AI.
“Do bills like SB 53 stop us from defeating China? No,” he said. “I think it’s really intellectually dishonest to say that it’s something that stops us in a race.”
He added: “What you care about is if you are beating China in the race with AI and if I care about it, what you’re pushing is something like Congress’ export control,” Billen said. “We make sure that American companies have chips, but that’s not something the industry is driving.”
Legislative proposals like the CHIP Security Act aim to prevent the conversion of advanced AI chips to China through export control and tracking equipment, while existing chip and science laws are trying to boost domestic chip production. However, some major tech companies, including Openai and Nvidia, have expressed unwillingness or opposition to certain aspects of these efforts, citing concerns about effectiveness, competitiveness and security vulnerabilities.
Nvidia has a reason for this. It has a strong financial incentive to continue selling chips to China, which has historically represented a significant portion of its global revenue. Viren speculated that Openai could be able to hold back Chip Export Advocacy in order to remain in the good bounty of key suppliers like Nvidia.
There was also an inconsistent message from the Trump administration. Three months after extending the export ban on advanced AI chips to China in April 2025, the administration reversed the course, allowing NVIDIA and AMD to sell several chips to China in exchange for 15% of their revenue.
“We see people on the hills moving towards bills like the Chip Security Act, which will control exports to China,” Billen said. “In the meantime, we’ll continue to support the story to kill the bills in the state that are actually very light.”
Viren added that SB 53 is an example of democratic behavior. Industry and policymakers are working together to arrive at a version of the bill that everyone can agree on. It’s “very ugly and messy,” but “the democratic and federalist processes are the whole foundation of our country and our economic system, and we hope we keep doing it well.”
“I think the SB 53 is one of the best evidence points that still works,” he said.
This article was originally published on October 1st.