Newsom Considering California’s AI Regulation Bill 

The AI industry could soon be required to follow new guidelines meant to improve safety and security.
Newsom Considering California’s AI Regulation Bill 
Gov. Newsom speaks at a press conference at the Capitol in Sacramento, Calif., on May 10, 2024. Travis Gillmore/The Epoch Times
Travis Gillmore
Updated:
0:00

California’s legislature passed a bill Aug. 29 that seeks to regulate artificial intelligence (AI) models and establish guidelines for developing the most powerful systems. The measure now awaits Gov. Gavin Newsom’s signature.

Senate Bill 1047, introduced by Sen. Scott Wiener, would regulate AI models and create a framework for the development of a public cloud-computing system—known as CalCompute—that allows for open participation in the industry.

The proposal is supported by dozens of tech companies, labor, and other organizations, including the Los Angeles Area Chamber of Commerce; the Center for AI Safety Action Fund, a San Francisco based research oriented nonprofit; and the Economic Security Project Action, a nonprofit with staff nationwide that advocates for economic power for all Americans, among others.

However, the bill is hotly contested by more than 100 organizations, including numerous technology firms, the California Chamber of Commerce, and the California Manufacturers and Technology Association—which represents 400 businesses—among others.

Opponents generally object to what they believe are overly stringent regulations that would limit the industry’s ability to innovate.

The bill’s author said that while AI can help advance knowledge of medicine, wildfire forecasting, and other emerging technologies, the measure is needed to protect Californians from unintended consequences and potential malicious use of powerful computing models.

“[AI] also gives us an opportunity to apply hard lessons learned over the last decade, as we’ve seen the consequences of allowing the unchecked growth of new technology without evaluating, understanding, or mitigating the risks,” Wiener said in a legislative analysis. “SB 1047 does just that, by developing responsible, appropriate guardrails around development of the largest, most powerful AI systems, to ensure they are used to improve Californians’ lives, without compromising safety or security.”

Tech giant and billionaire Elon Musk voiced his support of the measure in recent days.

“This is a tough call and will make some people upset, but all things considered, I think California should probably pass the SB 1047 AI safety bill,” Musk posted Aug. 26 on X. “For over 20 years, I have been an advocate for AI regulation, just as we regulate any product/technology that is a potential risk to the public.”

Uncertainty abounds, according to consultants for the Assembly’s Judiciary Committee who said in an analysis published in July that the industry presents “unprecedented opportunities and significant challenges,” as highly trained AI models can behave unpredictably and can attract malicious actors intent on using the technology for malevolent purposes.

“This unpredictability, coupled with the high stakes involved, underscores the necessity for stringent oversight and safety protocols,” committee staff wrote in the analyses.

Supporters of the bill suggest criminals and potentially terrorists could use the technology to coordinate attacks, and they highlight concerns around cyberattacks, espionage, and misinformation campaigns as just a few of the examples why regulations are needed.

Some lawmakers in both chambers, representing both sides of the aisle, spoke in support of the bill as it made its way through the legislature this year.

“Artificial intelligence has an enormous potential to benefit our state, our nation, and the world,” Democratic Assemblyman Steve Bennett said Aug. 28 during an Assembly hearing. “Artificial intelligence also has an enormous potential to be misused and cause serious problems that are beyond our ability to even imagine.”

He said the proposal is a requisite first step toward safeguarding the industry.

“This bill is, after my examination of it, I believe, a light touch, the lightest touch you could possibly come up with, which is the companies themselves need to do their own due diligence to make sure this is safe,” Bennett said. “That’s all the bill does: require them to ... do their own due diligence.”

While critics of the bill said that it could potentially stifle innovation, another assembly member pushed back on that notion.

“It’s time that big tech plays by some kind of a rule. And I’m kind of frankly sick of hearing all this different stuff of, ‘oh, we’re going to stop the growth of tech,’” Republican Assemblyman Devon Mathis told fellow assembly members before voting on the bill. “No, we’re not. But you have to put guide rails. We have to make sure that they’re going to be responsible players.”

One co-sponsor of the bill said the regulations for the most powerful systems—which will cost at least $100 million to develop—are a significant matter of national security and have public safety implications.

“SB 1047 introduces essential safeguards for the creation of highly capable AI models,” Encode Justice—a California based advocacy group seeking AI regulations—said in a legislative analysis.

Opponents said the bill would unnecessarily hinder AI developers and the companies that utilize the technology, saying the law would force them to solve for all types of potential harms, even those that are relatively inconceivable.

“Unfortunately, SB 1047 forces model developers to engage in speculative fiction about imagined threats of machines run amok, computer models spun out of control, and other nightmare scenarios for which there is no basis in reality,” the Chamber of Progress—a tech industry coalition headquartered in Virginia—said in legislative analyses.

Another critic said the bill is too vague and could create obstacles for open-source AI development and for companies without extensive legal teams compared to larger, well-financed tech firms.

“The bill is well intentioned but ultimately misguided and a genuine threat to innovation,” Ash Rust, managing partner of venture capital fund Sterling Road, posted Aug. 30 on X. “We are at the dawn of an exciting new era with technology offering both incredible new benefits and risks. However, if we are overzealous and try to regulate a core technology versus the potential negative applications, we risk stifling innovation.”

If ultimately signed into law, the legislature’s appropriations committees estimated the regulations would cost the state between $5 million and $10 million annually in government operations, in addition to between $4 million and $6 million to implement and annually operate the CalCompute system.

Costs to manage violations could also cost the state up to the low millions of dollars, depending on their number and the workload needed to address them, according to the appropriations committees.

The governor has until Sept. 30 to sign or veto the bill.

Travis Gillmore
Travis Gillmore
Author
Travis Gillmore is an avid reader and journalism connoisseur based in California covering finance, politics, the State Capitol, and breaking news for The Epoch Times.