Andrew Caballero-Reynolds/AFP via Getty Images
It’s been eight months since Sam Altman, CEO of ChatGPT creator OpenAI, called on U.S. senators to: pass a law The idea is to force accountability from major companies like Amazon, Google, and OpenAI investor Microsoft.
“There’s going to be fewer companies because of the resources they need, so I think there’s going to be an incredible amount of scrutiny on us and our competitors,” Altman said in May 2023. Ta.
The federal government is studying the issue, but the scrutiny and regulations Altman proposed have not yet been applied.
That’s even though large-scale AI models are expanding and doing a lot of exciting things. new antibiotics and help humans communicate with whales. But at the same time, concerns are growing about increased election season fraud and automated employment discrimination.
In 2023, many of the world’s leading experts have signed on. AI risk statementalerting policy makers to the possibility of disaster.
“Reducing the risk of extinction due to AI should be a global priority, alongside other society-wide risks such as pandemics and nuclear war.”
Democratic state Sen. Scott Wiener of San Francisco said California lawmakers are introducing legislation that could serve as a model for other states to follow, if not the federal government. .
“I would love for Congress to enact a single, uniform federal law that effectively addresses AI safety. did not pass such a law. “Congress has not come close to passing such a law.”
Wiener argues: his suggestion This is the most ambitious yet in this country. As the new chairman of the state Senate Budget Committee, he says he wants to use his position to pass aggressive legislation.
California’s measures Senate Bill 1047Now, companies that build the biggest and most powerful AI models must test those models for safety before releasing them to the public.
AI companies would have to report to the state about their testing protocols and guardrails, and California’s attorney general could sue if their technology caused “substantial harm.”
Wiener says his bill relies heavily on the Biden administration’s 2023 date Executive Order on AI.
There is Over 400 AI-related bills It is pending in 44 states, according to the BSA Software Industry Alliance. But a number of major companies are working on generative AI models based in the San Francisco Bay Area, and a bill moving forward at the Capitol in Sacramento could become a legal landmark if passed.
More than 60% of generative AI jobs posted in the year ending July 2023 were concentrated in 10 U.S. metropolitan areas, according to think tank Brookings. Guided to the faraway Bay Area.
In the absence of federal oversight, industry efforts are underway to allay concerns about AI. This includes recent joint commitments. Combating deceptive use of AI in the 2024 elections In the world. However, this is a voluntary initiative, raising questions about who will hold companies accountable, especially as technology becomes more and more advanced. OpenAI recently introduced a text-to-video model. sky This product has amazing features that go far beyond the model released just a year ago.
Meanwhile, the FTC and other regulators existing law Many experts say reining in AI developers and the nefarious individuals and organizations that use their technology to violate the law is not enough.
Federal Trade Commission Chair Lina Khan raised this question at an FTC meeting. Summit on AI Last month: “Will a few powerful companies centralize control of these critical tools, locking us into a future of their choosing?”
Hany Farid, a professor at the University of California, Berkeley School of Information, who specializes in digital forensics, misinformation, and human cognition, questioned how effective the patchwork of state regulations will be in reining in the industry.
“I don’t think it makes sense for individual states to try to regulate in this area, but if any state should do so, it should be California. The benefit of state regulation is that the federal “We need to act to avoid confusing state-by-state technology regulations.”
Grace Gedye, AI policy analyst at Consumer Reports, added that the current political climate may require states to take the lead on this issue. She said, “We can never hold our breath.” [for Congress to act]”You could be waiting 10 or 20 years,” she said.