Tuesday, July 1, 2025
Google search engine
HomeA.ICongress Abandons Efforts to Block State AI Regulations: Implications for Consumers

Congress Abandons Efforts to Block State AI Regulations: Implications for Consumers


After months of debate, a plan in Congress to block states from regulating artificial intelligence was pulled from the big federal budget bill this week. The proposed 10-year moratorium would have prevented states from enforcing rules and laws on AI if the state accepted federal funding for broadband access.

The issue exposed divides among technology experts and politicians, with some Senate Republicans joining Democrats in opposing the move. The Senate eventually voted 99-1 to remove the proposal from the bill, which also includes the extension of the 2017 federal tax cuts and cuts to services like Medicaid and SNAP. Congressional Republican leaders have said they want to have the measure on President Donald Trump’s desk by July 4.

AI Atlas

Tech companies and many Congressional Republicans supported the moratorium, saying it would prevent a “patchwork” of rules and regulations across states and local governments that could hinder the development of AI — especially in the context of competition with China. Critics, including consumer advocates, said states should have a free hand to protect people from potential issues with the fast-growing technology. 

“The Senate came together tonight to say that we can’t just run over good state consumer protection laws,” Sen. Maria Cantwell, a Washington Democrat, said in a statement. “States can fight robocalls, deepfakes and provide safe autonomous vehicle laws. This also allows us to work together nationally to provide a new federal framework on artificial intelligence that accelerates US leadership in AI while still protecting consumers.”

Despite the moratorium being pulled from this bill, the debate over how the government can appropriately balance consumer protection and supporting technology innovation will likely continue. “There have been a lot of discussions at the state level, and I would think that it’s important for us to approach this problem at multiple levels,” said Anjana Susarla, a professor at Michigan State University who studies AI. “We could approach it at the national level. We can approach it at the state level, too. I think we need both.”

Several states have already started regulating AI

The proposed moratorium would have barred states from enforcing any regulation, including those already on the books. The exceptions are rules and laws that make things easier for AI development and those that apply the same standards to non-AI models and systems that do similar things. These kinds of regulations are already starting to pop up. The biggest focus is not in the US, but in Europe, where the European Union has already implemented standards for AI. But states are starting to get in on the action.

Colorado passed a set of consumer protections last year, set to go into effect in 2026. California adopted more than a dozen AI-related laws last year. Other states have laws and regulations that often deal with specific issues such as deepfakes or require AI developers to publish information about their training data. At the local level, some regulations also address potential employment discrimination if AI systems are used in hiring.

“States are all over the map when it comes to what they want to regulate in AI,” said Arsen Kourinian, a partner at the law firm Mayer Brown. So far in 2025, state lawmakers have introduced at least 550 proposals around AI, according to the National Conference of State Legislatures. In the House committee hearing last month, Rep. Jay Obernolte, a Republican from California, signaled a desire to get ahead of more state-level regulation. “We have a limited amount of legislative runway to be able to get that problem solved before the states get too far ahead,” he said.

Read more: AI Essentials: 29 Ways to Make Gen AI Work for You, According to Our Experts

While some states have laws on the books, not all of them have gone into effect or seen any enforcement. That limits the potential short-term impact of a moratorium, said Cobun Zweifel-Keegan, managing director in Washington for IAPP. “There isn’t really any enforcement yet.” 

A moratorium would likely deter state legislators and policymakers from developing and proposing new regulations, Zweifel-Keegan said. “The federal government would become the primary and potentially sole regulator around AI systems,” he said.

What a moratorium on state AI regulation means

AI developers have asked for any guardrails placed on their work to be consistent and streamlined. 

“We need, as an industry and as a country, one clear federal standard, whatever it may be,” Alexandr Wang, founder and CEO of the data company Scale AI, told lawmakers during an April hearing. “But we need one, we need clarity as to one federal standard and have preemption to prevent this outcome where you have 50 different standards.”

During a Senate Commerce Committee hearing in May, OpenAI CEO Sam Altman told Sen. Ted Cruz, a Republican from Texas, that an EU-style regulatory system “would be disastrous” for the industry. Altman suggested instead that the industry develop its own standards.

Asked by Sen. Brian Schatz, a Democrat from Hawaii, if industry self-regulation is enough at the moment, Altman said he thought some guardrails would be good, but, “It’s easy for it to go too far. As I have learned more about how the world works, I am more afraid that it could go too far and have really bad consequences.” (Disclosure: Ziff Davis, parent company of CNET, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

Not all AI companies are backing a moratorium, however. In a New York Times op-ed, Anthropic CEO Dario Amodei called it “far too blunt an instrument,” saying the federal government should create transparency standards for AI companies instead. “Having this national transparency standard would help not only the public but also Congress understand how the technology is developing, so that lawmakers can decide whether further government action is needed.”

Sen. Ted Cruz and Sen. Maria Cantwell sit at a dais during a congressional hearing. Cantwell (right) is pointing and Cruz (left) has his hand on his chin.

A proposed 10-year moratorium on state AI laws is now in the hands of the US Senate, where its Committee on Commerce, Science and Transportation has already held hearings on artificial intelligence. 

Nathan Howard/Bloomberg via Getty Images

Concerns from companies, both the developers that create AI systems and the “deployers” who use them in interactions with consumers, often stem from fears that states will mandate significant work such as impact assessments or transparency notices before a product is released, Kourinian said. Consumer advocates have said more regulations are needed and hampering the ability of states could hurt the privacy and safety of users.

A moratorium on specific state rules and laws could result in more consumer protection issues being dealt with in court or by state attorneys general, Kourinian said. Existing laws around unfair and deceptive practices that are not specific to AI would still apply. “Time will tell how judges will interpret those issues,” he said.

Susarla said the pervasiveness of AI across industries means states might be able to regulate issues such as privacy and transparency more broadly, without focusing on the technology. But a moratorium on AI regulation could lead to such policies being tied up in lawsuits. “It has to be some kind of balance between ‘we don’t want to stop innovation,’ but on the other hand, we also need to recognize that there can be real consequences,” she said.

Much policy around the governance of AI systems does happen because of those so-called technology-agnostic rules and laws, Zweifel-Keegan said. “It’s worth also remembering that there are a lot of existing laws and there is a potential to make new laws that don’t trigger the moratorium but do apply to AI systems as long as they apply to other systems,” he said.





RELATED ARTICLES

Leave a reply

Please enter your comment!
Please enter your name here

- Advertisment -
Google search engine

Most Popular

Recent Comments