President Donald Trump signed an executive order Thursday evening that directs federal agencies to take aim at state AI laws, arguing startups need relief from a “patchwork” of rules. But legal experts and startups say the order could prolong uncertainty, sparking court battles that leave young companies navigating shifting state requirements while waiting to see if Congress can agree on a single national framework.
The order, titled “Ensuring a National Policy Framework for Artificial Intelligence,” directs the Department of Justice to set up a task force within 30 days to challenge certain state laws on the grounds that AI is interstate commerce and should be regulated federally. It gives the Commerce Department 90 days to compile a list of “onerous” state AI laws, an assessment that could affect states’ eligibility for federal funds, including broadband grants.
It also asks the Federal Trade Commission and Federal Communications Commission to explore federal standards that could preempt state rules and instructs the administration to work with Congress on a uniform AI law.
The order lands amid a broader push to rein in state-by-state AI rules after efforts in Congress to pause state regulation stalled. Lawmakers in both parties have argued that without a federal standard, blocking states from acting could leave consumers exposed and companies largely unchecked.
“This David Sacks-led executive order is a gift for Silicon Valley oligarchs who are using their influence in Washington to shield themselves and their companies from accountability,” said Michael Kleinman, Head of U.S. Policy at the Future of Life Institute, which focuses on reducing extreme risks from transformative technologies, in a statement.
Sacks, Trump’s AI and crypto policy czar, has been a leading voice behind the administration’s AI preemption push.
Even supporters of a national framework concede the order doesn’t create one. With state laws still enforceable unless courts block them or states pause enforcement, startups could face an extended transition period.
Techcrunch event
San Francisco
|
October 13-15, 2026
Sean Fitzpatrick, CEO of LexisNexis North America, U.K., and Ireland, tells TechCrunch that states will defend their consumer protection authority in court, with cases likely escalating to the Supreme Court.
While supporters argue the order could reduce certainty by centralizing the fight over AI regulation in Washington, critics say the legal battles will create immediate headwinds for startups navigating conflicting state and federal demands.
“Because startups are prioritizing innovation, they typically do not have…robust regulatory governance programs until they reach a scale that requires a program,” Hart Brown, principal author of Oklahoma Gov. Kevin Stitt’s Task Force on AI and Emerging Technology recommendations, told TechCrunch. “These programs can be expensive and time-consuming to meet a very dynamic regulatory environment.”
Arul Nigam, co-founder at Circuit Breaker Labs, a startup that performs red-teaming for conversational and mental health AI chatbots, echoed those concerns.
“There’s uncertainty in terms of do [AI companion and chatbot companies] have to self-regulate?” Nigam told TechCrunch, noting the patchwork of state AI laws does hurt smaller startups in his field. “Are there open-source standards they should adhere to? Should they continue building?”
He added that he is hopeful that Congress could move more quickly now to pass a better federal framework.
Andrew Gamino-Cheong, CTO and co-founder of AI governance company Trustible, told TechCrunch the EO will backfire on AI innovation and pro-AI goals: “Big Tech and the big AI startups have the funds to hire lawyers to help them figure out what to do, or they can simply hedge their bets. The uncertainty does hurt startups the most, especially those that can’t get billions of funding almost at will,” he said.
He added that legal ambiguity makes it harder to sell to risk-sensitive customers like legal teams, financial firms, and healthcare organizations, increasing sales cycles, system work, and insurance costs. “Even the perception that AI is unregulated will reduce trust in AI,” which is already low and threatens adoption, Gamino-Cheong said.
Gary Kibel, a partner at Davis + Gilbert, said businesses would welcome a single national standard, but “an executive order is not necessarily the right vehicle to override laws that states have duly enacted.” He warned that the current uncertainty leaves open two extremes: highly restrictive rules or no action at all, either creating a “wild west” that favors big tech’s ability to absorb risk and wait things out.
Morgan Reed, president of The App Association, meanwhile, urged Congress to quickly enact a “comprehensive, targeted, and risk-based national AI framework. We can’t have a patchwork of state AI laws, and a lengthy court fight over the constitutionality of an Executive Order isn’t any better.”


