Trump’s executive order on AI promises ‘one rule book’ — startups may get in legal hot water instead


President Donald Trump signed on Executive order Thursday evening directing federal agencies to challenge state AI laws, arguing that startups need an exemption from a “patchwork” of rules. At the same time, legal experts and startups say the order could prolong uncertainty, sparking court battles that leave startups navigating changing state requirements while waiting to see if Congress can agree on a single national framework.

the to requesttitled “Ensuring a National Policy Framework for AI,” directs the Department of Justice to convene a task force within 30 days to challenge certain state laws on the grounds that AI is interstate commerce and should be regulated at the federal level. It gives the Commerce Department 90 days to compile a list of states’ “onerous” AI laws, an assessment that could affect states’ eligibility for federal funds, including broadband grants.

The order also asks the FTC and FCC to explore federal standards that could preempt state rules and directs the administration to work with Congress on a uniform AI law.

The regime goes down amid a broader crackdown on Reining in country-by-country AI rules after efforts In Congress to stop the stalled state regulation. Lawmakers in both parties have argued that without a federal standard, preventing states from acting could leave consumers and businesses largely unchecked.

“This executive order led by David Sachs is a gift to the Silicon Valley minority who are using their influence in Washington to shield themselves and their companies from accountability,” Michael Kleinman, head of US policy at the Future of Life Institute, which focuses on reducing extreme risks from transformative technologies, said in a statement.

Sachs, the Trump administration’s AI and cryptocurrency policy official, has been a leading voice behind the administration’s preemptive AI push.

Even supporters of the national framework acknowledge that the system does not create a national framework. Because state laws are still enforceable unless courts block them or states temporarily suspend their enforcement, startups may face an extended transition period.

TechCrunch event

San Francisco
|
October 13-15, 2026

Sean Fitzpatrick, CEO of LexisNexis North America, UK and Ireland, tells TechCrunch that states will defend their consumer protection authority in court, with cases potentially escalating to the Supreme Court.

While supporters argue the order could reduce uncertainty by focusing the battle over AI regulation in Washington, critics say the legal battles would create immediate headwinds for startups navigating conflicting federal and state demands.

“Because startups prioritize innovation, they typically don’t have robust organizational governance programs until they reach scale that requires one,” Hart Brown, lead author of Oklahoma Gov. Kevin Stitt’s Task Force on Artificial Intelligence and Emerging Technology, told TechCrunch. “These programs can be expensive and time-consuming to meet a very dynamic regulatory environment.”

Arul Nigam, co-founder of Circuit Breaker Labs, a startup that is creating red-team AI-powered chatbots related to mental health, echoed these concerns.

“There’s uncertainty about, do[AI and chatbot companies]have to self-regulate?” Nigam told TechCrunch, noting that the state’s patchwork of AI laws hurts small startups in his field. “Are there open source standards they should adhere to? Should they keep building?”

He added that he hopes Congress can move more quickly now to pass a stronger federal framework.

Andrew Gamino-Cheong, CTO and co-founder of AI Governance TrustedHe told TechCrunch that an EO would be counterproductive to AI innovation and pro-AI goals: “Big tech companies and big AI startups have the money to hire lawyers to help them figure out what to do, or they can simply hedge their bets. Uncertainty hurts startups the most, especially those that can’t get billions in funding nearly as much as they want,” he told TechCrunch.

He added that legal ambiguity makes it difficult to sell to risk-sensitive clients such as legal teams, financial firms and healthcare organizations, increasing sales cycles, systems work and insurance costs. “Even the perception that AI is unregulated will reduce trust in AI,” which is already low and threatens adoption, Gamino Cheung said.

Businesses would welcome a single national standard, but “an executive order is not necessarily the appropriate vehicle to override laws duly enacted by states,” said Gary Cable, a partner at Davis + Gilbert. He warned that the current uncertainty leaves the door open to two extremes: imposing highly restrictive rules or taking no action at all, either of which could create a “Wild West” that favors Big Tech’s ability to absorb risks and wait things out.

Meanwhile, Morgan Reed, president of the Applications Association, urged Congress to enact “a comprehensive, targeted, risk-based national framework for AI. We cannot have a patchwork of state AI laws, and a protracted court battle over the constitutionality of an executive order is never better.”

Leave a Reply

Your email address will not be published. Required fields are marked *