The Trump administration released its long-awaited national AI policy framework on Friday, and if Congress follows through, it could change how artificial intelligence is governed across the entire country.
The framework, released March 20, urges Congress to adopt a federally unified, innovation-oriented approach centered on preempting state AI laws and taking a "light-touch" regulatory stance. Think of it as Washington trying to be the one voice in the room before 50 states all start talking at once.
The blueprint outlines six guiding principles: protecting children and empowering parents, safeguarding American communities, respecting intellectual property rights, preventing censorship and protecting free speech, enabling innovation and ensuring American AI dominance, and educating Americans and developing an AI-ready workforce.
On kids' safety, the framework calls on Congress to require AI platforms likely to be accessed by minors to implement features that reduce risks of sexual exploitation and self-harm, and to give parents tools to manage children's privacy, screen time, and content exposure.
On energy, a topic that has been simmering for months, the White House is pushing to codify a pledge signed by companies including Amazon, Google and OpenAI requiring tech firms to supply or pay for the electricity used by the data centers they operate. It also calls on Congress to streamline permitting so data centers can generate power on site.
For creators and publishers, the framework tries to thread a needle. It recommends that Congress protect creators' voices and likenesses, while also maintaining that AI scraping the internet for copyrighted material does not violate U.S. copyright law, leaving the courts to make the final call on fair use.
Notably, parts of this are already starting to take shape. The Take It Down Act, signed into law in 2025, created the first federal requirement for platforms to remove non-consensual AI-generated content. Meanwhile, proposals like the NO FAKES Act, the DEFIANCE Act, and the CLEAR Act signal growing bipartisan momentum to address voice, likeness, accountability, and transparency in the age of AI.
This is where the real divide starts to show. The administration says states should not be permitted to regulate AI development, should not penalize AI developers for a third party's unlawful conduct using their product, and should not unduly burden Americans' use of AI for activity that would otherwise be lawful.
The framework also argues that Congress should not create any new federal rulemaking body to regulate AI and should instead maintain a sector-specific approach using existing regulatory bodies.
The Political Reality
Don't hold your breath for a quick vote. House Republican leaders endorsed the framework and said they're ready to work across the aisle, but passing legislation would be a heavy lift requiring agreement with Democrats in the Senate as divisions over AI run deep.
Some Democrats aren't buying it. Rep. Josh Gottheimer of New Jersey said the proposal fails to address strong accountability for AI companies and that Americans need real protection, not a framework that allows the AI industry to operate like the Wild West.
On the other side, pro-innovation voices are largely cheering. NetChoice's Patrick Hedger argued that a light-touch regulatory environment, not 50 conflicting regulatory regimes, enabled the internet revolution, and that AI will require the same approach to win globally.
Whether or not this specific framework becomes law, it signals the direction Washington wants to move: federal control, minimal friction for companies and a bet that innovation should lead policy rather than the other way around. For creators, the IP question is one to watch closely. For parents and educators, the child safety provisions are meaningful, even if the broader accountability gaps make some nervous. And for anyone building in the AI space right now, the regulatory environment just got a little clearer and a lot more federal.
My hot take: A federal standard isn’t just cleaner, it’s necessary. A patchwork of state laws might check Big Tech, but it hurts small business and could quietly shut out the next generation of builders before they even get started.
But that only works if we trust the system enforcing it. Without real oversight, a light-touch approach can quickly turn into no accountability at all.
Other headlines to check out:
AI
Creator Economy
Web3
Friendly Reminder
You don't have to have it all figured out. You just have to start.
Remember, I'm Bullish on you! With gratitude,



