As we head into 2026, it already feels clear that 2025 will be remembered as a turning point: the year artificial intelligence became a regular part of daily life. ChatGPT is now as ubiquitous as Google, Waymo cars are a familiar sight on big city streets, and distinguishing between real and AI-generated images or videos is no longer straightforward.
Every sector, whether entertainment, finance, health care and beyond, is grappling with the implications. Education is no exception.
It can be tempting to throw our heads in the sand, wait for AI to take shape, and hope that it’s not as big of a deal as those who stand to benefit the most are saying it is. But whether we welcome it or not, AI is here, it is accelerating, and its trajectory will be shaped by those willing to engage. That includes educators, families and policymakers alike.
Too often, people are labeled as either “AI optimists” or “AI pessimists.” The truth is more complex. I’m both.
I see incredible potential — for educators to personalize learning, for students to access support that once felt out of reach, and for schools to operate more equitably and efficiently.
But I also see real risk, particularly when innovation moves faster than the guardrails meant to protect the public. In their book “Governing the Machine,” three leading AI policy experts argue that broad adoption of AI depends on public trust, and that trust is built through thoughtful regulation. After all, would we board a plane or drive a car without confidence in the safety standards behind them?
In that context, California lawmakers have taken meaningful steps toward leadership in AI policy. While far from perfect, the 2025 legislative session produced a growing slate of laws aimed at establishing public-interest safeguards as AI use expands.
Here’s a snapshot of key AI bills from 2025:
• SB 53 (Wiener) — Transparency in Frontier AI Act
Requires developers of large-scale “frontier” AI models to publish safety protocols, report major incidents and protect whistleblowers. Signed into law.
Recommended for you
• SB 243 (Padilla) — AI Chatbot Safeguards for Minors
Applies to AI “companion” tools used by minors, mandating disclosures that users are interacting with AI and requiring safeguards against harmful content. Signed into law.
• SB 11 (Ashby) — AI & Digital Replicas
Requires warnings when AI tools can generate realistic fake images, audio or video, and directs courts to examine authentication standards for AI-generated evidence. Became law without the governor’s signature.
Additionally, two bills passed through the Legislature that the governor vetoed.
The first, the LEAD for Kids Act (AB 1064), would have restricted AI companion chatbots likely to promote self-harm, violence or sexual content. While it was intended to protect children, Gov. Gavin Newsom vetoed the bill over concerns that its language was overly broad. This decision highlights the complexity of legislating fast-moving technology: how to safeguard young people without creating sweeping rules that may have unintended consequences. As lawmakers return this session, there’s an opportunity to revisit this issue with greater precision, informed by experts across education, child safety and technology.
Another, the No Robo Bosses Act (SB 7), would have required disclosure when AI is used in hiring or disciplinary decisions and prohibited sole reliance on automated systems. The governor vetoed this bill, describing the proposed regulations as “unfocused” and failing to offer targeted solutions for the risks posed by AI in the workplace. It’s expected that lawmakers will revise the bill this year to address his stated concerns.
As lawmakers reconvene for the 2026 session, we should urge them to prioritize student-centered AI policies that emphasize safety, transparency and educator support. The concerns facing schools go well beyond plagiarism or cheating; they include digital safety, the erosion of critical thinking skills (cognitive offloading), and how we prepare young people for a future shaped by AI.
AI is not waiting for us to catch up, and neither should our policies. With limited federal action, the spotlight is on California. In the world’s fourth-largest economy, leadership isn’t optional; it’s a responsibility, and our youth can’t afford for education leaders to remain on the sidelines.
Andrew Simmerman is a government affairs professional with a background in public education, advocacy and strategic policy engagement. He’s currently focused on the intersection of AI, K-12 education and regulation — tracking how emerging technologies shape public policy, workforce development and society. He wrote this for EdSource, an independent nonprofit organization founded in 1977, dedicated to providing analysis on key education issues facing the state and nation. Go to edsource.org to learn more.

(0) comments
Welcome to the discussion.
Log In
Keep the discussion civilized. Absolutely NO personal attacks or insults directed toward writers, nor others who make comments.
Keep it clean. Please avoid obscene, vulgar, lewd, racist or sexually-oriented language.
Don't threaten. Threats of harming another person will not be tolerated.
Be truthful. Don't knowingly lie about anyone or anything.
Be proactive. Use the 'Report' link on each comment to let us know of abusive posts.
PLEASE TURN OFF YOUR CAPS LOCK.
Anyone violating these rules will be issued a warning. After the warning, comment privileges can be revoked.