The Price of Unregulated AI
Apr 10, 2026 by FACT
“If we can build machines smart enough to think, then we can build them smart enough to protect our kids.” – Mandi Furniss
The share of children using artificial intelligence is rising sharply each year. A recent RAND Corporation study found that the share of middle school students (ages 11 to 14) using AI to assist with homework and other tasks grew from 48% in May 2025 to 62% in December 2025. That’s a 14-point increase in just six months. Even more telling, 67% of students agreed with the statement that "The more students use AI for their schoolwork, the more it will harm their critical thinking skills.” A national survey conducted by Common Sense Media similarly revealed that three out of four American teenagers have used AI chatbots.
Impaired critical thinking is only one part of the problem. In some ways, that concern pales in comparison to other risks. Growing evidence points to broader dangers, including emotionally manipulative chatbot relationships, exposure to sexual content, deepening isolation, and self-harm-related interactions. What begins as a seemingly harmless tool can quickly become something far more dangerous when children are allowed to engage with AI systems that lack meaningful safeguards.
To be clear: AI is not inherently bad. Most Americans already interact with artificial intelligence on a daily basis, whether through search engines, navigation apps, spam filters, streaming recommendations, facial recognition, fraud alerts, or smart devices in their homes and cars. Society is moving quickly to adopt and rely on new technologies that make our lives easier and safer. But as AI progresses at an unprecedented pace, parents, schools, and the government must implement safeguards to protect children from the very real threats posed by unregulated AI.
Tennessee legislators are already taking action.
Senator Ken Yager and Deputy Speaker Jason Zachary joined together to propose the Artificial Intelligence Public Safety and Child Protection Transparency Act (HB 1898 / SB 2171), which would establish transparency requirements for AI developers, including the adoption of risk mitigation strategies for AI models.
“I started out as a schoolteacher, and I’ve spent close to 50 years in public service — from the county courthouse to the state capitol,” Sen.Yager shared. “In all that time, I’ve learned that the best thing you can do is listen to the people you represent. And right now, Tennessee families are telling us loud and clear that they’re concerned about what AI is doing to their kids. When nine out of 10 voters say they want action, that’s not something I need to think twice about.”
Sen. Yager is right – Tennesseans want meaningful change. A recent survey found that 88% of Tennessee voters support legislation requiring safety and security protocols for AI to protect children. Another study showed 90% believe the state has an obligation to protect children from AI, and 67% believe the state should act now rather than waiting on Congress to pass federal protections.
“As a father and as Deputy Speaker, protecting Tennessee’s children is one of my highest priorities,” Rep. Zachary explained. “We’ve already seen tragic cases where AI chatbots have contributed to the harm and death of children across the country. Tennessee families shouldn’t have to wonder whether the AI systems their children use have basic safety measures in place. This legislation is common sense.”
Thankfully, Tennessee isn’t alone in this fight. Lawmakers at the federal level are similarly pursuing legislation to keep children safe online. The KIDS Act, currently awaiting consideration by the full House, is a sweeping proposal that would:
- Create public resources for parents and educators on AI risks.
- Require chatbots to disclose to minors that they are not interacting with a real person.
- Prevent AI from presenting itself as licensed professionals such as doctors or therapists.
- Require hotline crisis information when a minor mentions suicide or self-harm.
- Prompt minors to take breaks from extended sessions.
- Restrict minors from accessing sexual content, gambling, and other regulated activities.
Alliance for a Better Future, a new pro-family technology coalition, is planning to spend at least eight figures in 2026 supporting legislation aimed at advancing stronger safeguards for children.
“It’s time for America to choose: will the most powerful technology in history be used to harm children, enrich the powerful, and risk American lives? Or will we build it with the values that make America great?”
Watch their chilling video here:
