A new national survey has found that nearly one in five high school students has either had a romantic relationship with an artificial intelligence system or knows someone who has. The research, conducted by the Center for Democracy and Technology (CDT), also revealed that 42% of students have used AI for companionship or know peers who have, highlighting the rapidly growing role of AI in the social and emotional lives of teenagers.
Key Takeaways
- Nearly 20% of high school students report personal or peer involvement in romantic AI relationships.
 - 42% of students say they or someone they know have used AI for companionship.
 - Increased AI use in schools is correlated with a higher likelihood of students forming personal bonds with AI.
 - The study links higher school-related AI use to increased risks of data breaches, AI-generated deepfakes, and digital harassment.
 - A significant gap exists in AI literacy, with only 11% of teachers trained to address the potential negative impacts of AI on student wellbeing.
 
Widespread AI Adoption in Educational Settings
The study from the Center for Democracy and Technology, a nonprofit focused on technology and civil liberties, surveyed approximately 800 teachers, 1,000 high school students, and 1,000 parents. The findings show that AI is already deeply integrated into the education system.
According to the data, a vast majority of respondents reported using AI during the last academic year. This includes 86% of students, 85% of educators, and 75% of parents. This high level of adoption sets the stage for more complex interactions between students and AI systems, extending beyond academic tasks.
The Link Between School Use and Personal AI Bonds
A key finding from the report establishes a strong correlation between the extent of AI use in schools and students' tendencies to form personal relationships with the technology. Students attending schools with high levels of AI integration for educational purposes were more likely to engage with AI on a personal level.
Defining High AI Use
The CDT report categorized "high levels of AI use" as schools where teachers employed AI for seven to ten different tasks and students used it for four to six school-related purposes. This frequent, normalized interaction appears to lower the barrier for students to see AI as more than just a tool.
Elizabeth Laird, a co-author of the report, explained the connection observed in the survey data. "The more ways that a student reports that their school uses AI, the more likely they are to report things like 'I know someone who considers AI to be a friend,' [or] 'I know someone who considers AI to be a romantic partner,'" Laird stated.
Increased Risks from Pervasive AI Integration
While educators often see benefits in AI, such as saving time and personalizing learning, the report uncovers significant risks associated with its heavy use in schools. These risks affect data security, student safety, and the school community's trust.
Data Breaches and System Failures
The survey found a notable difference in data security incidents based on AI usage levels. Among teachers who use AI frequently for school tasks, 28% reported that their school had experienced a large-scale data breach. This is significantly higher than the 18% reported by teachers who use AI for fewer tasks or not at all.
"AI systems take a lot of data, they also spit out a lot of information too," Laird commented, suggesting that the sheer volume of data processed by these systems increases the vulnerability of school networks.
Teachers in high-use environments were also more likely to report instances where an AI system failed to work as intended, potentially disrupting educational activities. Furthermore, these educators noted that the use of AI had, in some cases, damaged the trust between the school and the community.
Monitoring Software and Student Privacy
The report highlights the use of AI-powered monitoring software on school-issued devices. This software has led to false alarms and, in some cases, student arrests. This issue disproportionately impacts students from lower-income families who cannot afford personal devices and must rely on school-provided laptops, effectively trading their privacy for access to education.
A New Avenue for Harassment
The research also identifies AI as a new and powerful tool for digital harassment. The rise of AI-generated deepfakes—manipulated videos or images—has created a new vector for bullying and sexual harassment among students. "This technology is a new vector for sexual harassment and bullying, which were long-standing issues," Laird said, noting that AI can "exacerbate" existing problems.
Concerns for Student Wellbeing and Training Gaps
The study raises serious questions about the impact of these human-AI interactions on student mental and emotional health, especially given the lack of sufficient guidance and training for both students and educators.
When students engaged in personal conversations with AI for mental health support or companionship, 31% reported using a device or software provided by their school. This blurs the line between personal and school life and exposes private conversations to potential monitoring.
Laird emphasized the importance of understanding the technology's limitations. "I think students should know that they are not actually talking to a person. They are talking to a tool, and those tools have known limitations," she said. The research indicates that current AI literacy programs are often too basic to address these complex social and emotional challenges.
Lack of Educator Preparedness
The survey revealed a critical training gap for educators. Only 11% of teachers reported receiving training on how to respond if they suspect a student's use of AI is becoming detrimental to their wellbeing. This leaves teachers ill-equipped to handle the novel challenges presented by students forming deep attachments to AI systems.
While AI offers potential educational benefits, students in schools with prevalent AI use reported feeling less connected to their teachers. This suggests that over-reliance on technology could come at the cost of human connection in the classroom.
"What we hear from students is that while there may be value in this, there's also some negative consequences that are coming with it, too," Laird concluded. "And if we're going to realize the benefits of AI, we really need to pay attention to what students are telling us."





