It’s not necessarily the guys you might expect, Apollo Knapp told me.
These are 6-foot-tall high-school athletes, guys who are social and popular. “They’re the type of people that are friends with everybody, who get dapped up in the hallway every two feet,” said Knapp, an 18-year-old high school senior in Ohio and a board member at sexual violence prevention nonprofit SafeBAE.
But at his school, these are the guys using AI to help them talk to girls. They’ll paste their texts into ChatGPT for feedback before sending, he said. Or, they’ll send their own photos to ChatGPT and ask, “am I cute?” Or, they’ll simply ask for moral support when they’re “too scared, maybe, to confront women.”
Girls and non-binary teens don’t need to lean on ChatGPT as much, Knapp said; they’re more likely to have a circle of friends ready and willing to workshop their texts. But guys are more isolated, socialized to believe it’s weak to talk about their feelings.
Worse, they’ve grown up on a steady diet of media telling them that “if you say the wrong thing” to a girl, “she’s going to accuse you of something,” Knapp said. Even if those messages aren’t accurate, they get inside teen boys’ heads, making them feel like they have to screen everything through ChatGPT to make sure it’s okay.
The drift of boys and young men away from everyone else in American society has been an enduring theme of the last few years. The fear is that guys, especially straight guys, are getting sucked into manosphere podcasts and becoming more and more alienated from the girls and women they, in theory, want to date. This is an oversimplified narrative, and there’s reason to hope that boys and men are more connected, and more interested in connection, than their most unpleasant listening material might suggest.
But in talking to teens and experts about AI and relationships, I did get the sense that boys need better outlets for their feelings than we’re giving them. And while ChatGPT might help some kids in some circumstances, teens of all genders need a more reliable support system — one that doesn’t require an electricity-guzzling data center to answer a question.
After all, Knapp said, “what’s going to happen if you don’t have power, and you have a girlfriend?”
Teens are using AI for dating. The question is how.
It’s hard to know exactly how many young people are talking to ChatGPT about relationship problems, since research on youth and AI is in its infancy. In one recent Pew survey, 57 percent of teens said they had used AI “to search for information,” while 12 percent said they’d used the tools “to get emotional support or advice.” It’s possible to imagine dating inquiries falling in either category.
Anecdotally, experts and teens alike say young people are turning to ChatGPT with everything from low-stakes questions about texting to serious concerns about what might constitute sexual assault.
Val Odiembo, 19, mentors their fellow college students about healthy relationships. As a peer educator, they’re used to getting questions like, “what do I do when my girlfriend says this?” or “is this consent?”
But recently, those questions have been tapering off. Odiembo, a nursing student and SafeBAE board member, thinks students are now asking ChatGPT, instead.
“I’ve had my students say to me, ‘I asked Chat what I should say to this boy,’” Odiembo told me. When that happens, “I die a little bit inside.”
Some young people are using chatbots “to test out being flirty or being romantic or being a little bit sexy and seeing how the chatbot responds to that,” Megan Moreno, a professor of pediatrics at the University of Wisconsin Madison who studies technology and adolescent health, told me.
That kind of experimentation may be more common among boys, who generally engage in more risky behavior online than girls, Moreno said.
Using technology to experiment with flirting and romance isn’t new. Millennial teens turned to chat rooms and AOL Instant Messenger for this purpose. This could be risky — my classmates spent a lot of time catfishing each other avant la lettre — or outright dangerous if teens ended up chatting with adults.
But, as Moreno points out, at least the people you were chatting with online were real humans who could tell you to go away if you said something too gross.
Chatbots, by contrast, “are programmed to be incredibly receptive and sycophantic,” Moreno said. “Even if you say something incredibly inappropriate, the chatbot is going to respond in a way that reinforces that.”
That’s even more problematic when the subject is sexual violence. Young people are increasingly turning to chatbots after sexual encounters to ask if they might have committed assault, Drew Davis, director of strategic initiatives at SafeBAE, told me. The responses he’s seen have sometimes been unhelpful, he said, emphasizing legal defenses or providing reassurances instead of discussing accountability.
SafeBAE is developing an interactive tool that helps young people think about sexual situations that may have been confusing for them, such as those in which both parties were drinking, and connects them with resources to help them take responsibility and apologize if needed.
The goal is “giving them language, giving them tools to be able to do this, that’s not coming from AI,” Davis said. “It’s connecting them with other people.”
Why teens are going to AI in the first place
It’s possible to imagine AI pushing young people even farther apart from one another than they already are. The big question is whether kids are using AI to practice having human relationships or to replace those relationships, Moreno said. In one recent survey, one in five high-school students said they or someone they knew had been in a romantic relationship with an AI.
It’s not hard to see why teenagers (or adults, for that matter) might be drawn to a voice that always has answers but never criticizes. When talking about thorny issues like sex and consent, “I think there’s a lot of shame,” Odiembo said. Teens “feel comfortable going to AI, because AI won’t judge them.”
But some teens also see value in the inevitable challenge and friction of human relationships.
“You need to be called out occasionally,” Knapp, the Ohio senior, said. “That’s how humans evolve.”
Some experts believe that with better guardrails — like a willingness to say, “hey, don’t talk to me like that!” — AI could still be a helpful partner for teens learning to talk to each other. For example, a chatbot could be trained to help kids with social skills. Part of me wonders how much less awkward my adolescence might have been if I’d been able to workshop my jokes with a bot before taking them to the crucible of middle-school homeroom.
It’s also worth noting that AI models are constantly changing and, in some ways, improving. After I talked to the SafeBAE team, I tested ChatGPT and Google Gemini by pretending to be a teenage boy concerned he’d crossed a line with a girl. Both models did a decent job, at least on first response, posing follow-up questions about the situation and encouraging me to take responsibility.
But the young people I spoke with for this story don’t want better chatbots; they want to see humans get better, instead. They want teachers who are better-trained to discuss difficult issues like consent and assault. They want coaches and other adults who can model healthy masculinity for boys, rather than reinforcing stereotypes. And for all teens, they want supportive places to open up about feelings and relationships, some of the messiest and most important aspects of human life.
“I wish people were a little more comfortable having uncomfortable conversations,” Odiembo said.
Families continue to report disturbing conditions at the Texas immigration center where 5-year-old Liam Conejo Ramos was held, including a worm in a child’s food, water that causes rashes and stomachaches, and staff withholding medical care.
Teens and tweens want to see more depictions of “fathers enjoying parenting” and “fathers showing love to kids” in movies and TV, according to a recent UCLA survey. In this, as in all things, the answer is Bluey.
The New York Times did a deep dive into AI slop videos aimed at kids. It is unclear as yet whether endless clips of adult mammals hatching out of eggs are harmful for children, but they are certainly bizarre.
My older kid is currently obsessed with the Ham Helsing series, graphic novels about a pig who hunts vampires.
After I wrote about kids’ recent obsession with the phrase “chicken banana,” one reader wrote in to let me know about a much earlier coinage. “Perhaps it’s my age (almost 80), but as teenagers, my age group regularly heard a jingle for Chiquita Bananas,” he wrote. “We naturally corrupted Chiquita banana into ‘chicken banana.’”
“Sorry to crush the illusion of today’s uniqueness of Chicken Banana, but we ancient folks were using the term ‘chicken banana’ a l-o-n-g time ago,” he added.
As always, if you have a question or want to share a story about kids today or in the past, you can reach me at anna.north@vox.com.