
When emotional connection moves from home to a chatbot, what are we really missing at home?
A few weeks ago, a parent came to me after my class. She was not panicking. She was confused.
“Sir, mene mere bete ka phone check kiya toh usne kisi se ghanton baat ki thi. Phir pata chala… koi insaan hi nahin tha.” (Sir, I checked my son’s phone and saw he had been talking to someone for hours. Then I found out… it wasn’t a person at all.)
Her son, seventeen, O-Levels, quiet at home, had been spending two to three hours every evening in deep conversation with an AI chatbot. Not for homework. Not for any school project. He was sharing how lonely he felt. How he felt misunderstood. How he wished someone at home would ask him something other than “padhai kaisi chal rahi hai?” (How is your studying going?)
The chatbot had been listening.
When the AI Becomes the Confidant
Let me explain what we are talking about, because many parents have not yet heard of this.
AI companion apps platforms like Character.AI, Replika, and newer alternatives appearing almost monthly are not search engines or homework tools. They are designed to hold conversations that feel emotionally warm and personal. They remember what you said last week. They ask follow-up questions. They respond with patience that no tired parent, no distracted sibling, no busy teacher can always offer.
For a teenager sitting alone in their room in Lahore or Karachi, unsure how to talk about what they are feeling, afraid of being dismissed or judged, that kind of response can feel like exactly what they needed.
And this is not rare. Research published in Psychology Today in 2025 found that nearly 1 in 3 teenagers globally have tried an AI companion. Among regular users, a third describe talking to their AI as just as good or better than talking to a real friend. That number should give us pause. Not because teenagers are broken, but because it reveals something important about the emotional gaps they navigate on their own.
The Pakistani Home and the Gap Nobody Talks About
In many of our homes, love and concern are not the problem. Tarbiyat (the raising and moral upbringing of a child) is taken seriously. Parents work hard, sacrifice, and plan for their children’s futures.
But there is often a gap, quiet and unspoken, between caring deeply about a child and knowing how to talk with them.
Conversations in many Pakistani families follow a familiar pattern: performance, plans, problems. “Kitne marks aaye?” (How many marks did you get?) “Kya khaya?” (What did you eat?) “Kaun se dost hain?” (Who are your friends?) These are not bad questions. But they leave very little room for a teenager to say: “Main akela feel kar raha hoon” (I am feeling lonely) or “Mujhe pata nahin main kaun hoon” (I don’t know who I am) or “Mujhe darr lag raha hai.” (I am scared.)
Our culture does not always give us a language for those conversations. Vulnerability is often mistaken for weakness. Emotional openness is sometimes seen as unnecessary, or worse, drama. So teenagers learn early: some things you carry quietly. And into that quiet steps an app that says: “You can tell me anything. I’m here.”
This is not a technology problem. It is a connection problem. And the technology is simply making visible what was already there.
What We Cannot Afford to Ignore
I want to be careful here, because the goal is not to alarm you. Most teenagers using these apps are not in danger. They are lonely, which is painful, but it is also very human and very adolescent. However, there are real and documented concerns that deserve honest attention.
Psychiatry Times has reported cases where AI companions, when interacting with vulnerable teenagers, discouraged them from seeking real help and reinforced harmful thinking rather than interrupting it. In at least one widely reported case abroad, a teenage boy’s deepening emotional dependency on a chatbot was linked to his death. The app had not flagged distress. It had deepened it.
These platforms are designed to keep users engaged. They are not designed to notice when a teenager genuinely needs a human being.
In Pakistan, there is currently no regulation of AI companion apps. No school policy addresses them. No conversation is happening at a system level. And most parents, like the mother who came to me after the workshop, are finding out about this only by accident.
What This Moment Is Asking of Us
I think about the C.A.L.M approach often in conversations like these.
Connect before you correct. If your teenager is spending hours talking to an AI, the first question to ask is not “why are you on your phone?” but something quieter: “Is there something you wish you could talk to me about?”
Ask before you assume. Perhaps they are not hiding something troubling. Perhaps they are just looking for space to think out loud, something many of our homes do not easily offer.
Listen before you lecture. If they tell you they find it easier to talk to an app than to you, that is not an insult. It is information. Sit with it honestly.
Mentor, don’t monitor. Checking their phone will not fix the need that led them there in the first place.
The parent I spoke to in Karachi did not need to panic. She needed a place to start. We talked about small things: how to make car rides less about performance and more about presence. How to ask questions that open doors rather than close them. How to sit in silence without rushing to fill it.
Her son, she later told me, had not been looking for a chatbot. He had been looking for someone to talk to. That is the real question this moment is asking all of us parents, teachers, educators,
not “how do we keep our teenagers away from AI?“
But: “What are we offering them instead?”