What 70 Years of Research Tells Us AI Can't Replace

I am a therapist who works in tech and AI. I believe in its power for good. Every day I see it make health information more accessible to families, reduce clinician burnout, and extend care to people who would otherwise have none. The first randomized controlled trial of an AI therapy chatbot found significant symptom reductions for depression and anxiety (Heinz et al., 2025). Some AI mental health tools are even showing signs of leading people back toward stronger real-world relationships (Slingshot AI, 2025). These are genuine contributions, and I want more of them.

But I also see writing on the wall that concerns me deeply. And my biggest fear is about what's coming for the developing brains of young children.

Not everyone understands the neuropsychobiology of child development. But a lot of people have taken Psych 101. And there is very foundational research, dating back 70 years, that should be sounding alarms about why relational AI for young children is a very bad idea.

The best way I can describe it: Let's not scale the cloth monkey.

According to the 2025 Common Sense Census on media use by children zero to eight, 40 percent of children have their own tablet by age two. By age four, it's 58 percent. One in five parents already uses devices for the emotional regulation of their children — and before we parent-blame, let's not: These platforms are designed to be sticky.

Among teens, 72 percent have used an AI companion, with more than half using them regularly. One in three find conversations with AI companions to be as satisfying or more satisfying than those with real-life friends (Common Sense Media, 2025b). And it's moving younger: In 2025, Mattel announced a partnership with OpenAI to bring generative AI into children's toys — dolls capable of personalized, free-flowing conversation with kids. Child advocacy groups raised immediate alarms and Mattel ultimately delayed the product indefinitely.

But it won't be the last attempt. Social media companies face increasing critique and scrutiny over negligence, exploitation, and harm toward youth. And AI companions are like social media on steroids, offering the simulation of an intimate reciprocal relationship shaped to a user’s desires. Corporations will keep trying to put that simulation in the hands of younger and younger children.

Here's why this terrifies me.

In the 1950s, psychologist Harry Harlow was breeding rhesus monkeys for cognitive research at the University of Wisconsin. To prevent disease, he separated infants from their mothers at birth and raised them in individual cages, hand-fed with bottles. Physically, these nursery-reared babies were healthier than mother-reared monkeys. By every basic medical metric, they were thriving.

But Harlow noticed something. These babies were psychologically different. They were reclusive and socially awkward, and they clung desperately to the cloth diapers lining their cages. When the diapers were removed for cleaning, the infants threw violent tantrums. This puzzled Harlow. The diapers provided no food, no survival function. Why were the babies treating them like a lifeline?

That observation led to one of psychology's most famous, albeit ethically questionable, experiments. Harlow built two surrogate "mothers": one made of bare wire that could hold a milk bottle, and one covered in soft terry cloth. He tested infant monkeys across conditions: cloth mother with food, wire mother with food, both together, each alone. The results were dramatic. Regardless of which mother provided milk, the infants spent 17 to 18 hours a day clinging to the cloth one. They visited the wire mother only to feed, then rushed back to the cloth. When frightened, they ran to the cloth surrogate. Without her, they collapsed in terror (Harlow, 1958).

Harlow had debunked "cupboard love" — the dominant idea that babies bond with whoever feeds them. Attachment was about comfort, not calories.

But let’s not scale the cloth mother. As those cloth-mothered monkeys grew up, they couldn't socialize. They engaged in self-harm and compulsive rocking, and couldn't form relationships with other monkeys. When they became mothers themselves, many were neglectful or abusive. Harlow himself later wrote that "the nourishment and contact comfort provided by the nursing cloth covered mother in infancy does not produce a normal adolescent or adult" (Harlow, 1966). The cloth mother reduced distress; she did not produce healthy development. She couldn't — because development requires a living system that attunes, responds, and co-regulates.

Why Biology Can't Be Simulated

Neuroscientist Ruth Feldman's research shows us what was missing. During face-to-face interaction, mothers and infants literally synchronize their heart rhythms, oxytocin release, and brain oscillations — a process Feldman calls "biobehavioral synchrony." It's the mechanism by which the mature brain externally regulates the infant's brain and tunes it to social living (Feldman, 2017).

And it's not limited to infancy.

Feldman's lab has documented the same synchrony across all human attachment types: parent-child, romantic partners, close friends, and even strangers. A 30-minute adult conversation triggers measurable hormonal coupling between two people (Djalovski et al., 2021). And when that interaction is technologically mediated rather than face-to-face? Inter-brain synchrony is measurably reduced (Schwartz et al., 2022).

We aren't social by preference. We're social by design. And that design runs on biological signals no algorithm can generate.

I believe AI can do enormous good. But any AI that operates in the space of human relationships should ask this question: Does it lead people toward other people, or away from them?

When it comes to young children, especially, we should avoid tools that mimic relationships, engage in free-flowing conversation, claim to be human, or are designed for emotional and time stickiness. Developmentally helpful tools nudge us toward real-life relationships and positive action known to help us thrive.

The public's deep discomfort with children forming emotional bonds with machines isn't alarmist, and isn't fringe. It's wisdom backed by science.

AI that strengthens human bonds is a tool. AI that substitutes for them — especially for developing children — is a cloth monkey at scale. And 70 years of research warns us what happens next.

We've already run this experiment. We don't need to run it again.

Common Sense Media. (2025). The Common Sense Census: Media use by kids zero to eight. San Francisco, CA: Common Sense Media.

Djalovski, A., Kinreich, S., Zagoory-Sharon, O., & Feldman, R. (2021). Social dialogue triggers biobehavioral synchrony of partners’ endocrine response via sex-specific, hormone-specific, attachment-specific mechanisms. Scientific Reports, 11(1), 1–10.

Harlow, H. F., & Suomi, S. J. (1971). Social recovery by isolation-reared monkeys. Proceedings of the National Academy of Sciences, 68(7), 1534–1538.

Feldman, R. (2017). The neurobiology of human attachments. Trends in Cognitive Sciences, 21(2), 80–99.

Harlow, H. F. (1958). The nature of love. American Psychologist, 13(12), 673–685.

Heinz, M. V., Mackin, D. M., Trudeau, B. M., Bhattacharya, S., Wang, Y., Banta, H. A., ... & Jacobson, N. C. (2025). Randomized trial of a generative AI chatbot for mental health treatment. Nejm Ai, 2(4), AIoa2400802.

Li, R.-N., Folk, D., Singh, A., Ungar, L., & Dunn, E. (2026). Is a random human peer better than a highly supportive chatbot in reducing loneliness over time? Journal of Experimental Social Psychology, 125, 104911.

Radesky JS, Kaciroti N, Weeks HM, Schaller A, Miller AL. Longitudinal Associations Between Use of Mobile Devices for Calming and Emotional Reactivity and Executive Functioning in Children Aged 3 to 5 Years. JAMA Pediatr. 2023;177(1):62–70. doi:10.1001/jamapediatrics.2022.4793

Slingshot AI. (2025). Connection, hope, and real progress: Findings from a real-world study of AI mental health support. talktoash.com/posts/connection-hope-and-real-progress

Waldinger, R. & Schulz, M. (2023). The Good Life: Lessons from the World's Longest Scientific Study of Happiness. Simon & Schuster.

There was a problem adding your email address. Please try again.

By submitting your information you agree to the Psychology Today Terms & Conditions and Privacy Policy


© Psychology Today