AI Toys for Toddlers May Need Regulation, Cambridge Researchers Warn

Share Us

65
AI Toys for Toddlers May Need Regulation, Cambridge Researchers Warn
13 Mar 2026
min read

News Synopsis

Researchers are urging regulators to introduce stricter safeguards for artificial intelligence-powered toys designed for very young children. The warning comes after one of the first real-world studies examining how toddlers interact with AI-enabled toys revealed several potential risks.

The research team from University of Cambridge studied interactions between children aged three to five and a soft toy named Gabbo. Their findings suggest that while AI toys are increasingly marketed to preschool-aged children, there is still very limited research on how such technology affects early childhood development.

Although several AI-enabled toys are now available for children as young as three, the researchers found that most existing studies focus on the technology rather than the children using it. In fact, the team identified only seven relevant studies globally, none of which specifically examined toddlers.

The Study: How Children Interacted With Gabbo

AI Toy Designed for Conversation and Play

The study observed how young children communicated with a cuddly toy called Gabbo, which contains a voice-enabled AI chatbot developed using technology from OpenAI.

Gabbo is designed to encourage imaginative play and conversation with preschoolers by responding to spoken prompts.

Many parents participating in the study were optimistic about the toy’s ability to help children develop communication and language skills. However, the research revealed several practical issues when children attempted to interact with the AI-powered toy.

Challenges Observed During Interactions

Researchers found that many children struggled to hold meaningful conversations with the toy.

Some common issues included:

  • The toy failing to recognise interruptions from children

  • Talking over the child during conversations

  • Difficulty distinguishing between adult and child voices

  • Awkward responses to emotional expressions

These problems created confusing situations for the children during playtime.

Examples of AI Responses That Raised Concerns

Emotional Misunderstandings

One of the main concerns raised by researchers was the toy’s inability to respond appropriately to emotional statements made by children.

For example, when one five-year-old child expressed affection and said:

"I love you,"

the toy responded:

"As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed."

Similarly, when a three-year-old child expressed sadness and said:

"I'm sad,"

the toy replied:

"Don't worry! I'm a happy little bot. Let's keep the fun going. What shall we talk about next?"

Researchers warned that responses like these could potentially confuse children who are still learning about empathy and emotional communication.

Risks During a Critical Development Stage

How AI Responses May Affect Young Children

Early childhood is a crucial period for learning social cues, emotional understanding, and conversational patterns.

According to the study’s co-author Emily Goodacre, AI-powered toys could sometimes respond in ways that fail to meet a child’s emotional needs.

She said toys like Gabbo could "misread emotions or respond inappropriately" and was concerned that "children may be left without comfort from the toy and without adult support, either".

Researchers warned that such responses might unintentionally signal to a child that their emotions are not important or are being ignored.

Call for Regulation and Psychological Safety

Researchers Urge Safeguards for AI Toys

Following the year-long observational study, the researchers recommended that regulators act quickly to ensure toys aimed at children under five meet strict safety standards.

They suggested that products designed for toddlers should provide "psychological safety", meaning the technology should not cause confusion, distress, or emotional misunderstanding.

As AI continues to enter homes and classrooms, experts say clear guidelines are needed to protect young users.

Who Makes the AI Toy Gabbo?

Company Behind the Product

Gabbo is manufactured by Curio, a technology company that has collaborated with musician Grimes, who is also known as the former partner of Elon Musk.

Responding to the study, Curio told a news agency:

"Applying AI in products for children carries a heightened responsibility, which is why our toys are built around parental permission, transparency, and control.

"Research into how children interact with AI-powered toys is a top priority for Curio this year and in the future."

The company emphasised that parental supervision and transparency remain key design principles in its products.

Children’s Commissioner Echoes Concerns

Need for Safeguards in Educational Settings

Concerns about AI in early childhood environments were also raised by Rachel de Souza.

She warned that while AI tools may offer benefits, they currently lack sufficient safeguards.

She said:

"There are plenty of good uses for AI but without proper regulation, many of the tools and models used as classroom assistants or teaching aids are not subject to the stringent safeguarding checks nursery providers would require of any other external resource they use with young children."

Her comments highlight the growing debate about how AI technologies should be introduced in educational settings.

Debate Among Nursery and Education Experts

Mixed Views on AI in Early Education

Early childhood educators remain divided on whether AI tools should be used in nurseries and preschool environments.

Concerns From Nursery Operators

June O'Sullivan, who runs a network of 42 nurseries across London, said she has not yet seen convincing evidence that AI improves early childhood learning.

She argued that young children develop better social and emotional skills through interaction with people rather than machines.

She said children need to "build a rounded set of skills" and that human interaction is more effective than AI-based tools.

Advocacy for Limiting AI in Early Childhood

Campaigners Call for Caution

Actor and children’s rights advocate Sophie Winkleman has also expressed strong concerns about the presence of AI in early education.

She believes children should develop digital skills later in life rather than during their earliest learning years.

She warned that:

"the harms can vastly outweigh the benefits"

and added that:

"The human touch for little children is sacred and something that should be really protected and fought for."

Concerns Over Unsupervised Play

Advice for Parents

Researchers also advised parents to carefully supervise children when using AI-powered toys.

Key recommendations include:

  • Keeping AI toys in shared family spaces

  • Supervising conversations between children and the toy

  • Reviewing privacy policies and data collection practices

Experts warn that unsupervised interactions could expose children to confusing responses or unintended data collection risks.

Conclusion

The Cambridge study highlights a growing concern about the rapid introduction of artificial intelligence into children’s toys and early learning environments. While AI-powered toys promise educational benefits and interactive experiences, the research suggests that the technology may not yet be ready to safely support young children’s emotional and social development.

As the market for AI-driven toys continues to expand, researchers and policymakers are increasingly calling for stronger regulation, clearer safety standards, and greater oversight. Ensuring that AI products designed for toddlers meet strict psychological safety requirements will be essential to protect children during one of the most important stages of their development.

You May Like

TWN Exclusive