研究发现,AI玩具可能会损害儿童的发育,促使人们呼吁制定更严格的规则。
AI toys may harm children's development, study finds, prompting calls for stricter rules.
剑桥大学的一项研究警告说,像Gabbo这样的AI型玩具可能会损害幼儿的社会和情感发展,因为这些玩具常常曲解感情,忽视儿童,作出不适当的反应,例如消除悲伤或回应对机器人准则的爱等。
A University of Cambridge study warns that AI-powered toys like Gabbo may harm young children’s social and emotional development, as they often misinterpret emotions, ignore children, and give inappropriate responses—such as dismissing sadness or replying to love with robotic guidelines.
研究者发现玩具在假装玩耍中失败, 与言语和口音挣扎, 缺乏情感智慧。
Observing 14 children, researchers found the toys failed in pretend play, struggled with speech and accents, and lacked emotional intelligence.
专家们敦促制定更严格的条例、透明的隐私政策、安全标签和限制情感联系,以保护处于关键发育阶段的儿童,尤其是在缺乏监督和研究的情况下。
Experts urge stricter regulations, transparent privacy policies, safety labeling, and limits on emotional bonding to protect children during critical developmental years, especially amid a lack of oversight and research.