人工智能冒充亲人的声音骗局正在上升,诱骗25%的人输钱。
AI voice scams impersonating loved ones are rising, tricking 25% of people into losing money.
2025年10月,AI-动力克隆骗术不断升级,欺诈者只用几秒钟的音频来模仿亲人,在紧急、深夜的电话中,他们声称发生事故或逮捕等危机。
In October 2025, AI-powered voice cloning scams are escalating, with fraudsters using just seconds of audio to mimic loved ones in urgent, late-night calls claiming crises like accidents or arrests.
McAfee的一项调查发现,25%的人遭遇过这种骗局,77%的受害者损失了钱,往往是数千人,70%的人无法察觉假声音。
A McAfee survey found 25% of people have encountered such scams, and 77% of victims lost money, often thousands, with 70% unable to detect the fake voice.
专家敦促家庭建立私人代码词,并使用“找到我”或“生活360”等定位工具,同时强调准备和核查,以避免成为情感操纵、AI驱动的欺诈的受害者。
Experts urge families to establish private code words and use location tools like “Find My” or Life 360, while emphasizing preparation and verification to avoid falling victim to emotionally manipulative, AI-driven fraud.