安大略警方警告 AI授权的祖父母骗局 模仿家庭声音 以窃取钱财。
Ontario police warn of AI-powered grandparent scams mimicking family voices to steal money.
安大略警方警告说,利用人工智能语音克隆模仿家庭成员、制造逮捕或事故等紧急假紧急情况的祖父母骗局增多。
Police in Ontario warn of a rise in grandparent scams using AI voice cloning to mimic family members, creating urgent fake emergencies like arrests or accidents.
1月初,两名老年人成为攻击目标,其中一人在一位朋友停止转移前几乎损失了20 000美元。
In early January, two seniors were targeted, one nearly losing $20,000 before a friend stopped the transfer.
Scammers使用来自社交媒体的简短音频剪辑来克隆声音,迫使受害者通过电汇或加密货币迅速汇款,往往要求保密。
Scammers use brief audio clips from social media to clone voices, pressuring victims to send money quickly via wire transfer or cryptocurrency, often demanding secrecy.
专家说人工智能使这些骗局变得更加现实和广泛.
Experts say AI has made these scams more realistic and widespread.
当局敦促老年人通过独立接触核实紧急呼吁,并保持对红旗的警觉,如突发紧急事件和要求保密等。
Authorities urge seniors to verify urgent calls through independent contact and remain alert to red flags like sudden urgency and requests for secrecy.