苹果在iCloud上面临虐童幸存者的诉讼, Apple faces lawsuit from child abuse survivors over claims it hosted and ignored abuse images on iCloud.
Apple面对成千上万儿童性虐待幸存者的集体诉讼, 指控公司故意收留并未能删除iCloud上的虐待图像和视频。 Apple faces a class-action lawsuit from thousands of child sexual abuse survivors, accusing the company of knowingly hosting and failing to remove abuse images and videos on iCloud. 原告声称Apple拥有检测此类内容的技术,但中止了2021年“CSAM检测”方案。 The plaintiffs claim Apple had the technology to detect such content but discontinued its 2021 "CSAM Detection" program. 该诉讼寻求确保儿童安全的措施,指控苹果的不作为伤害了受害者。 The lawsuit seeks measures to ensure child safety, alleging Apple's inaction has harmed victims.