OpenAI 的 AI 转录工具 Whisper 在医疗环境中经常出现"幻觉",导致潜在的风险. OpenAI's AI-transcription tool, Whisper, frequently "hallucinates" in medical settings, leading to potential risks.
研究者发现OpenAI的AI动力抄录工具Whisper经常产生假句, 引起医疗等高风险行业的担忧, Researchers have found that OpenAI's AI-powered transcription tool, Whisper, frequently "hallucinates" by generating false sentences, raising concerns in high-risk industries like healthcare. 尽管开放国际协会警告不要将其用于敏感地区,但许多医疗设施还是采用它来转录病人的诊疗。 Despite OpenAI's warnings against its use in sensitive areas, many medical facilities have adopted it for transcribing patient consultations. 专家呼吁制定联邦条例来解决这些问题,而开放会计师协会则承认这一问题,并正在努力改进。 Experts are calling for federal regulations to address these issues, while OpenAI acknowledges the problem and is working on improvements.