あなたはAI医師を信用しますか?新しい研究により、患者は分裂していることが判明(Would you trust an AI doctor? New research shows patients are split)

ad

2023-05-19 アリゾナ大学

◆アリゾナ大学の研究によると、AIによる医療アドバイスに完全に信頼を置かない人は50%以上だが、多くの人々は人間の監視と指導があればAIに信頼を寄せる。
◆AIによる医療治療法の選択肢は増えており、診断の正確性を向上させる可能性があるが、研究に参加した人の約52%は診断と治療においてAIよりも人間の医師を選ぶという結果が示された。人間の医師の支持がある場合、参加者はAIを受け入れやすくなった。
◆今後の研究では、AIを臨床実践に取り入れるための最善の方法と患者の意思決定についての研究が必要である。

<関連情報>

診断における人工知能(AI)に対する多様な患者の意識 Diverse patients’ attitudes towards Artificial Intelligence (AI) in diagnosis

Christopher Robertson,Andrew Woods,Kelly Bergstrand,Jess Findley,Cayley Balser,Marvin J. Slepian
PLOS Digital Health
  Published: May 19, 2023
DOI:https://doi.org/10.1371/journal.pdig.0000237

Abstract

Artificial intelligence (AI) has the potential to improve diagnostic accuracy. Yet people are often reluctant to trust automated systems, and some patient populations may be particularly distrusting. We sought to determine how diverse patient populations feel about the use of AI diagnostic tools, and whether framing and informing the choice affects uptake. To construct and pretest our materials, we conducted structured interviews with a diverse set of actual patients. We then conducted a pre-registered (osf.io/9y26x), randomized, blinded survey experiment in factorial design. A survey firm provided n = 2675 responses, oversampling minoritized populations. Clinical vignettes were randomly manipulated in eight variables with two levels each: disease severity (leukemia versus sleep apnea), whether AI is proven more accurate than human specialists, whether the AI clinic is personalized to the patient through listening and/or tailoring, whether the AI clinic avoids racial and/or financial biases, whether the Primary Care Physician (PCP) promises to explain and incorporate the advice, and whether the PCP nudges the patient towards AI as the established, recommended, and easy choice. Our main outcome measure was selection of AI clinic or human physician specialist clinic (binary, “AI uptake”). We found that with weighting representative to the U.S. population, respondents were almost evenly split (52.9% chose human doctor and 47.1% chose AI clinic). In unweighted experimental contrasts of respondents who met pre-registered criteria for engagement, a PCP’s explanation that AI has proven superior accuracy increased uptake (OR = 1.48, CI 1.24–1.77, p < .001), as did a PCP’s nudge towards AI as the established choice (OR = 1.25, CI: 1.05–1.50, p = .013), as did reassurance that the AI clinic had trained counselors to listen to the patient’s unique perspectives (OR = 1.27, CI: 1.07–1.52, p = .008). Disease severity (leukemia versus sleep apnea) and other manipulations did not affect AI uptake significantly. Compared to White respondents, Black respondents selected AI less often (OR = .73, CI: .55-.96, p = .023) and Native Americans selected it more often (OR: 1.37, CI: 1.01–1.87, p = .041). Older respondents were less likely to choose AI (OR: .99, CI: .987-.999, p = .03), as were those who identified as politically conservative (OR: .65, CI: .52-.81, p < .001) or viewed religion as important (OR: .64, CI: .52-.77, p < .001). For each unit increase in education, the odds are 1.10 greater for selecting an AI provider (OR: 1.10, CI: 1.03–1.18, p = .004). While many patients appear resistant to the use of AI, accuracy information, nudges and a listening patient experience may help increase acceptance. To ensure that the benefits of AI are secured in clinical practice, future research on best methods of physician incorporation and patient decision making is required.

ad

医療・健康
ad
ad
Follow
ad
タイトルとURLをコピーしました