AIツールがスマホの写真から母乳育児の合併症を発見、授乳ケアを効率化(AI Tool Spots Breastfeeding Complications from Phone Photos for Streamlined Lactation Care)

ad

2024-07-18 カリフォルニア大学サンディエゴ校(UCSD)

カリフォルニア大学サンディエゴ校の研究者たちは、AIを用いて授乳中の母親に対する迅速かつ正確なケアを提供するツールを開発しました。このツールは、スマートフォンで撮影された画像から健康な乳房か、6つの一般的な授乳関連の状態のいずれかを識別することができます。研究は、乳房の健康状態を評価し、緊急性に応じて母親が専門家の助けを求めるタイミングを判断するのに役立つことを目指しています。初期の実験では、健康な乳房を84.4%、異常な乳房を95%の精度で識別できました。さらなる研究とデータの増加が必要ですが、この技術は授乳ケアへのアクセスを拡大し、臨床結果を改善する可能性があります。

<関連情報>

視覚に基づく母乳関連状態の検出による産後遠隔ケアの強化: アルゴリズムの開発と検証 Augmenting Telepostpartum Care With Vision-Based Detection of Breastfeeding-Related Conditions: Algorithm Development and Validation

Jessica De Souza;  Varun Kumar Viswanath;  Jessica Maria Echterhoff;  Kristina Chamberlain;  Edward Jay Wang
Journal of Medical Internet Research  Published:24.6.2024
DOI:https://doi.org/10.2196/54798

AIツールがスマホの写真から母乳育児の合併症を発見、授乳ケアを効率化(AI Tool Spots Breastfeeding Complications from Phone Photos for Streamlined Lactation Care)

Abstract

Background:
Breastfeeding benefits both the mother and infant and is a topic of attention in public health. After childbirth, untreated medical conditions or lack of support lead many mothers to discontinue breastfeeding. For instance, nipple damage and mastitis affect 80% and 20% of US mothers, respectively. Lactation consultants (LCs) help mothers with breastfeeding, providing in-person, remote, and hybrid lactation support. LCs guide, encourage, and find ways for mothers to have a better experience breastfeeding. Current telehealth services help mothers seek LCs for breastfeeding support, where images help them identify and address many issues. Due to the disproportional ratio of LCs and mothers in need, these professionals are often overloaded and burned out.

Objective:
This study aims to investigate the effectiveness of 5 distinct convolutional neural networks in detecting healthy lactating breasts and 6 breastfeeding-related issues by only using red, green, and blue images. Our goal was to assess the applicability of this algorithm as an auxiliary resource for LCs to identify painful breast conditions quickly, better manage their patients through triage, respond promptly to patient needs, and enhance the overall experience and care for breastfeeding mothers.

Methods:
We evaluated the potential for 5 classification models to detect breastfeeding-related conditions using 1078 breast and nipple images gathered from web-based and physical educational resources. We used the convolutional neural networks Resnet50, Visual Geometry Group model with 16 layers (VGG16), InceptionV3, EfficientNetV2, and DenseNet169 to classify the images across 7 classes: healthy, abscess, mastitis, nipple blebs, dermatosis, engorgement, and nipple damage by improper feeding or misuse of breast pumps. We also evaluated the models’ ability to distinguish between healthy and unhealthy images. We present an analysis of the classification challenges, identifying image traits that may confound the detection model.

Results:
The best model achieves an average area under the receiver operating characteristic curve of 0.93 for all conditions after data augmentation for multiclass classification. For binary classification, we achieved, with the best model, an average area under the curve of 0.96 for all conditions after data augmentation. Several factors contributed to the misclassification of images, including similar visual features in the conditions that precede other conditions (such as the mastitis spectrum disorder), partially covered breasts or nipples, and images depicting multiple conditions in the same breast.

Conclusions:
This vision-based automated detection technique offers an opportunity to enhance postpartum care for mothers and can potentially help alleviate the workload of LCs by expediting decision-making processes.

医療・健康
ad
ad
Follow
ad
タイトルとURLをコピーしました