Welcome to the ATTD 2023 Interactive Program

Displaying One Session

PARALLEL SESSION
Session Type
PARALLEL SESSION
Date
Thu, 23.02.2023
Room
Hall A1
Session Time
13:00 - 14:30
Session Icon
Live Q&A

Real-world data on the use of decision support systems (ID 179)

Lecture Time
13:00 - 13:20
Session Type
PARALLEL SESSION
Date
Thu, 23.02.2023
Session Time
13:00 - 14:30
Room
Hall A1
Session Icon
Live Q&A

IS006 - Incorporating explainability and interpretability into AI-enabled decision support systems (ID 180)

Lecture Time
13:20 - 13:40
Session Type
PARALLEL SESSION
Date
Thu, 23.02.2023
Session Time
13:00 - 14:30
Room
Hall A1
Session Icon
Live Q&A

Abstract

Abstract Body

Artificial intelligence (AI) and the sub-field of machine learning (ML) are yielding powerful tools that are beginning to impact the field of diabetes in a number of ways. ML algorithms are being trained to forecast glucose, to predict meal and exercise events, and utilized in decision support systems to make insulin dose recommendations. Larger data sets are now becoming available because of the ubiquity of commercial sensors and these data sets are being used to train new ML algorithms. A challenge in the use of ML algorithms in healthcare, is that the algorithms are oftentimes not interpretable or explainable. An algorithm with high interpretability means that the algorithm is adept at indicating the cause and effect relationship between an input and an output of the algorithm. An algorithm with high explainability is designed in such a way that it is possible to easily understand how an algorithm works and therefore why it provides a specific forecast or recommendation. In this talk, I will discuss how we are incorporating interpretability and explainability into an AI-driven app-based decision support tool called DailyDose that is used to provide insulin dosing and behavioral suggestions to people with type 1 diabetes using multiple-daily-injection therapy. Specifically, I will review several decision support approaches: (1) a rule-based system, (2) a k-nearest-neighbor approach and (3) a digital twin approach. I will discuss the strengths and weaknesses of each of these approaches as they relate to interpretability and explainability. I will show results from a recent clinical study on DailyDose that showed that glucose outcomes could be improved (6.3% increased time in range), but only when participants followed the recommendations provided by the app. A rule-based and an AI-based exercise decision support module within DailyDose will also be described with regards to interpretability and explainability. I will finally describe how the recommender engine in DailyDose compares with physician recommendations and how often the two agree.
Hide

IS007 - “Designing an integrated, scalable decision support and coaching platform for multiple daily injection therapy”? (ID 181)

Lecture Time
13:40 - 14:00
Session Type
PARALLEL SESSION
Date
Thu, 23.02.2023
Session Time
13:00 - 14:30
Room
Hall A1
Session Icon
Live Q&A

Abstract

Abstract Body

Designing an integrated, scalable decision support and coaching platform for multiple daily injection therapy

Artificial Intelligence (AI) based decision support tools offer great promise to improve the care for people with type 1 diabetes who use multiple daily injections. We designed and tested in a clinical study a smartphone app decision support tool, DailyDose. This system makes insulin dose adjustment recommendations once weekly driven by an AI-based algorithm. In the pilot clinical study, we found some participants did not accept recommendations even when clinically indicated based on glucose patterns. We conducted interviews with participants at the completion of the study which indicated involvement of clinical diabetes care and education specialists and behavioral health experts may improve uptake and interactions with the decision support system. However, this type of care is costly and resource intensive. In order to ensure scalability of the decision support system, we have designed a follow-up study whereby those participants not achieving glycemic goals with decision support app use alone would receive diabetes education and behavioral health support tailored to their needs. This approach may allow for greater scalability and effectiveness. This presentation will include discussion of (1) qualitative results from post-study interviews with participants, (2) incorporation of these findings into the decision support-app to improve usability and explainability, (3) development of a web portal for interaction of diabetes educators, behavioral health and diabetes providers with app users. These system updates are important to ensure people with type 1 diabetes are able to benefit fully from AI-based decision support systems. Lastly, the design of the next phase multi-site clinical study with DailyDose will be presented.

Hide

IS008 - Patient reported outcomes in closed loop studies (ID 182)

Lecture Time
14:00 - 14:20
Session Type
PARALLEL SESSION
Date
Thu, 23.02.2023
Session Time
13:00 - 14:30
Room
Hall A1
Session Icon
Live Q&A

Abstract

Abstract Body

Closed loop (CL) automated insulin delivery leads to glycemic improvements yet there are mixed findings with regard to patient reported outcomes (PROs). PROs refer to the subjective experience of the person using CL and often include topics such as quality of life, satisfaction, and diabetes distress. Common methods for obtaining PROs are validated surveys and structured interviews or focus groups. This presentation covers the results from CL studies and real-world publications with regard to PROs, why there are mixed findings (e.g., some studies show PROs improvements while others show no change), and how we can improve methods for PROs data collection in clinics and future studies.

Hide

Q&A (ID 183)

Lecture Time
14:20 - 14:30
Session Type
PARALLEL SESSION
Date
Thu, 23.02.2023
Session Time
13:00 - 14:30
Room
Hall A1
Session Icon
Live Q&A