Description
Date depot: 27 mars 2025
Titre: Explainable AI in Myopathic Diseases Diagnostic
Directrice de thèse:
Nicoleta ROGOVSCHI (LIPADE)
Encadrante :
Patricia CONDE-CESPEDES (LISITE)
Directrice de thèse:
Maria TROCAN (LISITE)
Domaine scientifique: Sciences et technologies de l'information et de la communication
Thématique CNRS : Intelligence artificielle
Resumé: Myopathic diseases are typically diagnosed through clinical observation and manual analysis of electromyography (EMG) signals by rheumatologists and neurologists. However, the manual diagnostic process is time-consuming, prone to human error, and highly dependent on the expertise of healthcare professionals. As healthcare systems face increasing demand for more accurate and efficient diagnostic tools, artificial intelligence (AI) offers a promising solution.
Recent Research has shown that integrating machine learning models into real-time diagnostic systems, particularly with the use of EMG signals, can improve workflow efficiency [1-7]. Techniques such as convolutional neural networks (CNNs) and hybrid models combining CNNs with recurrent neural networks (RNNs) have provided high performance to classify EMG data. CNNs are particularly effective in spatial analysis, capturing local patterns within the signal, while RNNs or long short-term memory (LSTM) networks excel in temporal sequence modeling, which is crucial for EMG signals that vary over time [8, 9]. While deep learning approaches eliminate the need for manual feature extraction, they come with challenges related to interpretability and computational complexity [10].
Ethical concerns, particularly regarding explainability and bias, remain unresolved in healthcare applications [1–7]. The adoption of AI in clinical settings requires transparency in how decisions are made to ensure patient safety and provider trust. Besides, these models often involve high computational costs and complexity, making them less viable for real-time clinical use [5]. XAI methods aim to bridge the gap between the ’black-box’ nature of deep learning models and the need for transparent decision- making in clinical settings [11,12].
The purpose of this project is not only to develop an efficient, real-time diagnostic system capable of processing EMG signals efficiently, delivering immediate feedback to clinicians without compromising diagnostic accuracy, but also the integration of XAI to enhance the interpretability of the diagnostic models and ensure that ethical standards, such as fairness and accountability. By offering clear visual or textual explanations of the diagnostic outcomes, the system will provide clinicians with a more understandable and reliable tool for aiding in medical decision-making. This transparency is key to addressing regulatory concerns and fostering broader adoption of AI technologies in healthcare.
Challenges:
• Develop an AI-based platform capable of diagnosing various myopathic diseases by analyzing EMG signals collected through EMG sensors placed on patients’ arms by reducing the burden on healthcare professionals, improving diagnostic accuracy, and ensuring that patients receive timely and appropriate treatments.
• Integrate explainability mechanisms into deep learning models, providing visual explanations for the classifications of myopathic diseases. This will meet the clinical need for transparency and trust in AI systems, which are critical for ethical adoption in medical practice. The use of explainable AI (XAI) will ensure that the system’s decisions are interpretable, offering clear insights into the diagnostic process, which is crucial for clinical adoption and for maintaining trust between healthcare providers and patients.