Otmane Amel

E-Origin, INFRABEL, BARISK

Biography
Passionate about using Data Science to address real-world challenges, I'm currently pursuing my Ph.D. to delve deeper into multimodal learning techniques that mirror human perception. Always eager to enhance my expertise, I actively seek opportunities to master cutting-edge tools and acquire new skills.
Thesis

Multimodal learning : Learn from Different Modalities in an Explainable Way

ABSTRACT :

This thesis embarks on a captivating exploration of multimodal learning algorithms, inspired by the human brain's remarkable capacity to combine a myriad of sensory inputs. We delve into three application areas: customs fraud detection, "InfraSecure" a railway construction safety project, and computer-assisted diagnosis for osteoporosis pathology. Our aim is to develop efficient multimodal learning algorithms that elevate the performance of existing unimodal solutions and advance explainability in artificial intelligence. Our research spotlights the importance of robust fusion methods and effective modality encoders while highlighting key challenges in the field, including the integration of diverse data sources and the need for model transparency. The study concludes with the  implementation of our proposed fusion method for predicting customs classification, thereby demonstrating the practical applicability of multimodal learning. Additionally, we identify areas that require further research, paving the way for future advancements in this field.

KEYWORDS : Multimodal fusion, Representation learning, Deep learning, XAI

Advisors

Sidi Ahmed Mahmoudi

Embedded and Explainable AI

Check Profile

Xavier Siebert

Publications
A review and comparative study of explainable deep learning models applied on action recognition in real time

Electronics 12 (9), 2027, 2023

Access publication
Multimodal Approach for Harmonized System Code Prediction

arXiv preprint arXiv:2406.04349, 2024

Access publication
FuDensityNet: Fusion-Based Density-Enhanced Network for Occlusion Handling

Proceedings Copyright 632, 639, 2024

Access publication