Multi Modal Remote Sensing

Multi-modal remote sensing integrates data from diverse sources like optical, radar, and infrared sensors to improve the accuracy and scope of Earth observation. Current research emphasizes developing advanced deep learning models, including large language models and federated learning approaches, to effectively fuse these heterogeneous data types, often addressing challenges like missing data and computational limitations in satellite-based processing. This field is crucial for enhancing applications across various domains, such as sustainable development monitoring (e.g., poverty estimation), precision agriculture, and disaster response, by providing more comprehensive and reliable information about our planet. The development of large, multi-modal datasets is also a key focus to train and evaluate these increasingly sophisticated models.

Papers