Remote Sensing
Remote sensing utilizes satellite and aerial imagery to analyze Earth's surface, aiming to extract meaningful information for various applications like environmental monitoring and urban planning. Current research heavily emphasizes deep learning techniques, particularly transformer-based architectures and masked autoencoders, to improve the accuracy and efficiency of tasks such as semantic segmentation, object detection, and image-text retrieval. These advancements are crucial for enhancing our understanding of Earth's systems and informing decision-making in areas ranging from climate change mitigation to resource management. The field also sees growing interest in multimodal fusion, few-shot learning, and explainable AI to address challenges like data scarcity and model interpretability.
Papers
STARS: Sensor-agnostic Transformer Architecture for Remote Sensing
Ethan King, Jaime Rodriguez, Diego Llanes, Timothy Doster, Tegan Emerson, James Koch
Joint-Optimized Unsupervised Adversarial Domain Adaptation in Remote Sensing Segmentation with Prompted Foundation Model
Shuchang Lyu, Qi Zhaoa, Guangliang Cheng, Yiwei He, Zheng Zhou, Guangbiao Wang, Zhenwei Shi