Multi Scale
Multi-scale analysis focuses on processing and interpreting data across different scales of resolution, aiming to capture both fine details and broader contextual information. Current research emphasizes the development of novel architectures, such as transformers and state-space models (like Mamba), often incorporating multi-scale convolutional layers, attention mechanisms, and hierarchical structures to improve feature extraction and representation learning. This approach is proving valuable in diverse fields, enhancing performance in tasks ranging from medical image segmentation and time series forecasting to object detection and image super-resolution, ultimately leading to more accurate and robust results in various applications.
Papers
LoFi: Scalable Local Image Reconstruction with Implicit Neural Representation
AmirEhsan Khorashadizadeh, Tobías I. Liaudat, Tianlin Liu, Jason D. McEwen, Ivan Dokmanić
From Electrode to Global Brain: Integrating Multi- and Cross-Scale Brain Connections and Interactions Under Cross-Subject and Within-Subject Scenarios
Chen Zhige, Qin Chengxuan
Evolving Multi-Scale Normalization for Time Series Forecasting under Distribution Shifts
Dalin Qin, Yehui Li, Weiqi Chen, Zhaoyang Zhu, Qingsong Wen, Liang Sun, Pierre Pinson, Yi Wang
Dual-Attention Frequency Fusion at Multi-Scale for Joint Segmentation and Deformable Medical Image Registration
Hongchao Zhou, Shunbo Hu