Multi Scale
Multi-scale analysis focuses on processing and interpreting data across different scales of resolution, aiming to capture both fine details and broader contextual information. Current research emphasizes the development of novel architectures, such as transformers and state-space models (like Mamba), often incorporating multi-scale convolutional layers, attention mechanisms, and hierarchical structures to improve feature extraction and representation learning. This approach is proving valuable in diverse fields, enhancing performance in tasks ranging from medical image segmentation and time series forecasting to object detection and image super-resolution, ultimately leading to more accurate and robust results in various applications.
Papers
Mamba as Decision Maker: Exploring Multi-scale Sequence Modeling in Offline Reinforcement Learning
Jiahang Cao, Qiang Zhang, Ziqing Wang, Jingkai Sun, Jiaxu Wang, Hao Cheng, Yecheng Shao, Wen Zhao, Gang Han, Yijie Guo, Renjing Xu
Nutrition Estimation for Dietary Management: A Transformer Approach with Depth Sensing
Zhengyi Kwan, Wei Zhang, Zhengkui Wang, Aik Beng Ng, Simon See