Skip Connection
Skip connections, pathways that bypass intermediate layers in neural networks, are a crucial element in improving the performance and efficiency of deep learning models. Current research focuses on optimizing skip connection designs within various architectures, including U-Net and its variants, transformers, and GANs, to enhance feature propagation, address information loss, and reduce computational costs. This involves exploring different connection strategies, such as dense, multi-scale, and bidirectional skip connections, and integrating them with other techniques like attention mechanisms. Improved skip connection design leads to more accurate and efficient models across diverse applications, including medical image segmentation, image synthesis, and vision-language tasks.
Papers
mPLUG: Effective and Efficient Vision-Language Learning by Cross-modal Skip-connections
Chenliang Li, Haiyang Xu, Junfeng Tian, Wei Wang, Ming Yan, Bin Bi, Jiabo Ye, Hehong Chen, Guohai Xu, Zheng Cao, Ji Zhang, Songfang Huang, Fei Huang, Jingren Zhou, Luo Si
UNet#: A UNet-like Redesigning Skip Connections for Medical Image Segmentation
Ledan Qian, Xiao Zhou, Yi Li, Zhongyi Hu