Multi Layer
Multi-layer architectures are a central theme in contemporary machine learning, aiming to improve the efficiency and accuracy of various models by strategically organizing computational units into multiple layers. Current research focuses on optimizing these architectures, exploring alternatives to traditional multilayer perceptrons (MLPs) such as Kolmogorov-Arnold Networks (KANs) and Fourier Analysis Networks (FANs), and investigating techniques like layer distillation and frequency shifting for improved performance and reduced computational cost. These advancements have significant implications for diverse applications, including music generation, image processing, natural language processing, and scientific computing, by enabling faster, more accurate, and more efficient models.
Papers
Machine Learning Optimal Ordering in Global Routing Problems in Semiconductors
Heejin Choi, Minji Lee, Chang Hyeong Lee, Jaeho Yang, Rak-Kyeong Seong
Improving Location-based Thermal Emission Side-Channel Analysis Using Iterative Transfer Learning
Tun-Chieh Lou, Chung-Che Wang, Jyh-Shing Roger Jang, Henian Li, Lang Lin, Norman Chang
Stable Diffusion is a Natural Cross-Modal Decoder for Layered AI-generated Image Compression
Ruijie Chen, Qi Mao, Zhengxue Cheng
Two Layer Walk: A Community-Aware Graph Embedding
He Yu, Jing Liu
Task-Agnostic Language Model Watermarking via High Entropy Passthrough Layers
Vaden Masrani, Mohammad Akbari, David Ming Xuan Yue, Ahmad Rezaei, Yong Zhang
Detecting LLM Hallucination Through Layer-wise Information Deficiency: Analysis of Unanswerable Questions and Ambiguous Prompts
Hazel Kim, Adel Bibi, Philip Torr, Yarin Gal
ASLoRA: Adaptive Sharing Low-Rank Adaptation Across Layers
Junyan Hu, Xue Xiao, Mengqi Zhang, Xiao Chen, Zhaochun Ren, Zhumin Chen, Pengjie Ren
ScaleOT: Privacy-utility-scalable Offsite-tuning with Dynamic LayerReplace and Selective Rank Compression
Kai Yao, Zhaorui Tan, Tiandi Ye, Lichun Li, Yuan Zhao, Wenyan Liu, Wei Wang, Jianke Zhu