Paper ID: 2503.15420 • Published Mar 19, 2025
LIFT: Latent Implicit Functions for Task- and Data-Agnostic Encoding
Amirhossein Kazerouni, Soroush Mehraban, Michael Brudno, Babak Taati
University of Toronto, Vector Institute, University Health Network
TL;DR
Get AI-generated summaries with premium
Get AI-generated summaries with premium
Implicit Neural Representations (INRs) are proving to be a powerful paradigm
in unifying task modeling across diverse data domains, offering key advantages
such as memory efficiency and resolution independence. Conventional deep
learning models are typically modality-dependent, often requiring custom
architectures and objectives for different types of signals. However, existing
INR frameworks frequently rely on global latent vectors or exhibit
computational inefficiencies that limit their broader applicability. We
introduce LIFT, a novel, high-performance framework that addresses these
challenges by capturing multiscale information through meta-learning. LIFT
leverages multiple parallel localized implicit functions alongside a
hierarchical latent generator to produce unified latent representations that
span local, intermediate, and global features. This architecture facilitates
smooth transitions across local regions, enhancing expressivity while
maintaining inference efficiency. Additionally, we introduce ReLIFT, an
enhanced variant of LIFT that incorporates residual connections and expressive
frequency encodings. With this straightforward approach, ReLIFT effectively
addresses the convergence-capacity gap found in comparable methods, providing
an efficient yet powerful solution to improve capacity and speed up
convergence. Empirical results show that LIFT achieves state-of-the-art (SOTA)
performance in generative modeling and classification tasks, with notable
reductions in computational costs. Moreover, in single-task settings, the
streamlined ReLIFT architecture proves effective in signal representations and
inverse problem tasks.
Figures & Tables
Unlock access to paper figures and tables to enhance your research experience.