Nr Iqa
No-Reference Image Quality Assessment (NR-IQA) aims to automatically evaluate the perceived quality of an image without needing a pristine reference image, a crucial task for various applications like photo curation and mobile image processing. Current research focuses on developing lightweight, efficient deep learning models—often employing transformer, convolutional neural network, or state-space model architectures—that accurately predict human judgments of image quality, even on high-resolution images and mobile devices. This field is vital for improving image processing algorithms, optimizing image compression techniques, and enhancing user experience in applications dealing with large numbers of images.
Papers
MSLIQA: Enhancing Learning Representations for Image Quality Assessment through Multi-Scale Learning
Nasim Jamshidi Avanaki, Abhijay Ghildiyal, Nabajeet Barman, Saman Zadtootaghaj
A Deep-Learning-Based Lable-free No-Reference Image Quality Assessment Metric: Application in Sodium MRI Denoising
Shuaiyu Yuan, Tristan Whitmarsh, Dimitri A Kessler, Otso Arponen, Mary A McLean, Gabrielle Baxter, Frank Riemer, Aneurin J Kennerley, William J Brackenbury, Fiona J Gilbert, Joshua D Kaggie
DSL-FIQA: Assessing Facial Image Quality via Dual-Set Degradation Learning and Landmark-Guided Transformer
Wei-Ting Chen, Gurunandan Krishnan, Qiang Gao, Sy-Yen Kuo, Sizhuo Ma, Jian Wang
Q-Mamba: On First Exploration of Vision Mamba for Image Quality Assessment
Fengbin Guan, Xin Li, Zihao Yu, Yiting Lu, Zhibo Chen