Urban Scene
Urban scene analysis focuses on computationally representing and understanding the complex visual and geometric properties of city environments, aiming to improve applications like autonomous driving and urban planning. Current research heavily utilizes neural rendering techniques, such as 3D Gaussian splatting and Neural Radiance Fields (NeRFs), often incorporating LiDAR data and addressing challenges like reflection noise removal, moving object detection, and view extrapolation. These advancements enable more accurate 3D reconstruction, improved object detection, and the generation of realistic synthetic data for training perception models, ultimately contributing to safer and more efficient urban systems.
Papers
VEGS: View Extrapolation of Urban Scenes in 3D Gaussian Splatting using Learned Priors
Sungwon Hwang, Min-Jung Kim, Taewoong Kang, Jayeon Kang, Jaegul Choo
A Radiometric Correction based Optical Modeling Approach to Removing Reflection Noise in TLS Point Clouds of Urban Scenes
Li Fang, Tianyu Li, Yanghong Lin, Shudong Zhou, Wei Yao
HGS-Mapping: Online Dense Mapping Using Hybrid Gaussian Representation in Urban Scenes
Ke Wu, Kaizhao Zhang, Zhiwei Zhang, Shanshuai Yuan, Muer Tie, Julong Wei, Zijun Xu, Jieru Zhao, Zhongxue Gan, Wenchao Ding
HO-Gaussian: Hybrid Optimization of 3D Gaussian Splatting for Urban Scenes
Zhuopeng Li, Yilin Zhang, Chenming Wu, Jianke Zhu, Liangjun Zhang