Paper ID: 2112.06433

Generate Point Clouds with Multiscale Details from Graph-Represented Structures

Ximing Yang, Zhibo Zhang, Zhengfu He, Cheng Jin

As details are missing in most representations of structures, the lack of controllability to more information is one of the major weaknesses in structure-based controllable point cloud generation. It is observable that definitions of details and structures are subjective. Details can be treated as structures on small scales. To represent structures in different scales at the same time, we present a graph-based representation of structures called the Multiscale Structure Graph (MSG). Given structures in multiple scales, similar patterns of local structures can be found at different scales, positions, and angles. The knowledge learned from a regional structure pattern shall be transferred to other similar patterns. An encoding and generation mechanism, namely the Multiscale Structure-based Point Cloud Generator (MSPCG) is proposed, which can simultaneously learn point cloud generation from local patterns with miscellaneous spatial properties. The proposed method supports multiscale editions on point clouds by editing the MSG. By generating point clouds from local structures and learning simultaneously in multiple scales, our MSPCG has better generalization ability and scalability. Trained on the ShapeNet, our MSPCG can generate point clouds from a given structure for unseen categories and indoor scenes. The experimental results show that our method significantly outperforms baseline methods.

Submitted: Dec 13, 2021