Molecule Generation
Molecule generation, the computational design of novel molecules with desired properties, aims to accelerate drug discovery and materials science. Current research heavily utilizes large language models (LLMs), graph neural networks (GNNs), diffusion models, and reinforcement learning (RL) algorithms, often combined in hybrid approaches, to generate molecules based on various conditions, including textual descriptions, target properties, and even 3D protein structures. These advancements improve the efficiency and diversity of molecule generation, addressing challenges like synthesizability and chemical validity. The resulting tools have significant implications for accelerating scientific discovery and enabling the design of new drugs and materials.
Papers
Improving Molecular Graph Generation with Flow Matching and Optimal Transport
Xiaoyang Hou, Tian Zhu, Milong Ren, Dongbo Bu, Xin Gao, Chunming Zhang, Shiwei Sun
Bridging the Gap between Learning and Inference for Diffusion-Based Molecule Generation
Peidong Liu, Wenbo Zhang, Xue Zhe, Jiancheng Lv, Xianggen Liu
Geometric Representation Condition Improves Equivariant Molecule Generation
Zian Li, Cai Zhou, Xiyuan Wang, Xingang Peng, Muhan Zhang
Conditional Enzyme Generation Using Protein Language Models with Adapters
Jason Yang, Aadyot Bhatnagar, Jeffrey A. Ruffolo, Ali Madani
Text-guided Diffusion Model for 3D Molecule Generation
Yanchen Luo, Junfeng Fang, Sihang Li, Zhiyuan Liu, Jiancan Wu, An Zhang, Wenjie Du, Xiang Wang