New Machine
Research on "new machines" broadly encompasses the development and application of machine learning across diverse fields, aiming to improve efficiency, accuracy, and decision-making. Current efforts focus on refining model architectures like convolutional neural networks, gradient boosting machines, and transformers for tasks ranging from image and signal processing to complex prediction and control problems. This research is significant because it drives advancements in various sectors, including healthcare, energy, manufacturing, and transportation, by enabling automated processes, improved diagnostics, and more efficient resource allocation.
Papers
On the design space between molecular mechanics and machine learning force fields
Yuanqing Wang, Kenichiro Takaba, Michael S. Chen, Marcus Wieder, Yuzhi Xu, John Z. H. Zhang, Kuang Yu, Xinyan Wang, Linfeng Zhang, Daniel J. Cole, Joshua A. Rackers, Joe G. Greener, Peter Eastman, Stefano Martiniani, Mark E. Tuckerman
The Role of Large Language Models in Musicology: Are We Ready to Trust the Machines?
Pedro Ramoneda, Emilia Parada-Cabaleiro, Benno Weck, Xavier Serra
On the Undecidability of Artificial Intelligence Alignment: Machines that Halt
Gabriel Adriano de Melo, Marcos Ricardo Omena De Albuquerque Maximo, Nei Yoshihiro Soma, Paulo Andre Lima de Castro
Tell Codec What Worth Compressing: Semantically Disentangled Image Coding for Machine with LMMs
Jinming Liu, Yuntao Wei, Junyan Lin, Shengyang Zhao, Heming Sun, Zhibo Chen, Wenjun Zeng, Xin Jin