Large Scale
Large-scale data processing and analysis are central to addressing numerous scientific and engineering challenges, focusing on efficient handling of massive datasets and complex systems. Current research emphasizes developing novel algorithms and model architectures, such as graph neural networks, deep learning models, and physics-guided machine learning, to improve efficiency, accuracy, and scalability in diverse applications. These advancements are crucial for tackling problems ranging from traffic optimization and robot navigation to astronomical surveys and the development of more energy-efficient AI systems. The resulting insights and tools have significant implications across various fields, enabling more effective data-driven decision-making and scientific discovery.
Papers
TnT-LLM: Text Mining at Scale with Large Language Models
Mengting Wan, Tara Safavi, Sujay Kumar Jauhar, Yujin Kim, Scott Counts, Jennifer Neville, Siddharth Suri, Chirag Shah, Ryen W White, Longqi Yang, Reid Andersen, Georg Buscher, Dhruv Joshi, Nagu Rangan
CICLe: Conformal In-Context Learning for Largescale Multi-Class Food Risk Classification
Korbinian Randl, John Pavlopoulos, Aron Henriksson, Tony Lindgren
Counting-Stars: A Multi-evidence, Position-aware, and Scalable Benchmark for Evaluating Long-Context Large Language Models
Mingyang Song, Mao Zheng, Xuan Luo
Sim-to-Real Grasp Detection with Global-to-Local RGB-D Adaptation
Haoxiang Ma, Ran Qin, Modi shi, Boyang Gao, Di Huang
Large-scale Benchmarking of Metaphor-based Optimization Heuristics
Diederick Vermetten, Carola Doerr, Hao Wang, Anna V. Kononova, Thomas Bäck
Seed Optimization with Frozen Generator for Superior Zero-shot Low-light Enhancement
Yuxuan Gu, Yi Jin, Ben Wang, Zhixiang Wei, Xiaoxiao Ma, Pengyang Ling, Haoxuan Wang, Huaian Chen, Enhong Chen