Cross Device
Cross-device research focuses on leveraging data and computations across multiple devices, addressing challenges in efficiency, accuracy, and scalability. Current efforts concentrate on developing robust algorithms for federated learning (e.g., incorporating momentum variance reduction to handle non-convex losses and data heterogeneity), efficient latency prediction models for diverse hardware (using compact AST representations and domain adaptation techniques), and improved graph neural network architectures for tasks like user matching (employing hierarchical structures and cross-attention mechanisms). These advancements are crucial for improving the performance and applicability of machine learning across various domains, from personalized recommendations to quantum computing device characterization.