Artificial Intelligence Native
"AI-native" systems aim to integrate artificial intelligence directly into the core architecture of various technologies, moving beyond simply using AI as a tool. Current research focuses on applying large language models (LLMs), large multi-modal models (LMMs), and other generative AI models to tasks like network orchestration, autonomous programming, and resource management, often within a cloud-edge-end collaborative framework. This approach promises significant improvements in efficiency, adaptability, and personalization across diverse applications, particularly in 6G networks and beyond, impacting both the design of future systems and the development of novel AI algorithms.
Papers
November 1, 2024
October 1, 2024
September 21, 2024
September 17, 2024
June 26, 2024
April 29, 2024
April 25, 2024
April 16, 2024
January 30, 2024
January 17, 2024
November 10, 2023
October 26, 2023
October 18, 2023
October 11, 2023
July 12, 2023