Data Standardization
Data standardization aims to create consistent and comparable datasets across diverse sources, improving the reliability and efficiency of data analysis and machine learning. Current research heavily utilizes large language models (LLMs) to automate this process, focusing on applications like standardizing clinical data for AI, sensor data for IoT positioning, and text data for various tasks including scientific paper review and ad generation. This work is crucial for advancing numerous fields, enabling more robust AI systems, improved data analysis, and more efficient workflows across various scientific and industrial domains.
Papers
October 29, 2024
October 18, 2024
September 4, 2024
August 22, 2024
August 16, 2024
August 8, 2024
July 9, 2024
May 3, 2024
March 27, 2024
March 18, 2024
March 13, 2024
March 7, 2024
February 24, 2024
February 19, 2024
February 14, 2024
February 7, 2024
January 23, 2024
November 23, 2023
November 5, 2023