Lexical Representation
Lexical representation focuses on how words and their meanings are encoded and processed, aiming to create effective and interpretable models for various natural language processing tasks. Current research emphasizes developing hybrid models that combine sparse lexical representations (efficient for retrieval) with dense semantic representations (capturing nuanced meaning), often leveraging large language models like BERT and Llama 2, and employing techniques like integrated gradients for explainability. These advancements improve information retrieval, cross-modal alignment (e.g., image-text), and applications such as emotion detection in speech, ultimately enhancing the understanding and utilization of language in diverse fields.
Papers
September 30, 2024
August 29, 2024
July 25, 2024
May 21, 2024
February 27, 2024
February 26, 2024
June 14, 2023
May 24, 2023
May 19, 2023
May 16, 2023
September 14, 2022
August 29, 2022
July 5, 2022
December 17, 2021