Dimensional Embeddings
Dimensional embeddings reduce the dimensionality of high-dimensional data while preserving essential information, primarily aiming to improve computational efficiency and facilitate data analysis and visualization. Current research focuses on developing novel algorithms and model architectures, such as those based on transformers, variational autoencoders, and contrastive learning, to create embeddings that accurately capture both local and global data structures, often with a focus on preserving hierarchical relationships or specific properties like angles or densities. This work is significant because efficient and informative dimensional embeddings are crucial for handling large datasets in various fields, enabling improved performance in tasks such as recommendation systems, knowledge graph analysis, and image processing.
Papers
ZADU: A Python Library for Evaluating the Reliability of Dimensionality Reduction Embeddings
Hyeon Jeon, Aeri Cho, Jinhwa Jang, Soohyun Lee, Jake Hyun, Hyung-Kwon Ko, Jaemin Jo, Jinwook Seo
Classes are not Clusters: Improving Label-based Evaluation of Dimensionality Reduction
Hyeon Jeon, Yun-Hsin Kuo, Michaël Aupetit, Kwan-Liu Ma, Jinwook Seo