Skip Gram
Skip-gram is a widely used word embedding technique aiming to learn vector representations of words that capture semantic relationships by predicting surrounding words within a defined window. Current research focuses on improving skip-gram's efficiency, particularly through dimension regularization and optimized negative sampling strategies, as well as enhancing its performance by incorporating distance weighting and adaptive window sizes. These advancements have led to improved results in various natural language processing tasks, including language identification, and have also found applications in graph embedding for tasks like node classification and link prediction.
Papers
August 28, 2024
April 30, 2024
April 23, 2024
October 3, 2023
April 24, 2023
October 11, 2022