Skip Gram

Skip-gram is a widely used word embedding technique aiming to learn vector representations of words that capture semantic relationships by predicting surrounding words within a defined window. Current research focuses on improving skip-gram's efficiency, particularly through dimension regularization and optimized negative sampling strategies, as well as enhancing its performance by incorporating distance weighting and adaptive window sizes. These advancements have led to improved results in various natural language processing tasks, including language identification, and have also found applications in graph embedding for tasks like node classification and link prediction.

Papers