Graph Self Training
Graph self-training (GST) is a semi-supervised learning technique for graph neural networks (GNNs) that addresses the scarcity of labeled data by iteratively assigning pseudo-labels to unlabeled nodes. Current research focuses on mitigating distribution shifts between labeled and unlabeled data, which can arise from biased pseudo-label selection and negatively impact model generalization. This involves developing algorithms that consider factors like node homophily and employing techniques to ensure consistency between the distributions of labeled and pseudo-labeled nodes. Improved GST methods promise more robust and accurate GNN performance in various applications where labeled data is limited, such as social network analysis and recommendation systems.