Attention Based Random Forest

Attention-based random forests enhance the predictive power of traditional random forests by incorporating attention mechanisms, which assign weights to individual trees or leaf nodes based on their relevance to a given input. Current research focuses on developing sophisticated attention architectures, including two-level attention (leaf and tree-level) and self-attention mechanisms to refine predictions and mitigate noise. These advancements aim to improve the accuracy and robustness of random forest models, particularly for complex datasets, with applications across various fields requiring predictive modeling.

Papers