Forward Compatible Training
Forward compatible training (FCT) aims to develop machine learning models that can seamlessly integrate future updates without retraining on past data, addressing the significant computational cost and potential performance degradation associated with traditional retraining methods. Current research focuses on adapting FCT to various architectures, including graph neural networks and optical neural networks, often employing techniques like auxiliary feature learning or decoupled training modules to maintain compatibility while improving performance. This approach holds significant promise for improving the efficiency and scalability of large-scale systems, particularly in applications like image retrieval and class-incremental learning where continuous model updates are necessary.