Dot Product Kernel

Dot product kernels are fundamental components in kernel methods, enabling efficient computation of similarity between data points represented as vectors. Current research focuses on improving their scalability and expressiveness through techniques like scalable neural network kernels (SNNKs) and generalized zonal kernels (GZKs), often employing random feature approximations to reduce computational cost. These advancements are crucial for addressing limitations in applying kernel methods to high-dimensional data, impacting fields such as self-supervised learning and large-scale machine learning applications. The theoretical understanding of their learnability and generalization properties under various norms and data scaling regimes is also a significant area of ongoing investigation.

Papers