# Semidefinite Relaxation

Semidefinite relaxation (SDR) is a technique used to approximate solutions to difficult non-convex optimization problems by reformulating them as convex semidefinite programs, which are efficiently solvable. Current research focuses on improving the speed and scalability of SDR methods, particularly for large-scale problems in robotics, machine learning (e.g., support vector machines, neural network verification), and signal processing, often employing techniques like chordal decomposition and tailored algorithms such as Burer-Monteiro factorization. The effectiveness of SDR hinges on the tightness of the relaxation; recent work explores methods to automatically tighten relaxations and analyze conditions guaranteeing tightness, leading to more accurate and reliable solutions for a wide range of applications.