DNN Verifier
DNN verifiers are tools designed to rigorously assess the correctness and safety of deep neural networks (DNNs), aiming to guarantee their reliable performance in critical applications. Current research focuses on improving the efficiency and scalability of these verifiers, often employing techniques like bound propagation, parallel computing, and statistical methods to handle the complexity of large DNNs, as well as developing methods for generating verifiable proofs of correctness. This work is crucial for building trust in DNN-based systems, particularly in safety-critical domains like autonomous driving and healthcare, by providing mathematically sound guarantees about their behavior.
Papers
May 17, 2024
December 10, 2023
August 18, 2023
May 29, 2023
April 4, 2023
January 17, 2023
June 1, 2022