Non Robust

Non-robustness in machine learning and robotics focuses on understanding and mitigating the vulnerability of systems to unexpected inputs or perturbations. Current research investigates this issue across various models, including deep neural networks and robotic control systems, exploring factors like noise sensitivity, adversarial examples, and the impact of model parameters and training regimes. This research is crucial for improving the reliability and safety of AI systems in critical applications, such as autonomous vehicles and medical diagnosis, where robustness is paramount. Addressing non-robustness is essential for building trustworthy and dependable AI systems.

Papers