Interval Analysis
Interval analysis is a computational technique that uses intervals to represent uncertain quantities, aiming to provide guaranteed bounds on the results of calculations involving these uncertainties. Current research focuses on improving the efficiency and scalability of interval methods, particularly for applications involving neural networks and dynamical systems, employing techniques like parallelization and novel inclusion functions to handle complex interactions. This work is significant for enhancing the reliability and robustness of machine learning models and control systems, as well as enabling formal verification of their behavior under uncertainty in various domains, including time series classification and individualized decision-making. The development of efficient toolboxes and algorithms is driving progress in these areas.