Influence Indicator
Influence indicators aim to identify the factors driving outcomes in complex systems, whether these are latent causal processes in time series data, biases affecting collective decisions, or the sources of inspiration in generative models. Current research focuses on developing methods to identify these influences, often employing sparse influence constraints, variational inference, and adaptive aggregation algorithms, while also addressing challenges like instantaneous dependencies and mitigating biases. This work has significant implications for diverse fields, improving the accuracy and fairness of machine learning models, enhancing understanding of social dynamics, and promoting responsible development of generative AI.
Papers
May 24, 2024
March 11, 2024
February 23, 2024
January 25, 2024