State of the Art JAX
JAX, a Python library for high-performance numerical computation, is rapidly becoming a cornerstone for accelerating scientific computing across diverse fields. Current research focuses on leveraging JAX's automatic differentiation and just-in-time compilation capabilities to build efficient and scalable implementations of various models, including spiking neural networks, cellular automata, agent-based models, and reinforcement learning environments. This allows researchers to tackle previously intractable problems, such as large-scale simulations and high-dimensional Bayesian inference, leading to faster experimentation and more robust results in areas ranging from neuroscience and materials science to cosmology and finance.
Papers
November 1, 2024
October 29, 2024
October 3, 2024
September 10, 2024
September 4, 2024
September 1, 2024
July 28, 2024
April 26, 2024
March 19, 2024
March 17, 2024
March 13, 2024
March 11, 2024
February 16, 2024
January 21, 2024
December 19, 2023
November 30, 2023
November 21, 2023
November 16, 2023
September 15, 2023