Automatic differentiation is a powerful technique in numerical computing. In this talk we will explore the tradeoffs of implementing AD on different levels, and discuss automatic differentiation for parallel programs and high-level programming languages. The particular focus will be on Enzyme an AD framework that operates on the compiler level and supports reverse and forward mode AD in a variety of languages such as Julia, C/C++, Fortran and others. This talk is based on the paper "Scalable Automatic Differentiation of Multiple Parallel Paradigms through Compiler Augmentation" by Moses et al. (PDF, Winner of the Best Student Paper Award at SC22).
To obtain the Zoom link for this online talk, please get in touch with Gregor Gassner or Michael Schlottke-Lakemper.