NAG Numerical Services for Algorithmic Differentiation (AD)
NAG has a range of AD services and solutions to enable organizations to use this technique. NAG is continuously developing solutions and tools for Algorithmic Differentiation with increasing focus on addressing users' specific needs. Significant contributions have come from Vehicle Engineering and Financial Services, as well as Ocean and Climate Modelling.
The NAG Numerical Services team is engaged by clients to advise on, evaluate, specify, write and support custom AD solutions.
What is Algorithmic Differentiation
Algorithmic Differentiation, also sometimes called Automatic Differentiation or Computational Differentiation, is a technique for augmenting numerical simulation programs with the ability to compute first- and higher-order mathematical derivatives. In sharp contrast with classical numerical differentiation by finite differences, AD delivers gradients, Jacobians, and Hessians with machine accuracy by avoiding truncation - it uses analytic differentiation of individual statements within an arbitrarily complex simulation.
Algorithmic Differentiation has been applied in particular to optimization, parameter identification, nonlinear equation solving, the numerical integration of differential equations, and combinations of these.
Algorithmic Differentiation exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin, cos, etc.). By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, and accurate to working precision.
The adjoint (also reverse) mode of AD is of particular interest in the context of large-scale sensitivity analysis and nonlinear optimization.
Calculations that take years can be done in hours - for problems where adjoint AD techniques are available they can dramatically speed up gradient calculations.
What does NAG provide?
NAG provides expert help, support, training and consulting services that deliver substantial insight into this powerful technique in order to achieve robustness and efficiency for your specific AD application.
You can rely on The Numerical Algorithms Group to advise, develop, implement and maintain the best Algorithmic Differentiation solution for your problem type, software environment and hardware platform.
NAG collaborates with RWTH Aachen to deliver AD solutions to clients. AD projects are usually a combination of services covering training, applying AD tools to clients own codes (C, C++ or Fortran) and implementing AD versions of numerical functions.
NAG Software Tools
The dco (derivative code by overloading) Library is a run time AD library that uses operator and function overloading techniques*, offered by C++ as well as Fortran, to implement the transformation of a given simulation program into first and higher derivative code. This Library is particularly efficient with a very flexible interface structure.
NAG Fortran Compiler for AD
For simulation programs written in Fortran a version of the NAG Fortran compiler has been extended to serve as a pre-processor to dco. The seamless integration of AD into a complex build system is facilitated. Hence, the amount of modifications to be made by the user in the original source code can be minimized or even be eliminated entirely.
* Operator overloading: The operator overloading features of a programming language allow mathematical operators ( +,-,/,* ) to be changed so that they compute derivative values along with their usual elementary computation function.
- Exact First- and Second-Order Greeks by Algorithmic Differentiation
- Adjoint Algorithmic Differentiation of a GPU Accelerated Application
Fig: Illustration from GPU Accelerated Application paper
- The Art of Differentiating Computer Programs. An Introduction to Algorithmic Differentiation.
- Adjoint Parameter Estimation in Computational Finance
Differentiation Enabled Fortran Compiler Technology
The CompAD (Compiler for Automatic Differentiation) research project is investigating the integration of Algorithmic Differentiation (AD) capabilities to the NAG Fortran Compiler. This collaboration with computer scientists at the University of Hertfordshire in Hatfield and at RWTH Aachen University in Germany is funded by EPSRC.
Investigators and Research Associates at the University of Hertfordshire (UH) and RWTH Aachen University (RWTH):
Professor Bruce Christianson (Principal Investigator, UH)
Professor Uwe Naumann (Co-Investigator, RWTH and UH)
Jan Riehme (Research Associate, UH)
Dmitrij Gendler (Research Associate, UH)