On the Limits of Current Implementations of Algorithmic Differentiation.




The computation of derivatives is a crucial part in various computational techniques used in science and engineering. In many applications, like parameter identification, design optimization or data assimilation problems, different optimization tasks have to be performed. Since most numerical optimization algorithms require the use of either gradient or Jacobian derivative information, the accurate evaluation of these derivatives is essential. The technique of automatic differentiation provides an efficient way of computing derivatives without truncation error. In the present note we investigate various issues that arise in the GRADIENT, TAMC and Tapenade implementations of algorithmic differentiation for programs written in Maple and Fortran.