We use cookies on this site to help provide the best possible online experience. By using this site you agree to our use of cookies.

Click to view our cookie policy and customize your cookie preferences.

Icon
A notification message..

NAG: A hybrid approach to Adjoint Algorithmic Differentiation

Online

* Sorry, you missed this event *
NAG logo

When is it?

Starts at

Finishes at

The Algorithmic Differentiation (AD) world basically splits in two. Source transformation can give incredibly efficient adjoints, but is restricted to "simple languages" like subsets of C. On the other hand, operator overloading has successfully handled large production codes, but in general is less efficient than source transformation. But there's nothing stopping us combining these two ideas. It turns out that this is somewhat tricky to do, but by no means impossible.

In this webinar, Senior Technical Consultant, Viktor Mosenkis talks about this hybrid approach to Adjoint AD. This approach provides source transformation like performance whilst being as easy to apply as operator overloading.

This webinar will provide:

  • An overview of AD
  • Results from the front lines: what NAG's customers have achieved
  • The general idea behind the hybrid approach to AD
  • Results on in-house code and QuantLib

Join our mailing list for the latest news, event information and resources