# Lecture slides: Introduction to Adjoint Differentiation and Back-Propagation in Machine Learning and Finance

To access the document, go to https://github.com/asavine/CompFinance/blob/master/Intro2AADinMachi...

This is a work in progress, feedback is highly appreciated. Content

Introduction

• Application: model fitting
• Calibrating a financial pricing model
• Training a deep learning model
• Application: market risk
• Differentiation
• A brief history of AD
• Overview
• Demonstration: Dupire’s model (1992)
• Demonstration: Results

Deep Learning

• Neural networks and deep learning
• Linear regression: prediction
• Linear regression and classification (1)
• Linear regression and classification (2)
• Linear regression: training
• Lin reg only captures linear functions
• Basis function regression
• Basis function regression: performance
• Curse of dimensionality
• Overfitting and the rule of ten
• Basis function regression vs ANN
• ANN basis functions
• ANN: prediction
• Choice of activation function
• ANN: computation graph
• ANN: training
• Universal representation theorem
• More on regression, basis functions and ANNs
• Deep learning: composing basis functions
• Why deep learning?
• Deep feed-forward networks

Back-Propagation in deep neural nets

• Differentials of the cost function
• Differentials by finite differences
• FD and automatic differentiation
• Computing cost differentials
• Differentials by forward Jacobian propagation
• Jacobian propagation: performance
• Back-Propagation

• Evaluation graphs and adjoint propagation
• Example: Black & Scholes
• Black & Scholes: evaluation graph
• Feed-forward equations
• Differentiation equations
• Forward and backward differentiation
• Evaluation graphs: conclusion

Recording calculations on tape

• Automatic differentiation
• Building evaluation graphs
• Recording operations
• Lazy evaluation
• Conventional implementation
• Simplistic implementation
• Record and tape data structures
• Custom real number
• Avoiding code duplication
• Applying the recording framework
• Instrumenting computation code

• State of the tape after recording
• Complexity
• Conclusion

• Simple simulation code
• Simulation code, version 1
• A simplistic code
• An inefficient code
• Smoothing barrier options
• Smooth barrier
• Simulation code with smooth barrier
• Simulation code
• Differentiation steps
• Differentiation code
• Initialization
• Pick and return results
• Testing the code
• Solution in principle
• Solution in code
• Performance
• Conclusion

Views: 878

Comment

Join Data Science Central Comment by Antoine Savine on March 14, 2019 at 7:33am Comment by Antoine Savine on December 7, 2018 at 3:54am

Looks like it is the same Brownian motion we are talking about :) For an introduction in finance, we usually recommend Bjork's book

https://www.amazon.com/Arbitrage-Theory-Continuous-Oxford-Finance/d... Comment by Vincent Granville on November 22, 2018 at 5:06pm

Wondering if my presentation of Brownian motions (BM), differential and integrated BM, is compatible with the traditional definition found in quant books or taught at Princeton, or used by physicists. See my presentation about this topic, in the first two chapters of my book  Applied Stochastic Processes, Chaos Modeling, and Probabilistic Pro...