Added by Antoine Savine on May 25, 2020 at 11:30am — No Comments
Recorded in Bloomberg's London offices in November 2019:
Added by Antoine Savine on April 17, 2020 at 7:53am — No Comments
We first provide a mini-tutorial on Adjoint Algorithmic Differentiation (AAD) (also known as back-propagation in machine learning). We then illustrate how neural networks may be used to compute dynamic values and risks of trading books with applications to risk management of derivatives, valuation adjustments (XVA), counterpart credit risk, FRTB and SIMM margin valuation adjustments (MVA). We also describe new techniques to substantially improve deep learning on simulated data, and…Continue
Added by Antoine Savine on December 10, 2019 at 1:30am — No Comments
Automatic Adjoint Differentiation (AAD) and back-propagation are key technologies in modern machine learning and finance. It is back-prop that enables deep neural networks to learn to identify faces on photographs in reasonable time. It is AAD that allows financial institutions to compute the risks of complex derivatives books in real time. The two technologies share common roots.
See the AAD book here:…Continue
Deep Learning is picking momentum in Quantitative Finance, outside the obvious application to the prediction of asset prices (where to my knowledge it is not particularly effective) and spreading into the more serious application area of option pricing and risk management.
These two recent papers clearly demonstrate the benefits of DL as a pricing technology alternative to the classical FDM and Monte-Carlo in certain contexts:…Continue
Added by Antoine Savine on January 11, 2019 at 5:30am — No Comments
To access the document, go to https://github.com/asavine/CompFinance/blob/master/Intro2AADinMachineLearningAndFinance.pdf
This is a work in progress, feedback is highly appreciated.…Continue