finmath lib automatic differentiation extensions

Enabling finmath-lib to perform automatic differentiation (e.g. backward automatic differentiation, aka AAD).

License

License

Categories

Categories

Net Auto Application Layer Libs Code Generators
GroupId

GroupId

net.finmath
ArtifactId

ArtifactId

finmath-lib-automaticdifferentiation-extensions
Last Version

Last Version

1.1.0
Release Date

Release Date

Type

Type

jar
Description

Description

finmath lib automatic differentiation extensions
Enabling finmath-lib to perform automatic differentiation (e.g. backward automatic differentiation, aka AAD).
Source Code Management

Source Code Management

https://github.com/finmath/finmath-lib-automaticdifferentiation-extensions

Download finmath-lib-automaticdifferentiation-extensions

How to add to project

<!-- https://jarcasting.com/artifacts/net.finmath/finmath-lib-automaticdifferentiation-extensions/ -->
<dependency>
    <groupId>net.finmath</groupId>
    <artifactId>finmath-lib-automaticdifferentiation-extensions</artifactId>
    <version>1.1.0</version>
</dependency>
// https://jarcasting.com/artifacts/net.finmath/finmath-lib-automaticdifferentiation-extensions/
implementation 'net.finmath:finmath-lib-automaticdifferentiation-extensions:1.1.0'
// https://jarcasting.com/artifacts/net.finmath/finmath-lib-automaticdifferentiation-extensions/
implementation ("net.finmath:finmath-lib-automaticdifferentiation-extensions:1.1.0")
'net.finmath:finmath-lib-automaticdifferentiation-extensions:jar:1.1.0'
<dependency org="net.finmath" name="finmath-lib-automaticdifferentiation-extensions" rev="1.1.0">
  <artifact name="finmath-lib-automaticdifferentiation-extensions" type="jar" />
</dependency>
@Grapes(
@Grab(group='net.finmath', module='finmath-lib-automaticdifferentiation-extensions', version='1.1.0')
)
libraryDependencies += "net.finmath" % "finmath-lib-automaticdifferentiation-extensions" % "1.1.0"
[net.finmath/finmath-lib-automaticdifferentiation-extensions "1.1.0"]

Dependencies

compile (1)

Group / Artifact Type Version
net.finmath : finmath-lib jar 3.2.17

test (1)

Group / Artifact Type Version
junit : junit jar 4.12

Project Modules

There are no modules declared in this project.

finmath-lib automatic differentiation extensions


Enabling finmath lib to utilize automatic differentiation algorithms (e.g. AAD).


This project implements a stochastic automatic differentiation.

The implementation is fast, memory efficient and thread safe. It handles automatic differentiation of the conditional expectation (American Monte-Carlo), see http://ssrn.com/abstract=3000822.

The project provides an interface RandomVariableDifferentiableInterface for random variables which provide automatic differentiation. The interface extends RandomVariableInterface and hence allows to use auto-diff in all Monte-Carlo contexts (via a replacement of the corresponding parameters / factories).

The project also provides implementations of this interface, e.g. utilizing the backward (a.k.a. adjoint) method via RandomVariableDifferentiableAADFactory. This factory creates a random variable RandomVariableDifferentiableAAD which implements RandomVariableDifferentiableInterface.

All the backward automatic differentiation code is contained in RandomVariableDifferentiableAAD.

The interface RandomVariableInterface is provided by finmath-lib and specifies the arithmetic operations which may be performed on random variables, e.g.,

RandomVariableDifferentiableInterface add(RandomVariableDifferentiableInterface randomVariable);	
RandomVariableDifferentiableInterface mult(RandomVariableDifferentiableInterface randomVariable);
RandomVariableDifferentiableInterface exp();

// ...	

The interface RandomVariableDifferentiableInterface will introduce two additional methods:

Long getID();	
Map<Long, RandomVariableInterface> getGradient();

The method getGradient will return a map providing the first order differentiation of the given random variable (this) with respect to all its input RandomVariableDifferentiableInterfaces (leaf nodes). To get the differentiation with respect to a specific object use

/* Get the gradient of X with respect to all its leaf nodes: /*
Map gradientOfX = X.getGradient();

/* Get the derivative of X with respect to Y: */
RandomVariableInterface derivative = gradientOfX.get(Y.getID());

AAD on Cuda GPUs

It is possible to combine the automatic-differentiation-extensions with the cuda-extensions.

Using

AbstractRandomVariableFactory randomVariableFactory = new RandomVariableDifferentiableAADFactory();

will create a standard (CPU) random variable with automatic differentiation. Instead, using

AbstractRandomVariableFactory randomVariableFactory = new RandomVariableDifferentiableAADFactory(new RandomVariableCudaFactory());

will create a Cuda GPU random variable with automatic differentiation.

Example

The following sample code calculates valuation, delta, vega and rho for an almost arbitrary product (here an EuropeanOption) using AAD on the Monte-Carlo valuation

RandomVariableDifferentiableAADFactory randomVariableFactory = new RandomVariableDifferentiableAADFactory();

// Generate independent variables (quantities w.r.t. to which we like to differentiate)
RandomVariableDifferentiableInterface initialValue	= randomVariableFactory.createRandomVariable(modelInitialValue);
RandomVariableDifferentiableInterface riskFreeRate	= randomVariableFactory.createRandomVariable(modelRiskFreeRate);
RandomVariableDifferentiableInterface volatility	= randomVariableFactory.createRandomVariable(modelVolatility);

// Create a model
AbstractModel model = new BlackScholesModel(initialValue, riskFreeRate, volatility);

// Create a time discretization
TimeDiscretizationInterface timeDiscretization = new TimeDiscretization(0.0 /* initial */, numberOfTimeSteps, deltaT);

// Create a corresponding MC process
AbstractProcess process = new ProcessEulerScheme(new BrownianMotion(timeDiscretization, 1 /* numberOfFactors */, numberOfPaths, seed));

// Using the process (Euler scheme), create an MC simulation of a Black-Scholes model
AssetModelMonteCarloSimulationInterface monteCarloBlackScholesModel = new MonteCarloAssetModel(model, process);

/*
 * Value a call option (using the product implementation)
 */
EuropeanOption europeanOption = new EuropeanOption(optionMaturity, optionStrike);
RandomVariableInterface value = (RandomVariableDifferentiableInterface) europeanOption.getValue(0.0, monteCarloBlackScholesModel);

/*
 * Calculate sensitivities using AAD
 */
Map<Long, RandomVariableInterface> derivative = ((RandomVariableDifferentiableInterface)value).getGradient();
	
double valueMonteCarlo = value.getAverage();
double deltaAAD = derivative.get(initialValue.getID()).getAverage();
double rhoAAD = derivative.get(riskFreeRate.getID()).getAverage();
double vegaAAD = derivative.get(volatility.getID()).getAverage();
net.finmath

finmath.net

Mathematical Finance

Versions

Version
1.1.0
1.0.1
0.9.0
0.8.2
0.8.0
0.7.2
0.6.9
0.6.4
0.5.0
0.0.1