# Discrete Time Martingales

## Conditional expectation

Definition A Borel set is any set in a topological space that can be formed from open sets through the operations of:

• complement
• countable union
• countable intersection

Definition Let $Y$ be a random vector and $X$ be a integrable random variable with $\mathbb{E}|X|<\infty$. The conditional expectation of $X$ given $Y$ is the unique measureable function $f(Y)$ such that for every Borel set $\mathcal{B}$:

We denote $f(Y)$ as $\mathbb{E}(X|Y)$

Example 1 Suppose random variable $X$ and $Y$ are discrete.

Example 2 Suppose random variable $X$ and $Y$ are continuous, with joint probability density function $f_{X, Y}(x. y)$ and marginal density $f_X(x)$ and $f_Y(y)$.

Here are some basic properties of conditional expectation:

• Linearity: $\mathbb{E}[aX + bY|\mathcal{F}] = a\mathbb{E}[X|\mathcal{F}] + b\mathbb{E}[Y|\mathcal{F}]$
• Constant: if $X = a$, then $\mathbb{E}[X|\mathcal{F}] = a$
• Independence: if $X$ is independent of $\mathcal{F}$, then $\mathbb{E}[X|\mathcal{F}] = \mathbb{E}X$
• Tower Property: if $\mathcal{G} \subset \mathcal{F}$ then $\mathbb{E}[ \mathbb{E}[X | \mathcal{F}]|\mathcal{G}] = \mathbb{E}[X | \mathcal{G}]$
• Factorization Property: if Z is $\mathcal{F}$-measurable then $\mathbb{E}[ZX|\mathcal{F}] = Z\mathbb{E}[X|\mathcal{F}]$
• Monotonicity: if $X \leq Y$, then $\mathbb{E}[X|\mathcal{F}] \leq \mathbb{E}[Y|\mathcal{F}]$ a.s.

## $L^2$ Theory

Definition A $\boldsymbol{\sigma}$-algebra is a collection $\Sigma$ of subsets of a Borel set $\mathcal{B}$, that is closed under:

• complement, e.g. if $A \in \Sigma$, then $\mathcal{B}\backslash A \in \Sigma$
• countable unions, e.g. if $A_n \in \Sigma$, then $\cup A_n \in \Sigma$

Definition $L^2(\Omega, \mathcal{F}, \mathbb{P})$ is the set of all $\mathcal{F}$-measurable square-integrable random variable $X$, with finite 2nd moment $\mathbb{E}X^2$.

Definition A real Hilbert space is a real vector space $\mathcal{H}$ with an inner product $<,>$, such that $\mathcal{H}$ is a complete metric space w.r.t. to the metric $d$, where:

Hilbert space examples: $\mathbb{R}^n$, with inner product $<\textbf{x}, \textbf{y}> = \sum x_iy_i$. Or, $L^2(\Omega, \mathcal{F}, \mathbb{P})$, with inner product $<\textbf{X}_1, \textbf{X}_2> = \mathbb{E}[X_1X_2]$. The reason we are interested at $L^2$ rather than $L^p$ for other $p$ is that the innner product $\mathbb{E}[X_1X_2]$ give rise of orthogonality.

Proposition If $X \in L^2(\Omega, \mathcal{F}, \mathbb{P})$, then for any $\sigma$-algebra $\mathcal{G} \in \mathcal{F}$, the conditional expectation $\mathbb{E}[X|\mathcal{G}]$ is the orthogonal projection of X onto $L^2(\Omega, \mathcal{G}, \mathbb{P})$, such that:

Also, $\mathbb{E}[X|\mathcal{G}]$ can be interpreted as a $\mathcal{G}$-measurable random variable that minimizes the mean square error $\mathbb{E}[(X - \mathbb{E}[X|\mathcal{G}])^2]$.

## Martingales

Definition A filtration is an increasing sequence of $\sigma$-algebra $\mathcal{F}_n \subset \mathcal{F}$, where $\mathcal{F}$ is the $\sigma$-algebra of all events.

Definition A martingale is a sequence of $\mathcal{F}$ measurable integrable random variable $X_{n}$ such that:

The tower property implies that $\mathbb{E}X_n = X_0$.

Example 1 Given I.I.D. random variable $X_n \subset L^2$ with $\mathbb{E}X_n = 0$ and variance $\sigma^2$.

• Sequence $S_n = \sum_{i=1}^n X_i$, and
• Sequence$T_n = (\sum_{i=1}^n X_i)^2 - n\sigma^2$

are both martingales.

Example 2 Let $X$ be any $L^1$ random variable and $\mathcal{F}_n$ be any filtration. Then the sequence $X_n := \mathbb{E}[X|\mathcal{F}_n]$ is a closed martingales.

Note that the St. Petersburg martingale $X_n$ is not closed, where $X_0 \in \mathbb{R}$ and $P(X_{n}=2X_{n-1}) = 1/2$ and $P(X_{n}=0) = 1/2$. This is because $X \notin L^{1}$.

Example 3 Given I.I.D. random variable $X_n$ with moment generating function $M_{X} = \mathbb{E}e^{\theta X}$. Then the exponential martingales $Z_{n}$ is a positive martingale with definition:

## Doob’s Indentity

Definition A sequence $Z_n$ of random variables is predictable with respect to filtration $\mathcal{F}_n$ if $Z_n$ is measurable with respect to $\mathcal{F}_{n-1}$

Definition A sequence $Z_n$ of random variables is adapted to filtration $\mathcal{F}_n$ if $Z_n$ is measurable with respect to $\mathcal{F}_{n}$

Proposition If $X_n$ is a martingale with $X_0 = 0$ and $Z_n$ is a predictable sequence of bounded random variables, then the martingale transform $\{Z \cdot X\}_n$ is a martingale:

Definition A stopping time with respect to filtration $\mathcal{F}$ is a random variable $T \in \mathbb{N} \cup \{\infty\}$ such that $\{T = n\} \in \mathcal{F}_{n} \;\forall\; n \geq 0$

Lemma Let $T$ be a stopping time, then the sequence $Z_n := \textbf{1}_{T \geq n}$ is predictable.

Theorem Let $X_{n}$ be a martingale and $T$ be a stopping time. For all $m \in \mathbb{N}$, the Doob’s Identity states that $\mathbb{E}X_{T\wedge m} = \mathbb{E}X_{0}$. Note that if $|X_{T\wedge m}|$ is bounded for all $m$, DCT shows that $\mathbb{E}X_{T} = \mathbb{E}X_{0}$.

Proof. $X_{T\wedge n}$ is a martingale:

Theorem Let $f_n$ be a sequence functions on measure space $(\mathcal{S}, \Sigma, \mu)$ that converge point-wise to a function f. For $\lim_{n \rightarrow \infty} \int_{\mathcal{S}} f_{n}d\mu = \int_{\mathcal{S}} f d\mu$,

• The Dominated Convergence Theroem (DCT) requires $f_{n}$ to be dominated by an integrable function $g$: $|f_{n}(x)| \leq g(x)$

• The Monotone Convergence Theroem (MCT) requires $f_{n}$ to be monotone (increasing or decreasing): $f_{1} \leq f_{2} \leq f_{3} ...$ or $f_{1} \geq f_{2} \geq f_{3} ...$

Example 1 Let $S_{n} = \sum X_{i}$ be a simple random walk with $X_{i} = \pm 1$. Let stopping time $T := min[n: S_{n} = +A \;or -B]$, where $A, B>0$.

We know that $S_{n}$ is a martingale and $S_{T\wedge n} < max(A, B)$. Apply Doobs’s Identity and DCT we have:

We know that $S_{n}^2 - n$ is a martingale. Apply Doobs’s Identity we have $\mathbb{E}S_{T\wedge n}^2 = \mathbb{E}(T\wedge n)$. Since $S_{T\wedge n}^2$ is bounded by $max(A^2, B^2)$ and $T\wedge n$ is monotone, apply DCT on the RHS and MCT on LHS we get:

Combine both results we can get some interesting result for the Gambler’s Ruin problem:

Example 2 Let $S_{n}$ be a simple random walk. Let stopping time $T := min\{n: S_{n} = +A\}$, where $A>0$. Note that now DCT fails as $S_{T\wedge n}$ is not bounded. Hence $\mathbb{E}S_{T} \neq 0$.

In fact, $\mathbb{E}S_{T} = 1$ because $S_{T} \equiv 1$:

## Doob’s Maximal Inequality

Definition An adapted sequence of random variable $X_n$ is a:

• sub-martingale if $\mathbb{E}[X_n | \mathcal{F}_{n-1}] \geq X_{n-1}$
• super-martingale if $\mathbb{E}[X_n | \mathcal{F}_{n-1}] \leq X_{n-1}$

Proposition If $\varphi :\mathbb{R} \rightarrow \mathbb{R}$ is a convex function and $X_n$ is a martingale, then:

• The Jensen’s Inequality holds: $\varphi(\mathbb{E}X) \leq \mathbb{E}\varphi(X)$
• the sequence $\varphi(X_n)$ is a sub-martingale.

Proposition If $X_n$ is a martingale with $X_0 = 0$ and $Z_n$ is a predictable sequence of boundedm non-negative random variables, then the martingale transform $\{Z \cdot X\}_n$ is a sub-martingale:

Proposition If $X_n$ is a martingale with $X_0 = 0$ and $Z_n$ is a predictable sequence of random variables such that $Z_n \in [0, 1]$, then $\mathbb{E}\{Z \cdot X\}_n \leq \mathbb{E}X_n$

Corollary If $X_n$ is a non-negative sub-martingale with initial term $X_0 = 0$, then Doob’s Maximal Inequality claims that for any $\alpha \in \mathbb{R}$:

and that:

Note that this is a big improvement on the Chebyshev Inequality, which claims that given $L^2$-bounded random variable $X$ and for any $k\in\mathbb{R}^+$:

## Martingale Convergence Theorem

Definition a sequence $x_i$ of real numbers is called a Cauchy sequence if for every positive real number $\epsilon$, there is a positive integer $N$ such that for all natural numbers $m, n > N$ such that $|x_m - x_n| \leq \epsilon$

Definition $L^2$ martingales have orthogonal increments. Given $X_n$ a $L^2$ martingale with increments $\xi_n := X_n - X_{n-1}$ and $X_0 = 0$, then:

• $\mathbb{E}\xi_n\xi_{n+m} = 0$, $\forall \;n < m$, and

Theorem Suppose $X_n$ is $L^1$-bounded martingale, then there exists a $L^1$-bounded random variable $X_{\infty}$ such that:

Theorem Suppose $X_n$ is $L^2$-bounded martingale, then there exists a $L^2$-bounded random variable $X_{\infty}$ such that:

(1) $\lim_{n \rightarrow \infty} X_n = X_{\infty} \;\text{a.s.}$
(2) $\lim_{n \rightarrow \infty} \mathbb{E}|X_n - X_{\infty}|^2 = 0, and\; \lim_{n \rightarrow \infty} \mathbb{E}X_n^2 = \mathbb{E}X_{\infty}^2$

## Change Of Measure

Proposition Given $P$ a probability measure and $Z$ is a non-negative random variable satisfying $\mathbb{E}_{P}Z = 1$, then there exist a probability measure $Q$ such that for any bounded or non-negative random variable $Y$ that $\mathbb{E}_QY = \mathbb{E}_PYZ$. Z is called the likelihood ratio of probability measure $Q$ w.r.t. $P$, written as $Z = dQ/dP$ and that:

Proposition If the outcome space $\Omega$ is finite, then for each outcome $\omega \in \Omega$, $Q(\omega) = P(\omega)Z(\omega)$

Example 1 In a $N$-period market with finite set of outcomes and tradable assets. Let $P, Q$ denote the risk-neutural measure for USD and EUR investors. Let $S_t^i, \tilde{S}_t^i$ denote the USD and EUR price of the risk-less (w.r.t. its own measure) asset $B^i$ at time t. Then $dP/dQ = S^1_0/S^1_N$

Proof. By fundamental theorem, $\tilde{S}_t^i = \mathbb{E}_Q\tilde{S}_N^i$, and $\tilde{S}_t^i=S_t^i/S_t^1$, so:

Theorem Let $P$ and $Q$ be two probability measure on the same measurable space, and let $\mathcal{F}_n$ be a filtration such that for all n $Q$ is absolutely continuous w.r.t. $P$ on $mathcal{F}_n$. Then the sequence of likelihood ratio $L_n$ is a martingale:

# Brownian Motion

## Standard Bronwian Motion

Definition A standard Brownian motion (SBM) is a continuous-time random process $B_t$ such that $B_0 = 0$ and:
(a) $B_t$ has stationary increments.
(b) $B_t$ has independent increments.
(c) The sample path $t \rightarrow B_t$ are continuous.

Note that (a), (b), and (c) imply that for some constant $\sigma^2>0$ the distribution of $B_{t+s}-B_s$ is $\mathcal{N}(0, \sigma^2t)$

Definition Given a SBM $B_t$, $W_t = \mu t + \sigma B_t$ is a Brownian motion with drift $\mu$ and variance $\sigma^2$.

Proposition Given a SBM $B_t$, its reflection $-B_t$ is also a SBM.

Proposition Given a SBM $B_t$, then for any $\alpha \in\mathbb{R}^+$, $\tilde{B} := B_{\alpha t}/\sqrt{\alpha}$ is a SBM

Definition The nth level quadratic variation of a function $f: [0,t] \rightarrow \mathbb{R}$ is the sum of squares of the increments across intervals of length $2^{-n}$:

Theorem Given a SBM $W_t$ with drift $\mu$ and variance $\sigma^2 > 0$, then for all $t>0$ with probability $1$:

## Strong Markov Property

Definition Given a SBM $B_t$, a stoping time is a non-negative random variable $T$ such that for every fixed $t \geq 0$, the event $\{T \leq t\}$ depends only on the path $\{B_s\}_{s\leq t}$

Theorem If $W_t$ is a Brownian motion and $T$ is a stopping time then the strong Markov property holds:
(a) the process $\{B_{t+T} - B_T\}_{t\geq 0}$ is a Brownian motion, and
(b) the process $\{B_{t+T} - B_T\}_{t\geq 0}$ is independent of the path $\{B_s\}_{s\leq T}$

Theorem Run Brownian motion $W_t$, at the first time $\tau$ that $W_{\tau} = a > 0$, reflect the path in the line $y=a$, by the reflection principle the new process $W^{\ast}_t$ is another Brownian motion:

• for $t \leq \tau$, $W^{\ast}_t = W_t$
• for $t > \tau$, $W^{\ast}_t = 2a - W_t$

Corollary $P[\tau \leq s] = 2P[W_s > a]$

Corollary $M_t := max_{s \leq t} W_s$ has the same distribution as $|W_s|$

Corollary $-M_t^- := -min_{s \leq t} W_s$ has the same distribution as $M_t$. Hence $P[M_t>a]=P[-M_t^-<-a]=2P[W_t>a]>0$. Consequently, for every $t>0$ with probability 1 $M_t>0$ adn $M_t^-<0$. Therefore for every $\epsilon>0$, the Brownian path crosses the t-axis infinitely many times by time $\epsilon$

## Martingales In Continuous Times

Definition A filtration is a nested family of $\sigma$-algebra indexed by time $t$.

Definition The natural filtration for a Brownian motion $W_t$ is the filtration with $\mathcal{F}_t$-the collection of all events determined by Brownian path up to time $t$.

Definition A continuous-time stohastic process X_t is a martingale relative to a filtration $\mathcal{F_t}_{t\geq 0}$ if:
(a) each random variable $X_t$ is measurable w.r.t. $\mathcal{F_t}$ and
(b) for any $s, t\geq 0$, $\mathbb{E}(X_{t+s}|\mathcal{F}_t)=X_t$

Proposition Given a SBM $B_t$ then each of these is a martingale relative to the natural filtration:
(a) $B_t$
(b) $B_t^2 - t$
(c) $e^{\theta B_t - \theta^2t/2}$

Theorem Define $P_{\theta}$ to be the probability measure with likehood ratio $Z_t^{\theta} = dP_{\theta}/dP_0= e^{\theta B_t - \theta^2t/2}$. The Cameron-Martin theorem states that the SBM $B_t$ under $P_0$ is a Brownian motion with drift $\theta$ and variance $\sigma^2=1$ under $P_{\theta}$.

Corollary For any real value $\theta, \eta$ and $t <\infty$

Corollary For any stopping time $\tau$ and $T <\infty$,

# Ito Calculus

## Ito Integral

Definition If $X_t$ is an uniformally bounded process with continuous paths $t \rightarrow X_t$ adapted to $\mathcal{F_t}$ then we can define an Ito Integral $I_t(X)$, where $X^{(n)}$ is $X$ truncted at $\pm n$ :

Property The Ito Integral satisfy the following properties:
(1) Linearity: 􏰃$\int (aX_s +bY_s)dW_s = a\int X_sdW_s + b\int Y_sdW_s$.
(2) Continuity: the paths $t \rightarrow \int_0^t X_sdW_s$ are continuous.
(3) Mean Zero: $\mathbb{E} \int_0^t X_sdW_s = 0$
(4) Variance， a.k.a. Ito Isometry:

Defintion Define the quadratic variation of the Ito Itegral:

Proposition
(a) The process $I_t(X)$ is a martingale
(b) The process $I_t(X)^2 - [I_t(X), I_t(X)]$ is a martingale

Example $\int_0^T W_sdW_s = (W_T^2 - T)/2$

Example For any stopping time $\tau$ and any $t < \infty$:

Theorem Let $W_t$ be a SBM and let $\mathcal{F}_t$ be the $\sigma$−algebra of all events determined by the path $\{W_s\}_{s\leq t}$. If $Y_t$ is any random variable with mean 0 and finite variance that is measurable with respect to $\mathcal{F}_t$ , for some $t > 0$, then the Ito representation theorem claims that $\exists$ adapted process $A_s$ such that:

This theorem is of importance in finance because it implies that in the Black-Scholes setting, every contingent claim can be hedged.

## Ito Formula

Theorem Let $W_t$ be a SBM, and let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a twice-continuously differentiable function such that $f, f', f''$ are all bounded (or at most have exponential growth). Then for any $t > 0$:

Theorem Let $W_t$ be a SBM, and let $U: [0, \infty) \times \mathbb{R} \rightarrow \mathbb{R}$ be a twice-continuously differentiable function whose partial derivatives are all bounded. Then for any $t > 0$:

Proposition Assume $f(t)$ is nonrandom and continuously differentiable. Then:

## Ito Process

Definition An Ito process is a stochastic process $X_t$ that satisfies a stochastic differential equation of the form:

Equivalently, $X_t$ satisfies the stochastic integral equation:

Definition For any adapted process $U_t$ define:

Theorem Let $X_t$ be an Ito process, and let $U$ be a twice-continuously differentiable function whose partial derivatives are all bounded. Then:

### The Ornstein-Uhlenbeck Process

Definition The Ornstein-Uhlenbeck SDE: $dX_t = −\alpha X_t dt + dW_t$
(a) This SDE describes a process Xt that has a proportional tendency to return to an “equilibrium” position 0.
(b) In finance, the OU process is often called the Vasicek model.
(c) Solving the SDE: $Xt =e^{−\alpha t}X_0 + e^{-\alpha t} \int_0^t e^{\alpha s}dW_s$
(d) The Ornstein-Uhlenbeck process is Gaussian.

### The Exponential Martingale

Definition The Exponential Martingale SDE: $dX_t = −\theta X_t dW_t$
(a) Solving the SDE: $X_t = Ce^{ − \theta^2t/2 + \theta W_t}$

### The Diffusion Process

Definition The Diffusion SDE: $dX_t = \mu(X_t)dt+ \sigma (X_t)dW_t$

Definition The Harmonic Function is a function $f(x)$ that satisfies the ODE:

Example Let $X_t$ be a solution of the diffusion SDE with initial value $X_0 = x_0$, and for any real numbers $A \leq x_0 \leq B$ let $\tau := min\{t: X_t \notin (A, B)\}$. Find $P(X_{\tau} = B)$

We first apply the Ito Formula to $df(X_t)$ and observe that a harmonic function $f$ will force the $dt$ term to vanish. Therefore $f(X_t)$ is a martingale and that $\mathbb{E}f(X_{\tau}) = f(x_0):$

We can solve for $f(x)$:

### The Diffusion Process - Bassel Process

Definition The Diffusion SDE: $dX_t = a/X_tdt+ dW_t$

Example Similar problem as above:

Note that if $x_0 > 0$ and $a \geq 1/2$ then $X_t$ will never reach $0$.

## Ito Formula - Multi-Variable

Theorem Let $\textbf{W}_t =(W_t^1,W_t^2,...,W_t^K)$ be a K−dimensional SBM, and let $u: \mathbb{R}^K \rightarrow \mathbb{R}$ be a $C^2$ function with bounded first and second partial derivatives. Then the Ito Formula states:

Where:

Corollary If $\tau$ is a stopping time for the SBM $\textbf{W}_t$ then Dynkin’s Formula shows that for any fixed time $t$:

And that $u(\textbf{W_t}) \dfrac{1}{2} \int_0^{t} \triangle u(\textbf{W}_s)ds$ is a martingale

Definition A $C^2$ function $u: \mathbb{R}^K \rightarrow \mathbb{R}$ is said to be a Harmonic Function in a region $\mathcal{U}$ if $\triangle u(x)=0, \; \forall x \in \mathcal{U}$

(a) 2D Harmonic Function Exmaple: $u(x,y)=log(x^2 +y^2)=2logr$
(b) 3D Harmonic Function Example: $u(x,y,z)=1/\sqrt{x^2 +y^2 +z^2} =1/r$

Corollary Let $u$ be harmonic in the an open region $\mathcal{U}$ with compact support, and assume that $u$ and its partials extend continuously to the boundary $\partial\mathcal{U}$. Define $\tau$ to be the first exit time of Brownian motion from $\mathcal{U}$, then:

(a) the process $u(W_t\wedge\tau)$ is a martingale, and
(b) for every $x \in \mathcal{U}$, $\;\mathbb{E}^xu(W_{\tau}) = u(x)$

Example If a 2D SBM starts at a point on the circle $C_1$ of radius 1, find out the probability $p$ that it hits concentric circles $C_2$ before $C_{1/2}$.

Let $u(x, y) = log r$ be harmonic. Then $u(W_t\wedge\tau)$ is a martingale and that $\mathbb{E} u(W_t\wedge\tau) = u(W_0) = log(1) = 0$.

Example If a 3D SBM starts at a point on the sphere $C_1$ of radius 1, find out the probability $p$ that it hits concentric sphere $C_2$ before $C_{1/2}$.

Let $u(x, y) = 1/r$ be harmonic. Then $u(W_t\wedge\tau)$ is a martingale and that $\mathbb{E} u(W_t\wedge\tau) = u(W_0) = 1/1 = 1$.

## Ito Process - Multi-Variable

Definition An Ito process is a continuous-time stochastic process $X_t$ of the form:

Where the quadratic variation $d[X_t, X_t] = \textbf{N}_t \cdot \textbf{N}_t dt$

Let $\textbf{X}_t = (X^1_t,X^2_t,...,X^m_T)$ be a vector of Ito processes. For any $C^2$ function $u:\mathbb{R}^m \rightarrow \mathbb{R}$ with bounded first and second partial derivatives, then:

Theorem Let $\textbf{W}_t$ be a K −dimensional SBM, and let $\textbf{U}_t$ be an adapted, K−dimensional process satisfying $|\textbf{U}_t|=1, \;\;\forall t \geq 0$. Then the Knight’s Theorem states that the 1-dimensional Ito process $X_t$ is a SBM:

Proposition Let $\textbf{W}_t$ be a K −dimensional SBM. Define $R_t := |\textbf{W}_t|$ be the radial part of $\textbf{W}_t$. Then $R_t$ is a Bessel process with parameter $(K-1)$:

# Barrier Option

## Pricing

Definition A barrier option at time $T$ pays:
(a) $\$1$if$max_{0 \leq t \leq T}\;S_t \geq AS_0$, (b) $\$0$ otherwise.

Assume that $S_t$ follows GBM:

The no-arbitrage price $V_t$ of the barrier option at $t=0$ is the expected payoff:

At time $t$, there are two possibilities:
(a) if $max_{0 \leq r \leq t}\;S_r \geq AS_0$, then $V_t = e^{-r(T-t)}$
(b) if $max_{0 \leq r \leq t}\;S_r \leq AS_0$, then $V_t$ is the same as the time-$0$ value $V_0$ of a barrier option with time-to-maturity $T-t$ and $A'=AS_0/S_t$

## Hedging

Let $v(t, S_t)$ be the value of the barrier option at time $t$. The Fundamental Theorem and Ito Formula show that v(t, S_t satisfy the Black-Scholes PDE:

A replicating portfolio for the barrier option holds
(a) $v_S$ share of stock
(b) $e^{-rt}(v - v_SS)$ share of cash

provided that $S_t\leq AS_0$. Once $S_t\geq AS_0$ the portfolio convert all holdings to cash and hold till maturity.

# The Black-Scholes

## The Black-Scholes Formula

Theorem Under a risk-neutral $P$, the Fundamental Theorem asserts that discounted share price $S_t/M_t$ is a martingale, where:

Therefore $\mu_t \equiv r_t$:

Definition A European contingent claim with expiration date $T > 0$ and payoff function $f: \mathbb{R}\rightarrow\mathbb{R}$ is a tradeable asset with:
(a) share price at time $T$: $f(S_T)$
(b) discounted share price at time $t < T$: $\mathbb{E}[f(S_T)/M_T | \mathcal{F}_t]$

Proposition Let $W_t$ be a standard Brownian motion and $g:\mathbb{R}\rightarrow\mathbb{R}$ is a function such that $\mathbb{E}|g(W_T)| < \infty$. Then for every $0 \leq t \leq T$:

Corollary Given $dS_t = r_t S_tdt + \sigma S_tdW_t$, the Black Scholes Formula shows:

Under risk-neutral $P$, the time $t$ option price $u(t,S_t)/M_T$ is a martingale. With the Ito Formula we can set the drift of $du$ to be zero and therefore derive the Black Scholes PDE:

## Hedging In Continuous Times

Definition A portfolio $V_t = \alpha_t M_t + \beta_t S_t$ is self-financing if $dV_t = \alpha_t dM_t + \beta_t dS_t$ for all $t \leq T$

Proposition A portfolio $V_t$ is self-financing if and only if its discounted value $V_t/M_t$ is a martingale and satisfies:

Definition A replicating portfolio $V_t$ for a payoff function $f(S_T)$ is a self-financing portfolio such that $V_T = f(S_T)$

Theorem A replicating portfolio for contingent claims $f(S_T)$ is given by:
(a) $\alpha_t = (u - u_SS_t)/M_t$ cash, and
(b) $\beta_t = u_S$ shares of stock

where u is the solution of the Black Scholes PDE satisfying $u(T, S_T) = f(S_T)$

# The Girsanov Theorem

Proposition The exponential process $Z_t$ is a positive martingale.

Applying Ito Formula $Z_t = 1 + \int_0^t Z_sY_sdW_s$ and therefore $\mathbb{E}Z_t = 1$

Therorem Given $W_t$ a SBM under $P$-measure and the likelihood ratio $Z_t$, define the $Q$-measure where $dQ/dP = Z_t$. Then the Girsanov’s Theorem states that under the $Q$-measure:
(a) $\tilde{W}_t = W_t - \int_0^t Y_sds$ is a SBM
(b) $W_t$ is a BM with time-dependent drift $\int_0^t Y_sds$

Example 1 Given $W_t$ a brownian motion with $W_0 \in (0, A)$, define measure $Q$ be the conditional probability measure on event $\{W_T = A\}$. Therefore $W_t$ is a BM with drift $W_t^{-1}dt$.

Proof. We know that $\mathbb{P}[W_T = A] = W_0/A$, therefore by change of measure:

Therefore Girsanov’s Theorem implies that under $Q$, $\tilde{W}_t = W_t - \int_0^{T\wedge t} W_s^{-1}ds$ is a SBM.

Example 2 Given currency $A, B$ and their respective bank account $dA_t = r^A_tA_tdt$ and $dB_t = r^B_tB_tdt$. Define exchange rate (# B per A) $Y_t$ that $dY_t = \mu_t Y_tdt + \sigma Y_tdW_t$

Theorem If $W_t$ is a SBM under measure $Q^B$ then $\mu_t = r^B_t - r^A_t$.

Proof. $Y_t(A_t/B_t)$ is a martingale only if $\mu_t = r^B_t - r^A_t$

Theorem

# Levy Process

## Poisson Process

Definition A Levy process is a continuous-time random process $\{X_t\}_{t\geq 0}$ such that $X_0 = 0$ and:
(a) $X_t$ has stationary increments;
(b) $X_t$ has independent increments;
(c) the sample paths $t \rightarrow$X_t\$ are right-continuous.

Note that Brownian motion and Poisson process are both Levy processes and the basic building blocks of Levy processes. Brownian motion is the only Levy process with continuous paths.

Example Let $W_t$ be a SBM and for $a \geq 0$, the random variable $\tau_a$ is a Levy process.

Note that:
(a) $\tau_a$ has stationary, independent increments
(b) $\tau_{ab}$ has the same distribution as $b^2\tau_a$

Definition A Poisson process with rate $\lambda > 0$ is a Levy process $N_t$ such that for all $t \geq 0$ the random variable $N_t$ follows Poisson distribution with mean $\lambda t$:

Proposition If $X, Y$ are independent Poisson distributions with mean $\lambda, \mu$, then $X+Y \sim Poisson(\lambda + \mu)$.

Proof. $P(X+Y=n) = \sum_{m=0}^nP(X=m \;\;\text{and}\;\; Y=n-m)$

Corollary IF $N_t, M_t$ are independent Poisson processes with rates $\lambda, \mu$ then the superposition $N_t + M_t$ is a Poisson process with rate $\lambda + \mu$

Proposition Every discontinuity of a Poisson process is of size $1$

Proposition Let $N_t$ be a Poisson process of rate $\lambda > 0$, and let $\xi_i$ be an independent sequence of i.i.d. Bernoulli−$p$ random variables. Then the Thinning Theorem states that $N^S_t, N^F_t$ are independent Poisson processes with rates $\lambda p, \lambda (1-p)$:

Theorem If $n \rightarrow \infty$ and $p_n \rightarrow 0$ in such a way that $np_n \rightarrow \lambda > 0$, then the Law of Small Numbers states that the $Binomial(n, p_n)$ distribution converges to the $Poisson(\lambda)$ distribution.

Proposition If $N_t$ is a rate−$\lambda$ Poisson process, then for any real number $\theta$ the process $Z_t :=e^{\theta N_t + (\lambda - \lambda e^{\theta})t}$􏰍 is a martingale.

Theorem Define $Q$ with likelihood ratio $Z_t$ such that $dQ/dP | \mathcal{F}_t = Z_t$. Then under $Q$ the process $N_t$ is a rate-$-\lambda e^{\theta}$ Poisson process.

## Compound Poisson Process

Definition A compound Poisson process $X_t$ is a Levy process of the form:

Where $N_t$ is rate-$\lambda$ Poisson process and $Y_i$ are i.i.d. random variable independent of $N_t$. The distribution $F_{Y}$ is the compounding distribution and the measure $\lambda \times F_{Y}$ is the Levy measure.

At each $T_i \in N_t$, a random $Y_i$ is draw from $F_{Y}$. $X_t$ is the sum of all draws made by time $t$

Proposition If $\psi(\theta) = \mathbb{E}e^{\theta Y_i} < \infty$, then $\mathbb{E} e^{\theta X_t} = e^{-t\lambda (1- \psi(\theta))}$, and $\theta \in \mathbb{R}$, $Z_t^{\theta} = e^{\theta X_t - \lambda t(\psi(\theta)-1)}$ is an exponential martingale.

## Poisson Point Process

Definition Let $\mu$ be a $\sigma$−finite Borel measure on $\mathbb{R}^n$. A Poisson point process $\mathcal{P}$ with intensity measure $\mu$ is a collection $\{N_B\}_{B\in\mathcal{B}}$ of extended nonnegative integer-valued random variables such that
(A) If $\mu(B) = \infty$ then $N_B = \infty$ a.s.
(B) If $\mu(B) < \infty$ then $N_B \sim Poisson(\mu(B))$
(C) If $\{N_i\}_{i\in\mathbb{N}}$ are pairwise disjoint, then the r.v.s $N_{B_i}$ are independent, and $N_{\cup_i B_i} = \sum_{i=1}^{\infty} N_{B_i}$

Proposition The point process $(T_n, Y_n)$ associated with a CPP is a Poisson point process with intensity measure $Lebesgue \times v$, where $v=\lambda F$ is the Levy measure for the CPP.

Theorem Let $X_t$ be any Levy process, and let $J$ be the random set of points $(t,y) \in [0,\infty) \times \mathbb{R}$ such that the Levy process $X$ has a jump discontinuity of size $y$ at time $t$, i.e.,

Then $J$ is a Poisson point process with intensity measure $Lebesgue \times v$ where $v$ is a $\sigma$−finite measure called the Levy measure of the process.

Reference:

• Stochastic Calculus: An Introduction with Applications, Gregory F. Lawler
• FINM 34500 Lecture Notes, Steve Lalley, the University of Chicago