Upload
others
View
8
Download
0
Embed Size (px)
Citation preview
Statistical modeling with multivariate generalized Pareto distributions
Jenny WadsworthLancaster University, UK
Joint work with: Anna Kiriliouk, Holger Rootzén, Johan Segers
NCAR Extremes Workshop, 28 April 2016
Multivariate threshold exceedances
I Univariate right tail: X |X > u = X |X 6≤ uI Goal: model the multivariate tail
I First task: decide what is the multivariate tail!
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
● ●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●●
●
● ●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
● ●
●
●●
●● ●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●
●
●
●●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
● ●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●●
●
● ●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
● ●
●
●●
●● ●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●
●
●
●●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
● ●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●●
●
● ●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
● ●
●
●●
●● ●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●
●
●
●●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
I Focus of this talk: X | X 6≤ u
Limit model
Let
I X i ∼ FI un ∈ Rd s.t. F (un)→ 1 as n→∞
If there exists σn > 0 s.t.
P(X i − un
σn≤ x
∣∣∣∣ X i 6≤ un)→ H`(x;σ,γ, τ )for non-degenerate H` then this is the multivariate generalized Pareto or MGPdistribution (Rootzén and Tajvidi, 2006; Beirlant et al., 2004, Ch. 8).
H` =`(τ(1 + γ min(x, 0)/σ
)−1/γ+
)− `(τ(1 + γx/σ
)−1/γ+
)` (τ )
I ` : (0,∞)d → (0,∞) stable tail dependence function capturing extremaldependence
Limit model
Let
I X i ∼ FI un ∈ Rd s.t. F (un)→ 1 as n→∞
If there exists σn > 0 s.t.
P(X i − un
σn≤ x
∣∣∣∣ X i 6≤ un)→ H`(x;σ,γ, τ )for non-degenerate H` then this is the multivariate generalized Pareto or MGPdistribution (Rootzén and Tajvidi, 2006; Beirlant et al., 2004, Ch. 8).
H` =`(τ(1 + γ min(x, 0)/σ
)−1/γ+
)− `(τ(1 + γx/σ
)−1/γ+
)` (τ )
I ` : (0,∞)d → (0,∞) stable tail dependence function capturing extremaldependence
Utility of the model
I Limit H` exists under quite general conditions
I Provides most useful statistical model if all variables have a non-zeroprobability of being simultateously extreme
χ1:d = limq→1
P(F1(X1) > q, . . . , Fd(Xd) > q)1− q
●
●
●●
●
●
●●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●● ●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
● ●●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●●
●
●
●
●
●●
●
● ●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
● ●
●
●
●
●●
●
●
●
●
●
● ●
●
●●
●● ●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●
●
●
●●
●
●
●●
●
●● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●●
●
●
●
● ●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
● ●
●
●
●
●●
●●
●
●●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●●
● ●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
I χ1:d > 0: d−dim asymptotic dependenceI Otherwise mass on certain hyperplanes – can be handled with censoring,
but...
Assumption and marginal consequence
Modeling assumption
X − u | X 6≤ u ∼ H`(x;σ,γ, τ )
Under this assumption
P(Xj − uj > xj | Xj > uj) = (1 + γjxj/σj)−1/γj+
i.e. Xj − uj | Xj > uj ∼ GP(σj, γj)
●
●
●●
●
●
●
●
●
● ●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●
● ●
●●
●
●●
●
●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●
● ●
●
●
●
●●
●
●
●
●
●
●● ●●
●
●
●
●● ●
● ●
●
●●
●
●
●
●
●●
●
●
●●
Assumption and marginal consequence
Modeling assumption
X − u | X 6≤ u ∼ H`(x;σ,γ, τ )
Under this assumption
P(Xj − uj > xj | Xj > uj) = (1 + γjxj/σj)−1/γj+
i.e. Xj − uj | Xj > uj ∼ GP(σj, γj)
●
●
●●
●
●
●
●
●
● ●
●●
●
●●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●● ●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●
●
● ●
●●
●
●●
●
●●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●●
● ●
●
●
●
●●
●
●
●
●
●
●● ●●
●
●
●
●● ●
● ●
●
●●
●
●
●
●
●●
●
●
●●
Probabilistic treatment of MGP distributions
Various probabilistic results on MGPDs are contained in:
I Tajvidi (1996), Beirlant et al. (2004, Ch.8), Rootzén and Tajvidi (2006), Falkand Guillou (2008), Falk and Michel (2009), Ferreira and de Haan (2014),Rootzén et al. (2016)
I ...
Less has been done on the statistical side (Huser et al. (2015) – special case;Thibaud and Opitz (2015); others...?):
I Theoretically equivalent point process models (Coles and Tawn, 1991)
I Conceptual advantage of MGPD: proper multivariate distribution
I Easy to simulate from (with the right representation): model checking tools
This talk: construction device, density formulae, interesting properties,application
Construction recipe (Ferreira & de Haan, 2014)
Let
I T ∼ Exp(1) ⊥⊥I S0 ∈ Rd , maxj S0,j = 0 a.s.
thenZ = T + S0
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●● ●●
●
●●
●
●
● ● ●●●
●●
●
●
●
●
●
● ●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
● ●●
●
●
● ●
●●
●
●
●
●
●●
●
●
●
●●
●
● ● ●
● ●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●●
● ●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●●
●
●
●●
●
●●
●
●●
●
●
●
●
●●
●
●
●●
●
●
●
● ●●
●●●
● ●
●
●●
●
●● ●
●
●●
●
●
●
●●
●
●
●●●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●
●
● ●
●
●
●
●
●
●
●● ●
●
●
●
●
●
●
●
●●
●
●
●●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●●●
●
●
●
●
● ●
●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●●●
●
●
●
●
●
●● ●● ●
●
●●
●●● ●●
●
●
●
●
●
●
●
●● ●
●
●●
●
● ●
●
●
●
●● ●
●
●●
●
●
●
●
● ●
●
●
● ●
●
●
● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●
●
●
● ●
●●
●
●
●
●
●
●
●
●
●
● ●●
●
●
●
●
●●● ●●
●●●
●
●●
●
●● ●●●
●●
●
●
●
●
● ●
●●
●●
●●
●●
●
●● ●
●
●
●●
●●
●
●
●
●
● ●
●
●
●
●
●
●
●●
●
●
●
●
●
●●●●
●
●
●
●
●
●●
●
● ●
●
●
●
●
●
●
●
● ● ●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
● ●●●
● ●●●
●
●
●
●
●●
●
●
●
● ●
●
●
●
●
●
●
●
●●
●●
●
●
●●
●
●
●
●
●
●
●●
●
● ●
●
●
●●
●
●
●●
●
●
●
●●●
●
●●
●
●
●● ● ●
●●
●
●●
●
●
●●
●
●
●
● ●
●
●
●
●
●
●●
●
●
●●
●
●
●
●●
●●
●
●
●
●●
●●
●
●
●
●●
● ●
●
●
●●
●●
●
●
●
●
● ●
●
●●● ●
●
●
●
● ●●
●
●
●
●
●●
●
●
●●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●●
●
● ●
●
●
●●●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●●●
●●●
●● ●
●
●
●
●
●
●
●●
●
●●
●●
●●
●
●
●
●
●●
●● ●
●
●
●
●●
●
●
●●
●●● ●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●●
● ●
●
●
●
●●
● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●● ●●●
●
●
●
●●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●●
●
●
●
●●
●●
●
●
●
● ●
●
●●
●
●
●
●●●●
●
●
●
●
●●
●
●●
●
●
● ●●
●
●●
●
● ●●
●
● ●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●●
●
●●
●●
●
●
●
●
●
●●
●●●● ●
●
●●
●
● ●●
−4 −2 0 2 4 6 8
−4
−2
02
46
8
S01 + T
S02
+T
=
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●●
●
●●●
●
●
●
●
●●●
●
●●●●●●
●
●
●
●
●
●
●●●
●
●
●
●●
●●
●
●
●●
●
●
●●●
●
●
●
●●●
●
●●●
●
●
●●
●
●
●
●
●●●
●
●
●●
●
●●●
●
●●
●●
●
●
●
●
●
●●
●●●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●
●
●
●●
●
●●
●
●
●●●●●
●
●
●●
●
●
●●●
●
●
●
●●●●●
●
●
●●
●
●●
●●
●●
●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●
●
●●
●●
●
●
●
●
●●●
●●●●●●●
●
●
●
●●●
●
●●●●●
●
●
●
●●
●
●
●●
●
●
●
●●
●
●
●
●●●
●
●
●
●
●●●
●
●●
●●
●
●●
●
●●
●●●
●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●
●
●
●
●●
●●
●●
●●●●●●
●
●●
●●●
●
●●
●●●
●
●●●●●
●
●
●
●●
●
●
●●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●●●
●●
●
●●●
●
●
●
●
●
●●●●●●
●
●
●
●
●●
●
●
●
●●●
●
●●
●
●
●
●
●●●
●
●
●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●●
●
●
●
●●
●
●
●●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●●●●●
●●
●
●
●
●
●
●●●●●●●
●
●●●●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●●●●●
●●
●
●●●
●●
●●
●
●
●●
●
●
●●●●
●●
●
●●●●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●
●●
●●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●●
●
●
●
●
●
●
●●
●
●●
●
●
●●
●
●
●●
●
●
●
●●●
●●
●
●
●●
●●●●●
●
●●●
●
●●
●●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●●●
●●
●●
●●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●●●
●●
●
●
●
●
●
●
●
●
●●
●
●●
●
●●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●●●●
●●●
●●
●●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●●●
●●
●●
●
●
●●●
●
●●
●
●●
●●
●
●●●
●●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●●
●
●●
●●
●
●●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●●●
●
●
●
●
●
●
●
●
●
●●
●●●
●●
●
●
●
●●●●●●●●
●
●●
●●●
●●
●●
●
●
●●
●
●
●
●
●
●
●
●●●●
●
●
●
●
●
●●●
●
●●
●●
●●
●
●
●
●●●●
●
●●●●
●
●●
●
●
●
●
0 200 400 600 800 1000
02
46
8
T
+
●
●
●
●
●
●
●
●
●
●●
●
●
● ●
●
● ●
●
●
●●●
● ●●●●
●
●
●●
●●●
●● ● ●●●
●
●●
●
●
●
●
● ●●●●●
●●●
●●●
●
●
● ●
●
●●
●
●●●●
●
●●●● ●
●
●● ●●
● ●●
●
●
●
●
●
●●
● ●●●
●
●●
●●●
●
●
●
●● ●
●
●
●
●●
● ●
●
●●
●●
●
● ●
●
●●●●● ●●
●
●
●●
●
●
●
●●
● ●●
●
●● ● ●
●
● ●●●●●●
●
●
●
●
●
●
●
●
●●●
●
●
●
●
●●● ●●●
●
●
●
●
●
●
● ●
●
●●● ●●● ●
●
● ●● ● ●
●
● ●
●●
●●
●
●
●
●●
● ●●
●
● ●●● ●●● ●●●
● ●●● ●
●
●●
●
●
●
●●● ● ●●
●
●●●
●
●
●●
●●
●
●●●●
●●
●
●●
●● ●
●
●●
●
●
● ● ●
●
●●
●●
●
● ●●
●
●
●●
●
●● ●●
●
●
●
● ● ●●●
●●
●●
●
●●
●
●
●
●●
●●
●
●
●
●
●
●
●
●●●
●
●
●
●●●●
●
●
●
● ●● ●● ●●
●
●
●
●
●
● ●●●
●
●●●
●
●
● ●● ●●
●
●
●●●●
●
●●
● ●●
●
●●●●
●
●
●●
●
●
●
●● ●
●
● ●
●
● ●
●
●●
●● ●● ●●●
●
●●● ●
●
●●●
●
●
● ●●●● ● ●
●
●
●
●
●
●
●
●●
●
● ●
●●
●●
●
●●
●
●
●●
●
●●
● ●●
●
●●●●
●
●
●
●
●
●
●●●
●●
● ●● ●●
●● ● ●
●
●● ●●●
●
●●
●
●
●
● ●
●
●
●●● ●●
●
●
●●
●
●
●
●●
●●
●
●
●
●
● ●
●
●
●
●
●●
●●
●●
●
●
●
●● ●●
●
●
●
●
●
●●
●●
●
●
●● ●●●
●● ●
●
●
●●
●
●
● ●●
●
●● ●
●
● ●
●
●● ●
●●●
●
●
●●●
●
●●
●●●●
●
●
●
●●●
●
●
●●●
●●
●●●
●●
●
●●●
●●
● ●●● ●
●
●
●● ●
●
●● ●●
●
●●●
●
●
●●
●
●● ●●
●●
●
●●
●
● ●●
●
●●●
●
●● ●
●
● ● ●
●
● ●●
●
●
●
●●●
● ●●
●
●●
●
● ● ●
●
●
●
●
●●
● ●●
●
●
●●
●
●●●●
●
●●●
●
● ●●
●
●● ●● ● ●● ●
●
●
●●
●
●
●
●
●
●
●
●
●
●
● ●●●●●
●
●● ●●● ●●
●
●● ●
●
●●●●
●
●●
●
●
●
●
●
●●
●
●●●
●
●●●●
●
●●●●
●
●●
●
●
●● ●●
●● ● ●●
●
● ●●
●
●●●
●
●
●
●
●
●●
●●●
●●● ●
●
●●
●●
●
●
●●
●
●●
●●
●
●
● ●● ●●
●
●
●
●
●●
●●
●●
●●●●
●
●● ●
●
●
●
●● ● ●●●
●
●●●●
●
●●
●
●● ●
●
●● ●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●● ●● ●
●●
●
●
●
●
●
●●
●●
●●
●
●
●
●●
●
●
●
●
●●●
●
●●
●●● ●●●
●
● ●●
●
●
●●
●●
● ●●
●
●●
●
● ●
●
●
● ●●● ● ●
●
●
●●●
●
●●●●
● ●●●
● ●●●
●
●●
●
● ● ● ●
●
● ●
●
●
●●●
●●● ●●● ●
●●
●
●
−4 −2 0 2 4 6 8
−4
−2
02
46
8
S01
S02
is MGP with
I σ = 1, γ = 0, τ = E(eS0), `(x) = E(maxj eS0,jxj/E(eS0,j ))I Support: {z ∈ Rd : z 6≤ 0}
Construction recipe (Ferreira & de Haan, 2014)
Standardized form (σ = 1, γ = 0):
Z = T + S0
Generalized form:
X = σ(eγZ )− 1
γ
Densities? Not straightforward as S0 has no Lebesgue density on Rd .
Densities and simulation
Solution: rewriteS0 = S − max
1≤j≤dSj
for S ∈ Rd with Lebesgue density f (s).
Standardized formZ = T + S − max
1≤j≤dSj
with density
h(z) = e−max zj∫ ∞−∞
f (z + t) dt
I Densities f ⇒ densities h with (at most) 1-dim numerical integralI Censored likelihood contributions: simple if f has tractable partial
integrals
I Simulation from f ⇒ simulation from MGP
Densities and simulation
Solution: rewriteS0 = S − max
1≤j≤dSj
for S ∈ Rd with Lebesgue density f (s).
Standardized formZ = T + S − max
1≤j≤dSj
with density
h(z) = e−max zj∫ ∞−∞
f (z + t) dt
I Densities f ⇒ densities h with (at most) 1-dim numerical integralI Censored likelihood contributions: simple if f has tractable partial
integrals
I Simulation from f ⇒ simulation from MGP
Example tractable models
I Independent components: f (s) =∏d
j=1 fj(sj)
I Gumbel Fj(sj) = exp{− exp{−α(sj − µj)}}, α > 0, µj ∈ RI Reverse exponential Fj(sj) = e
1αj
(sj−µj), sj ∈ (−∞, µj), αj > 0, µj ∈ R
I Multivariate Gaussian: f (s) = (2π)−d/2
|Σ|1/2 exp{−(s − µ)TΣ−1(s − µ)/2}
Censored likelihood inference
I Generalized form likelihood via transformation:
hσ,γ(x) = h(
1γ
log(1 + γx/σ)) d∏
j=1
1σj
(1 + γjxj/σj)−1.
I Censored likelihood estimation:I marginals have lower endpoint −σj/γj for γj > 0I to avoid bias in dependence parameter estimation (e.g. Huser et al.
(2015))
Some useful properties of MGPDs
IfX − u | X 6≤ u ∼ H`(x;σ,γ, τ )
then
1.
χ1:d(q) =P(F1(X1) > q, . . . , Fd(Xd) > q)
1− qis constant when X > u (i.e. q > maxj Fj(uj)), and equal to its limitχ1:d = E
(min1≤j≤d e
S0,j
E(eS0,j )
)
2. If γ = γ1, and aj > 0, j = 1, . . . , d
d∑j=1
aj(Xj − uj)|d∑
j=1
aj(Xj − uj) > 0 ∼ GP(d∑
j=1
ajσj, γ)
This result doesn’t depend on the underlying dependence structure(probability of conditioning event does)
Some useful properties of MGPDs
IfX − u | X 6≤ u ∼ H`(x;σ,γ, τ )
then
1.
χ1:d(q) =P(F1(X1) > q, . . . , Fd(Xd) > q)
1− qis constant when X > u (i.e. q > maxj Fj(uj)), and equal to its limitχ1:d = E
(min1≤j≤d e
S0,j
E(eS0,j )
)2. If γ = γ1, and aj > 0, j = 1, . . . , d
d∑j=1
aj(Xj − uj)|d∑
j=1
aj(Xj − uj) > 0 ∼ GP(d∑
j=1
ajσj, γ)
This result doesn’t depend on the underlying dependence structure(probability of conditioning event does)
Example (sorry about the context!)
I Data: weekly negative returns of four largest UK banks, October 2007 -April 2016 (444 observations)
−0.2 0.0 0.1 0.2
−0.8
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
XH
XL
−0.2 0.0 0.1 0.2
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
XH
XR
−0.2 0.0 0.1 0.2
−0.8
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
XH
XB
−0.8 −0.4 0.0 0.4
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
XL
XR
−0.8 −0.4 0.0 0.4
−0.8
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
XL
XB
−0.6 −0.2 0.2 0.6
−0.8
−0.6
−0.4
−0.2
0.0
0.2
0.4
0.6
XR
XB
0.5 0.6 0.7 0.8 0.9 1.0
0.0
0.2
0.4
0.6
Threshold
χ̂
Asymptotic dependence seems reasonable; χ1:4(q) looks constant above q ≈ 0.83
Model selection / fi�ing strategy
1. standardize the data to common GP margins using the ranktransformation
2. fit most complicated dependence model within each class to thestandardized data
3. select as the dependence model class that which produces the closest fit tothe data, in the sense of largest maximized log likelihood (or AIC)
4. use likelihood ratio tests to test for simplification of models within theselected dependence class, and select a final dependence model
5. fit GP margins simultaneously with this dependence model (i.e. fit fullMGPD)
6. test for simplifications in the marginal parameterization
Dependence model
I Above strategy leads to Gumbel model with a single dependenceparameter(!)
● ● ● ● ● ● ● ● ●●
● ● ●
●●
● ●
● ● ● ● ●●
●
●
0.5 0.6 0.7 0.8 0.9
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Threshold
χ̂
●●
● ● ● ●● ● ●
●●
●●
● ● ● ●
● ●●
● ●●
●
●
0.5 0.6 0.7 0.8 0.9
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Threshold
χ̂
● ● ● ● ● ●● ● ●
● ●● ●
●● ● ●
● ●●
●●
●
●
●
0.5 0.6 0.7 0.8 0.9
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Threshold
χ̂
●● ● ● ●
●● ● ●
●●
● ●● ● ● ●
●●
●
●
●
●●
●
0.5 0.6 0.7 0.8 0.9
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Threshold
χ̂
● ● ● ● ●●
● ● ●● ● ● ●
●●
● ● ●●
●
●●
●●
●
0.5 0.6 0.7 0.8 0.9
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1.0
Threshold
χ̂
● ●● ●
● ● ● ●●
● ● ●● ● ● ● ●
●
●●
● ● ●
●
●
0.5 0.6 0.7 0.8 0.9
0.3
0.4
0.5
0.6
0.7
0.8
0.9
Threshold
χ̂
I Tripletwise and quadruple χ all look fine too
Full MGPD + Diagnostics
I Likelihood ratio tests support common shape (γ̂ = 0.46) but separate scaleparameters (standard errors reduced by ≈ 30% by use of joint fit)
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●
●●●●●●●●●●
●●●●
●●
●●
●
●
●
●
●
0.00 0.05 0.10 0.15 0.20
0.00
0.10
0.20
Empirical
Mod
el
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●
●●●●
● ●●●
●●
●●
●
●
●
●
●
0.0 0.1 0.2 0.3 0.4 0.5
0.0
0.1
0.2
0.3
0.4
0.5
Empirical
Mod
el
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●
●●●
●
●
●
●
●
●
0.0 0.1 0.2 0.3 0.4 0.5 0.6
0.0
0.1
0.2
0.3
0.4
0.5
Empirical
Mod
el
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●
●●●●●
●●●●
●●
●●●
●
●
●
●
●
●
0.0 0.1 0.2 0.3 0.4 0.5
0.0
0.1
0.2
0.3
0.4
Empirical
Mod
el
I GPD fit to∑d
j=1(Xj − uj)|∑d
j=1(Xj − uj) > 0
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●
●●●●●●
●●
●●●
●●
●
●
●
●
●
●
0.0 0.5 1.0 1.5
0.0
0.4
0.8
1.2
Empirical
Mod
el
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●
●●●●●●
●●
●●●
●●
●
●
●
●
●
●
0.0 0.5 1.0 1.5
0.0
0.5
1.0
1.5
Empirical
Mod
el
Comments, challenges and ongoing work
I Bigger applications: likelihoods are “easy”, success all depends on numberof parameters needed
I Ongoing application to structured data (e.g. ordered data such ascumulative rainfall over 1,2,3,... days – interest in landslide risk)
I Did not cover: expressing MGPDs derived from a Poisson processmotivation via the representation
Z = T + S − max1≤j≤d
Sj
this can be done, meaning exact simulation without Poisson processes.
Thanks for your a�ention!Some results in:Rootzén, H. Segers, J. and Wadsworth, J. (2016) Multivariate Peaks over ThresholdsModels. ArXiv:1603.06619.
Statistical work in progress...suggestions for interesting environmentalapplications welcome! ,
Comments, challenges and ongoing work
I Bigger applications: likelihoods are “easy”, success all depends on numberof parameters needed
I Ongoing application to structured data (e.g. ordered data such ascumulative rainfall over 1,2,3,... days – interest in landslide risk)
I Did not cover: expressing MGPDs derived from a Poisson processmotivation via the representation
Z = T + S − max1≤j≤d
Sj
this can be done, meaning exact simulation without Poisson processes.
Thanks for your a�ention!Some results in:Rootzén, H. Segers, J. and Wadsworth, J. (2016) Multivariate Peaks over ThresholdsModels. ArXiv:1603.06619.
Statistical work in progress...suggestions for interesting environmentalapplications welcome! ,
Shameless plug!
(Photo: h�p://www.lancaster.ac.uk/maths/easter-probability-meeting/)
STOR-i workshop onMultivariate and spatial extremes with environmental applications
Lancaster, 4-6 July 2016
www.stor-i.lancs.ac.uk/research/Workshops/Extremes-Workshop
Linking representations
Recall the standardized form
Z = T + S − max1≤j≤d
Sj
If S ∼ f then
h(z) = e−max zj∫ ∞−∞
f (z + t) dt.
Another way to think about MGPDs is as the distribution of points of a Poissonprocess ∑
i≥1
δTi+S i
where∑
i≥1 δ(Ti,S i) has intensity e−tdtf (s)ds on Rd+1, restricted to the set
{z ∈ Rd : z 6≤ 0}. This gives density
h(z) =
∫∞−∞ f (z + t)e
t dt
Ef (emax Sj )
What is the connection?
Linking representations
Recall the standardized form
Z = T + S − max1≤j≤d
Sj
If S ∼ f then
h(z) = e−max zj∫ ∞−∞
f (z + t) dt.
Another way to think about MGPDs is as the distribution of points of a Poissonprocess ∑
i≥1
δTi+S i
where∑
i≥1 δ(Ti,S i) has intensity e−tdtf (s)ds on Rd+1, restricted to the set
{z ∈ Rd : z 6≤ 0}. This gives density
h(z) =
∫∞−∞ f (z + t)e
t dt
Ef (emax Sj )
What is the connection?
Linking representations
Z = T + S − max1≤j≤d
Sj (∗)
I In place of S ∼ f , suppose S ∼ g where
g(s) = f (s)emax sj/Ef (emax Sj )†
Then
h(z) = e−max zj∫ ∞−∞
g(z + t) dt =
∫∞−∞ f (z + t)e
t dt
Ef (emax Sj )
I Two di�erent sets of models from one f
I Representation (∗) means can simulate from either provided we cansimulate from f and g
I rejection / MCMC
†Related to max-stable simulation idea in Oesting et al. (2013)