46
Signal Processing with L´ evy Information Lane P. Hughston Department of Mathematics Brunel University London Uxbridge UB8 3PH, United Kingdom 14 May 2015 Barrett Memorial Lectures University of Tennessee, Knoxville Collaborators: D. C. Brody (Brunel University London), E. Mackie (J. P. Morgan, London), and X. Yang (Shell International).

Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information

Lane P. Hughston

Department of MathematicsBrunel University London

Uxbridge UB8 3PH, United Kingdom—

14 May 2015—

Barrett Memorial LecturesUniversity of Tennessee, Knoxville

Collaborators: D. C. Brody (Brunel University London),E. Mackie (J. P. Morgan, London), and X. Yang (Shell International).

Page 2: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

This presentation is based on the following papers:

1. Brody, D.C., Hughston, L.P. & Mackie, E.“General theory of geometric Levy models for dynamic asset pricing”Proceedings of the Royal Society London A468, 1778-1798 (2012)

2. Brody, D.C., Hughston, L.P. & Yang, X.“Signal processing with Levy information”Proceedings of the Royal Society London A469, 2012.0433 (2013)

3. Brody, D.C. & Hughston, L.P.“Levy information and the aggregation of risk aversion”Proceedings of the Royal Society London A469, 2013.0024 (2013)

L. P. Hughston - 2 - University of Tennessee

Page 3: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Theory of Noise Type

We assume familiarity with the theory of Levy processes (Sato 1999, Schoutens2003, Appelbaum 2004, Bertoin 2004, Protter 2005, Kyprianou 2006).

A real-valued process ξtt≥0 on a probability space (Ω ,F ,P) is a Levy processif: (i) P(ξ0 = 0) = 1, (ii) ξt has stationary and independent increments, (iii)limt→s P(|ξt − ξs| > ε) = 0, and (iv) ξt is almost surely cadlag.

For a Levy process ξt to give rise to a class of information processes, werequire that it should possess exponential moments.

Let us consider the set defined for some (equivalently for all) t > 0 by

A =w ∈ R : EP[exp(wξt)] <∞

. (1)

If A contains points other than w = 0, then we say that ξt possessesexponential moments.

We define a function ψ : A→ R called the Levy exponent (or cumulantfunction), such that for α ∈ A we have

EP [exp(α ξt)] = exp(ψ(α) t). (2)

L. P. Hughston - 3 - University of Tennessee

Page 4: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

If a Levy process possesses exponential moments, then an exercise shows that:

(i) ψ(α) is convex on A,

(ii) the mean and variance of ξt are given respectively by ψ′(0) t and ψ′′(0) t, and

(iii) as a consequence of the convexity of ψ(α), the marginal exponent ψ′(α)possesses a unique inverse I(y) such that I(ψ′(α)) = α for α ∈ A.

The Levy exponent extends to a function ψ : AC → C whereAC = w ∈ C : Rew ∈ A.It can be shown (Sato 1999, Theorem 25.17) that ψ(α) admits aLevy-Khintchine representation of the form

ψ(α) = pα +1

2qα2 +

∫R\0

(eαz − 1− αz1|z| < 1)ν(dz) (3)

with the property that (2) holds for for all α ∈ AC.

Here 1· is the indicator function, p ∈ R and q ≥ 0 are constants. Theso-called Levy measure ν(dz) is a positive measure defined on R\0 satisfying∫

R\0(1 ∧ z2)ν(dz) <∞. (4)

L. P. Hughston - 4 - University of Tennessee

Page 5: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

If the Levy process possesses exponential moments then for α ∈ A we have∫R\0

eαz 1|z| ≥ 1 ν(dz) <∞. (5)

The Levy measure has the following interpretation: if B is a measurable subsetof R\0, then ν(B) is the rate at which jumps arrive for which the jump sizelies in B.

If ν(R\0) <∞ one says that ξt has finite activity.

Thus we can classify a Levy process abstractly by the specification of itsexponent ψ(α).

This means one can speak of a “type” of Levy noise by reference to theassociated exponent.

In this way we obtain a kind of general classification of the different types of“pure noise”.

L. P. Hughston - 5 - University of Tennessee

Page 6: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Families of Noise Types

Suppose we fix a measure P0 on a measurable space (Ω ,F), and let the processξt be P0-Levy, with exponent ψ0(α).

There exists a parametric family of probability measures Pλλ∈A on (Ω ,F)such that for each choice of λ the process ξt is Pλ-Levy.

The changes of measure arising in this way are called Esscher transformations(Esscher 1932, Gerber & Shiu 1994, Chan 1999, Kallsen & Shiryaev 2002,Hubalek & Sgarra 2006).

Under an Esscher transformation the characteristics of a Levy process aretransformed from one type to another, and thus one can speak of a “family” ofLevy processes interrelated by Esscher transformations.

The relevant change of measure can be specified by use of the process ρλt defined for λ ∈ A by

ρλt :=dPλdP0

∣∣∣∣Ft

= exp (λξt − ψ0(λ)t) , (6)

where Ft = σ [ξs0≤s≤t ].

L. P. Hughston - 6 - University of Tennessee

Page 7: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

One can check that ρλt is an (Ft,P0)-martingale: we find that

EP0s [ρλt ] = ρλs (7)

for s ≤ t, where EP0t [ · ] denotes conditional expectation under P0 with respect

to Ft.One can show that ξt has Pλ stationary and independent increments, and thatthe Pλ exponent of ξt, defined on the set

AλC := w ∈ C |Rew + λ ∈ A, (8)

is given by

ψλ(α) := t−1 ln EPλ[exp(αξt)] = ψ0(α + λ)− ψ0(λ). (9)

In what follows we write CI = w ∈ C : Rew = 0. For any random variable Zon (Ω ,F ,P) we write FZ = σ[Z], and we write EP[ · |Z] for EP[ · |FZ].

For processes we use both of the notations Zt and Z(t), depending on thecontext.

L. P. Hughston - 7 - University of Tennessee

Page 8: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Levy information

With these background remarks in mind, we are in a position to define a Levyinformation process.

We confine the discussion to the case of a “simple” message, represented by asingle random variable X .

In the situation when the noise is Brownian motion, the information admits alinear decomposition into signal and noise.

In the general situation the relation between signal and noise is more subtle, andhas something of the character of a fibre space, where one thinks of the pointsof the base space as representing the different noise types, and the points of thefibres as corresponding to the different information processes that one canconstruct in association with a given noise type.

We fix a probability space (Ω ,F ,P), and an Esscher family Levy exponentsψλ(α), α ∈ Aλ

C.

We refer to ψ0(α) as the fiducial exponent.

L. P. Hughston - 8 - University of Tennessee

Page 9: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

The intuition here is that the abstract Levy process of exponent ψ0(α), whichwe call the “fiducial” process, represents the noise type of the associatedinformation process.

Thus we can use ψ0(α), to represent the noise type.

Definition 1. By a Levy information process carrying a message X , we mean arandom process ξt, together with a random variable X , such that ξt isconditionally ψX-Levy given FX .

Thus, given FX we require ξt to have conditionally independent andstationary increments under P, and to possess a conditional exponent of the form

ψX(α) := t−1 ln EP[exp(αξt) | FX ] = ψ0(α + X)− ψ0(X) (10)

for α ∈ CI, where ψ0(α) is the fiducial exponent of the specified noise type.

It is implicit in the statement of Definition 1 that a certain compatibilitycondition holds between the message and the noise type.

For any random variable X we define its support SX to be the smallest closedset F with the property that P(X ∈ F ) = 1.

Then we say that X is compatible with the fiducial exponent ψ0(α) if SX ⊂ A.

L. P. Hughston - 9 - University of Tennessee

Page 10: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Intuitively speaking, this condition ensures that we can use X to make a randomEsscher transformation.

We are thus able to state the Levy noise-filtering problem as follows: givenobservations of the Levy information process up to time t, what is the bestestimate for X?

To gain a better understanding of the sense in which the information processξt actually “carries” the message X , it will be useful to investigate itsasymptotic behaviour.

We write I0(y) for the inverse marginal fiducial exponent.

Proposition 1. Let ξt be a Levy information process with fiducial exponentψ0(α) and message X . Then for every ε > 0 we have

limt→∞

P[|I0(t−1ξt)−X| ≥ ε

]= 0. (11)

It follows that the information process does indeed carry information about themessage, and in the long run “reveals” it.

The intuition here is that as more information is gained we improve our estimateof X to the point that X eventually becomes known with near certainty.

L. P. Hughston - 10 - University of Tennessee

Page 11: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Properties of Levy information

It will be useful if we present a construction that ensures the existence of Levyinformation processes.

First we select a noise type by specification of a fiducial exponent.

Next we introduce a probability space (Ω ,F ,P0) that supports the existence ofa P0-Levy process ξt with the given fiducial exponent, together with anindependent random variable X that is compatible with it.

Write Ft for the filtration generated by ξt, and Gt for the filtrationgenerated by ξt and X jointly: Gt = σ[ξt0≤s≤t, X ].

Let ψ0(α) be the fiducial exponent.

One can check that the process ρXt defined by

ρXt = exp (Xξt − ψ0(X) t) (12)

is a (Gt,P0)-martingale.

L. P. Hughston - 11 - University of Tennessee

Page 12: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

We are thus able to introduce a change of measure P0 → PX on (Ω ,F ,P0) bysetting

dPXdP0

∣∣∣∣Gt

= ρXt . (13)

It should be evident that ξt is conditionally PX-Levy given FX , since for fixedX the measure change is an Esscher transformation.

In particular, a calculation shows that the conditional exponent of ξt under PX isgiven by

t−1 ln EPX[exp(αξt) | FX

]= ψ0(α + X)− ψ0(X) (14)

for α ∈ CI, which shows that the conditions of Definition 1 are satisfied,allowing us to conclude the following:

Proposition 2. The P0-Levy process ξt is a Levy information process underPX with noise type ψ0(α) and message X .

In fact, the converse also holds: if we are given a Levy information process, thenby a change of measure we obtain a Levy process and an independent“message” variable.

L. P. Hughston - 12 - University of Tennessee

Page 13: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Going forward, we adopt the convention that P always denotes the “physical”measure in relation to which an information process with message X is defined,and that P0 denotes the transformed measure with respect to which theinformation process and the message decouple.

The idea is to work out properties of information processes by referring thecalculations back to P0.

We consider as an example the problem of working out the Ft-conditionalexpectation under P of a Gt-measureable integrable random variable Z.

The P expectation can be written in terms of P0 expectations, and is given by a“generalised Bayes formula” (Kallianpur & Striebel 1968) of the form

EP[Z | Ft] =EP0[ρXt Z | Ft

]EP0[ρXt | Ft

] . (15)

This formula can be used to obtain the Ft-conditional distribution of X .

We have the following.

L. P. Hughston - 13 - University of Tennessee

Page 14: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Proposition 3. Let ξt be a Levy information process under P with noise typeψ0(α), and let the a priori distribution of the associated message X be π(dx).Then the Ft-conditional a posteriori distribution of X is

πt(dx) =exp (xξt − ψ0(x)t)∫

exp (xξt − ψ0(x)t) π(dx)π(dx). (16)

In particular, when X is a continuous random variable with a density functionp(x) one can write πt(dx) = pt(x)dx, where pt(x) is the conditional density.

It is straightforward to establish by use of a variational argument that the bestestimate for the message X conditional on the information Ft is given by

Xt := EP[X | Ft] =

∫x πt(dx). (17)

By “best estimate” we mean the Ft-measurable random variable Xt thatminimises the quadratic error EP[(X − Xt)

2|Ft].

L. P. Hughston - 14 - University of Tennessee

Page 15: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

It will be observed that at any given time t the best estimate can be expressedas a function of ξt and t, and does not involve values of the information processat times earlier than t.

That this should be the case can be seen as a consequence of the following:

Proposition 4. The Levy information process ξt has the Markov propertyunder P.

It should be apparent that simulation of the dynamics of the filter is readilyapproachable on account of this property.

L. P. Hughston - 15 - University of Tennessee

Page 16: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Examples of Levy information processes

In a number of situations one can construct explicit examples of informationprocesses, categorised by noise type.

The Brownian and Poisson constructions, which are familiar in other contexts,can be seen as belonging to a unified scheme that brings out their differencesand similarities.

We then proceed to construct information processes of the gamma, the variancegamma, and the negative binomial type.

It is interesting to take note of the diverse nature of noise, and to observe themany different ways in which messages can be conveyed in a noisy environment.

Example 1: Brownian information. On a probability space (Ω ,F ,P), letBt be a Brownian motion, let X be an independent random variable, and set

ξt = Xt + Bt. (18)

The random process ξt, which we call the Brownian information process, isFX-conditionally Levy, with conditional exponent ψX(α) = Xα + 1

2α2.

L. P. Hughston - 16 - University of Tennessee

Page 17: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

The fiducial exponent is ψ0(α) = 12α

2, and the associated fiducial process or“noise type” is standard Brownian motion.

In the case of Brownian information, there is a linear separation of the processinto signal and noise.

This model, considered by Wonham (1965), is perhaps the simplestcontinuous-time generalisation of the example originally described by Wiener(1948).

The message is given by the value of X , but X can only be observed indirectly,through ξt.The observations of X are obscured by the noise represented by the Brownianmotion Bt.Since the signal term grows linearly in time, whereas |Bt| ∼

√t, it is intuitively

plausible that observations of ξt will asymptotically reveal the value of X , anda direct calculation using properties of the normal distribution function confirmsthat t−1ξt converges X , which is consistent with the fact that ψ′0(α) = α andI0(y) = y in the standard Brownian case.

L. P. Hughston - 17 - University of Tennessee

Page 18: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

We conclude our discussion of Brownian information with the following remark.

In problems involving prediction and valuation, it is not uncommon that themessage is revealed after the passage of a finite amount of time.

This is often the case in applications to finance, where the message takes theform of a random cash flow at some future date, or, more generally, a randomfactor that affects such a cash flow.

There are numerous examples coming from the physical sciences, economics andoperations research where the goal of an agent is to form a view concerning theoutcome of a future event by monitoring the flow of information relating to it.

One way of modelling such situations in the present context is by use of a timechange.

If ξt is a Levy information process with message X and a specified fiducialexponent, then a generalisation of Proposition 1 shows that the process ξtTdefined over the time interval 0 ≤ t < T by

ξtT =T − tT

ξ

(tT

T − t

)(19)

reveals the value of X in the limit as t→ T .

L. P. Hughston - 18 - University of Tennessee

Page 19: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

One can check for 0 ≤ s ≤ t < T that

Cov[ξsT , ξtT | FX

]=s(T − t)

Tψ′′0(X). (20)

In the case where ξt is a Brownian information process represented as abovein the form ξt = Xt + Bt, the time-changed process takes the form

ξtT = Xt + βtT , (21)

where βtT is a Brownian bridge over the interval [0, T ].

Such processes have had applications in physics (Brody & Hughston 2005, 2006;see also Adler et al. 2001, Brody & Hughston 2002) and in finance (Brody etal. 2007, 2008a, Rutkowski & Yu 2007, Brody et al. 2009, Filipovic et al. 2012).

It seems reasonable to conjecture that time-changed Levy information processesof the more general type proposed above may be similarly applicable.

Example 2: Poisson information. Consider a situation in which an agentobserves a series of events taking place at a random rate, and the agent wishesto determine this unknown rate as best as possible since its value conveys animportant piece of information.

L. P. Hughston - 19 - University of Tennessee

Page 20: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

One can model the information flow in this example by a modulated Poissonprocess for which the jump rate is itself an independent random variable.

Such a scenario arises in many real-world situations, and has been investigated inthe literature (Segall & Kailath 1975, Segall et al. 1975, Bremaud 1981, Di Masi& Runggaldier 1983, Kailath & Poor 1998).

The Segall-Kailath scheme can be seen to emerge rather naturally as an exampleof our general model for Levy information.

As in the Brownian case, one can construct the relevant information processdirectly. The setup is as follows.

On a probability space (Ω ,F ,P), let N(t)t≥0 be a standard Poisson processwith jump rate m > 0, let X be an independent random variable, and set

ξt = N(eXt). (22)

Thus ξt is a time-changed Poisson process, and the effect of the signal is torandomly modulate the rate at which the process jumps.

L. P. Hughston - 20 - University of Tennessee

Page 21: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

It is evident that ξt is FX-conditionally Levy and satisfies the conditions ofDefinition 1.

In particular, we have

E[exp(αN(eXt)) | FX

]= exp(meX(eα − 1) t), (23)

and for fixed X one obtains a Poisson process with rate meX .

It follows that (22) is an information process.

The fiducial exponent is ψ0(α) = m(eα − 1).

A calculation using (10) shows that ψX(α) = meX(eα − 1).

The relation between signal and noise in the case of Poisson information israther subtle.

The noise is associated with the random fluctuations of the inter-arrival times ofthe jumps, whereas the message determines the average rate at which the jumpsoccur.

L. P. Hughston - 21 - University of Tennessee

Page 22: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

In the case of Poisson information the conditional distribution of X is

πt(dx) =exp(xξt −m(ex − 1)t)∫

exp(xξt −m(ex − 1)t) π(dx)π(dx). (24)

Example 3: Gamma information.

Let m and κ be positive numbers. By a gamma process with rate m and scale κon a probability space (Ω ,F ,P) we mean a Levy process γtt≥0 with exponent

t−1 ln EP [exp(αγt)] = −m ln(1− κα) (25)

for α ∈ AC = w ∈ C |Rew < κ−1.The probability density for γt is

P(γt ∈ dx) = 1x > 0κ−mtxmt−1 exp (−x/κ)

Γ[mt]dx, (26)

where Γ[a] is the gamma function.

A short calculation making use of the functional equation Γ[a + 1] = aΓ[a]shows that EP [γt] = mκt and VarP [γt] = mκ2t.

Clearly, the mean and variance determine the rate and scale.

L. P. Hughston - 22 - University of Tennessee

Page 23: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

The Levy measure in this example is given by

ν(dz) = 1z > 0mz−1 exp(−κz) dz. (27)

One can check that ν(R\0) =∞ and thus that the gamma process hasinfinite activity.

If κ = 1 we say that γt is a standard gamma process with rate m, and in thatcase one finds that κγt is a scaled gamma process with rate m and scale κ.

Now let ξt be a standard gamma process with rate m on a probability space(Ω ,F ,P0), and let λ ∈ R satisfy λ < 1.

Then the process ρλt defined by

ρλt = (1− λ)mteλγt (28)

is an (Ft,P0)-martingale.

If we let ρλt act as a change of measure density for the transformationP0 → Pλ, then we find that γt is a scaled gamma process under Pλ, with ratem and scale 1/(1− λ). Thus the effect of an Esscher transformation on agamma process is to alter its scale.

With these facts in mind, one can establish the following result.

L. P. Hughston - 23 - University of Tennessee

Page 24: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Proposition 5. Let γt be a standard gamma process with rate m on aprobability space (Ω ,F ,P), and let the independent random variable X satisfyX < 1 almost surely. Then the process ξt defined by

ξt =1

1−Xγt (29)

is a Levy information process with message X and gamma noise, with fiducialexponent ψ0(α) = −m ln(1− α) for α ∈ w ∈ C |Rew < 1.

Proof. It is evident that ξt is FX-conditionally a scaled gamma process. As aconsequence of (25) we have

t−1 ln EP [exp(αξt)|X ] = t−1 ln EP[

exp

1−Xγt

)∣∣∣∣X] = −m ln

(1− α

1−X

)for α ∈ CI. Then we note that

−m ln

(1− α

1−X

)= −m ln (1− (X + α)) + m ln (1−X) , (30)

from which it follows that the FX-conditional P exponent of ξt isψ0(X + α)− ψ0(X).

The gamma filter arises as follows.

L. P. Hughston - 24 - University of Tennessee

Page 25: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

An agent observes a process of accumulation: typically there are many smallincrements, but now and then there are large increments.

The unknown rate at which the process is growing on average is an importantfigure that the agent wishes to determine as accurately as possible.

The accumulation process can be modelled by gamma information, and theassociated filter can be utilised to estimate the growth rate.

It has long been recognised that the gamma process is useful in characterisingphenomena such as the water level of a dam or the totality of the claims madein a large portfolio of insurance contracts (Gani 1957, Kendall 1957, Gani &Pyke 1960). Use of the gamma information process and related bridge processes,with applications in finance and insurance, is pursued in Brody et al. (2008b),Hoyle (2010), and Hoyle et al. (2011).

For the conditional distribution we deduce that

πt(dx) =(1− x)mt exp(xξt)∫ 1

−∞(1− x)mt exp(xξt)π(dx)π(dx), (31)

and this gives us the optimal filter for the case of gamma information.

L. P. Hughston - 25 - University of Tennessee

Page 26: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Example 4: Variance-gamma information. The so-called variance-gammaor VG process (Madan & Seneta 1990, Madan & Milne 1991, Madan et al. 1998)was introduced in the theory of finance and has been extensively investigated.

The relevant definitions and conventions are as follows.

By a VG process with drift µ ∈ R, volatility σ ≥ 0, and rate m > 0, we mean aLevy process with exponent

ψ(α) = −m ln

(1− µ

mα− σ2

2mα2

). (32)

The VG process admits representations in terms of simpler Levy processes.

Let γt be a standard gamma process on (Ω ,F ,P), with rate m, and let Btbe a standard Brownian motion, independent of γt.We call the scaled process Γt defined by Γt = m−1γt a gamma subordinatorwith rate m. Note that Γt has dimensions of time and that EP[Γt] = t.

A calculation shows that the Levy process Vt defined by

Vt = µΓt + σBΓt (33)

has the exponent (32).

L. P. Hughston - 26 - University of Tennessee

Page 27: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

If µ = 0 and σ = 1, we say that Vt is a “standard” VG process, with rateparameter m.

If µ 6= 0, we say that Vt is a “drifted” VG process.

One can always choose units of time such that m = 1, but for applications it isbetter to choose conventional units of time (e.g., seconds for physicalapplications, years for economic applications), and treat m as a modelparameter.

In the limiting case σ → 0 we obtain a gamma process with rate parameter mand scale parameter µ/m.

In the limiting case m→∞ we obtain a Brownian motion with drift µ andvolatility σ.

An important alternative representation of the VG process results if we let γ1t

and γ2t be a pair of independent standard gamma processes on (Ω ,F ,P),

each with rate m, and set

Vt = κ1γ1t − κ2γ

2t , (34)

where κ1 and κ2 are nonnegative constants.

L. P. Hughston - 27 - University of Tennessee

Page 28: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

A calculation shows that the exponent is of the form (32).

In particular, we have

ψ(α) = −m ln(1− (κ1 − κ2)α− κ1κ2 α

2), (35)

where µ = m(κ1 − κ2) and σ2 = 2mκ1κ2 , or equivalently

κ1 =1

2m

(µ +

√µ2 + 2mσ2

)and κ2 =

1

2m

(−µ +

√µ2 + 2mσ2

), (36)

where α ∈ w ∈ C : −1/κ2 < Rew < 1/κ1.Now let ξt be a standard VG process on a probability space (Ω ,F ,P0), withexponent ψ0(α) = −m ln(1− (2m)−1α2) for α ∈ w ∈ C : |Rew| <

√2m.

Under the transformed measure Pλ defined by the change-of-measure martingale(6), one finds that ξt is a drifted VG process, with

µ = λ

(1− 1

2mλ2

)−1

and σ =

(1− 1

2mλ2

)−12

(37)

for |λ| <√

2m.

Thus in the case of the VG process an Esscher transformation affects both thedrift and the volatility.

L. P. Hughston - 28 - University of Tennessee

Page 29: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Note that for large m the effect on the volatility is insignificant, whereas theeffect on the drift reduces to that of an ordinary Girsanov transformation.

With these facts in hand, we are now in a position to construct the VGinformation process.

We fix a probability space (Ω ,F ,P) and a number m > 0.

Proposition 6. Let Γt be a standard gamma subordinator with rate m, letBt be an independent standard Brownian motion, and let the independentrandom variable X satisfy |X| <

√2m almost surely. Then the process ξt

defined by

ξt = X

(1− 1

2mX2

)−1

Γt +

(1− 1

2mX2

)−12

B(Γt) (38)

is a Levy information process with message X and VG noise, with fiducialexponent

ψ0(α) = −m ln

(1− 1

2mα2

)(39)

for α ∈ w ∈ C : Rew <√

2m.

L. P. Hughston - 29 - University of Tennessee

Page 30: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

An alternative representation for the VG information process can be establishedif one randomly rescales the gamma subordinator appearing in the time-changedBrownian motion.

The result is as follows.

Proposition 7. Let Γt be a gamma subordinator with rate m, let Bt be anindependent standard Brownian motion, and let the independent randomvariable X satisfy |X| <

√2m almost surely. Write ΓX

t for the subordinatordefined by

ΓXt =

(1− 1

2mX2

)−1

Γt . (40)

Then the process ξt defined by ξt = XΓXt + B(ΓX

t ) is a VG informationprocess with message X .

A further representation of the VG information process arises as a consequenceof the representation of the VG process as the asymmetric difference betweentwo independent standard gamma processes.

In particular, we have the following:

L. P. Hughston - 30 - University of Tennessee

Page 31: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Proposition 8. Let γ1t and γ2

t be independent standard gamma processes,each with rate m, and let the independent random variable X satisfy|X| <

√2m almost surely. Then the process ξt defined by

ξt =1√

2m−Xγ1t −

1√2m + X

γ2t (41)

is a VG information process with message X .

Example 5: Negative-binomial information. By a negative binomialprocess with rate parameter m and probability parameter q, where m > 0 and0 < q < 1, we mean a Levy process with exponent

ψ0(α) = m ln

(1− q

1− qeα

), α ∈ w ∈ C |Rew < − ln q. (42)

There are two representations for the negative binomial process. The first is acompound Poisson process for which the jump size J ∈ N has a logarithmicdistribution

P0(J = n) = − 1

ln(1− q)1

nqn , (43)

and the intensity determining the timing of the jumps is −m ln(1− q).

L. P. Hughston - 31 - University of Tennessee

Page 32: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

One finds that the characteristic function of J is

φ0(α) := EP0[exp(αJ)] =ln(1− qeα)

ln(1− q)(44)

for α ∈ w ∈ C |Rew < − ln q.Then if we set

nt =

∞∑k=1

1k ≤ Nt Jk, (45)

where Nt is a Poisson process with rate −m ln(1− q), and Jkk∈N denotesa collection of independent identical copies of J , representing the jumps, acalculation shows that

P0(nt = k) =Γ(k + mt)

Γ(mt)Γ(k + 1)qk(1− q)mt, (46)

and that the resulting exponent is given by (42).

The second representation of the negative binomial process makes use of themethod of subordination.

We take a Poisson process with rate Λ = mq/(1− q), and time-change it usinga gamma subordinator Γt with rate parameter m.

L. P. Hughston - 32 - University of Tennessee

Page 33: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

The moment generating function thus obtained, in agreement with (42), is

EP0[exp(αN(Γt)

)]= EP0 [exp (Λ(eα − 1)Γt)] =

(1− q

1− qeα

)mt. (47)

With these results in mind, we fix a probability space (Ω ,F ,P) and find thefollowing:

Proposition 9. Let Γt be a gamma subordinator with rate m, let Nt bean independent Poisson process with rate m, let the independent randomvariable X satisfy X < − ln q almost surely, and set

ΓXt =

(qeX

1− qeX

)Γt. (48)

Then the process ξt defined by

ξt = N(ΓXt ) (49)

is a Levy information process with message X and negative binomial noise, withfiducial exponent (42).

L. P. Hughston - 33 - University of Tennessee

Page 34: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

There is also a representation for negative binomial information based on thecompound Poisson process.

This can be obtained by an application of Proposition ??, which shows how theLevy measure transforms under a random Esscher transformation.

In the case of a negative binomial process with parameters m and q, the Levymeasure is given by

ν(dz) = m

∞∑n=1

1

nqn δn(dz), (50)

where δn(dz) denotes the Dirac measure with unit mass at the point z = n.

The Levy measure is finite in this case, and we have ν(R) = −m ln(1− q),which is the overall rate at which the compound Poisson process jumps.

If one normalises the Levy measure with the overall jump rate, one obtains theprobability measure (43) for the jump size.

With these facts in mind, we fix a probability space (Ω ,F ,P) and specify theconstants m and q, where m > 1 and 0 < q < 1. Then as a consequence ofProposition 6 we have the following:

L. P. Hughston - 34 - University of Tennessee

Page 35: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Proposition 10. Let the random variable X satisfy X < − ln q almost surely,let the random variable JX have the conditional distribution

P(JX = n |X) = − 1

ln(1− qeX)

1

n(qeX)n , (51)

let JXk k∈N be a collection of conditionally independent identical copies of JX ,and let Nt be an independent Poisson process with rate m. Then the processξt defined by

ξt =

∞∑k=1

1k ≤ N(− ln(1− qeX)t) JXk (52)

is a Levy information process with message X and negative binomial noise, withfiducial exponent (42).

L. P. Hughston - 35 - University of Tennessee

Page 36: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

Conclusions

We wrap up this study of Levy information with the following remarks.

Applications to physical (Brody & Hughston 2006) and economic (Brody etal. 2008a) time series bring out the following points:

(i) Signal processing techniques have far-reaching applications to theidentification, characterization and categorization of phenomena, both in thenatural and in the social sciences.

(ii) Beyond the conventional remits of prediction, filtering, and smoothing thereis a fourth and important new domain of applicability: the description ofphenomena, both in physical and social sciences.

In particular, there is scope for the modeling of asset price processes.

In some financial models the “message” is the state variable that determines acash flow, and the associated information process generates the filtrationrepresenting market information.

In another class of financial models the hidden variable represents the excess rateof return per unit of risk demanded by investors in exchange for investment in arisky asset.

L. P. Hughston - 36 - University of Tennessee

Page 37: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

References

[1] Adler, S. A., Brody, D. C., Brun, T. A. & Hughston, L. P. 2001 Martingalemodels for quantum state reduction. J. Phys. A34, 8795–8820.(doi:10.1088/0305-4470/34/42/306)

[2] Ahn, H. & Feldman, R. E. 1999 Optimal filtering of a Gaussian signal inthe presence of Levy noise. SIAM J. Appl. Math. 60, 359–369.(doi:10.1137/S0036139996312004)

[3] Applebaum, D. 2004 Levy Processes and Stochastic Calculus (Cambridge:Cambridge University Press).

[4] Bain, A. & Crisan, D. 2010 Fundamentals of Stochastic Filtering (NewYork: Springer).

[5] Barndorff-Nielssen, O. E. 1977 Exponentially decreasing distributions forthe logarithm of particle size. Proc. Roy. Soc. London A353, 401–419.(doi:10.1098/rspa.1977.0041)

[6] Barndorff-Nielssen, O. E. 1998 Processes of normal inverse Gaussian type.Finance Stochast. 2, 41–68. (doi:10.1007/s007800050032)

[7] Bellman, R. E. 1961 Adaptive Control Processes (New Jersey: PrincetonUniversity Press).

L. P. Hughston - 37 - University of Tennessee

Page 38: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[8] Bertoin, J. 2004 Levy Processes (Cambridge: Cambridge University Press).

[9] Bingham, N. P. 1975 Fluctuation theory in continuous time. Adv. Appl.Prob. 7, 705–766. (www.jstor.org/stable/10.2307/1426397)

[10] Bremaud, P 1981 Point Processes and Queues: Martingale Dynamics.(New York: Springer-Verlag).

[11] Brody, D. C. & Hughston, L. P. 2002 Efficient simulation of quantum statereduction. J. Math. Phys. 43, 5254–5261. (doi:10.1063/1.1512975)

[12] Brody, D. C. & Hughston, L. P. 2005 Finite-time stochastic reductionmodels. J. Math. Phys. 46, 082101. (doi:10.1063/1.1990108)

[13] Brody, D. C. & Hughston, L. P. 2006 Quantum noise and stochasticreduction. J. Phys. A39, 833–876. (doi:10.1088/0305-4470/39/4/008)

[14] Brody, D. C., Hughston, L. P. & Macrina, A. 2007 Beyond hazard rates: anew framework for credit-risk modelling. In Advances in MathematicalFinance: Festschrift Volume in Honour of Dilip Madan (Basel:Birkhauser). (doi:10.1007/978-0-8176-4545-8 13)

L. P. Hughston - 38 - University of Tennessee

Page 39: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[15] Brody, D. C., Hughston, L. P. & Macrina, A. 2008a Information-basedasset pricing. Int. J. Theoretical Appl. Finance 11, 107-142.(doi:10.1142/S0219024908004749) Reprinted as Chapter 5 in Finance atFields. Grasselli, M. R. & Hughston, L. P., eds. (Singapore: WorldScientific Publishing, 2012).

[16] Brody, D. C., Hughston, L. P. & Macrina, A. 2008b Dam rain andcumulative gain. Proc. Roy. Soc. London A464, 1801–1822.(doi:10.1098/rspa.2007.0273)

[17] Brody, D. C., Davis, M. H. A., Friedman, R. L. & Hughston, L. P. 2009Informed traders. Proc. Roy. Soc. London A465, 1103–1122.(doi:10.1098/rspa.2008.0465)

[18] Brody, D. C., Hughston, L. P. & Mackie, E. 2012 General theory ofgeometric Levy models for dynamic asset pricing. Proc. Roy. Soc. LondonA468, 1778–1798. (doi:10.1098/rspa.2011.0670)

[19] Chan, T. 1999 Pricing contingent claims on stocks driven by Levyprocesses. Ann. App. Prob. 9, 504–528. (www.jstor.org/stable/2667343)

[20] Davis, M. H. A. 1977 Linear Estimation and Stochastic Control (London:Chapman & Hall).

L. P. Hughston - 39 - University of Tennessee

Page 40: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[21] Di Masi, G. B. & Runggaldier, W. J. 1983 Non-linear filtering withdiscontinuous observations and applications to life sciences. Bull. Math.Bio. 45, 571–577. (doi:10.1007/BF02459588)

[22] Esscher, F. 1932. On the probability function in the collective theory ofrisk. Skandinavisk Aktuarietidskrift 15, 175–195.(doi:10.1080/03461238.1932.10405883)

[23] Filipovic, D., Hughston, L. P. & Macrina, A. 2012 Conditional densitymodels for asset pricing. Int. J. Theoretical Appl. Finance, 15, 1250002.(doi:10.1142/S0219024912500021) Reprinted as Chapter 9 in Finance atFields. Grasselli, M. R. & Hughston, L. P., eds. (Singapore: WorldScientific Publishing, 2012).

[24] Gani, J. 1957 Problems in the probability theory of storage systems. J.Roy. Statist. Soc. B19, 181–206. (www.jstor.org/stable/2983814)

[25] Gani, J. & Pyke, R. 1960 The content of a dam as the supremum of aninfinitely divisible process. J. Math. Mech. (Indiana Univ. Math. J.) 9,639–651. (doi:10.1512/iumj.1960.9.09038)

[26] Gerber, H. U. & Shiu, E. S. W. 1994 Option pricing by Esscher transforms(with discussion). Trans. Soc. Actuaries 46, 99–191.

L. P. Hughston - 40 - University of Tennessee

Page 41: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[27] Grigelionis, B & Mikulevicius, R. 2011 Nonlinear filtering equations forstochastic processes with jumps. In The Oxford Handbook of NonlinearFiltering, Crisan, D. & Rozovskii, B., eds. (Oxford: Oxford UniversityPress).

[28] Heunis, A. J. 2011 The innovation problem. In The Oxford Handbook ofNonlinear Filtering, Crisan, D. & Rozovskii, B., eds. (Oxford: OxfordUniversity Press).

[29] Hoyle, E. 2010 Information-Based Models for Finance and Insurance. PhDthesis, Department of Mathematics, Imperial College London.(arXiv:1010.0829)

[30] Hoyle, E., Hughston, L. P. & Macrina, A. 2011 Levy random bridges andthe modelling of financial information. Stochast. Process. App. 121,856–884. (doi:10.1016/j.spa.2010.12.003)

[31] Hubalek, F. & Sgarra, C. 2006 Esscher transform and the minimal entropymartingale measure for exponential Levy models. Quant. Finance 6,125–145. (doi:10.1080/14697680600573099)

[32] Kailath, T. 1974 A view of three decades of linear filtering theory. IEEETrans. Info. Theo. 20, 146–181. (doi:10.1109/TIT.1974.1055174)

L. P. Hughston - 41 - University of Tennessee

Page 42: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[33] Kailath, T. & Poor, H. V. 1998 Detection of stochastic processes. IEEETrans. Info. Theo. 44, 2230–2259. (doi:10.1109/18.720538)

[34] Kallianpur, G. & Striebel, C. 1968 Estimation of stochastic systems:arbitrary system process with additive white noise observation errors. Ann.Math. Stat. 39, 785–801. (doi:10.1214/aoms/1177698311)

[35] Kalman, R. E. 1994 Randomness reexamined. Modelling, Ident. Cont. 15,141–151. (doi:10.4173/mic.1994.3.3)

[36] Kallsen, J. & Shiryaev, A. N. 2002 The cumulant process and Esscher’schange of measure. Fin. Stochastics 6 97–428.(doi:10.1007/s007800200069)

[37] Kendall, D. G. 1957 Some problems in the theory of dams. J. Roy. Statist.Soc. B19, 207–212. (www.jstor.org/stable/2983815)

[38] Kozubowski, T. J. & Podgorski, K. 2009 Distributional properties of thenegative binomial Levy process. Probability and Mathematical Statistics29, 43–71. (www.math.uni.wroc.pl/∼pms/files/29.1/Article/29.1.3.pdf)

[39] Kyprianou, A. E. 2006 Introductory Lectures on Fluctuations of LevyProcesses with Applications (Berlin: Springer).

L. P. Hughston - 42 - University of Tennessee

Page 43: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[40] Liptser, R. S. & Shiryaev, A. N. 2000 Statistics of Random Processes Vols.I and II, 2nd ed. (Berlin: Springer).

[41] Madan, D., Carr, P., & Chang, E. C. (1998) The variance gamma processand option pricing. European Fin. Rev. 2, 79–105.(doi:10.1023/A:1009703431535)

[42] Madan, D. B. & Milne, F. 1991 Option pricing with V.G. martingalecomponents. Mathematical Finance 1, 39–55.(doi:10.1111/j.1467-9965.1991.tb00018.x)

[43] Madan, D. B. & Seneta, E. 1990 The variance gamma (V.G.) model forshare market returns. Journal of Business 63, 511–524.(doi:10.1086/296519)

[44] Meyer-Brandis, T. & Proske, F. 2004 Explicit solution of a nonlinearfiltering problem for Levy processes with application to finance. Appl.Math. Optim. 50, 119–134. (doi:10.1007/s00245-004-0798-6)

[45] Poklukar, D. R. 2006 Nonlinear filtering for jump-diffusions. J. Comput.Appl. Math. 197, 558–567. (doi:10.1016/j.cam.2005.11.014)

[46] Pontryagin, L. S., Boltyanskii, V. G., Gamkrelidze, R. V., &Mishchenko, E. F. 1962 The Mathematical Theory of Optimal Processes(New York: John Wiley & Sons).

L. P. Hughston - 43 - University of Tennessee

Page 44: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[47] Popa, S. & Sritharan, S. S. 2009 Nonlinear filtering of Ito-Levy stochasticdifferential equations with continuous observations. Commun. Stoch. Anal.3, 313–330. (www.math.lsu.edu/cosa/3-3-01[195].pdf)

[48] Protter, P. E. 2005 Stochastic Integration and Differential Equations, 2nded. (New York: Springer).

[49] Rutkowski, M. 1994 Optimal linear filtering and smoothing for a discretetime stable linear model. J. Multivariate Anal. 50, 68–92.(doi:10.1006/jmva.1994.1035)

[50] Rutkowski, M. & Yu, N. 2007 On the Brody-Hughston-Macrina approachto modelling of defaultable term structure. Int. J. Theoretical Appl.Finance 10, 557–589. (doi:10.1142/S0219024907004263)

[51] Rydberg, T. H. 1997 The normal inverse Gaussian Levy process:simulation and approximation. Commun. Statist. Stochastic Models 13,887-910. (doi:10.1080/15326349708807456)

[52] Sato, K. 1999 Levy Processes and Infinitely Divisible Distributions(Cambridge: Cambridge University Press).

[53] Schoutens, W. 2003 Levy Processes in Finance: Pricing FinancialDerivatives (New York: Wiley).

L. P. Hughston - 44 - University of Tennessee

Page 45: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[54] Schrodinger, E. 1915 Zur Theorie der Fall- und Steigversuche an Teilchenmit Brownscher Bewegung. Physikalische Zeitschrift 16, 289–295.

[55] Segall, A. & Kailath, T. 1975 The modelling of randomly modulated jumpprocesses. IEEE Trans. Information Theory 21, 135–143.(doi:10.1109/TIT.1975.1055359)

[56] Segall, A., Davis, M. H. A. & Kailath, T. 1975 Nonlinear filtering withcounting observations. IEEE Trans. Information Theory 21, 143–149.(doi:10.1109/TIT.1975.1055360)

[57] Tweedie, M. C. K. 1945 Inverse statistical variates. Nature 155, 453–453.(doi:10.1038/155453a0)

[58] Wasan, M. T. 1968 On an inverse Gaussian process. Scand. Actuarial J.1968, 69–96. (doi:10.1080/03461238.1968.10413264)

[59] Wiener, N. 1948 Cybernetics: or the Control and Communication in theAnimal and the Machine. (New York: John Wiley & Sons).

[60] Wiener, N. 1949 The Extrapolation, Interpolation, and Smoothing ofStationary Time Series. (New York: John Wiley & Sons).

[61] Wiener, N. 1954 The Human Use of Human Beings. (Boston: HoughtonMifflin Co.).

L. P. Hughston - 45 - University of Tennessee

Page 46: Signal Processing with L evy Information · 2015. 6. 2. · Signal Processing with L evy Information 14 May 2015 L evy information With these background remarks in mind, we are in

Signal Processing with Levy Information 14 May 2015

[62] Wonham, W. M. 1965 Some applications of stochastic differentialequations to optimal nonlinear filtering. J. Soc. Indus. and Appl. Math.A2, 347–369. (doi:10.1137/0302028)

[63] Xiong, J. 2008 An Introduction to Stochastic Filtering Theory. (Oxford:Oxford University Press).

[64] Yor, M. 2007 Some remarkable properties of gamma processes. In:Advances in Mathematical Finance, Festschrift Volume in Honour of DilipMadan. R. Elliott, M. Fu, R. Jarrow & Ju-Yi Yen, eds. (Basel: Birkhauser).(doi:10.1007/978-0-8176-4545-8 3)

L. P. Hughston - 46 - University of Tennessee