Upload
andreas-brandt
View
233
Download
3
Embed Size (px)
Citation preview
The Stochastic Equation Yn+1=AnYn+Bn with Stationary CoefficientsAuthor(s): Andreas BrandtSource: Advances in Applied Probability, Vol. 18, No. 1 (Mar., 1986), pp. 211-220Published by: Applied Probability TrustStable URL: http://www.jstor.org/stable/1427243 .
Accessed: 13/06/2014 09:15
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp
.JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range ofcontent in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new formsof scholarship. For more information about JSTOR, please contact [email protected].
.
Applied Probability Trust is collaborating with JSTOR to digitize, preserve and extend access to Advances inApplied Probability.
http://www.jstor.org
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
Adv. Appl. Prob. 18, 211-220 (1986) Printed in N. Ireland
@Applied Probability Trust 1986
THE STOCHASTIC EQUATION Yn+1 = AY, + Bn WITH STATIONARY COEFFICIENTS
ANDREAS BRANDT, *Humboldt-Universitat zu Berlin
Abstract
In this note we deal with the stochastic difference equation of the form Y,+1 = A,Y, + Bn, n Z, where the sequence TP = {(A,, B,,)}n=_~ is assumed to be strictly stationary and ergodic. By means of simple arguments a unique stationary solution {y,,(W)}n==_ of this equation is constructed. The stability of the stationary solution is the second subject of investigation. It is shown that under some additional assumptions
Ipr p imply
{yn(,pr)}= -~= {y ()}
STOCHASTIC DIFFERENCE EQUATION; STATIONARY SOLUTION; ERGODICITY; MODEL STABILITY; UNIFORM STRONG LAW OF LARGE NUMBERS
0. Introduction and main results
An extensive discussion of the stochastic difference equation
(0.1) Yn+l
= AnY, + B,, n - 0,
is given in Vervaat [7]. In this paper and also in numerous references given therein the pairs (An, B,) are assumed to be i.i.d. R2-valued random variables. However, from the point of view of application mentioned in [7] this assumption seems to be restrictive.
Recently, stochastic equations of the form
Yn+1 =f(Yn, (An, B,)), n -0,
have been investigated for several functions f, particularly, in the framework of queueing theory, cf. for example Loynes [6], Borovkov [2], [3], Lisek [5], Brandt et al. [4]. In these papers the sequence {(An, Bn)}=o
was assumed to be stationary and ergodic. However, no independence assumptions were made in general. Thus, the first aim of this note is to find out conditions that ensure the existence of a uniquely determined stationary solution of (0.1) if the
Received 10 April 1984; revision received 20 November 1984. * Postal address: Sektion Mathematik, Humboldt-Universitat zu Berlin, PSF 1297, 1086 Berlin,
German Democratic Republic.
211
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
212 ANDREAS BRANDT
sequence {(An, B,)}n=o is stationary and ergodic. For solving this problem it is
more convenient to deal with the equation
(0.2) Yn+1 = A Y, + B,, n Z.
where {(An, Bn)}n=-~
is a stationary and ergodic sequence of RD2-valued random variables (Z denotes the set of integers). For a sequence {xn}~_. we often write merely {x, }, for convenience. Then we also discuss a related problem, namely the convergence of
(0.3) yn(Y, I) = ~1 (l Ai)B,njl + i A, )Y j=O
i=n--j i=0
as n tends to oo, where Y is an arbitrary R-valued random variable, which is defined on the same probability space as T and not necessarily independent of
W. Obviously, y,(Y, ') can be interpreted as the state at time n of a system governed by (0.2) and by the 'input' W if it starts at the origin with the initial random state Y. In Section 1 we prove the following result.
Theorem 1. If the sequence P = {(An, B,)} is stationary and ergodic and
one of the conditions
-oo_5 E log Ao01 <0 and E(log IBo)+ <o, (0.4) where x+ = max (0, x) for x e R,
or
(0.5) P(A0 = 0) > 0
is satisfied, then
(0.6) yn()
= > J Ai Bn-j-1, n Z, j=O i=n-j
is the only proper stationary solution of (0.2) for the given W (we set
]-Iin A = 1). The sum on the right-hand side of (0.6) converges absolutely almost surely. Furthermore,
(0.7) P(lim yn(Y, ) - n(t)= 0) = 1,
\n--- >
for arbitrary random variables Y defined on the same basic probability space as T', in particular,
(0.8) y, (Y, d s) d yo(r).
Here A denotes convergence in distribution.
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
The stochastic equation Yn,+ = An Y + Bn with stationary coefficients 213
Remark. The proof of Theorem 1 given in Section 1 shows that under appropriate conditions the results (0.6), (0.7) and (0.8) can also be extended to non-ergodic TI.
The form of the stationary solution (0.6) immediately yields the following corollary, cf. also Lemma 1.1 and Theorem 1.6 in [7].
Corollary. Let W = {(An, Bn)} satisfy the assumptions of Theorem 1 and
assume that the pairs (An, Bn) are i.i.d. Then
yn,() and
(An, Bn) are
independent and
yo(W) d Aoyo(W) + Bo, where - denotes equality in distribution.
Remark. We point out that the assumptions of Theorem 1 are the same as those of Theorem 1.6 in Vervaat [7].
Investigations of queueing models mentioned above suggest also the problem of the model continuity, that is, the continuous dependence of the stationary solution
{yn(W)) on T. As far as we know, this problem has not
been discussed for the equation (0.2). But we consider it to be of practical importance and interest in terms of approximations and statistical inference.
Theorem 2. Consider stationary and ergodic sequences T, ~1, 2,..., satisfying the following conditions:
(0.9) jr r.0 I,
(0.10) a' r
a, E(log IArI)+ -- E(log IAoI)+,
(0.11) b' --r b, E(log IBrI)+ > E(log |Bo )+,
where the moments a = E log A01O, ar= E log ImAI, b = E log 1Bno and b'= E log IB&I are finite and -oo < a < 0, -oo < ar < 0, r - 1. Let
{yn,(q)}, {y,(qW)}, {yn,(T2)},
.. be the stationary solutions of (0.2) for 'I, ?I, T2, . Then
(0.12) {(Ar B, B), y n(r))= )--- {(An-, Bn, yn(I))}=
in (R3)z with the product topology. In particular,
(0.13) {yn(r)
_ {y(W)}>.
The proof of Theorem 2, given in Section 2, is quite simple but uses a profound result by Borovkov [3].
If the conditions (0.10) or (0.11) in Theorem 2 are not satisfied, then the assertion (0.12) does not hold in general, as the following examples show.
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
214 ANDREAS BRANDT
Example 1. Consider TP = {(An, B,)} defined by P(A = -1) = P(B, = 1) = 1
and the sequences Wk = {(Ak, Bn)} of i.i.d. random pairs with P(Ak = exp (k/ 2)) = 1 - P(A = = 1l/k and P(Bk= 1) = 1. Then the assumptions of Theorem 2 are satisfied except for (0.10). By means of easy calculations we find
P (yo(k) 1 + 2 e 1 - k )k/2+k-•
-1/
and P(yo(Q) = 2) = 1, i.e. (0.12) does not hold.
Example 2. Let TP = {(An, Bn)}
be as in Example 1 and the sequences Wk =
{(A,, Bk)} be i.i.d. random pairs with P(Ak = 1) = 1 and
P(Bkn= exp (k)) = 1- P(Bk = 1) = 1/k. Then the assumptions of Theorem 2 are satisfied except for (0.11). As
P yo( k)>1+(e 1 k )k 1 - k+/e,
(0.12) does not hold.
1. Proof of Theorem 1
Before proving Theorem 1, we introduce some notation and establish a lemma. For the sequence W = {(A,, Bn)}
of R2-valued random pairs (An, B,) and the random variable Y we denote
Yn( Y, T)= > EL A)
Bn-j- (1 + A Y,
ne EZ, k ; 0. j=0 i=n-j i=n-k
The random variable yk(Y, ) can be interpreted as the state of a system governed by the equation (0.2) and the 'input' W at time n if it was started at time n - k with the initial random state Y. Obviously, y,(Y, T) = y(Y, I),
n- 0, and
y,(W) = lim yk(O, iT) = E Aj
B,__I k--> nj=O i=n-j
if the limit exists (pointwise). Thus, y,(W) can be interpreted as the state at time n of a system governed by (0.2) and the 'input' 'P if it starts at -oo with the initial state 0.
Lemma 1.1. If 'P satisfies the assumptions of Theorem 1, then the series on
the right-hand side of (0.6) converges absolutely almost surely.
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
The stochastic equation Yn+ = AnYn + Bn with stationary coefficients 215
Proof. 1. Assumption (0.4) and the strong law of large numbers yield
lik-- supk 1 log IAn+,I + log IBn-k-l) <0 a.s.,
i.e.
lim sup log (IAn-1"I IAn-kl IBn-k-l)l1/k <0 a.s. k---=
and thus
lim sup (IAn-1 . IAn-kl IBn-k-1l)1k < 1 a.s. k --- >
From this the lemma follows by Cauchy's root criterion. 2. When the assumption (0.5) is satisfied, we obtain from the stationarity
and ergodicity of T
(1.1) P(# {k < 0: Ak = 0} = # #{k _-
0:Ak = 0}
= oo) = 1.
This implies that the sum y,(W) consists only of finitely many summands. Hence, the assertion of the lemma is true.
Remark 1.2. The proof of Lemma 1.1 is similar to that of Lemma 1.7 in [7]. The difference is that the
yn(W) are given by means of a backward
construction, whereas in [7] a corresponding forward construction is considered.
Proof of Theorem 1. First we shall prove assertion (0.7). It follows from (0.3) and (0.6) that
lYn(Y, W) -Yn(W)I = -~ ( A)
Bnj_1
+ (H A )Y j=n i=n-j i=0
(1.2))( + Y). =< A I( yo (4)1 +I ). i=0
By the strong law of large numbers we get from (0.4) n-11 n-1 n
SAl = (exp (- log IAI))---o 0 a.s. i=O i=O
When (0.5) is satisfied, it follows from (1.1) that In-o'1 IA•I
=0 a.s. for all sufficiently large n. This together with (1.2) implies (0.7). The shift applied to T' on the right-hand side of (0.6) produces the shift of {y,(W)} on the left-hand side, so {y,(W)} is stationary. From (0.7) we obtain y,(Y, I) - y,(W) -4 0 (convergence in probability). Together with
y,(W)A-yo(W) we thus obtain the
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
216 ANDREAS BRANDT
assertion (0.8) (cf. Theorem 4.1 in [7]). Moreover,
Any,( ) + B, =
An o (in
Bn--A 1 + B,
(n +1) - =
AiBn+l)-j- + B, = ,,(W) a.s.,
j=1 i=(n+l)-j
i.e., the difference equation (0.2) is almost surely satisfied. Thus {y,(W)} is a stationary proper solution of the difference equation (0.2) for W. We complete the proof by showing that {y,(W)} is the only stationary solution. Assume that
{Yn} is another stationary solution of (0.2) for I, i.e., Yn,, =AnY,
+ B, a.s. and {Y } is stationary. Then
IYn -yn(W)I = IAn-11 . Yn-1 - y-_l()I =-'' k
(1.3) i=l k k
I mn-i
lYn-k +
I-A a .-it Yn-k
.
i=1 i=1
If (0.4) is satisfied, then, according to the strong law of large numbers, we obtain
A,_i = exp log IAI- * 0 a.s.
i=1 "k j=1
and if (0.5) is satisfied, then by (1.1) limk,-llfk=1 IAk_,I =0 a.s. Due to this and because {Y,} and {yn(W)} are stationary sequences, we have
k k
AI IYY-kI O and AI _, . Iyn-k (W)K 0. i=1 i=1
From this and (1.3) we find Yn -yn(W)
= 0, n E Z, a.s., i.e. {yn(WT)} is the only stationary solution of (0.2) for W.
Remark 1.3. The uniqueness result obtained in the proof of Theorem 1 applies to sequences as a whole. If {(Y} is a stationary solution of (0.2) for T, then Yn
= y,(W), n E Z, a.s., which implies
(1.4) {(An, Bn, Yn)} 4 {(An, Bnyn())},
in particular
{ Yn}=_o-
{yn()}n= .
2. Proof of Theorem 2
Before proving Theorem 2 we establish three lemmas.
Lemma 2.1 (Borovkov). If the pairs (X, Y), (X1, y1), (X2, Y2), ... Of jW2
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
The stochastic equation Yn,+ = AnY, + Bn with stationary coefficients 217
valued random variables satisfy the conditions (X', Y') , (X, Y), E(X')+
---, EX+ and E(Y')+ ---0 EY+, then E(X' + Y')+ ---+ E(X + Y)+.
A proof of Lemma 2.1 can be found in Borovkov [3], p. 277.
Lemma 2.2 (Borovkov). Let the stationary sequences of real-valued random variables X =
{X,}, X1= {Xl}, X2 = {X2},... satisfy the following con- ditions:
X is ergodic and EX < 0; X'r , X;
E(X)+ ---.) E(Xo)+. Then
(2.1) lim suPP(sup E Xr ) 0 ) 0, nE Z. r--).00
\j>NNi =n -j N-oo
A proof of Lemma 2.2 can be found in Borovkov [3], p. 278. By means of Lemma 2.1 and Lemma 2.2 it is easy to prove the following result.
Lemma 2.3. Let X= {X,}, X'= {Xl}, X2 = {X2},... be stationary and
ergodic sequences of real-valued random variables and assume that the expectations a = EXo, a' = EX;, r
_ 1, exist and are finite. If
(2.2) X-- X,
(2.3) a' ----,
a, E(Xo)+ ----, E(X0)+, then
(2.4) supP sup ' X- a' 0 Jj>N J i=n-j
and
(2.5) supP supI =
X_ ) 6) - 0,
where 6 is an arbitrary positive real number.
Proof. First we shall prove assertion (2.4). From (2.2) and (2.3) we obtain
((Xo)+ + (ar)-, -((X)- + (ar)+ + 6)) r
((X + a-, -(X- + a +6))
and
E((Xo)+ + (a')-)+ -+ E(Xo + a-)+,
E(-((XE)- + (a')+ + 6))+ ~- E( - (Xo + a+ + 6))+,
where x- = - min (0, x) for x E R. Applying Lemma 2.1 we get E(X5 - a' -
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
218 ANDREAS BRANDT
6) --5- E(Xo - a - 6)?. Now, applying Lemma 2.2 we find that
lim sup Psup (X-a - a )a
- ,- 0 i >N i -
and thus
lim supP sup - (X - ar - 6) 0) (2.6) r---oo \j>N i =n -i
1 n-1
=limrsupP sup (4 Xr - a' 6 - 0.
For the sequences {-X,}, {-Xn}, {-X2}, - the assumptions of Lemma 2.3 are also satisfied. Thus, for these sequences (2.6) leads to
(2.7) lim supP sup E (-Xi) - (-a') - O0. r--)0 j>N i=n-j
Combining (2.6) and (2.7) we get
(2.8) limsupP sup 1 Xn- a0.
r---m Yj>N J i=n-
By the strong law of large numbers (2.8) also holds without lim sup for each r
separately, and (2.4) follows.
Now we prove (2.5). Set Y,,j
= 1/j i --_- X.
- a'. Then
1 1 1
X,- j Yj,- 1 Y,
+ - a'
and (2.5) follows easily from (2.4) and a' -> a.
Remark. Statement (2.4) of Lemma 2.3, which is an easy consequence of Lemma 2.2, can be interpreted as follows: under natural convergence assumptions the strong law of large numbers holds uniformly. This is the key to the proof of Theorem 2.
Proof of Theorem 2. It is sufficient to show that
(2.9) (,lr yo(V')) - (IF, y0(W)) in (R2)7Z x R.
Since yl("Vr) = Ayo,(Q') + Br (also without r) it follows by the continuous
mapping theorem that
(WrlV, {yk•r)}k=o) -
> (', (yk('1)}I=o) in (k 2)Zx nX +1
for n = 1, and subsequently for all n > 0 by iteration. By stationarity, the lower
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
The stochastic equation Yn,+ = AnY, + Bn with stationary coefficients 219
bound k = 0 may be replaced by any negative bound, therefore by -oo, which is (0.12). We now proceed to prove (2.9). By the continuous mapping theorem, (2.9) is seen to be true with yo(W') replaced by Elyo Br'1 I-i1
Ai (also without r). The difference of the two, E~=N+1 B' -I 1 A7 converges to 0 almost surely (also without r), so by Theorem 4.2 of Billingsley [1] it is sufficient for (2.9) to prove that for all E > 0
(2.10) lim lim sup P Br i-_ A > E= 0. N-- r-- j=N+1 i= --j
From ar -- a and Lemma 2.3 we obtain for all 6 > 0
lim lim sup P B'-_-1 A > > exp (j(a + 36))
N-oo r-oo j=N+1 i=-j j=N+1 / 00 -1 0
? lim lim supPP Br-_-1I H1 IA~I > E exp (j(a + 6)) exp () N---oo r--* o j=N+1 i= -j j=N+1
Slim sup P sup logI -
Ar a> N-.oo j>N Ji=-j
+ lim sup P sup loB_ _ >6
N-- )+ r j>N l
=0.
The first term on the left-hand side majorizes the left-hand side of (2.10) for a + 36 < 0, and (2.10) follows.
Acknowledgment
The author is very grateful to the referee for suggesting a shorter version of the proof of Theorem 2 and several helpful comments, which led to an improved presentation, especially to a shorter proof of Lemma 2.3.
References
[1] BILLINGSLEY, P. (1968) Convergence of Probability Measures. Wiley, New York. [2] BOROVKOV, A. A. (1978) Ergodic theorems and stability theorems for a class of stochastic
equations and their applications. Theory Prob. Appl. 23, 241-262. [3] BOROVKOV, A. A. (1980) Asymptotic Methods in Queueing Theory (in Russian). Nauka,
Moscow. [4] BRANDT, A., FRANKEN, P. AND LISEK, B. (1984) Ergodicity and steady state existence.
Continuity of stationary distributions of queueing characteristics. Lecture Notes in Control and Information Sciences, Springer-Verlag, Berlin, 60, 275-296.
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions
220 ANDREAS BRANDT
[5] LISEK, B. (1982) A method for solving a class of recursive stochastic equations. Z. Wahrscheinlichkeitsth. 60, 151-161.
[6] LOYNES, R. M. (1962) The stability of a queue with non-independent inter-arrival and service times. Proc. Camb. Phil. Soc. 58, 497-520.
[7] VERVAAT, W. (1979) On a stochastic difference equation and a representation of non-negative infinitely divisible random variables. Adv. Appl. Prob. 11, 750-783.
This content downloaded from 185.2.32.109 on Fri, 13 Jun 2014 09:15:49 AMAll use subject to JSTOR Terms and Conditions