Ergodic Theorems and Related Problems [Reprint 2018 ed.] 9783110942064, 9783110460735

152 43 8MB

English Pages 103 [108] Year 1998

Report DMCA / Copyright

DOWNLOAD FILE

Polecaj historie

Ergodic Theorems and Related Problems [Reprint 2018 ed.]
 9783110942064, 9783110460735

Table of contents :
Contents
Preface
Introduction
1. Stationary processes
2. Markovian processes
3. A random walk on the semiaxis
Short Bibliographical Indications

Citation preview

Ergodic Theorems and Related Problems

Ergodic Theorems and Related Problems

V.M. Shurenkov

III MSPIII Utrecht, The Netherlands, 1998

VSP BV P.O. B o x 3 4 6 3700 A H Zeist The Netherlands

© V S P B V 1998 First p u b l i s h e d i n 1 9 9 8 ISBN 90-6764-282-7

All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the copyright owner.

Printed

in The Netherlands

by Ridderprint

bv,

Ridderkerk.

Managing Editor:

O . A . Cherkashina Institute

of

(Adamenko)

Mathematics,

National Academy of Sciences. Language Editor:

David Montgomery Oxford

University

Kyiv, Ukraine

Valentin Mikhailovich (1947-1994)

Shurenkov

Contents Preface

vii

Introduction

1

1

S t a t i o n a r y processes

5

1.1

Convergence and invariance

5

1.2

Translation operators

11

1.3

Examples

21

2

3

M a r k o v i a n processes

33

2.1

Recurrent Markovian chains

33

2.2

Sometimes Markovian processes

43

A r a n d o m walk on t h e s e m i a x i s 3.1

53

Semicontinuous processes with independent increments in a random medium

53

3.2

Boundary-value conditions

68

3.3

The ergodic theorem

83

Short Bibliographical Indications

95

Preface It is difficult to find, in the theory of random processes, a t e r m with so many different meanings as 'ergodicity'. In the theory of stationary processes, for example, ergodicity is often identified with metric transitivity. In the theory of Markov processes, the word ergodic is applied to theorems of both the existence of transition probability limits and on the convergence of mean value ratios of these transition probabilities. T h e word is also used in relation to theorems of convergence of t h e ratios of the additive f u n c t i o n a l ' mean values, and for theorems of the convergence of the ratios of the ' t i m e ' means in random fields. 'Convergence' can be understood as a convergence in probability, or as mean square convergence. In addition, there are also 'ergodic theorems' on the convergence of distributions of shifted random processes. In this monograph, the t e r m 'ergodic' is understood in its original sense, i.e. the one it had when it was first adopted by the theory of random processes f r o m statistical machanics and Boltzmann's theory of gases. By an ergodic theorem, we are referring to any s t a t e m e n t about the existence of a mean value with respect to trajectories of a random process taken with respect to time. Here, the author shares the point of view that the problems of the existence of time means, and their equality to the phase means, are interesting without any assumptions about the distribution of the random process. T h e book is not intended to be light reading. Even though, to understand the material of the first chapter, it is sufficient to have a general acquaintance with the theory of random processes. To read the second chapter, the reader is assumed to be able to freely use conditional probabilities and to be acquainted with the martingale theory (although this t e r m does not explicitly appear in the book). In the t h i r d chapter, the reader is assumed to know t h e theory of Markov processes and t h e theory of processes with independent increments given in the second and fourth chapters of the second volume of the book "Theory of random processes" by I. I. Gikhman and A. V. Skorokhod (Moscow, "Nauka", 1973). The author

vii

Introduction The following two problems could serve as an introduction to the subject of this book. The first problem is related to the problem of how adequate a mathematical model is for the description of a physical process. In this problem, let Xo, xi,...

,xn . . . be a sequence of states of a certain physical

(e.g. mechanical, chemical, biological, etc.) system, S, which evolves in discrete time intervals, t, where t = 0,1,2,

To provide a mathematical description of the

system S a transition operator T is used, which acts on the set X of all possible states of the system (called the phase space), such that xn = Txn-\

and, consequently,

n

xn = T xo- When answering the question of whether this operator T, the choice of which, of course, is substantiated by a sufficiently convincing consideration, is a good mathematical description of the system S , one encounters at least two difficulties. Firstly, it is impossible to determine exactly the states of the system S (because every measurement inevitably contains an uncertainty) and secondly, it is difficult mathematically to calculate Tnx,

even if the phase space, X , is relatively simple.

If the uncertainty of the measurement of the phase function f ( x ) is not systematic, it can be minimized by applying the arithmetical mean ¿ [ / ( z o ) + f(xx)

+

. . . + / ( x n _ i ) ] for sufficiently large n. However, if n is large, then the expression i [ / ( z ) + f{Tx)

+ ... + / ( T n - 1 x ) ] should not differ much from its limit value; n-l (1)

if, of course, a limit value exists. The situation can be further simplified if this limit does not depend on x. In this case, this limit expression (1) can be written, under fairly general assumptions, as J f(x)ft(dx),

which may lead to its determination

and hence it makes it possible to decide on the adequacy of the system S and its mathematical description T. The other problem mentioned above is related to making an estimate of the reliability of a newly designed system if the system is described by random processes (t

^ 0) (the more elements the system has, the larger is the random factor in the

system's evolution). One of characteristics determining the reliability of a newly designed system is 1

Introduction

2

the fraction of time the processes (t (t > 0) spend in the set of states A, i.e., 1 f l i m - / iA((u)du. «-> c , where r is the time at which the set of the states of failure is attained for the first time, then the limit expression (3) is equal to the proportion of those moments after which the system functioned properly for at least c units of time. If the limit expression (3) exists with unit probability and is not random, then its value is equal to the mean of P*£, calculated with respect to some probability measure P*; which, in general, is different from the initial probability. In particular, the limit expression (2) is equal to P*(Co € A). Hence, the two problems are reduced to conditions of existence and degeneracy, and to the calculation of the time mean values in (1) for the first problem, and that of the time mean values in (3) for the second. In spite of formal differences, each of them can be written in terms of the other, provided that the phase space X has the structure of a measurable space with a finite measure, the function / given in (1) is measurable, and existence of limit (1) is assumed for most x £ X. The first chapter of this book is devoted to the existence conditions for limits (1) and (3) in precisely such a setting. In the second chapter the results from the first chapter are employed in the case of random processes with a Markov property. The final chapter deals entirely with the proof of the ergodic theorem for a very special class of Markov processes, known as a random walk on the semi-axis. In this book we shall use the following notations, conventions, abbreviations, and definitions. The probability space

P ) is assumed to contain a sufficient

number of subsets in order to define on it all the random processes and variables used in this book. If £ and 77 are random variables and T is an event, then, by definition; P£ = J

ZdP,

P(f, r) = J ( d p ,

pfr =

).

We will say that a measure which is generated by a random process, defined for all functions taking values in the phase space of this process and defined on the set of time parameters of the process, is called the distribution of this process.

Introduction

3

To each subset, D, of some space, X, there is a function, ¿D, on X, called the indicator of the set D, which is equal to unity on D and is zero outside of D. The set which is the complement of D in X will be denoted by X] D or, if it is clear what the set X is, then it will be denoted just by ~\D. All measures on the real line i?i, encountered in this book, are considered to be defined on the Borel c-algebra, B. By a measure, we understand that it is a nonnegative countably additive function of sets. The symbol J* always denotes integration over the interval [a, 6]. To express the minimum of two numbers a and b, two types of notations are used: min(a, 6) and a Ab. The left and right limits of a function / at a point a are denoted by f(a—) and f ( a + ) and the upper and lower limits are denoted by lim and lim, respectively. For a matrix A with elements 0}

I sn(x)fi(dx) J{s „>0}

— I s${Tx )n(dx). J{s n >0}

The first integral in the right hand side of this relation is equal to f s+(x)fi(dx), the second integral is not greater than J s+(Tx)fi.(dx),

and

which in its turn is equal to

because the measure fi is invariant (the integrability of sn is implied

J s„(x)fi(dx)

by the invariance of the measure fi and the integrability of h). Hence; I h(x)fi(dx) J{*„>0}

> 0.

The limit as n —> oo is taken, and (13) is thus obtained. Further, let f " ( x ) and /„(x) be the upper and lower limits respectively of the expression J2kZo / ( ^ ^ / ( E i ^ o 9 { T k x ) ) as n —• oo. f'(Tx)

Then, according to (12),

= /*(x). Indeed; f

x x = hm sup - - n>m Y,k=0

f (T*) d{T x)

y hm sup E L o »>rn E t = 0

f(Tkx) 9{Tkx)

Convergence and invariance

9

and, consequently; V lim sup ^ f

f(Tx) = =

lim

Because EitLo 9(Tkx)

1

[

f(Tkx)

Znk=0f(TkX)

3up(ELog(^x)

fix)

1

= oo with respect to // for almost all x € X, then it is true

almost everywhere with respect to (i: E L o 9(T"X)

^

L

E k = 1 9{Tkx) and: /(«) ELis(^x)

. o.

The upper limit of the expression in the brackets, as tl is taken to oo, is equal to: r—

llm

o f{Tkx) ,,, . V^ —„tTk„\ ~ J W 1

almost everywhere with respect to fi. Hence, if

is the indicator function of the

k

= hx{Tkx)

set { a i < / * < a 2 }, then x ( x ) [ / ( I * i ) - aig{T x)\ k

f(T x)]

k

— h2(T x)

and x ( x ) [ a 2 5 ( r * x ) -

are true, where hi = \ ( f — a\g) and h2 = x(a2g

Also, if X ( x ) = 1, then s u p n > 1 [ E f ( T infn>i [EnkZlf(Tkx)/T,nkZl9(Tkx)} 0 and s u p n > 1 E t = o h2{Tkx)

k

k

x ) / Z Z l 9(T x)]



/)•

> a x and

< a2. Consequently, sup n >! E ^

M^x) >

> 0 almost everywhere with respect to ¡i. The integra-

bility of kx and h2 is implied by the integrability of / and g. By Lemma 5; J hi{x)fi(dx)

> 0

j

and

h2{x)fi{dx)

> 0,

(14)

or; / g{x)n{dx) J{ ai 0), in an arbitrary phase space ( E , A ) and for the set of trajectories stable with respect to the translations.

The latter implies t h a t for an arbitrary

u! 6 fi and any t > 0 there exists an w' € fi such that j 3 + i ( w ) =

for all

s > 0. Every such mapping T is automatically measurable with respect to t h e aalgebra M. generated by the process } t (< > 0). Indeed, the a-algebra M. is also generated by the events T = {w : 3 a (w) 6 B} for all s > 0 and B € A. respect to the mapping Tt, the inverse image T t {w :

-1

With

r of such an event T is equal to

: (TtoJ) G B} — {w : 3j+í(u>) € B), and consequently, if T G M, then t h e set

T ( - 1 r belongs to M. and does not depend on a particular choice of the m a p p i n g Tt. Similarly, if a random variable i is measurable with respect to M , then the equation T ( £(w) = t{T t uj) defines uniquely an measurable variable, T t £, with respect to M .

If Í is an indicator function of an event T from M , then T t £ is also an

indicator function of the measurable event,

T,-1r,

with respect to M .

Stationary

12

processes

The operators Tt of the translation of random variables, defined in just such a way as above, commute with taking Borel functions of several variables and with the operation of taking limits, i.e. if ¡^ is a Borel function of n real variables and is a sequence of measurable variables, with respect to M , and £1, £2, • • • converges everywhere on ft, thens Ttip(£i,..., lim n _oo Tt(,n- Moreover, Tt}s --

£„) = tp(Tt£i,...

,Tt£n),

and T i l i m « . . « ^

=

and TtT, = T J + ( .

The translation operators determine the a-algebra T of invariant events from M; i.e. r € 7 if and only if T £ M and T f ' r = T for all t > 0. Similarly a random variable that is measurable with respect to M.,

will be measurable with respect

to T if and only if Tt£ = £ for all t > 0. 9.

Any well posed problem about the existence and degeneracy of the time means

given in (3), necessarily involves the measurability of the mapping (f,w) —• Tt£(w) with respect to the product B+ x M. of the a-algebra B+ of Borel subsets of the real half-axes R^. = [0, oo) and the cr-algebra M . The measurability of this mapping is equivalent to the measurability of the mapping (t,u>) —• space (R^. x Q,B x M) into the measurable space (E,A).

of the measurable If such a measurability

exists, then the random process } t {t > 0) is called measurable. A measurable random process %t (t > 0) is called ergodic if the limit: (17) exists and is almost certainly degenerate for any bounded measurable (with respect to M ) variable

Limit (17), if it exists, can almost certainly be chosen so it is a

measurable variable with respect to T . Moreover, for a measurable £ (with respect to T ) , the limit is equal to

Hence, the ergodicity of a process implies that the

limits of the time means given in (17) almost certainly exist and that the cr-algebra T of invariant events is trivial. 10.

A process 31 (t > 0) is called stationary if its distribution coincides, for any

s > 0, with the distribution of the translated process 3 s + t (i > 0). This is equivalent to the invariance of the initial probability measure P on M. with respect to the translation operators Ts; i.e. P(T~ 1 T) = P(T), where T 6 M.. If the limits given in (17) exist and are almost certainly degenerate, then the values P ' £ define, on M , a probability P*, which is invariant with respect to the translation operators T, (s > 0); i.e. P ' Z t — P*£, which coincides with the initial probability P on events from T . Because the existence of limit (17) is a measurable event with respect to T , and the replacement of the initial probability space

P ) with (fi, A4, P*)

makes the process ¡ t (t > 0) stationary, stationary random variables are a natural object for a preliminary study of the convergence of the time mean values.

13

Translation operators 11.

T h e o r e m . IF A PROCESS, ¡T (T > 0), IS STATIONARY AND A MEASURABLE (WITH RESPECT TO M) VARIABLE, IS INTEGRABLE, THEN: LIM J F TU(DU = P(T\T) í J0

(18)

IS ALMOST CERTAINLY VALID. Proof.

The limit on the left hand side exists if and only if the limit of the

sequence; — FU TU£DU, n Jo

N = 1,2,...

(19)

exists. Indeed, let £ > 0 (the general case can be reduced to this case) and N 0) and the Fubini theorem, implies that the event is measurable with respect to M (given that the limit in the left hand side of (18) exists). The sequence = JNN+1 TU(DU, N = 0 , 1 , . . . , is thus stationary and

< JO ?TU\I\DU = P|£|.

According to Corollary 7, the limit; lim - Y ( k = lim - [ T u idu = C 71 ' n—oo N JQ K—0

N—>OO

almost certainly exists and P£* = P£o =

The random variable C is defined and

invariant on a measurable set with respect to T , at which the limit given in (18) exists. Consequently, if the variable £* is defined to be zero on the complement of this set, then it becomes measurable with respect to T . Let a measurable variable t), with respect to T , be bounded. Then, as it has been shown that; lim - F TMDU

= R, lim \ F TU£DU = IRF' and

t Jo

P i t f ' = P itf.

t Jo

This implies that f = P(£|T). 12.

The situation when the measure P on M is preserved by random translation

transformations 7V; TtluA-l

" '

í(rT( )W)

- \ 0,

T(W) 0) is measurable, then the variable TT( is measurable

Stationary processes

14

with respect to M for any measurable (with respect to M.) variable £ and any nonnegative measurable (with respect to M.) r so that the variables Tr2£, ... th , are defined and measurable with respect to M. The k iteration of the operator T t can be replaced by the operator TTfc, which is the translation of 7%, i.e. T* = T T k , where r 0 = 0, rk — t + T T T k - \ , ¿ = 1 , 2 , . . . . Theorem. Let a transformation T t preserve the probability P on M and: P ( r > 0) = 1,

P r < oo.

(20)

If P f j Tu|£|(iu < oo, then: l i m j [tTutdu = P'(t\T) «-00 t J0

(21)

almost certainly is valid, where; P ' i = ¿ T- P f T T u idu. " Jo

(22)

Proof. Equation (22) determines, on M., a probability measure P', which is invariant with respect to the translation operators Tt (t > 0). Indeed;

L r

TuTi£du

=

/-r /-r+i / Tu+t£du = / Tu^du T

j-T + t

Tu£du + J

ft

Tu£du - J

T^du

Tu(du + T t I Tu(du — I Tu£du

Jo

Jo

and the probability P* is invariant with respect to T t , because the probability P is invariant with respect to T t . Moreover, if T € T , then P"(r) = ^ P ( r , T) and, as equation (20) shows, contractions of the probabilities P and P* to the tr-algebra T are absolutely continuous with respect to each other. Because the set for which both the limit exists and the left hand side of equation (21) is equal to the expression in the right hand side of equation (21), is a measurable event with respect to T , then the statement of Theorem 12 thus follows from Theorem 11. 13.

Theorem. Suppose that the conditions of Theorem 12 are valid, except for condition (20), and 0 < r < oo is almost certainly valid. Then: fg TuÇdu _ P (l0T Tu£du\T) ' ™ fg Tur/du ~ P (JJ TuT]du\T)

^

is almost certainly valid if P f j T u (|£| + M)^u < oo and P (JJ Tur]du\T) / 0 is almost certainly valid also.

Translation

15

operators

One could prove this theorem in the same way as the previous one by making use of Theorem 4. However, to obtain a better understanding, it might be useful to carry out the following simple argument. The sequence; k = 0,1,2,... is stationary and P|£o| < P


0) be closed with respect to any continuously increasing unbounded change of time. This implies that, for any w £ fi and any continuously increasing unbounded function u(t) exists an

u>' €

3t(u>')

fl such that 3 u (t)(w) =

for all

t>

(t > 0 and u(0) = 0) there 0.

Then, if a measurable (with respect to M ) random process 5 ( i ) (t > 0) continuously increases from zero to infinity, then the space of trajectories of the process = 3t(t) (t > 0), where 5 ( t ( i ) ) = t ( £ ( i ) ) = t, is stable with respect to the translations. Consequently, the process fjt (t > 0) is associated with a family of translation operators Su (u > 0) and with the cr-algebra J

of events, invariant with respect to

S u (u > 0), from the cr-algebra 0 generated by the process f)( (t > 0). 18.

Let the translation operator Tt preserve the probability P(0 < r < 00) = 1. Then, if; Theorem.

+ t) =

3f(r) +

T

T

P

on M. and (28)

m

for all t > 0 and u> £ Q, then: S*(T)Suidu\J f*{T)SuVdu\J

l i m 4 ^ fo SxJjdu

is almost certainly true, where P / J is almost certainly valid.

5u(|£| +

M)du
0, and the right hand side of (29) becomes:

P[l0TTu(r,Odu\l(!)„,..., f),J, then: Sut =

, • • •, I)« + s n ) =

• • • '3 J-1 («+*.»))

and, after the change of variable u —• r$(T) / Su^du = Jo

rT / Jo Jo Jo

=

20.

[ Jo

Tu{(,Qdu.

If the process ¡t (t > 0) is stationary, further simplifications can be made by

setting R = 1. Then (31) becomes equal to P(£C|®Z)/P(t7£|Z©). Consequently, if the process ¡ t (t > 0) is stationary and f)t = Jj-i(¡), where $(t) is defined by formula (30), in which P ( < oo, then (in the case where P|£CI < °°)i

is almost certainly true. The right hand side is equal to P*(£|I0). The probability P* is obtained from the probability P by using the absolute continuous transformation with the density = f ^ j j g ] , i-e. P*J? = P r ) ( ' . The right hand side does not change if £ is replaced with 5 0. This means that the probability P' is invariant with respect to St. In other words, the process f)t (f > 0) is stationary on the space ( f i , M , ~ P m ) . This fact is important and is made in a separate statement. 21.

Proposition. The change of time t —* 5 - 1 ( i ) , where 5 ( t ) = J*Tu(du,

( > 0

and PC < oo, together with an absolutely continuous change of probability measure P (with density (/P(£|Jj,

transforms a stationary process into another stationary

process.

22.

The random change of time is a particular example of a general transformation

of the space, fi, of elementary events.

Let f)t(w) = jt(Uu>), where u> —• Uw is

a measurable mapping of the measurable space (fl, M ) into itself. Then, if the process

(t > 0) satisfies the conditions of Theorem 4 and: TrUu =

UTtu

(33)

21

Examples

for all u> € SI, then the transformation Sa of the translation of a trajectory of the process f)( (t > 0) by the value ) = r(Uu>) preserves the probability P on the er-algebra ©, generated by the process t)t (t > 0). Indeed, S^fj^w) = ¡ T ^ + t ( U u ) = lt(TTUw)

and TTi)t(uj)

= t)t{UTTui).

Consequently, the operators Sa and Tt coin-

cide on measurable (with respect to © ) variables and so the probability P on © is invariant with respect to Sa. If the random change of time t —+ t ( i , w ) is considered as a transformation U of the space of elementary events, i.e. ¡t(Uu>) = 3t(t,w)(w)> then, for condition (33) to be valid, it is necessary, together with condition (22), to impose the following additional condition: r((7w) = 5 ( r M , W ) . Indeed, }t{UTTu>)

= 3t(r(i/w)+t.w)M and ¡t{TTUuu)

(34) = }T(Uu)+t,This

implies

that, as long as the structure of the trajectories of the process, ¡ t (t > 0), is not considered, it is necessary to postulate either of the following conditions: t(r(£/w) + t,u)

- r ( w ) ) + t(i,T T w),

or;

r(Uu) + t =5(r(w) + t(i,rTw),w)>

(35)

which becomes condition (34) if í = 0. If condition (34) is valid, then relation (35) becomes 5 ( r ( w ) + u,w) =

$(T(T(U!),UI)+§(U,TTU})

i f u = t(t,TTU>) and t = $(T,TTLL>).

In this case the value of t in condition (35) is arbitrary because u is arbitrary. So to make condition (33) valid, it is required that condition (34) and condition (28) are valid. The condition T(ULJ) = 5 ( r ( u ; ) , w ) is valid as an identity if, for example, r is the time at which the process ¡ t (T > 0) reaches the set of states A £ A for the first time, i.e. r = TA — i n f { i > 0 : ¡ t £ A } . T h e same is true if r is the time at which the process returns to A through B, i.e. T — RG + TtbT¿,

etc.

At the same time, this condition will not be valid if the time r is not random and if 1s(t) is given by (30) with a nontrivial variable

1.3 23.

Examples A sequence

of independent

identically

distributed

variables ¡o, J i , I?, • • • , which

take values in a phase space ( E , A) gives the simplest example of a stationary ergodic sequence. It is clear that it is stationary, and Kolmogorov's law of zero and identity implies that the a-algebra T of invariant events is trivial. This situation is so simple that a direct proof can be given.

Stationary

22

processes

Let a measurable (with respect to T) variable 77 be bounded. Then, by Levi's theorem; T)= lim P(f?|3o,3i,..-,3n) = lim P{Tn+1r)\¡0,h, • • • >3-) n—*oo n—*oo is almost certainly true, where T is the translation operator of the sequence jn, n = 0 , 1 , 2 , . . . , by unity. The variable Tn+lr] is measurable with respect to j n + 1 , 3„+i, . . . and, consequently, it does not depend on the variables 30, 3i, ••• , 3nHence; V = lim P ( T n + 1 r ) ) = Pjj

71—»OO

is almost certainly valid. 24.

Permutable

variables give a more interesting example.

A sequence 30,3i, • •. with values in (E, A) is called permutable if, for any natural n, all n! vectors (3,,,3,-j,... ,3,„), ¿1, ¿2, - - -, ¿n = 0,1,. ..,n

— 1, have an identical

distribution. A permutable sequence is automatically stationary. Indeed; P(3o 6 A 0 , 31 e A i , . . . , 3n-i € A n _ 1) = = P(3o € A 0 , 31 € Ax,...,

3n—1 e A „ _ ! , 3 n € E)

= P(3¿ £ A 0 , 3? £ A t , . . . , 3„ € A n _!, 30 £ E) = P(jo e A 0 , 31 € A i , . . . , 3„ € A„_i). A permutable sequence is ergodic (implying that its er-algebra I of invariant events is trivial) if and only if it degenerates into a sequence of independent variables. This follows from the conditional indepedence of 30, Ji, . •. , } n under the condition T, i.e. P(3o e A,3! € A i , . . . ,3 m € A m | I ) = J J ^ 0 \ip{x + £1) — VK1)! — 0 with

un

i t probability. This means that

there exists a set I C [0,1) with F ( I ) = 1 such t h a t every element of / is a period of the function ip- But, because ip h 0 5 already a period equal to unity, and the distribution F is not concentrated on the set {0, h, 2h,...}

for any rational h, this

can occur only if ip = const. Thus; / " V ^ + y ) - 0, n — 0 , 1 , 2 , . . . . This is possible if and only if f ( x ) = const for almost all x E [0,1) with respect to the Lebesgue measure. Since ( n has a uniform distribution with probability P*, this, together with (39), implies that: V — P*7? is almost certainly true with respect to P*. This implies t h a t the cr-algebra J of invariant events is trivial with probability P*. It also implies, together with the possession of a stationary property, t h a t the sequence

n = 0 , 1 , 2 , . . . , is ergodic

with probability P*. Moreover, P * ( r ) = 1 if and only if P r ( r ) = 1 for almost all x with respect t o the Lebesgue measure. Hence, if P*|t?| < oo, then: 1

n_1

lim - y r *J 7

n—»oo n

^— k=0

?

= P,v

is almost certainly true with respect to P x for almost all x 6 [0,1) with respect to the Lebesgue measure. 26.

Regenerating

processes give an example of a random translation transforma-

tion, which is ergodic and preserves the probability. A measurable random process ¡ t (< > 0) is called regenerating, with a regenerating t i m e r , if the processes ( t (t < r ) and (T+t (t > 0) are independent and the distribution of t h e latter process coincides with t h e distribution of the process (t (t > 0). It is assumed that 0 < r < oo with unit probability. Thus, if the time, r , is measurable with respect to the cr-algebra M., which is generated by the process £[ (t > 0), t h e n the definition of a regenerating process already implies that the measure P on M is invariant with respect to TT. Moreover, if r is measurable with respect to M , then t^ = T T Zk~\, k = 2 , 3 , . . . , T\ = t , is a regenerating time for the process Q {t > 0). T h e proof of this is based on the following auxiliary statement. Let the mapping (i,w) —> ft(w) be bounded and measurable with respect to B t x M , and let i]t = T t £ t , then: P(Ct e A, t 0) satisfies the equation; = /(*)+

/ N{duMt-u), Jo

(44)

Examples

29

where; /(i),

P * 7 t f = PT ( 6 fc ,

Po7tf = P r t i „ .

(47)

The assumptions made imply that the trajectories of the first and the second additional processes coincide after a time tk- Consequently; Ttik = Ttio

for

t > tk.

Stationary

30

processes

Therefore; p i t a - p r t & = P[(rt(i* - 6 ) , t < u]. This, together with (47), gives (46). The distribution of the random variable t* is equal to the distribution of the maximum of k independent random variables, each of which has the same distribution as ti. So; Pfc(t> t) > fcPi(t> t) and the absolute value of the second term in (45) does not exceed:

J

ft

°°

' N(du) ^T^ qk{u)kg(t — u), 0 where g{t) = P i ( t > t). To prove the k=i integrability of the function / it will suffice to demonstrate that the mean P i t is finite (which is the same as the integrability of the function g). By using the formula for the average probability; Pi(tpP(Tc > ^> ^ O » which, according to (50) and the measurability (with respect to Afk) of the event {tq > fc}, coincides with; Tc — l

OO

p átc

> k, p¡ko

= p P Y1

k=0

=

k=0

If f = i{rc 0). The nonnegative random variable m is a stopping time with respect to the flow of cr-algebras _M¡ (t > 0), which is generated by the process } t (' > 0) (i.e. {m < t} £ M for all t > 0). The a-algebra A4m contains those and only those events T from A4, for which r n {m < i } € Mt for all t > 0. If the Markovian property (62) holds at a certain, but not given at the beginning, stopping time m, then the process ¡t (t > 0) is called Virtual 37.

Markovian.

The general theory of Virtual Markovian processes can be noticeably simplified

(which can be observed from the following lemma) if it is assumed that the set of trajectories of the process

(t > 0) is closed with respect to the stopping times.

This implies that, for every one of its elements w, the space of elementary events, fi, contains, for every s > 0, an element w' such that ¿¡(u/) = 3jA 0. L e m m a . If the set of trajectories

of a process $t (t > 0), which is Ma.rkovia.n at

time m, is closed with respect to the stopping times, then equation (62) is preserved if m is replaced with any of the times mi = m, mt = m + Tmm.k-\, k = 2,3, P r o o f . The proof is based on the following form of the definition given in (62) which is apparently more general. Let T)t = T)(t,u>) be a bounded measurable (with respect to B+ x M) function. Then, if £t = TtT¡t\ (63)

(64)

44

Ma.rkovia.ii processes

for some u > 0 ( r G M) then (63) becomes: P x ( m < ti, T - T l ^ i . ) =

P3m(r)

is almost certainly true with respect to Px, m < u. However, this evidently follows from (62) because the variable m is measurable with respect to Aim.

The general

case is proved by making an approximation to an arbitrary measurable (with respect to B+ x M) function by limits of the linear combinations of functions of the type given in (64). If the set of trajectories of the process } t (t > 0) is closed with respect to the stopping time, then the «r-algebra M.m is generated by the variables 3 tAm . Indeed, by the axiom of choice, for every u > 0 there exists such a mapping Su : fi —• fi that it{Suui) — ¡tAu(^) for all t > 0. Any such mapping is a measurable mapping of the measurable space

into itself because the inverse image of the set

{w : Jt(w) G B} under the mapping 5 U , where B G A, is the set {w : 3(au(w) €E B}, which belongs to Mu.

Hence, S^(M)

C Mu and, if T G Mu, then S ^ r = T.

Consequently, all sets from M. u and only these are invariant under the mapping Su. Hence, for a measurable, with respect to M, variable £ to be measurable with respect to M u , it is necessary and sufficient that £(w) = £(5 u w), or f(w) = £(o/) if 3((u;) = 3 0. Then, if 3t(w) = ¿¡(a/) for i < u and mt(w) < u, then mi_i(w) = m * - ^ « ' ) < u. Thus, m*(w') = mt-iiw') + m(r mjt _ l(u ,/)w') = t? + m(T v w'), where v = mjt-^w) = t n ^ i i w ' ) . This implies that m(Tvu>') = m(T„w) and, consequently, mjt(cj) = m*(u;'). Hence, the inequality mjt(w) < u implies that mi(w') < u. By symmetry, it follows that < u if and only if mfc(w') < u, i.e. the event {m^ < u} is measurable with respect to the u-algebra M u , or mt is a stopping time with respect to M u (u > 0). Let Q be a cr-algebra generated by the variables 3iAm (t > 0). Because the process it (t > 0) is measurable, the mapping Smu> =

is measurable as a mapping of

Sometimes

(il,M)

45

Ma.rkovia.ii processes

into itself, and Q is a cr-algebra of invariant, with respect to Sm, events from

M. Thus a measurable (with respect to M) variable, to Q, if and only if £(w)£(u/) if ¡t(u)

is measurable with respect

= £(Smu>), which in its t u r n is equivalent to the condition

= } t ( u ' ) for t < m(w) and so Q — Mm.

If induction is used with (63), then the Markovian property given in (62) is preserved when m is replaced by m^. Let the l e m m a be proved for the times m i , . . . < To complete t h e proof, the following verification should be undertaken:

P«(3ti € S i , . . . ,3t„ € Bn, tn < mk,imk



A ^ T )

= P*(3«i € Bi, . . . ,3(„ € Bn tn < where x £ E, Bu..., event {¿ tl 6 B\,...

€ A, P}mk ( r ) ) ,

(66)

Bn, A € A, 0 < ¿i < . . . < 0), with a phase space, (E, A), be closed with respect to the stopping and the cr-aJgebra, A, be countabJy generated. chain ¡m¡_ (k = 1,2,...)

Then, if the imbedded

times,

Markovian

is recurrent with an invariant measure ir and: P„(m = 0) = 0,

(69)

then; f

Km —.

/o Tujdu _ P „¡0mT¿dt 4 f fó Tur)du Pr JoTtvdt

(70)

1

is almost certainly true with respect to P x , where x £ E, P * /J" T((|£| + \rj\)dt < oo, and P * J™ Ttqdt ^ 0. P r o o f . The Markovian property at time m and the invariance of the tr-finite measure, IT, implies the invariance, with respect to Tm, of the c-finite measure, P T , on M.

Indeed, P T T m £ = P . P 3 m í = P»ff(jm), where g{x) = P r £ . But P ^ m )

JE*(dx)g(x) 41.

=

=

P,t.

L e m m a . Suppose that conditions of the theorem are valid and let a with respect to Mm, variable, (, be positive almost everywhere

measurable,

with respect to P^.

Sometimes Aiariovian processes Then;

47

oo E r ^ *=1

=

o

°

(71)

is true almost everywhere with respect to PT. P r o o f . Without loss of generality, it can be assumed that 0 < ( < c is true almost everywhere with respect to P * and 7r{x : P r £ > 1/c} > 0. Suppose that (71) is not valid. Then, there exists a time, t, such that t > 0 and; P , ( l i m ok < t) > 0,

(72)

k—»oo

where -(< + c)Px{ak

3=1

and, consequently; oo Y , P r ü m , € A, lim aT 00 k—*oa

is true almost everywhere with respect to P x , has the full measure w. Because the Markovian chain ¡ m k (k =

1 , 2 , . . . ) is recurrent, the limit

lim^oo Pjmt77 is almost certainly degenerate with respect to P x , for almost all x, with respect to tt, and this proves that the cr-algebra of invariant events with respect to Tm is trivial with respect to the measure P*. Hence; _ P* /0m T^du 5

P.C

. _ '

77

PwJ?TuVdu P,C

Markovian processes

50

is true almost everywhere with respect to P „ and, consequently; /0m* Tu£du _ P , / 0 m Tujdu TuVdu

P,

/0mTur,du

is true almost everywhere with respect to P T . The following inequalities, which are valid for £ > 0 and m* < t < m t + j ; /„""'Tu(du

E

k—l rp

j~0

J ^ d u




rp

> r^fc

Zj;=o -'mj',

rp

A '

2-,j=0Jn>A

,. o^nijC P,rT m £ Um — r « = ———— = i

being true almost everywhere with respect to P * , yields the existence, almost everywhere with respect to P * , of the limit in the left hand side of (70). Thus (70) is almost certainly valid with respect to P T . Let T denote the event such that the limit in the left hand side of (70) exists and equals the right hand side of (70). Let B be the set of those x from E for which P e ( r ) = 1. Then, tt("|£) = 0 and, consequently, P x (TB < oo) = 1 for all x € E, where TB — min{A: : 3Wk €

B}.

So, by the invariance of T, the almost certain (with respect to P r ) inclusion of 3m,B € B and the Markovian property of the process at the time m T B , then; P x ( r ) = P « ( T ~ ^ r ) = p r p , . r a ( r ) = 1, which proves the statement. 42.

A typical example of a Virtual Markovian process is a semi-Markovian process.

Such a process, which shall be denoted by 31 {t > 0), is Markovian at the time, m = inf{i : 31 ^ 30}, of the first jump and has step trajectories, i.e. the trajectories are right continuous and have left limits in the discrete topology, the phase space (E, A). The cr-algebra, A, contains all one-point sets and is generated by a countable number of its elements. Then, the cr-algebra, M , which is generated by the process 3t (t > 0), is also countably generated because it coincides with the minimal 0), in the phase space, { 1 , . . . , d} x

R 1 , is called a (homogeneous)

process with independent increments

in a random

medium if its conditional distribution, under the conditions I/q = i and (o = x, coincides with the distribution of the process (ft,x + Ct)

^ 0) subject to the

conditions i/o = i and (o = 0.

44.

Theorem. If almost all trajectories in a random medium,

of a process with independent

(vt,(t) (t > 0), are right-continuous, 53

increments

have left limits, and

A random walk on the semiaxis

54

PtJCt - Ct- > 0) = 1, then; ||P,o(e jCl ,« = j)\\tj=1 = exp {tA(s)}, where 3ts < 0 and P 1 2 : are conditional probabilities i4(a)

=

for i/q —

1

and (o = x;

f°°

Loidyy + AlS + A2S2 Jo hidyfte" - 1 - s(y M)].

A0+ +

(76)

Jo

Here Ao, A\, A2 and Li(dy) are diagonal matrices with the corresponding a?, aj, a?, /,-(dy), a? > 0, /,(dy) > 0, /,(0) = 0 and f~(t2A

elements

1)1,{dy) < 00. Tie matrix

Lo{dy) has all the diagonal elements equal to zero, its off-diagonal elements, are nonnegative

(77)

l,j(dy),

and: a? + £ U 0 , c x , ) = 0 .

(78)

Proof. It is sufficient to establish the existence of the mean values in the left hand side of (76) for s < 0 in the case where t > 0 because, if P,fle'i' < 00 when 0 < t < to {1 — 1 , . . . , d) and where to can depend on s, then; P,,oe5Ct = P,,0P„to,(toe'c' = Pi,o[eli'oPi/to>oeii'-o] < 00 for 0 < t -10 or 0 < t < 210. Let pt(x) = max, Ptio(info