• 沒有找到結果。

Stochastic Processes: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved**

N/A
N/A
Protected

Academic year: 2022

Share "Stochastic Processes: Syllabus and Exercise Narn-Rueih Shieh **Copyright Reserved**"

Copied!
14
0
0

加載中.... (立即查看全文)

全文

(1)

Stochastic Processes: Syllabus and Exercise

Narn-Rueih Shieh **Copyright Reserved**

§ This course is suitable for those who have taken Basic Probability;

some knowledge of Measure Theory is better( for Chapters 4,5).

§ The content and exercise are adapted from the followings:

S Karlin and H Taylor: A First Course in Stochastic Processes, 2nd Ed(1975) J Lamperti: Probability(1966); Stochastic Processes(1977)

R Durrett: Essentials of Stochastic Processes(1998)[ This one is recommanded for your own reading]

§ Grading:

HmWk and OnBb:40 percent; Exam(s):60 percent. Subject to adjust.

(2)

Chapter 1: Basic Theory of Stochastic Processes

§ math frames, finite–dim’l distributions

§ Kolmogorov Existence Theorem

§ discrete time, continuous time, multipaprmeter time(random field)

§ state space

§ sample path continuity(Kolmogorov Continuity Theorem)

§ sample path jump property

§ equivalence of processes

§ convergence of processes

§ exercise of Chapter 1

1. BPR(Basic Probability Review): Prove the following monotone continuity of

“probability measure” : If An+1 ⊂ An and ∩An =∅ then limn→∞P (An) = 0.

2. BPR: Prove the right continuity and the left limit existence of a distribution function F (x) on x. How about the multivariate case x∈ Rn.

3. PRR: Prove that, (i) if {Xn} is an uncorrelated random sequence, all with mean 0 and variance 1, then WLLN holds; (ii) if{Xn} is iid with foruth moments then SLLN holds.

4. BPR: Let {Xn} be iid B(p) and ¯Xn be sample mean, and f be a continuous function on [0,1]. Prove that Bn(p) := Ep[f ( ¯Xn)] is an n− th order polynomial and that Bn(x) converges to f (x), uniformly for x∈ [0, 1].

(3)

5. Refer to K Existence Theorem, prove that the class F of all sets of the form {ω; ai < ω(ti) ≤ bi, i = 1,· · · , k, ti, ai, bi, k vary freely, form a f ield, i.e. Ω ∈ F, C1/ C2 ∈ F and ∪ml=1Cl ∈ F whenever the assumed members are in F. Prove that P defind in K theorem is an “additive set function” onF.

6. Let Fn be a sequence(finite or infinite) of distribution functions on R. Refer to K Theorem, prove that there is a probability space (Ω,F, P) and a sequence of rv’s Xn defined on Ω such that Xn are independent and each Xn is of distribution Fn.

7. Let Fn in Ex 6 be of the form Fn(x) = 1, x < 0, = 1/2, 0 ≤ x < 1, = 1, x ≥ 1.

Show that how the probability space in Ex 6 corresponds to the unit interval [0,1] with P corresponding to Lebesgue-Borel measure.

8. Let Xtbe a continuous time process such that the increments Xt−Xsare Gaussian with mean 0 and var t− s. Prove that X has continuous sample paths, by applying K Continuity Theorem and the moments of mean 0 Gaussian rv which is given in BP.

9. How do you figure K Existence Theorem and K Continuity Theorem for multi- paprmeter time(random field) case ?

10. BPR: A sequence of distributions Fnon R is said to converge to a distribution F if Fn(x)→ F (x) at each continuity point x of F . It is then also said the corresponding Xnconverges to X in distribution. Prove that, if Xn converges to X in probability then it converges in distribution.

(4)

Chapter 2: Discrete Time Markov Chains

§ definition, transition probabilities

§ finite–dim’l distributions

§ random walks

§ examples of finite MC: on–off, weather, gambling, Ehrenfest, Wright–Fisher

§ examples of denumerable MC: queue, branching, birth and death

§ n–step transition probabilities, Chapman–Kolmogorov equation

§ i → j, i ↔ j, irreducibility

§ periodicity

§ recurrent states and transient states

§ usuage of probability generating function(pgf)

§ recurrence theorem

§ recurrence and transience of SRW

§ discrete nenewal equation

§ basic limit theorem

§ ergodicity

§ stationary equation and stationary distribution

§ examples of stationary MC

§ Galton–Watson branching process(GWBP), usuage of pgf

§ GWBP, extinction probability

(5)

§ examples of GWBP

§ excecise of Chapter 2

1. BPR: conditional probability, multiplicative formula and formula of total proba- bilities. P (·|E) is also a probability measure.

2. Assume the weather of any day depends the weather condtion of two previous days. To set a weather chain under this assumption by introducing the product space of the original states S, C, R.

3. An inventory chain is a model for continuing demand of some goods, say video game, in one, say, PC, store. If at the end of one business day, the number of units they have at hand is 1 or 0, then they order some new units so that their total on hand is, say, S=5. Assume the new units can arrive before the new business day. Assume the number of customers to buy one unit at one day is 0, 1, 2, or 3 with probabilities 0.3, 0.4, 0.2 and 0.1. Describe the transition matrix with state space 0,· · · , 5. In general, when the stock at hand is≤ s then we order new units to bring the stock back to S. Let dn+1 be the demand at the day n + 1, use the positive part function x+ = max{x, 0} to write the Xn+1 in terms of Xn, S, dn+1.

4. In the above vedio game store, assume that it makes USD 12 profit on each unit sold but it takes USD 2 per day to store one unit. What is the long–run profit per day of S = 5, s = 1 inventory policy ? How to choose S, s to make the maximum profit ?

5. In the gambling chain, let h(x) denote the probability of A to win all money ,i.e. B is ruined, when A starts with X dollars. Calculate the h(x) by solving a certain

(6)

difference equation of h(x), with boundary conditions h(0) = 0, h(N ) = 1. Note that the answer is of different type for the case p = q = 1/2 and the case p6= q.

6. In Ex 4, calculate the expected duration time of the game, in the fair case p = q = 1/2, H(x) = Ex[T ], T := min{n : X(n) = 0, N}. Again, to solve a certain difference equation of H(x), with boundary condition H(0) = H(N ) = 0.

7. Discuss the barrier 0, N for WF model with mutation.

8, Write down the trnasition matrix for birth and death chain.

9. Use CK Equation to prove the relation i → j is transtive.

10. Determine the classes and the periodicity of all states in the following Markov matrix

0 1 0 0

0 0 0 1

0 1 0 0

1/3 0 2/3 0

11. Use CK Eq to prove that i↔ j ⇒ d(i) = d(j).

12. Use σ−additivity of probability to prove that i is rec iffPn=0fii(n) = 1.

13. Use power series and Cauchy product to prove that Pii(s)− 1 = Fii(s)Pii(s), and that Pij(s) = Fij(s)Pij(s), i6= j.

14. Use CK to prove that i↔ j iff i, j either both rec or both tran.

15. Try to use three states MC to show that the expected number for tossing a fair coin to get two consecutive Tails is 6. Durrett p70.

16. Let x = {xi} satisfy the stationary eq x = xP. Prove that, for each n, xj =

P

ixip(n)ij .

(7)

17. A Markov matrix is called doubly stochastic if Pipij = 1 for each j. Prove that the corresponding MC then has at most one stationary prob distribution.

18. Prove that for the merry–go–round model, the stationary distribution is πi = 1/l,∀i.

19. Find the stationary distribution for Ehrenfest chain.

20. Write down and try to solve, by exact or by numerical, the extinction probability eq’s for the binary and the binomial GWBP’s.

21. Calculate fn(s) = f (fn−1(s)) explicitly for f (s) := α+βsγ+δs, αδ− βγ 6= 0.

22. Apply Ex 21 to get P{Xn = 0}, and then verify the extinction formula in this case.

(8)

Chapter 3: Cintinuous Time Markov Chains

§ continuous times Markov transtions probabilities and CK Equation

§ contruction from discrete time to continuous time MC

§ Poisson process

§ Yule process

§ birth and death process

§ infinitesimal generating matrix

§ renewal process and its SLL, CLT

§ renewal equation and renewal theorem

§ continuous time Markov branching process, usuage of pgf

§ continuous time Markov branching process, extinction probability

§ exercise of Chapter 3

1. BPR: show that an exponential distrbution T with parameter λ has the following

“lack of memory” property: P (T > t + s|T > t) = P (T > s).

2. Try the following Matlab graphic: Let Ui be U (0, 1) distributed and let τi =

− ln Ui. Define, as your will, the transition of Yn by some random numbers. Graph Xt

based on Yn and τi.

3. BPR: in the waiting times or inter-arrival times Ti, i = 1, 2,· · · , of a Poisson process, show that the Sn = T1+· · · Tn is a Γ distribution by showing the convolutions.

4. Compute the covariance of X(t) and X(t + s) of a Poisson process.

(9)

5. Let Y1, Y2,· · · be iid random sequence, and let N(t) be a Poisson process, indepen- dent on Y1, Y2,· · ·. Define X(t, ω) = Y1+· · ·+YN(t,ω), with X(t, ω) = 0 when N (t, ω) = 0.

It is called Y (t) a compound Poisson process. Calculate E(X(t)), var(X(t)) by following Wald identities. Write X, N for X(1), N (1), show that E(X) = EN · EY1, V ar(X) = EN · var(Y1) + V ar(N )(EY1)2. Key: look at the formulae when N is nonrandom, then calculate E(X) by using total probability formula of conditional probability based on (N = n), how the indpendence assumption will be used? Durrett p138

6. Let N1(t), N2(t) be two independent Poisson processes, with parameter λ1, λ2, prove that the sum process X(t) = N1(t)+N2(t) is also a Poisson process, with parameter λ1+ λ2.

7. Verify Yule process, with parameter β, has pk(t) = eβt(1− eβt)k−1, k = 1, 2,· · ·.

Then verify the pgf of X(t) is

ft(s) = seβt

1− (1 − eβt)s,−1 ≤ s ≤ 1.

8. Consider a Yule process with X(0) = N, N > 1. Regard the process as the sum of N independent Yule proceess, each starts with 1. Then obtain the pfg of X(t).

9. In Ex 7, obtain pk(t) by using the Taylor expansion of (1/(1− x))N..

10. In Yule process, with parameter β, let Tn be the time, start with 1 particle, to get n + 1 particles. Prove that ETn ≈ (ln n)/β, as n large.

11. In birth and death process, assume that Pkπk <∞ and that Pk = 1, prove that pj = Pπj

kπk, j = 1, 2,· · ·.

(10)

12. In birth and death process, λk = λk + a, µk = µk, let M (t) = EX(t), use 2nd Kolomogrov equation to derive that M0(t) = a + (λ− µ)M(t). Try to solve this d.e. ( λ =, >, < µ separately)

13. How you can understand matrix–valued o.d.e. P0(t) = P(t)A. How you consider the convergence.

14. Carry out the proof of CLT for renewal process

15. In renewal process, assume at i− th renewal you can earn a reward(or say you need to pay the cost) ri. Write R(t) =PNi=1(t)ri. Try to explain the limit R(t)tEXErii.

17. In continuous time Markov brnaching process, calculate the V ar(X(t) in terms of u0(1), u00(1) by the similar way in GWBP in Chapter 2. There are two cases u0(1)6= 0 and u0(1) = 0. KT p438

18. In continuous time Markov brnaching process, let u(s) = sk− s, say k = 2, solve

∂φ(t,s)

∂t = u(φ(t, s), φ(0, s) = s to get φ(t, s). KT p 439 19. Repeat Ex 18 with u(s) = 1− s −√

1− s. KT p439

20. If the life time of the particle is a distribution with density (1/2)tet, then the corresponding continuous time branching process is not Markovian. However the distribution is the sum of two independent exponent distributions, with parameter 1.

So we can think the branching process as X(t) = (X1(t), X2(t), each Xi is Markov branching, try derive the u function of this two–type branching process, here u should be a vector u = (u1, u2) and ui have two variables s1, s2. KT p431

(11)

Chapter 4: Second Order Processes

§ review of Hilbert space L2(Ω,F, P ) : CS inequality, inner product, completeness, GS orthonormalization, decomposition theorem

§ 2nd order process and its covariance function

§ continuity and differentiation of continuous time 2nd order processes

§ integration of continuous time 2nd order processes

§ covariance–stationary 2nd order processes

§ spectral measure of covariance function

§ spectral representation of 2nd order processes

§ predictions: linear prediction and optimal prediction

§ Wold decomposition

§ Gaussian process and system

§ exercise of Chapter 4

1. Prove CS inequality and GS o.n. procedure.

2. BPR: WLLN and SLLN; let ξn be iid mean 0 and var 1. Let Xn = n1 Pnk=1ξk. Prove that Xn→ 0 both in probability and in the mean square sense. Give an example show that in general the convergence cannot be w.p. 1. Porve that it can be w.p.1 if ξn

is assumed to have 4th moments.

3. Express the mean qsuare||Xt−Xs||2by the covariance function K. Then use the completeness to prove that if K(s, t) continuous at (t0, t0) then the process is continuous

(12)

at t0.

4. A Wiener process is a continuous time 2nd order processsuch that the process has independent and stationary increment( recall the Poisson process) and that each Xt

is Gaussian with mean 0 var σt, for some constant σ > 0. Compute its cov function, and see that the process is continuous yet not differentiable.

5. Do Ex 4 for centered Poisson process Nt − λt, where Nt is a Poission process with parameter λ.

6. Try to formulate “ fundamental theorem of Riemann calculus ” for the 2nd order continuous time process.

7. Let X(t), Y (t) be two independent 2nd order processes, with cov function K1, K2. Prove that K1 · K2 is the cov function of the product process X(t)Y (t).

8. Define random cosine process by Xt = Acos(ηt + φ), where A, η are real–valued rv’s. and φ is U nif (0, 2π) and is independent of A, η. Compute the cov function of the process and see if it is cov–stationary.

9. Compute the cov function of AR process Xn= αXn−1+ βξn, n∈ Z..

10. Let A1,· · · , An be orthogonal rv’s and λ1,· · · , λn be reals. Define the process Xt =Pj = 1nAjejt. Compute its cov function and spectral measure dF (λ).

11. Let Xt be a cov–stationary process; write down, formally, the spectral repre- sentations of the differentiation process Xt0 and the integration process RtX(s)ds from that of Xt. What should be sufficient conditions to ensure the formal expressions ?

12. Let Xt be an orthogonal increments process with associated function F . Fix

(13)

t0. Define integral process Yt := Rtt0f (u)dX(u) = R 1[t0, t]9u)f (u)dX(u). Prove that Yt

is also an orthogonal increments process and that the associated function of Y is given by G(t) =Rtt0 |f|2dF. How about the integral R gdY =R gf dX.

13. BPR: recall two rv’s X, Y form a bivariate Gaussian distribution, if X, Y have jpdf f (x, y) :=· · ·; write down the conditional density of X with respect to Y .

14. Let ξn be a noise process, and let Xn = ξn− αξn−1. Show that the spectral measure dF of of the process Xn has a density f (λ) = 1 |1 − αe|2.

15. Let Xn, n ∈ Z, be cov stationary, with cov function K(m), m = 0, ±1, · · · . let αi, i = 1,· · · , p be such that ˆXn = α1Xn−1+· · · + αpXn−p is the best linear prediction of Xn w.r.t. Xn−1,· · · , Xn−p. show that αi, i = 1,· · · , p solve the linear equations K(j) = α1K(1) +· · · + αpK(p− j), j = 1, · · · p. Find the prediction error. KT p471

(14)

Chapter 5: Brownian Motions

§ math def of BM: Wiener process, sample path continuity

§ finite dim’l distributions of BM

§ Fourier–Wiener expansion of BM

§ Markov and strong Markov property

§ First passage time, reflection principle

§ nondifferentiability of Brownian paths

§ quadratic variation of Brownian paths

§ BM with drift

§ U(t) = B2(t)− t and V (t) = exp(λB(t) − (1/2)λ2t)

§ multi–dim’l BM

§ Brownian bridge and OU process

§ exercise of Chapter 5 D p265 Ex 6.1– 6.11

參考文獻

相關文件

So we check derivative of f and g, the small one would correspond to graph (b) and the other to (c)(i.e.. Moreover, f is

(18%) Determine whether the given series converges or diverges... For what values of x does the series

§ Kolmogorov’s definition of probability space (Ω, F, P ) as a measure space of total measure 1, probability measure?. § basic properties of pm: monotonicity, subadditivity,

Okensdal: Stochastic Differential Equations, Sixth Edition(2003, Springer) Chap-..

2) A 900 g sample of pure water is shaken with air under a pressure of 0.95 atm at 20 °C. The vapor pressures of the pure compounds at 25°C are 331 mm Hg for cyclopentane and 113 mm

The molal-freezing-point-depression constant (Kf) for ethanol is 1.99 °C/m. The density of the resulting solution is 0.974 g/mL.. 21) Which one of the following graphs shows the

B) They are cyclic compounds in which carbon ring is connected to another ring not containing carbon atoms C) They are cyclic compounds that can have more than one stable

17) Consider a cell made up of two half cells consisting of the same metal in solutions of the metal ion of different concentrations.. A piece of the coffin is analyzed by