barcodefield.com

Dependent Laws of Large Numbers and Central Limit Theorems in .NET Maker ANSI/AIM Code 128 in .NET Dependent Laws of Large Numbers and Central Limit Theorems




How to generate, print barcode using .NET, Java sdk library control with example project source code free download:
Dependent Laws of Large Numbers and Central Limit Theorems using barcode maker for none control to generate, create none image in none applications. ASP.NET Web Form Project series this is no none for none t too farfetched an assumption, for in reality they always start from scratch somewhere in the far past (e.g., 500 years ago for U.

S. time series). Definition 7.

3: A time series process X t has a vanishing memory if the events in the remote -algebra = (X t , X t 1 , X t 2 , . . .

) have either t=0 X probability 0 or 1. Thus, under the conditions of Theorems 7.1 and 7.

2 and the additional assumption that the covariance stationary time series process involved has a vanishing memory, the deterministic term Wt in the Wold decomposition is 0 or is a zero vector, respectively. 7.2.

Weak Laws of Large Numbers for Stationary Processes I will show now that covariance stationary time series processes with a vanishing memory obey a weak law of large numbers and then specialize this result to strictly stationary processes. Let X t R be a covariance stationary process, that is, for all t, E[X t ] = , var[X t ] = 2 and cov(X t , X t m ) = (m). If X t has a vanishing memory, then by Theorem 7.

1 there exist uncorrelated random variables Ut R 2 with zero expectations and common nite variance u such that X t = 2 m=0 m Ut m , where m=0 m < . Then (k) = E. m=0. m+k Ut m m=0. m Ut m m=k (7.11). 2 Because m &l t; , it follows that limk m=0 lows from (7.11) and the Schwarz inequality that m=k m=0. 2 m = 0. Hence, it fol-. 2 . (k). u . 2 m. 2 m 0 as k . Consequently,. n n 1 n t var (1/n). = 2 /n + 2(1/n 2 ). t=1 m=1 n 1 (m) (n m) (m). m=1 n = 2 /n + 2(1/n 2 ) 2 /n + 2(1/n). (m). 0 as n . (7.12).

From Chebishev s none none inequality, it follows now from (7.12) that Theorem 7.3: If X t is a covariance stationary time series process with vanishing memory, then plimn (1/n) n X t = E[X 1 ].

t=1. The Mathematical and Statistical Foundations of Econometrics This result requi none none res that the second moment of X t be nite. However, this condition can be relaxed by assuming strict stationarity: Theorem 7.4: If X t is a strictly stationary time series process with vanishing memory, and E[.

X 1 . ] < , then pl none for none imn (1/n) n X t = E[X 1 ]. t=1 Proof: Assume rst that P[X t 0] = 1. For any positive real number M, X t I (X t M) is a covariance stationary process with vanishing memory; hence, by Theorem 7.

3,. plim(1/n). n t=1 (X t I (X t M) E[X 1 I (X 1 M)]) = 0.. (7.13). Next, observe that (1/n). (X t E[X 1 ]). (1/n). t=1 n (X t I (X t M) none for none E[X 1 I (X 1 M)]) (X t I (X t > M) E[X 1 I (X 1 > M)]). + (1/n). (7.14). Because, for nonn egative random variables Y and Z, P[Y + Z > ] P[Y > /2] + P[Z > /2], it follows from (7.14) that for arbitrary > 0,. (1/n). (X t E[X 1 ]) > P +P (1/n). t=1 n (X t I (X t M) E[X 1 I (X 1 M)]) > /2 (X t I (X t > M) E[X 1 I (X 1 > M)]) > /2 .. (1/n). (7.15) For an arb itrary (0, 1), we can choose M so large that E[X 1 I (X 1 > M)] < /8. Hence, if we use Chebishev s inequality for rst moments, the last probability in (7.

15) can be bounded by /2:. (1/n). (X t I (X t > M) E[X 1 I (X 1 > M)]) > /2 (7.16). 4E[X 1 I (X 1 & none for none gt; M)]/ < /2..
Copyright © barcodefield.com . All rights reserved.