Description

Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Side 1 av 6 LØSNINGSFORSLAG EXAM IN TMA4295 STATISTICAL INFERENCE Friday 6 June 2008 Time: 09:00 13:00 Oppgave 1 Let X 1,...,

Information

Category:
## Design

Publish on:

Views: 17 | Pages: 6

Extension: PDF | Download: 0

Share

Transcript

Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Side 1 av 6 LØSNINGSFORSLAG EXAM IN TMA4295 STATISTICAL INFERENCE Friday 6 June 2008 Time: 09:00 13:00 Oppgave 1 Let X 1,..., X n be iid from a beta distribution with parameters (θ, 4θ, i.e. from a distribution with pdf Γ(5θ Γ(θΓ(4θ xθ 1 (1 x 4θ 1, 0 x 1, θ 0. a List at least three different one-dimensional sufficient statistics. Solution. The likelihood function is where f(x θ = ( ( n Γ(5θ n Γ(θΓ(4θ Therefore (Factorization theorem θ ( n 1 x i (1 x i 4 x i (1 x i = g(t (x; θh(x, T (x = n x i (1 x i 4, ( n Γ(5θ g(t; θ = t θ, Γ(θΓ(4θ ( n 1 h(x = x i (1 x i. T (X = n X i (1 X i 4 Side 2 av 6 is a (one-dimensional sufficient statistic. Any one-to-one transformation of a sufficient statistic is also a sufficient statistic, therefore, for example, each is a sufficient statistic. T k (X = [T (X] k, k = 2, 3,..., b Can the method of moments estimator (MME of θ be found using the first moment (expectation? Find MME using the second moment. Solution. The first moment is 1/5 independently of θ, therefore it cannot be used. Let µ 2 be the second moment. Then or Thus MME is where µ 2 = θ = ˆθ = 4 25(5θ (25µ (25m , m 2 = 1 n Xi 2. Oppgave 2 Let X 1,..., X n be iid from a geometric distribution with parameter θ, i.e. from a distribution with pmf θ(1 θ x 1, x = 1, 2,...; 0 θ 1. a Find the maximum likelihood estimator ˆτ 1 of τ(θ = 1/θ. Solution. MLE of θ is solution of the equation that is ln f(x θ θ = n θ n X i n 1 θ ˆθ MLE = 1 X. = 0 Therefore (the invariance property of MLE ˆτ 1 = X. Side 3 av 6 b Find the asymptotic variance of the estimator ˆτ 1. Solution. The asymptotic variance is (see theory v(θ = [τ (θ] 2 I 0 (θ, where I 0 (θ is the Fisher information of one observation. I 0 (θ = E 2 ln f(x i θ θ 2 = 1 θ 2 (1 θ. Therefore, since τ (θ = 1/θ 2, the asymptotic variance is v(θ = 1 θ θ 2. c Find asymptotic (1 α maximum likelihood confidence interval for θ. Solution. Since the variance is estimated by Then 1 α P ( z α/2 ˆθ θ V arˆθ n(ˆθ θ ( D 1 N 0,, I 0 (θ V arˆθ = 1 ni 0 (ˆθ = ˆθ 2 (1 ˆθ n = X 1 n X 3. ( 1 X 1 z α/2 = P X z α/2 n X θ 1 X X 1 + z 3 α/2 n X. 3 Thus [ 1 X 1 X z α/2 n X, 1 X ] X 1 + z 3 α/2 n X 3 is an asymptotic (1 α maximum likelihood confidence interval for θ. d Suppose that the first ten observations and each even observation are lost, and τ(θ = 1/θ is estimated by n/2 2 ˆτ 2 = X 2i 1 n 10 i=6 Side 4 av 6 (assume for simplicity that the sample size n is always even. Find the asymptotic efficiency of ˆτ 2 (that is asymptotic relative efficiency of ˆτ 2 with respect to asymptotically efficient estimator ˆτ 1, for which all observations are used. Solution. Since n 10 D (ˆτ 2 τ(θ N 2 and since we have n(ˆτ2 τ(θ n n 10 2 D N Therefore the asymptotic efficiency of ˆτ 2 is 1/2. 2, (0, [τ (θ] 2 I 0 (θ (0, 2[τ (θ] 2 I 0 (θ., Oppgave 3 Let X 1,..., X n be a random sample drawn from a Poisson distribution with parameter θ. a Show that for testing H 0 : θ θ 0 versus H 1 : θ θ 0, the rejection region of a uniformly most powerful test has form { } R = x : x i c. Let α be the significance level. Find (approximately c if n is large enough so that the Central Limit Theorem can be used. Solution. The likelihood function is therefore, if θ θ, then the ratio L(θ; X = e nθ θ P X i ( Xi! 1, ( L(θ ; X L(θ ; X = en(θ θ θ P X i θ is a monotone (decreasing function of T (X = X i. Therefore the rejection region of UMP test has form { } R = x : x i c. Side 5 av 6 To find c let us use CLT. We have EX i = θ, V ar(x i = θ therefore and α = P θ0 ( X i c = P θ0 ( Xi nθ 0 nθ0 c nθ 0 nθ0 1 Φ Thus the hypothesis is accepted if c = nθ 0 + nθ 0 z α. ( c nθ0 nθ0 or X i nθ 0 + nθ 0 z α X θ 0 + θ0 n z α. b Prove that the test of part (a is unbiased. Solution. Prove the following. Let Y 1 Poisson(λ 1, Y 2 Poisson(λ 2, and λ 1 λ 2. Then P (Y 2 c P (Y 1 c for any c 0. Indeed, consider Y 3 independent on Y 1 and such that Y 3 Poisson(λ 2 λ 1. Then Y 1 + Y 3 Poisson(λ 2, and Now part (b follows from the fact that P (Y 2 c = P (Y 1 + Y 3 c P (Y 1 c. X i Poisson(nθ. Less strong but also valid solution is based on the normal approximation (see solution of part (c. c Find (approximately and plot the power function π(θ of the test of part (a. Find, in particular, lim θ 0 π(θ, π(θ 0 and lim θ π(θ Solution. π(θ = P θ ( X i c = P θ ( Xi nθ nθ ( n(θ0 θ + nθ 0 z α 1 Φ. nθ n(θ 0 θ + nθ 0 z α nθ Side 6 av 6 Simple analysis shows that lim π(θ = 0, θ 0 π(θ 0 = α and lim π(θ = 1. θ d Find the (1 α one-sided confidence interval that results from inverting the test of part (a. Solution. Inverting the test of part (a, i.e. solving the inequality θ X θ + n z α with respect to θ, we obtain the following (1 α one-sided confidence interval: [ ( 1 z 2 α 4 n + 4 X z 2 α,. n

Related Search

Similar documents

We Need Your Support

Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks