A geometric distribution has a pdf given by P(X=x) = p(1-p)^x

A geometric distribution has a pdf given by P(X=x) = p(1-p)^x, where x = 0, 1, 2, …, and 0 < p=””>< 1.=”” this=”” form=”” of=”” the=”” geometric=”” starts=”” at=”” x=”0,” not=”” at=”” x=”1.” given=”” are=”” the=”” following=””>

E(X) = (1-p)/p, and Var(X) = (1-p)/p^2

A random sample of size n is drawn; the data are X1, X2, …, Xn.

A. Derive the Fisher information function for the parameter p.

B. Find the Cramér-Rao lower bound (CRLB) for the variance of an unbiased estimator for p.

C. Find a sufficient statistic for the parameter p

D. Show that the sample mean, xbar, is an unbiased estimate of E(X) = (1-p)/p Find the variance of xbar.

E. Argue whether or not the sample mean is a minimum variance unbiased estimate (MVUE) of population mean, mew.

The Correct Answer and Explanation is :

A. Fisher Information Function for Parameter ( p )

The Fisher information function ( I(p) ) is given by:

[
I(p) = – \mathbb{E} \left[ \frac{\partial^2}{\partial p^2} \log L(p) \right]
]

Where ( L(p) ) is the likelihood function of the sample. For a random sample of size ( n ), the likelihood function is:

[
L(p) = \prod_{i=1}^{n} P(X_i = x_i) = \prod_{i=1}^{n} p(1 – p)^{x_i}
]

Taking the logarithm of the likelihood:

[
\log L(p) = \sum_{i=1}^{n} \log \left[ p(1 – p)^{x_i} \right]
= n \log p + \sum_{i=1}^{n} x_i \log(1 – p)
]

Now, we compute the first and second derivatives with respect to ( p ):

  1. First derivative:

[
\frac{\partial}{\partial p} \log L(p) = \frac{n}{p} – \sum_{i=1}^{n} \frac{x_i}{1 – p}
]

  1. Second derivative:

[
\frac{\partial^2}{\partial p^2} \log L(p) = – \frac{n}{p^2} – \sum_{i=1}^{n} \frac{x_i}{(1 – p)^2}
]

The Fisher information is:

[
I(p) = – \mathbb{E} \left[ \frac{\partial^2}{\partial p^2} \log L(p) \right]
]

Since ( \mathbb{E}[X_i] = \frac{1 – p}{p} ), the Fisher information function is:

[
I(p) = \frac{n}{p^2} + \frac{n(1 – p)}{p^3}
]

B. Cramér-Rao Lower Bound (CRLB) for the Variance of an Unbiased Estimator of ( p )

The CRLB is given by:

[
\text{CRLB} = \frac{1}{I(p)}
]

From part A, we know the Fisher information ( I(p) ). Therefore, the CRLB is:

[
\text{CRLB} = \frac{p^2}{n(1 – p)}
]

This is the minimum variance for an unbiased estimator of ( p ).

C. Sufficient Statistic for ( p )

By the factorization theorem, a statistic is sufficient for ( p ) if the likelihood can be factored into a product of two terms, one of which depends only on the data through the statistic. The likelihood function is:

[
L(p) = p^n (1 – p)^{\sum_{i=1}^{n} x_i}
]

This can be factored as:

[
L(p) = \left( p^n \right) \left( (1 – p)^{\sum_{i=1}^{n} x_i} \right)
]

Thus, the statistic ( T(X) = \sum_{i=1}^{n} X_i ) is sufficient for ( p ).

D. Unbiased Estimate of ( E(X) = \frac{1 – p}{p} ) and Variance of ( \overline{X} )

The sample mean ( \overline{X} ) is an unbiased estimator of ( E(X) = \frac{1 – p}{p} ) because:

[
\mathbb{E}[\overline{X}] = \mathbb{E}[X_1] = \frac{1 – p}{p}
]

To find the variance of ( \overline{X} ), recall that:

[
\text{Var}(\overline{X}) = \frac{\text{Var}(X_1)}{n}
]

Since ( \text{Var}(X_1) = \frac{1 – p}{p^2} ), we have:

[
\text{Var}(\overline{X}) = \frac{1 – p}{n p^2}
]

E. Is the Sample Mean the Minimum Variance Unbiased Estimator (MVUE)?

To determine if ( \overline{X} ) is the MVUE, we need to check if it is unbiased and has the minimum variance among all unbiased estimators. From parts D and B, we know that ( \overline{X} ) is an unbiased estimator of ( \frac{1 – p}{p} ), and its variance is given by:

[
\text{Var}(\overline{X}) = \frac{1 – p}{n p^2}
]

From the CRLB, the minimum variance for an unbiased estimator of ( p ) is ( \frac{p^2}{n(1 – p)} ). Since the sample mean does not achieve this lower bound, it is not the MVUE.

Thus, while ( \overline{X} ) is an unbiased estimator, it is not the MVUE because it does not attain the Cramér-Rao lower bound. The MVUE for ( p ) could be a more complex estimator that incorporates the sufficient statistic ( T(X) = \sum_{i=1}^{n} X_i ).

Scroll to Top