The process of drawing a sample from a population and to gather information about the parameter through a reasonably close function is known as estimation. This is done in order to find out the unknown parameters which we encounter in a population and which hinders our estimation and conclusion about a population. Thus this process is used to find out the parameters. The value obtained from such a process is known as estimated value and the function is called estimator.
An estimator is supposed to possess the qualities stated below to perform well. They are:
- Un biasedness
A statistic t is an unbiased estimator of a parameter , if E[t] =
If not, then the estimator is biased.
There are quite a few theorems stated below which proves this.
Prove that the sample mean x is an unbiased estimator of the population mean µ
Let x₁, x₂, …, , be a simple random sample with replacement from a finite population of size N, say, X₁, X₂,… ,
Prove E(x) =µ
While drawing xi, it can be one of the population members i.e., the probability distribution of Xi can be taken as follows:
E ( = X₁* + X₂* +….+ *
= µ, i= 1, 2,…., n
E ( = E [( ]
= [E (
If the population is finite or the sampling is done without replacement, the same result will be obtained.
The sample variance S²= is a biased estimator of variance σ².
Let x₁, x₂, …, be a random sample from an infinite population with mean σ and variance σ².
Then, E (x) = µ, Variance (xi) = E (xi – µ)² = σ², where i= 1, 2, … ,n.
= where xi – µ and standard deviation is unaffected by change in origin.
= – µ)²
E (s²) = – µ)²
= )= σ²-
Thus, s² is a biased estimator of σ².
Also, Let S²=
E (s²) =
Therefore, s² is a biased estimator of σ².
Example: A population consists of 4 values 3, 7, 11, 15. Draw all possible sample of size two with replacement. Verify that the sample mean is an unbiased estimator of the population mean.
No. of samples = 42 = 16, which are as below:
(3, 3), (7, 3), (11, 3), (15, 3)
(11 , 7), (15, 7), (11 , 11), (15, 11)
(11, 15), (15, 15), (3, 7), (7, 7),
(3, 11), (7, 11), (3, 15), (7, 15)
Population mean µ= = = 9
Sampling distribution of mean
Mean of sample = = 9
Since E ()= µ
Therefore, sample mean is an unbiased estimator of population mean.
A statistic obtained from a random sample of size n is said to be a consistent estimator of a parameter if it converges in probability to θ as n tends to infinity.
Alternatively, If E [ ] θ and Var [ ] 0 as n ∞, then the statistic is said to be consistent estimator of θ.
When sampling from a population N,
E )= µ and )= → 0 as n→
Therefore, sample mean is a consistent estimator of population mean.
A parameter might comprise of more than one consistent estimator. Let T1 and T2 be two consistent estimators of a parameter θ. If Var (T₁) < Var (T₂) for all n, then T₁ is said to be more efficient than T₂ for all size.
Let x₁, x ₂, … , be a random sample from a population whose pmfor pdfisf (x, 8). Then T is said to be a sufficient estimator of e if f (x₁, ). f(x₂, )…..f( , ) = g1(T , ). g₂( x1,x2,…,
Where g1(T , ) is the sampling distribution and g₂( x1,x2,…, is independent of .
Even though sufficient estimators exists only in certain cases, but when random sampling for a normal population, the sampling mean x is a sufficient estimator of µ.