Point Estimation

RuhulAmin399 5,088 views 18 slides Feb 25, 2019
Slide 1
Slide 1 of 18
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18

About This Presentation

According to Wikipedia point estimation involves the use of sample data to calculate a single value (known as a point estimate since it identifies a point in some parameter space) which is to serve as a "best guess" or "best estimate" of an unknown population parameter (for examp...


Slide Content

Welcome

Department of Statistics P abna u niversity of s cience & Technology

PRESENTATION ON POINT ESTIMATION Group - I Identity of group member Afsana Afroz (161605) Ruhul Amin(161625) Sultana Ismat Jahan(161619) Masud Rana (161633) MST Zarin Tasnim (161601) Angana Basak (161611 )

Table of contents Introduction Estimation Types of estimation Point estimation Methods of point estimation Properties of point estimation

Introduction One of the main objectives of statistics is to drawn inferences about a population from the analysis of a sample drawn from that population. Two important problems in statistical inference are- 1.Estimation 2.Testing of hypothesis.

Estimation Estimation is process of findings an estimate. Estimate : Estimate is the numerical result of the population parameter. Estimator : An estimator is rule or formula used to estimate unknown parameters. There are two types of estimation: 1.Point Estimation. 2.Interval Estimation.

Point estimation

Point estimation Point estimation involve the use of sample data to calculate a single value of an unknown population parameter. Point estimate Point estimate is the single numerical result of population parameter. Point estimator A point estimator is a rule or formula to estimate of single unknown population parameter. Example of point estimation Suppose we want to estimate in our Bangladesh peoples 18-24 who have a smart phone.

Advantage and disadvantage of point estimation Advantage Point estimation gives us a particular value as estimate of an unknown population parameter. By this estimation easily obtain a single number from the sample which will repressent the unknown value of the parameter . Disadvantage Point estimation can’t appropriate inference in every time. The most restriction of point estimation it's estimation is fail when data are stand with continuous.

Methods of estimation Maximum likelihood method(MLE). Least square method. Minimum Chi-square method. Method of moments. Bayesian method.

Maximum likelihood method Suppose that the data X 1 ,X 2 ,…., X n has the joint density function f ( x 1 ,x 2 ,…., x n ; 1 , 2 ,….., ) Where =( 1 , 2 ,…., p )are unknown parameters and X 1 ,….., X n are the observed sample values and f is regarded as a function of p it is called likelihood function. The maximum likelihood estimates(MLE) …., are those values of the f (x 1 ,x 2 ,… x n ; …., ) ≥ f (x 1 ,x 2 ,…, x n ; 1 , 2 ,….., ) for all 1 , 2 ,….., When the Xi’s are the substitute in place of the x i ’s, the maximum likelihood estimators result. Maximum likelihood estimation(MLE) is a method to find the most likely density function that would have generated the data.  

Methods of moments The method of moments is a way to estimate population parameter like the population mean or the population standard deviation. The basic idea is that you take known facts about the population and extend those ideas to a sample. For example , it’s a fact that within a population: Expected value E(x) = μ For a sample, the estimator (method of moments) is just the sample mean, x̄ . The formula for the sample mean is, This is the first moment condition .

Properties of point estimation Unbiasedness. Consistency. Efficiency. Sufficiency.

UNBIASEDNESS Any statistic whose mathematical expectation is equal to parameter is called unbiased estimator of the parameter . A statistic T n =T(X 1 ,X 2 ,……, X n ) is said to be an unbiased estimator of γ ( θ ) if E( Tn )= γ ( θ ); for all θ Є Θ For example, if X follows normal distribution with mean μ & variance σ 2 We have, = E ( ) = = So, , is an unbiased estimator of μ .  

CONSISTENCY An estimator Tn =T(X 1 ,X 2 ,……, X n ) based in a random sample of size n is said to be consistent estimator of γ ( θ ) if T n converges to γ ( θ ). For example, if X 1 ,X 2 ,…., Xn be a random sample from a population with finite mean μ ‹ ∞ then WLLN(with law large number). We have, = tends to ∞ Hence sample mean ( ) is always a consistent estimator of the population mean.  

EFFICIENCY An estimator is said to be efficient if in the class of unbiased estimators it has minimum variance. Example : If T 1 is the most efficient estimator with variance V 1 and T 2 is any other estimator with variance V 2 .Then the efficiency of an estimator is defined as, E=V 1 /V 2 V 1 /V 2 <1 OR V 1 <V 2

SUFFICIENCY An estimator is said to be sufficient for a parameter if it contains all information in the sample regarding the parameter. Example : if T=T(X 1 ,X 2 …. X n ) is an estimator of parameter based on a sample observations X 1 , X 2 …… X n of size n from the population with density f(X; ) Such that the conditional distribution of X 1 ,X 2 …. X n given T is independent of .Then T is sufficient estimator of .  

Thanks to all
Tags