Moment generating function

eddyboadu 7,435 views 26 slides Dec 02, 2011
Slide 1
Slide 1 of 26
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26

About This Presentation

Mathematics


Slide Content

EDWIN OKOAMPA BOADU

DEFINITION
Definition 2.3.3. Let X be a random variable with cdf
F
X
. The moment generating function (mgf) of X (or
F
X
), denoted M
X
(t), is
provided that the expectation exists for t in some
neighborhood of 0. That is, there is an h>0 such that,
for all t in –h<t<h, E[e
tX
] exists.
() []
tX
X
eEtM =

 If the expectation does not exist in a neighborhood of
0, we say that the moment generating function does
not exist.
More explicitly, the moment generating function can
be defined as:
() ()
() [ ] variablesrandom discretefor
and , variablesrandom continuousfor
å
ò
==
=
¥
¥-
x
tx
X
tx
X
xXPetM
dxxfetM

Theorem 2.3.2: If X has mgf M
X
(t), then
where we define
() ()
0
)(
0
=
=
t
Xn
n
n
X tM
dt
d
M
()
()0
nn
X
E X Mé ù=
ë û

First note that e
tx
can be approximated around zero
using a Taylor series expansion:
() ( ) ( ) ( )
[]
2 3
0 0 2 0 3 0
2 3
2 3
1 1
0 0 0
2 6
1
2 6
tx t t t
X
M t E e E e te x t e x t e x
t t
E x t E x E x
é ù
é ù= = + - + - + - +
ë û ê ú
ë û
é ù é ù= + + + +
ë û ë û
L
L

Note for any moment n:
Thus, as t®0
()
()
1 2 2
n
n n n n
X X n
d
M M t E x E x t E x t
dt
+ +
é ù é ù é ù= = + + +
ë û ë û ë û
L
()
() []
nn
X
xEM =0

Leibnitz’s Rule: If f(x,θ), a(θ), and b(θ) are
differentiable with respect to θ, then
( )
()
()
()( ) () ()( ) ()
( )
()
()
ò
ò


+
-=
q
q
q
q
q
q
q
q
qqq
q
qqq
q
b
a
b
a
dxxf
b
d
d
afa
d
d
bfdxxf
d
d
,
,,,

Berger and Casella proof: Assume that we can
differentiate under the integral using Leibnitz’s rule,
we have
() ()
()
()ò
ò
ò
¥
¥-
¥
¥-
¥
¥-
=
÷
ø
ö
ç
è
æ
=
=
dxxfxe
dxxfe
dt
d
dxxfe
dt
d
tM
dt
d
tx
tx
tx
X

Letting t->0, this integral simply becomes
This proof can be extended for any moment of the
distribution function.
() []xf xdx Ex
¥

Moment Generating Functions for
Specific Distributions
Application to the Uniform Distribution:
() (
( )abt
ee
e
tab
dx
ab
e
tM
atbt
b
a
tx
b
a
tx
X
-
-
=
-
=
-

11

Following the expansion developed earlier, we have:
()
( )( ) ( ) ( )
( )
( )
( )
( )
( )
( )( )
( )
( )( )
( )
( ) ( )L
L
L
L
++++++=
+
-
++-
+
-
+-
+=
+
-
-
+
-
-
+=
-
+-+-+-+-
=
222
3222
333222
333222
6
1
2
1
1
6
1
2
1
1
62
1
6
1
2
1
11
tbabatba
t
t
ab
aabbab
t
t
ab
abab
tab
tab
tab
tab
tab
tabtabtab
tM
X

Letting b=1 and a=0, the last expression becomes:
The first three moments of the uniform distribution
are then:
() L++++=
32
24
1
6
1
2
1
1 ttttM
X

()
()
()
()
()
()
4
1
6
24
1
0
3
1
2
6
1
0
2
1
0
3
2
1
==
==
=
X
X
X
M
M
M

Application to the Univariate Normal Distribution
()
( )
( )
ò
ò
¥
¥-
¥
¥-
-
-
ú
û
ù
ê
ë
é -
-=
=
dx
x
tx
dxeetM
x
tx
X
2
2
2
1
2
1
exp
2
1
2
1 2
2
s
m
ps
ps
s
m

Focusing on the term in the exponent, we have
( ) ( )
( )
( )
2
222
2
222
2
222
2
22
2
2
2
2
1
2
2
1
22
2
1
2
2
1
2
1
s
msm
s
msm
s
smm
s
sm
s
m
++-
-=
++-
-=
-+-
-=
--
-=
-
-
txx
txxx
txxx
txxx
tx

The next state is to complete the square in the
numerator.
( )
( )( )
( )
422
42222
2
2
222
2
022
0
02
sms
smsmsm
sm
msm
ttc
tttxx
tx
ctxx
+=
=++++-
=+-
=+++-

The complete expression then becomes:
( ) ( )
( )
2 2 2 4 2
2 2
2
2 2
2
21 1
2 2
1 1
2 2
x t t tx
tx
x t
t t
-m- s - ms -s-m
- =-
s s
-m- s
=- +m + s
s

The moment generating function then becomes:
()
( )
÷
ø
ö
ç
è
æ
+=
÷
÷
ø
ö
ç
ç
è
æ --

ø
ö
ç
è
æ
+= ò
¥
¥-
22
2
2
22
2
1
exp
2
1
exp
2
1
2
1
exp
tt
dx
tx
tttM
X
sm
s
sm
ps
sm

Taking the first derivative with respect to t, we get:
Letting t->0, this becomes:
()
()( ) ÷
ø
ö
ç
è
æ
++=
2221
2
1
exp ttttM
X
smsm
()
()m=0
1
XM

The second derivative of the moment generating
function with respect to t yields:
Again, letting t->0 yields
()
()
( )( ) ÷
ø
ö
ç
è
æ
+++

ø
ö
ç
è
æ
+=
2222
2222
2
1
exp
2
1
exp
tttt
tttM
X
smsmsm
sms
()
()
222
0 ms+=
X
M

Let X and Y be independent random variables with
moment generating functions M
X
(t) and M
Y
(t).
Consider their sum Z=X+Y and its moment generating
function:
()
( )
() ()
t x ytz tx ty
Z
tx ty
X Y
M t E e E e E e e
E e E e M t M t
+
é ùé ù é ù= = = =
ë û ë ûë û
é ù é ù=
ë û ë û

We conclude that the moment generating function
for two independent random variables is equal to the
product of the moment generating functions of each
variable.

Skipping ahead slightly, the multivariate normal
distribution function can be written as:
where Σ is the variance matrix and μ is a vector of
means.
() ( ) ( )÷
ø
ö
ç
è
æ
-S--S=
-
mm
p
xxxf
1
'
2
1
exp
2
1

In order to derive the moment generating function,
we now need a vector t. The moment generating
function can then be defined as:
()
1
exp ' '
2
X
M t t t t
æ ö
= m + S
ç ÷
è ø
%
% % % %

Normal variables are independent if the variance
matrix is a diagonal matrix.
Note that if the variance matrix is diagonal, the
moment generating function for the normal can be
written as:

()
( )
() () ()
1 2 3
2
1
2
2
2
3
2 2 2 2 2 2
1 1 2 2 3 3 1 1 2 2 3 3
2 2 2 2
1 1 1 1 2 2 2 3 3 3
0 0
1
exp ' ' 0 0
2
0 0
1
exp
2
1 1 1
exp
2 2 2
X
X X X
M t t t t
t t t t t t
t t t t
M t M t M t
æ ö é ùs
ç ÷ ê ú
= m + s
ç ÷ ê ú
ç ÷ ê ú s
ë ûè ø
æ ö
= m +m +m + s + s + s
ç ÷
è ø
æ öæ ö æ ö æ ö
= m + s + m + s + m + s
ç ÷ ç ÷ ç ÷ç ÷
è ø è ø è øè ø
=
%
% % %
Tags