Econometric Analysis 8th Edition Greene Solutions Manual

cragafiasse 23 views 52 slides Apr 20, 2025
Slide 1
Slide 1 of 52
Slide 1
1
Slide 2
2
Slide 3
3
Slide 4
4
Slide 5
5
Slide 6
6
Slide 7
7
Slide 8
8
Slide 9
9
Slide 10
10
Slide 11
11
Slide 12
12
Slide 13
13
Slide 14
14
Slide 15
15
Slide 16
16
Slide 17
17
Slide 18
18
Slide 19
19
Slide 20
20
Slide 21
21
Slide 22
22
Slide 23
23
Slide 24
24
Slide 25
25
Slide 26
26
Slide 27
27
Slide 28
28
Slide 29
29
Slide 30
30
Slide 31
31
Slide 32
32
Slide 33
33
Slide 34
34
Slide 35
35
Slide 36
36
Slide 37
37
Slide 38
38
Slide 39
39
Slide 40
40
Slide 41
41
Slide 42
42
Slide 43
43
Slide 44
44
Slide 45
45
Slide 46
46
Slide 47
47
Slide 48
48
Slide 49
49
Slide 50
50
Slide 51
51
Slide 52
52

About This Presentation

Econometric Analysis 8th Edition Greene Solutions Manual
Econometric Analysis 8th Edition Greene Solutions Manual
Econometric Analysis 8th Edition Greene Solutions Manual


Slide Content

Econometric Analysis 8th Edition Greene
Solutions Manual pdf download
https://testbankdeal.com/product/econometric-analysis-8th-
edition-greene-solutions-manual/
Download more testbank from https://testbankdeal.com

Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...
Structural Analysis 8th Edition Hibbeler Solutions Manual
https://testbankdeal.com/product/structural-analysis-8th-edition-
hibbeler-solutions-manual/
testbankdeal.com
Systems Analysis and Design 8th Edition Kendall Solutions
Manual
https://testbankdeal.com/product/systems-analysis-and-design-8th-
edition-kendall-solutions-manual/
testbankdeal.com
Bond Markets Analysis And Strategies 8th Edition Fabozzi
Solutions Manual
https://testbankdeal.com/product/bond-markets-analysis-and-
strategies-8th-edition-fabozzi-solutions-manual/
testbankdeal.com
Investments Analysis and Management Canadian 3rd Edition
Cleary Solutions Manual
https://testbankdeal.com/product/investments-analysis-and-management-
canadian-3rd-edition-cleary-solutions-manual/
testbankdeal.com

Purchasing And Supply Chain Management 5th Edition Monczka
Solutions Manual
https://testbankdeal.com/product/purchasing-and-supply-chain-
management-5th-edition-monczka-solutions-manual/
testbankdeal.com
Child Health Nursing 3rd Edition Bindler Test Bank
https://testbankdeal.com/product/child-health-nursing-3rd-edition-
bindler-test-bank/
testbankdeal.com
Computer Security Principles and Practice 4th Edition
Stallings Solutions Manual
https://testbankdeal.com/product/computer-security-principles-and-
practice-4th-edition-stallings-solutions-manual/
testbankdeal.com
Dynamic Child 1st Edition Manis Solutions Manual
https://testbankdeal.com/product/dynamic-child-1st-edition-manis-
solutions-manual/
testbankdeal.com
Financial Accounting in an Economic Context 8th Edition
Pratt Test Bank
https://testbankdeal.com/product/financial-accounting-in-an-economic-
context-8th-edition-pratt-test-bank/
testbankdeal.com

Calculus Several Variables Canadian 9th Edition Adams Test
Bank
https://testbankdeal.com/product/calculus-several-variables-
canadian-9th-edition-adams-test-bank/
testbankdeal.com

Copyright © 2018 Pearson Education, Inc. 118
Chapter 14

Maximum Likelihood Estimation

Exercises

1. The density of the maximum is

n[z/]
n-1
(1/), 0 < z < .
Therefore, the expected value is E[z] = 0

 z
n
dz = [
n+1
/(n+1)][n/
n
] = n/(n+1). The variance is found likewise,
E[z
2
] = 0

 z
2
n(z/n)
n-1
(1/)dz = n
2
/(n+2) so Var[z] = E[z
2
] - (E[z])
2
= n
2
/[(n + 1)
2
(n+2)]. Using mean
squared convergence we see that lim
n→ E[z] =  and lim
n→ Var[z] = 0, so that plim z = .
2. The log-likelihood is lnL = -nln - (1/)x
i
i
n
=
1 . The maximum likelihood estimator is obtained as the
solution to lnL/ = -n/ + (1/
2
) x
i
i
n
=
1 = 0, or ˆ
ML
 = 1
1 n
i
i
xx
n
=
= . The asymptotic variance of the
MLE is {-E[
2
lnL/
2
]}
-1
= {-E[n/
2
- (2/
3
) x
i
i
n
=
1 ]}
-1
. To find the expected value of this random variable,
we need E[xi] = . Therefore, the asymptotic variance is 
2
/n. The asymptotic distribution is normal with mean
 and this variance.

3. The log-likelihood is lnL = nln - (+)y
i
i
n
=
1 + lnx
i
i
n
=
1 + 1
ln
n
ii
i
xy
= - 1
ln( !)
n
i
i
x
=
The first and second derivatives are lnL/ = n/-y
i
i
n
=
1
lnL/ = -y
i
i
n
=
1 + x
i
i
n
=
1 /

2
lnL/
2
= -n/
2


2
lnL/
2
= -x
i
i
n
=
1 /
2


2
lnL/ = 0.
Therefore, the maximum likelihood estimators are ˆ
ML
 = 1/y and ˆ
 = /xy and the asymptotic covariance
matrix is the inverse of E
n
x
i
i
n
/
/


2
1
2
0
0
=







 . In order to complete the derivation, we will require the
expected value of x
i
i
n
=
1 = nE[xi]. In order to obtain E[xi], it is necessary to obtain the marginal distribution
of xi, which is f(x) =  

e yxdy
y x−+


()
()/!
0 = 
x yx
xe ydy(/!) .
()−+


0 This is 
x
(/x!) times a gamma
integral. This is f(x) = 
x
(/x!)[(x+1)]/(+)
x+1
. But, (x+1) = x!, so the expression reduces to
f(x) = [/(+)][/(+)]
x
.
Thus, x has a geometric distribution with parameter  = /(+). (This is the distribution of the number of tries
until the first success of independent trials each with success probability 1-. Finally, we require the expected
value of xi, which is E[x] = [/(+)] x=


0 x[/(+)]
x
= /. Then, the required asymptotic covariance
matrix is n
n
n
n
/
(/)/
/
/




2
2
1
2
0
0
0
0








=








− .

Copyright © 2018 Pearson Education, Inc. 119
The maximum likelihood estimator of /(+) is is
/( )  +  = (1/y )/[x /y + 1/y ] = 1/(1 + x ).
Its asymptotic variance is obtained using the variance of a nonlinear function
V = [/(+)]
2
(
2
/n) + [-/(+)]
2
(/n) = 
2
/[n(+)
3
].
The asymptotic variance could also be obtained as [-1/(1 + E[x])
2
]
2
Asy.Var[x ].)
For part (c), we just note that  = /(+). For a sample of observations on x, the log-likelihood would
be lnL = nln + ln(1-)x
i
i
n
=
1
lnL/d = n/ - x
i
i
n
=
1 /(1-).
A solution is obtained by first noting that at the solution, (1-)/ = x = 1/ - 1. The solution for  is, thus, ˆ
= 1 / (1 +x ).Of course, this is what we found in part b., which makes sense.
For part (d) f(y|x) = fxy
fx
(,)
() =  


e y
x x
y x x−+
+ +
()
()()()
!
.
Cancelling terms and gathering the
remaining like terms leaves f(y|x) = ()[()] /!
()


+ +
−+
ye x
x y so the density has the required form with
 = (+). The integral is  []/!
x yx
xeydy
+ −


1
0 . This integral is a Gamma integral which equals
(x+1)/
x+1
, which is the reciprocal of the leading scalar, so the product is 1. The log-likelihood function is
lnL = nln - y
i
i
n
=
1 + lnx
i
i
n
=
1 - ln!x
i
i
n
=
1
lnL/ = (x
i
i
n
=
1 + n)/ - y
i
i
n
=
1 .

2
lnL/
2
= -(x
i
i
n
=
1 + n)/
2
.
Therefore, the maximum likelihood estimator of  is (1 + x )/y and the asymptotic variance, conditional on
the xs is Asy.Var.ˆ
 = (
2
/n)/(1 +x )
Part (e.) We can obtain f(y) by summing over x in the joint density. First, we write the joint density
as fxyee yx
yy x
(,) ()/!=
−−
 
 . The sum is, therefore, fye e yx
y y x
x
() ()/!=
− −
=


 
 
0 . The sum is that
of the probabilities for a Poisson distribution, so it equals 1. This produces the required result. The maximum
likelihood estimator of  and its asymptotic variance are derived from
lnL = nln - y
i
i
n
=
1
lnL/ = n/ - y
i
i
n
=
1

2
lnL/
2
= -n/
2
.
Therefore, the maximum likelihood estimator is 1/y and its asymptotic variance is 
2
/n. Since we found f(y)
by factoring f(x,y) into f(y)f(x|y) (apparently, given our result), the answer follows immediately. Just divide the
expression used in part e. by f(y). This is a Poisson distribution with parameter y. The log-likelihood function
and its first derivative are
lnL = -y
i
i
n
=
1 + lnx
i
i
n
=
1 + xy
i i
i
n
ln
=
1 - ln!x
i
i
n
=
1
lnL/ = -y
i
i
n
=
1 + x
i
i
n
=
1 /,
from which it follows that ˆ
/xy= .

Copyright © 2018 Pearson Education, Inc. 120
4. The log-likelihood and its two first derivatives are
logL = nlog + nlog + (-1)logx
i
i
n
=
1 - x
i
i
n
=
1
logL/ = n/ - x
i
i
n
=
1
logL/ = n/ + logx
i
i
n
=
1 - (log)xx
ii
i
n

=
1
Since the first likelihood equation implies that at the maximum, ˆ = n /x
i
i
n
=
1 , one approach would be to
scan over the range of  and compute the implied value of . Two practical complications are the allowable
range of  and the starting values to use for the search.
The second derivatives are

2
lnL/
2
= -n/
2


2
lnL/
2
= -n/
2
- (log)xx
ii
i
n 2
1

=

2
lnL/ = -(log)xx
ii
i
n 
=
1 .
If we had estimates in hand, the simplest way to estimate the expected values of the Hessian would be to evaluate
the expressions above at the maximum likelihood estimates, then compute the negative inverse. First, since the
expected value of lnL/ is zero, it follows that E[xi

] = 1/. Now,
E[lnL/] = n/ + E[logx
i
i
n
=
1 ] - E[(log)xx
ii
i
n 
=
1 ]= 0
as well. Divide by n, and use the fact that every term in a sum has the same expectation to obtain
1/ + E[lnxi] - E[(lnxi)xi

]/E[xi

] = 0.
Now, multiply through by E[xi

] to obtain E[xi

] = E[(lnxi)xi

] - E[lnxi]E[xi

]
or 1/() = Cov[lnxi,xi

]. 

5. As suggested in the previous problem, we can concentrate the log-likelihood over . From logL/ = 0,
we find that at the maximum,  = 1/[(1/n) x
i
i
n
=
1 ]. Thus, we scan over different values of  to seek the value
which maximizes logL as given above, where we substitute this expression for each occurrence of . Values
of  and the log-likelihood for a range of values of  are listed and shown in the figure below.
 logL
0.1 -62.386
0.2 -49.175
0.3 -41.381
0.4 -36.051
0.5 -32.122
0.6 -29.127
0.7 -26.829
0.8 -25.098
0.9 -23.866
1.0 -23.101
1.05 -22.891
1.06 -22.863
1.07 -22.841
1.08 -22.823
1.09 -22.809
1.10 -22.800
1.11 -22.796
1.12 -22.797
1.2 -22.984
1.3 -23.693

Copyright © 2018 Pearson Education, Inc. 121
The maximum occurs at  = 1.11. The implied value of  is 1.179. The negative of the second derivatives
matrix at these values and its inverse are I






=





,
. .
. .
255596506
96506277552 and I
-1







=








,
. .
. .
045062673
267304148 .
The Wald statistic for the hypothesis that  = 1 is W = (1.11 - 1)
2
/.041477 = .276. The critical value for a test
of size .05 is 3.84, so we would not reject the hypothesis.
If  = 1, then ˆ = n x
i
i
n
/
=
1 = 0.88496. The distribution specializes to the geometric distribution if
 = 1, so the restricted log-likelihood would be
logLr = nlog - x
i
i
n
=
1 = n(log - 1) at the MLE.
logLr at  = .88496 is -22.44435. The likelihood ratio statistic is -2log = 2(23.10068 - 22.44435) = 1.3126.
Once again, this is a small value. To obtain the Lagrange multiplier statistic, we would compute
   
   
   
 
 
log/ log/
log/ log/
log/ log/
log/
log/
L L
L L
L L
L
L

− −
− −















2 2 2
2 2 2
1
at the restricted estimates of  = .88496 and  = 1. Making the substitutions from above, at these values, we
would have
logL/ = 0
logL/ = n + logx
i
i
n
=
1 - 1
1
x
xx
i i
i
n
log
= = 9.400342

2
logL/
2
= −nx
2 = -25.54955

2
logL/
2
= -n - 1
2
1
x
x x
i i
i
n
(log)
= = -30.79486

2
logL/ = −
=xx
i i
i
n
log
1 = -8.265.
The lower right element in the inverse matrix is .041477. The LM statistic is, therefore, (9.40032)
2
.041477 =
2.9095. This is also well under the critical value for the chi-squared distribution, so the hypothesis is not rejected
on the basis of any of the three tests.

6. a. The full log likelihood is logL =  log fyx(y,x|,).
b. By factoring the density, we obtain the equivalent logL = [ log fy|x (y|x,,) + log fx (x|)]
c. We can solve the first order conditions in each case. From the marginal distribution for x,
  log fx (x|)/ = 0
provides a solution for . From the joint distribution, factored into the conditional plus the marginal, we have

[ log fy|x (y|x,,)/ + log fx (x|)/ = 0
[ log fy|x (y|x,,)/ = 0

d. The asymptotic variance obtained from the first estimator would be the negative inverse of the expected
second derivative, Asy.Var[a] = {[-E[
2
 log fx (x|)/
2
]}
-1
. Denote this A
-1
. Now, consider the second
estimator for  and  jointly. The negative of the expected Hessian is shown below. Note that the A from
the marginal distribution appears there, as the marginal distribution appears in the factored joint distribution.

2
0ln
00
B B A B BAL
E
B B B B
+    
− = + =     
      
  
  
    
   

The asymptotic covariance matrix for the joint estimator is the inverse of this matrix. To compare this to the
asymptotic variance for the marginal estimator of , we need the upper left element of this matrix. Using the
formula for the partitioned inverse, we find that this upper left element in the inverse is

[(A+B) - (BB
-1
B)]
-1
= [A + (B - BB
-1
B)]
-1

Copyright © 2018 Pearson Education, Inc. 122
which is smaller than A as long as the second term is positive.

e. (Unfortunately, this is an error in the text.) In the preceding expression, B is the cross derivative. Even if
it is zero, the asymptotic variance from the joint estimator is still smaller, being [A + B]
-1
. This makes
sense. If  appears in the conditional distribution, then there is additional information in the factored joint
likelhood that is not in the marginal distribution, and this produces the smaller asymptotic variance.

7. The log likelihood for the Poisson model is

LogL = -n + logi yi - i log yi!

The expected value of 1/n times this function with respect to the true distribution is

E[(1/n)logL] = - + log E0[y ] – E0 (1/n)i logyi!

The first expectation is 0. The second expectation can be left implicit since it will not affect the solution
for  - it is a function of the true 0. Maximizing this function with respect to  produces the necessary
condition
E0 (1/n)logL]/ = -1 + 0/ = 0

which has solution  = 0 which was to be shown.

8. The log likelihood for a sample from the normal distribution is

LogL = -(n/2)log2 - (n/2)log
2
– 1/(2
2
) i (yi - )
2
.

E0 [(1/n)logL] = -(1/2)log2 - (1/2)log
2
– 1/(2
2
) E0[(1/n) i (yi - )
2
].

The expectation term equals E0[(yi - )
2
] = E0[(yi - 0)
2
] + (0 - )
2
= 0
2
+ (0 - )
2
. Collecting terms,

E0 [(1/n)logL] = -(1/2)log2 - (1/2)log
2
– 1/(2
2
)[ 0
2
+ (0 - )
2
]

To see where this is maximized, note first that the term (0 - )
2
enters negatively as a quadratic, so the
maximizing value of  is obviously 0. Since this term is then zero, we can ignore it, and look for the 
2
that
maximizes -(1/2)log2 - (1/2)log
2
– 0
2
/(2
2
). The –1/2 is irrelevant as is the leading constant, so we wish
to minimize (after changing sign) log
2
+ 0
2
/
2
with respect to 
2
. Equating the first derivative to zero
produces 1/
2
= 0
2
/(
2
)
2
or 
2
= 0
2
, which gives us the result.

9. The log likelihood for the classical normal regression model is

LogL = i -(1/2)[log2 + log
2
+ (1/
2
)(yi - xi)
2
]

If we reparameterize this in terms of  = 1/ and  = /, then after a bit of manipulation,

LogL = i -(1/2)[log2 - log
2
+ (yi - xi)
2
]

The first order conditions for maximizing this with respect to  and  are

logL/ = n/ - i yi (yi - xi) = 0

logL/ = i xi (yi - xi) = 0

Solve the second equation for , which produces  =  (XX)
-1
Xy =  b. Insert this implicit solution into
the first equation to produce n/ = i yi (yi - xib). By taking  outside the summation and multiplying

Copyright © 2018 Pearson Education, Inc. 123
the entire expression by , we obtain n = 
2
i yi (yi - xib) or 
2
= n/[i yi (yi - xib)]. This is an analytic
solution for  that is only in terms of the data – b is a sample statistic. Inserting the square root of this result
into the solution for  produces the second result we need. By pursuing this a bit further, you canshow that
the solution for 
2
is just n/ee from the original least squares regression, and the solution for  is just b times
this solution for . The second derivatives matrix is


2
logL/
2
= -n/
2
- iyi
2



2
logL/  = -i xixi


2
logL/  = i xiyi.

We’ll obtain the expectations conditioned on X. E[yi|xi] is xi from the original model, which equals xi/.
E[yi
2
|xi] = 1/
2
(xi)
2
+ 1/
2
. (The cross term has expectation zero.) Summing over observations and
collecting terms, we have, conditioned on X,

E[
2
logL/
2
|X] = -2n/
2
- (1/
2
)XX

E[
2
logL/ |X] = -XX

E[
2
logL/ |X] = (1/)XX

The negative inverse of the matrix of expected second derivatives is

1
' (1/ ) '
. [ , ]
2
(1/ ) ' ' (1/ )[2 '



−
=
−+
AsyVar h
n
X X X X
d
X X X X

  

(The off diagonal term does not vanish here as it does in the original parameterization.)

The first derivatives of the log likelihood function are logL/ = -(1/2
2
) i -2(yi - ). Equating this to zero
produces the vector of means for the estimator of . The first derivative with respect to 
2
is

logL/
2
= -nM/(2
2
) + 1/(2
4
)i (yi - )(yi - ). Each term in the sum is m (yim - m)
2
. We already
deduced that the estimators of m are the sample means. Inserting these in the solution for 
2
and solving the
likelihood equation produces the solution given in the problem. The second derivatives of the log likelihood
are


2
logL/ = (1/
2
) i -I


2
logL/
2
= (1/2
4
) i -2(yi - )


2
logL/
2

2
= nM/(2
4
) - 1/
6
i (yi - )(yi - )

The expected value of the first term is (-n/
2
)I. The second term has expectation zero. Each term in the
summation in the third term has expectation M
2
, so the summation has expected value nM
2
. Adding gives
the expectation for the third term of -nM/(2
4
). Assembling these in a block diagonal matrix, then taking the
negative inverse produces the result given earlier.
For the Wald test, the restriction is

H0:  - 
0
i = 0.

The unrestricted estimator of  is x . The variance of x is given above, so the Wald statistic is simply

Copyright © 2018 Pearson Education, Inc. 124
(x - 
0
i ) Var[(x - 
0
i )]
-1
(x - 
0
i ). Inserting the covariance matrix given above produces the suggested
statistic.

(Extra problem from 7
th
edition. Prove the claim made in this example.)



The asymptotic variance of the MLE is, in fact, equal to the Cramer-Rao Lower Bound for the variance of
a consistent, asymptotically normally distributed estimator, so this completes the argument.
In example 4.7, we proposed a regression with a gamma distributed disturbance,

yi =  + xi + i
where,
f(i) = [
P
/(P)] i
P-1
exp(-i), i > 0,  > 0, P > 2.

(The fact that i is nonnegative will shift the constant term, as shown in Example 4.7. The need for the
restriction on P will emerge shortly.) It will be convenient to assume the regressors are measured in
deviations from their means, so ixi = 0. The OLS estimator of  remains unbiased and consistent in this
model, with variance

Var[b|X] = 
2
(XX)
-1


where 
2
= Var[i|X] = P/
2
. [You can show this by using gamma integrals to verify that E[i|X] = P/ and
E[i
2
|X] = P(P+1)/
2
. See B-39 and (E-1) in Section E2.3. A useful device for obtaining the variance is (P)
= (P-1)(P-1).] We will now show that in this model, there is a more efficient consistent estimator of . (As
we saw in Example 4.7, the constant term in this regression will be biased because E[i|X] = P/; a estimates
+P/. In what follows, we will focus on the slope estimators.
The log likelihood function is
Ln L = 1
ln ln ( ) ( 1)ln
n
ii
i
P P P
=
 −  + −  − 
The likelihood equations are

 lnL/ = i [-(P-1)/i + ] = 0,
 lnL/ = i [-(P-1)/i + ]xi = 0,
 lnL/ = i [P/ - i] = 0,
 lnL/P = i [ln - (P) - i] = 0.

Copyright © 2018 Pearson Education, Inc. 125
The function (P) = dln(P)/dP is defined in Section E2.3.) To show that these expressions have expectation
zero, we use the gamma integral once again to show that E[1/i] = /(P-1). We used the result E[lni] = (P)-
 in Example 13.5. So show that E[lnL/] = 0, we only require E[1/i] = /(P-1) because xi and i are
independent. The second derivatives and their expectations are found as follows: Using the gamma integral
once again, we find E[1/i
2
] = 
2
/[(P-1)(P-2)]. And, recall that ixi = 0. Thus, conditioned on X, we have
-E[
2
lnL/
2
] = E[i (P-1)(1/i
2
)] = n
2
/(P-2),

-E[
2
lnL/] = E[i (P-1)(1/i
2
)xi] = 0,
-E[
2
lnL/] = E[i (-1)] = -n,
-E[
2
lnL/P] = E[i (1/i)] = n/(P-1),
-E[
2
lnL/] = E[i (P-1)(1/i
2
)xixi] = Σi [
2
/(P-2)]xixi = [
2
/(P-2)](X′X),
-E[
2
lnL/] = E[i (-1)xi] = 0,
-E[
2
lnL/P] = E[i (1/i)xi] = 0,
-E[
2
lnL/
2
] = E[i (P/
2
)] = nP/
2
,
-E[
2
lnL/P] = E[i (1/)] = n/,
-E[
2
lnL/P
2
] = E[i (P)] = n′(P).

Since the expectations of the cross partials witth respect to  and the other parameters are all zero, it
follows that the asymptotic covariance matrix for the MLE of  is simply

Asy.Var[ˆ
MLE
 ] = {-E[
2
lnL/]}
-1
= [(P-2)/
2
](X′X)
-1
.

Recall, the asymptotic covariance matrix of the ordinary least squares estimator is

Asy.Var[b] = [P/
2
](X′X)
-1
.

(Note that the MLE is ill defined if P is less than 2.) Thus, the ratio of the variance of the MLE of any
element of  to that of the corresponding element of b is (P-2)/P which is the result claimed in Example 4.7.


Applications

1. a. For both probabilities, the symmetry implies that 1 – F(t) = F(-t). In either model, then,

Prob(y=1) = F(t) and Prob(y=0) = 1 – F(t) = F(-t).

These are combined in Prob(Y=y) = F[(2yi-1)ti] where ti = xi′. Therefore,

ln L = Σi ln F[(2yi-1)xi′]

b. lnL/ = 1
[(2 1) ]
(2 1)
[(2 1) ]
n
ii
ii
i
ii
fy
y
Fy
=
−

−

x
x
x

 = 0
where f[(2yi-1)xi′] is the density function. For the logit model, f = F(1-F). So, for the logit model,
lnL/ = 1
{1 [(2 1) ]}(2 1)
n
i i i i
i
F y y
=
− − − xx = 0

Evaluating this expression for yi = 0, we get simply –F(xi′)xi. When yi = 1, the term is
[1- F(xi′)]xi. It follows that both cases are [yi - F(xi′)]xi, so the likelihood equations for the logit
model are
lnL/ = 1
[ ( )]
n
i i i
i
y
=
− xx = 0.

Copyright © 2018 Pearson Education, Inc. 126

For the probit model, F[(2yi-1)xi′] = [(2yi-1)xi′] and f[(2yi-1)xi′] = [(2yi-1)xi′], which does
not simplify further, save for that the term 2yi inside may be dropped since (t) = (-t). Therefore,

lnL/ = 1
[(2 1) ]
(2 1)
[(2 1) ]
n
ii
ii
i
ii
y
y
y
=
−

−

x
x
x

 = 0

c. For the logit model, the result is very simple.


2
lnL/′= 1
( )[1 ( )]
n
i i i
i=
−  −  x x x .

For the probit model, the result is more complicated. We will use the result that

d(t)/dt = -t(t).

It follows, then, that d[(t)/(t)]/dt = [-(t)/(t)][t + (t)/(t)]. Using this result directly, it follows
that


2
lnL/′= 2
1
[(2 1) ] [(2 1) ]
(2 1) (2 1)
[(2 1) ] [(2 1) ]
n
i i i i
i i i i i
i
i i i i
yy
yy
yy
=
   −  −
− − −  
 −  −
  

xx
x x x
xx

+
 = 0

This actually simplifies somewhat because (2yi-1)
2
= 1 for both values of yi and [(2 1) ]
ii
y − x = ()
i
x


d. Denote by H the actual second derivatives matrix derived in the previous part. Then, Newton’s
method is
 
1 ˆ
ln [ ( )]
ˆ ˆ ˆ
( 1) ( ) ( )
ˆ
()
Lj
j j j
j
−
+ = − 



H

  


where the terms on the right hand side indicate first and second derivatives evaluated at the
“previous” estimate of .

e. The method of scoring uses the expected Hessian instead of the actual Hessian in the iterations.
The methods are the same for the logit model, since the Hessian does not involve yi. The methods
are different for the probit model, since the expected Hessian does not equal the actual one. For
the logit model

-[E(H)]
-1
=  
1
1
( )[1 ( )]
n
i i i
i

=
 −  x x x

For the probit model, we need first to obtain the expected value. Do obtain this, we take the
expected value, with Prob(y=0) = 1 -  and Prob(y=1) = . The expected value of the ith term in
the negative hessian is the expected value of the term,

[(2 1) ] [(2 1) ]
(2 1)
[(2 1) ] [(2 1) ]
i i i i
i i i i
i i i i
yy
y
yy
   −  −
−  
 −  −
  
xx
x x x
xx

+


Copyright © 2018 Pearson Education, Inc. 127

This is

[ ] [ ]
[]
[ ] [ ]
ii
i i i i
ii
  
   − − +  
 −  −
  
xx
x x x x
xx

  +
 [ ] [ ]
[]
[ ] [ ]
ii
i i i i
ii
  
     

  
xx
x x x x
xx

  +


[ ] [ ]
[]
[ ] [ ]
ii
i i i i i
ii
 
   =  −
 − 

xx
x x + x x x
xx

  +  +


( )
( )
( )
2
2
2
[ ] [ ]
[]
[ ] [ ]
11
[]
[ ] [ ]
[ ] [ ]
[]
[ ] [ ]
[]
[1 ( (
ii
i i i
ii
ii
ii
ii
ii
ii
i
i
ii

= 
 − 


= 
 − 

 +  −
= 
 − 

 
=
 −  

xx
x x x
xx
x x x
xx
xx
x x x
xx
x
xx
xx

+

+





) )

?====================================================
? Application 14.1 Part f.
?====================================================
Namelist ; x = one,age,educ,hsat,female,married $
LOGIT ; Lhs = Doctor ; Rhs = X $
Calc ; L1 = logl $
| Binary Logit Model for Binary Choice |
| Dependent variable DOCTOR |
| Number of observations 27326 |
| Log likelihood function -16405.94 |
| Number of parameters 6 |
| Info. Criterion: AIC = 1.20120 |
| Info. Criterion: BIC = 1.20300 |
| Restricted log likelihood -18019.55 |
+--------------------------------------------- +
+--------+--------------+----------------+--------+--------+----------+
|Variable| Coefficient | Standard Error |b/St.Er.|P[|Z|>z]| Mean of X|
+--------+--------------+----------------+--------+--------+----------+
---------+Characteristics in numerator of Prob[Y = 1]
Constant| 1.82207669 .10763712 16.928 .0000
AGE | .01235692 .00124643 9.914 .0000 43.5256898
EDUC | -.00569371 .00578743 -.984 .3252 11.3206310
HSAT | -.29276744 .00686076 -42.673 .0000 6.78542607
FEMALE | .58376753 .02717992 21.478 .0000 .47877479
MARRIED | .03550015 .03173886 1.119 .2634 .75861817

g.
Matr ; bw = b(5:6) ; vw = varb(5:6,5:6) $
Matrix ; list ; WaldStat = bw'<vw>bw $
Calc ; list ; ctb(.95,2) $
LOGIT ; Lhs = Doctor ; Rhs = One,age,educ,hsat $
Calc ; L0 = logl $
Calc ; List ; LRStat = 2*(l1 -l0) $
Matrix WALDSTAT has 1 rows and 1 columns.

Copyright © 2018 Pearson Education, Inc. 128
1
+--------------
1| 461.43784
--> Calc ; list ; ctb(.95,2) $
+------------------------------------ +
| Listed Calculator Results |
+------------------------------------ +
Result = 5.991465
--> Calc ; L0 = logl $
--> Calc ; List ; LRStat = 2*(l1 -l0) $
+------------------------------------ +
| Listed Calculator Results |
+------------------------------------ +
LRSTAT = 467.336374
Logit ; Lhs = Doctor ; Rhs = X ; Start = b,0,0 ; Maxit = 0 $
+--------------------------------------------- +
| Binary Logit Model for Binary Choice |
| Maximum Likelihood Estimates |
| Model estimated: May 17, 2007 at 11:49:42PM.|
| Dependent variable DOCTOR |
| Weighting variable None |
| Number of observations 27326 |
| Iterations completed 1 |
| LM Stat. at start values 466.0288 |
| LM statistic kept as scalar LMSTAT |
| Log likelihood function -16639.61 |
| Number of parameters 6 |
| Info. Criterion: AIC = 1.21830 |
| Finite Sample: AIC = 1.21830 |
| Info. Criterion: BIC = 1.22010 |
| Info. Criterion:HQIC = 1.21888 |
| Restricted log likelihood -18019.55 |
| McFadden Pseudo R-squared .0765802 |
| Chi squared 2759.883 |
| Degrees of freedom 5 |
| Prob[ChiSqd > value] = .0000000 |
| Hosmer-Lemeshow chi-squared = 23.44388 |
| P-value= .00284 with deg.fr. = 8 |
+--------------------------------------------- +


h. The restricted log likelihood given with the initial results equals -18019.55. This is the log
likelihood for a model that contains only a constant term. The log likelihood for the model is
-16405.94. Twice the difference is about 3,200, which vastly exceeds the critical chi squared with
5 degrees of freedom. The hypothesis would be rejected.







Application 14.2.
The results for the Poisson model are shown in Table 14.11 with the results for the geometric model.
-----------------------------------------------------------------------------
Poisson Regression
Dependent variable DOCVIS
Log likelihood function -104607.04078
Restricted log likelihood -108662.13583
Chi squared [ 4](P= .000) 8110.19010

Copyright © 2018 Pearson Education, Inc. 129
Significance level .00000
McFadden Pseudo R-squared .0373184
Estimation based on N = 27326, K = 5
Inf.Cr.AIC = 209224.1 AIC/N = 7.657
Chi- squared =255585.13599 RsqP= .0802
G - squared =156175.50075 RsqD= .0494
Overdispersion tests: g=mu(i) : 22.155
Overdispersion tests: g=mu(i)^2: 22.450
--------+--------------------------------------------------------------------
| Standard Prob. 95% Confidence
DOCVIS| Coefficient Error z |z|>Z* Interval
--------+--------------------------------------------------------------------
Constant| 1.04805*** .02718 38.56 .0000 .99478 1.10132
AGE| .01840*** .00033 55.46 .0000 .01775 .01905
EDUC| -.04325*** .00173 -25.01 .0000 -.04663 -.03986
INCOME| -.52070*** .02195 -23.73 .0000 -.56371 -.47768
HHKIDS| -.16094*** .00796 -20.23 .0000 -.17653 -.14534
--------+--------------------------------------------------------------------
Partial derivatives of expected val. with
respect to the vector of characteristics.
Effects are averaged over individuals.
Observations used for means are All Obs.
Sample average conditional mean 3.1835
Scale Factor for Marginal Effects 3.1835
--------+--------------------------------------------------------------------
| Partial Standard Prob. 95% Confidence
DOCVIS| Effect Error z |z|>Z* Interval
--------+--------------------------------------------------------------------
AGE| .05858*** .00107 54.50 .0000 .05648 .06069
EDUC| -.13767*** .00552 -24.93 .0000 -.14850 -.12685
INCOME| -1.65765*** .07009 -23.65 .0000 -1.79503 -1.52027
HHKIDS| -.49998*** .02415 -20.70 .0000 -.54732 -.45264 #
--------+--------------------------------------------------------------------

Application 14.3
? Mixture of normals problem
sample;1-1000$
calc;ran(12345)$
create ; y1=rnn(1,1) ; y2 = rnn(5,1) $
create ; c = rnu(0,1) $
create ; if(c < .3)y=y1 ; (else) y=y2 $
calc ; yb = xbr(y) ; sb = sdv(y) $
calc ; yb1=.9*yb ; sb1 = .9*sb
; yb2=1.1*yb ; sb2=1.1*sb $
maximize
; labels = lambda,mu1,s1,mu2,s2
; start = .5, yb1,sb1,yb2,sb2
; fcn = log(lambda*1/s1*n01((y -mu1)/s1) + (1-lambda)*1/s2*n01((y-mu2)/s2)) $

sample;1-1000$
calc;ran(12345)$
create ; y1=rnn(1,1) ; y2 = rnn(2,1) $
create ; c = rnu(0,1) $
create ; if(c < .3)y=y1 ; (else) y=y2 $
calc ; yb = xbr(y) ; sb = sdv(y) $
calc ; yb1=.9*yb ; sb1 = .9*sb
; yb2=1.1*yb ; sb2=1.1*sb $
maximize
; labels = lambda,mu1,s1,mu2,s2
; start = .5, yb1,sb1,yb2,sb2
; fcn = log(lambda*1/s1*n01((y -mu1)/s1) + (1-lambda)*1/s2*n01((y-mu2)/s2)) $

Copyright © 2018 Pearson Education, Inc. 130
First try uses the same starting values for the two segments. Iterations
Never get started.
|-> calc ; yb = xbr(y) ; sb = sdv(y) $
|-> maximize
; labels = lambda,mu1,s1,mu2,s2
; start = .5, yb,sb,yb,sb
; fcn = log(lambda*1/s1*n01((y -mu1)/s1) + (1-lambda)*1/s2*n01((y-
mu2)/s2)) $
Iterative procedure has converged
NOTE: Convergence in initial iterations is rarely
at a true function optimum. This may not be a
solution (especially if initial iterations stopped).
Exit from iterative procedure. 3 iterations completed.
Convergence values:
Gradient Norm: Tolerance= .1000D -05, current value= .4896D -07
Function Change: Tolerance= .0000D+00, current value= .1958D -08
Parameter Change: Tolerance= .0000D+00, current value= .1691D -07
Smallest abs. param. change from start value = .0000D+00
At least one parameter did not leave start value.
Normal exit: 3 iterations. Status=0, F= .2135387D+04
Error 143: Models - estimated variance matrix of estimates is singular
Error 447: Current estimated covariance matrix for slopes is singular

Second try with better starting values.
|-> calc ; yb1=.9*yb ; sb1 = .9*sb
; yb2=1.1*yb ; sb2=1.1*sb $
|-> maximize
; labels = lambda,mu1,s1,mu2,s2
; start = .5, yb1,sb1,yb2,sb2
; fcn = log(lambda*1/s1*n01((y -mu1)/s1) + (1-lambda)*1/s2*n01((y-
mu2)/s2)) $
Iterative procedure has converged
Normal exit: 19 iterations. Status=0, F= .1942574D+04

-----------------------------------------------------------------------------
User Defined Optimization
Dependent variable Function
Log likelihood function -1942.57381
Estimation based on N = 1000, K = 5
Inf.Cr.AIC = 3895.1 AIC/N = 3.895
--------+--------------------------------------------------------------------
| Standard Prob. 95% Confidence
UserFunc| Coefficient Error z |z|>Z* Interval
--------+--------------------------------------------------------------------
LAMBDA| .30986*** .01608 19.27 .0000 .27835 .34138
MU1| .99937*** .06079 16.44 .0000 .88023 1.11850
S1| .88624*** .05345 16.58 .0000 .78149 .99100
MU2| 4.91473*** .04250 115.65 .0000 4.83144 4.99802
S2| .98467*** .03351 29.38 .0000 .91899 1.05036
--------+--------------------------------------------------------------------
***, **, * ==> Significance at 1%, 5%, 10% level.
Model was estimated on Aug 19, 2017 at 04:45:10 PM
-----------------------------------------------------------------------------


Third try moves segments closer together. Results are a bit less precise.
|-> sample;1-1000$
|-> calc;ran(12345)$
|-> create ; y1=rnn(1,1) ; y2 = rnn(2,1) $
|-> create ; c = rnu(0,1) $
|-> create ; if(c < .3)y=y1 ; (else) y=y2 $
|-> calc ; yb = xbr(y) ; sb = sdv(y) $
|-> calc ; yb1=.9*yb ; sb1 = .9*sb

Copyright © 2018 Pearson Education, Inc. 131
; yb2=1.1*yb ; sb2=1.1*sb $
|-> maximize
; labels = lambda,mu1,s1,mu2,s2
; start = .5, yb1,sb1,yb2,sb2
; fcn = log(lambda*1/s1*n01((y -mu1)/s1) + (1-lambda)*1/s2*n01((y-
mu2)/s2)) $
Iterative procedure has converged
Normal exit: 24 iterations. Status=0, F= .1461581D+04

-----------------------------------------------------------------------------
User Defined Optimization
Dependent variable Function
Log likelihood function -1461.58073
Estimation based on N = 1000, K = 5
Inf.Cr.AIC = 2933.2 AIC/N = 2.933
--------+--------------------------------------------------------------------
| Standard Prob. 95% Confidence
UserFunc| Coefficient Error z |z|>Z* Interval
--------+--------------------------------------------------------------------
LAMBDA| .21683 .25504 .85 .3952 -.28304 .71670
MU1| .59512 .43775 1.36 .1740 -.26285 1.45310
S1| .70248*** .17154 4.10 .0000 .36627 1.03870
MU2| 1.92993*** .32142 6.00 .0000 1.29997 2.55990
S2| .93527*** .11326 8.26 .0000 .71329 1.15726
--------+--------------------------------------------------------------------
***, **, * ==> Significance at 1%, 5%, 10% level.
Model was estimated on Aug 19, 2017 at 04:46:37 PM
-----------------------------------------------------------------------------

Copyright © 2018 Pearson Education, Inc. 132
Chapter 15

Simulation Based Estimation and
Inference and Random Parameter
Models

Exercises

1. Exponential: The pdf is f(x) = exp(-x). The CDF is
0
11
( ) exp( ) exp( ) exp( 0) 1 exp( ).
x
F x t dt x x
 
=  − =  − − − − − = − −

 


We would draw observations from the U(0,1) population, say Fi, and equate these to F(xi). Inverting the
function, we find that 1-Fi = exp(-xi), or –(1/)ln(1-Fi) = xi.

2. Weibull. If the survival function is S(x) = pexp[-(x)
p
], then we may equate random draws from the uniform
distribution, Si to this function (a draw of Si is the same as a draw of Fi = 1-Si). Solving for xi, we find
lnSi = ln(p) – (x)
p
, so xi = (1/)[ln(p) – lnSi]
1/p
.

3. The derivative of the simulated sum of squares is
( , )
2 (1/ ) ( ) (1/ ) ( )
it
it it u ir it u ir
i t r r
ir
S
y R h v R h v
v
 
   = − − +  +   
 



   
x
xx





To estimate the asymptotic covariance matrix, we rely can rely on the results for the nonlinear least squares
estimator. The gradient is of the form it -2eit xit
0
. The approach taken in Theorem 7.2 would be as follows:
( )
2
211
ˆˆˆ (1/ )
it it u ir
i t r
y R h v
nT
  = − + 
   x
. The counterpart to (XX)
-1
is 1
(1/ ) ( ) (1/ ) ( )
it it
it u ir it u ir
i t r r
ir ir
R h v R h v
vv

 
      
    +  +       

     

   
xx
xx

Copyright © 2018 Pearson Education, Inc. 133
Application

?================================================================
? Application 15.1. Monte Carlo Simulation
?================================================================
? Set seed of RNG for replicability
Calc ; Ran(123579) $
? Sample size is 50. Generate x(i) and z(i) held fixed
Sample ; 1 - 50 $
Create ; xi = rnn(0,1) ; zi = rnn(0,1) $
Namelist ; X = one,xi,zi ; X0 = one,xi $
? Moment Matrices
Matrix ; XXinv = <X'X> ; X0X0inv = <X0'X0> $
Matrix ; Waldi = init(1000,1,0) $
Matrix ; LMi = init(1000,1,0) $

?****************************************************************
? Procedure studies the LM statistic
?****************************************************************
Proc = LM (c) $
? Three kinds of disturbances
Create ?; Eps = Rnt(5) ? Nonnormal distribution
; vi=exp(.2*xi) ; eps = vi*rnn(0,1) ? Heteroscedasticity
?;eps= Rnn(0,1) ? Standard normal distribution
; y = 0 + xi + c*zi +eps $
Matrix ; b0 = X0X0inv*X0'y $
Create ; e0 = y - X0'b0 $
Matrix ; g = X'e0 $
Calc ; lmstat = qfr(g,xxinv)/(e0'e0/n) ; i = i + 1 $
Matrix ; Lmi (i) = lmstat $
EndProc $
Calc ; i = 0 ; gamma = -1 $
Exec ; Proc=LM(gamma) ; n = 1000 $
samp;1-1000$
create;LMv=lmi $
create;reject=lmv>3.84$
Calc ; List ; Type1 = xbr(reject) ; pwr = 1 -Type1 $

?****************************************************************
? Procedure studies the Wald statistic
?****************************************************************
Calc ; type = 1 or 2 or 3 … set this before you run the program. $
Proc = Wald(c) $
Create ; if(type=1)Eps = Rnn(0,1) ? Standard normal distribution
; if(type=2)vi=exp(.2*xi) ? eps = vi*rnn(0,1) ? Heteroscedasticity
; if(type=3)eps= Rnt(5) ? Nonnormal distribution
; y = 0 + xi + c*zi +eps $
Matrix ; b0=XXinv*X'y $
Create ; e0=y-X'b0$
Calc ; ss0 = e0'e0/(47)
; v0 = ss0*xxinv(3,3)
; wald0=(b0(3))^2/v0
; i=i+1 $
Matrix ; Waldi(i)=Wald0 $
EndProc $
? Set the values for the simulation
Calc ; i = 0 ; gamma = 0 ; type=1 $
Sample ; 1-50 $
Exec ; Proc=Wald(gamma) ; n = 1000 $
samp;1-1000$
create;Waldv=Waldi $
create;reject=Waldv > 3.84$

Copyright © 2018 Pearson Education, Inc. 134
Calc ; List ; Type1 = xbr(reject) ; pwr = 1 -Type1 $

To carry out the simulation, execute the procedure for different values of “gamma” and “type.” Summarize
the results with a table or plot of the rejection probabilities as a function of gamma.

?================================================================
Application 15.2
?================================================================
regress ; lhs = lwage
;rhs=exp,expsq,one,wks,occ,ind,south,smsa,ms,fem,union,ed$
matrix ; be=b(1:2) ; ve = varb(1:2,1:2) $
? Delta method
wald ; fn1 = -c1/(2*c2)
; parameters=be
; covariance=ve
; labels = c1,c2 $

? Bootstrapping $
proc $
regress ; quietly ; lhs = lwage
;rhs=exp,expsq,one,wks,occ,ind,south,smsa,ms,fem,union,ed$
calc ; mx = -b(1)/(2*b(2)) $
endproc $
exec;n=100 ; bootstrap = mx $

-----------------------------------------------------------------------------
Ordinary least squares regression ............
LHS=LWAGE Mean = 6.67635
Standard deviation = .46151
---------- No. of observations = 4165 DegFreedom Mean square
Regression Sum of Squares = 373.138 11 33.92168
Residual Sum of Squares = 513.767 4153 .12371
Total Sum of Squares = 886.905 4164 .21299
---------- Standard error of e = .35172 Root MSE .35122
Fit R-squared = .42072 R -bar squared .41919
Model test F[ 11, 4153] = 274.20378 Prob F > F* .00000
--------+--------------------------------------------------------------------
| Standard Prob. 95% Confidence
LWAGE| Coefficient Error z |z|>Z* Interval
--------+--------------------------------------------------------------------
EXP| .03991*** .00217 18.36 .0000 .03565 .04417
EXPSQ| -.00067*** .4776D-04 -14.10 .0000 -.00077 -.00058
Constant| 5.22697*** .07170 72.90 .0000 5.08644 5.36749
WKS| .00431*** .00109 3.96 .0001 .00217 .00644
OCC| -.14531*** .01474 -9.86 .0000 -.17420 -.11642
IND| .04986*** .01187 4.20 .0000 .02660 .07311
SOUTH| -.06754*** .01251 -5.40 .0000 -.09206 -.04302
SMSA| .14010*** .01205 11.62 .0000 .11647 .16372
MS| .06387*** .02061 3.10 .0019 .02348 .10426
FEM| -.38103*** .02521 -15.12 .0000 -.43043 -.33163
UNION| .08833*** .01287 6.86 .0000 .06310 .11356
ED| .05786*** .00263 22.04 .0000 .05272 .06301
--------+--------------------------------------------------------------------
-----------------------------------------------------------------------------
WALD procedure. Estimates and standard errors for nonlinear functions and
joint test of nonlinear restrictions.
Wald Statistic = 2001.26761
Prob. from Chi-squared[ 1] = .00000
Functions are computed at means of variables
--------+--------------------------------------------------------------------
| Standard Prob. 95% Confidence
WaldFcns| Function Error z |z|>Z* Interval

Copyright © 2018 Pearson Education, Inc. 135
--------+--------------------------------------------------------------------
Fncn(1)| 29.6227*** .66217 44.74 .0000 28.3249 30.9205
--------+--------------------------------------------------------------------
Completed 100 bootstrap iterations.
-----------------------------------------------------------------------------
Results of bootstrap estimation of model.
Model has been reestimated 100 times.
The statistics shown below are centered
around the original estimate based on
the original full sample of observations.
Result is MX = 29.62271
Bootstrap samples have 4165 observations.
Estimate RtMnSqDev Skewness Kurtosis
29.62271 .77585 1.18455 5.29601
Minimum = 28.14656 Maximum = 32.90254
--------+--------------------------------------------------------------------
| Standard Prob. 95% Confidence
BootStrp| Coefficient Error z |z|>Z* Interval
--------+--------------------------------------------------------------------
MX| 29.6227*** .77585 38.18 .0000 28.1021 31.1433
--------+--------------------------------------------------------------------

Copyright © 2018 Pearson Education, Inc. 136
Chapter 16

Bayesian Estimation and Inference

Exercise

a. The likelihood function is

L(y|) = 1 1 1
exp( ) 1
( | ) exp( ) .
( 1) ( 1)
i
ii
y
n n n
y
i
i i i
ii
f y n
yy

= = =
− 
 = = −  
 +  +
  

b. The posterior is

1
1
1
0
( ,..., | ) ( )
( | ,..., )
( ,..., | ) ( )

n
n
n
p y y p
p y y
p y y p d


=
   .

The product of factorials will fall out. This leaves

()
()
1
0
1
1
0
1
1
0
exp( ) (1/ )
( | ,..., )
exp( ) (1/ )
exp( )
exp( )
exp( )
exp( )
ii
ii
ii
ii
y
n
y
y
y
ny
ny
n
p y y
nd
n
nd
n
nd







−   
=
−    
−  
=
−   
−  
=
−   
=













1
exp( )
.
()
ny ny
nn
ny

−  



where we have used the gamma integral at the last step. The posterior defines a two parameter gamma
distribution, G(n,ny ).

c. The estimator of  is the mean of the posterior. There is no need to do the integration. This falls simply
out of the posterior density, E[|y] = ny /n = y .

d. The posterior variance also drops out simply; it is ny /n
2
= y /n.

Copyright © 2018 Pearson Education, Inc. 137
Application

a. p(Fi|Ki,) = (1 )
i i i
iF K F
i
K
F
−
 − 
 so the log likelihood function is 1
,
ln ( | ) ln ln ( )ln(1 )
n i
i i i
i
K
L F K F
F
=

 = +  + − − 

y

The MLE is obtained by setting lnL(|y)/ = Σi [Fi/ - (Ki-Fi)/(1-)] = 0. Multiply both sides by (1-) to
obtain

Σi [(1-)Fi -  (Ki-Fi)] = 0

A line of algebra reveals that the solution is  = (ΣiFi)/(ΣiKi) = 0.651596.

b. The posterior density is 11
1
11
1
()
(1 ) (1 )
( ) ( )
()
(1 ) (1 )
( ) ( )
i i i
i i i
n
iF K F ab
i
i
n
iF K F ab
i
i
K ab
F ab
K ab
d
F ab
− −−
=
− −−
=
  +
 −   −  


  +
 −   −   



1
0
This simplifies considerably. The combinatorials and gamma functions fall out, leaving
11
() 11
1
()1 1 1 1
1
( ) ( 1) [ (
(1 ) (1 ) (1 ) (1 )
( | )
(1 ) (1 ) (1 ) (1 )
(1 )
i i i
i i i i i
i i i i i i i i
i i i i i
n
F K F ab
F K F ab
i
n
F K F F K F a b a b
i
F a K F
p
dd
− −−
  − −−
=
−   −− − − −
=
 + −  −
 −   −   −   − 
   
 = =
  −   −    −   −  

 − 
=


1 1
0 0
y

)] ( 1)
( ) ( 1) ( )] ( 1)
(1 )
i i i i i
b
F a K F b
d
+−
 + −  − + −


 −  

1
0


The denominator is a beta integral, so the posterior density is
( ) ( 1) [ ( )] ( 1)[( ) ( 1)] [( ( )) ( 1)]
( | ) (1 )
[( ) ( 1) ( ( )) ( 1)]
i i i i i
F a K F bi i i i i
i i i i i
F a K F b
p
F a K F b
 + −  − + −  + −   − + −
 =  − 

  + − +  − + −
y

The denominator simplifies slightly;

( ) ( 1) [ ( )] ( 1)
( ) 1 [ ( )] 1
[( ) ( 1)] [( ( )) ( 1)]
( | ) (1 )
[( ) ( 1) ( 1)]
[( ) 1)] [( ( )) 1)]
(1 )
[( ) ( ) 1 1)]
i i i i i
i i i i i
F a K F bi i i i i
ii
a F b K Fi i i i i
ii
F a K F b
p
K a b
a F b K F
a b K
 + −  − + −
+ − + − −
  + −   − + −
 =  − 

  + − + −
 +  −  +  − −
=  − 
 + +  − −
y



c-e. The posterior distribution is a beta distribution with parameters a*=(a+ΣiFi) and b*=[b+Σi(Ki-Fi)].
The mean of this beta random variable is a*/(a*+b*) = (a+ΣiFi)/(a+b+ΣiKi). In the data, Σi = 49 and ΣiKi =
75. For the values given, the posterior means are
(a=1,b=1): Result = .647668
(a=2,b=2): Result = .643939
(a=1,b=2): Result = .639386

Another Random Scribd Document
with Unrelated Content

Oldfuks. Polidor.
Polidor (tultuaan, syleilee ja suutelee Oldfuksia). Oi kultainen herra
kultaseni! Jumala itse on kulettanut teidät tänne, palkitsemaan
minua pitkällisestä työstäni ja moniwuotisesta hikoilemisestani, mitä
minä olen kestänyt turhaan. Keino on kelwollinen, te olette kunnialla
ansainneet nuo neljätuhatta taalaria, jopa kaksinkertaisestikin.
Oldfuks. Minä en ota penniäkään enemmän, kuin
suostumuksemme on.
Polidor. Ähä! nyt minä pistän tulpan niiden suuhun, jotka owat
aina nauraneet minulle, kerskaan wihollisilleni ja niille, jotka owat
kääntäneet selkänsä minulle, ikäänkuin peläten minun muka
joutuwan köyhyyteen, ja sen wuoksi owat unohtaneet kaikki, mitä
hywää minä olen tehnyt heille, warakkaana ollessani.
Oldfuks. Niin, herra hywä! Se on maailman tapa.
Polidor. Mutta nyt on minun wuoroni halweksia heitä.
Oldfuks. Älkää, herraseni! Sillä mitä tärkeintä minun mainio
mestarini Albufagomar-Fagius neuwoo oppilaillensa, on nöyryyttä.
Näettehän te, minkälainen minä olen. Minä woisin elää
ruhtinaallisesti, jos tahtoisin, mutta se on wastoin meidän
mieltämme; se on myöskin syynä siihen, että meidän ei luulla
tietäwän tuota kallista mahtia, kuin me, näette, elämme köyhällä
tawoin.
Polidor. Kyllä minäkin seuraan hänen opetuksiansa, sitä saatte
wakuuttaa Albufagomar-Fagiukselle, jolle minä pyydän teidän

ilmoittamaan nöyrintä kunnioitusta minulta, milloin waan kirjoitatte
hänelle.
Oldfuks. Sitä en unhota.
Polidor. Tässä, herra kulta, on minun sinettini; kun wiette sen
Benjamin Juutalaiselle, niin saatte rahat paikalla.
(Antaa sinetin).
Oldfuks. Koettakaa wielä kerran, warmuuden wuoksi.
Polidor. Niin, tahdottenko wiipyä täällä niin kauan? Waimoni olkoon
seurassanne sillä aikaa.
(Menee).
Yhdestoista kohtaus.
Oldfuks. Leonora. Nilla. Sitten Polidor.
Leonora (tullessaan Nillan keralla samasta owesta, mistä Polidor
meni). Oi herra kulta! jos saisin sanoiksi kiitollisuuttani, niinkuin
tahtoisin.
Oldfuks. Rouwa kulta! Teidän miehenne on kelpo mies, ja
sentähden olen minä ilmaissut mahtiani hänelle.
Nilla (suutelee Oldfuksin kättä).
Oldfuks. Se on liian nöyrästi, mamseli! Minun käteni on wähän
likainenkin.

Nilla. Eikö mitä! se on kallis käsi, joka ansaitsee useampiakin
suutelemisia.
Leonora. Ettekö halweksi tätä sormusta, pitääksenne sitä minun
tähteni?
Oldfuks. Ei, rouwa kulta! minä en ota mitään antimia teiltä.
Leonora. Ottakaa, herra kulta! Minä katson sen ystäwyyden
merkiksi teiltä, älkää halweksiko sitä, pyydän nöyrimmästi.
Nilla. Oi herra kulta! tehkää toki rouwalle sellainen mielihywä.
Oldfuks. No, minä otan sen, ett'en pahoittaisi rouwan mieltä.
Nilla. Oi hywäinen aika! Ollapa minullakin jotakin hywää
lahjoittaakseni teille. Ettekö halweksi tätä muistorahaa, jonka minä
olen perinyt wanhemmiltani?
Oldfuks. Sääli olisi ottaa teiltä perintöosaanne.
Nilla. Herra kultaseni! minä en laske teitä, ennenkuin otatte sen.
Oldfuks. No, minä säilytän sitä teidän tähden, ja lähetän teille sen
sijaan kultaisen rahan.
Polidor (tullen owesta). Hehei! minä olen niin iloinen, ett'en tiedä
miten olla. Keino on kelwollinen. Wiime keitoksesta sain minä yhtä
paljon kultaa. Hei, herra kulta! käykää meillä joka päiwä, niin kauan
kuin te olette näillä tienoilla.
Oldfuks. Hywin mielelläni. Ehkä minä wiiwyn muutamia kuukausia
näillä seuduilla.

Polidor. Ettekö tekisi minulle sitä kunniata, että tulette luokseni
päiwälliselle?
Oldfuks. Minulla ei ole aikaa; waan kyllä minä tulen iltaiselle teidän
luoksenne. Nyt on minulla wähän pieniä toimituksia.
Polidor. No, minä en tahdo wiiwyttää teitä. Rahat saatte
Juutalaiselta, kun näytätte hänelle sinetin.
Oldfuks. Sitä en epäile. Hywästi niin kauan.
Polidor. Minä sanon jäähywäiset koko perheeni puolesta!
(Oldfuks menee).
Kahdestoista kohtaus.
Polidor. Leonora. Nilla. Sitten Henrik.
Polidor. No, rouwa! mitäs sanotte nyt? Olenko minä nyt häwittänyt
teidän taloutenne hullulla työlläni?
Leonora. Oi, kultaseni! Älkää toki panko pahaksi minun
ajattelemattomuuttani.
Nilla. Minäkin pyydän nöyrimmästi anteeksi, kun olen niin monta
kertaa rohkeasti nauranut sille.
Polidor. Minä annan anteiksi teille lapsukaiset! kaikesta
sydämestäni. Oppikaatte waan siitä, ett'ette wasta kajoa asioihin,
joita teidän ymmärryksenne ei käsitä. Kas, Henrik! Mistä sinä tulet?

Henrik (tulee). Oh, herra! onko se totta?
Polidor. Mikä on totta?
Henrik. Että herra osaa tehdä kultaa.
Polidor. Niin, Henrik, aiwan oikein. Mutta mistä sinä olet saanut
tietää sen?
Henrik. Hoho! koko kaupunki tietää sen. Minä kuulin sitä ensin
tässä likellä taimikauppiaalta, hän pyrki senkin seitsemän kertaa
minun suosioni ja hywätahtoisuuteni turwiin, kaatoi suuren pullon
täyteen donskoita minulle, eikä tahtonut penniäkään siitä, waikka
tuo nälkäinen koira ennen ei antanut minulle wiinaryyppyäkään,
ennenkuin panin rahat pöydälle.
Polidor. Kas, sellainen on maailma; niin pian kuin jollekin käypi
hywin, pyrkiwät kaikki hänelle ystäwiksi. Mutta mistä sellainen tulee
niin pian tiedyksi?
Henrik. Juutalainen lienee sanonut sitä ensin yhdelle, ja kun yksi
kerran saapi tietää jotakin, menee se paikalta ympäri kaupunkia.
Kaikki ihmiset, jotka näkiwät minun kadulla, terwehtiwät minua,
niinkuin minä olisin ollut kokonainen kreiwi; ja Risto Woikukkanen,
joka ennen ei ole ollut näkewinänsäkään minua, kumarsi niin
sywään, että milt'ei kaatunut katuojaan, — mutta minä, näettekös,
menin hänen siwuitsensa yhtä pönäkästi, kuin hän ennen on mennyt
minun siwuitseni.
Polidor. Eiköhän siellä kolkuteta. Henrik! mene owelle.
Henrik (awaa owea, waan lyöpi sen heti kiini taas). Siellä on
Leanderi herra, joka pyrkii Teidän luokse kunniaterweisille.

Polidor. Oho! Oikeinko totta? Se mies on ennen halweksinut minua
kaikin tawoin.
Leonora. Sano, Henrik, että me emme laske puheillemme sellaisia
ihmisiä!
Polidor. Ei niin, eukkoseni! Malttakaamme mieltämme
onnellisuudessamme.
Tulkaan waan sisälle.
Kolmastoista kohtaus.
    Leander. Entiset. Sitten kolme wierasta herraa.
    Leanderin rouwa ja kolme wierasta rouwaa.
Leander. Ah, armahin herra Polidorini! Minun sanoiksi saamaton
iloni on nähdä teitä hywässä terweydessä. Minä en woi sanoilla
kuwailla sitä harrasta halua, kuin minulla on ollut saada nähdä teitä.
Polidor. Sitäpä en ole huomannut; sillä minä olen monta kertaa
käynyt tawoittamassa teitä, waan te ette muka ole olleet kotona, ja
kadulla ette ole koskaan tahtoneet terwehtiä minua.
Leander. Armahin herra Polidorini! Te teette minulle,
wähäarwoisimmalle palwelijallenne, wäärin; sillä Jumala tietää, että
koko maailmassa ei ole ketään toista ihmistä, jota minä kunnioitan
niin paljon.
Polidor. Arwoisa herra! Olkaa wähemmällä siewistelemisellä — —

Henrik. Ei, herra! Kyllä minä uskon, että Leanderi herra on teidän
ystäwänne; sillä saatuansa tietää, mikä onni teille oli tapahtunut,
muuttui hän niin, että tahtoo haleta pelkästä rakkaudesta teitä
kohtaan.
Leander. Minä wannon, kaiken pyhyyden nimessä, että minä olen
entiselläni, ja se onni, joka on tapahtunut teille, arwoisa herra, ei
ollenkaan ole syynä minun tänne tulooni. Minä olen aina pitänyt teitä
parempana kaikkia muita tuttawiani, ja paras onni minulle olisi olla
teidän wähäarwoisin turwalaisenne.
Polidor. Saattaa olla, herra! Nyt minun täytyy mennä pois teidän
luotanne; minulla on wähän toimituksia.
(Menee sisälle).
Leander (suutelee Henrikkiä). Ah, weikkoseni monsieur von
Henrik!
Ruwetkaa minulle ystäwäksi ja puolustajaksi. Lupaatteko?
Henrik. No miksi ei? Serviteur treshumble! (Suutelewat toinen
toisiansa. Kolme wierasta herraa tulee, ylpeissä pukineissa, tähdet
rinnoissa, he syleilewät Henrikkiä ja menewät Leanderin kanssa
sisälle Polidorin sisähuoneesen).
Leanderin rouwa (tulee ja suutelee Leonoran esiliinaa).
Leonora. Ohoh, rouwaseni! Mistä tulee tuo suuri nöyryys?
Rouwa. Ah, armollinen rouwa! Woipiko kukaan olla liian nöyrä niin
arwokasta waimo-ihmistä kohtaan, kuin te olette? Loistaahan teistä
waloa kaikille näillä seuduilla.

Leonora. No, ennen on teillä ollut ihan toisellaisia ajatuksia
minusta.
Rouwa. Minä en rohkene wannoa teidän edessänne, armollinen
rouwa, — waan jos tohtisin, niin saattaisin kalleimmalla walallani
wakuuttaa, että — — (Kolme wierasta rouwaa tulee sisälle ja
suutelewat hekin Leonoran esiliinaa).
Leonora. Hywät rouwat! Menkäämme toiseen huoneesen. Koko
maailma näkyy pyrkiwän tänne sisälle; siis emme saata olla
kauemmin täällä eteisessä.
(Henrik ja Nilla jääwät kahden kesken; ne kolme herraa
tulewat takaisin ulos, kumarrellen Polidorille, joka on
sisäpuolella, niin että eräs heistä kaatuu päin lattialle;
suutelewat Henrikkiä ja ottawat nöyrät jäähywäiset,
ennenkuin lähtewät; rouwat tekewät samoin Nillalle ja
suutelewat hänen kättänsä).
Nilla. Eikö tämä ole oiwallista, Henrik? Nuo kolme rouwaa
suuteliwat jokainen minun kättäni.
Henrik. No, johan sinä sait kunnioitusta enemmän kuin minä, sillä
herrat suuteliwat minua suulle waan.
Nilla. Tämä historia kuwailee kyllin maailman menoa.
Henrik. Niin, kuka olisi uskonut, että ylhäiset rouwat suutelisiwat
mokoman hetaleen kättä, kuin sinä olet?
Nilla. Sanopa sitä! ja että mokomat ylhäiset herrat suutelisiwat
sellaista törkysuuta, kuin sinun on. (Katsoo ikkunasta). Mutta tuolla

seisattuu koko joukko waunuja; meille tulee wielä muitakin kunnia-
terweisille.
Neljästoista kohtaus.
Henrik. Nilla. Kaksi herraa. Kaksi rouwaa. Sitten kaksi runoilijaa.
Kaksi herraa (tulewat sisälle kahden rouwan kanssa). Saisimmeko
pyytää armoa päästä herran puheille?
Henrik. Hywät ihmiset! Minä en tiedä jos herra ja rouwa ottaa
wastaan. Seiskoohan kuitenkin, niin kauan kuin he tulewat ulos.
(Asettaa heidät toiselle siwulle. Sillä aikaa tulee sisälle kaksi miestä
mustissa waatteissa). Mitä miehiä te olette?
Runoilija. Me olemme runoilijoita.
Henrik. Hywä! Te tulette hywään tarpeesen. Minulta kuoli eilen
kissa, jolle tahtoisin teettää muutamia kauniita wärsyjä.
Runoilija. Käskekää waan meidän wähäpätöistä kykyämme, herra.
Henrik (erikseen). Hiisi wieköön teidät, koirankuonolaiset!
(Kowaa).
Mitä muuta asiata teillä on?
Runoilija. Meillä on muutamia wähäarwoisia wärsyjä,
kunnioitukseksi herralle ja rouwalle.
Henrik. Hywä! Seiskaa tuolla siwulla, siksi kuin herraswäki tulee
ulos. Ettekö te koirat ota pois hattua päästänne! Ettekö tiedä,

minkälaisessa huoneessa te olette? (Ottawat pois hatun päästänsä ja
katsowat kainosti alaspäin. Henrik käwelee edes ja takaisin).
Kuulkaa, miehet! Montako wärsyä te woitte tehdä päiwässä?
Runoilija (kumartaen). Nero työn tekee.
Henrik. Osaatteko panna minulle riimiä sanoihin Henrik
Lassinpoika?
Runoilija. Se on wähän tukalaa.
Henrik. Noh, te lienettekin lurjuksia runoilijoiksi! Osaatteko te
tehdä suorasanaisiakin wärsyjä?
Runoilija. Ei, herra! se on wastoin luontoa.
Henrik. Mitä, onko se wastoin luontoa? pilkkaatteko te minua?
Hakopäät?
Runoilija. Ei ollenkaan.
Henrik. Montako jalkaa wärsyssä on? Minä olen unhottanut taas
sen kirjawiisauden.
Runoilija. Se on wärsyjen laadun mukaan.
Henrik. Mitä mökisette! Eikö kaikki wärsyt ole yhtä pitkiä? Mutta
minkätähden te ette ole ennen milloinkaan tehneet wärsyjä
kunnioitukseksi herraswäelle, waikka he owat ansainneet ylistystä
aina, yhtä hywin kuin nytkin?
Runoilija. Sentähden kun meillä ei ole ollut onnea tietää heidän
hywiä awujansa ennenkuin nyt.

Henrik. Sanoisitte, että te ette ole tahtoneet tietää niitä,
ennenkuin saitte kuulla, mikä onni heille oli tapahtunut. Jos herra
taipuu minun tuumiini, niin totta maari, hirtetään yksi runoilija joka
päiwä, kunnes koko suku on häwitetty juurta jaksain. Mutta tuolla
tulee herra ja rouwa; nyt saatte kuulla, mitä he itse sanowat.
Wiidestoista kohtaus.
Entiset. Polidor. Leonora. Sitten rawintolan-isäntä ja Juutalainen.
Polidor (tulee sisälle Leonoran kanssa, molemmat koreissaan).
Henrik! juokse heti hakemaan minulle samanlaista pulweria neljällä
markalla, jotta et tarwitse käydä niin usein.
Henrik. Paikalla, herra!
(Menee).
Polidor. Mitä asiata teillä on, hywät ihmiset? Tahtoisitteko puhutella
minua? (Wieraat astuwat esille, nöyrästi kumarrellen ja sanowat
tulleensa, toiset alamaisille kunniaterweisille, toiset kuulemaan jos
herralla on jotakin käskemistä; rouwat tekewät samoin Leonoralle ja
suutelewat hänen esiliinaa, runoilijat astuwat sitten esille ja
tarjoowat paperiansa). Mitä paperia ne on?
Runoilijat. Muutamia wähäpätöisiä wärsyjä, kunnioitukseksi
herralle ja rouwalle.
Polidor. Kuulkaat, hywät ihmiset jokainen! Kun minulla oli
wastuksia, ja minun luultiin pitkällisellä turhalla työllä joutuneen
peräti takapajulle, niin te ette huomanneet minussa mitään hywää,

piditte halpana minun taloani ja pilkkaisitte puheillanne minua
itseäni. Mutta nyt, kun minun työni on onnistunut ja huoneeni
siunataan rikkaudella, näette te paljailla silmin, mitä ennen ette ole
nähneet lasisilmilläkään. Jos minä nyt olisin pahin hölmö, pitäisitte te
minua Salomonina; jos minä olisin mies mitä rumin, nimittäisitte te
minua Absaloniksi; pahinpanakin, olisin minä teidän mielestänne
parahin. Sellainen on maailma tätä nykyä; se ei kunnioita ketään
muuta kuin sitä, ken onnellinen on. Oi! kun onni kääntyy pois, katoo
myöskin rakkaus ja kunnioitus. Älkää kuitenkaan luulko minua niin
typeräksi, että minä en huomaisi teidän petollisuuttanne; sillä — —

Henrik (tulee takaisin). Woi herra kulta! mikä nyt lienee taikana?
Ennen sain minä arabian-pulweria koko kantokuormani wiidellä
kymmenellä pennillä, waan nyt en saa rahtuakaan, waikka antaisin
tynnyrin kultaa.
Polidor. Mitä sanot?
Henrik. Missä minä käwin, torilla sekä apteekissa, nauretaan
minulle, ja sanotaan minun juttelewan rojua.
Polidor. Woi taiwaan Jumala! mitä kummia kuullaan!
Henrik. Minä pelkään, herra, että meitä on petetty; sillä kaikki
ihmiset sanowat sellaista arabian-pulweria ei olewan olemassakaan.
— Kas peijakas! mitähän tuo mies tahtoo? (Rawintolan isäntä töytää
sisälle, kokin waatteissa). Oletteko hullu, mies? Uskallatteko juosta
tuon näköisenä ylhäisen herran huoneesen?
Isäntä. Oi, woi! minua ei suututtaisi niin paljon, jos ei se olisi ollut
perinnöksi saatu pikari.

Polidor. Se näkyy olewan kokki, joka asuu tuolla toisella puolella.
Toimita hänet kotiin, että pääsisi rauhaan entiselleen; mies parka on
wimmastunut.
Isäntä. Oih! jos ei siinä olisi ollut minun rakkaiden wanhempien
nimi!
Polidor. Oikein on paha mieleni; sillä hän on toimen mies ja kokki
mitä parasta.
Isäntä. Lusikankin wei hiisi, — ja tarkoin katsottuani, lienee
muutakin mennyt mukana.
Henrik. Kuulkaa, Risto herra! joko kauan olette olleet hulluna?
(Erikseen). Ollappa minulla hywä patukka, millä peloittaisin häntä,
niin kyllä paranisi mies.
Isäntä. Tuleehan sellaisesta hulluksi.
Polidor. Mikä teillä on hätänä, Risto herrani?
Isäntä. Eräs wieras, joka sanoi olewansa kullantekijä, on karannut
pois minun talostani, ja wei mennessänsä hopeapikarin ja
hopealusikan. Minä uskoin häntä hywänä miehenä, kun kuulin hänen
käywän Polidor herran talossa. Hänen luonansa oli äsköittäin,
ennenkuin hän meni pois, eräs yksisilmäinen perkele, pitkissä
mustissa waatteissa.
Henrik. Oliko hänellä luppalakki päässä?
Isäntä. Oli, ja musta paikka silmällä.

Henrik. Ahaa, — — Me olemme joutuneet yhteen katiskaan. Ihan
sama, joka myi minulle arabian-pulweria.
Polidor. Woi minua poloista! Niin meni minulta neljätuhatta
taalaria.
    (Wieraat sekä runoilijat panewat hatun päähänsä ja
    käwelewät röyhkeästi lattialla).
Juutalainen (tulee sisälle). Onkos tääll' kultamaasteri? Mine anto
sille kalliit juwelit, ja hän sai minu raha.
Polidor. Hywä, että teitäkin on petetty; tepä kuletitte hänen tänne
minulle.
Juutalainen. Oh, onko se petturi? Ai, ai, ai! minu kallis juweli!
Polidor. Minulla ei ole neljästä tuhannesta taalarista muuta kuin
pieni paperitilkku, jossa on muutamia kirjoitettuja arabialaisia sanoja,
joita minun piti lukea, kultaa keittäessäni.
(Ottaa poweltansa paperin).
Eräs wieras. Näyttäkää! Minä ymmärrän wähän arabiankieltä.
(Katselee paperiin). Ei se ole arabiankieltä, eikä muutu siksi sinä
ikänä. Mitä hiisiä tämä on? Kun minä luen takaperin wiimeisen
sanan, niin siinä on Narri. Katsotaanpa ensimäistä. Tässä on, totta
niinkin: kullantekijät owat pettureja, ja sinä olet narri. Hohoho.
    (Wieraat menewät ulos nauraen. Runoilijat kumartelewat
    Polidorille, hänelle selkää kääntäen).

Leonora. Woi minua! kuin lahjoitin wielä sille paholaiselle paraan
sormukseni.
Nilla. Oih! minua ei harmita niin paljon muistoraha, jonka annoin
hänelle, kuin se että minä suutelin sen rötkäleen likaista kättä.
Polidor. Menkäämme sisälle. Me lähdemme maalle, ja asumme
siellä pienellä tiluksellamme, mikä meillä on wielä jälellä. Minä en
huoli enää koskaan kullantekemisestä, waan heitän sen pahimmalle
wiholliselleni. Se on saattanut köyhyyteen minun, wieläpä monia
muitakin kelpo ihmisiä. Oppikaat waan hywät ihmiset tästä ja muista
sellaisista esimerkeistä, olemaan waroillanne!
    (Kaikki menewät ulos, juutalainen ja
    Rawintolan-isäntä woiwotellen).
(Loppu).

*** END OF THE PROJECT GUTENBERG EBOOK ARABIAN-
PULWERIA: KOMEDIA YHTENÄ NÄYTÖKSENÄ, KAHDELL A
WAIHOKSELLA ***
Updated editions will replace the previous one—the old editions will
be renamed.
Creating the works from print editions not protected by U.S.
copyright law means that no one owns a United States copyright in
these works, so the Foundation (and you!) can copy and distribute it
in the United States without permission and without paying
copyright royalties. Special rules, set forth in the General Terms of
Use part of this license, apply to copying and distributing Project
Gutenberg™ electronic works to protect the PROJECT GUTENBERG™
concept and trademark. Project Gutenberg is a registered trademark,
and may not be used if you charge for an eBook, except by following
the terms of the trademark license, including paying royalties for use
of the Project Gutenberg trademark. If you do not charge anything
for copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such as
creation of derivative works, reports, performances and research.
Project Gutenberg eBooks may be modified and printed and given
away—you may do practically ANYTHING in the United States with
eBooks not protected by U.S. copyright law. Redistribution is subject
to the trademark license, especially commercial redistribution.
START: FULL LICENSE

THE FULL PROJECT GUTENBERG LICENSE

PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK
To protect the Project Gutenberg™ mission of promoting the free
distribution of electronic works, by using or distributing this work (or
any other work associated in any way with the phrase “Project
Gutenberg”), you agree to comply with all the terms of the Full
Project Gutenberg™ License available with this file or online at
www.gutenberg.org/license.
Section 1. General Terms of Use and
Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand, agree
to and accept all the terms of this license and intellectual property
(trademark/copyright) agreement. If you do not agree to abide by all
the terms of this agreement, you must cease using and return or
destroy all copies of Project Gutenberg™ electronic works in your
possession. If you paid a fee for obtaining a copy of or access to a
Project Gutenberg™ electronic work and you do not agree to be
bound by the terms of this agreement, you may obtain a refund
from the person or entity to whom you paid the fee as set forth in
paragraph 1.E.8.
1.B. “Project Gutenberg” is a registered trademark. It may only be
used on or associated in any way with an electronic work by people
who agree to be bound by the terms of this agreement. There are a
few things that you can do with most Project Gutenberg™ electronic
works even without complying with the full terms of this agreement.
See paragraph 1.C below. There are a lot of things you can do with
Project Gutenberg™ electronic works if you follow the terms of this
agreement and help preserve free future access to Project
Gutenberg™ electronic works. See paragraph 1.E below.

1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright law
in the United States and you are located in the United States, we do
not claim a right to prevent you from copying, distributing,
performing, displaying or creating derivative works based on the
work as long as all references to Project Gutenberg are removed. Of
course, we hope that you will support the Project Gutenberg™
mission of promoting free access to electronic works by freely
sharing Project Gutenberg™ works in compliance with the terms of
this agreement for keeping the Project Gutenberg™ name associated
with the work. You can easily comply with the terms of this
agreement by keeping this work in the same format with its attached
full Project Gutenberg™ License when you share it without charge
with others.
1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside the
United States, check the laws of your country in addition to the
terms of this agreement before downloading, copying, displaying,
performing, distributing or creating derivative works based on this
work or any other Project Gutenberg™ work. The Foundation makes
no representations concerning the copyright status of any work in
any country other than the United States.
1.E. Unless you have removed all references to Project Gutenberg:
1.E.1. The following sentence, with active links to, or other
immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project Gutenberg™
work (any work on which the phrase “Project Gutenberg” appears,
or with which the phrase “Project Gutenberg” is associated) is
accessed, displayed, performed, viewed, copied or distributed:

This eBook is for the use of anyone anywhere in the United
States and most other parts of the world at no cost and with
almost no restrictions whatsoever. You may copy it, give it away
or re-use it under the terms of the Project Gutenberg License
included with this eBook or online at www.gutenberg.org. If you
are not located in the United States, you will have to check the
laws of the country where you are located before using this
eBook.
1.E.2. If an individual Project Gutenberg™ electronic work is derived
from texts not protected by U.S. copyright law (does not contain a
notice indicating that it is posted with permission of the copyright
holder), the work can be copied and distributed to anyone in the
United States without paying any fees or charges. If you are
redistributing or providing access to a work with the phrase “Project
Gutenberg” associated with or appearing on the work, you must
comply either with the requirements of paragraphs 1.E.1 through
1.E.7 or obtain permission for the use of the work and the Project
Gutenberg™ trademark as set forth in paragraphs 1.E.8 or 1.E.9.
1.E.3. If an individual Project Gutenberg™ electronic work is posted
with the permission of the copyright holder, your use and distribution
must comply with both paragraphs 1.E.1 through 1.E.7 and any
additional terms imposed by the copyright holder. Additional terms
will be linked to the Project Gutenberg™ License for all works posted
with the permission of the copyright holder found at the beginning
of this work.
1.E.4. Do not unlink or detach or remove the full Project
Gutenberg™ License terms from this work, or any files containing a
part of this work or any other work associated with Project
Gutenberg™.
1.E.5. Do not copy, display, perform, distribute or redistribute this
electronic work, or any part of this electronic work, without
prominently displaying the sentence set forth in paragraph 1.E.1

with active links or immediate access to the full terms of the Project
Gutenberg™ License.
1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if you
provide access to or distribute copies of a Project Gutenberg™ work
in a format other than “Plain Vanilla ASCII” or other format used in
the official version posted on the official Project Gutenberg™ website
(www.gutenberg.org), you must, at no additional cost, fee or
expense to the user, provide a copy, a means of exporting a copy, or
a means of obtaining a copy upon request, of the work in its original
“Plain Vanilla ASCII” or other form. Any alternate format must
include the full Project Gutenberg™ License as specified in
paragraph 1.E.1.
1.E.7. Do not charge a fee for access to, viewing, displaying,
performing, copying or distributing any Project Gutenberg™ works
unless you comply with paragraph 1.E.8 or 1.E.9.
1.E.8. You may charge a reasonable fee for copies of or providing
access to or distributing Project Gutenberg™ electronic works
provided that:
• You pay a royalty fee of 20% of the gross profits you derive
from the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information

about donations to the Project Gutenberg Literary Archive
Foundation.”
• You provide a full refund of any money paid by a user who
notifies you in writing (or by e-mail) within 30 days of receipt
that s/he does not agree to the terms of the full Project
Gutenberg™ License. You must require such a user to return or
destroy all copies of the works possessed in a physical medium
and discontinue all use of and all access to other copies of
Project Gutenberg™ works.
• You provide, in accordance with paragraph 1.F.3, a full refund of
any money paid for a work or a replacement copy, if a defect in
the electronic work is discovered and reported to you within 90
days of receipt of the work.
• You comply with all other terms of this agreement for free
distribution of Project Gutenberg™ works.
1.E.9. If you wish to charge a fee or distribute a Project Gutenberg™
electronic work or group of works on different terms than are set
forth in this agreement, you must obtain permission in writing from
the Project Gutenberg Literary Archive Foundation, the manager of
the Project Gutenberg™ trademark. Contact the Foundation as set
forth in Section 3 below.
1.F.
1.F.1. Project Gutenberg volunteers and employees expend
considerable effort to identify, do copyright research on, transcribe
and proofread works not protected by U.S. copyright law in creating
the Project Gutenberg™ collection. Despite these efforts, Project
Gutenberg™ electronic works, and the medium on which they may
be stored, may contain “Defects,” such as, but not limited to,
incomplete, inaccurate or corrupt data, transcription errors, a
copyright or other intellectual property infringement, a defective or

damaged disk or other medium, a computer virus, or computer
codes that damage or cannot be read by your equipment.
1.F.2. LIMITED WARRANTY, DISCLAIMER OF DAMAGES - Except for
the “Right of Replacement or Refund” described in paragraph 1.F.3,
the Project Gutenberg Literary Archive Foundation, the owner of the
Project Gutenberg™ trademark, and any other party distributing a
Project Gutenberg™ electronic work under this agreement, disclaim
all liability to you for damages, costs and expenses, including legal
fees. YOU AGREE THAT YOU HAVE NO REMEDIES FOR
NEGLIGENCE, STRICT LIABILITY, BREACH OF WARRANTY OR
BREACH OF CONTRACT EXCEPT THOSE PROVIDED IN PARAGRAPH
1.F.3. YOU AGREE THAT THE FOUNDATION, THE TRADEMARK
OWNER, AND ANY DISTRIBUTOR UNDER THIS AGREEMENT WILL
NOT BE LIABLE TO YOU FOR ACTUAL, DIRECT, INDIRECT,
CONSEQUENTIAL, PUNITIVE OR INCIDENTAL DAMAGES EVEN IF
YOU GIVE NOTICE OF THE POSSIBILITY OF SUCH DAMAGE.
1.F.3. LIMITED RIGHT OF REPLACEMENT OR REFUND - If you
discover a defect in this electronic work within 90 days of receiving
it, you can receive a refund of the money (if any) you paid for it by
sending a written explanation to the person you received the work
from. If you received the work on a physical medium, you must
return the medium with your written explanation. The person or
entity that provided you with the defective work may elect to provide
a replacement copy in lieu of a refund. If you received the work
electronically, the person or entity providing it to you may choose to
give you a second opportunity to receive the work electronically in
lieu of a refund. If the second copy is also defective, you may
demand a refund in writing without further opportunities to fix the
problem.
1.F.4. Except for the limited right of replacement or refund set forth
in paragraph 1.F.3, this work is provided to you ‘AS-IS’, WITH NO
OTHER WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED,

INCLUDING BUT NOT LIMITED TO WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR ANY PURPOSE.
1.F.5. Some states do not allow disclaimers of certain implied
warranties or the exclusion or limitation of certain types of damages.
If any disclaimer or limitation set forth in this agreement violates the
law of the state applicable to this agreement, the agreement shall be
interpreted to make the maximum disclaimer or limitation permitted
by the applicable state law. The invalidity or unenforceability of any
provision of this agreement shall not void the remaining provisions.
1.F.6. INDEMNITY - You agree to indemnify and hold the Foundation,
the trademark owner, any agent or employee of the Foundation,
anyone providing copies of Project Gutenberg™ electronic works in
accordance with this agreement, and any volunteers associated with
the production, promotion and distribution of Project Gutenberg™
electronic works, harmless from all liability, costs and expenses,
including legal fees, that arise directly or indirectly from any of the
following which you do or cause to occur: (a) distribution of this or
any Project Gutenberg™ work, (b) alteration, modification, or
additions or deletions to any Project Gutenberg™ work, and (c) any
Defect you cause.
Section 2. Information about the Mission
of Project Gutenberg™
Project Gutenberg™ is synonymous with the free distribution of
electronic works in formats readable by the widest variety of
computers including obsolete, old, middle-aged and new computers.
It exists because of the efforts of hundreds of volunteers and
donations from people in all walks of life.
Volunteers and financial support to provide volunteers with the
assistance they need are critical to reaching Project Gutenberg™’s
goals and ensuring that the Project Gutenberg™ collection will

remain freely available for generations to come. In 2001, the Project
Gutenberg Literary Archive Foundation was created to provide a
secure and permanent future for Project Gutenberg™ and future
generations. To learn more about the Project Gutenberg Literary
Archive Foundation and how your efforts and donations can help,
see Sections 3 and 4 and the Foundation information page at
www.gutenberg.org.
Section 3. Information about the Project
Gutenberg Literary Archive Foundation
The Project Gutenberg Literary Archive Foundation is a non-profit
501(c)(3) educational corporation organized under the laws of the
state of Mississippi and granted tax exempt status by the Internal
Revenue Service. The Foundation’s EIN or federal tax identification
number is 64-6221541. Contributions to the Project Gutenberg
Literary Archive Foundation are tax deductible to the full extent
permitted by U.S. federal laws and your state’s laws.
The Foundation’s business office is located at 809 North 1500 West,
Salt Lake City, UT 84116, (801) 596-1887. Email contact links and up
to date contact information can be found at the Foundation’s website
and official page at www.gutenberg.org/contact
Section 4. Information about Donations to
the Project Gutenberg Literary Archive
Foundation
Project Gutenberg™ depends upon and cannot survive without
widespread public support and donations to carry out its mission of
increasing the number of public domain and licensed works that can
be freely distributed in machine-readable form accessible by the
widest array of equipment including outdated equipment. Many

small donations ($1 to $5,000) are particularly important to
maintaining tax exempt status with the IRS.
The Foundation is committed to complying with the laws regulating
charities and charitable donations in all 50 states of the United
States. Compliance requirements are not uniform and it takes a
considerable effort, much paperwork and many fees to meet and
keep up with these requirements. We do not solicit donations in
locations where we have not received written confirmation of
compliance. To SEND DONATIONS or determine the status of
compliance for any particular state visit www.gutenberg.org/donate.
While we cannot and do not solicit contributions from states where
we have not met the solicitation requirements, we know of no
prohibition against accepting unsolicited donations from donors in
such states who approach us with offers to donate.
International donations are gratefully accepted, but we cannot make
any statements concerning tax treatment of donations received from
outside the United States. U.S. laws alone swamp our small staff.
Please check the Project Gutenberg web pages for current donation
methods and addresses. Donations are accepted in a number of
other ways including checks, online payments and credit card
donations. To donate, please visit: www.gutenberg.org/donate.
Section 5. General Information About
Project Gutenberg™ electronic works
Professor Michael S. Hart was the originator of the Project
Gutenberg™ concept of a library of electronic works that could be
freely shared with anyone. For forty years, he produced and
distributed Project Gutenberg™ eBooks with only a loose network of
volunteer support.

Project Gutenberg™ eBooks are often created from several printed
editions, all of which are confirmed as not protected by copyright in
the U.S. unless a copyright notice is included. Thus, we do not
necessarily keep eBooks in compliance with any particular paper
edition.
Most people start at our website which has the main PG search
facility: www.gutenberg.org.
This website includes information about Project Gutenberg™,
including how to make donations to the Project Gutenberg Literary
Archive Foundation, how to help produce our new eBooks, and how
to subscribe to our email newsletter to hear about new eBooks.