The exam is open book, including handouts. It is closed notes. You may use a calculator.
Put all of your work on this test form (use the back if necessary). Show your work or give an explanation of your answer. No credit for numbers with no indication of where they came from.
The points for the questions total to 200. There are pages and 8 problems.
[25 pts.] Suppose X1, X2, , Xn are i. i. d. random variables and, as usual, denotes the sample mean. What is the asymptotic distribution of ? You must give the parameters of the asymptotic distribution as functions of and for full credit.
[25 pts.] Suppose X1, X2, , Xn are i. i. d. random variables, where . Find a method of moments estimator of and its asymptotic distribution. You must give the parameters of the asymptotic distribution as functions of for full credit.
[25 pts.]
Suppose X1, X2, ,
Xn are i. i. d.
random
variables.
Perform an asymptotic (large sample) test of the hypotheses
corresponding to sample size n = 100, sample mean
,
and sample variance
S2n = 0.036.
Give the P-value for the test and also say whether H0 is accepted
or rejected at the .05 level of significance.
[25 pts.] Suppose X1, X2, , Xn are i. i. d. random variables, and we observe for sample size n = 16. We want to do a Bayesian analysis with a prior distribution for . Find a 95% HPD region for .
[25 pts.] In Problem 6-6 in the notes, the part of the posted solution was
xlow <- ifelse(x < 11, x - 11, 0) xhig <- ifelse(x < 11, 0, x - 11) out <- lm(y ~ xlow + xhig) summary(out)Recall that this fits a regression model with regression function
> out.too <- lm(y ~ x) > anova(out.too, out) Analysis of Variance Table Model 1: y ~ x Model 2: y ~ xlow + xhig Res.Df Res.Sum Sq Df Sum Sq F value Pr(>F) 1 19 195.966 2 18 123.005 1 72.961 10.677 0.004277 **Also explain why these are nested models and what conclusion about the fit of these two models can be drawn from the printout.
[25 pts.] Suppose X1, X2, , Xn are i. i. d. random random variables and the prior distribution of is . Find the posterior distribution of .
[25 pts.] Suppose we have regression data with variables x and y and fit a quadratic model
out <- lm(y ~ x + I(x^2)) options(show.signif.stars=FALSE) summary(out)getting the following (partial) output
Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) -0.544281 0.767327 -0.709 0.488 x 1.216011 0.168287 7.226 1.42e-06 I(x^2) -0.011464 0.007784 -1.473 0.159 Residual standard error: 1.031 on 17 degrees of freedom Multiple R-Squared: 0.9723, Adjusted R-squared: 0.969Give a 90% confidence interval for the coefficient of x2 in the regression function.
[25 pts.]
Suppose X1, X2, ,
Xn are i. i. d. random variables with density