1 Bootstrap and Jackknife 1.1 The Bootstrap 1.1.1 Bootstrap Estimation of Standard Error 1.1.2 Bootstrap Estimation of Bias 1.2 Jackknife 1.3 Jackknife-after-Bootstrap 1.4 Bootstrap Confidence Intervals 1.4.1 The Standard Normal Bootstrap Confidence Interval 1.4.2 The Percentile Bootstrap Confidence Interval 1.4.3 The Basic Bootstrap Confidence Interval 1.4.4 The Bootstrap t interval 1.5 Better Bootstrap Confidence Intervals 1.6 Application: Cross Validation
1 Methods for Generating Random Variables 1.1 Generating Uniform(0,1) random number 1.2 Random Generators of Common Probability Distribution in R 1.2.1 The Inverse Transform Method 1.2.2 The Acceptance-Rejection Method 1.2.3 Transformation Methods 1.2.4 Sums and Mixtures 1.3 Multivariate Distribution 1.3.1 Multivariate Normal Distribution 1.3.2 Mixtures of Multivariate Normals 1.3.3 Wishart Distribution 1.3.4 Uniform Distribution on the d−Sphere 1.4 Stochastic Process
1 Monte Carlo Methods in Inference 1.1 Monte Carlo Methods for Estimation 1.1.1 Monte Carlo Estimation and Standard Error 1.1.2 Estimation of MSE 1.2 Estimating a confidence level 1.3 Monte Carlo Methods for Hypothesis Tests 1.4 Empirical Type I error rate 1.4.1 Power of a Test 1.4.2 Power Comparisons 1.5 Application: “Count Five” Test for Equal Variance
1 EM optimization method 1.1 EM algorithm 1.2 Convergence 1.3 Usage in exponential families 1.4 Usage in finite normal mixtures 1.5 Variance estimation 1.5.1 Louis method 1.5.2 SEM algorithm 1.5.3 Bootstrap method 1.5.4 Empirical Information 1.6 EM Variants 1.6.1 Improving the E step 1.6.2 Improving the M step 1.7 Pros and Cons
1 Markov Chain Monte Carlo Methods 1.4 The Gibbs Sampler 1.4.1 The Slice Gibbs Sampler 1.5 Monitoring Convergence 1.5.1 Convergence diagnostics plots 1.5.2 Monte Carlo Error 1.5.3 The Gelman-Rubin Method 1.6 WinBUGS Introduction 1.6.1 Building Bayesian models in WinBUGS 1.6.2 Model specification in WinBUGS 1.6.3 Data and initial value specification 1.6.4 Compiling model and simulating values