Open Access
January, 1979 Bootstrap Methods: Another Look at the Jackknife
B. Efron
Ann. Statist. 7(1): 1-26 (January, 1979). DOI: 10.1214/aos/1176344552

Abstract

We discuss the following problem: given a random sample $\mathbf{X} = (X_1, X_2, \cdots, X_n)$ from an unknown probability distribution $F$, estimate the sampling distribution of some prespecified random variable $R(\mathbf{X}, F)$, on the basis of the observed data $\mathbf{x}$. (Standard jackknife theory gives an approximate mean and variance in the case $R(\mathbf{X}, F) = \theta(\hat{F}) - \theta(F), \theta$ some parameter of interest.) A general method, called the "bootstrap," is introduced, and shown to work satisfactorily on a variety of estimation problems. The jackknife is shown to be a linear approximation method for the bootstrap. The exposition proceeds by a series of examples: variance of the sample median, error rates in a linear discriminant analysis, ratio estimation, estimating regression parameters, etc.

Citation

Download Citation

B. Efron. "Bootstrap Methods: Another Look at the Jackknife." Ann. Statist. 7 (1) 1 - 26, January, 1979. https://doi.org/10.1214/aos/1176344552

Information

Published: January, 1979
First available in Project Euclid: 12 April 2007

zbMATH: 0406.62024
MathSciNet: MR515681
Digital Object Identifier: 10.1214/aos/1176344552

Subjects:
Primary: 62G05
Secondary: 62G15 , 62H30 , 62J05

Keywords: bootstrap , discriminant analysis , error rate estimation , jackknife , Nonlinear regression , nonparametric variance estimation , Resampling , subsample values

Rights: Copyright © 1979 Institute of Mathematical Statistics

Vol.7 • No. 1 • January, 1979
Back to Top