Ebook: A Distribution-Free Theory of Nonparametric Regression
- Tags: Statistical Theory and Methods
- Series: Springer Series in Statistics
- Year: 2002
- Publisher: Springer-Verlag New York
- Edition: 1
- Language: English
- pdf
The regression estimation problem has a long history. Already in 1632 Galileo Galilei used a procedure which can be interpreted as ?tting a linear relationship to contaminated observed data. Such ?tting of a line through a cloud of points is the classical linear regression problem. A solution of this problem is provided by the famous principle of least squares, which was discovered independently by A. M. Legendre and C. F. Gauss and published in 1805 and 1809, respectively. The principle of least squares can also be applied to construct nonparametric regression estimates, where one does not restrict the class of possible relationships, and will be one of the approaches studied in this book. Linear regression analysis, based on the concept of a regression function, was introduced by F. Galton in 1889, while a probabilistic approach in the context of multivariate normal distributions was already given by A. B- vais in 1846. The ?rst nonparametric regression estimate of local averaging type was proposed by J. W. Tukey in 1947. The partitioning regression - timate he introduced, by analogy to the classical partitioning (histogram) density estimate, can be regarded as a special least squares estimate.
This book provides a systematic in-depth analysis of nonparametric regression with random design. It covers almost all known estimates such as classical local averaging estimates including kernel, partitioning and nearest neighbor estimates, least squares estimates using splines, neural networks and radial basis function networks, penalized least squares estimates, local polynomial kernel estimates, and orthogonal series estimates. The emphasis is on distribution-free properties of the estimates. Most consistency results are valid for all distributions of the data. Whenever it is not possible to derive distribution-free results, as in the case of the rates of convergence, the emphasis is on results which require as few constrains on distributions as possible, on distribution-free inequalities, and on adaptation. The relevant mathematical theory is systematically developed and requires only a basic knowledge of probability theory. The book will be a valuable reference for anyone interested in nonparametric regression and is a rich source of many useful mathematical techniques widely scattered in the literature. In particular, the book introduces the reader to empirical process theory, martingales and approximation properties of neural networks.
This book provides a systematic in-depth analysis of nonparametric regression with random design. It covers almost all known estimates such as classical local averaging estimates including kernel, partitioning and nearest neighbor estimates, least squares estimates using splines, neural networks and radial basis function networks, penalized least squares estimates, local polynomial kernel estimates, and orthogonal series estimates. The emphasis is on distribution-free properties of the estimates. Most consistency results are valid for all distributions of the data. Whenever it is not possible to derive distribution-free results, as in the case of the rates of convergence, the emphasis is on results which require as few constrains on distributions as possible, on distribution-free inequalities, and on adaptation. The relevant mathematical theory is systematically developed and requires only a basic knowledge of probability theory. The book will be a valuable reference for anyone interested in nonparametric regression and is a rich source of many useful mathematical techniques widely scattered in the literature. In particular, the book introduces the reader to empirical process theory, martingales and approximation properties of neural networks.
Content:
Front Matter....Pages i-xvi
Why Is Nonparametric Regression Important?....Pages 1-17
How to Construct Nonparametric Regression Estimates?....Pages 18-30
Lower Bounds....Pages 31-51
Partitioning Estimates....Pages 52-69
Kernel Estimates....Pages 70-85
k-NN Estimates....Pages 86-99
Splitting the Sample....Pages 100-111
Cross-Validation....Pages 112-129
Uniform Laws of Large Numbers....Pages 130-157
Least Squares Estimates I: Consistency....Pages 158-182
Least Squares Estimates II: Rate of Convergence....Pages 183-221
Least Squares Estimates III: Complexity Regularization....Pages 222-234
Consistency of Data-Dependent Partitioning Estimates....Pages 235-251
Univariate Least Squares Spline Estimates....Pages 252-282
Multivariate Least Squares Spline Estimates....Pages 283-296
Neural Networks Estimates....Pages 297-328
Radial Basis Function Networks....Pages 329-352
Orthogonal Series Estimates....Pages 353-379
Advanced Techniques from Empirical Process Theory....Pages 380-406
Penalized Least Squares Estimates I: Consistency....Pages 407-432
Penalized Least Squares Estimates II: Rate of Convergence....Pages 433-447
Dimension Reduction Techniques....Pages 448-458
Strong Consistency of Local Averaging Estimates....Pages 459-492
Semirecursive Estimates....Pages 493-511
Recursive Estimates....Pages 512-539
Censored Observations....Pages 540-563
Dependent Observations....Pages 564-588
Back Matter....Pages 589-647
This book provides a systematic in-depth analysis of nonparametric regression with random design. It covers almost all known estimates such as classical local averaging estimates including kernel, partitioning and nearest neighbor estimates, least squares estimates using splines, neural networks and radial basis function networks, penalized least squares estimates, local polynomial kernel estimates, and orthogonal series estimates. The emphasis is on distribution-free properties of the estimates. Most consistency results are valid for all distributions of the data. Whenever it is not possible to derive distribution-free results, as in the case of the rates of convergence, the emphasis is on results which require as few constrains on distributions as possible, on distribution-free inequalities, and on adaptation. The relevant mathematical theory is systematically developed and requires only a basic knowledge of probability theory. The book will be a valuable reference for anyone interested in nonparametric regression and is a rich source of many useful mathematical techniques widely scattered in the literature. In particular, the book introduces the reader to empirical process theory, martingales and approximation properties of neural networks.
Content:
Front Matter....Pages i-xvi
Why Is Nonparametric Regression Important?....Pages 1-17
How to Construct Nonparametric Regression Estimates?....Pages 18-30
Lower Bounds....Pages 31-51
Partitioning Estimates....Pages 52-69
Kernel Estimates....Pages 70-85
k-NN Estimates....Pages 86-99
Splitting the Sample....Pages 100-111
Cross-Validation....Pages 112-129
Uniform Laws of Large Numbers....Pages 130-157
Least Squares Estimates I: Consistency....Pages 158-182
Least Squares Estimates II: Rate of Convergence....Pages 183-221
Least Squares Estimates III: Complexity Regularization....Pages 222-234
Consistency of Data-Dependent Partitioning Estimates....Pages 235-251
Univariate Least Squares Spline Estimates....Pages 252-282
Multivariate Least Squares Spline Estimates....Pages 283-296
Neural Networks Estimates....Pages 297-328
Radial Basis Function Networks....Pages 329-352
Orthogonal Series Estimates....Pages 353-379
Advanced Techniques from Empirical Process Theory....Pages 380-406
Penalized Least Squares Estimates I: Consistency....Pages 407-432
Penalized Least Squares Estimates II: Rate of Convergence....Pages 433-447
Dimension Reduction Techniques....Pages 448-458
Strong Consistency of Local Averaging Estimates....Pages 459-492
Semirecursive Estimates....Pages 493-511
Recursive Estimates....Pages 512-539
Censored Observations....Pages 540-563
Dependent Observations....Pages 564-588
Back Matter....Pages 589-647
....