Ebook: Neural Networks for Conditional Probability Estimation: Forecasting Beyond Point Predictions
Author: Dirk Husmeier PhD (auth.)
- Tags: Pattern Recognition, Statistical Physics Dynamical Systems and Complexity, Neurobiology
- Series: Perspectives in Neural Computing
- Year: 1999
- Publisher: Springer-Verlag London
- Edition: 1
- Language: English
- pdf
Conventional applications of neural networks usually predict a single value as a function of given inputs. In forecasting, for example, a standard objective is to predict the future value of some entity of interest on the basis of a time series of past measurements or observations. Typical training schemes aim to minimise the sum of squared deviations between predicted and actual values (the 'targets'), by which, ideally, the network learns the conditional mean of the target given the input. If the underlying conditional distribution is Gaus sian or at least unimodal, this may be a satisfactory approach. However, for a multimodal distribution, the conditional mean does not capture the relevant features of the system, and the prediction performance will, in general, be very poor. This calls for a more powerful and sophisticated model, which can learn the whole conditional probability distribution. Chapter 1 demonstrates that even for a deterministic system and 'be nign' Gaussian observational noise, the conditional distribution of a future observation, conditional on a set of past observations, can become strongly skewed and multimodal. In Chapter 2, a general neural network structure for modelling conditional probability densities is derived, and it is shown that a universal approximator for this extended task requires at least two hidden layers. A training scheme is developed from a maximum likelihood approach in Chapter 3, and the performance ofthis method is demonstrated on three stochastic time series in chapters 4 and 5.
This volume presents a neural network architecture for the prediction of conditional probability densities - which is vital when carrying out universal approximation on variables which are either strongly skewed or multimodal. Two alternative approaches are discussed: the GM network, in which all parameters are adapted in the training scheme, and the GM-RVFL model which draws on the random functional link net approach. Points of particular interest are: - it examines the modification to standard approaches needed for conditional probability prediction; - it provides the first real-world test results for recent theoretical findings about the relationship between generalisation performance of committees and the over-flexibility of their members; This volume will be of interest to all researchers, practitioners and postgraduate / advanced undergraduate students working on applications of neural networks - especially those related to finance and pattern recognition.
This volume presents a neural network architecture for the prediction of conditional probability densities - which is vital when carrying out universal approximation on variables which are either strongly skewed or multimodal. Two alternative approaches are discussed: the GM network, in which all parameters are adapted in the training scheme, and the GM-RVFL model which draws on the random functional link net approach. Points of particular interest are: - it examines the modification to standard approaches needed for conditional probability prediction; - it provides the first real-world test results for recent theoretical findings about the relationship between generalisation performance of committees and the over-flexibility of their members; This volume will be of interest to all researchers, practitioners and postgraduate / advanced undergraduate students working on applications of neural networks - especially those related to finance and pattern recognition.
Content:
Front Matter....Pages i-xxiii
Introduction....Pages 1-19
A Universal Approximator Network for Predicting Conditional Probability Densities....Pages 21-37
A Maximum Likelihood Training Scheme....Pages 39-55
Benchmark Problems....Pages 57-67
Demonstration of the Model Performance on the Benchmark Problems....Pages 69-85
Random Vector Functional Link (RVFL) Networks....Pages 87-97
Improved Training Scheme Combining the Expectation Maximisation (EM) Algorithm with the RVFL Approach....Pages 99-119
Empirical Demonstration: Combining EM and RVFL....Pages 121-135
A simple Bayesian regularisation scheme....Pages 137-145
The Bayesian Evidence Scheme for Regularisation....Pages 147-163
The Bayesian Evidence Scheme for Model Selection....Pages 165-177
Demonstration of the Bayesian Evidence Scheme for Regularisation....Pages 179-191
Network Committees and Weighting Schemes....Pages 193-201
Demonstration: Committees of Networks Trained with Different Regularisation Schemes....Pages 203-219
Automatic Relevance Determination (ARD)....Pages 221-227
A Real-World Application: The Boston Housing Data....Pages 229-250
Summary....Pages 251-253
Appendix: Derivation of the Hessian for the Bayesian Evidence Scheme....Pages 255-265
Back Matter....Pages 267-275
This volume presents a neural network architecture for the prediction of conditional probability densities - which is vital when carrying out universal approximation on variables which are either strongly skewed or multimodal. Two alternative approaches are discussed: the GM network, in which all parameters are adapted in the training scheme, and the GM-RVFL model which draws on the random functional link net approach. Points of particular interest are: - it examines the modification to standard approaches needed for conditional probability prediction; - it provides the first real-world test results for recent theoretical findings about the relationship between generalisation performance of committees and the over-flexibility of their members; This volume will be of interest to all researchers, practitioners and postgraduate / advanced undergraduate students working on applications of neural networks - especially those related to finance and pattern recognition.
Content:
Front Matter....Pages i-xxiii
Introduction....Pages 1-19
A Universal Approximator Network for Predicting Conditional Probability Densities....Pages 21-37
A Maximum Likelihood Training Scheme....Pages 39-55
Benchmark Problems....Pages 57-67
Demonstration of the Model Performance on the Benchmark Problems....Pages 69-85
Random Vector Functional Link (RVFL) Networks....Pages 87-97
Improved Training Scheme Combining the Expectation Maximisation (EM) Algorithm with the RVFL Approach....Pages 99-119
Empirical Demonstration: Combining EM and RVFL....Pages 121-135
A simple Bayesian regularisation scheme....Pages 137-145
The Bayesian Evidence Scheme for Regularisation....Pages 147-163
The Bayesian Evidence Scheme for Model Selection....Pages 165-177
Demonstration of the Bayesian Evidence Scheme for Regularisation....Pages 179-191
Network Committees and Weighting Schemes....Pages 193-201
Demonstration: Committees of Networks Trained with Different Regularisation Schemes....Pages 203-219
Automatic Relevance Determination (ARD)....Pages 221-227
A Real-World Application: The Boston Housing Data....Pages 229-250
Summary....Pages 251-253
Appendix: Derivation of the Hessian for the Bayesian Evidence Scheme....Pages 255-265
Back Matter....Pages 267-275
....