Online Library TheLib.net » Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach
cover of the book Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach

Ebook: Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach

00
27.01.2024
0
0

We wrote this book to introduce graduate students and research workers in various scienti?c disciplines to the use of information-theoretic approaches in the analysis of empirical data. These methods allow the data-based selection of a “best” model and a ranking and weighting of the remaining models in a pre-de?ned set. Traditional statistical inference can then be based on this selected best model. However, we now emphasize that information-theoretic approaches allow formal inference to be based on more than one model (m- timodel inference). Such procedures lead to more robust inferences in many cases, and we advocate these approaches throughout the book. The second edition was prepared with three goals in mind. First, we have tried to improve the presentation of the material. Boxes now highlight ess- tial expressions and points. Some reorganization has been done to improve the ?ow of concepts, and a new chapter has been added. Chapters 2 and 4 have been streamlined in view of the detailed theory provided in Chapter 7. S- ond, concepts related to making formal inferences from more than one model (multimodel inference) have been emphasized throughout the book, but p- ticularly in Chapters 4, 5, and 6. Third, new technical material has been added to Chapters 5 and 6. Well over 100 new references to the technical literature are given. These changes result primarily from our experiences while giving several seminars, workshops, and graduate courses on material in the ?rst e- tion.




This book is unique in that it covers the philosophy of model-based data analysis and a strategy for the analysis of empirical data. The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data. Kullback-Leibler Information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. These are relatively simple and easy to use in practice. The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are objective and practical to employ across a very wide class of empirical problems. Model selection, under the information theoretic approach presented here, attempts to identify the (likely) best model, orders the models from best to worst, and measures the plausibility ("calibration") that each model is really the best as an inference. Model selection methods are extended to allow inference from more than a single "best" model. The book presents several new approaches to estimating model selection uncertainty and incorporating selection uncertainty into estimates of precision. An array of examples is given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians using models for making inferences from empirical data. People interested in the empirical sciences will find this material useful as it offers an alternative to hypothesis testing and Bayesian.


This book is unique in that it covers the philosophy of model-based data analysis and a strategy for the analysis of empirical data. The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data. Kullback-Leibler Information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. These are relatively simple and easy to use in practice. The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are objective and practical to employ across a very wide class of empirical problems. Model selection, under the information theoretic approach presented here, attempts to identify the (likely) best model, orders the models from best to worst, and measures the plausibility ("calibration") that each model is really the best as an inference. Model selection methods are extended to allow inference from more than a single "best" model. The book presents several new approaches to estimating model selection uncertainty and incorporating selection uncertainty into estimates of precision. An array of examples is given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians using models for making inferences from empirical data. People interested in the empirical sciences will find this material useful as it offers an alternative to hypothesis testing and Bayesian.
Content:
Front Matter....Pages I-XXVI
Introduction....Pages 1-48
Information and Likelihood Theory: A Basis for Model Selection and Inference....Pages 49-97
Basic Use of the Information-Theoretic Approach....Pages 98-148
Formal Inference From More Than One Model: Multimodel Inference (MMI)....Pages 149-205
Monte Carlo Insights and Extended Examples....Pages 206-266
Advanced Issues and Deeper Insights....Pages 267-351
Statistical Theory and Numerical Results....Pages 352-436
Summary....Pages 437-454
Back Matter....Pages 455-488


This book is unique in that it covers the philosophy of model-based data analysis and a strategy for the analysis of empirical data. The book introduces information theoretic approaches and focuses critical attention on a priori modeling and the selection of a good approximating model that best represents the inference supported by the data. Kullback-Leibler Information represents a fundamental quantity in science and is Hirotugu Akaike's basis for model selection. The maximized log-likelihood function can be bias-corrected to provide an estimate of expected, relative Kullback-Leibler information. This leads to Akaike's Information Criterion (AIC) and various extensions. These are relatively simple and easy to use in practice. The information theoretic approaches provide a unified and rigorous theory, an extension of likelihood theory, an important application of information theory, and are objective and practical to employ across a very wide class of empirical problems. Model selection, under the information theoretic approach presented here, attempts to identify the (likely) best model, orders the models from best to worst, and measures the plausibility ("calibration") that each model is really the best as an inference. Model selection methods are extended to allow inference from more than a single "best" model. The book presents several new approaches to estimating model selection uncertainty and incorporating selection uncertainty into estimates of precision. An array of examples is given to illustrate various technical issues. This is an applied book written primarily for biologists and statisticians using models for making inferences from empirical data. People interested in the empirical sciences will find this material useful as it offers an alternative to hypothesis testing and Bayesian.
Content:
Front Matter....Pages I-XXVI
Introduction....Pages 1-48
Information and Likelihood Theory: A Basis for Model Selection and Inference....Pages 49-97
Basic Use of the Information-Theoretic Approach....Pages 98-148
Formal Inference From More Than One Model: Multimodel Inference (MMI)....Pages 149-205
Monte Carlo Insights and Extended Examples....Pages 206-266
Advanced Issues and Deeper Insights....Pages 267-351
Statistical Theory and Numerical Results....Pages 352-436
Summary....Pages 437-454
Back Matter....Pages 455-488
....
Download the book Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach for free or read online
Read Download
Continue reading on any device:
QR code
Last viewed books
Related books
Comments (0)
reload, if the code cannot be seen