Ebook: Mathematical Control Theory: Deterministic Finite Dimensional Systems
Author: Eduardo D. Sontag (auth.)
- Tags: Systems Theory Control, Calculus of Variations and Optimal Control, Optimization, Control Robotics Mechatronics
- Series: Texts in Applied Mathematics 6
- Year: 1990
- Publisher: Springer-Verlag New York
- Edition: 1
- Language: English
- pdf
Mathematics is playing an ever more important role in the physical and biologi cal sciences, provoking a blurring of boundaries between scientific disciplines and a resurgence of interest in the modem as well as the classical techniques of applied mathematics. This renewal of interest, both in research and teaching, has led to the establishment of the series Texts in Applied Mathematics (TAM). The development of new courses is a natural consequence of a high level of excitement on the research frontier as newer techniques, such as numerical and symbolic computer systems, dynamical systems, and chaos, mix with and rein force the traditional methods of applied mathematics. Thus, the purpose of this textbook series is to meet the current and future needs of these advances and to encourage the teaching of new courses. TAM will publish textbooks suitable for use in advanced undergraduate and beginning graduate courses, and will complement the Applied Mathematics Sci ences (AMS) series, which will focus on advanced textbooks and research-level monographs. v Preface This textbook introduces the basic concepts and results of mathematical control and system theory. Based on courses that I have taught during the last 15 years, it presents its subject in a self-contained and elementary fashion. It is geared primarily to an audience consisting of mathematically mature advanced undergraduate or beginning graduate students. In addi tion, it can be used by engineering students interested in a rigorous, proof oriented systems course that goes beyond the classical frequency-domain material and more applied courses.
This textbook, based on courses taught at Rutgers University, introduces the core concepts and results of Control and System Theory in a self-contained and elementary fashion. Unique in its emphasis on foundational aspects, it is intended to be used in a rigorous, proof-oriented course to an audience consisting of advanced undergraduate or beginning graduate students. In devel- oping the necessary techniques from scratch, the only background assumed is basic mathematics. An introductory chapter describes the main contents of the book in an intuitive and informal manner and grives the reader a valuable perspective of modern control theory. While linear systems are the focus of much of the presentation, most definitions and many results are given in a far more general framework. And though mostly elementary, the text includes illustrations of the applications in control of techniques from Lie groups, nonlinear analysis, commutative algebra, and other areas of "pure" mathematics. With an emphasis on a complete and totally self-contained presentation and containing an extensive (almost 400 entries) up-to-date bibliography and a detailed index, Mathematical Control Theory will be an excellent research reference source as well. The book covers the algebraic theory of linear systems, - including controllability, observability, feedback equivalence, families of systems, controlled invariant subspaces, realization, and minimality, - stability via Lyapunov as well as input/output methods, ideas of optimal control, observers and dynamic feedback, parameterization of stabilizing controllers, tracking, Kalman filtering (introduced through a deterministic version of "optimal observation"), and basic facts about frequency domain such as the Nyquist criterion. Several nonlinear topics, such as Volterra series, smooth feedback stabilization, and finite-experiment observability, as well as many results in automata theory of relevance for discrete-event control, are also included. The text highlights the distinctions and the similarities between continuous and discrete time systems, as well as the sampling process that relates them.
This textbook, based on courses taught at Rutgers University, introduces the core concepts and results of Control and System Theory in a self-contained and elementary fashion. Unique in its emphasis on foundational aspects, it is intended to be used in a rigorous, proof-oriented course to an audience consisting of advanced undergraduate or beginning graduate students. In devel- oping the necessary techniques from scratch, the only background assumed is basic mathematics. An introductory chapter describes the main contents of the book in an intuitive and informal manner and grives the reader a valuable perspective of modern control theory. While linear systems are the focus of much of the presentation, most definitions and many results are given in a far more general framework. And though mostly elementary, the text includes illustrations of the applications in control of techniques from Lie groups, nonlinear analysis, commutative algebra, and other areas of "pure" mathematics. With an emphasis on a complete and totally self-contained presentation and containing an extensive (almost 400 entries) up-to-date bibliography and a detailed index, Mathematical Control Theory will be an excellent research reference source as well. The book covers the algebraic theory of linear systems, - including controllability, observability, feedback equivalence, families of systems, controlled invariant subspaces, realization, and minimality, - stability via Lyapunov as well as input/output methods, ideas of optimal control, observers and dynamic feedback, parameterization of stabilizing controllers, tracking, Kalman filtering (introduced through a deterministic version of "optimal observation"), and basic facts about frequency domain such as the Nyquist criterion. Several nonlinear topics, such as Volterra series, smooth feedback stabilization, and finite-experiment observability, as well as many results in automata theory of relevance for discrete-event control, are also included. The text highlights the distinctions and the similarities between continuous and discrete time systems, as well as the sampling process that relates them.
Content:
Front Matter....Pages i-xiii
Introduction....Pages 1-24
Systems....Pages 25-78
Reachability and Controllability....Pages 79-129
Feedback and Stabilization....Pages 131-188
Outputs....Pages 189-242
Observers and Dynamic Feedback....Pages 243-272
Optimal Control....Pages 273-318
Back Matter....Pages 319-396
This textbook, based on courses taught at Rutgers University, introduces the core concepts and results of Control and System Theory in a self-contained and elementary fashion. Unique in its emphasis on foundational aspects, it is intended to be used in a rigorous, proof-oriented course to an audience consisting of advanced undergraduate or beginning graduate students. In devel- oping the necessary techniques from scratch, the only background assumed is basic mathematics. An introductory chapter describes the main contents of the book in an intuitive and informal manner and grives the reader a valuable perspective of modern control theory. While linear systems are the focus of much of the presentation, most definitions and many results are given in a far more general framework. And though mostly elementary, the text includes illustrations of the applications in control of techniques from Lie groups, nonlinear analysis, commutative algebra, and other areas of "pure" mathematics. With an emphasis on a complete and totally self-contained presentation and containing an extensive (almost 400 entries) up-to-date bibliography and a detailed index, Mathematical Control Theory will be an excellent research reference source as well. The book covers the algebraic theory of linear systems, - including controllability, observability, feedback equivalence, families of systems, controlled invariant subspaces, realization, and minimality, - stability via Lyapunov as well as input/output methods, ideas of optimal control, observers and dynamic feedback, parameterization of stabilizing controllers, tracking, Kalman filtering (introduced through a deterministic version of "optimal observation"), and basic facts about frequency domain such as the Nyquist criterion. Several nonlinear topics, such as Volterra series, smooth feedback stabilization, and finite-experiment observability, as well as many results in automata theory of relevance for discrete-event control, are also included. The text highlights the distinctions and the similarities between continuous and discrete time systems, as well as the sampling process that relates them.
Content:
Front Matter....Pages i-xiii
Introduction....Pages 1-24
Systems....Pages 25-78
Reachability and Controllability....Pages 79-129
Feedback and Stabilization....Pages 131-188
Outputs....Pages 189-242
Observers and Dynamic Feedback....Pages 243-272
Optimal Control....Pages 273-318
Back Matter....Pages 319-396
....