Online Library TheLib.net » Mathematical Theory of Control Systems Design

Give, and it shall be given unto you. ST. LUKE, VI, 38. The book is based on several courses of lectures on control theory and appli­ cations which were delivered by the authors for a number of years at Moscow Electronics and Mathematics University. The book, originally written in Rus­ sian, was first published by Vysshaya Shkola (Higher School) Publishing House in Moscow in 1989. In preparing a new edition of the book we planned to make only minor changes in the text. However, we soon realized that we like many scholars working in control theory had learned many new things and had had many new insights into control theory and its applications since the book was first published. Therefore, we rewrote the book especially for the English edition. So, this is substantially a new book with many new topics. The book consists of an introduction and four parts. Part One deals with the fundamentals of modern stability theory: general results concerning stability and instability, sufficient conditions for the stability of linear systems, methods for determining the stability or instability of systems of various type, theorems on stability under random disturbances.




The many interesting topics covered in Mathematical Theory of ControlSystems Design are spread over an Introduction and four parts. Each chapter concludes with a brief review of the main results and formulae, and each part ends with an exercise section.
Part One treats the fundamentals of modern stability theory. Part Two is devoted to the optimal control of deterministic systems. Part Three is concerned with problems of the control of systems under random disturbances of their parameters, and Part Four provides an outline of modern numerical methods of control theory. The many examples included illustrate the main assertions, teaching the reader the skills needed to construct models of relevant phenomena, to design nonlinear control systems, to explain the qualitative differences between various classes of control systems, and to apply what they have learned to the investigation of particular systems.
Audience: This book will be valuable to both graduate and postgraduate students in such disciplines as applied mathematics, mechanics, engineering, automation and cybernetics.


The many interesting topics covered in Mathematical Theory of ControlSystems Design are spread over an Introduction and four parts. Each chapter concludes with a brief review of the main results and formulae, and each part ends with an exercise section.
Part One treats the fundamentals of modern stability theory. Part Two is devoted to the optimal control of deterministic systems. Part Three is concerned with problems of the control of systems under random disturbances of their parameters, and Part Four provides an outline of modern numerical methods of control theory. The many examples included illustrate the main assertions, teaching the reader the skills needed to construct models of relevant phenomena, to design nonlinear control systems, to explain the qualitative differences between various classes of control systems, and to apply what they have learned to the investigation of particular systems.
Audience: This book will be valuable to both graduate and postgraduate students in such disciplines as applied mathematics, mechanics, engineering, automation and cybernetics.
Content:
Front Matter....Pages i-xxiii
Front Matter....Pages 1-1
Continuous and Discrete Deterministic Systems....Pages 3-71
Stability of Stochastic Systems....Pages 73-103
Front Matter....Pages 125-125
Description of Control Problems....Pages 127-150
The Classical Calculus of Variations and Optimal Control....Pages 151-181
The Maximum Principle....Pages 183-237
Linear Control Systems....Pages 239-300
Dynamic Programming Approach. Sufficient Conditions for Optimal Control....Pages 301-341
Some Additional Topics of Optimal Control Theory....Pages 343-373
Front Matter....Pages 391-391
Control of Stochastic Systems. Problem Statements and Investigation Techniques....Pages 393-434
Optimal Control on a Time Interval of Random Duration....Pages 435-466
Optimal Estimation of the State of the System....Pages 467-507
Optimal Control of the Observation Process....Pages 509-542
Front Matter....Pages 553-553
Linear Time-Invariant Control Systems....Pages 555-579
Numerical Methods for the Investigation of Nonlinear Control Systems....Pages 581-618
Numerical Design of Optimal Control Systems....Pages 619-639
Back Matter....Pages 657-671


The many interesting topics covered in Mathematical Theory of ControlSystems Design are spread over an Introduction and four parts. Each chapter concludes with a brief review of the main results and formulae, and each part ends with an exercise section.
Part One treats the fundamentals of modern stability theory. Part Two is devoted to the optimal control of deterministic systems. Part Three is concerned with problems of the control of systems under random disturbances of their parameters, and Part Four provides an outline of modern numerical methods of control theory. The many examples included illustrate the main assertions, teaching the reader the skills needed to construct models of relevant phenomena, to design nonlinear control systems, to explain the qualitative differences between various classes of control systems, and to apply what they have learned to the investigation of particular systems.
Audience: This book will be valuable to both graduate and postgraduate students in such disciplines as applied mathematics, mechanics, engineering, automation and cybernetics.
Content:
Front Matter....Pages i-xxiii
Front Matter....Pages 1-1
Continuous and Discrete Deterministic Systems....Pages 3-71
Stability of Stochastic Systems....Pages 73-103
Front Matter....Pages 125-125
Description of Control Problems....Pages 127-150
The Classical Calculus of Variations and Optimal Control....Pages 151-181
The Maximum Principle....Pages 183-237
Linear Control Systems....Pages 239-300
Dynamic Programming Approach. Sufficient Conditions for Optimal Control....Pages 301-341
Some Additional Topics of Optimal Control Theory....Pages 343-373
Front Matter....Pages 391-391
Control of Stochastic Systems. Problem Statements and Investigation Techniques....Pages 393-434
Optimal Control on a Time Interval of Random Duration....Pages 435-466
Optimal Estimation of the State of the System....Pages 467-507
Optimal Control of the Observation Process....Pages 509-542
Front Matter....Pages 553-553
Linear Time-Invariant Control Systems....Pages 555-579
Numerical Methods for the Investigation of Nonlinear Control Systems....Pages 581-618
Numerical Design of Optimal Control Systems....Pages 619-639
Back Matter....Pages 657-671
....
Download the book Mathematical Theory of Control Systems Design for free or read online
Read Download
Continue reading on any device:
QR code
Last viewed books
Related books
Comments (0)
reload, if the code cannot be seen