This textbook, based on courses taught at Rutgers University, introduces the core concepts and results of Control and System Theory in a self-contained and elementary fashion. Unique in its emphasis on foundational aspects, it is intended to be used in a rigorous, proof-oriented course to an audience consisting of advanced undergraduate or beginning graduate students. In devel- oping the necessary techniques from scratch, the only background assumed is basic mathematics. An introductory chapter describes the main contents ...
Read More
This textbook, based on courses taught at Rutgers University, introduces the core concepts and results of Control and System Theory in a self-contained and elementary fashion. Unique in its emphasis on foundational aspects, it is intended to be used in a rigorous, proof-oriented course to an audience consisting of advanced undergraduate or beginning graduate students. In devel- oping the necessary techniques from scratch, the only background assumed is basic mathematics. An introductory chapter describes the main contents of the book in an intuitive and informal manner and grives the reader a valuable perspective of modern control theory. While linear systems are the focus of much of the presentation, most definitions and many results are given in a far more general framework. And though mostly elementary, the text includes illustrations of the applications in control of techniques from Lie groups, nonlinear analysis, commutative algebra, and other areas of "pure" mathematics. With an emphasis on a complete and totally self-contained presentation and containing an extensive (almost 400 entries) up-to-date bibliography and a detailed index, Mathematical Control Theory will be an excellent research reference source as well. The book covers the algebraic theory of linear systems, - including controllability, observability, feedback equivalence, families of systems, controlled invariant subspaces, realization, and minimality, - stability via Lyapunov as well as input/output methods, ideas of optimal control, observers and dynamic feedback, parameterization of stabilizing controllers, tracking, Kalman filtering (introduced through a deterministic version of "optimal observation"), and basic facts about frequency domain such as the Nyquist criterion. Several nonlinear topics, such as Volterra series, smooth feedback stabilization, and finite-experiment observability, as well as many results in automata theory of relevance for discrete-event control, are also included. The text highlights the distinctions and the similarities between continuous and discrete time systems, as well as the sampling process that relates them.
Read Less
Add this copy of Mathematical Control Theory to cart. $62.09, very good condition, Sold by ThriftBooks-Reno rated 5.0 out of 5 stars, ships from Reno, NV, UNITED STATES, published 1990 by Springer.