Comprehensive and accessible guide to the three main approachesto robust control design and its applications
Optimal control is a mathematical field that is concerned withcontrol policies that can be deduced using optimization algorithms.The optimal control approach to robust control design differs fromconventional direct approaches to robust control that are morecommonly discussed by firstly translating the robust controlproblem into its optimal control counterpart, and then solving theoptimal control problem.
Robust Control Design: An Optimal Control Approach offersa complete presentation of this approach to robust control design,presenting modern control theory in an concise manner. The othertwo major approaches to robust control design, the H_infiniteapproach and the Kharitonov approach, are also covered anddescribed in the simplest terms possible, in order to provide acomplete overview of the area. It includes up-to-date research, andoffers both theoretical and practical applications that includeflexible structures, robotics, and automotive and aircraftcontrol.
Robust Control Design: An Optimal Control Approach willbe of interest to those needing an introductory textbook on robustcontrol theory, design and applications as well as graduate andpostgraduate students involved in systems and control research.Practitioners will also find the applications presented useful whensolving practical problems in the engineering field.