Introduction
1. Maxwell — On Governors
2. Bateman — The Control of an Elastic Fluid
3. Bellman and Kalaba — The Work of Lyapunov and Poincare
4. Hurwitz — On the Conditions Under Which an Equation Has Only Roots With Negative Real Parts
5. Nyquist — Regeneration Theory
6. Bode — Feedback — The History of an Idea
7. Van der Pol — Forced Oscillations in a Circuit with Non-linear Resistance (Reception with Reactive Triode
8. Minorsky — Self-excited Oscillations in Dynamical Systems Possessing Retarded Action
9. Zadeh and Ragazzini — An Extension of Wiener's Theory of Prediction
10. LaSalle — Time Optimal Control Systems
11. Boltyanskii, Gamkrelidze, and Pontryagin — On the Theory of Optimal Processes
12. Bellman — On the Application of the Theory of Dynamic Programming to the Study of Control Processes
13. Bellman and Kalaba — Dynamic Programming and Adaptive Processes: Mathematical Foundation
This collection of historically and technically important papers follows a logical line of development from early work in mathematical control theory to studies in adaptive control processes. The book touches upon all the major themes: stability theory, feedback control, time lag, prediction theory, dynamic programming, "bang-bang" control, and maximum principles.
The book opens with J. C. Maxwell's "On Governors" and continues with "The Control of an Elastic Fluid" by H. Bateman; an essay by editors Bellman and Kalaba, "The Work of Lyapunov and Poincaré"; Hurwitz's "On the Conditions Under Which an Equation Has Only Roots With Negative Real Parts"; Nyquist's "Regeneration Theory"; "Feedback?The History of an Idea" by H. W. Bode; a paper on forced oscillations in a circuit by B. van der Pol; "Self-excited Oscillations in Dynamical Systems Possessing Retarded Action" by N. Minorsky; "An Extension of Wiener's Theory of Prediction" by Zadeh and Ragazzini; "Time Optimal Control Systems" by J. P. LaSalle; "On the Theory of Optimal Processes" by Boltyanskii, Gamkrelidze, and Pontryagin; Bellman's "On the Application of the Theory of Dynamic Programming to the Study of Control Processes"; and the editors' study "Dynamic Programming and Adaptive Processes: Mathematical Foundation." Each paper is introduced with a brief account of its significance and with some suggestions for further reading.
Dover (2017) republication of Selected Papers on Mathematical Trends in Control Theory, originally published by Dover Publications in 1964.
www.doverpublications.com
Richard Bellman (1920?1984) received his Ph.D. from Princeton and worked for many years at the RAND Corporation, where he developed the mathematical optimization method known as dynamic programming. He was Professor of Mathematics, Electrical Engineering, and Medicine at the University of Southern California, and in 1979 he received the IEEE Medal of Honor for his contributions to decision processes and control system theory. Dr. Bellman was one of the major advisors to Dover's math program in the 1950s?1960s.
Robert Kalaba (1926?2004) contributed 12 books and more than 600 articles to the scientific community, and he worked with Richard Bellman at the RAND Corporation. He served on the faculty of the University of Southern California as Professor of Electrical Engineering and Economics, and he founded the institution's Department of Biomedical Engineering.