# Bertsekas

### From Wikimization

## Contents |

# DIMITRI P. BERTSEKAS

Dimitri Bertsekas is an applied mathematician and computer scientist, and a professor at the department of Electrical Engineering and Computer Science at the Massachusetts Institute of Technology (MIT), Cambridge, Massachusetts.

## Biographical sketch

Dimitri P. Bertsekas was born in Greece and lived his childhood there. He studied for five years at the National Technical University of Athens, Greece **(**a time that, by his account, was spent mostly in playing poker and chess, and dating his future wife Ioanna**)**, for about a year and a half at the George Washington University, Wash.DC (at night, while working as a research engineer), and for about two years at MIT, where he obtained his doctorate in system science. He also taught for three years at the Engineering-Economic Systems Dept. of Stanford University, and for five years at the Electrical and Computer Engineering Dept. of the University of Illinois at Urbana-Champaign.

He is known for his research work, and for his fourteen textbooks and monographs in theoretical and algorithmic optimization and control, and in applied probability. His work ranges from theoretical/foundational work, to algorithmic analysis and design for optimization problems, and to applications such as data communication and transportation networks, and electric power generation. He is featured among the top 100 most cited computer science authors in the CiteSeer search engine academic database and digital library. In 1995, he co-founded a publishing company Athena Scientific that, among others, publishes most of his books.

In the late 90s Bertsekas developed a strong interest in digital photography. His photographs have been exhibited on several occasions at M.I.T., and can also be accessed from his www site http://web.mit.edu/dimitrib/www/home.html.

## Awards and honors list

Bertsekas was awarded the INFORMS 1997 Prize for Research Excellence in the Interface Between Operations Research and Computer Science for his book "Neuro-Dynamic Programming" (co-authored with J. N. Tsitsiklis); the 2000 Greek National Award for Operations Research; and the 2001 ACC John R. Ragazzini Education Award for outstanding contributions to education. In 2001, he was elected to the US National Academy of Engineering for "pioneering contributions to fundamental research, practice and education of optimization/control theory, and especially its application to data communication networks" Election citation by National Academy of Engineering.

## Free Books to Download

- Network Optimization
- Data Networks
- Approximate Dynamic Programming
- Constrained Optimization and Lagrange Multiplier Methods
- Parallel and Distributed Computation: Numerical Methods
- Stochastic Optimal Control: The Discrete-Time Case

## Textbooks and research monographs

Bertsekas' textbooks include Template:Multicol

- Dynamic Programming and Optimal Control (1996)
- Data Networks (1989, co-authored with Robert G. Gallager)
- Nonlinear Programming (1996)
- Introduction to Probability (2003, co-authored with J. N. Tsitsiklis)

Template:Multicol-end all of which are used widely for classroom instruction in many universities including MIT, have been published in multiple editions, and have been translated in foreign languages.

He has also written several widely referenced research monographs, which collectively contain most of his research. These include: Template:Multicol

- Stochastic Optimal Control: The Discrete-Time Case (1978, co-authored with S. E. Shreve), a mathematically complex work, establishing the measure-theoretic foundations of dynamic programming and stochastic control.

- Constrained Optimization and Lagrange Multiplier Methods (1982), the first monograph that addressed comprehensively the algorithmic convergence issues around augmented Lagrangian and sequential quadratic programming methods.

- Parallel and Distributed Computation: Numerical Methods (1989, co-authored with J. N. Tsitsiklis), which among others established the fundamental theoretical structures for the analysis of distributed asynchronous algorithms.

- Linear Network Optimization (1991) and Network Optimization: Continuous and Discrete Models (1998), which among others discuss comprehensively the class of auction algorithms for assignment and network flow optimization, developed by Bertsekas over a period of 20 years starting in 1979.

- Neuro-Dynamic Programming(1996, co-authored with J. N. Tsitsiklis), which layed the theoretical foundations for suboptimal approximations of highly complex sequential decision-making problems.

- Convex Analysis and Optimization (2002, co-authored with A. Nedic and A. Ozdaglar) and Convex Optimization Theory (2009), which provided a new line of development for optimization duality theory, a new connection between the theory of Lagrange multipliers and nonsmooth analysis, and a comprehensive development of incremental subgradient methods.

## External links

- Publications from Google Scholar.
- Publications from DBLP.
- Biography from National Academy of Engineering
- Bertsekas' home page at MIT
- Athena Scientific
- Laboratory for Information and Control Systems, MIT
- Department of Electrical Engineering and Computer Science, MIT

## Notes

## See also

## VITA

Dimitri P. Bertsekas received a combined B.S.E.E. and B.S.M.E. from the National Technical University of Athens, Greece, an M.S.E.E. from George Washington University, and a Ph.D. in system science from the Massachusetts Institute of Technology.

Dr. Bertsekas has held faculty positions with the Engineering-Economic Systems Dept. at Stanford University (1971-1974) and the Electrical Engineering Dept. of the University of Illinois, Urbana (1974-1979).

Since 1979 he has been teaching at the Electrical Engineering and Computer Science Department of the Massachusetts Institute of Technology (M.I.T.), where he is currently McAfee Professor of Engineering.

He consults regularly with private industry and has held editorial positions in several journals.

His research at M.I.T. spans several fields, including optimization, control, large-scale computation, and data communication networks, and is closely tied to his teaching and book authoring activities.

He has written numerous research papers, and thirteen books, several of which are used as textbooks in MIT classes.

Professor Bertsekas was awarded the INFORMS 1997 Prize for Research Excellence in the Interface Between Operations Research and Computer Science
for his book *Neuro-Dynamic Programming* (co-authored with John Tsitsiklis), the 2000 Greek National Award for Operations Research, and the 2001 ACC John R. Ragazzini Education Award.

In 2001, he was elected to the United States National Academy of Engineering.

Dr. Bertsekas' recent books are *Introduction to Probability* (2002), *Convex Analysis and Optimization* (2003), *Dynamic Programming and Optimal Control: 3rd Edition* (2007), all published by Athena Scientific.

He has written a new book on Convex Optimization Theory:

## *Convex Optimization Theory*, Dimitri P. Bertsekas, Athena Scientific

### Excerpt from the **Preface**:

This textbook aims to provide a simple, intuitive, and mathematically rigorous intoduction to convexity theory and its connections to optimization. The book evolved from an earlier (2003) book of the author on the subject **(**written with collaboration by Angelia Nedić and Asuman Ozdaglar**)**, but has different objectives and substantially different content. A major focus of the 2003 book was to bridge the gap between convex and nonconvex optimization, by placing special emphasis on the convex analysis topics that bear a connection to both. By contrast, the present book provides a simpler, more focused exposition, by concentrating exclusively on convexity, and developing in greater depth core subjects such as polyhedral convexity, subdifferential theory, and conjugate function theory. The methodology and choice of topics are in the spirit and tradition of Fenchel's 1951 lecture notes and Rockafellar's 1970 text on convex analysis: the material and the ideas are similar, but the exposition, while on occasion less comprehensive, is more accessible, and includes much theory and algorithms developed in the long period since the appearance of Fenchel's and Rockafellar's seminal works.

Despite the differences, the styles and levels of mathematical sophistication of the present book and the 2003 book are similar. In particular, both books place primary emphasis on theory, rather than algorithms and applications. Also, both books use geometric visualization as a principal tool for exposition, and also use end-of-chapter exercises to extend the range of exposition with substantial theoretical and algorithmic results. Detailed solutions of all the exercises are internet-posted in the book's www page.

Another common stylistic element of the two books is the close link between the theoretical treatment of convexity and its application to optimization. For example, simultaneously with the development of some of the basic facts about convexity in Chapters 1 and 2, we discuss elementary optimality conditions, and the question of existence of optimal solutions; in Chapter 3, after presenting the theory of hyperplane separation, we develop some of its applications to duality and saddle point theory; in Chapter 4, after the discussion of polyhedral convexity, we discuss its application in linear, integer, and convex programming; and in Chapter 6, after the discussion of subgradients, we discuss their use in optimality conditions and minimization algorithms. We follow this style in the remaining chapters, although having developed in Chapters 1-6 most of the needed convexity theory, the discussion in Chapters 7-8 is more heavily weighted towards optimization.

Some of the novel expository features of the 2003 book have been maintained in the present version as well. In particular, minimax theory and constrained optimization duality have been developed in a unified way, as special cases of the duality between two simple but fundamental geometrical problems: the min common point problem and the max crossing point problem. The min common/max crossing framework is essentially a geometric version of convex conjugate function theory, but is simpler and unencumbered by functional descriptions. In several contexts, including duality, it provides a powerful and insightful analytical machinery, which is far more convenient than conjugate function theory. The min common/max crossing framework is also useful in the analysis and interpretation of conjugate functions, theorems of the alternative, and subdifferential theory.

Another feature shared with the 2003 book is the unified approach for developing conditions for existence of solutions of convex optimization problems, conditions for the minimax equality to hold, and conditions for the absence of a duality gap in constrained optimization. This unification is traced to a simple and fundamental issue: the question whether a nested family of closed sets has a nonempty intersection.

To provide an alternative, more accessible path to convexity, the most advanced sections have been marked with a star, and can be skipped at first reading. The principle followed here is that "non-starred" sections do not make use of material in "starred" sections.

An instructor who wishes to teach a course from the book has a choice between a few different plans:

- Cover the entire book. This requires a full semester course at a fast pace.

- Cover primarily the first six chapters, emphasizing convexity theory, yet maintaining substantial optimization content.

- Cover the entire book, but omit the starred sections (or just summarize them). This is the option followed by the author in his one-semester MIT class (course slides available on the Internet).

Other plans may include some applications or some additional theoretical topics of the instructor's choice.

Categories: American atheists | American engineers | American mathematicians | Computer pioneers | American computer scientists | American electrical engineers | Control theorists | Massachusetts Institute of Technology alumni | Massachusetts Institute of Technology faculty | Systems scientists | Scientists in stochastics | Probability theorists