Dr. Darinka Dentcheva Awarded NSF Grant to Aid High-Stakes Decision Making

Mathematical models quantify risk over time as they predict future states of real-world systems.

Hoboken, NJ, October 04, 2013 --(PR.com)-- Dr. Darinka Dentcheva of the Department of Mathematical Sciences at Stevens Institute of Technology has spent years applying her expertise in stochastic optimization to create mathematical models that aid decision-making in a wide variety of real-world situations. These types of models have been applied to problems like generating electric power efficiently, making a profit in financial markets, and assessing risk in medical operations. She has recently been awarded a grant by the National Science Foundation (NSF) to establish optimal control of multi-dimensional dynamic stochastic systems with risk aversion. Working with frequent collaborator Andrzej Ruszczyski of Rutgers University, Dr. Dentcheva is advancing research on risk-averse discrete-time models and developing a general methodology for incorporating risk models into continuous-time optimal control problems of Markov structure.

“Mathematical models for optimization and risk analysis inform decisions in government and industry that help to protect and enrich our lives, guiding many aspects of policy in energy production and distribution, telecommunication, insurance and finance, logistics, medicine, security and the military,” says Dr. Michael Bruno, Dean of the Charles V. Schaefer, Jr. School of Engineering and Science. “Dr. Dentcheva’s numerous grants and publications on the subject demonstrate that she is at the forefront of this important field of research.”

Dr. Dentcheva’s latest efforts entail the application of risk aversion in problems involving controlled Markov chains, which are a type of stochastic process named after Andrey Markov, the Russian mathematician who first investigated them. Markov chains have states that change over time, with the distinct property that a future state can be predicted with knowledge of only the present state and the chosen control. Markov systems have numerous applications in the modeling of real-world processes.

In most decision and control problems under uncertainty that arise in the real world, decisions have to be made over long periods of time. Every choice that is made, as well as random factors in the environment, effects the evolution of the system and creates a need for new decisions. Therefore, a policy has to be designed that incorporates rules for responding to future states of the system. So far, most theoretical models of such control processes have been based on average performance criteria. In some cases, however, e.g. making decisions about medical procedures like organ transplants, it may be more important to avoid the worst possible outcome for patients than it is to ensure a high average performance or success rate. Dr. Dentcheva’s research therefore incorporates crucial considerations about the risk of highly undesirable scenarios, establishing mathematical models for quantifying risk in dynamical systems that evolve in a continuous way. It also provides methods to determine the best policy when risk aversion is essential.

Dr. Dentcheva and Dr. Ruszczyski will face three major challenges during the course of their project. First, they will develop proper mathematical tools for measuring risk in a time-consistent manner that would be suitable for continuous-time Markov systems. Second, the investigators will develop optimality theory for control problems involving time-consistent dynamic models of risk. This includes the analysis of the structure of the control models, existence of solutions, and properties of the solutions. Finally, when developing risk-averse control models researchers must ensure the possibility of solving the problems numerically in an efficient way.

According to Dr. Alexei Miasnikov, Director of the Department of Mathematical Sciences, “Dr. Dentcheva’s longstanding commitment to impactful research in stochastic systems has produced tremendously valuable results. Her latest work on Markov systems promises to continue her extensive record of excellence in collaborative research and advance the analysis of stochastic systems.”

Dr. Darinka Dentcheva is the Director of the graduate program for Analysis and Optimization of Stochastic Systems, which is a unique interdisciplinary program providing solid and rigorous education in the area of analysis and optimization of complex stochastic systems under uncertainty and risk. She is a member of the Society of Industrial and Applied Mathematics, as well as the Associate Editor of SIAM Journal on Optimization, the SIAM Review, and the journal “Control, Optimisation and Calculus of Variations”, which is part of the European Series on Applied and Industrial Mathematics. In addition, she is a member of the publication committee of the Mathematical Optimization Society, past member of the Stochastic Programming Committee of the Mathematical Optimization Society and a reviewer for Mathematical Review.

About the Department of Mathematical Sciences
The Stevens Department of Mathematical Sciences provides students with a background vital for pursuing a job in industry, while also offering students the depth and rigor required for graduate studies in mathematics or related fields. Students discover how scientific and engineering problems have often inspired new developments in mathematics, and, conversely, mathematical results have frequently had an impact on business, engineering, the sciences, and technology. The Department is committed to stochastic systems and optimization, and algebraic cryptography as its areas of emphasis.

Learn more: http://www.stevens.edu/ses/math/
Contact
Stevens Institute of Technology
Christine del Rosario
201-216-5561
http://buzz.stevens.edu/news.php?id=4043
ContactContact
Categories