Decision, Optimization and Learning at the California Institute of Technology

Core Faculty


Yisong Yue
, Director
Professor Yue's research interests lie primarily in the theory and application of statistical machine learning. He is particularly interested in developing novel methods for spatiotemporal reasoning, structured prediction, interactive learning systems, and learning with humans in the loop. In the past, his research has been applied to information retrieval, recommender systems, text classification, learning from rich user interfaces, analyzing implicit human feedback, data-driven animation, sports analytics, policy learning in robotics, and adaptive routing & allocation problems.


Anima Anandkumar will join DOLCIT in Summer 2017. Her research interests are in the areas of large-scale machine learning, non-convex optimization and high-dimensional statistics. In particular, she has been spearheading the development and analysis of spectral methods which involve matrix and tensor decompositions. She is now a principal scientist at Amazon web services where she explores the intersections of theory and practice of large-scale machine learning.


Joel Burdick focuses on robotics, kinematics, mechanical systems and control. Active research areas include: robotic locomotion, sensor-based motion planning algorithms, multi-fingered robotic manipulation, applied nonlinear control theory, neural prosthetics, and medical applications of robotics.



Venkat Chandrasekaran's research interests broadly lie in mathematical optimization and its application to the information sciences. He seeks a deeper understanding of the power as well as the limitations of convex optimization, with a focus on the development of efficient algorithms for challenging problems in statistical signal processing. A recurring theme in his work is the prominent role played by various notions of algebraic structure in explaining the effectiveness of convex relaxation methods.


Mani Chandy works in two related areas: (1) sense & respond systems that detect and respond to critical events and (2) methods for developing reliable distributed systems, particularly distributed control systems. This work develops theory, builds prototypes, and is interdisciplinary straddling control theory, stochastic processes and statistics, optimization and game theory, and temporal logic.



Babak Hassibi's research is in communications, information theory, signal processing, and control. He is currently most interested in various information-theoretic and algorithmic aspects of wireless communications, especially wireless networks. Other interests include adaptive signal processing and neural networks; blind channel equalization; statistical signal processing; robust estimation and control, especially connections between robustness and adaptation; and linear algebra, with emphasis on fast algorithms, random matrices and group representation theory.

Katrina Ligett's research is centered in algorithms, particularly online algorithms, algorithmic game theory, and data privacy. Her work in game theory explores the implications for complex systems of applying simple modeling and learning assumptions on the rational agents participating in the systems. Her data privacy work seeks to provide a mathematical framework for understanding the fundamental tensions and tradeoffs involved in the use of databases of sensitive information.

Pietro Perona is interested in vision: how useful information about the environment may be computed from pictures. He is interested both in biological vision (perception, neural computation) and in computer vision (representations, algorithms). Current work includes visual recognition, the perception of behavior, and the control of behavior from visual input.


Andrew Stuart's research is focused on the development of foundational mathematical and algorithmic frameworks for the seamless integration of models with data. He works in the Bayesian formulation of inverse problems for differential equations, and in data assimilation for dynamical systems.





Omer Tamuz studies how people exchange information and learn from each other, using game theoretical, probabilistic and statistical tools. He is also interested in machine learning, and in particular in how machines and people can learn, work and play together.




Joel Tropp invents and analyzes algorithms for fundamental problems in linear algebra, optimization, and machine learning. He also develops mathematics to make this analysis more transparent. His current interests include constrained matrix factorization, random matrix theory, and universality laws for statistical problems.



Adam Wierman's research interests center around resource allocation and scheduling decisions in computer systems and services. More specifically, his work focuses both on developing analytic techniques in stochastic modeling, queueing theory, scheduling theory, and game theory, and applying these techniques to application domains such as energy-efficient computing, data centers, social networks, and the electricity grid.

Affiliated Faculty at Caltech

Affiliated Groups

Postdoctoral Fellows

Current

Recent Alumni

Graduate Students

Current

Recent Alumni

  • Elizabeth Bodine-Baron, RAND Corporation
  • Subhonmesh Bose, Assistant Professor, University of Illinois Urbana-Champaign
  • Agostino Capponi, Assistant Professor, Columbia University
  • Matthew Faulkner
  • Alex Gittens, AMPLab, UC Berkeley
  • Na Li, Assistant Professor, Harvard University
  • Minghong Lin, Facebook
  • Zhenhua Liu, Assistant Professor, Stony Brook University
  • Michael McCoy, Qadium
  • Judy Xingyue Mou, Google
  • Peter Stobbe, Optiver