Invited Speakers

Professor Venkat Anantharam (Keynote Session 1, Feb 20, 10 AM)

Model Release-YES

Professor, Electrical Engineering and Computer Science
University of California, Berkeley

“Information Theory of Coordination”

Venue: CSL B02

Abstract:

Problems of coordination between mulitple agents have traditionally been studied by
the control community under the assumption of perfect communication channels between
the agents. Likewise, information theory has traditionally largely focused on the
problem of communicating information in a probabilistically reliable sense between
the agents, without worrying about what the information is going to be used for.

In systems where the required time scales of the eventual application
are not long enough to code away channel uncertainties,
or where there is the possibility of deriving coordination
gains from a global view of the communication infrastructure (rather than as a collection
of point to point links), or where there is a need for the information to be transferred
in a way that is more precisely defined (e.g. for strategic reasons), this separation of
roles between control and communication becomes inappropriate.

Building an information theory of coordination to deal with such scenarios is an area
rich with challenging open questions. We will survey this landscape in this talk.

Biography:

He received his B.Tech in Electrical Engineering from the Indian Institute of Technology, 1980, a M.S. in EE from UC Berkeley, 1982, a M.A. in Mathematics, UC Berkeley, 1983, a C.Phil in Mathematics, UC Berkeley, 1984 and a Ph.D. in EE, UC Berkeley, 1986. Prior to joining the faculty of EECS in 1994, he was a member of the faculty at Cornell University.

Professor Justin Romberg (Keynote Session 2, Feb 20, 2 PM)

Model Release-YES

Associate Professor, Electrical and Computer Engineering
Georgia Institute of Technology

“Using structure to solve underdetermined systems of linear equations and overdetermined systems of quadratic equations”

Venue: CSL B02

Abstract:

We will start by discussing some of the fundamental results in the field that has come to be known as compressive sensing.  The central theme of this body of work is that underdetermined systems of linear equations can be meaningfully “inverted” if they have structured solutions.  Two examples of structure would be if the unknown entity is a vector which is sparse (has only a few “active” entries) or if it is a matrix which is low rank.  We discuss some of the applications of this theory in signal processing and machine learning.

In the second part of the talk, we will show how some of these structured recovery results give us new insights into solving systems of quadratic and bilinear equations.  In particular, we will show how recasting classical problems like channel separation and blind deconvolution as a structured matrix factorization gives us new insight into how to solve them.

Biography:

Dr. Justin Romberg is an Associate Professor in the School of Electrical and Computer Engineering at the Georgia Institute of Technology.  Dr. Romberg received the B.S.E.E. (1997), M.S. (1999) and Ph.D. (2004) degrees from Rice University in Houston, Texas.  From Fall 2003 until Fall 2006, he was a Postdoctoral Scholar in Applied and Computational Mathematics at the California Institute of Technology.  In the Fall of 2006, he joined the Georgia Tech ECE faculty.  In 2008 he received an ONR Young Investigator Award, in 2009 he received a PECASE award and a Packard Fellowship, and in 2010 he was named a Rice University Outstanding Young Engineering Alumnus.

Dr. Chris Burges (Keynote Session 3, Feb 21, 10 AM)

cburges

Research Manager/Principal Researcher
Microsoft Research

“From Machine Learning to Machine Intelligence?”

Venue: CSL B02

Abstract:

Machine Learning has made impressive gains recently, but Machine Intelligence – the ability of machines to reason over general meaning representations in the service of doing something useful – still seems distant.  Why is this?  What might progress towards truly intelligent machines look like?  Is current research on machine learning likely to lead us there?  If not, what should we start doing differently, and what might such a roadmap look like?  On the other hand, will intelligent machines ever actually even be necessary, or might the problems they solve be as easily solved using statistical learning, or some other simpler techniques?  Such questions are very broad, but in the spirit of keeping our eye on the prize, it seems wise, and fun, to ask them, now and then.

Biography:

Chris Burges received his PhD in theoretical physics, on constraints on new particle mass scales and on models of early universe cosmology, at Brandeis University in 1984.  After a two-year post doc at the theoretical physics department at MIT, during which he worked on supersymmetry in Anti deSitter space, cellular automata models of thermal fluids, and the gravitational Aharanov-Bohm effect, and after the arrival of his family’s first baby, he rather abruptly switched to a more family-friendly field and became a systems engineer for AT&T Bell Labs.  There he worked on network performance and routing: AT&T still uses his algorithms to route their CCS7 signaling network (the nervous system of the long distance network).  When he saw a cool demo of neural networks reading handwritten digits, he switched fields again, and began his long descent into machine learning.  He has worked on handwriting and machine print recognition (he worked on a system now used by NCR to read millions of checks daily, and he worked on zip code and handwritten address recognition for the USPS), support vector machines, audio fingerprinting (his work is currently used in XBox and Windows Media Player to identify music), speaker verification, information retrieval, and semantic modeling.  His ranking algorithm is currently used in Bing for ranking web search results.  Chris was program co-chair of Neural Information Processing Systems 2012 and general co-chair of NIPS 2013. His main current research interest is on modeling meaning in text.

Dr. Ron Benson (Keynote Session 4, Feb 21, 1:30 PM)

02_18_Headshot

Vice President, Engineering
Walmart Labs

“Big Data driving e-Commerce at WalmartLabs”

Venue: CSL B02

Abstract:

Walmart has massive data from store transactions and and online from browse behavior and transactions. We are streaming much of this data to a large distributed file system (Hadoop). Now many teams across Walmart labs have written many applications that depend upon this big data to power the e-commerce and in-store experience. Dr. Benson will discuss big data at Walmart and how it is used. He will cover several applications that use this data.

Biography:

As VP of Engineering, Ron Benson is helping to pioneer the next generation of eCommerce. An expert in machine learning, social data, and search, Ron focuses on key consumer initiatives including search, item recommendations, site personalization, and personalized offers. His team is also provides critical insights into users’ shopping habits using social data. Ron received a PhD in Computation and Neural Systems from Caltech.