University of Texas at Austin
Surbhi Goel is a PhD student in the department of computer science at the University of Texas at Austin, advised by Adam Klivans. Prior to this, she received a bachelor’s degree in computer science from Indian Institute of Technology (IIT) Delhi. Her interests lie at the intersection of theory and machine learning. Her honors include a JP Morgan AI Research fellowship, a Simons-Berkeley Research Fellowship for the “Foundations of Deep Learning” program and multiple fellowships from the University of Texas at Austin.
In recent times, neural networks have seen a surge in popularity due to their unprecedented practical success, however they are poorly understood in theory. Gradient descent has been a successful workhorse for training these networks, and the backbone of the entire field rests on essentially this algorithm and its variants. A major drawback of gradient descent is that it is hard to analyze mathematically.
My research focusses on developing alternate efficient algorithms for training neural networks that are backed by strong theoretical guarantees. We have proposed various algorithms based on techniques such as kernel methods and isotonic regression for learning shallow neural networks that are provably efficient and generalize to unseen data.
Apart from this, I am also interested in understanding binary graphical models such as Ising models and Restricted Boltzmann machines from a learning theory perspective.