Session 12: Modern Multivariate Statistics: Tensors and Networks

Session title: Modern Multivariate Statistics: Tensors and Networks
Organizer: Jacob Bien (Cornell)
Chair: Jacob Bien (Cornell)
Time: June 4th, 1:45pm – 3:15pm
Location: VEC 404

Speech 1: Computationally Efficient Tensor Completion with Statistical Optimality

Speaker: Dong Xia (columbia)
Abstract:  We develop methods for estimating a low rank tensor from noisy observations on a subset of its entries to achieve both statistical and computational efficiencies. There have been a lot of recent interests in this problem of noisy tensor completion. Much of the attention has been focused on the fundamental computational challenges often associated with problems involving higher order tensors, yet very little is known about their statistical performance. To fill in this void, in this article, we characterize the fundamental statistical limits of noisy tensor completion by establishing minimax optimal rates of convergence for estimating a k-th order low rank tensor which suggest significant room for improvement over the existing approaches. Furthermore, we propose a polynomial-time computable estimating procedure based upon power iteration and a second-order spectral initialization that achieves the optimal rates of convergence. Our method is fairly easy to implement and numerical experiments are presented to further demonstrate the practical merits of our estimator.

Speech 2: Structured shrinkage of tensor parameters
Speaker: Peter Hoff (Duke)
Tensor-valued parameters arise in many multivariate statistical models, such as network autoregression where the relationship between a pair of nodes is potentially dependent on that of any other pair. Parameters in such models are likely to be near, but not necessarily in, low-dimensional subspaces of the parameter space. In this talk we discuss some adaptive empirical Bayes methods for shrinking parameter estimates towards an appropriately chosen subspace.

Speech 3: Global Spectral Clustering for Dynamic Networks
Speaker: Patrick Perry (NYU)
Abstract: In this talk, we present a new method (PisCES) for finding time-varying community structure in dynamic networks. The method implements spectral clustering, with a smoothing penalty to promote similarity across time periods. We prove that this method converges to the global solution of a nonconvex optimization problem, which can be interpreted as the spectral relaxation of a smoothed K-means clustering objective. We also show that smoothing is applied in a time-varying and data-dependent manner; for example, when a drastic change point exists in the data, smoothing is automatically suppressed at the time of the change point. Finally, we show that the detected time-varying communities can be effectively visualized through the use of sankey plots.