Exact Gaussian Processes on a Million Data Points

May 10, 2019 - 9:30 am to 10:30 am

SLAC, Kavli 3rd Floor Conf. Room 

Greg Green

Join us Friday, May 10th for the latest KIPAC Stats & ML Journal Club. We'll meet at 9:30 am on the Kavli Third floor and on Zoom <https://stanford.zoom.us/j/2038764923> . This week, Greg Green will be leading a discussion on a new approximate method for Gaussian Processes.




Gaussian processes (GPs) are flexible models with state-of-the-art performance on many impactful applications. However, computational constraints with standard inference procedures have limited exact GPs to problems with fewer than about ten thousand training points, necessitating approximations for larger datasets. In this paper, we develop a scalable approach for exact GPs that leverages multi-GPU parallelization and methods like linear conjugate gradients, accessing the kernel matrix only through matrix multiplication. By partitioning and distributing kernel matrix multiplies, we demonstrate that an exact GP can be trained on over a million points in 3 days using 8 GPUs and can compute predictive means and variances in under a second using 1 GPU at test time. Moreover, we perform the first-ever comparison of exact GPs against state-of-the-art scalable approximations on large-scale regression datasets with 104−106 data points, showing dramatic performance improvements.