About Me
In Spring 2018, I joined the Theory of Computation Group at Harvard as a postdoc under Jelani Nelson.
In the fall semester 2017, I was a research fellow at the Simons Institute at UC Berkeley. I finished my PhD at the Department of Computer Science at Yale in the summer 2017, where I was advised by Daniel A. Spielman. Before attending Yale, I received B.A. in Computer Science from the University of Cambridge in 2011.
My Curriculum Vitae (Jan, 2018).
My research is focused on fast algorithms and their applications to machine learning.
Papers
- Approximate Gaussian Elimination
- Rasmus Kyng.
- My PhD dissertation.
- Iterative Refinement for \(\ell_p\)-norm Regression
- Deeksha Adil, Rasmus Kyng, Richard Peng, and Sushant Sahdeva,
- In SODA 2019.
- A Matrix Chernoff Bound for Strongly Rayleigh Distributions and Spectral Sparsifiers from a few Random Spanning Trees
- Rasmus Kyng, Zhao Song.
- In FOCS 2018.
- Solving Directed Laplacians in Nearly Linear Time through Sparse LU Factorizations
- Michael B. Cohen, Jonathan Kelner, Rasmus Kyng, John Peebles, Richard Peng, Anup B. Rao, Aaron Sidford.
- In FOCS 2018. The result is available as Chapter 5 of my dissertation.
- Incomplete Nested Dissection
- Rasmus Kyng, Richard Peng, Robert Schwieterman, Peng Zhang.
- In STOC 2018.
- Hardness Results for Structured Linear Systems
- Rasmus Kyng, Peng Zhang.
- In FOCS 2017. Won the Machtey Award for Best Student Paper.
- Sampling Random Spanning Trees Faster than Matrix Multiplication
- David Durfee, Rasmus Kyng, John Peebles, Anup B. Rao, Sushant Sachdeva.
- In STOC 2017.
- A Framework for Analyzing Resparsification Algorithms
- Rasmus Kyng, Jakub Pachocki, Richard Peng, and Sushant Sachdeva.
- In SODA 2017.
- Approximate Gaussian Elimination for Laplacians: Fast, Sparse, and Simple
- Rasmus Kyng, Sushant Sachdeva.
- In FOCS 2016.
- Sparsified Cholesky and Multigrid Solvers for Connection Laplacians
- Rasmus Kyng, Yin Tat Lee, Richard Peng, Sushant Sachdeva, Daniel A. Spielman.
- In STOC 2016.
- Fast, Provable Algorithms for Isotonic Regression in all \(\ell_p\)-norms
- Rasmus Kyng, Anup B. Rao, Sushant Sachdeva.
- In NIPS 2015.
- Our implementation of the least-squares Isotonic regression algorithm is available on the Isotonic Github repository.
- Algorithms for Lipschitz Learning on Graphs
- Rasmus Kyng, Anup B. Rao, Daniel A. Spielman, Sushant Sachdeva.
- In COLT 2015.
- Our implementations of the lex-minimization algorithms are available on the YINSlex Github repository.
- Solving SDD Linear Systems in Nearly \( m\log^{1/2}n \) Time
- Michael B. Cohen, Rasmus Kyng, Gary L. Miller, Jakub W. Pachocki, Richard Peng, Anup B. Rao and Shen Chen Xu.
- In STOC 2014.
- Preconditioning in Expectation
- Michael B. Cohen, Rasmus Kyng, Jakub W. Pachocki, Richard Peng, and Anup B. Rao.
- Stretching Stretch
- Michael B. Cohen, Gary L. Miller, Jakub W. Pachocki, Richard Peng, and Shen Chen Xu.
Teaching
- Instructor for Harvard AM 221: Advanced Optimization. Spring 2018.
Recent and Upcoming Talks
- Caltech Theory Seminar; Caltech, November 2018.
- TBA.
- Northwestern Quarterly Theory Workshop; Northwestern, November 2018.
- TBA.
- FOCS; Paris, October 2018.
- A Matrix Chernoff Bound for Strongly Rayleigh Distributions and Spectral Sparsifiers from a few Random Spanning Trees.
- FOCS; Paris, October 2018.
- Solving Directed Laplacians in Nearly Linear Time through Sparse LU Factorizations.
- Laplacian 2.0 Workshop; FOCS, Paris, October 2018.
- Analysis Using Matrix Martingales (Slides).
- RNLA Workshop; Simons Institute, Berkeley, September 2018.
- Matrix Martingales in Randomized Numerical Linear Algebra (Video).
- High-Performance Graph Algorithms Seminar; Dagtuhl, June 2018.
- Optimization on Graphs.
- Discrepancy & IP Workshop; CWI Amsterdam, June 2018.
- Matrix Approximation by Row Sampling (Notes).
- GraphXD Workshop; UC Berkeley, March 2018.
- Optimization on Graphs (Video), (Slides).
- Michael Cohen Memorial Symposium; Simons Institute, Berkeley, November 2017.
- Michael Cohen and Directed Laplacians (Video).
More Talks
- Stanford Theory Seminar; October 2017. Approximate Gaussian Elimination.
- FOCS, Berkeley; October 2017. Hardness Results for Structured Linear Systems.
- UC Berkeley; September 2017. Hardness Results for Structured Linear Systems.
- Google Research Mountain View; August 2017. Hardness Results for Structured Linear Systems.
- YPNG Seminar, Yale Dept. of Statistics and Data Science; July 2017. Approximate Gaussian Elimination.
- MSR Redmond; January 2017. Regression, Elimination, and Sampling on Graphs.
- University of Copenhagen; January 2017. Approximate Gaussian Elimination.
- CMU; December 2016. Approximate Gaussian Elimination.
- Georgia Tech; November 2016. Approximate Gaussian Elimination.
- Princeton; October 2016. Approximate Gaussian Elimination.
- UC Berkeley; October 2016. Approximate Gaussian Elimination.
- Google Research NYC; October 2016. Approximate Gaussian Elimination.
- FOCS, New Brunswick; October 2016. Approximate Gaussian Elimination.
- MIT A&C Seminar; September 2016. Approximate Gaussian Elimination.
- Aarhus University; September 2016. Lipschitz Learning on Graphs.
- China Theory Week, Hong Kong; August 2016. Approximate Gaussian Elimination.
- SIAM Annual Meeting, Boston; July 2016. Approximate Cholesky Factorization
- STOC, Boston; June 2016. Sparsified Cholesky and Multigrid Solvers for Connection Laplacians.
- IT University of Copenhagen; December 2015. Lipschitz Learning and Isotonic Regression on Graphs
Code
- Laplacians.jl
- Laplacians.jl is a Julia package containing graph algorithms, especially ones related to spectral and algebraic graph theory. It was started by Dan Spielman, and has contributions from Dan, Serban Stan, myself and several others.
- YINSlex Github Repository
- Our implementations of the lex-minimization algorithms from the paper Algorithms for Lipschitz Learning on Graphs . The code was written by Anup Rao, Sushant Sachdeva, Dan Spielman, and myself.
- Isotonic Github Repository
- An implementation of the least-squares Isotonic regression algorithm from the paper Fast, Provable Algorithms for Isotonic Regression in all \(\ell_p\)-norms . The code was written by Anup Rao, Sushant Sachdeva, and myself.
Contact
I can be reached by email at
rj [last name] @ gmail.com
[last name] = kyng