About
I am a PhD student in computational math at Stanford University advised by Jure Leskovec. My research focuses on network science, data mining, and highperformance matrix and tensor computations. I have interned with Google (four times—2016, 2015, 2012, 2011), Sandia National Labs (2014), and HP Labs (2013). Before coming to Stanford, I studied EE/CS and Applied Math across the bay at Berkeley.
News and events
 07/08/2016: My paper with David Gleich and Jure Leskovec, Higherorder organization of complex networks, is now out in Science (code and data is available here). This work has been covered by Stanford News, Phys.org, and DARPA.
 Summer 2016: I am interning at Google Research in Mountain View. Happy to be back!
Papers
 Higherorder organization of complex networks.
Austin R. Benson, David F. Gleich, and Jure Leskovec.
Science, 353.6295 (2016): 163–166.  The spacey random walk: a stochastic process for higherorder data.
Austin R. Benson, David F. Gleich, and LekHeng Lim.
arXiv, cs.NA:1602.02102, 2016.  General tensor spectral coclustering for higherorder data.
Tao Wu, Austin R. Benson, and David F. Gleich.
To appear in NIPS, 2016.  On the relevance of irrelevant alternatives.
Austin R. Benson, Ravi Kumar, and Andrew Tomkins.
In Proceedings of the 25th International Conference on World Wide Web (WWW), 2016.
 Modeling user consumption sequences.
Austin R. Benson, Ravi Kumar, and Andrew Tomkins.
In Proceedings of the 25th International Conference on World Wide Web (WWW), 2016.
 Improving the numerical stability of fast matrix multiplication algorithms.
Grey Ballard, Austin R. Benson, Alex Druinksy, Benjamin Lipshitz, and Oded Schwartz.
To appear in SIAM Journal on Matrix Analysis and Applications, 2016.
 Tensor spectral clustering for partitioning higherorder network structures.
Austin R. Benson, David F. Gleich, and Jure Leskovec.
In Proceedings of the 2015 SIAM International Conference on Data Mining (SDM), 2015.
 A framework for practical parallel fast matrix multiplication.
Austin R. Benson and Grey Ballard.
In Proceedings of the 20th ACM SIGPLAN Symposium on Principles and Practice of Parallel Programming (PPoPP), 2015.

Scalable methods for nonnegative matrix factorizations of nearseparable tallandskinny matrices.
Austin R. Benson, Jason D. Lee, Bartek Rajwa, and David F. Gleich.
In Proceedings of Neural Information Processing Systems (NIPS), 2014.
Selected for spotlight presentation.
 Learning multifractal structure in large networks.
Austin R. Benson, Carlos Riquelme, and Sven Schmit.
In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD), 2014.
 A parallel directional Fast Multipole Method.
Austin R. Benson, Jack Poulson, Kenneth Tran, Björn Engquist, and Lexing Ying.
SIAM Journal on Scientific Computing, 2014 36:4, C335C352.
 Silent error detection in numerical timestepping schemes.
Austin R. Benson, Sven Schmit, and Robert Schreiber.
International Journal of High Performance Computing Applications, 2015 29: 403421. (first published April, 2014).
 Direct QR factorizations for tallandskinny matrices in MapReduce architectures.
Austin R. Benson, David F. Gleich, and James Demmel.
In Proceedings of the 2013 IEEE International Conference on Big Data (IEEE BigData), 2013.
Teaching
 Instructor, Discrete Mathematics and Algorithms ICME refresher course, Summer 2014.
Lecture notes are available.  Volunteer TA, CME 181: Projects in Applied and Computational Mathematics, Winter 2014.
 Instructor, CME 193: Introduction to Scientific Python, Spring 2013.
 Instructor, CME 193: Introduction to Scientific Python, Winter 2013.