
I’m a PhD candidate in Applied and Computational Mathematics at Caltech, whose research is focused on designing computational techniques to solve large-scale linear algebra problems with applications in scientific computing and data science. My advisor is Joel A. Tropp.
I got my undergraduate degrees in Mathematics and Computing from the College of Creative Studies at UCSB, where I worked with Professor Shivkumar Chandrasekaran and the scientific computing group. I have had internships at Sandia National Lab, working with Ryan Sills, Don Ward, and Jonathan Hu; Lawrence Livermore National Lab, working with Andrew Barker; and Lawrence Berkeley National Lab, working with Lin Lin.
I have been recognized for my work with the UCSB Chancellor’s Award for Excellence in Undergraduate Research, the UCSB Mathematics Department’s Raymond L Wilder Award, finalist status for the Hertz foundation fellowship, and the Caltech Thomas A. Tisch Prize for Graduate Teaching in CMS. I am supported by a Department of Energy Computational Science Graduate Fellowship.
Recent Publications
- M. Díaz, E. N. Epperly, Z. Frangella, J. A. Tropp, & R. J. Webber (2023). Robust, randomized preconditioning for kernel ridge regression. arXiv preprint arXiv:2304.12465 [math.NA].
- E. N. Epperly & J. A. Tropp (2023). Efficient error and variance estimation for randomized matrix computations. arXiv preprint arXiv:2207.06342 [math.NA].
- E. N. Epperly, J. A. Tropp, & R. J. Webber (2023). XTrace: Making the most of every sample in stochastic trace estimation. arXiv preprint arXiv:2301.07825 [math.NA].
- Y. Chen, E. N. Epperly, J. A. Tropp, & R. J. Webber (2022). Randomly pivoted Cholesky: Practical approximation of a kernel matrix with few entry evaluations. arXiv preprint arXiv:2207.06503 [math.NA].
- E. N. Epperly, L. Lin, & Y. Nakatsukasa (2022). A theory of quantum subspace diagonalization. SIAM Journal of Matrix Analysis and Applications. (preprint)