Who am I?:

  • Computational Mathematician, Argonne National Laboratory since September 2019. Started June 2017 as a Postdoctoral Appointee.
  • A mathematical optimizer. See research interests below for more specifics.
  • A well-rounded and friendly individual. I’m good enough, I’m smart enough, and doggone it, people like me.

Publications/Conference Proceedings/Preprints:

Let’s be real. Anyone who tries to put this information on their website is just going to fall behind eventually. I’ve certainly been guilty of this many times. Google Scholar is such a great public service.

What you really want to get from a website is this:

Research Interests:

  • Derivative-free optimization. Check out this cool survey paper I did with fellow Argonne-ites Jeff Larson and Stefan Wild. I know a lot about DFO methods.
  • Optimization under uncertainty (stochastic and robust). Optimization’s all well and good, but how often do you actually know your problem data? Sometimes uncertainty needs to be encoded in your problem statement because you simply don’t trust (all of) your data, or you just don’t have enough of it to make the best decisions. I’ve worked on adapting trust-region algorithms for stochastic optimization (here and here). I’ve also surveyed nonlinear robust optimization and developed my own nonlinear robust optimization method.
  • Nonsmooth optimization. Especially algorithms that try to approximate Clarke subdifferentials. I’ve done (and am actively developing software for) this cool algorithm called manifold sampling. Here is the latest and greatest on manifold sampling.
  • The intersection of optimization and machine learning. Because who in applied math hasn’t dabbled in machine learning in this millennium? I’ve done some work in optimal decision trees and graphical anomaly detection.
  • Variational quantum algorithms. Quantum is way cool. Here’s a recent paper I wrote on using adaptive stochastic optimizers in the parameter update step of VQAs.
  • Some software development. In particular, I contribute to and maintain optimization methods in IBCDFO (Interpolation-Based Composite Derivative-Free Optimization), which are included in the POptUS (Practical Optimization Using Structure) Github organization.
  • Really, anything involving numerical optimization. Let’s just chat, my contact info’s at the top of this page.

A SPECTRE is haunting numerical optimization — the spectre of derivative-free optimization. 

Non-Research Stuff: 

  • I’m a trained classical pianist. If we’re ever at a meeting with a piano, odds are I’ll play something if you ask me to. Here’s a really old recording on the internet from my junior recital in university.
  • I’m queer. As I get older, I find myself more comfortable using they/them pronouns than he/him.
  • I had a baby in 2022! Here’s me and my family being cute:

Email me if you want to know more. I (generally) don’t bite.