Research

  1. Online Covariance Matrix Estimation in Sketched Newton Methods
    Wei Kuang, Sen Na, and Mihai Anitescu
    Preprint, 2024 [Link]
    Preliminary results in NeurIPS Workshop OPT 2023: Optimization for Machine Learning, 2023
    [Link]

    • We develop a consistent estimator for the limiting covariance matrix based on sketched Newton methods. Our estimator is recursively updated, batch-free, and requires no Hessian inverse. We theoretically provide an upper bound on its convergence rate. Combined with asymptotic normality results, we can perform real-time statistical inference on model parameters.

  2. Sequential quadratic programming method for inequality constrained stochastic optimization using only equality constrained subproblems
    Wei Kuang, Sen Na, and Mihai Anitescu
    Working paper

    • We aim to develop an online sequential quadratic programming method for stochastic optimization with both deterministic equality and inequality constraints. At each iteration, the method identifies an active set and then solves a subproblem involving only equality constraints, which significantly reduces the computational cost compared to methods that solve inequality constrained subproblems.

  3. Compressed sensing for diffuse scattering
    Wei Kuang, Vishwas Rao, Alexis Montoison, François Pacaud, and Mihai Anitescu
    Working paper

    • We use compressed sensing to improve the resolution of a gigantic (size 500 \(\times\) 500 \(\times\) 500) diffuse scattering data set. We apply a matrix-free interior point method to solve the compressed sensing problem, with Newton systems solved by preconditioned conjugate gradient (CG) method. We show that we can achieve “perfect” preconditioning on the neighborhood of the central path – the number of CG iterations is uniformly bounded on the central path and, as opposed to most experience with interior point, it does not degrade as the barrier parameter goes to infinity. We show a version of code that works on GPU.