Research
I aim to make neural network training faster, more efficient, and highly scalable.
My research focuses on second-order optimization, with a particular emphasis
on the Gauss-Newton method and diagonal Hessian approximations.
|
Software
As part of my thesis, I developed and currently maintain
somax—a second-order complement to
optax, a library primarily focused
on first-order solvers.
|
|
Somax: Stochastic Second-Order Optimization in JAX
Mikalai Korbit
code
/
arXiv
Somax is a library of stochastic second-order methods for machine learning
optimization written in JAX. Somax is based on the JAXopt StochasticSolver API,
and can be used as a drop-in replacement for JAXopt as well as Optax solvers.
|
|