Thursday, January 15, 2026

Show HN: The Hessian of tall-skinny networks is easy to invert https://ift.tt/fUHnXQD

Show HN: The Hessian of tall-skinny networks is easy to invert It turns out the inverse of the Hessian of a deep net is easy to apply to a vector. Doing this naively takes cubically many operations in the number of layers (so impractical), but it's possible to do this in time linear in the number of layers (so very practical)! This is possible because the Hessian of a deep net has a matrix polynomial structure that factorizes nicely. The Hessian-inverse-product algorithm that takes advantage of this is similar to running backprop on a dual version of the deep net. It echoes an old idea of Pearlmutter's for computing Hessian-vector products. Maybe this idea is useful as a preconditioner for stochastic gradient descent? https://ift.tt/Uvh1Ydu January 16, 2026 at 02:06AM

No comments:

Show HN: Interactive California Budget (By Claude Code) https://ift.tt/K7jLM2I

Show HN: Interactive California Budget (By Claude Code) There's been a lot of discussion around the california budget and some proposed ...