Loading view.
Second-order Optimization Made Practical – Talk by Tomer Koren, TAU
Gonda Building (901), Room 101Title: Second-order Optimization Made Practical Abstract: Optimization in machine learning, both theoretical and applied, is presently dominated by first-order gradient methods such as stochastic gradient descent. Higher-order (preconditioned) optimization methods have become far less prevalent, despite compelling theoretical properties, due to their impractical computation, memory and communication costs. I will present some recent theoretical, algorithmic ... Read more