The UCF Mathematics and Applications Seminar, which takes place every Friday from 11 a.m. to 12 p.m. in MSB 318, provides a venue for researchers to present their current work, foster new collaborations and showcase foundational mathematics and its applications to graduate and undergraduate students.
Dr. Gerrit Welper will be speaking at this seminar about approximation rates for Gradient Descent trained Shallow Neural Networks in 1d.
Abstract:
Two aspects of neural networks that have been extensively studied in the recent literature are their function approximation properties and their training by gradient descent methods. The approximation problem seeks accurate approximations with a minimal number of weights. In most of the current literature these weights are fully or partially handcrafted, showing the capabilities of neural networks but not necessarily their practical performance. In contrast, gradient descent theory for neural networks heavily relies on over-parametrization, and thus an abundance of weights.
The talk provides an approximation result for shallow networks in 1d with weights optimized by gradient descent. We consider an infinite sample limit, as standard for the classical approximation question, so technically the problem is no longer over-parametrized. However, some form of redundancy reappears as a loss in approximation rate compared to best possible rates.
Read More