Representation power of neural networks
Add to Google Calendar
This talk will survey a number of classical results regarding the representation power of neural networks, and also provide a new result separating shallow and deep networks: namely, there exist classification problems where a shallow network needs exponentially as many nodes to match the performance of a deep network.
All proofs will be elementary and the talk will require no knowledge of machine learning.
The paper is available at http://arxiv.org/abs/1509.08101