On the Approximation Properties of Neural Networks
Authors: Jonathan W. Siegel, Jinchao Xu
Abstract: We prove two new results concerning the approximation properties of neural networks. Our first result gives conditions under which the outputs of the neurons in a two layer neural network are linearly independent functions. Our second result concerns the rate of approximation of a two layer neural network as the number of neurons increases. We improve upon existing results in the literature by significantly relaxing the required assumptions on the activation function and by providing a better rate of approximation. We also provide a simplified proof that the class of functions represented by a two-layer neural network is dense in any compact set if the activation function is not a polynomial.
Explore the paper tree
Click on the tree nodes to be redirected to a given paper and access their summaries and virtual assistant
Look for similar papers (in beta version)
By clicking on the button above, our algorithm will scan all papers in our database to find the closest based on the contents of the full papers and not just on metadata. Please note that it only works for papers that we have generated summaries for and you can rerun it from time to time to get a more accurate result while our database grows.