NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
The reviewers found the paper original and found it deserve an accept. Nevertheless some concern about the claims in 3.3 was raised during the discussion. The authors say that a "multi-layer fully connected network with ‘leaky ReLU’ activations" satisfies assumptions H1-H4. This is obviously false since leaky relu is non differentiable, hence at least H1 of C^\infty is false. This would means that their NN implementations do not fit in their theoretical framework (although it works very well in practice). The paper should be updated to correct any false claim in the final revision.