Home
Projects
Publications
People
Join the Lab
Contact
Login
Machine Learning
If Dropout Limits Trainable Depth, Does Critical Initialisation Still Matter? A Large-scale Statistical Analysis on ReLU Networks
Recent work in signal propagation theory has shown that dropout limits the depth to which information can propagate through a neural …
Arnu Pretorius
,
Elan Van Biljon
,
Benjamin van Niekerk
,
Ryan Eloff
,
Matthew Reynard
,
Steven James
,
Benjamin Rosman
,
Herman Kamper
,
Steve Kroon
PDF
Cite
«
Cite
×