Skip to main content

Frozen Weights, Hot Results: Batch Norm Gets 83% on CIFAR-10

·30 words·1 min · Download pdf

Turns out that if you froze all layers of neural networks to their random initialized weights except for batch norms, you can still get 83% accuracy on cifar10!

https://arxiv.org/abs/2003.00152

Discussion