AutoDropout: Learning Dropout Patterns to Regularize Deep Networks

Review of paper by Hieu Pham1, 2 and Quoc V. Le1, 1Google Research and 2Carnegie Mellon University, 2021.

As an improvement over existing Dropout regularization variants for deep neural networks (e.g. regular Dropout, SpatialDropout, DropBlock) that have a randomized structure with certain fixed parameters, the authors develop a reinforcement learning approach for finding better Dropout patterns for various network architectures.

CONTINUE READING >

MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks

Review of paper by Zhiqiang Shen and Marios Savvides, Carnegie Mellon University, 2020

The authors used a version of the recently suggested MEAL technique (which involves knowledge distillation from multiple large teacher networks into a smaller student network via adversarial learning) to increase the top-1 accuracy of ResNet-50 on ImageNet with 224×224 input size to 80.67% without external training data or network architecture modifications.

CONTINUE READING >

Compounding the Performance Improvements of Assembled Techniques in a Convolutional Neural Network

Review of paper by Jungkyu Lee, Taeryun Won, and Kiho Hong, Clova Vision, NAVER Corp, 2019

A great review of many state-of-the-art tricks that can be used to improve the performance of a deep convolutional network (ResNet), combined with actual implementation details, source code, and performance results. A must-read for all Kaggle competitors or anyone who wants to achieve maximum performance on computer vision tasks.

CONTINUE READING >

No more pages to load