Mish: A Self Regularized Non-Monotonic Neural Activation Function
This paper presents a new neural network activation function and shows with a number of examples that it often increases the accuracy of deep networks.
This paper presents a new neural network activation function and shows with a number of examples that it often increases the accuracy of deep networks.
A great review of many state-of-the-art tricks that can be used to improve the performance of a deep convolutional network (ResNet), combined with actual implementation details, source code, and performance results. A must-read for all Kaggle competitors or anyone who wants to achieve maximum performance on computer vision tasks.