Abstract:Dropout is a classical regularization method in convolutional neural networks,which can effectively prevent overfitting.The Dropout-based convolutional neural network removes some nodes in a completely random way during training,and the resulting local networks lack the discrimination on different samples.In order to solve this problem,a sparse Dropout regularization method is proposed,which introduces sparse constraints on nodes in training,selectively removes some nodes according to the node's activation value,so that the network removes the lower nodes of the activation with higher probability and retains more nodes with higher activation value,enhance the capability of feature extraction.In the test phase to restore all the removed nodes and trained parameters,to achieve the purpose of integrating multiple local networks.The experimental results on the public dataset show that the method of combining sparsity with Dropout has better generalization capability than traditional method.