M. V. Zingerenko, E. E. Limonova Layer"Wise Knowledge Distillation for Simplified Bipolar Morphological Neural Networks
M. V. Zingerenko, E. E. Limonova Layer"Wise Knowledge Distillation for Simplified Bipolar Morphological Neural Networks

Various neuron approximations can be used to reduce the computational complexity of neural networks. One such approximation based on summation and maximum operations is a bipolar morphological neuron. This paper presents an improved structure of the bipolar morphological neuron that enhances its computational efficiency and a new approach to training based on continuous approximations of the maximum and knowledge distillation. Experiments were conducted on the MNIST dataset using a LeNet-like neural network architecture and on the CIFAR10 dataset using a ResNet-22 model architecture. The proposed training method achieves 99.45% classification accuracy on the LeNet-like model, with the same accuracy of the classical network, and 86.69% accuracy on the ResNet-22 model, compared to 86.43% accuracy of the classical model. The results show that the proposed method with logsum-exp (LSE) approximation of the maximum and layer-by-layer knowledge distillation, allows for a simplified bipolar morphological network that is not inferior to classical networks.


bipolar morphological networks, approximations, artificial neural networks, computational efficiency.

PP. 46-54.

DOI 10.14357/20718632230305

1. Chernyshova YS, Sheshkus AV, Arlazarov VV. Twostep CNN framework for text line recognition in cameracaptured images. IEEE Access. 2020;8:32587-600.
2. Kanaeva I, Ivanova YA, Spitsyn V. Deep convolutional generative adversarial network-based synthesis of datasets for road pavement distress segmentation. Computer Optics. 2021;45(6):907-16.
3. Das PAK, Tomar DS. Convolutional neural networks based weapon detection: a comparative study. In: Fourteenth International Conference on Machine Vision (ICMV 2021). vol. 12084. SPIE; 2022. p. 351-9.
4. Bulatov K, Arlazarov VV, Chernov T, Slavin O, Nikolaev D. Smart IDReader: Document recognition in video stream. In: 2017 14th IAPR International Conference on Document Analysis and Recognition (ICDAR). vol. 6. IEEE; 2017. p. 39-44.
5. Zhao Y, Wang D, Wang L. Convolution accelerator designs using fast algorithms. Algorithms. 2019;12(5):112.
6. Yao Z, Dong Z, Zheng Z, Gholami A, Yu J, Tan E, et al. Hawq-v3: Dyadic neural network quantization. In: International Conference on Machine Learning. PMLR; 2021. p. 11875-86.
7. Tai C, Xiao T, Zhang Y, Wang X, et al. Convolutional neural networks with low-rank regularization. arXiv preprint arXiv:151106067. 2015.
8. Sun X, Zhou D, Pan X, Zhong Z, Wang F. Pruning filters with L1-norm and standard deviation for CNN compression. In: Eleventh international conference on machine vision (ICMV 2018). vol. 11041. SPIE; 2019. p. 691-9.
9. You H, Chen X, Zhang Y, Li C, Li S, Liu Z, et al. Shiftaddnet: A hardware-inspired deep network.Advances in Neural Information Processing Systems. 2020;33:2771-83.
10. Chen H, Wang Y, Xu C, Shi B, Xu C, Tian Q, et al. Adder-Net: Do we really need multiplications in deep learning? In: Proceedings of the IEEE/CVF conference on computer vision and pattern recognition; 2020. p. 1468-77.
11. Limonova EE, Alfonso DM, Nikolaev DP, Arlazarov VV. Bipolar morphological neural networks: Gate-efficient architecture for computer vision. IEEE Access. 2021;9:97569-81.
12. Limonova EE. Fast and Gate-Efficient Approximated Activations for Bipolar Morphological Neural Networks. Informacionnye tekhnologii I vichslitel’nye sistemy 2022;(2):3-10.
13. Hinton G, Vinyals O, Dean J. Distilling the knowledge in a neural network. arXiv preprint arXiv:150302531. 2015.
14. Xu Y, Xu C, Chen X, Zhang W, Xu C, Wang Y. Kernel based progressive distillation for adder neural networks. Advances in Neural Information Processing Systems. 2020;33:12322-33.
15. Kirszenberg A, Tochon G, Puybareau E´, Angulo J.Going beyond p-convolutions to learn grayscale morphological operators. In: International Conference on Discrete Geometry and Mathematical Morphology. Springer; 2021. p. 470-82.
16. Calafiore GC, Gaubert S, Possieri C. A universal approximation result for difference of log-sum-exp neural networks. IEEE transactions on neural networks and learning systems. 2020;31(12):5603-12.
17. He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition; 2016. p. 770-8.
2024 / 01
2023 / 04
2023 / 03
2023 / 02

© ФИЦ ИУ РАН 2008-2018. Создание сайта "РосИнтернет технологии".