Entropy Error-based Convolutional Neural Network Sparse Optimization and Application

Authors

  • Kaifang Yang
  • Dewei Yang
  • Fen Wang
  • Jing Zhao
  • Qinwei Fan

DOI:

https://doi.org/10.54097/rxbm7088

Keywords:

Convolution Neural Network (CNN), Smooth L1 Regularization, Cross-Entropy Loss Function, Classification Problems

Abstract

Convolutional Neural Networks (CNNs), as quintessential representatives of deep neural networks, have found widespread applications in numerous domains such as image recognition, object detection, and image generation, owing to their robust nonlinear mapping capabilities. However, the practical deployment of CNNs faces challenges such as low learning efficiency and subpar accuracy due to the intricacies in the network structure. This paper introduces the smoothing L1 regularization term atop the conventional entropy loss function, effectively enhancing both the learning efficiency and algorithmic precision of Convolutional Neural Networks. The efficacy of the improved algorithm is substantiated through numerical simulations.

Downloads

Download data is not yet available.

References

Leon. CNN: A Vision of Complexity, International Journal of Bifurcation and Chaos, 7 (1997) :2219-2425.

B. T. Li. Learning deep neural networks for node classification, Expert systems with applications, 137(2019): 324-334.

F. Ghasemi. Deep neural network in qsar studies using deep belief network, Applied Softcomputing, 62 (2018): 251-258.

Y. S. Xu. Convergence of deep convolutional neural networks, Neural Networks, 153 (2022): 553-563.

S. F. Wen, Translation analysis of English address image recognition based on image recognition, image and video processing, 11 (2019): 1-9.

M.Y.Park, T.Hastie, L1-Regularization Path Algorithm for Generalized Linear Model, Journal of the Royal Statistical Society Series B: Statistical Methodology, 69 (2007): 659–677.

C.Yang. Structured pruning of convolutional neural networks via L1 Regularization, IEEE Access, 7 (2019): 106385-106394.

L. Li. Approximating the Gradient of Cross-Entropy Loss Function, IEEE Access, 2020 (8): 111626-111635.

T.Wiatowski.A mathematical theory of deep convolutional neural networks for feature extraction, IEEE Transactions On Information Theory, 64(2017): 1845-1866.

P.Y.Wang. Differentially private sgd with non-smooth losses, Applied and Computational Harmonic Analysis, 56 (2022): 306-336.

Downloads

Published

03-02-2024

Issue

Section

Articles

How to Cite

Yang, K., Yang, D., Wang, F., Zhao, J., & Fan, Q. (2024). Entropy Error-based Convolutional Neural Network Sparse Optimization and Application. Frontiers in Computing and Intelligent Systems, 7(1), 69-71. https://doi.org/10.54097/rxbm7088