Can Hu1, Shanqing Zhang2,*, Kewei Tao2, Gaoming Yang1, Li Li2
CMC-Computers, Materials & Continua, Vol.82, No.3, pp. 4913-4930, 2025, DOI:10.32604/cmc.2025.059770
- 06 March 2025
Abstract The surge of large-scale models in recent years has led to breakthroughs in numerous fields, but it has also introduced higher computational costs and more complex network architectures. These increasingly large and intricate networks pose challenges for deployment and execution while also exacerbating the issue of network over-parameterization. To address this issue, various network compression techniques have been developed, such as network pruning. A typical pruning algorithm follows a three-step pipeline involving training, pruning, and retraining. Existing methods often directly set the pruned filters to zero during retraining, significantly reducing the parameter space. However, this… More >