• Journal of Internet Computing and Services
    ISSN 2287 - 1136 (Online) / ISSN 1598 - 0170 (Print)
    https://jics.or.kr/

A Method for Progressive Accumulative Data-Free Pruning and Restoration of Convolutional Neural Networks


Yeje Choi, Kyoungwon Park, Juhyoung Sung, Kiwon Kwon, Byoungchul Song, Taeho Im, Journal of Internet Computing and Services, Vol. 26, No. 6, pp. 109-120, Dec. 2025
10.7472/jksii.2025.26.6.109, Full Text:  HTML
Keywords: Network Pruning, filter pruning, Data-free Restoration, Structured pruning, edge AI

Abstract

Deep Neural Networks (DNNs) have revolutionized artificial intelligence with their exceptional performance; however, their immense size, stemming from millions of parameters, poses significant challenges for deployment on resource-constrained edge devices such as mobile and wearable systems. Network pruning has emerged as an effective strategy for eliminating redundancy, yet conventional approaches often necessitate costly retraining with the original dataset, which may be restricted due to privacy or commercial constraints. This paper introduces Progressive Accumulative Data-Free Compression and Restoration (PA-DFCR), a novel data-free and training-free structural pruning method that incrementally compresses models by applying a many-to-one compensation mechanism at each step. By decomposing the target compression rate into smaller increments, PA-DFCR dynamically reassesses layer-wise importance post-compensation to identify optimal subnetworks, mitigating the severe information loss commonly associated with one-shot approaches. This framework distributes reconstruction errors across multiple iterations, enhancing the resilience of deeper layers where filter similarity is often low. Evaluations on standard benchmarks, such as VGG-16 on CIFAR-10 and ResNet-50 on CIFAR-100, demonstrate significant advantages. Notably, at a 50% pruning ratio, PA-DFCR achieves an average accuracy improvement of 8.18% over the baseline LBYL on VGG-16/CIFAR-10. These results underscore PA-DFCR’s efficacy for on-demand compression in real-world industrial settings without requiring data or retraining.


Statistics
Show / Hide Statistics

Statistics (Cumulative Counts from November 1st, 2017)
Multiple requests among the same browser session are counted as one view.
If you mouse over a chart, the values of data points will be shown.


Cite this article
[APA Style]
Choi, Y., Park, K., Sung, J., Kwon, K., Song, B., & Im, T. (2025). A Method for Progressive Accumulative Data-Free Pruning and Restoration of Convolutional Neural Networks. Journal of Internet Computing and Services, 26(6), 109-120. DOI: 10.7472/jksii.2025.26.6.109.

[IEEE Style]
Y. Choi, K. Park, J. Sung, K. Kwon, B. Song, T. Im, "A Method for Progressive Accumulative Data-Free Pruning and Restoration of Convolutional Neural Networks," Journal of Internet Computing and Services, vol. 26, no. 6, pp. 109-120, 2025. DOI: 10.7472/jksii.2025.26.6.109.

[ACM Style]
Yeje Choi, Kyoungwon Park, Juhyoung Sung, Kiwon Kwon, Byoungchul Song, and Taeho Im. 2025. A Method for Progressive Accumulative Data-Free Pruning and Restoration of Convolutional Neural Networks. Journal of Internet Computing and Services, 26, 6, (2025), 109-120. DOI: 10.7472/jksii.2025.26.6.109.