Acta Scientiarum Naturalium Universitatis Pekinensis ›› 2023, Vol. 59 ›› Issue (5): 764-772.DOI: 10.13209/j.0479-8023.2022.115

Previous Articles     Next Articles

AdaPruner: Adaptive Channel Pruning and Effective Weights Inheritance

LIU Xiangcheng1, CAO Jian1,†, YAO Hongyi1, XU Pengtao2, ZHANG Yuan1, WANG Yuan3   

  1. 1. School of Software and Microelectronics, Peking University, Beijing 102600 2. Xi’an Nuclear Instrument Co., Ltd. Xi’an 710061 3. School of Integrated Circuits, Peking University, Beijing 100871
  • Received:2022-09-14 Revised:2022-11-11 Online:2023-09-20 Published:2023-09-18
  • Contact: CAO Jian, E-mail: caojian(at)


刘相呈1, 曹健1,†, 姚宏毅1, 徐鹏涛2, 张袁1, 王源3   

  1. 1. 北京大学软件与微电子学院, 北京 102600 2. 西安中核核仪器股份有限公司, 西安 710061 3. 北京大学集成电路学院, 北京 100871
  • 通讯作者: 曹健, E-mail: caojian(at)
  • 基金资助:


Previous channel pruning methods require complex search and fine-tuning processes and are prone to fall into local optimal solutions. To solve this problem, the authors propose a novel channel pruning framework AdaPruner, which can generate corresponding sub-networks adaptively for various budget complexities and efficiently select the initialization weights suitable for the current structure by sparse training once. Experimental results show that the proposed method achieves better performance than previous pruning methods on both commonly used residual networks and lightweight networks on multiple datasets for image classification task. 

Key words: convolutional neural network, channel pruning, sparse training, neural network architecture search; image classification


目前的通道级剪枝方法往往需要复杂的搜索和微调过程, 并且容易陷入局部最优解, 针对此问题, 提出一种新颖的通道剪枝框架(AdaPruner), 只需通过一次稀疏训练, 就可以针对各种预算复杂度, 自适应地生成相应的子网络, 并高效地选择适合当前结构的初始化权重。在图像分类任务的多个数据集上实验结果表明, 该方法在常用的残差网络和轻量级网络上的性能都优于以往剪枝方法。

关键词: 卷积神经网络, 通道剪枝, 稀疏化训练, 神经网络结构搜索, 图像分类