论文标题

柏拉图:修剪大型变压器模型,具有重量重要性的上限

PLATON: Pruning Large Transformer Models with Upper Confidence Bound of Weight Importance

论文作者

Zhang, Qingru, Zuo, Simiao, Liang, Chen, Bukharin, Alexander, He, Pengcheng, Chen, Weizhu, Zhao, Tuo

论文摘要

基于变压器的大型模型在各种自然语言处理和计算机视觉任务中表现出色。但是,这些模型包含大量参数,这些参数将其部署限制为现实世界应用程序。为了减少模型大小,研究人员根据权重的重要性得分修剪这些模型。但是,这种分数通常是在训练过程中估计的小批次,这会由于迷你批次采样和复杂的训练动态而引起的巨大可变性/不确定性。结果,由于这种不确定性,可以通过常用的修剪方法来修剪一些关键权重,从而使训练变得不稳定并受到伤害。为了解决这个问题,我们提出了柏拉图,该问题通过重要性估计的上限(UCB)捕获了重要性得分的不确定性。特别是,对于较低的分数但不确定性较高的重量,柏拉图倾向于保留它们并探索其能力。我们对基于自然语言理解,问题答案和图像分类的几种基于变压器的模型进行了广泛的实验,以验证柏拉图的有效性。结果表明,柏拉图在不同的稀疏度水平下显着改善。我们的代码可在https://github.com/qingruzhang/platon上公开获取。

Large Transformer-based models have exhibited superior performance in various natural language processing and computer vision tasks. However, these models contain enormous amounts of parameters, which restrict their deployment to real-world applications. To reduce the model size, researchers prune these models based on the weights' importance scores. However, such scores are usually estimated on mini-batches during training, which incurs large variability/uncertainty due to mini-batch sampling and complicated training dynamics. As a result, some crucial weights could be pruned by commonly used pruning methods because of such uncertainty, which makes training unstable and hurts generalization. To resolve this issue, we propose PLATON, which captures the uncertainty of importance scores by upper confidence bound (UCB) of importance estimation. In particular, for the weights with low importance scores but high uncertainty, PLATON tends to retain them and explores their capacity. We conduct extensive experiments with several Transformer-based models on natural language understanding, question answering and image classification to validate the effectiveness of PLATON. Results demonstrate that PLATON manifests notable improvement under different sparsity levels. Our code is publicly available at https://github.com/QingruZhang/PLATON.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源