论文标题

SA-DPSGD:基于模拟退火的私有随机梯度下降

SA-DPSGD: Differentially Private Stochastic Gradient Descent based on Simulated Annealing

论文作者

Fu, Jie, Chen, Zhili, Ling, XinPeng

论文摘要

差异隐私(DP)提供了正式的隐私保证,该保证可以防止对手可以访问机器学习模型,从而提取有关单个培训点的信息。私有随机梯度下降(DPSGD)是最受欢迎的训练方法,具有图像识别的差异隐私。但是,现有的DPSGD方案导致了大量的性能下降,从而阻止了差异隐私的应用。在本文中,我们提出了一个基于模拟退火的差异私有随机梯度下降方案(SA-DPSGD),该方案接受候选更新,其概率既取决于更新质量又取决于迭代次数。通过此随机更新筛选,我们使差异的私有梯度下降在每次迭代中的正确方向上进行,并最终导致更准确的模型。在我们的实验中,在相同的超参数下,与98.12%,86.33%和59.34%的最新结果相比,数据集MNIST,FashionMnist和CIFAR10的测试精度分别为98.35%,87.41%和60.92%。在自由调整的超参数下,我们的计划达到了更高的精度,98.89%,88.50%和64.17%。我们认为,我们的方法为缩小私人图像分类和非私有图像分类之间的准确差距有很大的贡献。

Differential privacy (DP) provides a formal privacy guarantee that prevents adversaries with access to machine learning models from extracting information about individual training points. Differentially private stochastic gradient descent (DPSGD) is the most popular training method with differential privacy in image recognition. However, existing DPSGD schemes lead to significant performance degradation, which prevents the application of differential privacy. In this paper, we propose a simulated annealing-based differentially private stochastic gradient descent scheme (SA-DPSGD) which accepts a candidate update with a probability that depends both on the update quality and on the number of iterations. Through this random update screening, we make the differentially private gradient descent proceed in the right direction in each iteration, and result in a more accurate model finally. In our experiments, under the same hyperparameters, our scheme achieves test accuracies 98.35%, 87.41% and 60.92% on datasets MNIST, FashionMNIST and CIFAR10, respectively, compared to the state-of-the-art result of 98.12%, 86.33% and 59.34%. Under the freely adjusted hyperparameters, our scheme achieves even higher accuracies, 98.89%, 88.50% and 64.17%. We believe that our method has a great contribution for closing the accuracy gap between private and non-private image classification.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源