论文标题

高度精确的CNN推断,使用同态加密上的近似激活功能

Highly Accurate CNN Inference Using Approximate Activation Functions over Homomorphic Encryption

论文作者

Ishiyama, Takumi, Suzuki, Takuya, Yamana, Hayato

论文摘要

在大数据时代,基于云的机器学习作为服务(MLAA)引起了广泛的关注。但是,当处理敏感数据(例如财务和医疗数据)时,会出现隐私问题,因为云服务器可以访问客户端的原始数据。云中处理敏感数据的一种常见方法使用同态加密,该加密允许在无解密的情况下对加密数据进行计算。以前的研究通常采用低级多项式映射函数(例如平方函数)进行数据分类。但是,该技术可导致分类较低的精度。在这项研究中,我们试图提高卷积神经网络(CNN)中推理处理的分类准确性,同时使用同形加密。我们采用一个激活函数,该激活函数在使用四阶多项式时近似Google的Swish激活函数。我们还采用批归一化以使SWISH函数的输入归一化,以适合输入范围以最大程度地减少误差。我们使用Microsoft的简单加密算术库(CKKS)方案实现了CNN推理标签,该推理标记是通过Microsoft简单的加密算术库(CKKS)方案实现了CNN推理标签。实验评估证实了MNIST和CIFAR-10的分类精度分别为99.22%和80.48%,分别比以前的方法分别需要0.04%和4.11%的改善。

In the big data era, cloud-based machine learning as a service (MLaaS) has attracted considerable attention. However, when handling sensitive data, such as financial and medical data, a privacy issue emerges, because the cloud server can access clients' raw data. A common method of handling sensitive data in the cloud uses homomorphic encryption, which allows computation over encrypted data without decryption. Previous research usually adopted a low-degree polynomial mapping function, such as the square function, for data classification. However, this technique results in low classification accuracy. In this study, we seek to improve the classification accuracy for inference processing in a convolutional neural network (CNN) while using homomorphic encryption. We adopt an activation function that approximates Google's Swish activation function while using a fourth-order polynomial. We also adopt batch normalization to normalize the inputs for the Swish function to fit the input range to minimize the error. We implemented CNN inference labeling over homomorphic encryption using the Microsoft's Simple Encrypted Arithmetic Library for the Cheon-Kim-Kim-Song (CKKS) scheme. The experimental evaluations confirmed classification accuracies of 99.22% and 80.48% for MNIST and CIFAR-10, respectively, which entails 0.04% and 4.11% improvements, respectively, over previous methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源