论文标题

猎豹:为私人推理优化和加速同构加密

Cheetah: Optimizing and Accelerating Homomorphic Encryption for Private Inference

论文作者

Reagen, Brandon, Choi, Wooseok, Ko, Yeongil, Lee, Vincent, Wei, Gu-Yeon, Lee, Hsien-Hsin S., Brooks, David

论文摘要

随着深度学习的应用不断增长,用于做出预测的数据量也随之增长。尽管传统上,大数据深度学习受到计算性能和芯片内记忆带宽的限制,但出现了新的约束:隐私。一种解决方案是同态加密(HE)。将HE应用于客户端云模型允许云服务直接在客户端的加密数据上执行推断。尽管他可以达到隐私限制,但它引入了巨大的计算挑战,并且在当前系统中仍然不切实际。 本文介绍了Cheetah,这是一套DNN推理的算法和硬件优化,以实现明文DNN推理速度。 Cheetah提出了He-Parameter Tuning优化和运营商计划优化,该优化共同在最先进的情况下提供了79倍的速度。然而,这仍然没有纯粹的推理速度降低了近四个数量级。为了弥合剩余的性能差距,Cheetah进一步提出了一个加速器架构,该架构与算法优化相结合,接近了明文DNN推理速度。我们评估了几种常见的神经网络模型(例如Resnet50,VGG16和Alexnet),并表明,对于每个定制加速器,均可使用30W和545mm^2的定制加速器,对每个模型的明文级别推断是可行的。

As the application of deep learning continues to grow, so does the amount of data used to make predictions. While traditionally, big-data deep learning was constrained by computing performance and off-chip memory bandwidth, a new constraint has emerged: privacy. One solution is homomorphic encryption (HE). Applying HE to the client-cloud model allows cloud services to perform inference directly on the client's encrypted data. While HE can meet privacy constraints, it introduces enormous computational challenges and remains impractically slow in current systems. This paper introduces Cheetah, a set of algorithmic and hardware optimizations for HE DNN inference to achieve plaintext DNN inference speeds. Cheetah proposes HE-parameter tuning optimization and operator scheduling optimizations, which together deliver 79x speedup over the state-of-the-art. However, this still falls short of plaintext inference speeds by almost four orders of magnitude. To bridge the remaining performance gap, Cheetah further proposes an accelerator architecture that, when combined with the algorithmic optimizations, approaches plaintext DNN inference speeds. We evaluate several common neural network models (e.g., ResNet50, VGG16, and AlexNet) and show that plaintext-level HE inference for each is feasible with a custom accelerator consuming 30W and 545mm^2.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源