论文标题
Cryptogru:使用GRU的低潜伏期隐私文本分析
CryptoGRU: Low Latency Privacy-Preserving Text Analysis With GRU
论文作者
论文摘要
数十亿个包含私人电子邮件,个人短信和敏感在线评论的文本分析请求每天都在公共云上部署在公共云上。尽管先前的安全网络结合了同态加密(HE)和乱码电路(GC),以保留用户的隐私,因此天真地采用了HE和GC混合动力技术来实施由于激活功能缓慢而导致的RNN遭受了长时间的推理潜伏期。在本文中,我们提出了一个HE和GC混合式复发单元(GRU)网络,即Cryptogru,以进行低延迟的安全推断。 Cryptogru用基于快速GC的$ relu $替换了基于GC的$ TANH $,然后量化$ Sigmoid $和$ relu $,长度较小,以加速GRU中的激活。我们通过在4个公共数据集中训练的多个GRU模型评估Cryptogru。实验结果表明,与宾夕法尼亚州Treebank数据集的最先进的安全网络相比,Cryptogru达到了一流的准确性,并将安全推断潜伏期提高到$ 138 \ times $。
Billions of text analysis requests containing private emails, personal text messages, and sensitive online reviews, are processed by recurrent neural networks (RNNs) deployed on public clouds every day. Although prior secure networks combine homomorphic encryption (HE) and garbled circuit (GC) to preserve users' privacy, naively adopting the HE and GC hybrid technique to implement RNNs suffers from long inference latency due to slow activation functions. In this paper, we present a HE and GC hybrid gated recurrent unit (GRU) network, CryptoGRU, for low-latency secure inferences. CryptoGRU replaces computationally expensive GC-based $tanh$ with fast GC-based $ReLU$, and then quantizes $sigmoid$ and $ReLU$ with a smaller bit length to accelerate activations in a GRU. We evaluate CryptoGRU with multiple GRU models trained on 4 public datasets. Experimental results show CryptoGRU achieves top-notch accuracy and improves the secure inference latency by up to $138\times$ over one of state-of-the-art secure networks on the Penn Treebank dataset.