论文标题

快速深层自动编码器用于联合学习

Fast Deep Autoencoder for Federated learning

论文作者

Novoa-Paradela, David, Romero-Fontenla, Oscar, Guijarro-Berdiñas, Bertha

论文摘要

本文介绍了一种新颖,快速和隐私的保存,以实施深度自动编码器。 Daef(用于联合学习的深度自动编码器)与传统的神经网络不同,以非著作方式训练深层自动编码器网络,从而大大减少了其训练时间。它的培训可以以分布式的方式(并联数据集的多个分区)和逐步(部分模型的聚合)进行,并且由于其数学表达式,所交换的数据不会危及用户的隐私。这使DAEF成为边缘计算和联合学习方案的有效方法。该方法已被评估并与传统的(迭代)深层自动编码器进行了比较,但使用七个真正的异常检测数据集进行了评估,尽管Daef的训练更快,但它们的性能被证明是相似的。

This paper presents a novel, fast and privacy preserving implementation of deep autoencoders. DAEF (Deep Autoencoder for Federated learning), unlike traditional neural networks, trains a deep autoencoder network in a non-iterative way, which drastically reduces its training time. Its training can be carried out in a distributed way (several partitions of the dataset in parallel) and incrementally (aggregation of partial models), and due to its mathematical formulation, the data that is exchanged does not endanger the privacy of the users. This makes DAEF a valid method for edge computing and federated learning scenarios. The method has been evaluated and compared to traditional (iterative) deep autoencoders using seven real anomaly detection datasets, and their performance have been shown to be similar despite DAEF's faster training.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源