论文标题
羽毛:联合建筑和超参数搜索
FEATHERS: Federated Architecture and Hyperparameter Search
论文作者
论文摘要
深层神经体系结构对当今许多AI任务中的表现产生了深远的影响,但是它们的设计仍然在很大程度上依赖于人类的先验知识和经验。神经体系结构搜索(NAS)以及超参数优化(HO)有助于减少这种依赖性。但是,随着以分布式方式存储的数据,NAS的最新状态和HO迅速变得不可行,通常违反了GDPR和CCPA等数据隐私法规。 As a remedy, we introduce FEATHERS - $\textbf{FE}$derated $\textbf{A}$rchi$\textbf{T}$ecture and $\textbf{H}$yp$\textbf{ER}$parameter $\textbf{S}$earch, a method that not only optimizes both neural architectures and optimization-related在分布式数据设置中联合参数,但通过使用差异隐私(DP)进一步遵守数据隐私。我们表明,羽毛有效地优化了架构和优化相关的超参数,同时在遵守隐私约束时无损地证明了对分类任务的收敛性。
Deep neural architectures have profound impact on achieved performance in many of today's AI tasks, yet, their design still heavily relies on human prior knowledge and experience. Neural architecture search (NAS) together with hyperparameter optimization (HO) helps to reduce this dependence. However, state of the art NAS and HO rapidly become infeasible with increasing amount of data being stored in a distributed fashion, typically violating data privacy regulations such as GDPR and CCPA. As a remedy, we introduce FEATHERS - $\textbf{FE}$derated $\textbf{A}$rchi$\textbf{T}$ecture and $\textbf{H}$yp$\textbf{ER}$parameter $\textbf{S}$earch, a method that not only optimizes both neural architectures and optimization-related hyperparameters jointly in distributed data settings, but further adheres to data privacy through the use of differential privacy (DP). We show that FEATHERS efficiently optimizes architectural and optimization-related hyperparameters alike, while demonstrating convergence on classification tasks at no detriment to model performance when complying with privacy constraints.