论文标题
用于网络物理系统统计验证的差异私有算法
Differentially Private Algorithms for Statistical Verification of Cyber-Physical Systems
论文作者
论文摘要
统计模型检查是一类顺序算法,可以在网络物理系统的集合上验证感兴趣的规格(例如,来自批处理的99%的汽车是否符合其能源效率的要求)。这些算法通过绘制足够数量的独立和相同分布的样本来确定具有可证明的统计保证的系统满足给定规范的概率。在统计模型检查过程中,可能会推断出样品的值(例如,用户的汽车能源效率),从而在消费者级别的应用程序(例如自动摩托车和医疗设备)中引起隐私问题。本文介绍了统计模型从差异隐私的角度检查算法的隐私。这些算法是顺序的,绘制样品直到满足其值的条件。我们表明,揭示绘制的样本数量可能侵犯隐私。我们还表明,在顺序算法的背景下,将算法的输出随机输出随机的标准指数机制无法做到。取而代之的是,我们放宽了差异隐私的保守要求,即该算法的输出的灵敏度应与任何数据集的任何扰动界定。我们提出了一个新的差异隐私概念,我们称之为预期的差异隐私。然后,我们提出了对顺序算法的新型预期灵敏度分析,并提出了一种相应的指数机制,该机制将终止时间随机,以实现预期的差异隐私。我们将提出的机制应用于统计模型检查算法,以保留其绘制样品的隐私。在案例研究中证明了所提出算法的效用。
Statistical model checking is a class of sequential algorithms that can verify specifications of interest on an ensemble of cyber-physical systems (e.g., whether 99% of cars from a batch meet a requirement on their energy efficiency). These algorithms infer the probability that given specifications are satisfied by the systems with provable statistical guarantees by drawing sufficient numbers of independent and identically distributed samples. During the process of statistical model checking, the values of the samples (e.g., a user's car energy efficiency) may be inferred by intruders, causing privacy concerns in consumer-level applications (e.g., automobiles and medical devices). This paper addresses the privacy of statistical model checking algorithms from the point of view of differential privacy. These algorithms are sequential, drawing samples until a condition on their values is met. We show that revealing the number of the samples drawn can violate privacy. We also show that the standard exponential mechanism that randomizes the output of an algorithm to achieve differential privacy fails to do so in the context of sequential algorithms. Instead, we relax the conservative requirement in differential privacy that the sensitivity of the output of the algorithm should be bounded to any perturbation for any data set. We propose a new notion of differential privacy which we call expected differential privacy. Then, we propose a novel expected sensitivity analysis for the sequential algorithm and proposed a corresponding exponential mechanism that randomizes the termination time to achieve the expected differential privacy. We apply the proposed mechanism to statistical model checking algorithms to preserve the privacy of the samples they draw. The utility of the proposed algorithm is demonstrated in a case study.