TY - CHAP
T1 - Differentially Private Anomaly Detection for Interconnected Systems
AU - Ferrari, Riccardo M.G.
AU - Degue, Kwassi H.
AU - Le Ny, Jerome
PY - 2021
Y1 - 2021
N2 - Detecting anomalies in large-scale distributed systems, such as cyber-attacks launched against intelligent transportation systems and other critical infrastructures, or epidemics spreading in human populations, requires the collection and processing of privacy-sensitive data from individuals, such as location traces or medical records. Differential privacy is a powerful theoretical tool that by applying so-called randomized mechanisms to individual data, allows to do meaningful computations at the population level whose results are insensitive, in a probabilistic sense, to the data of any given individual. So far, differential privacy has been applied to several control problems, such as distributed optimization and estimation, filtering and anomaly detection. Still, several issues are open, regarding the balance between the accuracy of the computation results and the guaranteed privacy level for the individuals, as well as the dependence of this balance on the type of randomized mechanism used and on where, in the data acquisition and processing pipeline, the noise is applied. In this chapter, we explore the possibility of using differentially private mechanisms to develop fault-detection algorithms with privacy guarantees and discuss the resulting trade-offs between detection performance and privacy level.
AB - Detecting anomalies in large-scale distributed systems, such as cyber-attacks launched against intelligent transportation systems and other critical infrastructures, or epidemics spreading in human populations, requires the collection and processing of privacy-sensitive data from individuals, such as location traces or medical records. Differential privacy is a powerful theoretical tool that by applying so-called randomized mechanisms to individual data, allows to do meaningful computations at the population level whose results are insensitive, in a probabilistic sense, to the data of any given individual. So far, differential privacy has been applied to several control problems, such as distributed optimization and estimation, filtering and anomaly detection. Still, several issues are open, regarding the balance between the accuracy of the computation results and the guaranteed privacy level for the individuals, as well as the dependence of this balance on the type of randomized mechanism used and on where, in the data acquisition and processing pipeline, the noise is applied. In this chapter, we explore the possibility of using differentially private mechanisms to develop fault-detection algorithms with privacy guarantees and discuss the resulting trade-offs between detection performance and privacy level.
UR - http://www.scopus.com/inward/record.url?scp=85107987094&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-65048-3_10
DO - 10.1007/978-3-030-65048-3_10
M3 - Chapter
AN - SCOPUS:85107987094
SN - 978-3-030-65047-6
T3 - Lecture Notes in Control and Information Sciences
SP - 203
EP - 230
BT - Safety, Security and Privacy for Cyber-Physical Systems
A2 - Ferrari, Riccardo M.G.
A2 - Teixeira, André M.H.
PB - Springer
ER -