In order to defend neural networks against malicious attacks, recent approaches propose the use of secret keys in the training or inference pipelines of learning systems. While this concept is innovative and the results are promising in terms of attack mitigation and classification accuracy, the effectiveness relies on the secrecy of the key. However, this aspect is often not discussed. In this short paper, we explore this issue for the case of a recently proposed key-based deep neural network. White-box experiments on multiple models and datasets, using the original key-based method and our own extensions, show that it is currently possible to extract secret key bits with relatively limited effort.
On the Difficulty of Hiding Keys in Neural Networks / Kupek, Tobias; Pasquini, Cecilia; Böhme, Rainer. - (2020), pp. 73-78. (Intervento presentato al convegno 8th ACM Workshop on Information Hiding and Multimedia Security, IH and MMSec 2020 tenutosi a Denver, CO nel 22nd–24th June 2020) [10.1145/3369412.3395076].
On the Difficulty of Hiding Keys in Neural Networks
Pasquini, Cecilia;
2020-01-01
Abstract
In order to defend neural networks against malicious attacks, recent approaches propose the use of secret keys in the training or inference pipelines of learning systems. While this concept is innovative and the results are promising in terms of attack mitigation and classification accuracy, the effectiveness relies on the secrecy of the key. However, this aspect is often not discussed. In this short paper, we explore this issue for the case of a recently proposed key-based deep neural network. White-box experiments on multiple models and datasets, using the original key-based method and our own extensions, show that it is currently possible to extract secret key bits with relatively limited effort.File | Dimensione | Formato | |
---|---|---|---|
IH2020.pdf
Solo gestori archivio
Tipologia:
Versione editoriale (Publisher’s layout)
Licenza:
Tutti i diritti riservati (All rights reserved)
Dimensione
1.04 MB
Formato
Adobe PDF
|
1.04 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione