Skip to Main content Skip to Navigation
Journal articles

Illuminating the Dark or how to recover what should not be seen in FE-based classifiers

Abstract : Classification algorithms/tools become more and more powerful and pervasive. Yet, for some use cases, it is necessary to be able to protect data privacy while benefiting from the functionalities they provide. Among the tools that may be used to ensure such privacy, we are focusing in this paper on functional encryption. These relatively new cryptographic primitives enable the evaluation of functions over encrypted inputs, outputting cleartext results. Theoretically, this property makes them well-suited to process classification over encrypted data in a privacy by design’ rationale, enabling to perform the classification algorithm over encrypted inputs (i.e. without knowing the inputs) while only getting the input classes as a result in the clear. In this paper, we study the security and privacy issues of classifiers using today practical functional encryption schemes. We provide an analysis of the information leakage about the input data that are processed in the encrypted domain with state-of-the-art functional encryption schemes. This study, based on experiments ran on MNIST and Census Income datasets, shows that neural networks are able to partially recover information that should have been kept secret. Hence, great care should be taken when using the currently available functional encryption schemes to build privacy-preserving classification services. It should be emphasized that this work does not attack the cryptographic security of functional encryption schemes, it rather warns the community against the fact that they should be used with caution for some use cases and that the current state-ofthe-art may lead to some operational weaknesses that could be mitigated in the future once more powerful functional encryption schemes are available.
Complete list of metadata
Contributor : Caroline Fontaine Connect in order to contact the contributor
Submitted on : Thursday, December 17, 2020 - 6:12:07 PM
Last modification on : Saturday, June 26, 2021 - 3:42:19 AM
Long-term archiving on: : Thursday, March 18, 2021 - 8:30:42 PM


[22990984 - Proceedings on Pri...
Publisher files allowed on an open archive


Distributed under a Creative Commons Attribution - NonCommercial - NoDerivatives 4.0 International License



Sergiu Carpov, Caroline Fontaine, Damien Ligier, Renaud Sirdey. Illuminating the Dark or how to recover what should not be seen in FE-based classifiers. Proceedings of Privacy Enhancing Technologies, 2020, 2020 (2), pp.5-23. ⟨10.2478/popets-2020-0015⟩. ⟨hal-02413588⟩



Record views


Files downloads