Homomorphic Encryption
Homomorphic encryption (HE) allows computations on encrypted data without decryption, addressing privacy concerns in machine learning. Current research focuses on applying HE to enhance the privacy of large language models (LLMs) and federated learning (FL), often involving optimizations for specific model architectures like transformers and the development of novel algorithms to mitigate computational overhead and address vulnerabilities. This field is significant because it enables secure collaborative machine learning and the deployment of AI services that protect sensitive data, impacting various sectors including healthcare, finance, and personal data management.
Papers
Dash: Accelerating Distributed Private Convolutional Neural Network Inference with Arithmetic Garbled Circuits
Jonas Sander, Sebastian Berndt, Ida Bruhns, Thomas Eisenbarth
Privacy-Preserving Tree-Based Inference with TFHE
Jordan Frery, Andrei Stoian, Roman Bredehoft, Luis Montero, Celia Kherfallah, Benoit Chevallier-Mames, Arthur Meyre
Deep Neural Networks for Encrypted Inference with TFHE
Andrei Stoian, Jordan Frery, Roman Bredehoft, Luis Montero, Celia Kherfallah, Benoit Chevallier-Mames