Paper Deep Dive – HETAL
Date:
Summary:
In this lab meeting, I reviewed HETAL: Efficient Privacy-preserving Transfer Learning with Homomorphic Encryption. Transfer learning is widely used for data-scarce problems by fine-tuning pre-trained models. While previous studies focused mainly on encrypted inference, HETAL is the first practical scheme that enables encrypted training under homomorphic encryption.
🔑 Research Question:
- How can transfer learning be made both privacy-preserving and efficient when client data must remain encrypted?
⚙️ Key Mechanism:
- Encrypted Softmax Approximation: Designed a highly precise softmax approximation algorithm compatible with HE constraints.
- Efficient Matrix Multiplication: Introduced an encrypted matrix multiplication algorithm, 1.8×–323× faster than prior methods.
- End-to-end Encrypted Training: Adopted validation-based early stopping, achieving accuracy comparable to plaintext training.
📊 Main Results:
- Fine-tuning on encrypted models succeeded (a new milestone compared to prior work).
- Training times ranged 567–3442 seconds (< 1 hour) across five benchmark datasets.
- Accuracy comparable to non-encrypted training was achieved.
⚠️ Limitations:
- Accuracy degradation in certain tasks due to approximation constraints.
- Evaluation limited to moderate-sized models/datasets.
Slides:
PDF (Korean) Download