< Back to previous page

Publication

Design and Security Analysis of Lattice-based Post-Quantum Encryption

Book - Dissertation

More and more electronic devices are connected to the internet. To secure these devices, designers rely on cryptographic standards that provide encryption, authentication and data-integrity. However, future advances in quantum computing threaten the security of these standards as they are predicted to break the underlying hard mathematical problems. It is therefore of the utmost importance to find new standards that are safe against quantum computers. The design and analysis of such quantum-safe algorithms is studied in the field of post-quantum cryptography. In 2016, NIST announced the NIST post-quantum standardization process, soliciting for candidate public key encryption and signature schemes to replace the current standards. In the wake of this announcement, 82 candidates were submitted for standardization. At the time of writing, we have just entered the third and final round of this process, which sees 4 finalists and 5 backup schemes compete to become the new encryption standard, and 3 finalists and 3 backup schemes in the running for the new digital signature standard. In this thesis, I focus on the encryption side of post-quantum cryptography. In the first part, I present our own encryption scheme design called Saber, which was selected as a finalist into the final round of the NIST Post-Quantum Cryptography standardization process. Saber was devised with a strong focus on simplicity, efficiency and flexibility. I show the design choices that lead to these properties and analyse the correctness and security of our algorithm. Many post-quantum encryption schemes are subject to decryption failures. This means that even after a proper execution of the algorithm, there is a (very small) chance that the message or key is not transmitted correctly. In the second part of the thesis I show that these decryption failures lead to a new attack vector in which failing ciphertexts are assembled and used to reconstruct the secret key. Furthermore, I introduce an attack strategy to optimize the search for these failing ciphertexts called failure boosting. To reduce the impact of decryption failures, some encryption schemes resort to error correcting codes that reduce the failure probability. The third part of this thesis examines two complications that might arise with the use of error correction: an underestimation of the failure probability due to dependencies of individual errors and side-channel attacks enabled by complex implementations of error decoding. I discuss these complications and show their relevance by applying them to two NIST submissions. To provide chosen-ciphertext security, lattice-based encryption schemes are typically combined with a generic transformation such as the Fujisaki-Okamoto transformation. This transformation includes a re-encryption of the message into the ciphertext, which generally dominates the computational cost of decapsulation. The fourth and final contribution of this thesis introduces a new transformation that is specifically designed for lattice-based encryption schemes and that does not require the expensive re-encryption step by validating the error term of the ciphertext instead. I discuss the efficiency and security of this transformation and apply our design to the NIST submission Threebears.
Publication year:2021
Accessibility:Open