Middle East, Opinion, UAE

How to navigate the transition to post-quantum cryptography

Dr. Víctor Mateu, Acting Chief Researcher Cryptography Research Center at Technology Innovation Institute.

Security professionals worldwide are preparing for a major upgrade in the form of a migration to new post-quantum cryptographic standards as the era of quantum computing comes closer to reality.  

The U.S. National Institute of Standards and Technology (NIST) has been leading a standardisation process to transition from classical public-key cryptosystems to quantum-resistant alternatives.  

Governments and businesses can now plan their transition to post-quantum cryptography (PQC) to ensure long-term data security against quantum-enabled threats. 

However, this shift must be approached with caution to avoid unintended vulnerabilities. 

Recent research from the Technology Innovation Institute (TII)’s Cryptography Research Center (CRC) in Abu Dhabi and Polytechnic University of Turin highlights a key concern: solutions that rely on variants of computationally hard problems used in the design of PQC algorithms to enhance their performance or to provide added functionalities require additional scrutiny.  

 An example is the Linear Code Equivalence (LCE), which plays a role in PQC signature schemes. 

The study, Don’t Use it Twice! Solving Relaxed Linear Code Equivalence Problems warns that modifying computational problems, even slightly, can significantly change their complexity, sometimes making them solvable with today’s technology.  

This is a caution to designers of new designs to double-check that tweaks they introduce don’t lead to weaker security guarantees than intended. 

Lessons from the Linear Code Equivalence Problem
LCE, a computational assumption consisting of two linear codes that are equivalent up to a linear transformation, has been studied by cryptanalysts and is used to construct secure cryptosystems like digital signatures. The research warns against using relaxed versions of LCE in cryptographic applications without rigorous security validation, which could lead to vulnerabilities. 

A key takeaway is that even for well-established hard problems, providing additional data, such as multiple instances of a problem that share the same secret, can make it easier for attackers to recover the secret information. This serves as a reminder to designers that seemingly minor adjustments to cryptographic structures can unintentionally reduce security. 

While the study highlights potential vulnerabilities, it by no means suggests abandoning PQC development. Instead, organizations should begin transitioning to quantum-safe cryptography while keeping in mind the importance of careful validation and measured adoption. 

For example, security practitioners should focus on rigorous cryptanalysis to assess the long-term security of any PQC scheme built on novel or modified computational problems.  

They must also avoid relying on less studied assumptions or at least approach them with skepticism to ensure that relaxations of problems don’t introduce unintended vulnerabilities. 

The transition to PQC should be a gradual process, informed by ongoing cryptanalysis and contributions from the global cryptographic community. The process will also go through refinements as a natural part of its journey in the coming years. 

The Road Ahead
The industry must navigate this shift with an understanding that cryptographic design is inherently iterative. New threats emerge and countermeasures must adapt accordingly.  

Governments and organizations embarking on their PQC migration journey must recognise that while PQC is still maturing, it presents an exciting opportunity to build a stronger, more resilient cryptographic foundation for the future.  

This opinion piece is authored by Dr. Víctor Mateu, Acting Chief Researcher, Cryptography Research Center at TII.

 

Previous ArticleNext Article

GET TAHAWULTECH.COM IN YOUR INBOX

The free newsletter covering the top industry headlines