Overview
The Quantum-Secure Internet Browsing (IB) pilot demonstrates the successful system-level integration of Post-Quantum Cryptography (PQC) into a real-world client-server setup. The objective is to validate that PQC, as standardized by NIST, can be adopted across multiple software layers while preserving security, interoperability, and usability in practice. This is accomplished in an efficient way by implementing PQC functionality in loadable modules and browser extensions, thereby requiring minimal changes to large pre-existing projects such as OpenSSL and Firefox.
Architecture and building blocks
The pilot demonstrator consists of the following key components:
- OS: Fedora Linux (client and server)[1].
- Server side: OpenSSL + Provider[2].
- Client side: Firefox / NSS[3] + qryptotoken, wallet for Verifiable Credentials[4].
These components form a complete client-server setup capable of establishing secure TLS 1.3 connections using hybrid post-quantum and traditional cryptographic algorithms. On the server side, OpenSSL is configured with a post-quantum provider, Aurora, that enables support for ML-KEM-768 and ML-DSA-65, in line with the NIST standardization process. On the client side, Firefox is built with NSS and extended through the external qryptotoken
PKCS#11 module, which delegates key exchange and signature operations to a Rust-based implementation of the same algorithms.
The system was tested under multiple configurations to ensure compatibility and validate the performance of PQ/T hybrid handshakes. This integration demonstrates that post-quantum cryptographic support can be added to existing infrastructure in a modular way, without requiring disruptive changes to the software architecture or user-facing behaviour.
Relevant environment
The demonstrator targets Technology Readiness Level (TRL) 6[5]—“Technology demonstrated in relevant environment (industrially relevant environment in the case of key enabling technologies)”. The building blocks are integrated with software such as Firefox that is already in widespread real-world use, rather than being developed only as standalone proofs-of-concept. The demonstrator demonstrates end-to-end functionality for PQC-enabled web browsing and user authentication, showing a viable complete user experience in a controlled environment.
Use cases and validation plan
The demonstrator focuses on three web browsing use cases that progressively build on one another to explore post-quantum cryptography at both the transport and application layer:
- UC01: Establishing a secure connection using TLS 1.3 with post-quantum key exchange and server-side authentication.
- UC03: Extends UC01 by adding user authentication at the application level using plaintext Verifiable Credentials, implemented with both pure post-quantum and hybrid PQ/T cryptographic schemes.
- UC04: Further extends UC01 by using privacy-preserving, anonymous credentials for user authentication, enabling selective disclosure of identity attributes.
To validate these use cases, the demonstrator is tested against public servers configured to support PQ/T hybrid TLS. This includes interoperability testing with real-world PQC-enabled endpoints to assess performance, connection reliability, and correct negotiation of cipher suites. The validation helps ensure that the integration is robust across different environments and compatible with emerging standards.
Perspectives on the transition exercise in system integration
PQ Crypto at the OS level: from components to the whole system
Fedora Linux is one of the building blocks during the QUBIP development. To ensure that we have PQ crypto as a part of Fedora Linux, we defined the scope as follows:
- Algorithms should be available in basic crypto libraries (OpenSSL, NSS).
- There should be an option to enable them system-wide.
The Fedora approach for the default settings of crypto algorithms is done via so-called crypto policies. They provide snippets of system-wide configurations for major crypto libraries (OpenSSL, GnuTLS, NSS). Applications, such as web servers, normally inherit the system-wide settings from the crypto library they rely on. In most cases the settings defined by the crypto policies can be overwritten, and we used this widely for experimental purposes during the project development.
Various libraries have different approaches for implementing new cryptography. NSS has a library providing all algorithms it is aware of, and Fedora followed this approach. On the other hand, OpenSSL uses loadable modules via the provider mechanism.
From an OS maintenance perspective, we have chosen the OQS project and 2 components developed by it, liboqs and oqs-provider, to bring PQ crypto to OpenSSL. We have chosen the web server Nginx for the server-side experiments. For initial attempts we configured Nginx to enable PQ cryptography manually. We didn’t have any problems and so this approach was chosen for setting up the server side. We also run interoperability tests against Google Chrome and Cloudflare infrastructure to ensure that we are interoperable with the rest of the world.
Preparing for Pilot demonstration
At the end of 2024, NIST issued final standards of the first PQ algorithms: ML-KEM based on Kyber for key exchange, and ML-DSA (instead of Dilithium) for digital signatures. We updated liboqs and oqs-provider to the relevant versions and decided to simplify the setting for those who want to make experiments (and of course, for ourselves). As proper installation of a TLS server required several steps (installation of liboqs and oqs-provider, configuring OpenSSL to load the oqs-provider, and enabling the PQ cryptography system-wide), Red Hat crafted a dedicated crypto policy for enabling PQ crypto, named PQ-TEST. Then a Linux container image, pq-container[6], was created to automate these steps for any user that is going to configure a PQ-ready installation. The container is based on Fedora 42 which is the last released version of Fedora at the moment; it has liboqs 0.12 and oqs-provider 0.8. To run test servers, several containers with slightly different Nginx configurations were started at the machine dedicated for experiments.
System integration at the library level: OpenSSL+Aurora
Aurora served as a compatibility layer between OpenSSL (slotting into its provider architecture) and various implementations of PQC algorithms, particularly ML-KEM-768 for TLS handshakes and ML-DSA-65 for digital signatures and X.509 certificates. The loadable module design of the provider architecture enabled different implementations to be used with no changes required within the implementations themselves or within OpenSSL, demonstrating a high level of cryptographic agility.
The OQS provider was chosen as a functional compatibility target, as it already provided the functionality that was being developed in Aurora. This enabled the other partners to develop their building blocks at their own pace by using the OQS provider while certain features they need were not yet implemented in the Aurora provider, preparing for a seamless switchover once Aurora provides the required features.
Using OQS as a functional compatibility target also aided the internal development process of the Aurora provider, as OQS was used for interoperability testing and occasionally as a reference implementation to understand aspects of the OpenSSL provider interface. For example, after implementing ML-DSA-65 key generation in Aurora, using the generated keys to generate certificates with the OQS provider offered reassurance that key generation was working correctly before moving on to implementing certificate generation in Aurora. Interoperability testing with the publicly available PQ testing servers hosted by Cloudflare was also beneficial in the development of Aurora’s TLS capabilities.
System integration at the library level: Firefox/NSS+qryptotoken
The integration of PQC into Firefox via the Network Security Services (NSS) library and the externally developed Rust-based PKCS#11 module qryptotoken
was a key milestone in enabling quantum-secure internet browsing. This Rust-based soft token exposes ML-KEM-768 for key exchange and ML-DSA-65 for digital signatures; both algorithms are aligned with the NIST PQC standardization process. Through NSS’s pluggable architecture, these algorithms were injected into Firefox’s TLS 1.3 handshake flow without requiring deep modifications to Firefox’s core cryptographic engine.
To make this possible, several enhancements were made to NSS to allow the negotiation of PQ/T hybrid key exchanges and authentication mechanisms during TLS sessions. Since NSS does not natively support KEMs or ML-DSA, custom vendor extensions were introduced to define new mechanisms and signature schemes. The browser was then configured to load the qryptotoken
module at runtime, allowing all cryptographic operations to be routed through the external implementation.
This setup made it possible to establish full PQ/T hybrid TLS 1.3 connections between a modified Firefox client and servers configured with compatible post-quantum cryptography, such as those using OpenSSL and Aurora. Firefox, running on Fedora, successfully completed real-world handshakes with public test servers including those from Cloudflare and the Open Quantum Safe project. These tests confirmed interoperability and correctness across the full TLS stack.
Throughout the integration process, particular attention was given to maintaining modularity, avoiding intrusive changes to upstream code, and enabling reproducible testing. Firefox was packaged with the modified NSS and qryptotoken
module into a pre-configured Flatpak bundle to simplify deployment. Logging and debugging tools were used to trace operations and validate handshake behaviour during end-to-end testing.
This integration shows how modern browsers can be extended to support advanced cryptographic mechanisms like PQC through well-structured interfaces. It also demonstrates the feasibility of injecting experimental algorithms at runtime, enabling flexible experimentation and deployment without sacrificing compatibility or security. The Firefox/NSS+qryptotoken stack forms a foundational client-side component of the QUBIP pilot and serves as a practical reference for post-quantum readiness in real-world web applications.
System integration at the application level
The Quantum-secure Internet Browsing pilot aims to achieve the following goals at the application level:
- Secure and privacy-preserving Internet browsing using Post-Quantum/Traditional (PQ/T) Hybrid methods.
- Testing realistic deployments of PQ/T Hybrid algorithms in existing applications through OpenSSL and NSS libraries.
- Exploring the transition to PQC of the Self-Sovereign Identity (SSI)[7] ecosystem in a web environment.
Concerning the last item, the SSI reference framework involves peer-to-peer interactions, but it is also intended to be used in the client-server architecture typical of the web. In this scenario, the SSI is a promising decentralized alternative for implementing client authentication, taking advantage of Decentralized IDentifiers (DIDs)[8] and Verifiable Credentials (VCs)[9], currently under standardization by the World Wide Web Consortium (W3C). The client establishes a secure communication channel with the server using the Transport Layer Security (TLS) protocol with server authentication only. Then, assuming the client already has its self-sovereign identity, it creates a Verifiable Presentation (VP), signs it, and sends it to the server for authentication. Upon successful authentication, the server will also check the client’s VC claim(s) for authorization before granting access to the requested service/resource.
QUBIP has taken a practical step towards the transition to PQC of the SSI ecosystem by designing and implementing plaintext VCs with solely PQ and with PQ/T hybrid approaches, and of PQ anonymous VCs with selective disclosure capabilities. Note that PQ anonymous VCs represent a privacy-preserving alternative to the plaintext VCs, enabling the Holder to manage its VC by choosing the level of information disclosure (for more details, please refer to the previous blogpost “Post-Quantum Verifiable Credentials”[10]).
LINKS has selected the IOTA Identity library[11] as the target of this transition exercise. This is a widely used SSI library written in the Rust programming language and is the result of a large, open source, community-led SSI project maintained by the IOTA Foundation. The library provides all the functionalities to handle W3C compliant DIDs, DID documents, VCs, and VPs. In essence, it is a general purpose SSI library and, therefore, the perfect target for a practical transition to PQC of the SSI ecosystem.
Among the software that has been integrated in the pilot demonstrator, LINKS has also developed and integrated a browser extension on top of the Mozilla Firefox building block that functions as a digital wallet, enabling users to manage their self-sovereign identity for authentication purposes. The wallet extension interacts with the IOTA Identity library, allowing the user to generate and store VCs and to present VPs.
Human-Centred Quantum-Resistant Security
User-Centric Approach
The User Centric Approach (UCA) is a design and development philosophy which places the user, stakeholders, and citizens at the centre of all design decisions when creating digital products.
The UCA puts the end user at the very centre of the design and development process. It is based on understanding the needs, objectives, and context of the users, as well as actively involving them in all phases of the solution lifecycle.
To follow this approach, one should first gather the user needs and requirements for later onboarding end-users in the development and final validation of the product. The aim is to create digital products that make sense in the user’s context and give response to their real needs. This encompasses ease of use, reward while interacting with the product, and the overall holistic experience of use. By understanding the user’s goals and needs, the designer can create a digital product that is tailored to their specific goals and needs. This can be beneficial not only for the user but also for the designer, as the solution will be more effective and have higher user engagement and satisfaction.
The UCA is an iterative development and design methodology that puts the user at the centre of all the technical development and interface design decisions, by involving them throughout the design process. The UCA follows specific core principles:
- The design is based upon an explicit understanding of user characteristics.
- The design is based upon a deep understanding of the user’s context.
- The solutions developed are refined by users’ evaluation.
- The development addresses the whole user experience and user context, not simply the use of the technology.
Many UCA methodologies follow the international standard ISO 13407:1999 of Human-Centred Approach (HCA) and have been proven to lead to highly successful products.
It is worth noting that the ISO standard defines HCA but not UCA, as it considers both terms synonyms. ISO 9241:2010 prefers use of the first, considering that it impacts all the humans involved in the system, not only the end users of the product. However, we justify the use of the term “User-Centred Design” by common usage in the market. Also, the UCA considers the relationship between the user and any type of element. This may include interactive systems or non-interactive, for example, the cases that Don Norman explains in the book The Design of Everyday Things[12].
A five-step design thinking process is used to identify problems and iterate solutions by focusing on users, with the goal of creating a product that is tailored to the user’s needs in an effective way. This approach is both a work methodology and a design philosophy; hence, it should be adapted to the different contexts of each project and its form adapted accordingly. This process involves:
- Empathize: Understanding the users and their context.
- Define: Specifying requirements for both the business and user.
- Ideate: Designing solutions in accordance with the previous two steps.
- Prototype: Evaluating the designed solutions with users.
- Test: Iterating and repeating the process.
To better explain the concept of UCA, it is useful to contrast it with initiatives and projects that, while involving users, cannot be called user-centric, since the user is not at the centre of the process. Some common examples are:
- Projects that are restricted to technological and academic endeavours.
- Projects that “use” users as tools for measuring the capabilities of a technology without caring for user feedback.
- Projects in which the user-centric cycle is not complete.
- Projects that believe that user centricity is only about the User Interface (UI).
The case for Internet Browsing
Web browsers are gateways to the internet, responsible for securing billions of daily interactions—from email to banking, from government services to private chats. The transition to quantum-resistant cryptography begins here. Yet browsers present a unique challenge: they are simultaneously complex technical platforms and everyday tools for non-expert users.
The goal is not only to ensure that the browser remains secure in a quantum future, but to explore how these changes are experienced by users. Do they notice differences in performance? Do they feel more secure? Do they even understand what’s changing?
If users don’t understand, trust, or adopt quantum-resistant solutions, we risk designing for theoretical security while failing in practice. This transition poses unique human factors challenges:
- Transparency vs. Complexity: Quantum-resistant cryptography is unfamiliar to most users. Overexposure risks confusion; underexposure risks invisibility. We must find ways to communicate changes meaningfully, without technical overload.
- Latency and Perception: Quantum-resistant operations can increase handshake times and message sizes. If browsing feels slower or less responsive, users may blame the browser—even when delays are tied to stronger security.
- Trust and Interpretability: Security is ultimately about trust. Users must believe that the browser is protecting them, and that changes are justified. The best cryptographic protocol won’t help if it erodes user confidence.
Testing in the wild, learning in context
To address these challenges, our Quantum-Secure Internet Browsing pilot integrates a multidisciplinary user evaluation process, drawing from cybersecurity, behavioural science, interaction design, and ethics.
We are working with a diverse sample of end users, testing real-world browsing sessions using the quantum-resistant enhanced browser. This includes:
- Quantitative telemetry, e.g. connection setup times, fallback behaviour.
- Surveys on trust, usability, and perceived security.
- Scenario-based tasks that reveal how users react to subtle cryptographic differences.
Our approach reflects established best practices in usable security and seeks to capture not just what works, but what is understood, accepted, and adopted.
Toward Quantum-Resistant Transition Guidelines
The pilot is not just a technical demo: it’s a critical step in producing concrete guidelines for the global transition to quantum-resistant internet systems.
By grounding our evaluation in real user feedback, we aim to deliver recommendations that reflect:
- Interface design principles for quantum-resistant browsing.
- Risk communication strategies for end-user environments.
- Deployment considerations for hybrid cryptographic schemes.
- Policy guidance for institutions managing large-scale transitions.
These guidelines will support not only browser developers, but also policymakers, IT managers, and standards bodies engaged in shaping a secure quantum-ready internet.
Conclusions
After achieving the Mid-Term Review milestone of successfully deploying the first working integration of all the building blocks, we now look forward to completing the deployment activities to provide all the planned functionalities. We will finally start the comprehensive validation activities with the involvement of end-users, and will use that experience to provide a valuable handbook for the PQ/T transition, which will document the lessons learned, critical challenges, and resulting recommendations.
References
[1] https://qubip.eu/fedora-linux-transition-to-support-quantum-secure-internet-browsing/
[2] https://qubip.eu/transition-of-openssl-for-implementing-pq-t-tls/
[3] https://qubip.eu/transition-of-nss-and-firefox-to-support-the-quantum-secure-internet-browsing/
[4] https://qubip.eu/post-quantum-verifiable-credentials/
[5] https://en.wikipedia.org/wiki/Technology_readiness_level
[6] https://github.com/QUBIP/pq-container
[7] https://www.manning.com/books/self-sovereign-identity
[8] https://www.w3.org/TR/did-core/
[9] https://www.w3.org/TR/vc-data-model-2.0/
[10] https://qubip.eu/post-quantum-verifiable-credentials/
[11] https://github.com/iotaledger/identity.rs
[12] https://mitpress.mit.edu/9780262525671/the-design-of-everyday-things/