Identity standards have been streamlined and made more accessible. OAuth 2.0 and OpenID Connect (OIDC) offer many options to transfer personal data across internet services. This is very popular with social media, and also online retail, health, and banking. In other words, anything online. Data travels from centrally held user stores across to multiple parties, or between different services of a single legal entity.
This article discusses how to select signatures. A follow up article will address encryption. In the following sections, I talk about signatures supported by the OAuth 2.0 and OIDC standard: HMAC, RSA, and EC; respectively, Hash Message Authentication Code, Rivest Shamir Alderman (a.k.a. private/public key encryption), and Elliptic Curve (i.e. keys derived from said curves).
Table 1 has a lot of information; in fact, it is a copy/paste of the IETF documentation (so, not bedside reading material). Let’s break it down a bit. Looking at the supported signature solutions, we can recognize the three types of signatures: HS[xxx], RS/PS[xxx], and ES[xxx] respectively corresponding to three families: HMAC, RSA, Signature Scheme with Appendix (RSA-SSA), and ECDSA (Elliptic Curve Digital Signature Algorithm).
I would like to compare the different algorithms from performance and security levels. I considered the following characteristics:
- Security level: The strength of a cryptographic primitive or function². It measures the number of operations necessary to be able to compromise the security.
- Performance and time complexity: The time complexity reflects the number of operations necessary to run the encryption or decryption. The time complexity is affected by the encryption algorithm and the input size. Therefore, we need to make sure that input size is identical for a fair comparison.
As I am not a cryptographer, the strength analysis is not explored in any depth. The actual intent is to review the performance of secure solutions for the different types of algorithms so that it becomes easier to make an informed decisions, and select the best options given any use case. When it comes to signatures, the most important characteristic is whether an attacker can forge that signature or not. We do not want keys to be guessed, nor do we want fraudulent messages to be granted legitimacy by reusing previous signatures in clever ways.
For comparison purposes, I reduced the strength of algorithms to their theoretical robustness ignoring attacks on implementation, or hardware flaws. This gives me a very black and white view; either a solution is acceptable, or it is not.
In choosing a signature algorithm for OAuth 2.0 and OIDC tokens, we want to select one that is secure, but also fast. To obtain performance data, I implemented the signature algorithm on my laptop. For reference, here is the processing power available:
2,4 GHz Intel Core i9 8 cores — T2 Hardware Encryption Acceleration chip (no specification details known)
Tests presented here computing signatures for a string of 250 characters, which is roughly representative of an OAuth 2.0 token, or an OIDC token. The size is always the same for every algorithm presented here. The test runs measure the time needed in seconds for the laptop to perform 1 million signatures, and 1 million verifications.
The first diagram shown in Figures 1 and 2 illustrates the performance for signing and verifying messages by comparing the three types of algorithms head to head (note two variations of the RSA algorithm are represented which means four algorithms are on the diagrams). Note that the HS256 algorithm does not show on the diagrams because its response time is so fast the bars can hardly represent it. Already, it is obvious that the most performant is the hashing solution both for signing, and verifying. This very clearly shows that asymmetric algorithms can be divided into two categories: solutions that are fast to sign but slow to verify (e.g., ES256), or slow to sign but fast to verify (e.g. RS256 and PS256).
The following sections describe in more detail the different options available for each type of signature.
HS[xxx]: HMAC-based signatures
An HMAC signature uses hashing algorithms. In particular, the algorithm used is called SHA, an acronym for Secure Hash Algorithm. In SHA[xxx], the number (xxx) reflects the output size of the hash. The SHA family of algorithms needs an input key and message, and in output, it produces a hash. The security level of this is related to the size of both the secret, and the output hash size. It is considered to be the lesser of the two. For example, an SHA256 hash with a key size of 256 bit offers 2²⁵⁶ brute force “options”. This is already considered fully resistant. But, an SHA256 hash with a key size of say 64 bit (e.g., your secret is “password”) gives 2⁶⁴ possibilities, and that might be brute-forced. The important thing to remember is to choose a key size at least as big as the output size so you do not dumb down security.
Note there are possible attacks against hashing techniques in different contexts such as passwords (rainbow tables, collisions, etc…). However, HMAC signatures are a different scenario altogether. They much harder to attack because messages cannot be random like with a collision, and so even finding a collision alone does not help an attacker.
This means you and I can go ahead and trust any of the SHA-2 algorithms (all three belong to this sub-family of SHA algorithms). Next, I need to know how much CPU is required to compute the signatures. In the case of HMAC, this depends on the key and output size. The complexity grows linearly; therefore, the time complexity is often simplified with the O(n) notation. Here are some results from the tests I ran on my laptop (making sure the key size is simply as big as the output size).
Surprisingly, Figure 3 shows no significant difference in the time required to compute an HMAC 256 hash to any other hash. It is possible that some hardware/encryption acceleration is responsible for this levelling up.
As expected, verification times are roughly the same as signing because it is the same operation. In just under 2 seconds, the laptop was able to sign or verify 1 million messages regardless of which SHA-2 flavor was selected.
RS/PS[xxx]: RSA-based signature
The RSA-SSA signature leverages the private and public keys scheme. Let’s spend one minute clarifying one possible source of confusion here. There are two variations of this algorithm supported by the IETF JWA (i.e., the signatures) standard; namely, Public Key Cryptographic Standards v 1.5 (PKCS) and Probabilistic Signature Scheme (PSS). In Table 1, they are referred to as RS[xxx], and PS[xxx]. The former is the original standard, and the latter has added security, including salt, which makes the verification of the signature a little bit more complicated (for programmers). Therefore, these are both RSA algorithms, and the PSS one is more robust, but generally speaking, PKCS is considered safe enough for signatures by authoritative organizations (e.g.,IETF³), or by the literature.
The security level of the algorithm is also related to the key size. Frequently used key sizes are 1024, 2048, or 4096 bits. However, the real strength of RSA algorithms is extremely difficult to express even with a solid math degree (which I don’t have); therefore, the values are basically there to compare orders of magnitude between each key size. Unfortunately, this does not really reflect how robust those options really are in the real wide, wild cyberworld. What is known for sure is that computer power is closing in on key sizes under 1024 bits; in fact, those keys are considered insecure, full stop. At the time of writing, published academic material demonstrate that key sizes of 829 bits can be cracked⁴, reportedly achieved with the following hardware:
2700 core-years, using Intel Xeon,Gold 6130 CPUs (2.1GHz)
Therefore, the recommendation for the RSA algorithm is to not use key sizes of 1024 bits. So, what are the options? Traditionally, computer programmers have a particular affection for 2x values, and would traditionally jump from 1024 to 2048 and then to 4096. It is more about conventions than anything else. We know, however, that key sizes of 2048 bits are considered safe (but only until 2030, apparently⁵). Still, if for whatever reason, you think future-proofing is important, you may select larger keys of 4096 or more if your servers can take the hit (during these tests I found that keys of 4096 bit took about four and a half times longer to produce a signature than the 2048 bit keys). In the end, you may find that a stronger key size is a futile protection anyway, given the recent progress in quantum physics...but who knows? Figures 4 and 5 show a comparison of the time it takes on the same laptop to compute RSA-based signatures of sizes 2048 (remember, 1024 is not safe). As you can see, PKCS and PSS have very comparable performance capabilities, with the expected minor hit on PSS results.
A striking measurement is the RSA signature verification. No matter how big the key is, verification is extremely fast. This will make it a popular choice for OAuth 2.0 and OIDC clients. Both RSA algorithms can verify 1 million signatures between 44 and 56 seconds. In testing, it shows that the verification duration seems to increase linearly with the size of the key; i.e., double the size of the key, and you roughly double the time it takes to verify a signature.
ES[xxx]: Elliptic Curve signatures
Finally, Elliptic Curve signatures. Again, reporting on theoretical strength. I refer to the comparison value tables which is reported by OpenSSL⁶. It shows that the least strong of HMAC signatures (HS256) with a 2²⁵⁶ possible combinations is as good as the best Elliptic Curve P-521. Nonetheless, P-256, and P-384 curves are considered robust to brute force attacks.
When it comes to performance, however, things are more interesting. Signing with any EC-DSA with curve P-256 takes half the time of RSA with a 2048-bit key, and the strength is equivalent. If we select a stronger curve, either P-384 or P-521 performance drops significantly (see Figure 6), but the equivalent key strength in RSA would have to grow beyond reasonable proportions for most cases. Indeed, P-384 is equivalent to a 7680-bit RSA key, and P-521 to a 15360-bit RSA key. Therefore, like for like, ECDSA consistently performs better at signing.
Verifying a signature, on the other hand, takes an awfully long time, it is slower than signing with an RSA key, and therefore can be a real showstopper if you are on the verifying end of the protocol.
Elliptic Curves with key sizes of 256 bits are considered as robust as RSA keys of size 3072 bit⁷. However, given the drop in performance, one has to be careful to choose the correct key size in relation to expected traffic. The main trouble with this encryption is that there are many concepts to grasp and that poor implementations will lead to flawed security. Therefore, choosing this solution demands that you use skilled people to validate your implementation.
Quantum capable monster machines will probably break asymmetric cryptography sooner or later; RSA or Elliptic Curves, but we do not know when. Therefore, choosing a key size beyond 2048, or 256 bits respectively, is an attempt to protect data beyond the horizon of 2030, provided quantum computers do not come first. So, unless the data sent will be sensitive, even beyond the decade, it would not be increasing security that much. Finally, hashing algorithms are reportedly more resistant⁸ to quantum crypto-power than asymmetric options.
As for SHA-2 signatures, they are the most performant keys; but, as you need to exchange a secret with the party verifying the signature, the implementation is not always easy. In some controlled environments, it might be a sensible choice; e.g., an all internal deployment.
So, factors such as the amount of traffic, the length of time for which privacy matters, how much computer power is available, or how near capacity is a full deployment may have an impact on which signature algorithm is used. And, hopefully by now, Table 1 will make more sense to the non-cryptographers amongst us.
Thanks to Ali S, and Christian B who helped me with both part 1 and 2 giving me invaluable advice: