Java KeyStores – the gory details

Java KeyStores are used to store key material and associated certificates in an encrypted and integrity protected fashion. Like all things Java, this mechanism is pluggable and so there exist a variety of different options. There are lots of articles out there that describe the different types and how you can initialise them, load keys and certificates, etc. However, there is a lack of detailed technical information about exactly how these keystores store and protect your key material. This post attempts to gather those important details in one place for the most common KeyStores.

Each key store has an overall password used to protect the entire store, and can optionally have per-entry passwords for each secret- or private-key entry (if your backend supports it).

Java Key Store (JKS)

The original Sun JKS (Java Key Store) format is a proprietary binary format file that can only store asymmetric private keys and associated X.509 certificates.

Individual private key entries are protected with a simple home-spun stream cipher—basically the password is salted (160-bits) and hashed with SHA-1 in a trivial chained construction until it has generated enough output bytes to XOR into the private key. It then stores a simple authenticator tag consisting of SHA-1(password + private key bytes) — that’s the unencrypted private key bytes. In other words, this is an Encrypt-and-MAC scheme with homespun constructions both based on simple prefix-keyed SHA-1. (This scheme has OID

The whole archive is again integrity protected by a home-spun prefix keyed hash construction, consisting of the SHA1 hash of the UTF-16 bytes of the raw keystore password, followed by the UTF-8 bytes of the phrase “Mighty Aphrodite” (I’m not kidding) followed by the bytes of the encoded key store entries.

If every part of this description has not got you screaming at your screen in equal parts terror and bemusement, then you probably haven’t fully grasped how awful this is. Don’t use it, even for just storing certificates — it’s tampering resistance is if anything even worse than the encryption.

JCE Key Store (JCEKS)

Sun later updated the cryptographic capabilities of the JVM with the Java Cryptography Extensions (JCE). With this they also introduced a new proprietary key store format: JCEKS.

JCEKS uses “PBEWithMD5AndTripleDES” to encrypt individual key entries, with a 64-bit random salt and 200,000 iterations of PBKDF1 to derive the key. TripleDES is used with 3 keys (“encrypt-decrypt-encrypt”) in CBC mode. There is no separate integrity protection of individual keys, which is fine if the archive as a whole is integrity protected, but it means that access control is effectively at the level of the whole keystore. This is not terrible from a crypto point of view, but can definitely be improved—neither MD5 nor TripleDES are considered secure any more, and it’s been a long time since anyone recommended them for new projects. However, it would also not be a trivial effort to break it.

JCEKS uses the same ridiculous “Mighty Aphrodite” prefix-keyed hash as JKS for integrity protection of the entire archive. It is probably best to assume that there is no serious integrity protection of either of these key stores.


Apart from these proprietary key stores, Java also supports “standard” PKCS#12 format key stores. The reason for the scare quotes around “standard” is that while it is indeed a standard format, it is a very flexible one, and so in practice there are significant differences between what “key bag” formats and encryption algorithms are supported by different software. For instance, when you store symmetric SecretKey objects in a PKCS#12 key store from Java, then OpenSSL cannot read them as they use a bag type (“secretBag” – OID 1.2.840.113549. that it does not understand.

Java uses version 3 of the PKCS#12 standard format. It stores secret keys in the aforementioned “secretBag” format, and asymmetric private keys in “PKCS#8 Shrouded Key Bag” format (OID 1.2.840.113549. This just dictates the format of bytes on the disk. In both cases the actual key material is encrypted using some form of password-based encryption (PBE) mode. By default this is “PBEWithSHA1AndDESede” — “DESede” is another name for TripleDES in encrypt-decrypt-encrypt mode, so this is pretty similar to the mode used by JCEKS apart from using a slightly better (but still deprecated) hash in the form of SHA-1. By default this uses a 160-bit salt and 50,000 iterations.

But, there is an important improvement in the PKCS#12 implementation—you get to choose the encryption algorithm! By passing in a PasswordProtection parameter (from Java 8 onwards) when saving a key you can specify a particular (password-based) cipher to use. I haven’t checked exactly what ciphers are allowed, but you can at least specify a stronger PBE mode, such as “PBEWithHmacSHA512AndAES_256”, which will derive a 256-bit AES key using salted PBKDF2 and then encrypt the stored key using AES/CBC/PKCS5Padding with that key. You can also increase the number of iterations of PBKDF2 used. For example:


import javax.crypto.SecretKey;
import javax.crypto.spec.PBEParameterSpec;
import javax.crypto.spec.SecretKeySpec;

public class scratch {
    public static void main(String... args) throws Exception {
        KeyStore keyStore = KeyStore.getInstance("PKCS12");
        keyStore.load(null, null); // Initialize a blank keystore

        SecretKey key = new SecretKeySpec(new byte[32], "AES");

        char[] password = "changeit".toCharArray();
        byte[] salt = new byte[20];
        new SecureRandom().nextBytes(salt);
        keyStore.setEntry("test", new SecretKeyEntry(key),
            new PasswordProtection(password,
                new PBEParameterSpec(salt, 100_000))); FileOutputStream("/tmp/keystore.p12"), password);

Note that despite the inclusion of “HmacSHA512” in the above PBE mode that only applies to the key derivation from the password. There is no integrity protection at the level of individual entries.

It is also worth noting that the keystore and individual key passwords should be the same. I don’t think this is a fundamental limitation of PKCS#12 in Java, but certainly standard Java tools like the command line “keytool” utility will fail to handle PKCS#12 keystores with different passwords used for the store vs individual keys. If you don’t need to use those tools then you might be able to get away with different passwords for each key.

In contrast to the previous entries, the PKCS#12 key store format does actually encrypt certificates too. It does this with a hard-coded algorithm “PBEWithSHA1AndRC2_40”. This uses 50,000 rounds of salted PBKDF1 to derive a 40-bit key for RC2 encryption. RC2 is an old stream cipher that I certainly wouldn’t recommend. The 40-bit key is far too small to provide any serious security. It makes me wonder why bother applying 50,000 rounds of PBKDF1 to protect the password while generating a key that is itself vulnerable to brute-force. It is probably actually faster to brute force the derived key than the original password. I can only assume it is maintaining compatibility with some decision taken way back in the depths of time that everyone involved now deeply regrets.

The integrity of the overall PKCS#12 key store is protected with “HmacPBESHA1”. This is HMAC-SHA1 using a key derived from the store password using 100,000 iterations of salted PBKDF2-HMAC-SHA1. This is all hard-coded so cannot be changed. This is an ok choice, although it would be nice to be able to use something other than SHA-1 here, as it appears that PKCS#12 allows other MACs to be used. For HMAC usage, SHA-1 is still just about ok for now, but it would be better to remove it. It would also be nice to be able to tune the iteration count.

Overall, the PKCS#12 key store is considerably better than either of the Sun-designed proprietary options. If you specify your own PasswordProtection instances with AES and SHA2 and use high iteration counts and good random salts, then it’s actually a pretty solid design even by modern standards. The only really ugly part is the 40-bit RC2 encryption of trusted certificates, but if you do not care about the confidentiality of certificates then we can overlook that detail and just consider them lightly obfuscated. At least the use of HMAC-SHA1 is a decent integrity protection at last.


There’s not much to say about PKCS#11. It is a standard interface, intended for use with hardware security tokens of various kinds: in particular Hardware Security Modules (HSMs). These range from 50 Euro USB sticks up to network-attached behemoths that cost tens or hundreds of thousands of dollars. The hardware is usually proprietary and closed, so it’s hard to say exactly how your keys will be stored. Generally, though, there are significant protections against access to keys from either remote attackers or even those with physical access to the hardware and a lot of time on their hands. This isn’t a guarantee of security, as there are lots of ways that keys might accidentally leak from the hardware, as the recent ROCA vulnerability in Infineon hardware demonstrated. Still, a well-tested HSM is probably a pretty secure option for high-value keys.

I won’t go into the details of how to set up a PKCS#11 key store, as it really varies from vendor to vendor. As for PKCS#12, while the interface is standardised there is enormous room for variation within that standard. In most cases you would let the HSM generate keys in the secure hardware and never export the private key material (except perhaps for backup).


Use a HSM or a PKCS#12 keystore, and specify manual PasswordProtection arguments when storing keys. Avoid the proprietary key stores.

Alternatively, farm out key management to somebody else and use a Key Management System (KMS) like Hashicorp Vault.


So how *do* you validate (NIST) ECDH public keys?

Updated 20th July 2017 to clarify notation for the point of infinity. A previous version used the symbol 0 (zero) rather than O, which may have been confusing

In the wake of the recent critical security vulnerabilities in some JOSE/JWT libraries around ECDH public key validation, a number of implementations scrambled to implement specific validation of public keys to eliminate these attacks. But how do we know whether these checks are sufficient? Is there any guidance on what checks should be performed? The answer is yes, but it can be a bit hard tracking down exactly what validation needs to be done in which cases. For modern elliptic curve schemes like X25519 and Ed25519, there is some debate over whether validation should be performed at all in the basic primitive implementations, as the curve eliminates some of the issues while high-level protocols can be designed to eliminate others. However, for the NIST standard curves used in JOSE, the question is more clear cut: it is absolutely critical that public keys are correctly validated, as evidenced by the linked security alert.

Continue reading “So how *do* you validate (NIST) ECDH public keys?”

Updating OpenAM’s encryption

Updated 30th March 2017 to reflect updated information (see comments), add additional links and add some clarifying text about why misuse-resistance is useful.

With the impending release of the ForgeRock Identity Platform, I thought I’d spend some time writing up a few of the bits of OpenAM 14 that I was directly involved with creating. One of my last acts before leaving FR to go solo, was to put in place the first phase of modernising AM’s aging system credential encryption scheme. Before I start, I should say that this encryption scheme is not used for encrypting user passwords (which are hashed by the LDAP user store, not AM). Instead, this scheme is used for encrypting various system credentials (passwords for SMTP servers, HMAC shared secrets, etc) in the config store and in exported system configurations and in a few other places.

The original (and still default) encryption method was first mentioned in Dante’s Inferno. Actually it dates from the original iPlanet codebase from the mid-90s, and uses correspondingly ancient cryptographic algorithms (MD5 and DES). It is best to regard it as providing only limited obfuscation of credentials, rather than any true security guarantees, and the advice has always been to secure the config store by traditional means (TLS, access controls) rather than rely on this encryption. Still, we can do much better than this now, so AM 14 ships with a new AESWrapEncryption scheme that provides significantly improved security:

Continue reading “Updating OpenAM’s encryption”

Should you use JWT/JOSE?

In the wake of some more recent attacks against popular JSON Web Token (JWT)/JSON Object Signing and Encryption (JOSE) libraries, there has been some renewed criticism of the JWT/JOSE standards themselves (see also discussion on with an excellent comment from Thomas Ptacek summarising some of the problems with the standard). Given these criticisms, should you use JOSE at all? Are articles like my recent “best practices” one just encouraging adoption of bad standards that should be left to die a death?

Certainly, there are lots of potential gotchas in the specs, and it is easy for somebody without experience to shoot themselves in the foot using these standards. I agree with pretty much all of the criticisms levelled against the standards. They are too complicated with too many potentially insecure options. It is far too easy to select insecure combinations or misconfigure them. Indeed, much of the advice in my earlier article can be boiled down to limiting which options you use, understanding what security properties those options do and do not provide, and completely ignoring some of the more troublesome aspects of the spec. If you followed my advice of using “headless” JWTs and direct authenticated encryption with a symmetric key, you’d end up not far off from the advice of just encrypting a JSON object with libsodium or using Fernet.

So in that sense, I am already advocating for not really using the specs as-is, at least not without significant work to understand them and how they fit with your requirements. But there are some cases where using JWTs still makes sense:

  • If you need to implement a standard that mandates their use, such as OpenID Connect. In this case you do not have much of a choice.
  • If you need to interoperate with third-party software that is already using JWTs. Again, in this case you also do not have a choice.
  • You have complex requirements mandating particular algorithms/parameters (e.g. NIST/FIPS-approved algorithms) and don’t want to hand-roll a message format or are required to use something with a “standard”. In this case, JWT/JOSE is not a terrible choice, so long as you know what you are doing (and I hope you do if you are in this position).

If you do have a choice, then you should think hard about whether you need the complexity of JWTs or can use a simpler approach that takes care of most of the choices for you or store state on the server and use opaque cookies. In addition to the options mentioned in the referenced posts, I would also like to mention Macaroons, which can be a good alternative for some authorization token use-cases and the existing libraries tend to build on solid foundations (libsodium/NaCl).

So, should you use JWT/JOSE at all? In many cases the answer is no, and you should use a less error-prone alternative. If you do need to use them, then make sure you know what you are doing.

Critical thinking for software engineers

I am sometimes asked whether doing a PhD was worth it, given that I left academia and research to become a full-time software developer. My answer is an unequivocal “yes”, despite the fact that my thesis is about as relevant to what I do now as a book on the sex lives of giraffes.

By far the most important skill I learnt during that time was not any particular technical knowledge, but rather a general approach to critical thinking—how to evaluate evidence and make rational choices. In a profession such as software engineering, where we are constantly bombarded with new technologies, products and architectural styles, it is absolutely essential to be able to step back and evaluate the pros and cons to form sensible technology choices. In this post I’ll try and summarise the approach I take to making these decisions.

Continue reading “Critical thinking for software engineers”

Bloom Filter session logout – some numbers

The previous post on Stateless Session Logout in OpenAM 13 has proved to be quite popular in terms of this blog. While it went into some detail about the technology and the problems that need to be solved in a production system, it was a bit short on actual figures to illustrate the gains. In this post we will rectify that with some theoretical numbers on memory usage and some measurements from early performance testing.

Continue reading “Bloom Filter session logout – some numbers”