Tokenisation – Introduction
What we are primarily looking at now are mobile payments but we are designing solutions for e-commerce as well. Payment processing involves re-encryption of PINs (PIN-blocks) for card-present transactions.
E-commerce does not use PINs but it certainly works with transaction data and that includes credit card numbers. PCI audits (a quarterly pain for payment processing companies) are basically interested in two things: security of card PINs and security card numbers. Secure processing of PINs provided as a service solves one of the issues but it still doesn’t help limiting the scope of PCI audits.
Here comes tokenisation of credit card numbers – and the world of e-commerce as well. PCI Security Standards Council have updated their guidelines in April. Things may somewhat change but I don’t believe the general idea will change significantly. When you get the tokenisation right according to these guidelines, the reward is a reduction of the scope of PCI audits.
Tokenisation should protect the card number from misuse by converting it to something that is random. Most of card numbers are 16 digits – e.g., 5602 2123 3212 3332 – and their official name is Primary Account Number (PAN). Unfortunately, card numbers (or PANs), have an internal structure and some parts must not be changed:
- six digit issuer identification number (IIN) or sometimes called bank identification number (BIN) – 5602 XX above is Bankcard Australia;
- account number – variable – 23 3212 above;
- sub-account number – e.g., the order number of the credit card issued to you;
- control number – usually the last number computed with Luhn algorithm.
When you want to tokenise a PAN (and be able to reverse it as well as route via, e.g., VISA network), you must be quite careful. One has to preserve IIN/BIN and also the sub-account number. The token must also be adjusted so that the control number is correct, as it may be checked anyway en route from a payment terminal to the issuing bank (the bank where you have your money).
This leaves only 5 digits that can be tokenised or randomised. The PCI guidelines therefore state that the probability of guessing the PAN must not be higher than 1 in 100,000. That is an inherent limitation to preserve correct behaviour of the card payment processing systems and anything else is likely to break things.
The way you do tokenisation is, however a pretty local issue and requirements are much higher:
- tokenisation must be done with a cryptographic function;
- the length of encryption keys must be at least 128 bits;
- key management (e.g., generation of keys) must be done in FIPS140-2 Level 3 hardware;
- any information stored outside secure hardware must not increase the probability of a random guess to find the correct PAN.
The advantages of tokenisation include substantial reduction in audit requirements. Our goal is to be able to take the whole of card processing systems out of PCI audit scope and allow anyone to process payments – be it internet application or smart phone app.
A final note
So, that’s what we’ve been looking at last couple of weeks and there are a number of ways to do that. They will, however, always have one common feature – a lookup table that will help mapping tokens onto original PANs. That is, if one wants to have reversible tokens. Our solution follows all the guidelines and includes:
- tables of 100,000 items for each user (=card processing company);
- diversification of tokens with BIN number – so that PANs from different banks map on different tokens;
- adjustment of the token to match the Luhn control number;
- replication of any lookup tables with a single encryption key used by FIPS140-2 Level 3 hardware;
- fast tokenisation and de-tokenisation operations; and
- as we are geeks, we provide an additional token that does not preserve anything but is an encrypted value of PAN.