On Key Management and Tokenization
August 17, 2012 | POSTED BY JERALD DAWKINS IN SECURITY, ENCRYPTION, TOKENIZATION
I often get into debates on the use of encryption and it being the panacea of data protection. While encryption has proven itself a viable solution for many years, the problem is never in the algorithm, but rather in the management of the keys. In order for encryption to occur the system must have the key to encrypt and decrypt the data. This means that the key resides somewhere on a computer system accessible by the application. How well is the organization protecting the key and ensuring that the application is handling the key appropriately is the most significant question.
Take RSA and their SecurID token, for example. The breech wasn't targeted against the encryption algorithm. The attackers attack the weakest link, RSA themselves, to expose the means by which the tokens were being generated (Wikipedia Reference). In a similar way, attackers will attack the weakest link of encryption, the key management process. I've been party to many application penetration tests where developers often store the encryption key within the application themselves, often with rather obvious passwords, "5UP3RS3cuR3!".
Encryption is great, but very few people actually know how to use it correctly. How well do your software developers understand key management principles? How well do your vendors' developers understand the issue and have you even asked the question? How many assumed RSA had all their ducks in a row (given the fact that their company name is in fact an encryption algorithm itself)? How well are your internal key management processes documented?
While tokenization eliminates many of the risks associated with encryption, it too has attack vectors. The first is how to get the sensitive data into the token vault, and the second is access to the de-tokenization feature. One advantage that tokenization presents is in its ability to isolate those functions. In other words, a good tokenization system is one that is solely focused on tokenization. This singular focus makes it more easily defensible.