« Well, /That/ Didn't Work! | Main | Which Came First: The Software or The Security? »

Does Tokenization Solve Anything?

I've been on the hunt for a solution to the PCI problem. You might be wondering "which problem is that?" The problem, as I see it, is that L2-4 merchants have to accept, handle, transmit, and ultimately store cardholder data. Thus, the high risk associated with merchants and their generally lax/lacking security practices.

To me, the solution here is to get the data out of the hands of the merchants. If the merchants don't have the cardholder data, then you don't need to worry (as much) about them getting compromised. The question is if there are solutions in the market that accomplish this goal. One such solution is "tokenization" - but I'm not convinced it actually saves you much.

With tokenization solutions, your card processor provides you with a non-sensitive token to store in your billing system in lieu of the CCN itself. Ok, all good and fine, but that's where things stop. With this solution I see 2 major problems. First, you're still accepting the CCN through your site, and thus have to secure systems in accordance with PCI. Second, it would appear that you then rely on APIs (or compiled software) that can then conduct some transactions with the CCN (though I would hope you couldn't get the CCN back).

My primary concern with a tokenization solution is the first one listed above. If you still have cardholder data flowing through your site, then you're still subject to almost all of the PCI requirements. The main ones you can side-step are encryption, key management, and certain nuances of the network security requirements (such as the proxy). Beyond that, you still have to shoulder the cost of compliance for the rest of your site.

Given this apparent case, one must then wonder: is outsourcing to a tokenization provider really more cost-effective than simply fixing the problem directly on your own systems? Sure, it transfers some of the responsibility away from you, to an org that is geared toward protecting that data. But at what cost? Moreover, now you have to trust this 3rd party not to get compromised, while you still have to share in the responsibility of protecting the data.

In the end, it seems to me that the right solution is to move all of the CC acceptance and processing off of the merchant platforms. I think this can be done in a couple ways, and it seems that there may be a couple solutions out there (EPX, Paymetric), but again it's not fully clear if they are in fact fully transferring the CC acceptance, processing, and storage, or if they're really just a more involved tokenization company. Paymetric's site talks extensively about ERP integration, suggesting a tokenization practice. EPX explicitly talks about tokenization, while making the claim that merchants "Never process, transmit or store data". Of course, the EPX site also says "Eliminate the nightmare of PCI" - which I think is probably not true.

Tokenization may reduce the burden of PCI, but it definitely does not eliminate it completely, and I'm left to wonder if the reduction is even enough to warrant the outsourcing cost. What else am I missing out there? Is this problem being solved, and well?

TrackBack

TrackBack URL for this entry:
http://www.secureconsulting.net/MT/mt-tb.cgi/883

Listed below are links to weblogs that reference Does Tokenization Solve Anything?:

» More on Tokenization from The Falcon's View
In a previous post, "Does Tokenization Solve Anything?", I questioned what the value was if the cardholder data still passed through your site. I've had the opportunity this week to look at three of these solutions, and have been pleasantly... [Read More]

» Tokenization: Someone Else Gets It from The Falcon's View
Apparently I'm not in fact insane, but do in fact know a little something about things that don't make much sense. One of those things is the mythical tokenization that has been heralded in marketing hype as the next greatest... [Read More]

Comments (4)

Alex Pezold:

I approach this comment very humbly as I believe you probably have done much more research on this topic than I. With this, I believe the purpose of tokenization is to 1) reduce risk to the customer’s environment by eliminating as much CC data as possible, and 2) to reduce the impact PCI has on an environment by shrinking the scope. Based on what I know of tokenization, it is no silver bullet, but it does achieve both goals above.

Ultimately, like you, I would like to see that silver bullet for removing all CC data from our client’s environments.

Ben:

@Alex -

I think you're right, I just question whether or not it's achieving either of those objectives. Too often I see orgs with multiple billing systems and I wonder, will the tokenization solution just become yet another layer of complexity in an already over-complex environment?

In terms of shrinking the scope, I'm not fully convinced it actually achieves that goal. If it's a 1-to-1 match for an existing billing system, it's simply a shell game, moving the CC data from one box to another. Most solutions are built on the premise that your flows don't change, which means your PCI scope very likely stays almost exactly the same. Not much gain there.

Thanks for the comment!

-ben

Nice debate topic. The root issue is PCI Scope, as "defined" by QSA's who may or may not fully understand security or the merchant or enterprise IT environment. If anyone has attended QSA training they would see the gaps emerge quickly in knowledge levels which vary from industry hardened experts to extremely inexperienced technicians who lack basic infosec awareness. Hence we have the drive to scope reduction over risk management - that is, if scope can be easily pinned down :)

One of the reasons Tokenization has taken off is that it permits merchants to side step a lot of PCI DSS because tokenization is not defined specifically in the PCI DSS Standard. This is bound to change, but right now it is more difficult to assess Tokenization as “correctly” implemented or not by a QSA - there's no guidance, no “Section 3.X – Tokenization Requirements”. Defining what "cardholder data" is also a grey area in PCI and debate rages. QSA's are trained that encrypted cardholder data is cardholder data – a bizarre circumstance to anyone who's experienced in encryption and key management- and which seems to be missing the critical overlooked addendum "if there is explicit access to the specific encryption key used to encrypt the cardholder data". However, this seems to have come about as breaches from badly implemented systems have taken place – e.g. putting your keys in the same database as the encrypted data, encrypting with a key left sitting unprotected in a store controlled that anyone can then copy. Bzzzt! Danger! But was the encryption itself bad? No ? Was the key management bad ? Yes. So why not reduce scope with encryption one asks? Well, true end to end encryption will do just that - from swipe to acquirer /processor for example (not to be confused with Point to Point...but thats another topic).

Tokenization on the other hand leverages the grey area of the standard and thus creating the concept of the "disassociated" data item - the token - which can therefore be used to reduce scope – at least for now. This is also missing the critical addendum "can reduced scope if and only the entity who owns the token cannot easily detokenize to clear data, the token is not reversible itself or weakly constructed, and does not retain a 1:1 relationship to live data or permit replay attacks in token dependent systems as a means to defraud systems". Such reversibility with an token service or an in house solution may be a simple phone call or dispute requests, or accessing a weakly managed service API or an in-house database and/or its API. I've heard so many QSA's talk as though Tokenization is "stronger" than encryption without fully understanding either: Quite alarming - both from what may have passed as "compliant", and the level of knowledge in stating such. Both Encryption and Tokenization have their place, and the format preserving encryption approaches can essentially provide the benefits of Tokenization without the ugly massive token lookup database that still has to be managed by someone, somewhere.

The challenge is the PCI council created a clarification FAQ note on scope and encrypted data, but the way the answer is described is still unclear. Tokenization is yet again not mentioned in the same way, and the encryption model the clarification is based on looks upon the problem of scope and encrypted data from an assumption that keys are merely dished out to people. A modern cryptographic approach perhaps involving hardware key management and public key techniques eliminates this – but this fact seems to be lost in the confusion and we have the mantra: encrypted data is cardholder data, again without analysis, logic, or reason.

My view is that scope and Tokenization will change in 2010. Oddly, despite encryption being a very mature technology and with care can be implemented extremely well, it comes under more scrutiny for correct implementation in PCI DSS because the risks of use and implementation are well known and can be tested, managed, and controlled - hence the is "in scope".

Taking a regulation side-stepping approach is short lived. This is especially true if by tokenization, all one is doing is becoming an issuer of a "number" that has a local 1:1 mapping to live data which represents value. How is that descoping? Its making a previously insecure entity an "issuer" of a proxy card number....just as you noted, billing, booking, forward reservation systems can make active decisions based on the proxy…and thus open that to potential abuse, without even knowing a live credit card number - by proxy if you will.

Tokenization also suffers from a widely varying range of implementations and indeed scale/security problems – that big database is going to get bigger. Some Tokenization approaches are good and some scarily bad and insecure – e.g. using poor hashing techniques to create tokens without about as much strength as 30 bits of encryption, or describing techniques which hide the fact that encryption is used up front to get the token in the first place - and sadly an almost implicit assumption of reducing scope from many QSA's without detailed analysis or risk assessment of exactly why. To them, Token = out of scope, end of story. Bzzzt! Danger!

As scope reduction reduces QSA costs, the drive towards it is economic vs risk driven: a short term benefit, but without care a potential long term disaster. Thus, I expect as both the quality/experience of the QSA's improves and the PCI Council moves to the new standard in May 2010 its likely in my opinion that Tokenization will come under more intense scrutiny and potentially some form of standardization and guidance.

Those rare but very experienced QSA's that really know IT risk and payment processing environments and truly help in managing risk already know this. However, time will tell and hopefully logic and best practice will prevail to deliver the original intent of PCI – reducing risk.

-Mark

I've been reading up on tokenization since my current employer utilizes it (not for credit card info as far as I know, but for other reasons). Everything I've read has left me (not an InfoSec expert by any means, and more and more I'm learning what I don't know) thinking, "I don't really see how that's much better, since 1) Someone still has to secure the user info, and 2) now someone has to be responsible for and secure a database where the tokens can be mapped to real data."

Somewhere on this page the term "shell game" was used. That's exactly what it is ... and if the shell game isn't played well (and considering the players involved, one would guess it might be no better than the "original game" they were playing), tokenization doesn't seem to gain much.

At the very least, encryption shouldn't become ignored when using tokens - they should be used together.

It may not be "as dangerous" for (an unauthorized) someone to have the token as it would be for them to have the actual information ... but it still seems dangerous to me. So they have a little more work to do before they can get actionable data ... that's never stopped them before.

Post a comment

About

This page contains a single entry from the blog posted on May 15, 2009 1:07 PM.

The previous post in this blog was Well, /That/ Didn't Work!.

The next post in this blog is Which Came First: The Software or The Security?.

Many more can be found on the main index page or by looking through the archives.

Creative Commons License
This weblog is licensed under a Creative Commons License.