CLOSING THE GAP IN TOKENIZATION ... - Voltage Security

15 downloads 157 Views 735KB Size Report
CLOSING THE GAP IN TOKENIZATION: REMOVING THE LAST VULNERABILITY. A Mercator Advisory Group Research Brief sponsored by Voltage. May 2013 ...
Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

CLOSING THE GAP IN TOKENIZATION: REMOVING THE LAST VULNERABILITY

A Mercator Advisory Group Research Brief sponsored by Voltage © 2013 Mercator Advisory Group, Inc. 8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

May 20131

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

Table of Contents Introduction……………………………………………………………………………………………………………………………………………3 Tokenization: A Cost-Effective Way to Reduce Exposure…………………………………………………………………………4 Why Invest in Tokenization?.....................................................................................................................6 Selecting a Tokenization Vendor……………………………………………………………………………………………………………..6 Regulatory Compliance Transparency

7

Maintenance

7

Ease of Implementation Scalability

6

7

7

Auditability 8 Key Features to Look for in Selecting a Tokenization Solution…………………………………………………………………8 Stateless Tokenization Data Integrity

8

8

Randomly Generated Tokens 8 Support for Fine-Grained Permissions 9 Token Multiplexing

9

Summary…………………………………………………………………………………………………………………………………………………9 About Voltage Secure Stateless Tokenization™………………………………………………………….………………………….10 About Mercator Advisory Group………………………………………………………………………………….………………………..11

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

2

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

Introduction When the card networks adopted the Payment Card Industry Data Security Standards (PCI DSS), they effectively raised the bar on how sensitive data should be protected. Their objectives had consequences for any company that accepts network-branded cards for payment. Companies that accept payment cards were suddenly required to comply with data security requirements at levels that were previously the exclusive domain of companies specializing in financial services such as banks and securities firms. Many were not prepared. Controversy shrouded the launch of the PCI DSS standards, as merchants balked at the complexity and cost of compliance. This raised public awareness of the importance of payment card security. With the market increasingly moving toward a “card-not-present” environment of online shopping and e-commerce, companies that processed large volumes of payment transactions quickly became targets for criminals aiming to commit fraud and identity theft. The market responded to the security standards by rushing to either end of the compliance continuum. On one end were companies that viewed the processing of payments as not strategic or core to their business. These firms quickly outsourced the entire payment process. Their approach to compliance was avoidance. By externalizing the payment, a company could avoid coming in contact with the card number and thus be exempt from the scope of PCI compliance. On the other end of the continuum, companies embraced the standard. For many, the focus was on the foundational elements of the standard—securing the card number while it was in transit and at rest. Through encryption (e.g., using the Secure Sockets Layer, or SSL, protocol) and by exploiting the loopholes of “compensating controls,” the IT departments of many organizations convinced their management that these baseline requirements were relatively easy to meet. As the standards evolved and new approaches emerged in the market, such primitive approaches were increasingly viewed as risky. However, as the card schemes increased their emphasis on monitoring compliance and Qualified Security Assessors (QSAs) performed their assessments, companies came to understand the full implications and complexity of PCI compliance. Every system that came in contact with the card number fell within its scope. Programmers who had access to any of these systems had to be authenticated with two factors, and their system activities came under increased scrutiny with Open Web Application Security Project (OWASP) scans and continual testing for penetration of the secure perimeter. Complicating matters further, the standards continue to evolve and the general requirements are subject to interpretation by the QSA. Actions deemed compliant by one auditor may be considered insufficient by another auditor. And to make matters worse, companies that had remediated their systems and received a QSA’s Report on Compliance (ROC) have still been breached, with not only the loss of sensitive data resulting that PCI was supposed to prevent but also the negative publicity and significant costs.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

3

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage Companies increasingly saw the overall cost of compliance skyrocketing while the corresponding reputational and financial risk did not seem to diminish. PCI compliance did not necessarily mean the company was impervious to breach; an ROC simply documents that at the point in time when the assessment was prepared, the communicated standards and procedures met the requirements as set forth by PCI. Data security requires both vigilance and an expense proportional to the size and the number of systems with access to card data. The greater the number of systems involved, the more that hardware and software needs to be monitored, the more programmers and their programs need to be reviewed, and the more users have access to the data. As PCI continues to evolve, changes in the standards must be applied across all of the involved systems, further increasing costs and complexity.

Tokenization: A Cost-Effective Way to Reduce Exposure As companies wrestle with the cost and complexity of compliance with PCI standards, tokenization emerged as a viable alternative to the “secure the universe” approach. In tokenization, the sensitive information (card number) is removed from all of the systems and is replaced by a number (a surrogate value) that has no value to external parties (thieves). The security of this number, called a token, is preserved since it cannot be reverse engineered to ascertain the original number. Tokenization converts credit card numbers into randomly generated values. At the front end, when a card number is captured at the point of sale or through an e-commerce website, it is securely transmitted (encrypted) to the tokenization engine, which immediately converts the sensitive data into a token that is returned to the requesting application. The token is maintained within the various applications, and when it needs to be presented to the acquirer or card schemes, it is detokenized and replaced with the card number and securely presented to the destination. The challenge of tokenization is to properly secure the environment (called the trusted zone) where the card number is tokenized and detokenized as well as the storage area where the cross-reference is typically maintained. Internally, the card number is typically retained for use only in customer service (e.g., to resolve disputes and for charge-backs) or to reverse or repeat a previous payment. A token enables both of these functions without having to involve the system within the scope of a PCI assessment. Some organizations also use the card number as a means to identify the customer and his or her bank, analyzing the card number to produce loyalty or sales analysis. While frowned upon by the card industry, with a properly structured token such activity can be performed without the risk of divulging sensitive payment information. In order for most companies to cost effectively implement a tokenization solution, the token has to resemble many of the characteristics of an actual card. Often systems have internal functions that look for valid lead numbers (or BINs, Bank Identification Numbers) that correspond to the card scheme (4 for Visa, 5 for MasterCard, etc.), have

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

4

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage the appropriate number of digits for the card (13–19 characters), or even a “mod-10” (modulus-10, or LUHN formula) check digit. Tokenization addressed these challenges with “format-preserving” approaches in which the token replicated the characteristics of a card. In a format-preserving token, certain characteristics of the original card are maintained (e.g., last four digits of the card number) and/or some representation of the type of card and issuer. As best practices have evolved, the industry has driven toward a standard of “distinguishability,” whereby the characteristics of the token make it is clear that it is not an actual card number. Removing the card number from all enterprise systems significantly reduces the scope of a merchant’s PCI compliance requirement and minimizes the merchant’s exposure to crimes of convenience and social engineering. Tokenization itself is rather simple, typically consisting of less than a dozen programs/services that convert a card number into a token and vice versa. This serves to further reduce the complexity of compliance monitoring because the number of programmers required to maintain the application is minimal, the in-scope hardware is nominal, and the administrators’ ability to audit system access is significantly enhanced. However, maintaining a secure processing environment that can scale and remain impervious to compromise continues to present challenges for even the most sophisticated IT departments. One approach to tokenization is to outsource the process. However, in the long run, this can be an expensive proposition as many vendors apply a “per-access” charge every time the data vault is accessed. Outsourcers also tend to be inflexible and restrictive regarding the way the service is configured, as having a standardized offering drives economies of scale and margin. Additionally, some are aligned with specific acquirers, potentially requiring new relationships, despite the fact that proprietary approaches often pose difficulties when changing providers. In-house solutions provide merchants and enterprises with better control and typically a more cost-effective solution. Most solutions, however, require maintenance of the cross-reference database that holds the card data. A token database requires regular rotation of the security keys, which has implications with disaster recovery or business continuity back-ups, database release management/upgrades, audit log management, dual controls, and more. Addressing these challenges distracts scarce resources from mission-critical business issues. Even after removing the card number from the various enterprise systems and restricting the presence of credit card information to a well-protected database, companies are exposed to database administrators and superusers or system administrators that can gain access to the sensitive data and could potentially transfer a vault’s content to personal storage devices. The beauty of tokenization is that it reduces the number of systems and users that have access to the actual card data, which improves control. The downside is that it concentrates the sensitive information into a small footprint that becomes the target of criminals. The optimal solution is to further neutralize the attractiveness of that tokenization database by eliminating the sensitive data.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

5

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

Why Invest in Tokenization? Firms must consider many factors when deciding whether to implement tokenization in-house with a premisesbased solution or to outsource it to a service provider. Any consumer-facing company that also has personally identifiable information (PII) entered on Web forms by customers as well as credit card numbers should lean toward managing a PCI tokenization solution in-house. Many of the standards and best practice guidelines that have been applied to payment card data will ultimately apply to the management and use of PII. Having a solid data security practice and a mechanism to store sensitive data will enhance a company's ability to safely expand its operations, particularly as cloud-based models and big data analytics increasingly require data to be shared. Tokenization provides a number of benefits, the primary one being the reduction of applications and hardware that handle sensitive data. By limiting the presence of card numbers to a small number of servers and programs, significantly fewer applications, devices, and people fall within the scope of the compliance assessment. This simplifies the assessment and reduces costs and the need for encryption software licenses, two-factor authentication tokens, software change management, and transaction monitoring activities. Ongoing maintenance of a tokenization environment is also reduced. Typically a tokenization environment consists of a limited number of programs and transactions that reduce the number of programmers required to have access to the actual card number data. With a small number of predictable transactions, ongoing monitoring of the environment for suspicious activity can be enhanced and aberrations readily identified for action. The result is a more secure and auditable environment.

Selecting a Tokenization Vendor Once a decision is made to implement an on-premise tokenization solution, there are a number of factors to be considered.

Regulatory Compliance Many solutions on the market claim to be PCI compliant, but in reality compliance requirements extend well beyond individual hardware or software components. PCI compliance involves ongoing monitoring of activity, penetration testing, reviews of code change, control over system and software access, and more. The ability to minimize the scope of these ancillary processes should be a primary consideration, as it will reduce the overall cost of compliance. An assessment is not a science, and PCI leaves a lot to the discretion of a Qualified Security Assessor, or QSA. It is important to obtain a solution from a vendor who has a proven track record of helping clients obtain Reports on Compliance (ROCs) and satisfying the QSA’s remediation requirements. These vendors typically have invested in their products and have created innovative solutions to consistently address the challenges of PCI compliance and tokenization.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

6

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

Transparency Transparency is key. Vendors need to be able to provide proof of their data protection claims and be able to explain their methods in detail to customers and their QSAs. But the real proof is through independent verification, in which the vendor’s solution has been put to the test by multiple QSAs, clients have received their ROCs, and the vendor’s customers continually stand the test of time without compromised security.

Maintenance Solutions that are specific to hardware platforms, operating systems, or other supporting software become dependent upon those environments for their compliance. As these platforms and systems change, typically the software that runs on them has to be changed as well. Solutions that are agnostic to platform and operating system provide considerable advantage over those that are linked to specific environments and those environments’ vulnerabilities. Part of maintenance is key management, the rotation of master keys, which is paramount to a solid security practice. In a typical tokenization environment, synchronizing the new key with previously issued tokens may be problematic. Look for a solution that recognizes the importance of key management and provides tools that simplify the process.

Ease of Implementation Implementing a tokenization solution requires that the conversion from card number to token and back be placed in-stream with mission-critical operations. Often there are multiple sources and destinations for payment card data. The solution needs to fit seamlessly within the existing operations instead of necessitating that the operations be revamped to accommodate the software solution. A tokenization solution should have minimal impact on the overall operations. The solution should provide a variety of options for configuration of the token (format-preserving formats). It should minimize the remediation of existing systems while providing back-office and customer service personnel with the information they need to perform their tasks (e.g., BIN or last four digits of the card number).

Scalability Once a tokenization solution has been selected, it should require minimal monitoring to ensure it can support the growth of the business. Resource consumption for tokenization should be minimal and predictable so that it doesn’t require much attention as the data center continues to expand and platforms need to be upgraded. Memory-based processing is much faster and has higher availability than does database access, which is vulnerable to channel and database synchronization bottlenecks as well as hardware failure.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

7

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

Auditability One of the advantages of tokenization is that it concentrates the sensitive data in a single location with a very small universe of permitted activities. This facilitates a high level of scrutiny about the activity that takes place. The tokenization solution should be able to provide extensive detail about any activity where the card number is being accessed, either by an application or by a user.

Key Features to Look for in Selecting a Tokenization Solution Features that can help your business overcome the challenges of traditional tokenization solutions include the following.

Stateless Tokenization Token databases, also known as token vaults, are always in scope for PCI compliance because they contain the actual card data. Traditional tokenization solutions are vulnerable because they maintain the cross-reference between the token and the encrypted card number within a database table. “Stateless” tokenization removes the storage of card data from any system, eliminating the need for token databases. In addition to eliminating the exposure to file theft, a stateless token is typically agnostic to the platform and operating system, which reduces both implementation cost and complexity. Note, however, that not all stateless tokenization solutions are provably secure. Look for independent, published third-party validation and security proofs substantiating a vendor’s solution.

Data Integrity Traditional tokenization solutions cannot guarantee consistent 1-to-1 mapping of credit card number to token. Enterprises and merchants that have implemented first-generation tokenization approaches, whether commercial solutions or homegrown, have encountered critical data integrity problems with inaccurate analytics and other application correlation due to credit card numbers sometimes being replaced by more than one token (a side effect of having a distributed token database).

Randomly Generated Tokens Creating tokens from a pregenerated token table removes the need for a token database and eliminates the exposure of the token to being reverse engineered (since no “feasible reversibility” relationship exists between the primary account number, or PAN, and the resulting token). This method also improves security by removing the storage of cardholder data and improves overall throughput. Pregenerated token tables can be utilized in distributed architectures, each installation prepopulated with the same look-up table, which can easily scale to support regional and global growth efforts. Since token generation is performed in memory, there is no need for specialized hardware or software (which typically require expensive licenses), further driving down total cost of ownership.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

8

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

Support for Fine-Grained Permissions One of the primary advantages of tokenization is that a reduced footprint provides much greater control over who has access to the card information. Fine-grained permissions enable companies to lock down access to sensitive data without impeding business processes. The access of individual applications or users can be limited to tokenized or detokenized data only, or to detokenized data but with certain digits “masked.” Fine-grained permissions are typically accompanied with detailed audit logs that provide indisputable evidence of who had access to which card.

Token Multiplexing Multiplexing is a means to avoid “high-value” tokens by enabling token independence or unique token mappings between merchants, applications, and lines of business. A high-value token is a token that takes on the characteristics of the card number it replaces, so that the token can be used to initiate a payment transaction within the organization. In such instances, the token falls under the same PCI scrutiny as the actual card. Token multiplexing restricts the use of the token without the cost and complexity of multiple databases.

Summary PCI DSS set the standard on how sensitive data must be protected. For any enterprise that is engaged in consumerfacing commerce, establishing capabilities for secure transactions is strategic and core to the business. Minimizing the footprint where sensitive data is stored reduces the complexity, cost, and ultimately the exposure associated with protecting sensitive data. Removing sensitive data from core business applications and replacing it with a token (that is, the process of tokenization) is increasingly being recognized as the industry best practice for securing sensitive data. Although it reduces the presence of the card number, tokenization does not remove the card’s associated sensitive data from the enterprise, and heightened security procedures need to be implemented to protect the physical storage of the card numbers. The industry trend toward so-called stateless tokenization, whereby the processing is done in memory instead of through database look-ups, minimizes this storage exposure while enhancing the throughput and scalability of a tokenization solution.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

9

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

About Voltage Secure Stateless Tokenization™ The features described in this report are available are part of the new Voltage Secure Stateless Tokenization™ (SST) solution from Voltage Security. Voltage SST has been designed by cryptographic experts, is based on published and proven academic research, and is validated by third-party QSAs and cryptography experts. It provides maximum protection against data exposure from security breaches while offering a proven technique for PCI DSS compliance and maximum PCI audit scope reduction. Contact Voltage at http://www.voltage.com/company/contact-us/ for a follow up. Voltage Security®, Inc. is the leading provider of scalable and proven data-centric security and key management solutions, enabling customers to effectively combat new and emerging security threats. Powered by groundbreaking innovations including Identity-Based Encryption™, Format-Preserving Encryption™, Page-Integrated Encryption™, and Secure Stateless Tokenization™, our powerful data protection solutions allow any company to seamlessly secure all types of sensitive corporate and customer information, wherever it resides, while efficiently meeting regulatory compliance and privacy requirements. For more information, visit www.voltage.com.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

10

Closing the Gap in Tokenization: Removing the Last Vulnerability A Mercator Advisory Group Research Brief sponsored by Voltage

About Mercator Advisory Group Mercator Advisory Group is the leading, independent research and advisory services firm exclusively focused on the payments and banking industries. We deliver a unique blend of services designed to help clients uncover the most lucrative opportunities to maximize revenue growth and contain costs. Advisory Services Services providing unparalleled, independent, and objective analysis and include Banking Channels, Credit, Commercial and Enterprise Payments, Debit, Emerging Technologies, Fraud, Risk and Analytics, International, and Prepaid. The CustomerMonitor Survey Series A set of topically grouped reports gleaned from a unique set of specific, pragmatic and detailed questions and updated yearly to capture critical topic content. Custom Research and Consulting Services Services enabling clients to gain actionable insights, implement more effective strategies, and accelerate go-to-market plans. Offerings include tailored project-based expertise, customized primary research, go-to-market collateral, market sizing, competitive intelligence, and payments industry training. PaymentsJournal.com The industry’s only free online payments and banking news information portal delivering focused content, expert insights and timely news. For information, contact Mercator Advisory Group at 781-419-1700.

Copyright Notice External publication terms for Mercator Advisory Group information and data: Any Mercator Advisory Group information that is to be used in advertising, press releases, or promotional materials requires prior written approval from the appropriate Mercator Advisory Group research director. A draft of the proposed document should accompany any such request. Mercator Advisory Group reserves the right to deny approval of external usage for any reason. Copyright 2013, Mercator Advisory Group, Inc. Reproduction without written permission is completely forbidden.

© 2013 Mercator Advisory Group, Inc.

8 Clock Tower Place, Suite 420 | Maynard, MA 01754 phone: 1(781) 419-1700 | e-mail: [email protected] www.mercatoradvisorygroup.com

11