Last year, the number of personal records exposed by cyber attacks on the financial services industry was an incredible 446,575,334 – more than triple from the year before. The financial and reputational damage from these data breaches can be immense.
However, customers are increasingly demanding more personalized and engaging experiences. That means being able to maximize the value of the data you hold. De-identification technologies can help strike the right balance between data security and utility. One of the most mature is data tokenization, which has become almost a default in helping secure payment card information.
What is data tokenization?
Data tokenization is the process of turning sensitive data – such as account numbers or social security numbers – into random values called tokens. Unlike encryption, tokens don’t use mathematical process to transform the information so they act as a reference for the original data but can’t be used to either access that data or guess at its values.
There is no key that can be used to derive the original data from the token. Instead, tokenization uses a highly secure and scalable database – known as a token vault – which is used to store the relationship between the token and the sensitive data. In this way, the sensitive data is not shared between different applications and, if the token is attacked, the hacker is left with completely useless and unusable information.
Most financial services companies–and their customers–have experience with data tokenization through their credit card payments and ecommerce transactions. Since 2006, Payment Card Industry (PCI) standards don’t allow credit card numbers to be stored on the retailer’s point of sale (POS) system or internal databases after a transaction. Data tokenization has become an accepted means of conducting the transaction while keeping cardholder data locked down.
Beyond payments: Key use cases for a financial services organization
Data tokenization reduces the risk of data in use and minimizes the amount of sensitive data being exposed to systems that don’t need it. In this way, your customer’s PII is secure while allowing business operations to use that information as they need it. This can help in various parts of your business.
Order and transaction processing
- In many cases, PII such as social security numbers and driving licenses are used as unique identifiers in the application and transaction processes. The result is that this sensitive information is stored and accessed from many different back-end systems to accommodate different transactions such as orders, billing, marketing and customer service. Data tokenization allows the sensitive data to be held centrally with the other back-end systems only using the token. This increases the speed and efficiency of the process while reducing duplication and risk.
Research and analysis
- To deliver more targeted and personalized services, financial services companies need to be able to take advantage of all the data available. However, strict guidelines determine that personal data can’t be held for any longer than needed and there may be limitations on using it for analytics purposes if it can be used to directly identify an individual. Data tokenization enables you to ‘pseudonymize’ the data so that it can be retained and made available for research and analysis purposes.
Data privacy
- New global data privacy regulations – such as GDPR – are placing a much greater focus on how organizations manage personal data and delivering hefty fines for transgressions. Data tokenization helps ensure data privacy at rest and in transit. It also has another major benefit. If all sensitive data is held within your token vault, it becomes much easier and cost-effective to demonstrate that your data is secure and well managed as well as meet regulatory stipulations such as GDPR’s ‘right to be forgotten’.
Regulatory compliance
- In addition to your own data security policies, data tokenization helps meet regulatory requirements such as PCI DSS, GLBA and HIPAA. For example, tokenized data doesn’t form part of the audit process for PCI DSS.
Embedding data tokenization as part of your business processes
Today, there are data tokenization platforms – such as OpenText™ Protect™ – that can help financial organizations secure their sensitive data. These cloud-based platforms enable the security and scalability to ensure that tokens are properly managed and provisioned rapidly so that customers can perform transactions unaware that the tokenization process is taking place at all. Financial services companies are able to gain maximum value from data in a safe and secure environment.
In our next data security focused blog we’ll look at what data tokenization offers companies in other sectors. In the meantime, visit our website to learn more about how OpenText can help you secure your sensitive information.