Your bank is concerned about protecting personally identifiable information (PII), which is one of its most valuable assets. PII is present throughout your data; keeping it safe is essential to retain customer trust and regulatory compliance.
AI/ML use has elevated the demand for data, much of which may occur outside of your bank. Data usage for analytics and reporting has traditionally occurred inside your bank, whereas AI/ML is often outside.
You bank is unsure of the best approach to protect its PII. Traditional methods of access controls are no longer effective as AI/ML require access to large data sets to create value for your bank.
Our Advice
Critical Insight
- The use of AI is new in your bank, your employees are eager to explore and try new ideas, but they don’t understand how AI uses and permanently stores data.
- AI presents new challenges for your bank that have not been considered before. Your bank doesn’t have formal acceptable use policies, your employees don’t fully understand how AI works, and your data security did not consider applications like AI.
- You are learning about the challenges AI present, but you are not sure of potential solutions. New processes and applications will secure your PII and assure you remain secure and compliant.
Impact and Result
You must implement several changes to your bank to secure PII and educate about AI:
- Employee training on AI and its use is essential within the bank and perhaps to influence overall culture.
- You must scan your existing systems and data to locate PII then encrypt or tokenize it. New tools have been created to scan your local, on-premises, and cloud storage to identify existing PII that needs to be retroactively encrypted or tokenized
- All new PII collected by your bank must immediately be encrypted or tokenized. Real-time encryption or tokenization will secure your future.
Protecting Personally Identifiable Information When Using AI in Banks
The sooner you protect, the safer you are.
Analyst Perspective
The best time to protect personally identifiable information (PII) is at the moment of collection.
The use of artificial intelligence (AI)/machine learning (ML) and generative artificial intelligence (Gen AI) is growing daily. As appealing as the benefits are, their use presents a considerable threat for your bank and its customers. AI requires large amounts of data to power the deep insights that it offers. This need for data is what represents the greatest threat to your bank and its customers’ PII.
Unlike other applications, the PII threats that arise from AI/ML come primarily from within your bank. This contrasts with most of the other data-related threats that your bank is used to encountering, which are primarily external. The internal nature of the risks associated with AI/ML has meant that many banks are not well prepared. Preparing for internal risks is quite different from preparing for external threats.
Perhaps that greatest challenge is that every employee in your bank has the potential to expose PII while experimenting with or using AI/ML tools. Unless there is a well-defined threat, your bank must assume that everyone is a potential risk, regardless of whether the threat is intentional or accidental. The outcome in both cases is the same.
Unlike other applications and tools, AI and ML are unpredictable and difficult to control. The greatest challenge comes from the inability to permanently retract or delete data that has been sent to an AI/ML application. This means that you cannot predict when the data might re-emerge in responses to other users of the AI/ML application. Your bank must take immediate action to safeguard its PII.
David Tomljenovic, MBA, LLM, CIM
Head of Financial Services Industry Research
Info-Tech Research Group
Executive Summary
Your ChallengeYour bank is concerned about protecting personally identifiable information, which is one of its most valuable assets. PII is present throughout your data, so keeping it safe is essential for retaining customer trust and regulatory compliance. AI/ML use has elevated the demand for data, much of which may occur outside your bank. The use of data for analytics and reporting has traditionally occurred inside your bank, while AI/ML is often outside. You bank is unsure of the best approach to protect its PII. Traditional methods of access controls are no longer effective because AI and ML require access to large data sets to create value for your bank. |
Common ObstaclesThe use of AI is new in your bank and your employees are eager to explore and try new ideas. However, they do not understand how AI uses and permanently stores data. AI presents new challenges for your bank – challenges that have not been considered before. Your bank does not have formal acceptable-use policies, your employees do not fully understand how AI works, and your data security did not consider applications like AI. You are learning about the challenges that AI presents, but you are not sure about potential solutions. You need new processes and applications to secure your PII and ensure that you remain secure and compliant. |
Info-Tech's ApproachYou must implement several changes to your bank to secure PII and educate your employees about AI.
|
Info-Tech Insight
Encryption and tokenization of PII provides your bank with the protection it needs, not only for threats related to the use of AI but also for other data-related threats. Real-time and retroactive encryption and tokenization of PII will also allow free access to the data that your bank needs to drive product, service, and customer experience innovation.
PII is frequently exposed as part of larger data breaches
PII was exposed in 80% of data breaches.
Source: BigID, 2021
40% of Americans had PII exposed during the Equifax data breach.
Source: UpGuard, 2024
Info-Tech Insight
PII may not always be the primary target of a data breach, but it must always be treated as if it is the primary focus.
There is a real cost to data breaches in banks
Customer account information is highly sought after.
JPMorgan Chase had 83 million customer records breached, costing the company US$100 million.
Financial PII is widely collected and highly vulnerable.
Equifax had 143 million customer records breached, which costed the company US$300 million.
Source: Finextra, 2022
Info-Tech Insight
Beyond financial penalties, financial institutions that experience data breaches risk their reputation and their customers’ trust.