Why APAC banks must rethink their data architecture in the age of AI-powered digital banking.
In the financial services sector, data is the business. From fraud detection to trading and personalized customer experiences, every decision hinges on fast, accurate, and trustworthy data.
However, across APAC, many financial institutions continue to struggle with fragmented data environments – scattered across cloud, on-premises, and legacy systems. A global study found that close to 3 in 5 banking executives experienced data silos as a major barrier to digital progress.
These silos not only slow innovation but also create blind spots that weaken cyber resilience and complicate regulatory reporting.
As banks accelerate AI adoption across operations, fragmented infrastructure threatens both trust and performance. According to a recent IDC report, APAC FSI organizations are bracing for stricter digital resilience mandates and the cost of inaction is rising.
In a discussion with Steve Rackham, CTO for Financial Services, NetApp, we find outhow financial institutions across the region can overcome data fragmentation challenges and build AI-ready infrastructure that strengthens performance, compliance, and cyber resilience.
How does data fragmentation hinder AI performance and innovation in the FSI sector?
Steve Rackham (SR): Financial services organizations are rapidly adopting AI to enhance efficiency, detect fraud, bolster cybersecurity, and improve customer experience. Yet, a significant hurdle persists: data fragmentation.
Data is scattered across IT environments – from on-premises systems to various cloud environments – and multiple data ingestion points like ATMs, mobile banking apps and more. All these are creating silos and directly undermining AI initiatives that require a unified view of critical business data to run successfully.
AI algorithms demand vast, high-quality datasets to train robust models. When data is siloed, preparing these datasets becomes arduous, time-consuming, and error-prone, impacting model accuracy and deployment speed. This often prevents AI projects from reaching production or delivering scalable outcomes.
According to the NetApp 2024 Data Complexity Report, 85% of APAC tech executives agree that unifying data is essential for optimal AI outcomes by 2025, highlighting the urgency of this challenge.
Looking beyond performance, fragmentation introduces severe governance and security risks, particularly for heavily regulated financial institutions. The operational overhead of managing fragmented data also drains resources, diverting IT teams from developing new intelligence-driven financial products and services. This delays time-to-market and stifles competitive advantage.
AI-ready data architecture, built for secure, scalable, and governed access across hybrid environments, is foundational for accelerating AI success and innovation in financial services and beyond.
How do siloed legacy systems compromise both regulatory compliance and cyber resilience, and create blind spots that delay incident detection and regulatory reporting?
SR: From a regulatory compliance perspective, disparate legacy systems make it extremely difficult to achieve a unified view of data. Imagine trying to accurately track, categorize, and report sensitive customer information for strict financial regulations when everything is spread across disconnected databases.
This absence of a single control plane means applying consistent data governance policies is a nightmare, resulting in increased manual work, error-prone reporting, and a higher risk of non-compliance fines and reputational damage.
When it comes to cyber defense, these siloed legacy systems are inviting targets for attacks. They often lack modern, built-in security and use outdated protection. Finding unusual activity, spotting threats, and responding quickly to incidents becomes much harder because security teams cannot easily connect the dots across all these different systems.
These delays make it tougher to investigate and result in longer recovery processes that take weeks or even months.
True cyber resilience requires all teams to work together, and deploying the right, intelligent IT architecture enables just that. By removing data silos, it also allows organizations to stay ahead of threats, respond quickly, and maintain continuity no matter what.
Why is there a need for AI-ready data architecture that spans hybrid environments?
SR: Instead of looking at AI-readiness as a technical issue to be solved, we should be looking at AI-ready data architecture across hybrid environments as a business necessity, shaped by enterprise data needs and AI demands.
Now, AI isn’t new to the financial services industry. While AI has reshaped business operations and value, organizations are now looking beyond how they can operationalize AI and this will depend on accessible, well-managed data.
Three reasons why this is important:
- As enterprises operate across diverse landscapes – on-premises data centers, private clouds, and multiple public cloud providers; having AI-ready architecture means unified data access and services across this hybrid reality. This enables data teams to access, prepare, and leverage data wherever it resides without compromise. Without this unification, data fragmentation leads to inaccessible datasets, inconsistent data quality, and stifled innovation.
- AI workloads are dynamic and resource intensive. Training large models requires massive compute power and high-performance data access, best delivered in the cloud. However, fine-tuning or inferencing might benefit from proximity to on-premises data or specific hardware configurations. An AI-ready hybrid architecture provides the flexibility to run AI workflows – from training to fine-tuning to inferencing, wherever they perform best and are most cost-effective, ensuring data availability and governance across the entire AI data lifecycle.
- AI success hinges on data governance and security. In hybrid environments, ensuring secure, compliant, and governed data access for AI is paramount. AI-ready architecture must provide built-in data security, policy-based governance, and observability to manage data effectively, minimizing risks associated with sensitive information and intellectual property. This intelligent foundation is critical for moving AI projects from concept to production.
How can organizations unify storage without compromising data residency or sovereignty?
SR: Unifying storage across hybrid environments without compromising data residency or sovereignty is a critical challenge, particularly for highly regulated industries like financial services.
The key is to modernize and simplify data infrastructure by breaking down silos and delivering a unified data architecture that spans on-premises, private cloud, and public cloud environments, ensuring data remains in the required geographical location while still being fully accessible and manageable.
NetApp’s approach, powered by ONTAP, exemplifies this. ONTAP serves as a single storage operating system that runs seamlessly across on-premises storage-as-a-service and the world’s largest public clouds like AWS, Azure, and Google Cloud.
This architectural design enables organizations to maintain their data within the physical locations required by regulatory mandates and under the legal jurisdiction of their operating country, while still benefiting from centralized visibility and consistent operations.
Through intelligent data services, institutions gain a centralized view of their data assets, improving traceability, enforcing consistent governance policies, and automating critical compliance workflows. Capabilities such as automated data classification, audit-ready backup and retention policies, and integrated data protection reduce the operational burden on compliance teams while improving accuracy and resilience.
A unified control plane like BlueXP provides centralized orchestration so that policies for data access, security, compliance, and governance are consistently applied across the entire data fabric, regardless of where the data physically resides.
This level of visibility and control enables financial institutions to shift from reactive compliance to proactive risk management – building the agility and trust needed to respond confidently to regulatory changes, security threats, and market volatility.
With this approach, organizations can harness the agility and optimization benefits of cloud transformation without trade-offs, maintaining full control over data placement and access while optimizing cost and performance at scale across hybrid, multi-cloud environments – all while rigorously adhering to data residency and sovereignty obligations.