Core banking platforms are the beating heart of financial services. Decades of accumulated expertise, regulatory rigour, and customer trust are embedded in these systems. They are battle-tested, resilient, and deeply trusted. But they are also products of a different era. Architectures conceived in the 1980s and 1990s – and refined throughout the 2000s – were never designed for real-time payments, embedded finance, continuous product experimentation, or AI-driven personalisation. What sustained banks through previous decades is no longer sufficient for today’s digital-first institutions, let alone the AI-native banks of the future.
Traditional core banking platforms are increasingly a barrier to innovation in an economy defined by instant payments, embedded finance, and customer experiences benchmarked against Amazon and Apple. The problem is not a lack of effort. Banks have poured billions into modernisation programmes. Too often, however, these initiatives amount to incremental change – technical sleight of hand that keeps the lights on but fails to address the structural constraints at the heart of the monolith.

Banks cannot simply bolt modern customer expectations onto foundations designed for a slower, analogue era. The industry is undergoing a deeper transition: from the monolithic platforms of the past to modular architectures of the early cloud era, and now towards a molecular, cloud-native approach that promises genuine agility and economic scalability.

The first cloud era: Monoliths in virtual clothing

The first wave of cloud transformation in banking was, in reality, dominated by lift-and-shift strategies. Traditional core systems, designed for on-premises environments, were virtualised and deployed into cloud infrastructure with minimal architectural change.

This reduced physical infrastructure management but left the underlying economics largely untouched. A monolith in the cloud remains a monolith. Scaling is blunt and expensive. If account opening volumes surge, banks cannot scale that capability in isolation; they must scale the entire platform, replicating functions that are not under pressure. The result is inefficient capacity expansion and rising cost.

Many of these systems also continue to rely on end-of-day or near-batch processing, limiting real-time insight and responsiveness. It is therefore unsurprising that many CIOs now view cloud migration with a degree of scepticism. Costs escalate, promised agility fails to materialise, and boards begin to question whether the transformation delivered its intended value.

Modular architectures: Progress or a false dawn?

The industry’s next response was modularisation. By decomposing the monolith into functional modules – such as deposits, lending or payments – vendors promised greater flexibility. In principle, banks could adopt only the components they needed and evolve incrementally.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

This represented progress, but only to a point. Modular cores often introduce new forms of rigidity. Each additional module brings incremental cost, licensing complexity, and integration overhead. More importantly, many modular architectures fail to isolate resource demand effectively at a technical level.

A common example is transaction processing. In many systems, transaction ingestion and reporting workloads remain tightly coupled to the same underlying data stores. A spike in reporting activity forces banks to scale ingestion capacity as well, and vice versa. The result is distorted economics and inefficient use of infrastructure.

Modularity, in practice, often becomes monolithic thinking expressed in smaller units. It reduces immediate pressure but does not fundamentally change the operating model of the core.

The microservices era: Valuable, but not sufficient

As banks pushed further into cloud-native territory, many turned to microservices architectures. Microservices delivered genuine benefits. They enabled independent scaling of technical capabilities, improved deployment velocity, and allowed teams to work more autonomously. For infrastructure and platform concerns, microservices played a critical role in modernising how banking systems are built and operated.

However, microservices were often treated as a universal design principle rather than a contextual one. In core banking, this sometimes led to over-decomposition: financial products and behaviours were split across dozens of fine-grained services, each with its own data model, APIs, and operational lifecycle.

This approach works well for capabilities that are naturally independent. It is far less effective for financial products, whose constituent elements – balances, interest, fees, schedules, limits, and events – are inherently interdependent and must interact in real time. Designing a single product from tens of microservices introduces excessive coordination, chatty communication patterns, and significant effort to reassemble a coherent data model at runtime.

The result is unnecessary complexity. Stitching financial state back together across service boundaries adds latency, operational risk, and development overhead. It also makes it harder to reuse common product concepts consistently, as shared behaviour becomes fragmented across services rather than expressed once at the product level.

Microservices remain a powerful tool for scaling systems and teams. But they are not, on their own, a suitable abstraction for modelling financial products. Product design requires composability without fragmentation – reusability without constant orchestration.

Molecular banking: From modules and microservices to micro-components

Molecular banking represents a more fundamental architectural shift. Instead of organising systems around coarse-grained modules, molecular architectures decompose the core into its smallest functional components – the equivalent of product DNA – which can be configured, combined, and scaled independently in real time.

Rather than defining products as static constructs, molecular systems define granular configuration groups: interest calculation logic, repayment schedules, fee models, term structures, and behavioural rules. These components can be recombined dynamically and applied across different products, ledgers, and customer contexts without standing up new modules or bespoke code paths.

This approach enables entirely new forms of product design. A buy-now-pay-later facility can be introduced as a sub-ledger within an existing current account. Hybrid products can combine deposit, payment, and credit functionality without creating additional core products. Product innovation becomes an exercise in configuration rather than construction.

The implications are material:

  • Precision scaling and cost control: Only the components under pressure need to scale, aligning infrastructure cost more closely with actual usage.
  • Iterative product innovation: Banks can launch minimum viable products quickly, observe real-world behaviour, and iterate in short cycles.
  • Parallel experimentation: Multiple product variants can run concurrently, with real-time insight informing which designs to scale and which to retire.
  • Customer-level personalisation: Product parameters can adapt dynamically, enabling personalised financial experiences at scale, including those shaped by AI-driven agents.

Migration without the shock

One of the most persistent barriers to core transformation is migration risk. Large-scale, ‘big bang’ core replacements have a long history of cost overruns and operational disruption.

Molecular banking enables a different path. Products and capabilities can be migrated incrementally, one at a time, while legacy and modern systems coexist. Over time, usage naturally shifts toward the new architecture without requiring a single, high-risk cutover. This approach reduces operational risk, spreads cost, and allows banks to modernise pragmatically.

Future-proofing for an AI-native era

Banking is entering an era characterised by AI-enabled services, real-time decisioning, and hyper-personalised financial management. Architectures built around static product definitions and batch processing will struggle to keep pace.

Molecular banking provides a foundation that is inherently compatible with AI-native models. Granular, parameterised components can be reasoned over, tested, and optimised algorithmically. Product managers gain the ability to evolve offerings with the same discipline and velocity that software teams apply to code. The core system itself evolves from a system of record into a system of continuous innovation.

The industry must be realistic about the limits of past approaches. Cloud hosting did not resolve the structural constraints of the monolith. Modularisation improved flexibility but stopped short of true architectural change.

As customer expectations accelerate and AI-driven capabilities become standard, these compromises will become increasingly visible. Molecular banking is not a wrapper or an incremental enhancement; it is a re-engineering of the core operating model. By embracing granular, composable, and independently scalable components, banks can align their core systems with the realities of modern financial services.

Institutions that adopt molecular architectures will be better equipped to innovate continuously, personalise at scale, and adapt as technology and customer expectations evolve. Those that do not will remain constrained by core systems optimised for historical operating models, limiting their ability to respond effectively to the demands of modern and future financial services.

Paul Payne, CTO at SaaScada