by Eugene Morozov
On 10 July 2020

What does the new normal of ‘multi-cloud’ mean for financial institutions?

Cloud computing has been widely adopted by enterprises seeking data storage and computing power that is faster, more agile, and more scalable than on-premise solutions. Multi-cloud, applied correctly, can push these improvements even further.

This insight was originally published as a contributor article in Data Economy. Click here to view.

The sustained shift away from on-premise storage and computing towards cloud-based solutions has reframed the debate around cloud within the financial services industry. Fundamental concerns about security and compliance have largely dissipated, while the potential benefits of greater speed, agility, efficiency, and scalability are well understood. Choosing the right private, public, or hybrid model undoubtedly remains a significant challenge for banks’ and asset managers’ technology teams. But the overall direction of travel is clear: legacy systems migrating onto the cloud, accompanied by new cloud-native services and applications built to take advantage of the new possibilities cloud can offer.

That is not to say that all IT systems within financial markets need to become cloud-native. Some legacy operations might be better decommissioned than migrated. In some cases complete reliance on cloud-based solutions can be cost-ineffective. Nonetheless, the pertinent question has become not whether enterprises will use the cloud, but how they will best use these services.

Multi-cloud’s value proposition

Organizations are increasingly choosing to run different parts of their business on different cloud computing services. A ‘multi-cloud’ architecture can leverage any combination of AWS, Azure, GCP, and private cloud capabilities. Anecdotally, the majority of the financial services organizations we work with are pursuing a multi-cloud architecture as part of their target operating model.

A common motivation for multi-cloud architectures is the avoidance of vendor lock-in, both from an operational risk and a cost perspective. Enterprises can select their cloud vendor and services based on value for money, rather than being captive to their incumbent provider. This requires that applications be built to be capable of running in multiple different environments, which introduces significant complexity and trade-offs.

One trade-off is that applications may not realize the full benefit of some features that are unique to AWS, Azure, or GCP if they are not designed to take advantage. In the long term this may be mitigated by convergence in services and the commoditization of the cloud computing market. But the current reality is that each platform is highly idiosyncratic with a significant learning curve. Writing software that can run interchangeably on different providers requires targeting a common denominator between them, which can have diminishing returns when compared to productivity gains from platform features.

Cloud has provided enterprises with data storage and computing power that is faster, more agile, and more scalable. Multi-cloud is a lever that organizations can pull to achieve these outcomes with greater cost efficiency.

Finance should look at and learn from other industries that are more advanced in solving this problem. Silicon Valley has led the way in pioneering containerization, orchestration, and Infrastructure-as-Code (IaC) tools. Systems and components are packaged independently, with configuration management and environment provisioning fully automated as code. All major cloud providers support tools like Docker, Kubernetes, Puppet, Chef, and Terraform, often with their own flavor of each technology, and teams have to establish new standards and ways of working.

Financial institutions need to invest in building expertise in these new technologies and rethinking their software development, deployment, and support lifecycles to take advantage. Understanding new paradigms such as microservice architectures and continuous delivery are equally important for organizations to be successful in this new world.

Cross-border complexities

The issues of data provenance and data sovereignty are also of paramount concern for multi-cloud models. As data is piped between cloud-based applications that sit on different infrastructures, it is essential to capture multiple layers of data provenance so that these flows are trustworthy and can be effectively integrated. Recent legal, regulatory, and legislative developments in jurisdictions across the globe have increased the complexity of cross-border and domiciling requirements to be tackled.

The EU General Data Protection Regulation only permits the transfer of personal data for storage or processing outside of the European Economic Area to jurisdictions with “adequate” data protections. The list of these is constantly reviewed, with fewer than 10 countries currently qualifying, and some to only limited extents.[1]  Otherwise the data “controller” – the user of the cloud service, rather than the provider – must put in place “appropriate safeguards”. As the geographic footprints of different providers’ data centers inevitably vary, enterprises adopting a multi-cloud model must be mindful of which vendor satisfies these obligations to host data locally or in compliant third countries.

But a multi-cloud model can also aid regulatory compliance. Four Australian banks account for c.80% of New Zealand’s banking system.[2]  The Reserve Bank of New Zealand has been pushing for these subsidiaries of “overseas-incorporated” banks to be independent of their parent organization – “to be able to stand on their own”, in the words of the RBNZ’s deputy-governor.[3]  Running these different businesses on different cloud computing services is a clear operational solution for this.

But for the Australian banks, regulators have cautioned against over-reliance on any one provider, and urged firm to consider Australia-hosted solutions before crossing borders. The recommended best practice is to treat each provider as a third party who could potentially be breached, and to ensure that critical business operations could not be significantly impacted in the event of a failure at the provider.[4]  This requires a thoughtful approach to multi-cloud whereby specific business functions and technical capabilities are identified and hardened against potential the risks.

Applied correctly, cloud has provided enterprises with data storage and computing power that is faster, more agile, and more scalable. Multi-cloud is a lever that organizations can pull to achieve these outcomes with greater cost efficiency. But as with many other aspects of the technically complex transition to cloud-based systems, the quality of execution will be the key determinant as to how much of that value potential is realized.


[2] p.14