IDG Contributor Network: Bank battle for innovation and market share needs huge live data crunching


When there isn’t much else to choose between brands, customer service becomes an important differentiation, and in . As regulators continue to make it easier for customers to switch providers, financial institutions must spend as much time keeping existing account holders happy as they do wooing new ones. Issuing apps and making it easier for customers to bank and source products online is a good start, but account holders will soon notice and defect if such moves are really a thinly disguised attempt to reduce costs and close branches.

has found that nearly two-thirds of consumers perceive little or no differentiation of products and services across the overall banking sector. The last major technical innovation in banking was the ATM after all. Bain has demonstrated a clear correlation between a bank’s Net Promoter Score (the willingness of customers to recommend a company’s products or services to others) and how customers rank simple things such as the ease of bill-paying. .

Institutions are well aware of this: it is no coincidence that removing friction from the customer journey has been identified as the most significant trend by . Improved use of data and advanced analytics ranked similarly highly, suggesting the two are closely connected. Yet without advanced use of technology and data to support innovation, costs will soar as established players try to play .

Smarter services need smarter data processes

A number of leading organizations, from Bank of America to Barclays Africa, are currently trialing artificial intelligence to improve the customer experience, by accelerating the resolution of queries and completion of tasks — with . Beyond banking, insurance companies are embracing technology to simplify the claims process and offer lower premiums, as long as customers agree to connect their cars, boilers and home alarm systems so they can be .


The big challenge is that many of these emerging applications and big ideas rely on the ability to process huge volumes at data in real time, to allow the next action to happen. This is not an after-hours number crunching exercise.

And this isn’t something that can be achieved within a private data center – even one owned by a large bank. The scale and performance required would make this far too expensive, by up to a factor of 20 compared with using the cloud, if indeed it were possible to build out the infrastructure in a reasonable timeframe. (It isn’t.)

But nor is it something that can be easily moved offsite – or at least not without special provision. That’s because the data, being live, needs to exist in more than one place at the same time: in core systems within the bank’s main data center; and in the public cloud where the rapid analysis really needs to happen.