Other parts of this series:
With big data comes big responsibility, especially in financial services.
In my earlier posts, I introduced the five key technology trends identified in our Technology Vision 2018 survey and gave some insights into the potential uses of artificial intelligence (AI) and extended reality (XR) technologies in financial services.
In this post, I will look at the core principle that needs to form the basis for all these new key technologies: data veracity.
Data is driving more and more critical decisions
Data isn’t new territory for banks. Given their pivotal place at the center of almost all commercial activity, banks have always had wide access to all sorts of customer data. However, as the Internet of Things (IoT), Open Banking application programming interfaces (APIs) and AI increase the number of sources and the volume of data, banks are faced with both the upside and the downside of this privilege.
While big data and analytics offer financial services institutions the opportunity to create increasingly personalized services, the fears around incorrect or falsified data, data hacks and identity theft are all justified.
In our Technology Vision 2018 survey:1
- More than 84 percent of surveyed bankers agreed that their organizations are increasingly using data to drive critical and automated decision-making.
- However, 77 percent also agreed that most organizations are not ready to confront the impending waves of corrupted insights, bad decisions, and potential compliance failures that could occur as falsified data starts to infiltrate their information systems.
Reliable data is indispensable
In addition to the traditional data on customers, partners, services, suppliers and products they have always held, banks are now increasingly adding data from external “unstructured” sources, such as newly accessible government and third-party databases and many more distribution channels, such as social media.
But even with new regulations around bank data, such as the General Data Protection Regulation (GDPR), the Revised Directive on Payment Services (PSD2), and the review of Open Banking by the Government of Canada, some banks still have a lot of work to do when it comes to data management.
In our survey:1
- Twenty-eight percent of surveyed bankers said they do not validate or examine the data they receive from ecosystem or strategic partners most of the time.
- Five percent said they do not validate such data at all or rarely.
Key aspects of data management
How can banks protect themselves from vulnerabilities relating to data that is of questionable value at best and corrupted at worst?
By strengthening their capabilities in three key aspects of data management:
- Provenance. Verifying the history of data from its origin throughout its life cycle.
- Context. Considering the circumstances of its use.
- Integrity. Securing and maintaining data.
The skills and tools needed to build these capabilities are within reach. In Canada in particular, we are seeing significant investments by all banks when it comes to leveraging the wealth of customer data as well as data governance to drive significant improvement in the quality and the security of the data.
With a new “data intelligence” function, it will be possible to grade the truth within data by establishing, implementing, and enforcing standards for data provenance, context and integrity. To grade data, banks will also have to develop an understanding of the “behavior” around it, and institutions should build the capability to track this behavior as data is recorded, used, and maintained.
With this understanding, banks can ensure that the vast amounts of data at their disposal can be trusted to drive critical decisions in the future.
In my next post, I will discuss another development made possible by new technologies. We call it “frictionless business.”
- “Building the Future-Ready Bank: Banking Technology Vision 2018,” Accenture, 2018. Access at: https://www.accenture.com/us-en/insights/banking/technology-vision-banking-2018?src=SOMS.