Financial Services Blog

In the previous blogs of this series, we introduced the concept of Integrated Risk Management (IRM) and introduced capabilities needed to implement a modern IRM framework. In this blog, we will take a closer look at the foundation of IRM: the ability to organize, share and analyze data.

To realize the benefits of today’s innovative tools and customer experience offerings used for IRM, organizations need to make sure their data and processes are of high quality and ready for interoperability. This means that the data itself should be viewed as a foundational point of hygiene and not only standardized for consistency but made available for sharing across all three risk lines of defense, as well as for senior management, in order to establish meaningful insights. Essentially, risk data should be considered the cornerstone of effective risk technology and a risk function that is capable of providing transparency.

In addition to quality data, there are three additional elements needed for IRM:

  1. AI and machine learning driven tools and innovation.  Independently, tools are important for IRM but are limited in the benefits they can provide. With the addition of Artificial Intelligence (AI) and Machine Learning (ML) capabilities, however, these tools can be improved to provide even greater value. For example, workflow accelerators can leverage AI and ML technologies to deliver meaningful real-time insights at a relatively low cost.
  1. Fully integrated processes. Some companies have been successful at integrating processes and reducing siloed activities, but have not carried this over to the supporting technology. This leaves further opportunity to perform full integration and faster access to required data by consolidating and rationalizing platforms to make data and information directly and more readily accessible across processes.
  1. Exceptional user experience.  Risk functions rely upon the humans who manage the data and assess risk for the organization. Risk management teams that are provided inadequate tools with awkward and counterintuitive interfaces are more likely to feel unmotivated to perform critical risk management activities. A simple, yet flexible, user interface on the IRM system can encourage and motivate users to perform their responsibilities more effectively and efficiently, leading to better, overall risk management. It is for this reason that user experience has become a major differentiator when it comes to selecting and implementing IRM technology.

All of the elements above are essential but add little value in the absence of data that is properly governed, well-integrated and easily consumable. As a result, an IRM solution needs to take the following additional considerations into account:

  • Sources and ingestion – Accurate internal and external data needs to be fed by well-defined, authoritative sources. These sources should be verified and connected to the ecosystem.  Data ingestion and processing that is supported by cloud-based software promotes flexibility and agility.
  • Architecture –Taxonomies are foundational to IRM data architecture, so careful consideration needs to be given to the design of a data model so that products, channels, and organizational structure can be aligned to it.  This alignment provides a common framework for identifying, measuring, managing, and monitoring risk.
  • Integration – Standardized taxonomies and hierarchies facilitate full integration across all risk modules in the ecosystem so that risk functions can fully utilize their inputs and outputs. To facilitate this, capabilities should be harmonized across technology platforms using a “single source of truth,” common reporting, and robust data sharing. The goal is to streamline the process of responding to complex internal and external requirements.
  • Centralized Location – Trusted, governed source of truth is required for all IRM ecosystem modules. To seek this, internal and external data could ideally be sourced into one or more common data lakes that can be shared across domains as the enterprise risk data hub. Efficiency and data quality can also be enhanced using natural language processing (NLP) and natural language generation (NLG) to clean and transport appropriate, accurate data into the hub.
  • Consumption – Robust business intelligence tools can be leveraged to produce reporting and analytics and help to improve the value and effectiveness of aggregated data and information. It would be beneficial to firms pursuing IRM to focus on outbound consumption, which is often overlooked, but essential, to providing a sustainable and widely adopted ecosystem.
  • Governance – As alluded to in our previous two blogs, robust data governance is critical for IRM and in many cases it improves naturally due to the consumption of standardized datasets integrated into the risk function from authoritative sources. This provides guardrails for the inputs that guide decision-making.

Data is the foundation of IRM, but it needs to be properly managed within the ecosystem for maximum effectiveness. In the next blog in this series, we will look in more detail at another element of IRM: technology and innovation.

Submit a Comment

Your email address will not be published. Required fields are marked *