Remember paging through encyclopedias as a kid to find the information you needed for a project? The books were musty, outdated and volumes were missing. We’ve come a long way since then thanks to technology. Our kids have information right at their fingertips, literally.

I wish the same were true for financial services customer care staff. With today’s knowledge management systems and processes, they’re chasing information like we used to do. In this case, though, there are major consequences when information is old, inaccurate or unavailable. Problems aren’t solved quickly. Some not at all. Customers and employees have poor experiences. And companies end up spending a lot of money to ensure compliance and remediation. Nobody wins.

When call center reps are hired, they face a firehose of where, when and how to find information. The deluge is overwhelming. But after this initial training, it can be difficult for employees to keep track of and assimilate all of the new information about new process flows from daily stand ups. Sometimes, their knowledge can become stuck in time. This despite the fact that their employers have made big investments in training.

Artificial intelligence (AI) can change all of this. (In previous blogs in this series, I’ve written about other ways AI can impact customer care.) I am most excited about AI’s potential to reduce investments in training, shorten the ramp-up to proficiency, improve compliance and customer experience, and provide new opportunities for the workforce.

One way that AI streamlines knowledge management is through its ability to quickly factor in contextual information. A basic example would be when someone from North Carolina calls in to ask about account fees, the representative is automatically directed to information and regulatory policies only for that state. This seems so simple, yes. But it is not always happening in call centers today.

AI-powered chatbots can also take over queries, allowing representatives to do more meaningful work—and build better customer relationships. Just imagine if instead of navigating countless decision trees, or asking the person in the next chair, a representative could query a bot to get customers the information they need. In a world where we rely on Alexa to turn off the lights and ask virtual agents to place retail orders, this is not so far-fetched. And with AI constantly learning on the job, it can do more than locate information with its sophisticated search capabilities, it can eventually anticipate exactly what representatives need to know.

We are seeing some financial services companies rolling-out chatbots now, often internally first to support an IT help desk or in human services. For example, here at Accenture, we developed a chatbot that provides employees with an anonymous and interactive way to access information related to our Code of Business Ethics.1 Once chatbots have been trained, companies can make them available externally. Saudi Arabia’s largest Islamic bank is using a virtual agent that understands and speaks Arabic and takes calls through the bank’s IVR.

Getting results from AI-powered chatbots requires more than plugging them in. It requires a multi-dimensional approach. Process re-engineering eliminates the problem of putting good bots on bad processes. It is key. Companies also need the right people to train the bots, change management programs to acclimate people to new ways of working, and high-quality information so people trust the knowledge they get from the bots.

AI is an exciting alternative to spending millions on knowledge management only to get low adoption and high maintenance costs. And it’s a whole lot more exciting than those musty encyclopedias.

As always, I hope you’ll keep following my blog. Watch for my last post in this AI series, which will explore the impact of AI on leadership in financial services customer care.


“Accenture Reimagines its Code of Business Ethics Through Intelligent Technology,” Accenture, December 20, 2017. Access at

Submit a Comment

Your email address will not be published.