Leveraging Generative AI to Accelerate and Personalize Client Onboarding
In today’s fast-paced world, banking clients demand quick, easy and personalized onboarding experiences, a demand shared by retail, private and corporate clients alike. Client onboarding is a critical process to ensure financial institutions (FIs) onboard the right clients; however, to meet client expectations, it needs to be seamless and swift. As a result of traditional, decades-old and often highly regulated processes, client onboarding can be complex and time-consuming. This can be disadvantageous in today’s globalized, digital world, where FIs must compete aggressively to acquire market share from competitors and retain their own share of wallet.
Most advisors looking to add to their books of business can get clients to the main door; however, getting them through the door is another challenge altogether. Although compliance teams have a myriad of RegTech tools in their arsenal today to accelerate the process (e.g. KYC, OFAC, risk engines, transaction monitoring systems, etc.), onboarding new private banking clients is still an involved process.
While many banks have set goals to accomplish the onboarding process over the course of about a week, it’s not uncommon to hear of onboarding lasting up to 20 days or longer, depending on the complexity of the client. Could these figures be improved if banks started to incorporate large language models like ChatGPT technology — aka generative artificial intelligence — into the client onboarding process?
Incorporating Generative AI into the Client Onboarding Process
Traditionally, the client onboarding process is led by bankers who guide their clients on obtaining source of wealth information and other personal documentation. They also help clients with mapping complex investment structures (i.e., identifying ultimate beneficial owners, personal investment companies, offshore holding companies, etc.) and acquiring related corroborating evidence.
The time spent on and intricacy of this process is magnified for clients with highly complex structures or business arrangements. Compliance officers often conduct enhanced due diligence on these clients, which requires additional processes that can further delay onboarding.
Generative AI can act as a “voice of compliance” during the onboarding process, providing bankers the opportunity to initiate conversations and receive instant feedback and verification of information, akin to speaking with a personal compliance officer. It is important to note that safeguards should be put in place (such as human approval if the model is asked to make decisions) to prevent the AI model from deviations and hallucinations.
These AI models can be trained with comprehensive use cases and information from historical and current onboarding data and can learn to provide unique scenarios to each exclusive onboarding case. This, in turn:
- Enables bankers to verify the completeness and accuracy of client information upfront during the discovery phase, thereby saving time on required follow up.
- Helps bankers validate key questions that are typically asked by compliance officers, eliminating bottlenecks in onboarding.
- Validates information and documents against a checklist adapted for each unique case, reducing the need for back-and-forth between bankers and compliance officers, again speeding up the overall onboarding process.
- Enables faster decisions by compliance officers and helps compliance officers quickly flag unique cases that require special attention, further reducing bottlenecks.
- Enables personalized onboarding experiences, with each client case treated uniquely.
How Generative AI Can Boost KYC and AML Processes.
Broadly, AI can help banks automate KYC and AML processes by analyzing large amounts of client data, including personal information, transaction history, adverse news, OFAC Sanctions screening and validation of client documents. Upon detecting discrepancies or sanctions hits while referencing these datasets, the AI model can automatically prompt the Compliance Team.
Generative AI can analyze this information and generate a large pool of potential scenarios and test cases for monitoring and training purposes. Compliance teams can be fully equipped with various case permutations, which helps accelerate decision making and identify critical cases for prioritization.
In addition, generative AI can be programmed to learn and analyze global events and potential economic and political risks, serving as a tool for impact analysis for the bank. It can also develop various use cases and trigger alerts to bankers, investment managers and compliance officers. To safeguard against AI hallucinations, the alerts can be approved or overruled by humans. Leveraging its self- learning capabilities, the generative AI model can continuously learn from false alerts and improve its predictive processes over time.
Complementing, Rather than Replacing, Humans
Contrary to common belief, AI has not replaced human input. Rather, it is being deployed in a complementary role, enhancing the working experience of bankers and advisors. AI chatbots have evolved rapidly to provide additional self-service options working alongside traditional customer service agents. As an example, chatbots in bank call centers can recognize spoken words in different accents. Based on natural language processing inputs, the chatbot can automatically route cases to task-specific customer service officers or direct the customers to self-service portals for less urgent matters (i.e., change of address). This reduces the load on the call center and frees up the line for more urgent or time-consuming cases, like fraud.
One particularly challenging issue for banks is ensuring consistency in the evaluation of cases and documentation. For example, if two compliance officers were given the same set of information and documentation about a certain medium-risk client, one officer might be content with straightforward and recent corroborating evidence on the source of wealth, while the other officer might request more detailed evidence that spans across decades. This begs the question, “How much evidence is enough evidence to onboard a client?”
While largely being driven by all applicable regulation, this can also be somewhat of a subjective question. This is where Generative AI may be useful. It can formulate various scenarios and permutations based on a given set of information to enhance decision making. Generative AI also has the ability to bring consistency to the evaluation process. Ultimately, however it remains up to each compliance officer’s human judgment to make the call.
Conclusion
Bankers want to onboard clients quickly, and the Compliance Team wants to ensure that the client is legitimate and safe to onboard. Generative AI is a powerful technology that can help banks accelerate client onboarding, comply with regulatory requirements, manage risks and reduce fraud.
Human input and judgement are still required to mitigate the risk of AI hallucinations, but the advantages outweigh those concerns, which can be addressed with a robust operating model that has checks in place. Since faster onboarding means reduced cost of acquisition, reduced revenue leakage and happier customers, banks that are still on the fence about embracing solutions like generative AI risk falling behind their competitors.