[ad_1]

Adam Lieberman, Head of Artificial Intelligence and Machine Learning, Finestra

With ChatGPT Access 100 million users Within two months of its release, generative AI has become one of the hottest topics, as individuals and industries ponder its benefits and implications. This is further fueled by the fact that ChatGPT has inspired a number of new Generative AI projects across industries including the financial services ecosystem. Recently, it was informed of He JPMorgan Chase is developing a software service like ChatGPT to be used by its customers.

On the flip side, as new stories spread about generative AI tools and applications, so do conversations about the potential risks of AI. On May 30, the Center for AI Safety released a statement – ​​signed by over 400 AI scientists and notable leaders, including Bill GatesCEO of OpenAI Sam Altman and “The Godfather of AI,” Geoffrey Hinton– Expressing concern about serious potential risks.

finestra We’ve been closely following developments in AI for many years, and our team is optimistic about what the future holds – especially for the application of this technology in financial services. In fact, at Finastra, AI-related efforts are broad, touching areas ranging from financial product recommendations to mortgage process document summarization and more.

However good it is to come from AI, bank leaders — responsible for keeping customers’ money safe, a job they don’t take lightly — must also have a clear picture of how tools like ChatGPT will benefit from previous chatbot offerings, What sets the initial usage apart. The case for generative AI for financial institutions and the risks that come with artificial intelligence, especially technology, are advancing rapidly.

Not Your Grandma’s Chatbot

AI is no stranger to financial services, with artificial intelligence being deployed in tasks such as customer interaction, fraud detection and analytics even before the release of ChatGPT.

However, unlike today’s larger language models (LLMs), previous financial services chatbots were archaic – far simpler and more rule-based than ChatGPT. In response to an inquiry, these previous iterations would inevitably look to find a similar question and if no such question had been entered, they would give an irrelevant answer, an experience that many of us will no doubt have had.

Much larger language models are needed to understand the semantics of what a person is asking, and then to provide a useful response. ChatGPT and its partners excel in domain experience with a human-like ability to discuss topics. Such large bots are highly trained to provide users with a far more seamless experience than previous offerings.

potential use cases

With a better understanding of how new generative AI tools differ from what is already there, bank leaders need to understand the potential use cases of these innovations in their own work. No doubt the applications will expand exponentially as the technology develops further, but initial use cases include:

Case workload: These documents can be hundreds of pages long and often take at least three days for one person to review manually. With AI technology, it is reduced to seconds. Furthermore, as this technology evolves, AI models may evolve such that they not only review but actually prepare documents after being trained to create them with all their required requirements and concepts .

Administrative tasks: Tools like ChatGPT can save bank employees meaningful time by completing tasks such as answering emails and supporting incoming tickets.

Domain Expertise: To provide an example here, the home mortgage market process raises many questions for consumers who may not understand all the complex terms in the applications and forms. Advanced chatbots can be integrated into the customer’s digital experience to answer questions in real time.

Idea

While this technology has many exciting potential use cases, much remains unknown. Many of Finastra’s clients, whose job it is to be risk-aware, have questions about the risks AI presents. And in fact, many in the financial services industry are already moving to restrict the use of ChatGPT among employees. Based on our experience as a provider of banks, Finastra focuses on several key risks that bank leaders should be aware of.

Data integrity is table stakes in financial services. Customers trust their banks to keep their personal data secure. However, at this stage, it is not clear what ChatGPT does with the received data. This begs an even more related question: Can ChatGPT generate a response sharing sensitive customer data? With older style chatbots, the questions and answers are predefined, controlling what is being returned. But controlling what is asked for and returned with the new LL.M. can prove difficult. It’s a top consideration that bank leaders must weigh and keep a close eye on.

Ensuring fairness and lack of bias is another important consideration. Bias in AI is a well-known problem in financial services. If bias is present in historical data, it will taint AI solutions. Data scientists in the financial industry and beyond must continue to explore and understand the data and look for any biases. Finastra and its customers have been working and developing products to combat bias for years. Knowing how important it is to the industry, Finastra actually named blinksA decentralized application designed to create a fair fintech future as the winner of our 2021 Hackathon.

road ahead

Balancing innovation and regulation is not a new dance for financial services. The AI ​​revolution is here and, as with past innovations, the industry will continue to evaluate this technology as it evolves to consider applications for the benefit of customers – always with an eye on customer safety.

Adam Lieberman, Head of Artificial Intelligence and Machine Learning, Finestra

,

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *