Search
Close this search box.

Launching Everyday GenAI With a Pilot Team

By Ben Udell, SVP Digital Innovation

Lake Ridge Bank

Embracing artificial intelligence in banking, part one

Generative Artificial Intelligence (GenAI) is no longer a future promise, it’s arrived and is already shaping the future of financial services. Financial Institutions (FIs) have begun using artificial intelligence (AI) to work more productively and creatively while enhancing their employees’ skills. These organizations are building a competitive advantage with AI, and yet every organization can find practical everyday benefits. 

In this two part series, we’ll examine GenAI solutions like ChatGPT. For only $20 per month per user (the paid version is worth every penny), you can unlock significant productivity gains. In this blog, we’ll use ChatGPT and AI interchangeably; you may prefer another model like Microsoft’s Copilot or Bing AI, Gemini (formerly known as Bard), or any other large language model (LLM). All of these applications produce human-like responses through natural language interaction. In part two we’ll take a deeper look at applying AI.

If you haven’t taken steps to pilot or deploy a GenAI solution like ChatGPT or other generative AI solutions, you’re beginning to fall behind in an area that’s growing exponentially. Your first step into AI should be with a pilot team that will help you create guardrails, understand capabilities, and lay the groundwork for the future of your organization. In part one of helping you get started with AI, I will:

  • Walk you through common concerns and how to address them      
  • Help you consider who should be on our pilot team     
  • Provide basic ideas on where and how to implement GenAI

 

Overcoming AI Concerns

There are common themes that are holding FIs back from even attempting to use (AI) in any meaningful way. Headlines about GenAI often talk about privacy and bias concerns, hallucinations, or focus on large technology investments.      

Privacy concerns around AI tools are an important topic, but not actually a new topic to the industry. On the one hand, bank and credit union industry employees are entrusted with handling sensitive data daily. Yet many critics of artificial intelligence tools bring up concerns about data privacy. I would argue that with the right training, financial services industry veterans are worthy of the utmost trust when it comes to handling new tools and software solutions.

 
“We already have an IT Acceptable Use Policy that addresses the appropriate handling      of private data and how it relates to technology. We clarified that AI is part of this policy and before granting users access, they are trained on ChatGPT and review the policy to ensure compliance and understand the risks.”
 
– Julie Redfern, Chief Banking Officer at Lake Ridge Bank

This point reinforces that privacy with ChatGPT, and other AI models is not a new phenomenon, but does require transparency and training. Reinforce these well-learned expectations to ensure users appropriately handle private data just like they’re already doing. 

Topics such as bias, model training, and hallucinations, are just as important as privacy, but by designing a consistent approach for your team’s artificial intelligence tools, you can address them and offset risk. Simply put, never blindly trust the full output of AI until you’ve fully read and edited the work. Think of ChatGPT as your assistant. When you distribute content to clients, you already work with compliance, or have a process to ensure quality and accuracy. This applies to AI. While AI output is strong in the financial services industry, you must review before using AI created content. 

  • Bias is an industry-wide concern, particularly given the regulatory landscape. FIs are uniquely positioned to mitigate bias through their strong compliance frameworks. The content being created at FIs typically undergoes rigorous compliance review by employees who are trained on creating fair and unbiased content. Our industry is built to fairly communicate and apply financial products and tools without bias. Edit your work, follow compliance procedures, and you’ll keep yourself from taking any missteps. 
  • Factual accuracy in AI-generated content. Yet, the solution lies in proactive measures—editing content, sourcing from credible databases, and fact-checking rigorously. FIs can leverage their expertise to ensure the information produced by AI remains accurate and reliable. Trust but verify is an industry mantra that is transferable to AI content creation. When you find incorrect information, data, or references, simply ask AI to provide references that align with your goals. You can incorporate accurate data and information from appropriate sources and cite that work when necessary. Fact checking concerns are overblown when you can easily find and fact check sources. 
  • Hallucinations: What are they and how do you prevent them? AI Hallucinations are when artificial intelligence tools, in an attempt to provide a human-like response to an account holder’s question, will sometimes simply make things up. This happened recently with a major airline when their chatbot invented a refund policy that did not exist. The reality is that with good training, prompting, and direction, especially on financial products and services where information is readily available, we should find limited hallucinations. Your editing and fact checking process will help protect you against publishing incorrect or erroneous content. Embrace these challenges, acknowledge them with stakeholders, and focus on the steps, policies, and procedures to deliver relevant output.. Julie Redfern, chief banking officer at Lake Ridge Bank, summed up these concerns well, “The realization that all of these concerns already exist in organizations helped us move forward with piloting AI. We didn’t have to recreate procedures, we needed to ensure we followed our existing procedures when looking for bias, creating factual content, and managing other AI concerns.” 

 

Start Your Pilot Team

Once you’ve addressed your organization’s policies, it’s time to build your pilot team. This group should be focused on finding practical everyday applications that improve the quality and productivity of their work. There are some key suggestions to finding the right group of employees to help you move forward with AI. 

  • Tech Savvy – Look for employees who can’t wait to begin using AI. They’ve asked about it, use it personally, and may already have a head start. Use their excitement and experience to get the pilot group off the ground. 
  • Active Writers – One of ChatGPT’s greatest assets is the ability to write a high quality first draft, at incredible speed. Marketing is an obvious choice, but there are many pockets of organizations that are writing policies and procedures or communications to clients or across the organization. You want to find employees who will embrace AI writing opportunities which means they’ll input ideas and concepts, let AI write for them, then they’ll edit and brand the final drafts. 
  • Curious Minds – Some people will never be comfortable with AI. The best approach is to talk to AI like a human being. Humans are unique, embracing a curious mind to engage, prompt, think creatively, and push boundaries is very helpful to learning how to best use AI. 
  • Leaders – AI will create and necessitate change.  Finding leaders who will embrace this disruptive technology, evangelize it, and apply it to help others will help you accelerate this change. 

Gathering a pilot group that is excited to usher in change will help you accelerate usage and gain efficiencies. At some point in time, organizations had to mandate the use of Microsoft Excel and computer spreadsheets over paper ledgers. There had to be a line in the sand that pushed employees to use email over paper memos. Other technology evolution examples are not hard to find. AI is one more disruptive technology that everyone will need to address. 

 

Develop AI Use Cases

Encourage your  pilot group to find use cases which demonstrate how powerful GenAI can be in your organization. While ideas are nearly endless, here are some to get you started. Applying these use cases will help you demonstrate how AI can improve productivity and quality of work. 

 

Use Case #1: Write Marketing Content


Blogs, emails, copy, taglines, communications, or any other content need, is a great opportunity to showcase AI’s capabilities. 

Use this prompt in ChatGPT, “Write me a blog that takes five minutes to read promoting the use of high yield savings accounts for Gen Z savers wanting to buy their first home. Include three bullet points with features and benefits of this product, include a call to action to contact a banker, make it educational in nature because this is a new product for this demographic. Remind them that an automatic savings plan will help them reach their down payment goals. Provide 10 blog title ideas.” 

The output won’t be your final draft, but it will be done in seconds! Ask ChatGPT to make follow up modifications which dial it in to your needs. 

 

Use Case #2: Employee or Client Conversations

ChatGPT is a great tool to improve how you interact with others. 

Try these prompts in ChatGPT,Rewrite this email adding in excitement and praise.  I need to have a conversation with an employee giving them the good news on their promotion. How can I approach this conversation?”

ChatGPT can help you tailor your communication to other styles, write questions to ask, create agendas, and give you pitfalls to avoid

Use Case #3: Use ChatGPT as a Consultant


ChatGPT is powered by vast amounts of information from the internet. Using it as a consultant to learn new skills or apply new concepts helps employees improve their quality of work. 

Simply copy and paste the information into ChatGPT and let it act as a personal consultant to help improve anyone’s work. “Walk me through the process of creating a pivot table in Excel.” “Write an executive summary of my report, the reader will be a Chief Technology Officer.” “Give me a point, counter point of my recommendation to help me prepare to overcome objections and strengthen my case.”

 

 
“Once we began to see how much ChatGPT could improve how our associates worked, we quickly added more associates to the pilot group. The team started to share ideas that only lead to new ways to apply AI which leads to a stronger internal and external client experience.”
 
– Julie Redfern, Chief Banking Officer at Lake Ridge Bank

 

Your Next Steps

Following this guidance will help you create a personalized pilot plan for your organization. You already have the people, policies, procedures, and opportunities in place to begin testing ChatGPT or other Gen AI assistants. With basic guidelines and a plan that encourages the use of AI you can begin to find amazing practical solutions with meaningful impact to your organization. 

Stay tuned for part two, where you’ll read specific in-depth usage examples of ChatGPT. We’ll explore using data to transform how you write content which will allow you to be hyper relevant to client needs.
 

Additional Resources:

Implementing generative AI with speed and safety – Article by McKinsey & Company, March 2024
Navigating the Regulatory Landscape of Artificial Intelligence in Banking – Blog by Alkami’s Chief Compliance Officer, Dennis Irwin, February 2024

author avatar
Ben Udell
With more than 25 years of financial expertise, Ben Udell is the driving force behind "future proofing" Lake Ridge Bank.
Related Blogs

Never miss a beat in digital banking

Market research is a reliable tool that can be the foundation for strategic success in...

Delivering seamless 24/7 support within digital banking solutions Artificial intelligence (AI) is expected to change...

This blog post was originally published in June 2021 and was updated and republished June...