AI in Finance
Traditional, Generative, and Agentic applications in Finance
AI in Global Banking: Flagship Initiatives from the World’s Leading Institutions
“AI in Finance” is far from a single, unified concept.
Rather, it’s an umbrella term that captures a wide spectrum of tools and technologies being applied in very different ways across the financial industry. The rise of generative AI, machine learning, and natural language processing has touched everything from how bankers build models to how traders execute orders or how compliance teams detect fraud.
This article aims to act as a primer on AI in the finance industry; introducing how 10 of the largest financial institutions are implementing AI – across Employee Productivity, Retail Banking, Investment Banking & Markets, Wealth Management, and Risk & Compliance.
To understand the scale to which AI is becoming deeply integrated into the financial industry, one needs not look further than the IT budgets for 2025.
Across North America, Europe, and Asia, leading institutions are collectively investing over 35 billion in bringing AI into their core operations – in most cases equaling over 35% of the bank’s total IT budgets.
LLM’s are being deployed across every division of major banks—from retail banking to wealth management, from markets to compliance, and from internal developer tools to client-facing assistants.
1. JP Morgan Chase
Over the past 10 years, JP Morgan Chase has cemented itself as one of the largest players in the AI ring – investing $2 billion a year into its internal LLM Products (LLM Suite) and their extensive AI Cloud infrastructure Omni AI.
Internal Productivity: LLM Suite & DocLLM
JP Morgan LLM Suite (Learn More)
In 2023, JPMorgan rolled out its in-house LLM Suite to more than 200,000 employees, marking the formal start of AI integration across the entire firm. According to Lori Beer (CIO) and David Heitsenrether (head of AI strategy), the suite is designed to streamline day-to-day tasks such as email drafting, document summarization, and knowledge search. While its current value lies in productivity gains, JPMorgan’s long-term vision is for the Suite to evolve into a deeply integrated AI layer—embedded directly into workflows across banking, markets, risk, and compliance—making AI assistance as natural as any other enterprise tool.
DocLLM (Learn More)
DocLLM is a layout-aware generative language model developed by JPMorgan for multimodal document understanding. It handles documents like forms, invoices, receipts, reports, contracts, etc., by combining text semantics with spatial layout (how text is positioned and structured on a page). A massive step forward for how banks might deal with unstructured data in documents in the future.
Retail Banking: EVEE & Pilots of Various AI Products
EVEE Intelligent Q&A (Learn More)
EVEE is a internal document butler built to support JP Morgan Chase’s call centre agents. It lets employees ask internal queries (about policies, documentation, procedures, etc.) and get concise, context-aware answers. EVEE is used in many of JPMorgan Chase’s call centers across a range of functions (customer service, fraud & claims, home lending, wealth management, collections)
IndexGPT (Learn More)
JPMorgan is testing client-facing AI Products like IndexGPT – An AI for creating customized thematic investment strategies.
Investment Banking & Markets: SpectrumGPT & Banking CoPilots
SpectrumGPT (Learn More)
SpectrumGPT is JP Morgan Chase’s Portfolio Manager Nudge Engine. It helps with spotting market signals, monitoring risk, and staying on top of research/news/earnings events that affect portfolios.
Banking CoPilots (Learn More)
JP Morgan Chase is also piloting various investment banking CoPilots to assist in automating various repetitive roles of entry IB Analysts (As part of LLM Suite) for assistance in drafting first drafts, data gathering, summarization, compiling materials for pitch books, etc.
Wealth Management: Connect Coach AI
Connect Coach AI (Learn More)
Connect Coach AI is JPMorgan Chase’s wealth advisor copilot, designed to enhance how financial advisors prepare for and engage with clients. The tool retrieves data from vast internal and external research libraries, surfaces timely market insights, and even generates draft talking points or ideas for client outreach.
Software Engineering: GitHub Copilot, Codeium, PRBuddy, and more.
AI Coding Assistants (Learn More)
JPMorgan Chase’s software engineering teams use a variety of coding assistants to handle repetitive or mundane coding tasks. JPMorgan uses tools such as GitHub Copilot, Codeium, Tabnine, and DevGPT internally, as well as PullRequestBuddy (PRBuddy) to help automate part of the code review process.
Risk and Compliance: AI in KYC and AML
KYC & AML AI Solutions (Learn More)
Anti-money laundering (AML) monitoring, contract review, and Know Your Customer (KYC) checks have long been a cornerstone of JPMorgan Chase’s application of artificial intelligence, reflecting an industry-wide priority among global banks. These tools help automate the analysis of vast amounts of structured and unstructured data—from transaction flows to customer documents—enabling the bank to identify risks and anomalies faster than manual teams could manage.
Future Outlook: Currently in 2025, JPMorgan Chase is leading the charge on AI implementation in finance. Over 200,000 employees are using the LLM Suite daily, and the firm remains poised to expand its reach beyond productivity and compliance into client-facing applications. With ongoing pilots like IndexGPT and advanced copilots across banking, wealth, and markets, JPMorgan is laying the foundation for AI-driven products and services that could reshape how institutional and retail clients interact with financial data.
2. Bank of America
Bank of America is treating AI as core infrastructure. Flagships include the client-facing Erica assistant (now over 3 billion interactions and 20 million users) and a growing set of internal copilots. In 2025, the bank allocated $4B of its $13B tech budget to AI, focusing on job-specific tools.
Internal Productivity
Erica for Employees
Erica for Employees is Bank of America’s internal version of its flagship AI assistant ( Erica), that has been designed to streamline support across IT, HR, and operations. Adopted by over 90% of employees, it began as a simple IT helpdesk tool—handling routine tasks like password resets, device activations, and troubleshooting—but by 2023 it had expanded into a full-service employee assistant. Today, Erica helps staff locate payroll and tax documents, review health benefits, navigate HR policies, and get answers to internal questions in real time, reducing support bottlenecks and improving productivity across the organization.
Retail Banking: Erica
Erica for Retail (Learn More)
On the consumer side, Erica is BoA’s AI-powered mobile virtual assistant, baked into the Bank of America app. It helps users perform various everyday banking tasks via voice/text/chat, including viewing account information (routing/account numbers), searching for transactions, managing debit/credit cards (locking/unlocking, reporting lost cards), and receiving notifications and insights on spending. It also assists with bill payments, money transfers (including via Zelle), and helps customers track recurring charges or changes in spending habits.
Investment Banking & Markets: Banker Assist
BofA Banker Assist (Learn More)
Bank of America’s Banker Assist is an internal generative AI platform designed to streamline work for banking employees, particularly in the areas of meeting preparation and information gathering. It consolidates financial data from both internal systems (e.g., in-house research, reports, client data) and external sources (market data, news, third-party databases), thereby reducing the need to switch between multiple tools.
Wealth Management: ASK MERRILL and ASK PRIVATE BANK
Ask MERRILL® and ask PRIVATE BANK® (Learn More)
Bank of America has rolled out specialized AI copilots within its wealth management divisions, branded as ask MERRILL® for Merrill advisors and ask PRIVATE BANK® for private banking teams. These tools serve as natural-language assistants that can extract insights from vast client, portfolio, and market data, enabling advisors to prepare for meetings and client outreach more efficiently.
By 2024, these AI assistants had supported over 23 million interactions and had deep adoption across the advisor network. Advisors rely on them to surface new business opportunities, identify timely portfolio actions, generate tailored talking points, and assemble research summaries — all with the goal of enabling more proactive, personalized, and timely engagement with clients.
Future Outlook: Future Outlook: Under new CTIO Hari Gopalkrishnan, Bank of America is expanding job-specific AI and continuing heavy investment in AI patents and platforms.
3. HSBC
HSBC is embedding generative AI across customer service, institutional banking, and internal productivity, with over 600 AI use cases in flight.
Internal Productivity: Internal LLM's and ORRA
Unnamed Internal Productivity LLM’s (Learn More)
Although specific internal names of LLM tools beyond “productivity tool” have not widely been publicized (e.g. no name like “HSBC Assist” or similar) – HSBC says colleagues across the Group have access to an LLM-based productivity tool for day-to-day tasks like translation, document analysis, and text assistance.
ORRA (Learn More)
HSBC has a policy chatbot called ORRA (Operational Resilience and Risk Application) that employees can use to ask questions about internal policies and frameworks. This leverages natural language conversation (Dialogflow + Google Cloud) and document search. While not strictly labelled as an LLM, it’s an AI/NLP/ML internal assistant.
Retail Banking: Personetics
Personetics (Learn More)
HSBC’s offerings in retail banking are limited to offering personalized insights to customers based on their spending. HSBC has also invested in AI Conversational Chatbots with their partnership with LivePerson.
Investment Banking & Markets: HSBC AI Markets
AI Markets (Learn More)
A tool to give Traders access to HSBC’s global research and trading data, market analysis, pricing and execution. Generative AI assists credit analysts by drafting write-ups from rich data sources, and internal copilots support servicing teams handling millions of client interactions annually—reducing turnaround times and improving satisfaction.
Wealth Management: HSBC AiPEX and HSBC AIGT
HSBC AiPEX (Learn More) and HSBC AIGT (Learn More)
AI Powered Indicies, AI-powered assistants help relationship managers prepare for client meetings with faster access to research and portfolio insights, enabling more personalized advice and timely follow-ups.
Risk and Compliance: HSBC Dynamic Risk Assessment
HSBC Dynamic Risk Asssessment in AML (Learn More)
As with most major banks, at HSBC AI is being extensively applied to fraud detection and compliance monitoring. HSBC has partnered with Google.
Future Outlook: HSBC is scaling proprietary and partner-built AI solutions, aiming to embed AI into nearly every employee workflow while carefully extending use to customer-facing channels. Plans to automate up to 90% of certain data and analytics tasks could save billions annually, underscoring the bank’s commitment to both efficiency and responsible innovation.
4. Citigroup
Citigroup has implemented AI across its global operations, reaching over 150,000 employees in 80 countries with proprietary generative AI tools. Backed by a $12 billion annual tech budget they are trying to develop an internal “AI-first” culture. Citi is embedding copilots into employee workflows while piloting early agentic AI systems designed for autonomous decision-making.
Internal Productivity:
Citi Assist (Learn More)
Citi Assist is a chatbot tool rolled out to about 140,000 employees across eight countries. Its main function is to search internal policies and procedures across domains like HR, compliance, finance, and risk—acting like a “super-smart coworker” to help staff quickly find answers. It’s integrated into employee workflows (including via Microsoft Teams) and the bank plans to expand its content coverage further.
Citi Stylus (Learn more)
Citi Stylus is a document intelligence tool also available to those ~140,000 employees. It can summarize, compare, and search multiple documents simultaneously—making it easier to extract insights, locate relevant passages, or contrast different versions. Over time, Citi has added features like real-time chat, a browser plugin, and “Stylus Workspaces” capabilities (drafting emails, slide speaker notes, etc.).
Retail Banking:
CitiService Agent Assist (Learn More)
platform equips call center staff with generative AI, enabling faster and more accurate responses across millions of customer interactions globally.
Wealth Management:
AskWealth
a AI tool to answer questions related to their clients and portfolios; and
Advisor Insights
An AI tool for market intelligence, and a dashboard featuring timely messages about market turns, portfolios and current events.
Software and Development
Google’s Vertex AI
have been implemented at Citi’s development team. A series of AI-powered tools via
Future Outlook: Citi is expanding copilots firm-wide and exploring agentic AI that can handle autonomous decision-making.
5. Wells Fargo
Wells Fargo has positioned AI at the center of both its customer-facing and internal operations. The flagship Fargo virtual assistant, launched in March 2023, represents one of the earliest large-scale generative AI deployments by a U.S. retail bank. At the same time, the bank has built Tachyon, an internal AI platform that runs multiple LLMs to support diverse business tasks.
Internal Productivity:
Tachyon (Learn More)
Tachyon is Wells Fargo’s internal AI platform designed to serve as a centralized, model- and service-agnostic engine for their AI initiatives. Its architecture allows the bank to swap in different models and providers over time—without reengineering the entire system. By aggregating AI capabilities under one flexible platform, Tachyon supports use cases across document analysis, summarization, risk monitoring, and more, enabling consistent governance, scalability, and adaptability.
Retail Banking:
Fargo (Learn More)
Fargo powered by Google’s Dialogflow and PaLM 2, handles customer requests through text or voice. It can pay bills, transfer funds, and answer account questions. Fargo logged 242.4 million interactions in 2024. Wells Fargo also launched Wells Fargo Vantage for their corpporate cliets.
Investment Banking and Markets
Wells Fargo Benchmark Intelligence (Learn More)
Wells Fargo Benchmark Intelligence driven by with i2i Logic to provide data solutions for middle market clients. Wells Fargo’s industry insights and U.S. middle-market expertise with public data to generate thousands of unique benchmarks.
Wealth Management:
Customer Decision Hub powered by Pega (Learn More)
A centralized reporting dashboard, designed to provide customer insights as well as suggest the “next-best action/conversation” to be had with the customer.
Software and Development
Google’s Vertex AI (Learn More)
A series of AI-powered tools via Google’s Vertex AI have been implemented at Citi’s development team.
Risk and Compliance
Tachyon (Learn More)
AI initiatives under Tachyon include enhanced fraud detection and risk monitoring, leveraging LLMs to flag unusual activity in transaction streams.
Future Outlook: Wells Fargo plans to scale Fargo’s capabilities into proactive financial guidance while broadening Tachyon’s role in internal productivity and compliance. Together, they form a dual-track strategy—consumer-facing AI for daily banking and enterprise LLM platforms for operational efficiency.
6. Goldman Sachs
Goldman’s flagship AI is their GS AI Assistant. The assistant is designed to help with tasks like summarizing complex documents, drafting content, performing data analysis, and even translating research for clients in preferred languages. To support a wide variety of use cases and user preferences, GS AI Assistant is reportedly model-agnostic: it can interact with a suite of LLMs (such as OpenAI’s GPT-4o, Google’s Gemini, and Anthropic’s Claude) so that the best model can be employed depending on the task.
Internal Productivity:
GS AI Assistant (Learn More)
Goldman Sachs launched GS AI Assistant firmwide in mid-2025 after piloting it with about 10,000 employees. The generative AI tool is designed to support knowledge workers by handling tasks such as summarizing complex documents, drafting content, analyzing data, and translating research into client-preferred languages.
GS AI Assistant is built to be model-agnostic, giving employees secure access to multiple underlying LLMs—such as OpenAI’s GPT, Google’s Gemini, and Anthropic’s Claude—within Goldman’s audited environment. The rollout is part of the bank’s broader AI strategy to embed generative tools across its divisions, enabling developers, bankers, analysts, and wealth professionals to work more efficiently in their day-to-day workflows.
Investment Banking and Markets
Banker Copilot
Goldman has piloted a generative AI assistant for bankers and traders, helping draft pitch materials, summarize financial reports, and speed deal preparation. AI is also used to parse earnings transcripts, Fed statements, and research data to give front-office staff faster insights.
Wealth Management:
GS AI Assistant
Wealth Managers can translate research for international clients.
Software and Development
Goldman engineers use AI copilots such as GitHub Copilot within secure environments, gaining real-time code suggestions and automated reviews. Internal platforms also generate test cases and accelerate development cycles.
Future Outlook: Goldman Sachs is expanding its banker/trader copilots across divisions and continuing to test large language models for research, compliance, and client service. Its focus is on high-value internal use cases that boost productivity and risk controls, with client-facing generative AI likely to follow once accuracy and compliance thresholds are met.
7. Morgan Stanley
Morgan Stanley deployed an internal GPT-4-powered assistant known informally as “AI @ Morgan Stanley Assistant.” This wealth management chatbot, rolled out to financial advisors in Sept 2023, answers advisors’ questions in natural language by drawing on ~100,000 research reports and documents.
Wealth Management:
AI@ Morgan Stanley Assistant (Learn More)
AI@ Morgan Stanley Assistant is GPT-4 advisor assistant that gives its 16,000+ financial advisors a knowledge-assistant chatbot that enables financial advisors to query Morgan Stanley’s internal document library (100,000+ documents), retrieve research, synthesize information, and get quick answers during their workflow. The assistant summarizes reports, surfaces investment ideas, and generates content for client outreach, significantly reducing prep time.
AI@ Morgan Stanley Debreif (Learn More)
AI @ Morgan Stanley Debrief, introduced June 2024, is a meeting assistant. With client consent, it transcribes advisor-client meetings, extracts key points, generates draft follow-up emails, and enters meeting notes into the CRM. It’s designed to reduce administrative burden post-meeting so advisors can focus more on client interaction.
Investment Banking and Markets
AskResearch GPT
In 2024, Morgan Stanley released AskResearchGPT, a generative AI assistant for its Institutional Securities division that helps staff surface, synthesize, and summarize insights from the firm’s expansive research libraries (70,000+ proprietary reports annually). AskResearchGPT is built on GPT-4 and is integrated into analysts’ workflows, including one-click exports into email drafts with links to source documents.
Future Outlook: Morgan Stanley plans to expand its GPT-4 assistant across wealth management while developing more role-specific copilots in banking and risk. The focus remains on advisor enablement and compliance-friendly adoption, with client-facing generative AI services considered only once accuracy and governance are assured.
8. Deutche Bank
Deutsche Bank’s in-house generative AI platform includes a tool called “DB Lumina.” This is an AI-powered research assistant (built on Google’s Gemini LLM via Vertex AI) that can rapidly summarize market research and analysis. DB Lumina accelerates the creation of financial reports and notes – work that once took analysts days can be done in minutes – while meeting strict data privacy controls. (Deutsche Bank reported that as of 2024 it had 200+ AI use cases in production, with its LLMs generating millions of lines of code, credit reports, and audit memos.)
Internal Productivity:
DB Lumina (Learn More)
An AI research assistant built in partnership with Google Cloud. It uses Gemini models via Vertex AI, with a conversational chat interface, prompt templates, and a retrieval-augmented architecture. It helps employees summarize documents, draft content, analyze data, and refine writing—accelerating internal workflows.
Retail Banking
No Internal Name for AI
Deutsche Bank uses conversational AI assistants in its digital channels to answer customer inquiries and guide transactions. Generative AI is being piloted to support call center staff with faster query resolution and to improve personalization in retail banking services.
Investment Banking and Markets
DB Lumina
In usage within origination, advisory, and trading divisions. Analysts use DB Lumina to ask market questions, generate report summaries, highlight signals, and refine research outputs AI copilots are being tested to support trading desks and research teams by summarizing market reports, analyzing data streams, and generating draft client materials. LLMs are also being piloted to help parse complex regulatory communications for strategists and risk officers.
Software and Development
Through its Google Cloud partnership, Deutsche Bank is developing internal AI assistants to streamline document analysis, automate reporting, and support developer productivity with code-generation and review tools.
Risk and Compliance
Risk & Compliance AI
LLMs are being integrated into fraud detection, AML monitoring, and compliance reviews. AI tools scan contracts and regulatory documents, while also helping compliance teams process and triage alerts more effectively.
Future Outlook: Deutsche Bank aims to expand its AI capabilities across all divisions, with a strong focus on efficiency and regulatory compliance. The bank is positioning itself to scale generative AI adoption securely while preparing for more advanced applications, including agentic AI, in the coming years.
9. UBS
UBS has built an enterprise AI platform named “Eliza” (after Eliza Hamilton, wife of the bank’s founder). Eliza functions as an internal AI marketplace and knowledge hub: it hosts approved AI models and tools that employees can use for tasks like answering policy questions, retrieving research, or automating workflows. One such tool is an internal chatbot assistant nicknamed “Red,” which integrates UBS’s knowledge base to help employees better serve clients. By early 2025, over 46,000 UBS employees (90% of staff) had been onboarded on Eliza’s generative AI capabilities.
Internal Productivity:
UBS RED (Learn More)
UBS uses an internal AI assistant called Red, which taps into the bank’s institutional knowledge and internal library to make it readily accessible to employees across divisions. Red is built on Azure OpenAI Service + Azure AI Search, leveraging a digitized knowledge base of roughly 60,000 investment advice and product documents. Through multilingual capabilities and concept-based retrieval (rather than strict keyword matching), Red helps employees quickly find content, draft communications, and support client interactions.
Retail Banking
UBS RED (Learn More)
For Investment Banking / M&A, UBS built a specialized AI tool that can scan over 300,000 companies in about 20 seconds to identify sell-side or acquisition opportunities. This M&A “copilot” accelerates idea sourcing and deal flow analysis for bankers.
Investment Banking and Markets
UBS RED (Learn More)
In Wealth & Advisory, UBS’s AI functions are powered by its Smart Technologies & Advanced Analytics Team (STAAT). The AI tools generate pre-meeting briefings, client alerts, and personalized advisory prompts by extracting signals from client data and market trends. Some advisors report saving 3–4 hours per meeting by offloading repetitive preparation tasks.
Future Outlook: UBS plans to scale AI across its enlarged global footprint post-merger, with a particular emphasis on advisor productivity and client-facing digital experiences. The bank is also exploring agentic AI use cases while maintaining a strong governance framework to balance innovation with risk management.
10. RBC
RBC has been a pioneer in adopting AI among Canadian banks, with investments focused on both customer-facing applications and enterprise productivity. Through its research division Borealis AI, RBC develops proprietary AI models for banking, fraud prevention, and trading. The bank emphasizes safe, explainable AI while steadily introducing generative AI copilots to improve staff efficiency and client service.
Retail Banking
NOMI (Learn More)
RBC’s mobile app features NOMI, an AI-powered digital assistant that helps customers manage spending, savings, and cash flow. RBC’s digital assistant for retail customers analyzes spending, surfaces insights (“NOMI Insights”), automates savings via NOMI Find & Save, and forecasts cash flows with NOMI Forecast.
Generative AI is also being tested to provide more conversational interactions and to support call center agents with faster, more accurate responses.
Investment Banking and Markets
RBC’s Aiden platform / Aiden QuickTakes
RBC’s Aiden platform / Aiden QuickTakes is used in its Capital Markets division to accelerate research: the system ingests earning announcements, generates draft summaries, and reduces research turnaround times by 20–60%. Analysts refine and publish these drafts.
ATOM
RBC describes ATOM as akin to a “Large Transaction Model (LTM),” parallel to how LLMs act over text. It’s trained on large-scale financial and transactional datasets (payments, trades, client behavior, loyalty interactions) to enable predictions, pattern recognition, and inference across a variety of business tasks.
Wealth Management:
Borealis AI
Advisors use AI-powered insights from Borealis AI to personalize portfolio recommendations and improve meeting preparation. Generative AI copilots are being piloted to draft client communications and summarize research for wealth teams.
TIFIN AG
RBC’s U.S. wealth management arm has piloted AI tools in partnership with TIFIN AG to help advisors surface client opportunities automatically—creating “ideal client profiles,” detecting life events (e.g. inflows, consolidation), and prioritizing outreach.
Future Outlook: RBC plans to expand generative AI into more client-facing services while strengthening internal copilots for employees. By leveraging Borealis AI as its in-house innovation engine, RBC is positioning itself to compete globally in safe and scalable AI adoption, with a strong emphasis on explainability and governance.
Concluding Thoughts
As financial institutions race to deploy AI at scale, the industry has entered an arms race to embed LLMs and generative tools into core workflows as quickly as possible. But building the systems is only the first step.
The real challenge ahead lies in onboarding and upskilling employees so that bankers, advisors, analysts, and developers can fully leverage these capabilities in their day-to-day work. Ultimately, the firms that succeed will be those that not only invest incutting-edge AI platforms, but also prepare their talent to use them effectively, responsibly, and efficiently.
Barclays
Barclays created an internal generative AI helper called the “Colleague AI Agent.” This is an employee-facing assistant integrating Microsoft 365 Copilot with Barclays’ systems. The Colleague AI Agent lets staff perform tasks like booking business travel, checking HR policies, or searching compliance systems through one natural-language interface. It was piloted with 15,000 users and is being rolled out to ~100,000 employees globally as part of Barclays’ AI-driven productivity push. (No distinct consumer-facing LLM has been named by Barclays yet.)
- Retail Banking: Barclays is expanding its use of AI-powered chatbots within its mobile and online platforms to assist retail customers with account inquiries, payments, and loan services. Generative AI is also being tested to provide real-time summaries of customer interactions for support staff, improving speed and resolution.
- Enterprise Productivity & Engineering: Barclays developers and operations teams are adopting AI assistants for coding, document analysis, and process automation. These tools integrate into the bank’s secure systems to improve efficiency while maintaining regulatory standards.
- Investment Banking & Markets: Barclays is piloting AI copilots that help analysts and bankers draft research summaries, generate pitch materials, and analyze market data. Early deployments aim to reduce prep time and streamline workflows across capital markets and corporate banking teams.
- Wealth Management: Advisors use internal AI tools to access research libraries and produce tailored client updates. These copilots assist with meeting preparation and portfolio insights, freeing time for more strategic client engagement.
- Risk & Compliance: LLMs support fraud detection, transaction monitoring, and regulatory compliance checks by analyzing patterns in large volumes of structured and unstructured data. AI also assists legal teams in reviewing contracts and regulatory texts more quickly.
Future Outlook: Barclays plans to broaden AI copilots across all divisions and scale customer-facing conversational AI. Its focus remains on safe adoption—embedding AI into internal processes now while preparing for more expansive client-facing generative AI once accuracy, governance, and regulatory conditions are met.
MUFG
Mitsubishi UFJ Financial Group (MUFG)
MUFG, Japan’s largest bank, is steadily embedding AI into its global operations, with a strong focus on customer service, risk management, and operational efficiency. The bank partners with technology firms and develops proprietary tools, while aligning AI adoption with Japan’s strict regulatory and data-privacy frameworks.
- Investment Banking & Markets: MUFG is piloting LLMs to assist research analysts and trading teams by summarizing market data, parsing regulatory announcements, and preparing first drafts of client presentations. AI tools are also being tested to monitor market signals and detect trading anomalies.
- Retail Banking: MUFG has deployed conversational AI assistants in Japan to help customers with account queries, transfers, and loan applications. Generative AI is being integrated into call centers to provide agent-assist functions, reducing wait times and improving service quality.
- Wealth Management: Advisors use AI copilots to access investment research, generate client-ready reports, and prepare meeting briefs. These tools allow relationship managers to deliver more personalized advice and handle larger client books efficiently.
- Enterprise Productivity & Engineering: MUFG has introduced internal AI platforms for document translation, summarization, and compliance checks. Developer assistants are being tested to accelerate coding and reduce IT backlogs as part of its broader digital transformation.
- Risk & Compliance: AI systems enhance fraud detection, AML/KYC monitoring, and regulatory compliance. LLMs help compliance teams scan contracts and communications, while predictive models support early risk identification in lending portfolios.
Future Outlook: MUFG aims to scale AI across its domestic and international operations, with a cautious but deliberate approach to generative AI. The bank is exploring advanced copilots for employees and customer-facing assistants, positioning itself to compete with global peers while maintaining a strong emphasis on trust, safety, and compliance.
BNP Paribas
BNP Paribas is embedding AI across its European and global operations, with a strong focus on operational efficiency, risk management, and client experience. The bank leverages both proprietary AI systems and partnerships, while emphasizing responsible AI adoption under strict EU regulatory frameworks.
- Retail Banking: BNP has introduced AI-driven virtual assistants in digital channels to help customers with everyday banking tasks such as transfers, account queries, and loan applications. Generative AI tools also support contact center agents with real-time suggested responses, cutting call times and improving customer satisfaction.
- Investment Banking & Markets: BNP applies LLMs to accelerate research and trading workflows, including summarizing analyst reports, parsing regulatory announcements, and generating first drafts of client presentations. AI is also tested in risk modeling and trade surveillance to improve speed and accuracy.
- Wealth Management: Advisors are using AI copilots to synthesize market research and client portfolio data, enabling personalized investment recommendations and reducing preparation time for client meetings.
Enterprise Productivity & Engineering: BNP is piloting internal generative AI assistants to help employees with translation, summarization, and document comparison. Developer copilots are being tested for code review and automation to modernize IT systems more efficiently. - Risk & Compliance: BNP is adopting AI for fraud detection, KYC/AML monitoring, and regulatory compliance. LLMs are used to scan large volumes of contracts, communications, and transaction data, helping compliance teams triage risks more effectively while maintaining regulatory standards.
Future Outlook: BNP Paribas plans to scale AI use across all divisions, focusing on hybrid in-house/partner solutions to balance innovation with compliance. With the EU AI Act shaping governance, the bank is positioning itself as a leader in responsible, large-scale AI deployment across Europe.
Goldman Sachs
Crédit Agricole, one of Europe’s largest cooperative banking groups, is steadily expanding its use of generative AI to improve customer service, employee productivity, and compliance. The bank combines internal development with partnerships, ensuring AI adoption aligns with European regulatory and ethical standards.
- Investment Banking & Markets: LLM pilots support research teams and corporate bankers by drafting summaries of financial reports, preparing pitch materials, and scanning regulatory updates. AI is also being tested for risk modeling and trade monitoring.
- Retail Banking: Crédit Agricole has introduced AI-driven virtual assistants to handle customer queries on accounts, cards, and mortgages, while generative AI tools assist call center staff with real-time suggested responses, cutting call resolution times.
- Wealth Management: Advisors use internal copilots to generate personalized investment insights, prepare meeting notes, and access research faster, enabling more tailored and efficient client service.
- Enterprise Productivity & Engineering: Internal generative AI tools are being used for translation, document drafting, and summarization. Developer copilots are piloted to accelerate coding tasks, supporting the bank’s broader IT modernization efforts.
- Risk & Compliance: AI is embedded in fraud detection, AML/KYC checks, and contract analysis. Generative AI helps compliance teams parse through regulatory texts and large volumes of documentation, improving oversight and reducing manual workload.
Future Outlook: Crédit Agricole plans to expand generative AI adoption across its cooperative network and business lines, balancing efficiency gains with its strong commitment to ethical, responsible AI under the EU AI Act. The bank sees AI as a way to boost both client satisfaction and employee effectiveness while maintaining trust at the center of its brand.
Section 1: Productivity & Decision-Support Tools
In 2025, most finance professionals’ first meaningful exposure to AI is via a LLM-powered productivity assistant embedded in their daily software.
For many, this means Microsoft Copilot embedded in Excel, Word, Outlook, or Teams. For others, it’s ChatGPT or Anthropic Claude in the browser for quick research, drafting, or summarization.
Increasingly, however, major banks and asset managers are deploying in-house LLM platforms— trained on proprietary datasets and aligned with strict compliance controls.
This first section of our article will dive into all four.
Tool Overview: Copilot, ChatGPT, Claude, and In-House LLMs
The most common LLMs you will likely come across in the finance industry are: CoPilot, ChatGPT, Claude, or an in-house LLM.

Microsoft CoPilot
Website: https://copilot.microsoft.com
Documentation: https://learn.microsoft.com/en-us/microsoft-copilot
Microsoft Copilot is an AI productivity assistant built directly into the Microsoft 365 suite—Excel, Word, Outlook, Teams, and more. Powered by large language models, Copilot can analyze spreadsheets, summarize email threads, generate presentation outlines, and automate repetitive tasks inside the tools finance professionals already use daily. Its tight integration with enterprise data and Microsoft Graph means it can securely reference company documents, calendars, and communications to provide tailored outputs. For finance teams, this translates into faster report building, cleaner client communications, and instant insights—without ever leaving familiar software.

ChatGPT
Website: https://openai.com/chatgpt
Documentation: https://platform.openai.com/docs
ChatGPT, developed by OpenAI, is a conversational AI platform capable of answering questions, generating narratives, summarizing documents, and reasoning over structured or unstructured information. Its versatility makes it useful across finance workflows—from summarizing market research and drafting memos to brainstorming investment ideas and preparing pitch materials. Accessible through a web interface, desktop app, or API, ChatGPT supports natural language prompts, making it easy for professionals to interact with data and documents without specialized technical skills. With advanced capabilities like Code Interpreter (also known as Advanced Data Analysis), it can even work directly with datasets, perform calculations, and visualize results.

Claude
Website: https://claude.ai
Documentation: https://docs.anthropic.com
Claude, built by Anthropic, is an AI assistant designed for safer, more explainable interactions with text and data. Known for its extended context window and conversational depth, Claude can process and synthesize large volumes of information—such as long reports, regulatory filings, or historical transaction records—in a single session. It’s particularly well-suited for analytical and compliance-focused work where nuanced understanding and careful reasoning are critical. Finance professionals use Claude to summarize lengthy documents, prepare structured briefs, draft policy-aligned communications, and support research efforts. Available via browser or API, Claude can be integrated into custom workflows or accessed as a stand-alone assistant.
In House LLMs
There are a number of in-house LLM’s.

Information Synthesis & Summarization
Tasks where the LLM quickly condenses large volumes of unstructured information into actionable insights.
- Summarizing market news, research reports, or earnings calls
- Condensing regulatory updates into key compliance implications
- Producing meeting summaries and action items from call transcripts
- Creating bullet-point takeaways from analyst notes or pitch books

Document & Communication Drafting
LLMs accelerate the creation of client-facing and internal written materials.
- Drafting client memos, investment letters, or market commentary
- Preparing internal updates for leadership or team distribution
- Writing first drafts of pitch decks, proposals, or RFP responses
- Standardizing tone, style, and formatting in communications

- Conversational queries in Excel or BI dashboards (“Show YoY revenue change by region”)
- Spotting trends or anomalies in portfolio performance data
- Running quick comparative analyses without writing formulas or SQL
- Translating plain-English questions into pivot tables or visualizations

Research & Knowledge Retrieval
Using natural language to surface relevant facts, precedents, or data points.
- Pulling historical market data or economic indicators
- Identifying peer benchmarks for valuation or risk metrics
- Searching internal knowledge bases for deal precedents or templates

- Flagging sections of regulations that may impact a specific product
- Checking communications for language that may pose compliance risk
- Automating KYC document review summaries
- Screening counterparties for sanction or PEP (politically exposed person) mentions

Streamlining repetitive or administrative steps inside familiar tools.
- Auto-filling report templates with the latest figures
- Creating recurring meeting agendas based on project updates
- Generating standardized commentary for routine portfolio reviews
- Pre-formatting decks and spreadsheets for management sign-off

Document Drafting
LLMs accelerate the creation of client-facing and internal written materials.
- Summarizing market news, research reports, or earnings calls
- Condensing regulatory updates into key compliance implications
- Producing meeting summaries and action items from call transcripts
- Creating bullet-point takeaways from analyst notes or pitch books
Debrief – The firm’s AI-powered tool summarizes meetings and drafts client communication. After a client meeting, Debrief automatically creates a structured summary email for the advisor to review and send, essentially serving as a “first draft” writer for follow-ups. This allows advisors to quickly send thorough recap emails and focus on personalization rather than starting from scratch.
Goldman Sachs: According to industry reports, Goldman has developed a proprietary GS AI Assistant that can generate first drafts of pitchbooks and client presentations. The system, built on in-house LLMs and trained on the bank’s style templates, can produce an initial pitch deck in minutes. Early use has shown it can cut the time to create pitch materials by roughly 50%, while keeping slides consistent with brand and compliance guidelines.

Information Synthesis & Summarization
Financial firms are leveraging LLMs to distill large volumes of unstructured content – from market news and research reports to meeting transcripts – into concise, actionable summaries. This saves professionals hours of reading and keeps them informed. Recent examples include:
- Summarizing market news, research reports, or earnings calls
- Condensing regulatory updates into key compliance implications
- Producing meeting summaries and action items from call transcripts
- Creating bullet-point takeaways from analyst notes or pitch books
Morgan Stanley: The firm’s Wealth Management division uses an OpenAI-powered “AI @ Morgan Stanley Debrief” tool that automatically summarizes client meeting discussions. Integrated into Zoom, it generates key-point meeting notes and even drafts follow-up emails for advisors, a feature that can save each advisor up to 15 hours per week
HSBC: In HSBC’s wealth management unit, an AI assistant nicknamed “Amy” automatically generates natural-language summaries of investment research reports for clients. In the first quarter of 2024 alone, Amy processed over 3 million reports, providing clients with bite-sized, tailored insights from detailed research – a task that would be impractical to do manually at such scale.
Bank of America: BofA developed an internal GenAI platform for its Global Markets team that can search and summarize the bank’s research reports and market commentary within seconds. Sales and trading staff use it to quickly glean insights from the firm’s vast research library, vastly speeding up information flow. The bank also uses GenAI to summarize call center recordings – extracting key client feedback and action items from lengthy calls – which improves service while saving time on manual review
newsroom.bankofamerica.com

Information Synthesis & Summarization
Tasks where the LLM quickly condenses large volumes of unstructured information into actionable insights.
- Summarizing market news, research reports, or earnings calls
- Condensing regulatory updates into key compliance implications
- Producing meeting summaries and action items from call transcripts
- Creating bullet-point takeaways from analyst notes or pitch books
Bank of America: BofA developed an internal GenAI platform for its Global Markets team that can search and summarize the bank’s research reports and market commentary within seconds. Sales and trading staff use it to quickly glean insights from the firm’s vast research library, vastly speeding up information flow. The bank also uses GenAI to summarize call center recordings – extracting key client feedback and action items from lengthy calls – which improves service while saving time on manual review
newsroom.bankofamerica.com
Bank of America: BofA developed an internal GenAI platform for its Global Markets team that can search and summarize the bank’s research reports and market commentary within seconds. Sales and trading staff use it to quickly glean insights from the firm’s vast research library, vastly speeding up information flow. The bank also uses GenAI to summarize call center recordings – extracting key client feedback and action items from lengthy calls – which improves service while saving time on manual review
newsroom.bankofamerica.com
Bank of America: BofA developed an internal GenAI platform for its Global Markets team that can search and summarize the bank’s research reports and market commentary within seconds. Sales and trading staff use it to quickly glean insights from the firm’s vast research library, vastly speeding up information flow. The bank also uses GenAI to summarize call center recordings – extracting key client feedback and action items from lengthy calls – which improves service while saving time on manual review
newsroom.bankofamerica.com
Financial firms are leveraging LLMs to distill large volumes of unstructured content – from market news and research reports to meeting transcripts – into concise, actionable summaries. This saves professionals hours of reading and keeps them informed. Recent examples include:
- Morgan Stanley: The firm’s Wealth Management division uses an OpenAI-powered “AI @ Morgan Stanley Debrief” tool that automatically summarizes client meeting discussions. Integrated into Zoom, it generates key-point meeting notes and even drafts follow-up emails for advisors, a feature that can save each advisor up to 15 hours per week
reuters.com
morganstanley.com - Bank of America: BofA developed an internal GenAI platform for its Global Markets team that can search and summarize the bank’s research reports and market commentary within seconds. Sales and trading staff use it to quickly glean insights from the firm’s vast research library, vastly speeding up information flow. The bank also uses GenAI to summarize call center recordings – extracting key client feedback and action items from lengthy calls – which improves service while saving time on manual review
newsroom.bankofamerica.com - HSBC: In HSBC’s wealth management unit, an AI assistant nicknamed “Amy” automatically generates natural-language summaries of investment research reports for clients. In the first quarter of 2024 alone, Amy processed over 3 million reports, providing clients with bite-sized, tailored insights from detailed research – a task that would be impractical to do manually at such scale
bfsi.eletsonline.com
When it comes to LLM's, how are they actually being used ?
In 2025, we are seeing six common categories of use cases for generalized LLM’s in Finance:

Interacting with structured data directly from spreadsheets, databases, or BI tools.
- Conversational queries in Excel or BI dashboards (“Show YoY revenue change by region”)
- Spotting trends or anomalies in portfolio performance data
- Running quick comparative analyses without writing formulas or SQL
- Translating plain-English questions into pivot tables or visualizations
LLMs provide a natural language interface to financial data, allowing users to ask questions and get analytical answers without writing SQL or complex code. This “conversational analytics” approach democratizes data access.
BlackRock: The asset manager integrated a generative AI Copilot into its Aladdin platform (used by portfolio managers for analytics). In 2023, BlackRock launched eFront Copilot on its private markets system, enabling users to ask analytical questions and instantly get quick analytics and visualizations of portfolio data. For example, an investor can request, “Show me the risk exposure breakdown of our portfolio” and the Copilot will output charts or analysis of risk factors by asset, saving manual work. BlackRock reports that this AI can surface on-demand insights on performance and risk, making Aladdin more interactive and data-driven for usersmicrosoft.com.
Wells Fargo: The bank’s customer-facing AI assistant, “Fargo,” acts as a conversational interface to account data. Customers can simply ask Fargo questions like “How much did I spend on groceries last month?” or “What’s my current balance?” and receive an immediate answer. Fargo handles everyday banking requests via voice/text – from bill pay and funds transfers to providing transaction details – all by parsing natural-language queries and fetching the relevant dataventurebeat.com. This makes interacting with one’s financial information as easy as chatting.
Standard Chartered: The bank introduced a virtual assistant called “Dot” that combines voice and visual interfaces for customers to explore financial products and data conversationally. A client can verbally ask Dot things like, “What would my mortgage payments be if I borrow $250k over 30 years?” and Dot will process the request, perform the calculation, and display an interactive graph or answer. This voice-command and AI-powered interface lets clients visually explore scenarios and get advice on complex financial questions in a user-friendly waybfsi.eletsonline.com, rather than having to manually input data into calculators or read fine print.

Research & Knowledge Retrieval
Using natural language to surface relevant facts, precedents, or data points.
- Pulling historical market data or economic indicators
- Identifying peer benchmarks for valuation or risk metrics
- Searching internal knowledge bases for deal precedents or templates
LLMs excel at searching and synthesizing knowledge from vast information archives – far beyond simple keyword search. In finance, banks are deploying AI assistants that act as intelligent research analysts, quickly retrieving data, precedents, or insights from internal and external knowledge bases. For example:
Morgan Stanley: In 2023 the firm rolled out the AI @ Morgan Stanley Assistant, a generative AI chatbot for its wealth management division. This tool lets financial advisors query the firm’s enormous repository of research and intellectual capital in plain English. An advisor can ask, for instance, “What’s our latest view on emerging market equities?” and the assistant will pull relevant points from Morgan Stanley’s ~70,000 research reports (published annually) and provide a concise answermorganstanley.com. This internal tool, built with OpenAI’s technology, has been hugely successful – by late 2023, 98% of advisor teams were using it regularly to retrieve research insights for clientsmorganstanley.com. The same underlying LLM tech is now being extended to Morgan Stanley’s investment banking and trading staff via an “AskResearchGPT” interfacemorganstanley.com.
JPMorgan Chase: The bank has been developing a tool called IndexGPT – essentially a chatbot for investment ideas. It is designed to let portfolio managers or clients describe a thematic idea in plain language and then retrieve or suggest relevant stock picks or index components that match the theme. For example, a user could input a theme like “aging population healthcare,” and IndexGPT will leverage an LLM to suggest a basket of stocks aligned with that idea. JPMorgan’s leadership has noted that IndexGPT was rolled out for testing with big fund clients, using AI to parse keywords and recommend investments for the given themereuters.com. (This is an early foray into AI-driven product advice, and JPMorgan is proceeding carefully with compliance as it integrates such tools.)
UBS: The Swiss bank’s investment bankers have developed an AI-powered M&A “co-pilot” to accelerate deal research. This tool can scan a database of 300,000 companies in under 30 seconds and identify potential acquisition targets or buyers based on a client’s criteriaswissinfo.ch. In practice, bankers can ask the AI for, say, “mid-sized tech firms in Europe that would fit our client’s expansion strategy,” and the system will analyze vast company data and spit out a shortlist of candidates. It even compares subtle data like the tone of management comments in earnings calls to flag firms that might attract activist investorsswissinfo.ch. UBS has been using this AI co-pilot for about a year to generate buy-side ideas and pitch lists for clients, significantly speeding up what was once a very manual research process.
Royal Bank of Canada (RBC): RBC’s capital markets division has been testing generative AI in its equity research department, allowing analysts to quickly retrieve information from internal research notes and even draft portions of reports. They are also piloting an AI assistant in their retail call center (“Advice Centre”) so that service reps can pull up relevant product info or precedent cases by asking the AI, thereby speeding up responses to customer inquiriesrbccm.com. While still in the experimental stage, RBC sees these use cases as promising ways to let employees tap the firm’s collective knowledge more efficiently, with AI doing the heavy lookup work in the background. (Source: RBC Capital Markets, Aug 2024)

Compliance & Risk Support
- Flagging sections of regulations that may impact a specific product
- Checking communications for language that may pose compliance risk
- Automating KYC document review summaries
- Screening counterparties for sanction or PEP (politically exposed person) mentions
Given strict regulatory requirements in finance, institutions are carefully deploying AI to assist (not replace) compliance and risk management functions. LLMs and related AI can help flag issues in vast compliance documents or monitor transactions and communications for red flags. Some examples:
HSBC: The bank built a generative AI system called “Ava” to bolster its anti-money laundering efforts. Ava scans billions of transactions, communications, and documents across HSBC’s global operations to detect patterns of potential financial crime. Thanks to the LLM’s ability to contextualize and learn, HSBC reports Ava is 65% more accurate at identifying money laundering activities than their previous rules-based systemsbfsi.eletsonline.com. This dramatically improves detection of suspicious activities while reducing false alarms. (HSBC also uses other AI models to continuously monitor transactions – in fact, they partner with Google on “Dynamic Risk Assessment” AI, which has cut false positives in fraud detection by 60%hsbc.com – showing the wider trend of AI in compliance.)
Wells Fargo: In a push toward automation, Wells Fargo’s technology team used an LLM-driven agent system to re-underwrite 15 years’ worth of old loan documents in an archival review project. They set up a network of specialized AI agents (using tools like LangChain/LangGraph) that autonomously retrieved loan files, extracted key data, cross-checked it against internal systems, and even performed necessary calculations – tasks that traditionally would require a team of human analysts. A human only needed to review the final outputs. The project demonstrated that AI agents could handle the bulk of this legacy compliance review work with minimal interventionventurebeat.com. This kind of AI-assisted archival analysis can help banks ensure past portfolios meet current standards or identify hidden risks, far faster than manual audits.
JPMorgan Chase: The bank employs AI-driven risk analytics to help its trading and risk teams manage exposure. For example, JPMorgan has machine learning models (including language models for news analysis) monitoring its trading books for early warning signals. The AI combs through market data, trading positions, counterparty news, etc., and has helped flag anomalies in the firm’s risk profile that might have been missed. The company noted that such AI risk tools have reduced the frequency of Value-at-Risk (VaR) limit breaches or anomalies by about 40%smartdev.com, enabling risk managers to take preemptive action more often. While not purely an LLM use-case (it’s a mix of AI techniques), it highlights how AI summarization of market signals and pattern-detection is supporting risk oversight on trading desks.
Silent Eight & Compliance Automation: Many banks are also leveraging third-party AI solutions for compliance. For instance, HSBC since 2021 has worked with a fintech firm Silent Eight to automate labor-intensive compliance decisions like sanctions screening and transaction alert adjudication. The AI models ingest customer and transaction data and can decide or prioritize alerts (e.g., flagging a sanctions hit or clearing a false positive) much faster than human reviewpaymentsjournal.compaymentsjournal.com. This kind of AI co-pilot in compliance departments is becoming more common – it doesn’t make final decisions on its own, but it drastically cuts down the manual workload by triaging alerts and providing risk officers with AI-vetted recommendations. Banks like Standard Chartered and ING have similarly tested LLM-based tools that read through regulation texts or communications to highlight compliance risks (e.g., spotting forbidden language in emails or cross-referencing new regulatory rules with internal policies), augmenting the efforts of compliance staff. (Sources: PaymentsJournal on HSBC/Silent Eight, Feb 2024; Standard Chartered internal reports)

Streamlining repetitive or administrative steps inside familiar tools.
- Auto-filling report templates with the latest figures
- Creating recurring meeting agendas based on project updates
- Generating standardized commentary for routine portfolio reviews
- Pre-formatting decks and spreadsheets for management sign-off
Beyond individual use cases, banks are embedding LLMs as a “background layer” in their workflows and software – automating repetitive tasks and seamlessly integrating AI into day-to-day operations. This turns the AI into a utility that employees might use without even realizing it, improving efficiency across the board. For example:
Morgan Stanley: The firm envisions AI as an “efficiency-enhancing interaction layer” between employees and the many systems they use (from trading and CRM platforms to reporting tools). In practice, this means an LLM might sit behind the scenes of internal apps, ready to take natural-language instructions and execute tasks across systems. Morgan Stanley’s Head of AI noted they are moving toward a world where AI mediates interactions with execution systems, CRMs, reporting, risk analysis, etc., so that employees can input requests in plain English and the AI will handle the cross-application workmorganstanley.com. A concrete example already live is the Debrief tool: it not only writes notes but also auto-logs the summary and action items directly into Salesforce CRM after a meetingmorganstanley.com, fully automating what used to be a manual data entry task.
Bank of America: BofA has made AI a ubiquitous part of its internal operations. An illustrative case is how the bank auto-fills and updates recurring reports. For quarterly business reviews and portfolio summaries, the AI system pulls the latest data from databases and populates pre-formatted report templates with up-to-date charts, commentary, and figures. Employees then simply verify and adjust the narrative. Additionally, BofA’s in-house virtual assistants are deeply integrated: Erica for Employees (an internal chatbot used by 90% of BofA staff) is being enhanced with generative AI to answer a wide range of internal queries (HR, IT, product info) instantlynewsroom.bankofamerica.comnewsroom.bankofamerica.com. This effectively automates many routine support tasks (like password resets or locating policy documents) and has cut certain help-desk workloads by over 50%.
JPMorgan Chase: In mid-2024, JPMC announced its new “LLM Suite” – a proprietary generative AI platform – which it is integrating across various divisions. About 50,000 employees (15% of the workforce) were already using it at launch, making it one of the largest enterprise LLM deployments on Wall Streetfinancialit.net. The LLM Suite gives employees a one-stop AI helper for many routine workflow needs – from drafting emails and slide decks to summarizing lengthy documents – all within the bank’s secure environmentfinancialit.net. By providing a sanctioned, in-house AI, JPMC is embedding AI into everyday workflows (e.g. bankers can ask the LLM to pull data or generate talking points from within their standard software tools). JPMorgan’s leadership has emphasized that this kind of AI integration is aimed at boosting productivity (they estimate current AI uses are already contributing $1–1.5 billion in value) while maintaining strict data securityfinancialit.netfinancialit.net.
HSBC: Another example of back-end integration is at HSBC, which deployed a generative AI assistant for its operations teams in corporate banking. This assistant is integrated with their customer service platform and handles about 3 million internal queries a year from frontline staff (e.g. relationship managers asking for product details or procedural checks)hsbc.com. By automating these lookups and providing instant answers, the AI has reduced turnaround times for client service requests and improved consistency. HSBC is also integrating AI into workflow tools to generate standardized commentary (for instance, auto-generating first drafts of portfolio review notes that bankers can tweak) and to create checklists for routine processes. All of these small automations add up to a smoother, “AI-augmented” workflow where employees spend less time on administrative prep and more on high-value work.
Keys to Success
Artificial Intelligence (AI), and in particular large language models (LLMs), are reshaping the finance industry—not in abstract future terms, but in the real, immediate day-to-day work of analysts, associates, and senior professionals.
From investment research to deal modeling, AI is being woven into workflows through both publicly available tools and proprietary, firm-specific systems. Finance professionals are no longer asking whether AI will impact their work—they’re asking how to adapt to the change already underway.
What We Mean by “AI in Finance”
AI in finance refers to the use of advanced computational models—especially large language models like GPT-4 and Claude—to support, augment, or automate financial tasks. These models process text, code, numbers, and documents to perform tasks such as:
- Generating and editing text (e.g., drafting memos, summaries, reports)
- Synthesizing large datasets (e.g., extracting insights from earnings transcripts)
- Supporting analysis (e.g., benchmarking financials or identifying trends)
- Automating repetitive tasks (e.g., updating slides or cleaning data)
Unlike traditional algorithmic tools in finance (such as Excel macros or rule-based automation), LLMs work in natural language, allowing users to ask questions, give commands, and interact conversationally.
Get Started
Train your team to apply AI
AI is changing the world of finance – are you changing the way you train your teams?
From using Copilot in Excel to firm-specific AI tools, we train all levels of professionals to utilize large language models (LLMs) to drive results in diligence, modeling, and value creation.