Zing Forum

Reading

Financial AI Copilot: An Intelligent Financial Analysis Assistant Based on Agentic RAG

An open-source autonomous financial analyst application that combines the Agentic RAG architecture with large language models to enable real-time stock data acquisition, intelligent financial news retrieval, and real-time web search, providing data-driven risk-return assessments for investment decisions.

金融AIAgentic RAGLLM实时股票数据LangChainLangGraph开源项目投资分析
Published 2026-04-07 01:14Recent activity 2026-04-07 01:20Estimated read 7 min
Financial AI Copilot: An Intelligent Financial Analysis Assistant Based on Agentic RAG
1

Section 01

[Main Thread Guide] Financial AI Copilot: An Intelligent Financial Analysis Assistant Based on Agentic RAG

Financial AI Copilot is an open-source autonomous financial analyst application designed to address the pain points of traditional financial analysis tools—high barriers to entry and AI assistants' difficulty accessing real-time data. It combines the Agentic RAG architecture with large language models to enable real-time stock data acquisition, intelligent financial news retrieval, and real-time web search, providing users with data-driven risk-return assessments. Positioned as users' "financial co-pilot", it supports various needs from public company analysis to macro trend judgment.

2

Section 02

Project Background and Positioning

In the field of financial investment, the timeliness of information and comprehensiveness of analysis determine the quality of decisions. Traditional tools require professional programming skills or financial knowledge, and most AI assistants struggle to directly access real-time market data. This project is positioned as a "financial co-pilot", allowing users to obtain professional analysis through natural language via Agentic RAG + LLM. It covers needs such as public/private enterprise analysis and macro trend judgment, can understand context, track conversation history, and convert high-risk requests into objective analysis.

3

Section 03

System Architecture and Tech Stack

The system uses a microservice architecture and is deployed containerized with Docker Compose. The backend is based on FastAPI (for asynchronous handling of concurrent requests), and the frontend uses Streamlit to build the interactive interface. Tech stack selection: LLM engine is Llama3.3 70b (deployed on Groq for low latency); agent framework uses LangChain (tool binding) + LangGraph (stateful execution and memory); observability integrates LangSmith; vector database Qdrant (stores financial news embeddings); embedding model all-MiniLM-L6-v2; real-time stock data from yfinance; web search uses DuckDuckGo (no API key required).

4

Section 04

Core Function Analysis

  1. Intelligent tool routing: Automatically determines whether to query real-time stock data (public companies) or perform web search (private companies/macro trends); 2. Dynamic RAG pipeline: When querying public stocks, it real-time retrieves news, embeds it, and stores it in Qdrant for semantic search; 3. Stateful conversation: Uses LangGraph checkpoints to maintain cross-session context and support natural follow-up questions; 4. Anti-rejection and request reconstruction: When high-risk requests are detected, converts them into data-driven risk-return assessments, balancing compliance and user needs.
5

Section 05

Practical Application Scenarios and Test Evaluation

Application scenarios include: public company analysis (e.g., comparing Apple/Microsoft's market capitalization and profit margins), private company research (e.g., Stripe's funding rounds), macro trend queries (e.g., reasons for the decline in the tech sector). The project uses an LLM-as-a-Judge pipeline to evaluate RAGAS metrics, covering scenarios from simple stock price queries to complex mixed ones, including boundary tests like fake company traps and future prediction traps to ensure correct responses.

6

Section 06

Deployment and Secondary Development Guide

Deployment steps: Prepare Docker Desktop and Groq API key (optional LangSmith key) → Clone the repository → Configure environment variables → Run docker-compose up --build. The frontend uses port 8501 by default, and the backend API documentation is available at port 8000 via Swagger UI. The code structure is modular (independent directories for agent configuration, API services, RAG pipeline, etc.), facilitating secondary development and customization by developers.

7

Section 07

Summary and Outlook

Financial AI Copilot demonstrates the possibility of combining LLMs with real-time data retrieval to build practical financial tools, solving the high barrier problem of traditional tools. Through the Agentic architecture, it endows the system with autonomous decision-making and intelligent routing capabilities. For developers and investors exploring AI financial applications, it is an open-source project worth in-depth research.