LangChain And Autogen
๐ง Multi-Agent QA System with RAG and Multi-Agent Collaboration
๐ If you havenโt signed up for LanceDB Cloud yet, click here to get started!
This notebook demonstrates an end-to-end question answering system that combines retrieval-augmented generation (RAG) with multi-agent collaboration to achieve accurate, context-aware answers. Built on the SQuAD v2 benchmark, The implementation uses the following key components:
- LanceDB-powered context retrieval
- AutoGen-managed agent validation cycles
- LangChain-optimized document processing
๐ก Example Output
QUESTION: What is the capital of France?
INSTRUCTIONS: Provide verified answer using the context retrieval workflow
--------------------------------------------------------------------------------
Next speaker: QA_Specialist
QA_Specialist (to chat_manager):
1. Context 1: "Paris is the most populous city in France and the capital of the country."
Context 2: "The capital and the most populous city of France is Paris, often called the City of Light."
Context 3: "France, officially the French Republic, has its capital in Paris."
2. Answers from Contexts:
-From Context 1: Paris
-From Context 2: Paris
-From Context 3: Paris
3. Final Answer: Paris.
--------------------------------------------------------------------------------
Next speaker: Fact_Checker
Fact_Checker (to chat_manager):
Paris
--------------------------------------------------------------------------------
Next speaker: Coordinator
Coordinator (to chat_manager):
TERMINATE
--------------------------------------------------------------------------------
Question: What is the capital of France?
Answer: paris
Exact Match: 10000%
F1: 10000%
๐ ๏ธ What You'll Build
- RAG Pipeline with LanceDB vector store (all-MiniLM-L6-v2 embeddings)
- AutoGen GroupChat with specialized agent roles (retrieve โ generate โ verify)
- Conversation Visualizer showing agent interactions
Step 1: Install Required Libraries
Step 2: Obtain the API key from the dashboard
- Get the
db uri
db uri starts with db://, which can be obtained from the project page on the dashboard. In the following example, db uri is db://test-sfifxz.
- Get the
API KeyObtain a LanceDB Cloud API key by clicking on theGENERATE API KEYfrom thetablepage.
๐ก Copy the code block for connecting to LanceDB Cloud that is shown at the last step of API key generation.
paste your OPEN_AI_KEY
Step 3: Import libraries
Step 4: Load SQuAD dataset and chunck contexts
Note: We loaded the first 1000 data from the training set to speed up this example.
Step 5: Create embeddings and ingest them to LanceDB Cloud
This step might have a few minutes for generating embeddings.
Step 6: Configure Agent
In the setup, we have:
- QA agent: to generate answers from context
- Review agent: to validate answers
- User proxy: to manage the conversation flow
Step 7 : Orchestrate workflow
The agents collaborate to: generate the initial answer -> verify against context -> refine the answer if needed.
Step 8 : Set up evaluation pipeline
Step 9 : Let's test!
Coordinator (to chat_manager):
QUESTION: What is the capital of France?
INSTRUCTIONS: Provide verified answer using the context retrieval workflow
--------------------------------------------------------------------------------
Next speaker: QA_Specialist
QA_Specialist (to chat_manager):
1. Context 1: "Paris is the most populous city in France and the capital of the country."
Context 2: "The capital and the most populous city of France is Paris, often called the City of Light."
Context 3: "France, officially the French Republic, has its capital in Paris."
2. Answers from Contexts:
-From Context 1: Paris
-From Context 2: Paris
-From Context 3: Paris
3. Final Answer: Paris.
--------------------------------------------------------------------------------
Next speaker: Fact_Checker
Fact_Checker (to chat_manager):
Paris
--------------------------------------------------------------------------------
Next speaker: Coordinator
Coordinator (to chat_manager):
TERMINATE
--------------------------------------------------------------------------------
Question: What is the capital of France?
Answer: paris
Exact Match: 10000%
F1: 10000%
-----------
Coordinator (to chat_manager):
QUESTION: Who wrote Romeo and Juliet?
INSTRUCTIONS: Provide verified answer using the context retrieval workflow
--------------------------------------------------------------------------------
Next speaker: QA_Specialist
QA_Specialist (to chat_manager):
1. Retrieved Context 1: "Romeo and Juliet is a tragedy written by William Shakespeare early in his career about two young star-crossed lovers whose deaths ultimately reconcile their feuding families."
Retrieved Context 2: "The Shakespearean work Romeo and Juliet is among the most popularly known pieces by the playwright."
Retrieved Context 3: "Shakespeare became famous for several plays, including Romeo and Juliet which remains one of his most performed and well-known works."
2. Exact Answer from Context: William Shakespeare
3. Final Answer: William Shakespeare
--------------------------------------------------------------------------------
Next speaker: Fact_Checker
Fact_Checker (to chat_manager):
William Shakespeare
--------------------------------------------------------------------------------
Next speaker: Coordinator
Coordinator (to chat_manager):
TERMINATE
--------------------------------------------------------------------------------
Question: Who wrote Romeo and Juliet?
Answer: william shakespeare
Exact Match: 10000%
F1: 10000%
-----------