What is Model Context Protocol (MCP)?
The Model Context Protocol (MCP) is emerging as a critical standard for structured communication between LLM powered Agents and external tools. By providing a unified way to pass context, MCP unlocks new possibilities for building intelligent, context-aware applications. MCP acts as a lightweight protocol that standardizes tool interfaces. Its standardized approach to tool interaction is particularly beneficial for agents, which often require complex interactions with multiple tools and data sources. This means:
- Improved Agent Reasoning: Agents can reason more effectively by accessing structured context from various tools.
- Enhanced Tool Utilization: MCP enables agents to use tools more efficiently, leading to more accurate and relevant responses.
- Complex Task Orchestration: Agents can orchestrate complex tasks involving multiple tools and data sources seamlessly.
- Increased Agent Autonomy: Agents can autonomously access and utilize external information, reducing the need for human intervention.
This enhances the ability to modularize and scale the entire Agent or LLM application by easily adding or updating individual custom-external tool calls without changing your core application. For more details, refer to the documentation.
Building the MCP RAG Server
Oracle OCI GenAI provides a powerful platform for building LLM-powered agents. By integrating OCI GenAI with MCP, we can create agents that are not only intelligent but also deeply integrated with Oracle’s data and services. In this step, we create an MCP server that provides a single tool—retrieve_answers. This tool connects to an Oracle vector database, performs a retrieval using a pre-configured vectorstore (using OracleVS), and returns relevant information on Oracle topics.
# rag_server.py from mcp.server.fastmcp import FastMCP from langchain_community.vectorstores.utils import DistanceStrategy from langchain_community.vectorstores import OracleVS from langchain.tools.retriever import create_retriever_tool # Import your custom embedding function and Oracle DB connection from my_project.embeddings import MyEmbeddingFunction # Replace with your actual embedding function from my_project.database import oracle_connection # Replace with your actual DB client/connection # Initialize the MCP server with a name mcp = FastMCP("RAG") # Initialize your embedding function and Oracle vector store (OracleVS) embedder = MyEmbeddingFunction() vectorstore = OracleVS( embedding_function=embedder, client=oracle_connection, # Your Oracle DB connection/client table_name="EMPLOYEE_BENEFITS", distance_strategy=DistanceStrategy.DOT_PRODUCT, ) retriever = vectorstore.as_retriever() # Wrap the retriever into a tool with a descriptive name and instructions retriever_tool = create_retriever_tool( retriever, "retrieve_answers", "Search and return information regarding Oracle topics", ) # Expose the tool as an MCP tool. The tool function is what gets called by the client. @mcp.tool() def retrieve_answers(query: str) -> str: """ Given a query, retrieve relevant Oracle benefits information from the vector database. """ # Directly delegate to the retriever tool return retriever_tool(query) if __name__ == "__main__": # Run the MCP server with a chosen transport method; stdio is used for demonstration. mcp.run(transport="stdio")
Note: This example assumes you have an existing Oracle vector database and an embedding function (or OracleVS wrapper) in place. Replace the placeholders with your actual implementations and configuration.
Integrating with an Oracle Agent Client
Now that our MCP server is running, we can build a client that loads this tool and integrates it into an Agent workflow using Oracle OCI GenAI. The client will use a multi-server MCP client (even if there’s only one server in this case) to load tools dynamically.
#agent_client.py # Create server parameters for stdio connection from mcp import ClientSession, StdioServerParameters from mcp.client.stdio import stdio_client import asyncio from langchain_mcp_adapters.tools import load_mcp_tools from langgraph.prebuilt import create_react_agent from langchain_community.chat_models.oci_generative_ai import ChatOCIGenAI llm = ChatOCIGenAI( model_id="cohere.command-r-plus-08-2024", # Replace with your model_id service_endpoint="YOUR_SERVICE_ENDPOINT", # Replace with your actual service endpoint compartment_id="YOUR_COMPARTMENT_ID", # Replace with your actual compartment id model_kwargs={"temperature": 0,"max_tokens": 4000} ) server_params = StdioServerParameters( command="python", # Make sure to update to the full absolute path to your rag_server.py file args=["rag_server.py"], ) async def main(): async with stdio_client(server_params) as (read, write): async with ClientSession(read, write) as session: # Initialize the connection await session.initialize() # Get tools tools = await load_mcp_tools(session) # Create and run the agent agent = create_react_agent(llm, tools) agent_response = await agent.ainvoke({"messages": "What are Flex Credits?"}) print("Response: ",agent_response['messages'][-1].content) if __name__ == "__main__": asyncio.run(main())
Explanation:
- MCP Server: Remains the same, exposing the vector store’s search functionality as an MCP tool.
- MCP Agent Client:
- Creates LangChain Agent using the tools that are loaded from the MCP server.
- Runs the Agent with a question.
- The Agent will use the tool and the LLM to provide a response.
Response: Flex Credits are credits provided by Oracle on each paycheck to help offset the cost of some of the benefit choices you make. Flex credits are listed on your payslip and include amounts for employee life insurance, accidental death and dismemberment insurance and long-term disability insurance.
Scaling Up with Multiple MCP Servers
If you want to integrate multiple external tools (for example, combining multiple RAG servers, real-time weather, and finance functionalities etc), you can use the MultiServerMCPClient
. This allows you to connect to various MCP servers simultaneously and aggregate their tools into a single agent. See the LangChain MCP Adapters documentation for more details.
Conclusion
MCP empowers the creation of highly intelligent and context-aware agents within the Oracle ecosystem. By standardizing tool integration, MCP enables agents to access and utilize information with unprecedented precision, leading to more effective and efficient applications. As the MCP ecosystem continues to evolve, we can expect to see even more innovative agent-based solutions that transform how we interact with Oracle’s data and services.
Learn more about Oracle GenAI.