How to Interact with MCP Servers from Your Code in 5 Minutes
This article introduces a library that lets you communicate with your MCP servers directly in your code. You can use it with any LLM you prefer. Previously, communicating with MCP servers required specific clients like Windinsurf, Cursor, and Claude Desktop. Now, you can use this new MCP client library, which works by using an agent and comes with some pretty cool features.
Please note that this guide requires some coding knowledge to use properly. However, even if you're new to it, the process is straightforward. We'll also explore how to vibe code with it.
Installation and Setup
The library is Python-based. The first step is to ensure Python is installed on your system.
- Create a Virtual Environment: Once you have Python, create and activate a virtual environment. The commands for Windows and Mac/Linux are shown below.
* **Windows:**
```bash
python -m venv venv
.\venv\Scripts\activate
```
* **macOS/Linux:**
```bash
python3 -m venv venv
source venv/bin/activate
```
Install the Library: The necessary installation commands are also available on the library's GitHub repository. If you are using Python version 3, make sure to run everything using
pip3
.pip3 install mcp-client-lib
Install LLM Provider Libraries: Depending on the LLM you plan to use, you'll need to install additional packages.
- For OpenAI, install
langchain-openai
. - For Anthropic, install
langchain-anthropic
. - For other providers like Grok or Llama, you can find the required libraries listed on the official documentation.
- For OpenAI, install
Configuration and Code Example
Once everything is installed, open your terminal within the project directory.
Create an Environment File: Create a new file named
.env
. Your project will likely only have the virtual environment folder at this point, so you'll need to create this file yourself.Add Your API Key: In the
.env
file, paste the following line with your API key. You only need to provide the key for the provider you are using.OPENAI_API_KEY="YOUR_API_KEY_HERE"
Understanding the Code
Here is a basic example of how the code works. At the top, you can see various imports from the MCP library, Langchain, and OpenAI.
# Imports from MCP, Langchain, and OpenAI
from mcp import McpAgent, McpClient, McpConfig
from langchain_openai import ChatOpenAI
from dotenv import load_dotenv
# Load environment variables from the .env file
load_dotenv()
# Create an MCP client using a configuration from a separate file
# For this example, we assume 'airbnb_mcp_config' is defined elsewhere
mcp_config = ...
client = McpClient(mcp_config)
# Define the LLM we want to use (e.g., OpenAI)
# The setup will differ slightly for other providers like Anthropic
llm = ChatOpenAI(model="gpt-4")
# Create an agent
agent = McpAgent.from_llm_and_client(
llm=llm,
client=client,
max_iterations=5,
prompt="Find me a place to stay with a pool and good ratings."
)
# Run the agent and print the result
result = agent.run()
print(result)
When this script is executed, the server starts, and the agent gets to work. We received a list of links to listings from the Airbnb MCP that match our preferences for a pool and good ratings. It's a cool implementation that demonstrates the power of creating custom agents.
The possibilities for creating different agents are endless. You no longer need a separate client; you can bind an LLM to an MCP to build modular and autonomous applications.
Getting Help from Your Editor
One challenge you might face is that your editor (like Cursor) may not have the context of this framework. To provide it with the necessary context:
- Navigate to the "docs" feature in your editor.
- Add a new doc.
- Go to the library's GitHub repository and open the
README.md
file. - Copy the link to the
README.md
file and paste it into the doc link field.
Your editor will read, index, and use this documentation as context. To reference it in your code, type the @
sign, select the MCP docs, and the editor will generate code based on the framework's conventions.
Making the Repository Readable for LLMs
Another useful trick is to convert the repository into an LLM-ingestible format. You can do this by replacing hub
with ingest
in the GitHub URL. This opens the repository in a readable text format that you can use with any LLM to ask questions or get clarification.
Advanced Features
The framework supports numerous advanced use cases:
- HTTP Connections: You can connect to servers running on
localhost
. - Multi-Server Support: Define multiple servers in a single configuration file. You can either specify which result should come from which server or let the agent handle it dynamically by setting
use_service_manager
totrue
. - Tool Access Control: You can control which tools the agent has access to.
- Example Use Cases: The repository includes other examples, such as using Playwright with the Airbnb MCP or connecting to a Blender MCP server.
This is a solid framework for building new and innovative applications. If you need further assistance, you can use an LLM to make sense of the documentation or even write the code for you.
Join the 10xdev Community
Subscribe and get 8+ free PDFs that contain detailed roadmaps with recommended learning periods for each programming language or field, along with links to free resources such as books, YouTube tutorials, and courses with certificates.