Podcast Title

Author Name

0:00
0:00
Album Art

MCP Protocol Upgrade: Why Streamable HTTP Beats SSE

By 10xdev team August 17, 2025

Anthropic's MCP protocol has undergone some significant changes, and arguably the most impactful is the shift from SSE to streamable HTTP as its transport method. In this article, you'll learn why this transition is a major improvement for the protocol and significantly enhances its capabilities. We'll cover how to set up both stateful and stateless MCP servers and even demonstrate how to use a proxy server to implement this new transport method without altering your existing SSE-based MCP server.

The Rationale Behind the Switch

So, why revise the protocol at all? Well, Server-Sent Events (SSE) inherently require state. They depend on a persistent, long-lived connection between a client and a server. This introduces considerable overhead when accessing tools via HTTP, especially since many common use cases involve centralized tools where a stateful connection isn't necessary.

However, stateful connections remain incredibly useful for more complex tools, particularly those involving processes like sampling. Sampling means a server requests LLM calls directly from the client. This is why we will demonstrate how to implement both stateful and stateless scenarios. Let's dive straight into the examples.

Implementing Stateful and Stateless Servers

Let's start with a basic MCP server with a single tool that adds two numbers. For those familiar with MCP, you'll notice a new stateless_http argument within the FastMCP class, which can be set to True or False. We'll begin with a stateful connection by setting it to False.

Here is the basic server setup in server.py:

# A basic MCP server with a single tool which adds two numbers
# ... inside the FastMCP class initialization ...
fast_mcp = FastMCP(..., stateless_http=False)

Setting stateless_http to False establishes a stateful connection using streamable HTTP as the transport method.

Interacting with a Stateful Server

To start the server, you would run python server.py, which spins up the MCP server on port 8000. To communicate with it, we can use a shell script, script-stateful.sh.

Here’s what the interaction involves in just 3 steps:

  1. Initialize Session: First, we must initialize a session with the server to establish a long-lived connection. This is done with a POST request using JSON-RPC. The payload specifies JSON-RPC: "2.0" and method: "initialize". The server responds with a session ID.

  2. Confirm Connection: We then use the received session ID in a custom header (mcp-session-id) for a follow-up request. The payload for this request uses the method notifications/initialized to confirm the client connection.

  3. Execute Tool Call: Finally, we make the actual tool call. The method is tools/call, and the parameters include the tool's name (add) and its arguments as a dictionary object.

Here is an example of the shell script content:

# 1. Initialize session and get session_id
# ... curl command with '{"jsonrpc": "2.0", "method": "initialize"}' ...

# 2. Send notification that client is initialized
# ... curl command with 'mcp-session-id' header and '{"jsonrpc": "2.0", "method": "notifications/initialized"}' ...

# 3. Call the tool
# ... curl command with 'mcp-session-id' header and '{"jsonrpc": "2.0", "method": "tools/call", "params": {"name": "add", "arguments": {"a": 2, "b": 3}}}' ...

Executing this script successfully returns the result 5 without any errors.

Shifting to a Stateless Connection

While the stateful approach works, it introduces significant overhead. For a simple task like adding two numbers, a persistent connection is unnecessary. We can simplify this by using a stateless connection, eliminating the need for session initialization and confirmation notifications.

With stateless HTTP, a single request containing the tool call and its arguments is sufficient. If we try to run a simplified script (script-stateless.sh) that omits the first two steps against our current stateful server, it will fail with a Bad Request: Missing session ID error.

To fix this, we modify the server configuration:

# In server.py
fast_mcp = FastMCP(..., stateless_http=True)

After restarting the server with this change, the stateless script works perfectly. This is a much more efficient approach and the recommended way to set up an MCP server for functions that do not require state.

Using the Python Client

The MCP library provides a convenient client for connecting to your server. The modern approach involves importing the StreamableHTTPClient function from the mcp.client.streamable_http module.

You configure the client with the server URL, which now defaults to the /mcp endpoint instead of /sse. Although we are operating statelessly, the ClientSession class is still needed to instantiate the client. From there, you can use the call_tool method, passing the tool name and arguments.

Here is a sample client implementation in client.py:

from mcp.client.streamable_http import StreamableHTTPClient, ClientSession

url = "http://localhost:8000/mcp"

async def main():
    async with StreamableHTTPClient(url) as client:
        async with ClientSession(client) as session:
            result = await session.call_tool("add", {"a": 21, "b": 21})
            print(f"Result: {result}")

# Running this would output: Result: 42

Interestingly, the client library is robust enough that even if you call initialize (which normally sets up a session), it will be gracefully ignored when the server's stateless_http attribute is set to True.

Upgrading Legacy Servers with a Proxy

This approach is great for new MCP servers, but what about existing, live servers that use the older SSE transport? You can upgrade them to support streamable HTTP without modifying the original server code by using a proxy.

Imagine a legacy backend server running on port 9001 that uses SSE. We can create a proxy server that translates modern, stateless HTTP requests into the SSE format that the legacy server understands.

Here’s how it works in proxy_server.py:

  1. Connect to Legacy Server: First, we import a client and connect to the legacy backend server using the SSE transport.

  2. Create a Proxy with a Client: Next, we instantiate an SSETransport client that points to our legacy server. We then use the from_client class method of FastMCP, pass in our SSE client, and set the stateless_http attribute to True.

# In proxy_server.py
from mcp.client.sse import SSETransport
from mcp.fast_mcp import FastMCP

# 1. Client connects to the legacy SSE server
legacy_client = SSETransport("http://localhost:9001/sse")

# 2. Create a new FastMCP instance from the client
# This new server will act as a proxy
proxy_server = FastMCP.from_client(
    client=legacy_client,
    stateless_http=True
)

# 3. Run the proxy server on a different port
# proxy_server.run(port=8000)

This creates a proxy server that listens for streamable HTTP requests on port 8000. When it receives a request, it forwards it to the legacy SSE server on port 9001, effectively bridging the two transport methods.

To make this work, you would first run the legacy backend server (using SSE on port 9001) and then start the proxy server (listening for streamable HTTP on port 8000). A modern client can then send requests to the proxy server, which seamlessly routes them to the legacy system, making the old logic compatible with the new protocol.

Conclusion

This update significantly improves the MCP protocol. By moving away from mandatory stateful connections, the protocol becomes more flexible and efficient. Stateless communication is now the recommended default for simple functions, while the option for stateful connections remains available for more complex servers that require it. This change addresses previous criticisms and makes the protocol much more versatile.

Join the 10xdev Community

Subscribe and get 8+ free PDFs that contain detailed roadmaps with recommended learning periods for each programming language or field, along with links to free resources such as books, YouTube tutorials, and courses with certificates.

Recommended For You

Up Next