Neo4j's Memory MCP Server Explained in 5 Minutes
Welcome to this guide. Today, we'll explore Neo4j's memory MCP server. This powerful tool serves as a near drop-in replacement for Anthropic's own knowledge graph memory MCP server.
Both servers feature a suite of several tools that are nearly identical. The primary distinction lies in a single function: the Neo4j server uses find_nodes
, while Anthropic's version uses open_nodes
. Beyond this minor difference, the two servers share the exact same schema.
The main purpose of this server is to provide an AI tool with long-term conversational memory, eliminating the need to fill its context window with the entire chat history.
Core Components of the Memory Server
There are several key components to this system:
- Entities: These are the fundamental nouns of the memory, creating nodes for a person, place, or thing.
- Observations: These are specific facts or details associated with the entities.
- Relationships: The server intelligently connects different entities, and sometimes observations, to build a comprehensive knowledge graph.
This article builds upon previous tutorials on installing MCP servers for tools like Cursor and Claude. We will focus on integrating the memory server and configuring it for continuous, automatic operation.
Configuring the Memory Server in Claude
First, let's configure Claude. The MCP servers are set up in the background by editing the config.json
file via the developer tab. The configuration should point to your local Neo4j instance.
While this setup works out of the box, you would typically need to explicitly ask Claude to use the memory tool. To enable automatic memory usage, navigate to Settings > General > Claude Settings > Configure
. In this section, you'll find a profile area for personal preferences that Claude should consider in its responses.
Anthropic provides a personalization prompt in their memory server repository. Adding this user prompt instructs the AI to consistently utilize the memory tool. You simply copy this prompt and paste it into the personal preferences section in Claude's settings.
With this configuration saved, Claude will now use the memory server by default. For instance, if you provide a personal fact, such as:
"I really like using FastAPI and I'm half-decent at writing Python code."
Claude will recognize this as important information to retain. It will first check the knowledge graph for existing data about the user before deciding how to store the new information. To further prompt the memory, you could add another piece of information:
"One of my good friends is building a project using Neo4j and Weaviate."
In response, Claude will access the graph. Even if the graph is initially empty, it will proceed to create the necessary entities and relationships based on this new data.
Upon inspecting the Neo4j database, you would see that it has successfully created a knowledge graph. The graph would contain nodes representing the user with attributes like "likes to use FastAPI" and "decent at Python," as well as a node for the "friend" who is "building a project using Weaviate," with a relationship connecting them as "friends." This demonstrates the beginning of a dynamic and evolving knowledge graph.
Enabling Continuous Memory in Cursor
The process is similar for the Cursor editor. After adding the same MCP configuration, you can enable default memory usage by navigating to Cursor's settings. Under the 'Rules' section, you will find 'User Rules.' Paste the same system prompt from Anthropic into this area, and it will be saved automatically.
Now, you can test the integration. For example, if you ask Cursor:
"Build a 'Hello, World' application for my friend."
Cursor will access its memory, recall that you prefer FastAPI, and immediately generate the appropriate code for a FastAPI server.
Example FastAPI Code: ```python from fastapi import FastAPI
app = FastAPI()
@app.get("/") def read_root(): return {"Hello": "World"} ```
You can then follow up with a request like:
"Modify this app to use the databases my friend is interested in."
The system will correctly pull "Neo4j" and "Weaviate" from the knowledge graph and attempt to modify the application accordingly, demonstrating its ability to recall and apply stored information contextually.
This article provided a quick introduction to using Neo4j's memory MCP server. This is just one example of how you can leverage this technology to create more intelligent and context-aware AI applications.
Join the 10xdev Community
Subscribe and get 8+ free PDFs that contain detailed roadmaps with recommended learning periods for each programming language or field, along with links to free resources such as books, YouTube tutorials, and courses with certificates.