1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
|
---
title: "Overview"
description: "Give your AI assistants persistent memory with the Model Context Protocol"
icon: "brain-circuit"
---
Supermemory MCP Server 4.0 gives AI assistants (Claude, Cursor, Windsurf, etc.) persistent memory across conversations. Built on Cloudflare Workers with Durable Objects for scalable, persistent connections.
## Quick Install
```bash
npx -y install-mcp@latest https://mcp.supermemory.ai/mcp --client claude --oauth=yes
```
Replace `claude` with your MCP client: `cursor`, `windsurf`, `vscode`, etc.
## Manual Configuration
Add to your MCP client config:
```json
{
"mcpServers": {
"supermemory": {
"url": "https://mcp.supermemory.ai/mcp"
}
}
}
```
The server uses **OAuth** by default. Your client will discover the authorization server via `/.well-known/oauth-protected-resource` and prompt you to authenticate.
### API Key Authentication (Alternative)
If you prefer API keys over OAuth, get one from [app.supermemory.ai](https://app.supermemory.ai) and pass it in the `Authorization` header:
```json
{
"mcpServers": {
"supermemory": {
"url": "https://mcp.supermemory.ai/mcp",
"headers": {
"Authorization": "Bearer sm_your_api_key_here"
}
}
}
}
```
API keys start with `sm_` and skip OAuth when provided.
### Project Scoping
Scope all operations to a specific project with `x-sm-project`:
```json
{
"mcpServers": {
"supermemory": {
"url": "https://mcp.supermemory.ai/mcp",
"headers": {
"x-sm-project": "your-project-id"
}
}
}
}
```
## Tools
### `memory`
Save or forget information about the user.
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `content` | string | Yes | The memory content to save or forget |
| `action` | `"save"` \| `"forget"` | No | Default: `"save"` |
| `containerTag` | string | No | Project tag to scope the memory |
### `recall`
Search memories and get user profile.
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `query` | string | Yes | Search query to find relevant memories |
| `includeProfile` | boolean | No | Include user profile summary. Default: `true` |
| `containerTag` | string | No | Project tag to scope the search |
### `whoAmI`
Get the current logged-in user's information. Returns `{ userId, email, name, client, sessionId }`.
## Resources
| URI | Description |
|-----|-------------|
| `supermemory://profile` | User profile with stable preferences and recent activity |
| `supermemory://projects` | List of available memory projects |
## Prompts
### `context`
Inject user profile and preferences as system context for AI conversations. Returns a formatted message with the user's stable preferences and recent activity.
You can access this in Cursor and Claude Code by just doing /context, which will give the LLMs just enough context to use and query supermemory more.
**Purpose:** Unlike the `recall` tool (which searches for specific information) or the `profile` resource (which returns raw data), the `context` prompt provides a pre-formatted system message designed for context injection at the start of conversations.
| Parameter | Type | Required | Description |
|-----------|------|----------|-------------|
| `containerTag` | string | No | Project tag to scope the profile (max 128 chars) |
| `includeRecent` | boolean | No | Include recent activity in the profile. Default: `true` |
**Output format:**
- Includes instructions to save new memories using the `memory` tool
- **Stable Preferences:** Long-term user facts and preferences
- **Recent Activity:** Recent interactions and context (when `includeRecent` is `true`)
- Fallback message when no profile exists yet
**When to use:**
- Use `context` prompt for automatic system context injection at conversation start
- Use `recall` tool when you need to search for specific information
- Use `profile` resource when you need raw profile data for custom processing
<Card title="MCP Server Source Code" icon="github" href="https://github.com/supermemoryai/supermemory/tree/main/apps/mcp">
View the open-source implementation
</Card>
|