aboutsummaryrefslogtreecommitdiff
path: root/apps/docs/vibe-coding.mdx
diff options
context:
space:
mode:
authorDhravya Shah <[email protected]>2026-01-18 16:55:32 -0800
committerGitHub <[email protected]>2026-01-18 16:55:32 -0800
commit87b361c26bf5fc16049cd2727825891aa14b8e8b (patch)
treec2f9f4f6223a7c1734578b772a16490ba2eb8b96 /apps/docs/vibe-coding.mdx
parentAdd Claude Code GitHub Workflow (#681) (diff)
downloadsupermemory-87b361c26bf5fc16049cd2727825891aa14b8e8b.tar.xz
supermemory-87b361c26bf5fc16049cd2727825891aa14b8e8b.zip
docs changes (#678)
Co-authored-by: Claude Opus 4.5 <[email protected]>
Diffstat (limited to 'apps/docs/vibe-coding.mdx')
-rw-r--r--apps/docs/vibe-coding.mdx14
1 files changed, 7 insertions, 7 deletions
diff --git a/apps/docs/vibe-coding.mdx b/apps/docs/vibe-coding.mdx
index 54b00321..bb5cd0e9 100644
--- a/apps/docs/vibe-coding.mdx
+++ b/apps/docs/vibe-coding.mdx
@@ -2,7 +2,7 @@
title: "Vibe Coding Setup"
description: "Automatic Supermemory integration using AI coding agents"
icon: "zap"
-sidebarTitle: "Automatic setup"
+sidebarTitle: "Install with AI"
---
Get your AI coding agent to integrate Supermemory in minutes. Copy the prompt below, paste it into Claude/GPT/Cursor, and let it do the work.
@@ -170,7 +170,7 @@ const { profile, searchResults } = await client.profile({
const context = `
Static facts: ${profile.static.join('\n')}
Recent context: ${profile.dynamic.join('\n')}
-${searchResults ? `Memories: ${searchResults.results.map(r => r.content).join('\n')}` : ''}
+${searchResults ? `Memories: ${searchResults.results.map(r => r.memory).join('\n')}` : ''}
`
// Send to LLM
@@ -180,7 +180,7 @@ const messages = [
]
// After LLM responds:
-await client.memories.add({
+await client.add({
content: `user: ${userMessage}\nassistant: ${response}`,
containerTag: userId
})
@@ -202,7 +202,7 @@ const results = await client.search({
})
// Build context
-const context = results.results.map(r => r.content).join('\n')
+const context = results.results.map(r => r.memory || r.chunk).join('\n')
// Send to LLM with context
const messages = [
@@ -211,7 +211,7 @@ const messages = [
]
// Store the conversation
-await client.memories.add({
+await client.add({
content: `user: ${userMessage}\nassistant: ${response}`,
containerTag: userId
})
@@ -372,11 +372,11 @@ The skill asks questions interactively and generates code for your specific setu
Manual integration guide
</Card>
- <Card title="User Profiles" icon="user" href="/user-profiles/overview">
+ <Card title="User Profiles" icon="user" href="/concepts/user-profiles">
Deep dive into profiles
</Card>
- <Card title="Search API" icon="search" href="/search/overview">
+ <Card title="Search API" icon="search" href="/search">
Search modes and parameters
</Card>