diff options
| author | Prasanna <[email protected]> | 2026-01-10 15:19:31 -0800 |
|---|---|---|
| committer | GitHub <[email protected]> | 2026-01-10 15:19:31 -0800 |
| commit | d015036b05133a0a836db51e1fd7157120947302 (patch) | |
| tree | da3cfdd2f1fe2d0a6e6bbd9be7bfd360f9889d1d /packages/pipecat-sdk-python/Agents.md | |
| parent | docs: add S3 connector documentation (#657) (diff) | |
| download | supermemory-d015036b05133a0a836db51e1fd7157120947302.tar.xz supermemory-d015036b05133a0a836db51e1fd7157120947302.zip | |
pipecat-sdk (#663)
Diffstat (limited to 'packages/pipecat-sdk-python/Agents.md')
| -rw-r--r-- | packages/pipecat-sdk-python/Agents.md | 72 |
1 files changed, 72 insertions, 0 deletions
diff --git a/packages/pipecat-sdk-python/Agents.md b/packages/pipecat-sdk-python/Agents.md new file mode 100644 index 00000000..0d4f687c --- /dev/null +++ b/packages/pipecat-sdk-python/Agents.md @@ -0,0 +1,72 @@ +# AGENTS.md + +## Overview + +This package adds persistent memory to Pipecat voice AI pipelines using Supermemory. + +**Tech Stack:** Python >=3.10, Pipecat, Supermemory SDK + +## Commands + +```bash +pip install supermemory-pipecat +``` + +## Integration Pattern + +Place `SupermemoryPipecatService` between context aggregator and LLM in the pipeline: + +```python +from supermemory_pipecat import SupermemoryPipecatService + +memory = SupermemoryPipecatService( + user_id="user-123", # Required: identifies the user + session_id="session-456", # Optional: groups conversations +) + +pipeline = Pipeline([ + transport.input(), + stt, + context_aggregator.user(), + memory, # <- Memory service here + llm, + tts, + transport.output(), + context_aggregator.assistant(), +]) +``` + +## Configuration + +```python +memory = SupermemoryPipecatService( + api_key="...", # Or use SUPERMEMORY_API_KEY env var + user_id="user-123", + session_id="session-456", + params=SupermemoryPipecatService.InputParams( + search_limit=10, # Max memories to retrieve + search_threshold=0.1, # Similarity threshold 0.0-1.0 + mode="full", # "profile" | "query" | "full" + system_prompt="Based on previous conversations:\n\n", + ), +) +``` + +## Memory Modes + +| Mode | Retrieves | Use When | +|------|-----------|----------| +| `"profile"` | User profile only | Personalization without search | +| `"query"` | Search results only | Finding relevant past context | +| `"full"` | Profile + search | Complete memory (default) | + +## Environment Variables + +- `SUPERMEMORY_API_KEY` - Supermemory API key +- `OPENAI_API_KEY` - For OpenAI services (STT/LLM/TTS) + +## Boundaries + +- Always place memory service after `context_aggregator.user()` and before `llm` +- Always provide `user_id` - it's required +- Never hardcode API keys in code - use environment variables |