| Commit message (Collapse) | Author | Age | Files | Lines | |
|---|---|---|---|---|---|
| * | Support for conversations in SDKs (#618)12-15-support_for_conversations | Dhravya | 2025-12-20 | 2 | -0/+2 |
| | | |||||
| * | feat(@supermemory/tools): capture assitant responses with filtered memory (#539) | MaheshtheDev | 2025-10-28 | 2 | -6/+36 |
| | | | | | | | | | | | | | | | | ### Added streaming support to the Supermemory middleware and improved memory handling in the AI SDK integration. ### What changed? - Refactored the middleware architecture to support both streaming and non-streaming responses - Extracted memory prompt functionality into a separate module (`memory-prompt.ts`) - Added memory saving capability for streaming responses - Improved the formatting of memory content with a "User Supermemories:" prefix - Added utility function to filter out supermemories from content - Created a new streaming example in the test app with a dedicated route and page - Updated version from 1.3.0 to 1.3.1 in package.json - Simplified installation instructions in [README.m](http://README.md)d | ||||
| * | feat: withSupermemory for openai sdk (#531) | MaheshtheDev | 2025-10-27 | 1 | -0/+31 |
| | | | | | | | | | | | | | | | | ### TL;DR Added OpenAI SDK middleware support for SuperMemory integration, allowing direct memory injection without AI SDK dependency. ### What changed? - Added `withSupermemory` middleware for OpenAI SDK that automatically injects relevant memories into chat completions - Implemented memory search and injection functionality for OpenAI clients - Restructured the OpenAI module to separate tools and middleware functionality - Updated README with comprehensive documentation and examples for the new OpenAI middleware - Added test implementation with a Next.js API route example - Reorganized package exports to support the new structure | ||||
| * | chat app withSupermemory test | Mahesh Sanikommmu | 2025-10-22 | 1 | -0/+23 |