🧠 Conversation Memory
Store, retrieve, and clear multi-turn conversation history for AI workflows.
Overview
The Conversation Memory node enables multi-turn conversations in your workflows by persisting chat history across executions. Store user and assistant messages, retrieve recent context for AI prompts, and auto-summarize older messages to keep token usage efficient.
Actions
- Retrieve — fetch recent messages and an optional summary for a conversation. Output includes messages array, summary text, and total message count.
- Store — save a message with a role (user, assistant, or system) and content. Supports variable interpolation for dynamic content.
- Clear — permanently delete all messages and summaries for a conversation. Use when a conversation should be reset.
Configuration
- Action — retrieve, store, or clear
- Conversation ID — a unique identifier for the conversation (supports {{variables}}, e.g. {{trigger.from}} for SMS threads)
- Recent Messages — how many recent messages to retrieve (default: 10)
- Include Summary — whether to include auto-generated summaries of older messages
- Output Variable — variable name to store retrieved memory (default: memory)
AI Prompt Integration
The AI Prompt node has built-in memory support. Enable "Conversation memory" in the AI Prompt config to automatically load history, send it with your prompt, and store both the user message and assistant response — without needing separate Memory nodes.
Common Use Cases
- SMS chatbot — use {{trigger.from}} as the conversation ID so each phone number maintains its own thread
- Support agent — persist context across webhook calls so the AI remembers prior questions and answers
- Long-running conversations — enable auto-summarization to compress older messages and stay within LLM context limits