🤖 AI Prompt
Call LLMs (OpenAI, Anthropic, Google Gemini, Ollama) with custom prompts.
Overview
The AI Prompt node calls large language models with custom prompts. Use it to generate text, analyze data, classify inputs, extract information, or any other LLM task within your workflows.
Supported Providers
- OpenAI — GPT-4o, GPT-4, GPT-3.5 Turbo, and other OpenAI models
- Anthropic — Claude Opus 4.6, Claude Sonnet 4.5, Claude Haiku 4.5
- Google — Gemini 2.5 Flash, Gemini 2.5 Pro
- Ollama — any self-hosted model (Llama, Mistral, etc.)
Configuration
- Provider — select OpenAI, Anthropic, Google, or Ollama
- Credential — select the API key credential for your chosen provider
- Model — choose the specific model (updates when you change provider)
- System Prompt — optional instructions that set the AI's behavior
- User Prompt — the main prompt with variable interpolation (
{{variable}}) - Temperature — controls randomness (0 = deterministic, 1 = creative)
- Max Tokens — maximum response length
- Output Variable — name for storing the response (default:
ai_response)
Usage Examples
- Text generation — “Write a follow-up email for
{{$json.customer_name}}” - Data extraction — “Extract the phone number from:
{{$json.email.text}}” - Classification — “Classify this support ticket as bug, feature, or question:
{{$json.ticket_body}}” - Summarization — “Summarize this document in 3 bullet points:
{{$json.document}}”
Common Use Cases
- Lead qualification — score inbound leads by analyzing their message, company, and intent using AI with context from your CRM
- Email triage — classify incoming emails by urgency and topic, draft auto-replies, and route to the right team
- Content generation — generate blog outlines, social posts, or product descriptions from structured data in your database
- Document summarization — extract key insights from long documents, contracts, or support threads into structured summaries