Give your site or app a smart chat reply without heavy setup. Incoming messages are answered by an AI model you choose through OpenRouter, and short chat history is remembered per user. It fits customer support, pre sales questions, and internal help.
A chat event starts the flow the moment a message arrives. A Settings step stores a default model id, and you can pass a model value to override it at runtime. The LLM Model node connects to OpenRouter with your API key, while Chat Memory keeps a small window of context using a sessionId so the bot remembers the last turns. The AI Agent sends the prompt text to the selected model and returns a clear reply. You can swap models quickly to compare quality, tone, and cost without changing the flow.
You will need an OpenRouter account and API key, then add the credential in n8n and set your default model string. Expect faster first responses and the ability to handle more chats without adding staff. Use it for website FAQs, campaign hotlines, or an internal concierge that answers routine questions.