n8n

How to Automate Bitrix24 Chat Support?

Turn your Bitrix24 open chats into a smart support channel that answers from your company documents. It greets new chats, handles messages, and replies with helpful, sourced answers. Great for teams that want faster replies without adding more agents.

Incoming events land on a secure webhook, which checks tokens and routes to the right path for message, join, install, or delete. An install event registers the bot and sets welcome and message handlers. Files are pulled from storage, split into readable chunks, embedded with a local model, and saved in Qdrant. When a user sends a question, the flow retrieves the most relevant chunks and builds a clear answer using a chat model. The reply is posted back to the same chat. Success and error responses keep the integration stable.

Setup needs Bitrix24, Qdrant, Gemini, and an Ollama host. Expect faster first response, lower agent load, and consistent answers from approved content. Ideal for support teams with many FAQs, onboarding guides, or policy docs. Start by pointing Bitrix24 events to the provided webhook URL, connect Qdrant and Gemini credentials, and choose which files to index.

What are the key features?

  • Webhook entry point that validates tokens and routes each event
  • Automatic bot registration for message and welcome events
  • Message processing that retrieves the best matching content from Qdrant
  • Document ingestion that lists folders, downloads files, and moves them after processing
  • Text splitting and default data loading for clean, searchable chunks
  • Embeddings created with an Ollama model and stored in Qdrant
  • Answer generation using a chat model and a question and answer chain
  • Clear success and error webhook responses for stable integrations

What are the benefits?

  • Reduce manual replies from hours per day to minutes
  • Handle up to 10 times more chat volume without extra staff
  • Improve answer accuracy by using approved files as the source
  • Cut first response time from minutes to seconds
  • Connect chat, knowledge, and AI in one workflow

How do you set it up?

  1. Import the template into n8n: Create a new workflow in n8n > Click the three dots menu > Select 'Import from File' > Choose the downloaded JSON file.
  2. You'll need accounts with Bitrix24, Qdrant, Google Gemini and Ollama. See the Tools Required section above for links to create accounts with these services.
  3. Open the Webhook node labeled Bitrix24 Handler and copy the Production URL. Keep this URL ready for your Bitrix24 app settings.
  4. In your Bitrix24 admin, create or edit a custom app or bot for open channels. Set the event handler to the Webhook URL you copied so Bitrix24 can send message, join, install, and delete events.
  5. In n8n Cloud, double click the Google Gemini Chat Model node, choose Create new credential, and follow the on screen steps to add your API key from your Google AI Studio account.
  6. Double click the Qdrant Vector Store nodes and create Qdrant credentials. Enter your Qdrant host URL and API key. Pick or create a collection name for your knowledge base.
  7. Double click the Embeddings Ollama nodes. Set the base URL of your Ollama host and the model name you have downloaded. Make sure your Ollama service is reachable from n8n.
  8. Open the Register Bot and Send Message HTTP Request nodes. Confirm that the domain and auth fields are mapped from incoming Bitrix24 events as shown. No extra credentials are needed if Bitrix24 sends auth tokens.
  9. Configure the storage listing and download steps. Check the folder paths or API endpoints in the Get a list of available storages and related nodes so they point to the files you want to index.
  10. Run the workflow once and trigger an install event from Bitrix24. Confirm you see a Success Response. The bot should register and set its event handlers.
  11. Add a few test files to your source folder. Execute the ingestion path to download, split, embed, and store them in Qdrant. Confirm that files move to the processed folder after indexing.
  12. Send a test question in your Bitrix24 open channel. Check that the workflow retrieves relevant chunks from Qdrant and posts a clear answer back to the chat.
  13. If messages are not posting, review the Validate Token and Route Event branches. Verify that the incoming event names match the switch rules and that the auth token is present.

Tools Required

$24 / mo or $20 / mo billed annually to use n8n in the cloud. However, the local or self-hosted n8n Community Edition is free.

Bitrix24

Sign up

Basic: $49 / mo (billed annually, 5 users) — lowest plan with REST API access

Google Gemini

Sign up

Free tier: $0 via Gemini API; e.g., Gemini 2.5 Flash-Lite free limits 1,000 requests/day (15 RPM, 250k TPM). Paid from $0.10/1M input tokens and $0.40/1M output tokens.

Ollama

Sign up

Free tier: $0 (self-hosted local API)

Qdrant

Sign up

Free tier: $0, 1 GB free cluster (no credit card), accessible via REST/GRPC API

Similar Templates

Join Futurise to access 1,200+ automation templates

Get instant access to ready-made automation workflows for n8n, Make.com, AI agents, and more. Download, customise, and deploy in minutes.