n8n

How to Automate YouTube Research for Marketing Insights?

Turn a chat into fast research on YouTube channels and videos. Teams can ask questions and get answers from real video data, comments, transcripts, and thumbnails. Built for content marketers and creators who need clear audience insights.

A chat message starts an AI agent that picks the right tool for the job. It uses OpenAI as the brain and calls tools to search YouTube, get channel details, list videos, pull descriptions, and collect comments. It can transcribe videos through an external service and review thumbnails with AI. A Postgres memory keeps context so follow up questions stay on topic. A switch route also handles direct API calls for structured replies when you need exact fields.

You will need API keys for the YouTube Data API, OpenAI, and Apify plus a PostgreSQL database. Add credentials in n8n, map the input fields on each tool node, and test with one channel handle and one video id. Expect less manual research and faster content planning, driven by what viewers actually say and do on each video.

What are the key features?

  • Chat trigger that starts on incoming messages and routes them to an AI agent
  • OpenAI language model with PostgreSQL memory to keep conversation context
  • Channel lookup by handle to return channel id, title, and description
  • Video listing by channel with controls for sort order and date filters
  • Video detail fetch to get full description, duration, and publish date
  • Comment collection with pagination for deeper audience insights
  • Video transcription via external service for long form content analysis
  • Thumbnail review with AI to evaluate clarity and design signals
  • Switch based routing for direct API calls and structured responses

What are the benefits?

  • Reduce manual research from 3 hours to 10 minutes per channel
  • Automate about 80% of comment and metadata collection
  • Improve accuracy by pulling data directly from YouTube APIs
  • Handle up to 10 times more videos per week with the same team
  • Unify chat questions, video data, and database context in one place

How do you set it up?

  1. Import the template into n8n: Create a new workflow in n8n > Click the three dots menu > Select 'Import from File' > Choose the downloaded JSON file.
  2. You'll need accounts with YouTube Data API, OpenAI, Apify and PostgreSQL. See the Tools Required section above for links to create accounts with these services.
  3. In Google Cloud Console, enable YouTube Data API v3. Create an API key and copy it. This key will be passed to HTTP Request nodes that call YouTube endpoints.
  4. In your OpenAI account, create an API key. Keep it secure. You will add it to the OpenAI nodes in n8n.
  5. In your Apify account, create a personal API token. This token is required by the transcription node that runs on Apify.
  6. Prepare a PostgreSQL database. Create a user with write access. Note the host, port, database name, user, password, and SSL settings.
  7. In n8n Credentials, create new credentials for each service. If unsure, double click the relevant node, then on the 'Credential to connect with' dropdown, click 'Create new credential' and follow the on screen instructions to integrate that service.
  8. Assign credentials: set OpenAI on the OpenAI Chat Model and AI nodes, YouTube API key on the HTTP Request nodes for channel, video, and comments, Apify token on the transcription node, and PostgreSQL on the Postgres Chat Memory node.
  9. Open the tool nodes and confirm required inputs: handle for channel lookup, channel_id for listing, video_id for details and comments, and video_url for transcription. Save each node after mapping fields.
  10. Test the chat trigger by opening the workflow’s chat view and asking: Show the last 5 videos from @handle. Check that results include titles and publish dates.
  11. Validate comments and transcription: query a known video_id, confirm comments return with authors, and run a short video transcription to verify the Apify integration.
  12. Troubleshoot common issues: if you get quota errors, reduce calls or set publishedAfter filters. For invalid handles, paste the full channel URL so the tool can parse the handle. For long videos, expect longer processing time and higher cost.

Tools Required

$24 / mo or $20 / mo billed annually to use n8n in the cloud. However, the local or self-hosted n8n Community Edition is free.

Apify

Sign up

Free plan: $0 / mo with $5 monthly platform credits; API access via token

OpenAI

Sign up

Pay-as-you-go: GPT-5 at $1.25 per 1M input tokens and $10 per 1M output tokens

PostgreSQL

Sign up

Free: $0 (open-source PostgreSQL License; self-hosted)

YouTube Data API

Sign up

Free: $0, default 10,000 units/day per project; additional quota via audit request (no paid tier)

Credits:
Made by Mark Shcherbakov from 5minAI. Setup video: YouTube video

Similar Templates

Join Futurise to access 1,200+ automation templates

Get instant access to ready-made automation workflows for n8n, Make.com, AI agents, and more. Download, customise, and deploy in minutes.