n8n

How to Automate Etsy Product Research?

Collect product listings from Etsy search pages, turn them into clean JSON with AI, and send results to a webhook while also saving a file. Great for marketing and ecommerce teams that need fast product research and simple reporting. Useful for trend checks, price tracking, and catalog building without manual copy and paste.

The flow starts on a manual run. A Set node defines the Etsy search URL and the proxy zone. Bright Data Web Unlocker fetches the first page. A Google Gemini model reads the page and extracts pagination so more page links are created. Split Out and Split In Batches loop through these pages. Each page is fetched by Bright Data, then an Information Extractor uses the model to pull fields like name, image, price, and link into JSON. A webhook call posts a summary, and a file writer saves the content. There is an optional OpenAI path if you prefer that model.

You need a Bright Data token and zone, a Google AI or OpenAI key, and a Webhook.site URL. Update the search query and zone values, then choose one model path and disable the other. Expect a clear product list in minutes and less manual work. Helpful for keyword testing, competitive checks, and building small research datasets for ads or merchandising.

What are the key features?

  • Manual run for safe testing before scaling up
  • Search query control using a Set node to define URL and zone
  • Bright Data Web Unlocker calls to fetch HTML reliably
  • AI model extracts pagination to build a list of result pages
  • Split Out and Split In Batches loop through each page
  • LLM based Information Extractor converts HTML to clean JSON items
  • Webhook notification sends a summary field to an external endpoint
  • Create Binary and File Writer store the scraped content to disk
  • Optional switch between Google Gemini and OpenAI for extraction

What are the benefits?

  • Reduce manual product research from 3 hours to 10 minutes
  • Automate 90 percent of copy and paste work for product lists
  • Improve data completeness by extracting consistent fields
  • Handle dozens of result pages per run without extra effort
  • Send a live summary to any webhook endpoint for quick reviews
  • Save a local file you can archive or feed into other tools

How do you set it up?

  1. Import the template into n8n: Create a new workflow in n8n > Click the three dots menu > Select 'Import from File' > Choose the downloaded JSON file.
  2. You'll need accounts with Bright Data, OpenAI, Google AI and Webhook.site. See the Tools Required section above for links to create accounts with these services.
  3. In the n8n credentials manager, create a Bright Data credential using HTTP Header Auth. Add an Authorization header with your API token. Name the credential clearly, for example BrightData Web Unlocker.
  4. Open the HTTP Request nodes that call Bright Data and select your Bright Data credential in the Credential to connect with dropdown. Confirm the zone value in the body matches your Bright Data zone.
  5. For Google AI, create an API key in your Google AI account. In n8n, open the Google Gemini nodes, click Create new credential, and paste the API key. Save and test the connection.
  6. For OpenAI, create an API key in your OpenAI account. In n8n, open the OpenAI Chat Model node, click Create new credential, and paste the key. Save and test the connection.
  7. In the Set Esty Search Query node, change the url field to your Etsy search and page number. If needed, add your Bright Data zone field so the first request uses the correct zone.
  8. Open the HTTP Request node named Initiate a Webhook Notification for the extracted data and replace the URL with your own Webhook.site URL. Keep the summary field in the body.
  9. Choose one AI path. Keep the Google Gemini nodes enabled or switch to the OpenAI path and disable the Gemini path. This prevents double extraction.
  10. Click Test workflow. Watch the execution. You should see pagination extracted, multiple pages fetched, and an item list created. Check Webhook.site for a new request with the summary field.
  11. Open the Write the scraped content to disk node output to confirm a file was written. If it fails, set a valid file path and try again.
  12. Troubleshooting: If Bright Data returns errors, verify the Authorization header and zone. If no items are extracted, review the Information Extractor schema example and adjust the prompt. If rate limits appear, reduce batch size in Split In Batches.

Tools Required

$24 / mo or $20 / mo billed annually to use n8n in the cloud. However, the local or self-hosted n8n Community Edition is free.

Bright Data

Sign up

Pay as you go: $1.5 per 1K records (Web/LinkedIn Scraper API)

Google AI

Sign up

Free tier: $0 (Gemini API with rate limits). Lowest paid option: Gemini 2.5 Flash-Lite at $0.10 per 1M input tokens and $0.40 per 1M output tokens.

OpenAI

Sign up

Pay-as-you-go: GPT-5 at $1.25 per 1M input tokens and $10 per 1M output tokens

Webhook.site

Sign up

Free tier: $0, public API available; free URLs expire after 7 days and accept up to 100 requests

Similar Templates

Join Futurise to access 1,200+ automation templates

Get instant access to ready-made automation workflows for n8n, Make.com, AI agents, and more. Download, customise, and deploy in minutes.