n8n

How to Automate Bright Data Product Data Capture?

Collect structured product data at scale without manual scraping. Ideal for ecommerce teams, data analysts, and engineers who need fresh web data for pricing, catalog checks, or market research.

The flow starts on a manual test run. It sets your dataset id and request URL, then calls Bright Data to create a snapshot. A status loop checks progress every 30 seconds using a wait and if path. When ready, it downloads the snapshot as JSON, combines items into one output, sends a quick webhook notification with a sample record, and saves the full payload as a file for archiving. Credentials use header auth to keep API access secure.

You will need a Bright Data account and API key, a valid dataset id, and a request URL. Replace the sample webhook with your own endpoint. Expect faster research cycles and less hands on effort, with clean data delivered in a reusable file and a simple QA signal by webhook. Great for price tracking, catalog audits, and marketplace monitoring across many product pages.

What are the key features?

  • Manual test trigger lets you run and verify the flow safely
  • Pre set fields for dataset id and request URL for easy setup
  • HTTP requests with header auth to start and track a snapshot
  • Status polling every 30 seconds using a wait and if loop
  • Error check gate so only valid states move to download
  • Download snapshot as JSON with query parameter format set
  • Aggregate all items into one payload for downstream use
  • Send a webhook with a sample record for fast quality checks
  • Create binary data and write a file to disk for archiving

What are the benefits?

  • Reduce manual scraping from 4 hours to 10 minutes per run
  • Automate 90 percent of collecting and packaging product data
  • Improve freshness by pulling new snapshots directly from the API
  • Handle large catalogs with a safe polling loop that waits until ready
  • Connect Bright Data output to any system with a simple webhook

How do you set it up?

  1. Import the template into n8n: Create a new workflow in n8n > Click the three dots menu > Select 'Import from File' > Choose the downloaded JSON file.
  2. You'll need accounts with Bright Data and Webhook.site. See the Tools Required section above for links to create accounts with these services.
  3. In your Bright Data dashboard, create an API key on the API or tokens page. Copy the key and keep it secure.
  4. In the n8n credentials manager, create a new HTTP Header Auth credential. Name it Bright Data API. Add the header name and token format provided by your Bright Data account, or click Create new credential in a Bright Data HTTP Request node and follow the on screen steps.
  5. Open the Set Dataset Id, Request URL node. Paste your dataset id from Bright Data and the request URL that defines what you want to scrape.
  6. Open the HTTP Request to the specified URL node. Select the Bright Data API credential you created. Keep the configured method and endpoint unless your Bright Data setup requires changes.
  7. Open the Set Snapshot Id node. Confirm the snapshot_id field maps from the previous HTTP response field that holds the snapshot identifier.
  8. Open the Check Snapshot Status node. Confirm the URL uses the snapshot_id expression and that the same Bright Data API credential is selected.
  9. Open the Wait node if you expect large jobs and adjust the wait time. Longer waits reduce API calls and avoid rate limits.
  10. Open the Initiate a Webhook Notification node and replace the webhook.site URL with your own endpoint if needed. You can keep webhook.site for testing and switch later.
  11. Open the Write the file to disk node. Set a file name and a folder path that your n8n host can write to. Use a path that exists on the server.
  12. Click Execute Workflow to run a test. Watch the status loop. When ready, confirm the JSON file is saved and that your webhook receives the sample record.
  13. If the snapshot_id is empty, verify your dataset id and request URL. If you see 401 or 403 errors, update the API key in credentials. If downloads time out, raise the timeout in the Download Snapshot node options.

Tools Required

$24 / mo or $20 / mo billed annually to use n8n in the cloud. However, the local or self-hosted n8n Community Edition is free.

Bright Data

Sign up

Pay as you go: $1.5 per 1K records (Web/LinkedIn Scraper API)

Webhook.site

Sign up

Free tier: $0, public API available; free URLs expire after 7 days and accept up to 100 requests

Similar Templates

Join Futurise to access 1,200+ automation templates

Get instant access to ready-made automation workflows for n8n, Make.com, AI agents, and more. Download, customise, and deploy in minutes.