n8n

How to Automate ScrapingRobot SERP Tracking?

Need a fast way to see where your pages rank on Google? This build collects search results for a list of keywords and turns them into clean, ranked data. It fits SEO teams that want simple weekly tracking and quick competitor checks.

Run it on demand from n8n. Keywords come from a simple list or your own database. Each keyword is sent to the ScrapingRobot API with a POST request using the GoogleScraper module. The flow then maps the organic results, splits each result into its own row, removes rows with empty titles, and assigns a rank number for each keyword. The HTTP Request node batches calls so large keyword sets are handled in smaller chunks.

Setup is straightforward. Add your ScrapingRobot API key using HTTP Query Auth with a token field and keep the Accept header set to application json. You can keep the built in keyword array or replace the placeholders with your source. Expect faster SERP reviews, steady rankings, and ready to export data for reports. Many teams cut manual checks from hours to minutes and avoid copy paste mistakes while keeping a clear view of competitor positions.

What are the key features?

  • On demand run with a manual trigger for quick tests and ad hoc checks.
  • Keyword input from a simple Array list or your own data source.
  • Batch POST requests to ScrapingRobot using the GoogleScraper module.
  • Maps organic results into a clean structure for analysis.
  • Splits every search result into its own row for easy filtering and export.
  • Filters out rows with empty titles to reduce noise.
  • Assigns rank numbers per search query using a code step.
  • Placeholders ready to connect to your output tool like a database or sheet.

What are the benefits?

  • Reduce manual SERP checks from hours to minutes
  • Improve data accuracy by removing copy paste errors
  • Handle hundreds of keywords with safe batch requests
  • Get clear rank numbers for each keyword and page
  • Unify SERP data with your reporting database

How do you set it up?

  1. Import the template into n8n: Create a new workflow in n8n > Click the three dots menu > Select 'Import from File' > Choose the downloaded JSON file.
  2. You'll need accounts with ScrapingRobot. See the Tools Required section above for links to create accounts with these services.
  3. In your ScrapingRobot account, create an API key from the API or dashboard page.
  4. Open the GET SERP node in n8n and in the Credential to connect with field click Create new credential, choose HTTP Query Auth, add a parameter named token, and paste your ScrapingRobot API key. Save the credential.
  5. In the same GET SERP node, keep the method as POST and the URL as https://api.scrapingrobot.com. Ensure the Accept header is set to application/json.
  6. Confirm the JSON body contains url set to https://www.google.com, module set to GoogleScraper, and params.query mapped to {{$json["Keyword"]}}.
  7. Open the Set Keywords to get SERPs for node and add or edit your keyword list. If using an external source later, keep the column name Keyword to match the mapping.
  8. Check the Split out Keywords node to ensure it iterates one keyword at a time. The HTTP node batching option should be enabled with a batch size that suits your account limits.
  9. Run the workflow with the manual trigger. Inspect the GET SERP node output for a result object that includes organicResults.
  10. Open the SERP results and Separate nodes to verify that organicResults is mapped and split into individual rows. Ensure the filter drops empty titles.
  11. Open the Assign SERP #pos node and confirm that each keyword group starts at rank 1 and increases by one per result.
  12. Replace the output placeholder with your target data sink, such as Airtable, Google Sheets, or a database. Map fields like searchQuery, title, url, and position, then test again.
  13. If you receive empty results or errors, recheck your API key in the credential, confirm the token parameter name is token, and verify you have not exceeded rate limits. Adjust batch size if needed.

Tools Required

$24 / mo or $20 / mo billed annually to use n8n in the cloud. However, the local or self-hosted n8n Community Edition is free.

ScrapingRobot

Sign up

Free tier: $0, 5,000 scrapes (API access with token)

Credits:
Made by Simon at automake.io

Similar Templates

Join Futurise to access 1,200+ automation templates

Get instant access to ready-made automation workflows for n8n, Make.com, AI agents, and more. Download, customise, and deploy in minutes.