Turn long web pages into clean markdown with links, ready for AI and research. Teams that audit content, build knowledge bases, or study competitors can load a list of URLs and get structured results without manual copy and paste.
The flow starts when you click run. A sample list or your database feeds a Page column. The list is split so each URL becomes one item. A limit of 40 items controls server memory, and a batch size of 10 respects the Firecrawl rate of 10 requests per minute. A wait of 45 seconds spaces each round. Each URL is sent to Firecrawl, which returns title, description, markdown, and links. A mapping step formats these fields for storage, and the loop repeats until the list is done.
To set it up, add your Firecrawl API key, keep your URL column named Page, and connect your output store. Expect faster content prep and fewer errors. You can process large lists in cycles that fit your server and the API limits. Use it for SEO audits, internal link maps, and AI data pipelines that need clean, consistent text.