n8n

How to Automate Supabase Social Profile Enrichment?

Need social links for outreach without manual research? This build scans company websites, finds real social media profiles, and writes clean results into your Supabase tables. It suits marketing and sales teams that want fast lead enrichment without copying links by hand.

The run starts by loading company names and websites from a Supabase input table. An AI crawler powered by OpenAI checks each site using two helper tools that read page text and collect links. A structured parser returns a social media array in a steady JSON format. A link flow removes empty values, fixes missing protocols, drops duplicates, validates URLs, and aggregates the final list. Page text can also be converted to Markdown to help the model decide if a link is relevant. The data is then merged with the original company fields and saved to a Supabase output table.

Add your Supabase and OpenAI credentials, confirm the table names, and test on a few records. Many teams cut research time from 20 minutes per company to about 2 minutes while keeping data consistent. Use it for lead enrichment, social profile tracking, and keeping CRM fields complete with minimal effort.

What are the key features?

  • Loads company names and websites from a Supabase input table
  • Uses an OpenAI chat model to guide crawling and decide which links are social profiles
  • Two helper tools fetch page text and collect all URLs for better coverage
  • Structured JSON parser returns a clean social_media array for consistent fields
  • URL pipeline removes empty values, adds missing protocols, validates links, and deduplicates results
  • HTML to Markdown conversion helps the model judge relevance from readable text
  • Merges extracted links with original company data before saving
  • Writes final records into a Supabase output table for easy reporting

What are the benefits?

  • Reduce manual research from 20 minutes to about 2 minutes per company
  • Automate up to 90 percent of repetitive link collection
  • Improve data quality by filtering invalid links and removing duplicates
  • Scale enrichment across hundreds or thousands of domains
  • Keep one source of truth by writing results directly to Supabase

How do you set it up?

  1. Import the template into n8n: Create a new workflow in n8n > Click the three dots menu > Select 'Import from File' > Choose the downloaded JSON file.
  2. You'll need accounts with OpenAI and Supabase. See the Tools Required section above for links to create accounts with these services.
  3. In Supabase, create two tables named companies_input and companies_output. Include at least name and website fields in companies_input. In companies_output, include fields for company_name, website, and social_media JSON.
  4. In the n8n credentials manager, create a Supabase credential. Enter your Supabase URL and API key. Use a service role key if you need write access. Name the credential clearly, for example Supabase Main.
  5. In the n8n credentials manager, create an OpenAI credential. Paste your API key from the OpenAI account API page. Name it OpenAI Main.
  6. Open the Get companies node and select your Supabase credential. Set the table to companies_input and confirm the operation is getAll.
  7. Open the Select company name and website node and confirm it only passes name and website. If your column names differ, update the field mapping.
  8. Open the Crawling agent or Crawl website node. Confirm the OpenAI chat model is selected and temperature is zero for consistent output. Review the prompt and adjust the parser schema if you want to capture different data.
  9. Check the Set social media array and Merge nodes to ensure they map to the output of the JSON parser. The field should map to output.social_media.
  10. Open the Insert new row node. Select your Supabase credential and choose the companies_output table. Set it to auto map input data so company_name, website, and social_media are saved.
  11. Click Execute workflow. Confirm a few companies process correctly. Check companies_output in Supabase to verify that social_media contains valid URLs and platforms.
  12. If some sites block crawling, add a proxy in the scraping tools or slow down runs. If links look broken, review the Add protocol and Filter invalid URLs nodes and make sure your websites include a correct domain.

Tools Required

$24 / mo or $20 / mo billed annually to use n8n in the cloud. However, the local or self-hosted n8n Community Edition is free.

OpenAI

Sign up

Pay-as-you-go: GPT-5 at $1.25 per 1M input tokens and $10 per 1M output tokens

Supabase

Sign up

Free: $0 / mo — unlimited API requests; 500 MB database; 5 GB bandwidth; 1 GB storage; 50,000 MAUs.

Similar Templates

Join Futurise to access 1,200+ automation templates

Get instant access to ready-made automation workflows for n8n, Make.com, AI agents, and more. Download, customise, and deploy in minutes.