Website Crawler & Extractor with Postgres, Claude Sonnet 4.5, and the native HTTP module
Crawl web pages, extract structured data using Claude Sonnet 4.5, and store the results in Postgres.
Overview
Automatically fetch and parse web pages using the native HTTP module, use Claude Sonnet 4.5 to extract valuable information intelligently, and save the structured data into a Postgres database.
Requirements
- •Target URLs to crawl
- •Anthropic API key for Claude
- •PostgreSQL database connection
HTTP RequestClaude SonnetPostgres
How to use
- 1.Fetch webpage content via HTTP Request
- 2.Send raw HTML/Text to Claude Sonnet 4.5 for extraction
- 3.Parse the JSON/structured output from Claude
- 4.Insert or update records in a Postgres table