diff --git a/README.md b/README.md index 663fd21..b360615 100644 --- a/README.md +++ b/README.md @@ -4,6 +4,43 @@ A professional full-stack application for monitoring domain name availability wi --- +## ⚠️ Important: After Fresh Clone / Database Reset + +If you just cloned the repo or reset the database, run these commands to populate data: + +```bash +cd backend +source venv/bin/activate + +# 1. Initialize database (creates tables) +python scripts/init_db.py + +# 2. Scrape TLD prices (886+ TLDs from Porkbun) +python scripts/seed_tld_prices.py + +# 3. (Optional) Scrape auctions immediately +python3 -c " +import asyncio +from app.services.auction_scraper import AuctionScraperService +from app.database import AsyncSessionLocal + +async def scrape(): + scraper = AuctionScraperService() + async with AsyncSessionLocal() as db: + await scraper.scrape_all_platforms(db) + await scraper.close() + +asyncio.run(scrape()) +" +``` + +Without these steps: +- TLD Pricing page shows only 18 TLDs (instead of 886+) +- Auctions page shows 0 auctions +- The scheduler will auto-scrape TLDs daily at 03:00 UTC and auctions hourly + +--- + ## ⚡ Quick Start (Local Development) **Terminal 1 - Backend:** @@ -321,8 +358,27 @@ cp env.example .env python -c "import secrets; print(secrets.token_hex(32))" # Copy the output and paste into .env as SECRET_KEY=... -# Initialize database (creates tables + seeds data) +# Initialize database (creates tables + seeds basic data) python scripts/init_db.py + +# Scrape TLD prices from Porkbun API (886+ TLDs) +python scripts/seed_tld_prices.py + +# (Optional) Scrape auctions - or let the scheduler do it hourly +python3 -c " +import asyncio +from app.services.auction_scraper import AuctionScraperService +from app.database import AsyncSessionLocal + +async def scrape(): + scraper = AuctionScraperService() + async with AsyncSessionLocal() as db: + result = await scraper.scrape_all_platforms(db) + print(f'Scraped auctions: {result}') + await scraper.close() + +asyncio.run(scrape()) +" ``` Start backend: