Add local development setup instructions to README
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Added important notice about database initialization after fresh clone - Added TLD price scraping command (886+ TLDs from Porkbun) - Added auction scraping command - Explains what happens without these steps (only 18 TLDs, 0 auctions) - Notes that scheduler auto-scrapes daily/hourly
This commit is contained in:
58
README.md
58
README.md
@ -4,6 +4,43 @@ A professional full-stack application for monitoring domain name availability wi
|
||||
|
||||
---
|
||||
|
||||
## ⚠️ Important: After Fresh Clone / Database Reset
|
||||
|
||||
If you just cloned the repo or reset the database, run these commands to populate data:
|
||||
|
||||
```bash
|
||||
cd backend
|
||||
source venv/bin/activate
|
||||
|
||||
# 1. Initialize database (creates tables)
|
||||
python scripts/init_db.py
|
||||
|
||||
# 2. Scrape TLD prices (886+ TLDs from Porkbun)
|
||||
python scripts/seed_tld_prices.py
|
||||
|
||||
# 3. (Optional) Scrape auctions immediately
|
||||
python3 -c "
|
||||
import asyncio
|
||||
from app.services.auction_scraper import AuctionScraperService
|
||||
from app.database import AsyncSessionLocal
|
||||
|
||||
async def scrape():
|
||||
scraper = AuctionScraperService()
|
||||
async with AsyncSessionLocal() as db:
|
||||
await scraper.scrape_all_platforms(db)
|
||||
await scraper.close()
|
||||
|
||||
asyncio.run(scrape())
|
||||
"
|
||||
```
|
||||
|
||||
Without these steps:
|
||||
- TLD Pricing page shows only 18 TLDs (instead of 886+)
|
||||
- Auctions page shows 0 auctions
|
||||
- The scheduler will auto-scrape TLDs daily at 03:00 UTC and auctions hourly
|
||||
|
||||
---
|
||||
|
||||
## ⚡ Quick Start (Local Development)
|
||||
|
||||
**Terminal 1 - Backend:**
|
||||
@ -321,8 +358,27 @@ cp env.example .env
|
||||
python -c "import secrets; print(secrets.token_hex(32))"
|
||||
# Copy the output and paste into .env as SECRET_KEY=...
|
||||
|
||||
# Initialize database (creates tables + seeds data)
|
||||
# Initialize database (creates tables + seeds basic data)
|
||||
python scripts/init_db.py
|
||||
|
||||
# Scrape TLD prices from Porkbun API (886+ TLDs)
|
||||
python scripts/seed_tld_prices.py
|
||||
|
||||
# (Optional) Scrape auctions - or let the scheduler do it hourly
|
||||
python3 -c "
|
||||
import asyncio
|
||||
from app.services.auction_scraper import AuctionScraperService
|
||||
from app.database import AsyncSessionLocal
|
||||
|
||||
async def scrape():
|
||||
scraper = AuctionScraperService()
|
||||
async with AsyncSessionLocal() as db:
|
||||
result = await scraper.scrape_all_platforms(db)
|
||||
print(f'Scraped auctions: {result}')
|
||||
await scraper.close()
|
||||
|
||||
asyncio.run(scrape())
|
||||
"
|
||||
```
|
||||
|
||||
Start backend:
|
||||
|
||||
Reference in New Issue
Block a user