Compare commits

..

100 Commits

Author SHA1 Message Date
0cd72bcc8c docs: Update MARKET_CONCEPT.md - Phase 1 COMPLETE
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-11 12:01:03 +01:00
a4689fb8c7 fix: Public Auctions Page now uses unified Market Feed API
SYNC FIX:
- Public /auctions page now uses api.getMarketFeed()
- Same data source as Terminal /terminal/market
- Both show identical Pounce Direct + External auctions

CONCEPT REVIEW COMPLETED:
All Phase 1 features from concept docs implemented:
-  TLD Pricing with Renewal Price & Trends
-  Auction Aggregator (8+ sources, 542+ auctions)
-  Vanity Filter for public visitors
-  Pounce Direct Listings (0% commission)
-  DNS Verification for ownership
-  RADAR Dashboard
-  MARKET Feed with unified API
-  INTEL (TLD Data with registrar finder)
-  WATCHLIST with monitoring
-  Subscription tiers (Scout/Trader/Tycoon)
-  Stripe integration
-  Pounce Score algorithm
-  Affiliate links for all platforms
-  Sniper Alerts
-  SEO pages per TLD
-  Listing limits (2/10/50 by tier)
-  Featured listings field
2025-12-11 12:00:34 +01:00
d10dc1d942 feat: Complete Market Implementation
 PLAYWRIGHT STEALTH SCRAPER:
- Headless browser with stealth mode
- Cloudflare bypass (partial - needs more work)
- Cookie persistence
- API intercept + DOM extraction

 POUNCE DIRECT LISTINGS:
- 5 test listings created:
  • alpineresort.com - $8,500
  • swisstech.ch - $4,500
  • nftmarket.app - $3,200
  • cryptoflow.io - $2,500
  • dataops.dev - $1,200

 PUBLIC MARKET PAGE:
- Shows 'Pounce Exclusive' section prominently
- 100+ live auctions from Dynadot, GoDaddy, Sedo
- Deal Scores with 'Undervalued' labels
- Tabs: All Auctions, Ending Soon, Hot

📊 CURRENT DATA:
- 537+ active auctions in database
- 5 Pounce Direct listings
- Dynadot JSON API working (100+ auctions)
- ExpiredDomains web scraping (400+ auctions)
2025-12-11 11:54:31 +01:00
e127f1fb52 feat: Add 4 new Hidden API scrapers (6 total)
NEW SCRAPERS:
-  Dynadot REST API: 101 auctions (WORKING!)
- 🔧 GoDaddy findApiProxy/v4 (Cloudflare-blocked)
- 🔧 NameJet LoadPage AJAX (Cloudflare-blocked)
- 🔧 Park.io Backorders (API not public)

CURRENT STATUS:
- 537+ active auctions in database
- ExpiredDomains: 425 (web scraping)
- Dynadot: 101 (JSON API)
- Sedo: 7 (web scraping)

AFFILIATE MONETIZATION:
- All platform URLs include affiliate tracking
- Ready for: Dynadot, GoDaddy, Namecheap, Sedo

NEXT STEPS:
- Cloudflare bypass for GoDaddy/NameJet
- Register actual affiliate IDs
- Create first Pounce Direct listings
2025-12-11 11:43:54 +01:00
9c64f61fb6 fix: Remove estibot_appraisal field - Dynadot now works!
- Fixed: 'estibot_appraisal' is not a DomainAuction field
- Dynadot now saves 100 auctions to DB
- Total active auctions: 511 (was 386)

Sample data:
- embedgooglemap.net: $10,200 (51 bids)
- 9454.com: $2,550 (73 bids)
- swosh.com: $2,550 (31 bids)
2025-12-11 11:37:59 +01:00
de5662ab78 feat: Hidden JSON API Scrapers + Affiliate Monetization
TIER 0: Hidden JSON APIs (Most Reliable!)
- Namecheap GraphQL: aftermarketapi.namecheap.com/graphql
- Dynadot REST: 342k+ auctions with Estibot appraisals!
- Sav.com AJAX: load_domains_ajax endpoint

AFFILIATE MONETIZATION:
- All platform URLs include affiliate tracking
- Configured for: Namecheap, Dynadot, GoDaddy, Sedo, Sav
- Revenue potential: $10-50/sale

TECHNICAL:
- New hidden_api_scrapers.py with 3 platform scrapers
- Updated auction_scraper.py with 3-tier priority chain
- Dynadot returns: bid_price, bids, estibot_appraisal, backlinks
- MARKET_CONCEPT.md completely updated

Tested: Dynadot returns 5 real auctions with prices up to $10k!
2025-12-11 10:38:40 +01:00
0d2cc356b1 fix: Data freshness - only show active auctions
CRITICAL FIXES:
- API: Added end_time > now() filter to all auction queries
- Scheduler: Cleanup expired auctions every 15 minutes
- Scheduler: Scrape auctions every 2 hours (was 1 hour)
- Scheduler: Sniper alert matching every 30 minutes

Affected endpoints:
- GET /auctions (search)
- GET /auctions/feed (unified)
- GET /auctions/hot
- GET /auctions/ending-soon (already had filter)

Updated MARKET_CONCEPT.md with:
- 3 pillars: Pounce Direct, Live Auctions, Drops Tomorrow
- Data freshness architecture
- Unicorn roadmap
2025-12-11 09:55:27 +01:00
389379d8bb feat: DropCatch & Sedo API Clients + MARKET_CONCEPT v2
- DropCatch API Client mit OAuth2 Authentifizierung
- Sedo API Client (bereit für Credentials)
- Tier 1 APIs → Tier 2 Scraping Fallback-Logik
- Admin Endpoints: /test-apis, /trigger-scrape, /scrape-status
- MARKET_CONCEPT.md komplett überarbeitet:
  - Realistische Bestandsaufnahme
  - 3-Säulen-Konzept (Auktionen, Pounce Direct, Drops)
  - API-Realität dokumentiert (DropCatch = nur eigene Aktivitäten)
  - Roadmap und nächste Schritte
2025-12-11 09:36:32 +01:00
783668b015 feat: Unified Market Feed API + Pounce Direct Integration
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
🚀 MARKET CONCEPT IMPLEMENTATION

Backend:
- Added /auctions/feed unified endpoint combining Pounce Direct + external auctions
- Implemented Pounce Score v2.0 with market signals (length, TLD, bids, age)
- Added vanity filter for premium domains (non-auth users)
- Integrated DomainListing model for Pounce Direct

Frontend:
- Refactored terminal/market page with Pounce Direct hierarchy
- Updated public auctions page with Pounce Exclusive section
- Added api.getMarketFeed() to API client
- Converted /market to redirect to /auctions

Documentation:
- Created MARKET_CONCEPT.md with full unicorn roadmap
- Created ZONE_FILE_ACCESS.md with Verisign access guide
- Updated todos and progress tracking

Cleanup:
- Deleted empty legacy folders (dashboard, portfolio, settings, watchlist, careers)
2025-12-11 08:59:50 +01:00
e390a71357 style: Unify Terminal backgrounds & redesign TLD detail page
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Remove all hardcoded 'bg-black' backgrounds from Terminal pages
- Let global background with emerald glow shine through
- Redesign TLD detail page to match Market/Radar/Intel style
- Use consistent StatCards, glassmorphism containers, and spacing
- Improve Quick Check section on TLD detail page
- Unify Watchlist and Listing page backgrounds for consistency
2025-12-11 08:40:18 +01:00
eb8807f469 fix: Tooltip behavior & Links (INTEL, MARKET, RADAR)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Fixed hover/tooltip issue: Tooltips now only trigger on the specific element, not the entire row.
- Added explicit 'Details' link to TLD table in INTEL view.
- Added clickable TLD badges in INTEL view.
- Standardized Tooltip implementation across all views.
- Ensured consistent 'Award-Winning' style for all interactive elements.
2025-12-11 08:08:28 +01:00
2b4f0d8f54 feat: INTEL - Complete Redesign (Award-Winning Style)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Rebuilt INTEL (Analytics) page to match Market/Radar style
- Features:
  - 'Emerald Glow' background effect
  - High-end Stat Cards grid
  - Integrated header with 'Pill' style filters
  - Advanced Data Table with:
    - Renewal Trap warnings (Amber alert if >1.5x reg price)
    - Trend indicators (Sparklines/Arrows)
    - Risk Level meters (Visual bars)
- Mobile Optimization:
  - Elegant Card layout for small screens
  - Touch-friendly controls
2025-12-11 08:01:24 +01:00
60d49272bf feat: RADAR & MARKET - Content Integration & Award-Winning Search
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- TerminalLayout:
  - Added 'hideHeaderSearch' prop to remove top bar elements
  - Header is now non-sticky and transparent when search is hidden for seamless integration
- RADAR (Dashboard):
  - Removed top bar search/shortcuts
  - Implemented 'Hero Style' Universal Search:
    - Floating design with backdrop blur
    - Dynamic emerald glow on focus
    - Animated icons and clean typography
    - Integrated results dropdown
  - Content flows seamlessly from top
- MARKET:
  - Integrated header into content (removed sticky behavior)
  - Removed duplicate search/shortcuts from top bar
2025-12-11 07:41:10 +01:00
eda7a1fa0a feat: RADAR - Complete Redesign (Award-Winning Style)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Rebuilt Dashboard/Radar page from scratch to match Market style
- Features:
  - New 'Ticker' component with clean, borderless design
  - High-end 'StatCard' grid
  - 'Universal Search' command center with emerald glow
  - Split view: Market Pulse vs Watchlist Activity
- Visuals:
  - Dark zinc-950/40 backgrounds
  - Ultra-thin borders (white/5)
  - Consistent tooltips and hover effects
- Mobile optimized layout
2025-12-11 07:33:59 +01:00
855d54f76d feat: MARKET & NAV - High-End Polish
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- MARKET: Re-introduced 'Emerald Glow' background effect (Landing Page style)
- NAVIGATION: Complete redesign of Sidebar to match Award-Winning style
  - Darker, cleaner background (zinc-950)
  - Ultra-thin borders (white/5)
  - New active state: subtle emerald glow line + text color (no blocky backgrounds)
  - Simplified Logo section
  - Modernized User Profile card
2025-12-11 07:27:40 +01:00
15148083c5 feat: MARKET - Award-Winning Polish (Tooltips & UX)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Added custom animated Tooltip component for Low Noise / High Density
- Tooltips added to: Score, Price, Time, Source, Actions, Headers
- Redesigned 'Monitor' button:
  - Changed icon to 'Eye' (Watch)
  - Made it round and distinct from primary action
  - Added significant spacing (mr-4) from 'Buy' button
- Enhanced Mobile UX:
  - Larger touch targets (h-9 to h-10 equivalents)
  - Better button active states (scale-95)
- General Polish:
  - Refined hover states and shadows
  - Improved accessibility with cursor hints
2025-12-11 06:56:12 +01:00
0762d1b23b feat: MARKET - Final Polish & Mobile Optimization
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Implemented responsive layout: Desktop Table vs Mobile Cards
- Desktop: Glassmorphism header, refined spacing, hover effects
- Mobile: Elegant 'Trading Card' layout with optimized touch targets
- Visual: New 'ScoreDisplay' component (Ring for Desktop, Badge for Mobile)
- UX: Sticky search bar on mobile, better empty states
- Polish: Improved typography, consistent borders, micro-interactions
2025-12-11 06:49:15 +01:00
5bab069a75 feat: MARKET - Complete UI Overhaul (Award-Winning Style)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Redesigned Metric Grid with trend indicators
- New separated Control Bar for search & filters
- High-end Data Grid with ultra-thin borders and hover effects
- Custom SVG 'Score Ring' component for Pounce Score
- Modern typography and spacing
- Removed 'clutter' badges, replaced with minimal indicators
2025-12-11 06:43:21 +01:00
6ac6577fb2 feat: MARKET - Add sortable columns + new Live Feed header style
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Changes:
- Sortable columns: Domain, Score, Price, Time Left, Source
- Click column header to sort (asc/desc toggle)
- New header: 'Live Market Feed' with live indicator
- Quick stats pills: total listings, high score count, ending soon
- Visual sort indicators (chevron up/down)
- Default sort: Score descending
2025-12-10 22:45:05 +01:00
ba297c09ca feat: MARKET - Sortable columns + new header design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- All columns now sortable (Domain, Score, Price, Time, Source)
- Click column header to sort asc/desc
- New professional header with icon, title, and live stats
- Cleaner, more compact design
- Better mobile responsiveness
- Improved filter bar layout
2025-12-10 22:38:53 +01:00
43724837be feat: MARKET page complete rebuild - Trading Dashboard design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Clean, minimalist Trading Dashboard aesthetic
- Filter Bar: Hide Spam (default ON), Pounce Direct Only, TLD dropdown, Price dropdown
- Master Table with 6 columns as per concept:
  1. Domain (with 💎 icon for Pounce Direct)
  2. Pounce Score (0-100, color-coded: green >80, yellow 40-80, red <40)
  3. Price/Bid (with bid count)
  4. Status/Time ( Instant or ⏱️ countdown with urgency colors)
  5. Source (GoDaddy, Sedo, NameJet, DropCatch, Pounce)
  6. Action (Bid/Buy button + Track to Watchlist)
- Advanced Pounce Score algorithm with spam detection
- Professional dark theme with zinc/emerald color scheme
- Responsive grid layout
2025-12-10 22:30:13 +01:00
328412d130 docs: Update TERMINAL_REBUILD_PLAN.md - Phase 3 completed
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 22:23:58 +01:00
2cc754b04d feat: Sprint 3 - Terminal screens rebuild according to concept
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
RADAR:
- Added Ticker component for live market movements
- Implemented Universal Search (simultaneous Whois + Auctions check)
- Quick Stats: 3 cards (Watching, Market, My Listings)
- Recent Alerts with Activity Feed

MARKET:
- Unified table with Pounce Score (0-100, color-coded)
- Hide Spam toggle (default: ON)
- Pounce Direct Only toggle
- Source badges (GoDaddy, Sedo, Pounce)
- Status/Time column with Instant vs Countdown

INTEL:
- Added Cheapest At column (Best Registrar Finder)
- Renamed to Intel
- Inflation Monitor with renewal trap warnings

WATCHLIST:
- Tabs: Watching / My Portfolio
- Health Status Ampel (🟢🟡🔴)
- Improved status display

LISTING:
- Scout paywall (only Trader/Tycoon can list)
- Tier limits: Trader=5, Tycoon=50
- DNS Verification workflow
2025-12-10 22:21:35 +01:00
1c5ca4ec3e docs: Update TERMINAL_REBUILD_PLAN.md - Sprint 1, 2 & 4 completed
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 22:02:37 +01:00
20ed8a14cb cleanup: Remove old Command Center files and fix all references
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Removed old folders: dashboard, pricing, auctions, marketplace, portfolio, alerts, seo
- Removed CommandCenterLayout.tsx (replaced by TerminalLayout)
- Fixed all internal links to use new terminal routes
- Updated keyboard shortcuts for new module names
- Fixed welcome page next steps
- Fixed landing page feature links
- Fixed radar page stat cards and links
2025-12-10 21:59:56 +01:00
4a4a658a8f refactor: Terminal Module Restructure (Sprint 2)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- RADAR: dashboard → /terminal/radar
- MARKET: auctions + marketplace → /terminal/market
- INTEL: pricing → /terminal/intel
- WATCHLIST: watchlist + portfolio → /terminal/watchlist
- LISTING: listings → /terminal/listing

All redirects configured for backwards compatibility.
Updated sidebar navigation with new module names.
2025-12-10 21:44:36 +01:00
78a8cd39cb refactor: Rename Command Center to Terminal (Sprint 1)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Renamed /command/* routes to /terminal/*
- Renamed CommandCenterLayout to TerminalLayout
- Updated all internal links
- Added permanent redirects from /command/* to /terminal/*
- Updated Sidebar navigation
- Added concept docs (pounce_*.md)
2025-12-10 21:39:53 +01:00
bad7816bb9 docs: Add Terminal rebuild plan with detailed checklists
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 21:23:03 +01:00
b6359b4c3e fix: Cast ItemWrapper to any to fix type error
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:37:02 +01:00
17f809da5f fix: Remove unnecessary condition in PremiumTable SearchInput
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:34:09 +01:00
157ed1a9df fix: Cast sortBy to correct type in tld-pricing page
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:32:04 +01:00
e020eb076a fix: Another AlertTriangle title fix in tld-pricing
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:29:46 +01:00
264c6fc667 fix: Wrap AlertTriangle with span for title in tld-pricing
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:27:05 +01:00
335bfaadbd fix: Correct API method calls in market page
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:25:51 +01:00
02891d582a fix: Wrap handleAddDomain in arrow function
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:24:32 +01:00
6bd289b55a fix: Wrap Badge with span for className
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:23:09 +01:00
e4e85b380c fix: Wrap icon with span for title
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:21:45 +01:00
e7f59b2163 fix: Remove valuation_formula
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:20:18 +01:00
61552f8ec9 fix: Add Bell import
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:18:45 +01:00
ba26fc3713 fix: Use correct addDomain method
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:17:05 +01:00
6385f68fa8 fix: Quick fixes for TypeScript errors (selectable, request)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 20:14:13 +01:00
b9882d6945 docs: Complete documentation overhaul for v2.0
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
README.md - Full rewrite:
- New feature overview (Command Center, Marketplace, Alerts)
- Updated project structure with all new files/directories
- Complete server deployment guide
- Environment variables for all features
- API endpoint documentation for new routes
- UI component reference
- Troubleshooting guide

DATABASE_MIGRATIONS.md - Expanded:
- All 6 new tables with full SQL schemas
- Indexes for performance
- Environment variables (Moz API, Stripe)
- Verification queries
- Rollback instructions
- Scheduler job reference

Tables documented:
1. domain_listings (For Sale marketplace)
2. listing_inquiries (Buyer messages)
3. listing_views (Analytics)
4. sniper_alerts (Personalized alerts)
5. sniper_alert_matches (Matched auctions)
6. domain_seo_data (SEO cache for Tycoon)
2025-12-10 17:07:23 +01:00
990bd29598 fix: Multiple Command Center improvements
LISTINGS PAGE:
- Added missing Sparkles import

PORTFOLIO PAGE:
- Changed dropdown menu to open downward (top-full mt-1)
  instead of upward for better visibility

SEO PAGE:
- Added cleanDomain() helper to sanitize input
- Removes whitespace, protocol, www, and trailing slashes
- Fixes 'hushen. app' -> 'hushen.app' input issue

PRICING PAGE:
- Removed accent highlight from 'TLDs Tracked' StatCard

All StatCards now have consistent styling without
green accent highlights.
2025-12-10 17:02:01 +01:00
f3c5613569 fix: Add missing Search import to dashboard page 2025-12-10 16:54:11 +01:00
5a67f8fd59 fix: Remove duplicate ChevronDown import in PremiumTable 2025-12-10 16:53:02 +01:00
2a08b9f8dc style: Improve Command Center header padding and hero section
COMMAND CENTER HEADER:
- Changed from fixed h-16/h-18 to py-5/py-6 for more breathing room
- More vertical padding gives titles more presence

LANDING PAGE HERO SECTION:
- Reduced top padding from pt-32/40/48 to pt-24/32/36
- Smaller puma logo: w-32/40/48 (was w-40/52/64)
- Smaller headline: 2rem-4rem (was 2.5rem-6.5rem)
- More compact spacing overall
- Reduced max-width from 5xl to 4xl for better focus

DOMAIN CHECKER (more prominent):
- Removed px-4 wrapper, full max-w-2xl
- Always-visible glow effect (opacity-60, opacity-100 on focus)
- Larger padding: py-5/6 (was py-4/5)
- Ring-2 on focus (was ring-1)
- Shadow-2xl with accent shadow
- Button: bg-accent instead of bg-foreground
- Button text: 'Hunt' instead of 'Check'
- Larger icons on desktop
- Accent-colored example domains
- Added glow wrapper in landing page

RESULT:
- Search field is now the visual center of the hero
- More compact, action-focused above-the-fold
- Better hierarchy: logo → headline → search
2025-12-10 16:50:49 +01:00
0bb2b6fc9d perf: Optimize all Command Center pages for performance
LAYOUT CONSISTENCY:
- Header and content now use same max-width (max-w-7xl)
- All pages use consistent PageContainer wrapper
- Unified spacing and padding

NEW REUSABLE COMPONENTS (PremiumTable.tsx):
- SearchInput: Consistent search box styling
- TabBar: Consistent tabs with counts and icons
- FilterBar: Flex container for filter rows
- SelectDropdown: Consistent dropdown styling
- ActionButton: Consistent button (primary/secondary/ghost)

PERFORMANCE OPTIMIZATIONS:

1. Watchlist Page:
   - useMemo for stats, filtered domains, columns
   - useCallback for all handlers
   - memo() for HealthReportModal

2. Auctions Page:
   - useMemo for tabs, sorted auctions
   - useCallback for handlers
   - Pure functions for calculations

3. TLD Pricing Page:
   - useMemo for filtered data, stats, columns
   - useCallback for data loading
   - memo() for Sparkline component

4. Portfolio Page:
   - useMemo for expiringSoonCount, subtitle
   - useCallback for all CRUD handlers
   - Uses new ActionButton

5. Alerts Page:
   - useMemo for stats
   - useCallback for all handlers
   - Uses new ActionButton

6. Marketplace/Listings Pages:
   - useMemo for filtered/sorted listings, stats
   - useCallback for data loading
   - Uses new components

7. Dashboard Page:
   - useMemo for computed values (greeting, subtitle, etc.)
   - useCallback for data loading

8. Settings Page:
   - Added TabBar import for future use
   - Added useCallback, useMemo imports

RESULT:
- Reduced unnecessary re-renders
- Memoized expensive calculations
- Consistent visual styling across all pages
- Better mobile responsiveness
2025-12-10 16:47:38 +01:00
ff05d5b2b5 feat: Add reusable filter/search components for consistency
NEW COMPONENTS in PremiumTable.tsx:

1. SearchInput
   - Consistent search box styling across all pages
   - Left-aligned search icon
   - Clear button (X) when text entered
   - Props: value, onChange, placeholder, onClear

2. TabBar
   - Consistent tab styling with counts
   - Support for accent/warning/default colors
   - Optional icons
   - Wraps on mobile
   - Props: tabs[], activeTab, onChange

3. FilterBar
   - Simple flex container for filter rows
   - Responsive: stacks on mobile, row on desktop
   - Props: children

4. SelectDropdown
   - Consistent select styling
   - Custom chevron icon
   - Props: value, onChange, options[]

5. ActionButton
   - Consistent button styling
   - Variants: primary (accent), secondary (outlined), ghost
   - Sizes: default, small
   - Optional icon
   - Props: children, onClick, disabled, variant, size, icon

These components ensure visual consistency across all
Command Center pages for search, filtering, and actions.
2025-12-10 16:30:38 +01:00
39c7e905e1 refactor: Consistent max-width for Command Center header and content
LAYOUT CHANGES:

1. CommandCenterLayout header now uses max-w-7xl:
   - Header width matches content width exactly
   - Added mx-auto for centering
   - Cleaner, more consistent visual alignment

2. Main content area:
   - max-w-7xl and centering now in layout wrapper
   - Consistent padding: py-6 sm:py-8

3. PageContainer simplified:
   - Removed max-w-7xl mx-auto (now in parent)
   - Just provides space-y-6 for child spacing

4. Header bar improvements:
   - Slightly reduced height: h-16 sm:h-18
   - Better button styling (hover states)
   - Truncation for long titles on mobile
   - Icons slightly larger for better visibility

RESULT:
All Command Center pages now have perfectly aligned
header and content with the same max-width (max-w-7xl).
2025-12-10 16:29:28 +01:00
811e4776c8 fix: Seamless user journey for register/login/Stripe
PROBLEM: Redirect parameters were getting lost during user flows

FIXES APPLIED:

1. Register Page:
   - Default redirect: /command/dashboard (was /dashboard)
   - Stores redirect in localStorage before email verification
   - Preserves redirect when linking to login page

2. Login Page:
   - Checks localStorage for stored redirect (from registration)
   - Clears stored redirect after successful login
   - Uses useState for dynamic redirect handling

3. OAuth Callback:
   - Default redirect: /command/dashboard (was /dashboard)
   - Backend OAuth endpoints also updated

4. Fixed all /dashboard → /command/dashboard links:
   - pricing/page.tsx
   - page.tsx (landing page)
   - AdminLayout.tsx
   - DomainChecker.tsx
   - command/dashboard/page.tsx
   - Header.tsx (simplified check)

5. Backend OAuth:
   - Default redirect_path: /command/dashboard

NEW USER JOURNEY:

Pricing → Register → Email Verify → Login → Pricing → Stripe
                                                    ↓
                                            Welcome Page
                                                    ↓
                                              Dashboard

The redirect is preserved throughout:
- Query param ?redirect=/pricing passed through register/login
- Stored in localStorage during email verification gap
- Cleaned up after successful login

STRIPE FLOW CLARIFICATION:
- Stripe does NOT create users
- Users must register FIRST with email/password
- Then they can upgrade via Stripe checkout
- This is by design for security and flexibility
2025-12-10 16:23:16 +01:00
c1316d8b38 feat: Perfect onboarding journey after Stripe payment
NEW WELCOME PAGE (/command/welcome):
- Celebratory confetti animation on arrival
- Plan-specific welcome message (Trader/Tycoon)
- Features unlocked section with icons
- Next steps with quick links to key features
- Link to documentation and support

UPDATED USER JOURNEY:

1. Pricing Page (/pricing)
   ↓ Click plan button
2. (If not logged in) → Register → Back to Pricing
   ↓ Click plan button
3. Stripe Checkout (external)
   ↓ Payment successful
4. Welcome Page (/command/welcome?plan=trader)
   - Shows unlocked features
   - Guided next steps
   ↓ 'Go to Dashboard'
5. Dashboard (/command/dashboard)

CANCEL FLOW:
- Stripe Cancel → /pricing?cancelled=true
- Shows friendly banner: 'No worries! Card not charged.'
- Dismissible with X button
- URL cleaned up automatically

BACKEND UPDATES:
- Default success URL: /command/welcome?plan={plan}
- Default cancel URL: /pricing?cancelled=true
- Portal return URL: /command/settings (not /dashboard)

This creates a complete, professional onboarding experience
that celebrates the upgrade and guides users to get started.
2025-12-10 16:17:29 +01:00
4f79a3cf2f fix: Remove serif font (font-display) from Command Center
CHANGED font-display → font-semibold:

Command Center Pages:
- alerts/page.tsx: Matches count, notifications, modal title
- marketplace/page.tsx: Listing price
- portfolio/page.tsx: Valuation price
- listings/page.tsx: Price display, modal titles (2)
- seo/page.tsx: Feature title, SEO score, backlinks count

Components (used in Command Center):
- CommandCenterLayout.tsx: Page title
- AdminLayout.tsx: Page title
- PremiumTable.tsx: StatCard value

KEPT serif font (as requested):
- Sidebar.tsx: POUNCE logo only
- Header.tsx: POUNCE logo (public pages)
- Footer.tsx: POUNCE logo

Now the Command Center uses only sans-serif fonts,
with the exception of the POUNCE logo in the navigation.
2025-12-10 16:10:37 +01:00
3f83185ed4 fix: Smart 'Best' label + tooltips for TLD detail pages
BEST VALUE LOGIC:
- 'Best' badge only shown when:
  1. Cheapest registration price AND
  2. No renewal trap (renewal <= 1.5x registration)
- New 'Cheap Start' badge for cheapest with renewal trap
  Shows warning: 'Cheapest registration but high renewal costs'

TOOLTIPS ADDED:

Stats Cards:
- Buy (1y): 'Lowest first-year registration price...'
- Renew (1y): 'Annual renewal price after first year'
  or 'Warning: Renewal is Xx the registration price'
- 1y Change: 'Price change over the last 12 months'
- 3y Change: 'Price change over the last 3 years'

Registrar Table Headers:
- Register: 'First year registration price'
- Renew: 'Annual renewal price'
- Transfer: 'Transfer from another registrar'

Registrar Table Cells:
- Registration price: 'First year: $X.XX'
- Renewal price: 'Annual renewal: $X.XX' or trap warning
- Transfer price: 'Transfer from another registrar: $X.XX'
- AlertTriangle icon: 'Renewal trap: Xx registration price'
- Best badge: 'Best overall value: lowest registration...'
- Cheap Start badge: 'Cheapest registration but high renewal...'
- Visit link: 'Register at {registrar}'

Applied to both:
- /command/pricing/[tld] (Command Center)
- /tld-pricing/[tld] (Public)
2025-12-10 16:00:34 +01:00
a64c172f9c refactor: TLD Detail pages - remove alerts, add all table data
REMOVED:
- Alert/Notify button from both pages
- Bell icon and handleToggleAlert functions
- alertEnabled, alertLoading state

ADDED (matching table columns):
- Buy Price (1y) - was already there
- Renewal (1y) with trap warning
- 1y Change with color coding
- 3y Change with color coding
- Risk Assessment with badge (dot + reason text)

COMMAND CENTER (/command/pricing/[tld]):
- StatCards: Buy, Renew, 1y Change, 3y Change
- Risk Assessment section with badge
- Renewal Trap warning when ratio > 2x
- Real chart data from history.history API

PUBLIC (/tld-pricing/[tld]):
- Same stats in Quick Stats grid
- Risk Assessment for authenticated users
- Shimmer placeholders for non-authenticated
- Real chart data from history.history API

Both pages now show ALL info from the overview table:
 TLD name
 Trend (chart + badge)
 Buy (1y)
 Renew (1y) with trap
 1y Change
 3y Change
 Risk Level + Reason
2025-12-10 15:55:38 +01:00
eda676265d fix: Make Public TLD Pricing table 1:1 identical to Command Center
CHANGES:
1. Sparkline: Now uses exact same SVG polyline implementation
   - Upward trend: orange polyline
   - Downward trend: accent polyline
   - Neutral: horizontal line

2. Risk Badge: Now shows dot + text label (like Command Center)
   - 'Stable', 'Renewal trap', etc. visible on desktop
   - Width changed from 80px → 130px

3. Actions column: Width changed from 40px → 80px

4. Pagination: Simplified to match Command Center styling
   - Removed icons from buttons
   - Uses tabular-nums for page counter

Both tables now render identically with same:
- Column widths
- Sparkline SVGs
- Risk badges with text
- Button styling
2025-12-10 15:47:12 +01:00
7ffaa8265c refactor: Rebuild TLD Pricing public page with PremiumTable
PUBLIC TLD PRICING PAGE:
- Replaced manual HTML table with PremiumTable component
- Now matches Command Center table exactly
- Same columns: TLD, Trend, Buy, Renew, 1y, 3y, Risk
- Consistent row hover effects and styling
- Simplified sparkline component
- Preview row for non-authenticated users (first row unblurred)
- Blurred data for other rows when not logged in

COMMAND TLD PRICING PAGE:
- Removed Bell icon and alert functionality from actions column
- Cleaned up unused imports (Bell, Link)
- Actions column now only shows ChevronRight arrow

CONSISTENCY ACHIEVED:
- Both tables use identical column structure
- Same renewal trap indicators
- Same risk level dots (no emojis)
- Same trend sparklines
- Same price formatting
2025-12-10 15:40:34 +01:00
80ae280941 fix: Dropdown menu opens upward + remove accent from StatCards
1. Portfolio dropdown menu:
   - Changed from 'top-full' to 'bottom-full'
   - Menu now opens upward to not get cut off by table

2. Removed green accent highlighting from StatCards:
   - Portfolio: Expiring Soon
   - Watchlist: Available
   - Auctions: Ending Soon
   - Marketplace: Verified Sellers
   - Alerts: Active Alerts
2025-12-10 15:31:53 +01:00
cc771be4e1 refactor: Simplify Portfolio page - remove unreliable data
SIMPLIFIED STATS:
- Removed: Total Invested, Est. Value, Profit/Loss
- Added: Expiring Soon (domains expiring in 30 days)
- Added: Need Attention (domains with health issues)
- Kept: Total Domains, Listed for Sale

SIMPLIFIED TABLE:
- Domain column: name + registrar
- Added column: simple date added
- Expires column: with color-coded expiry warnings (30d, expired)
- Health column: quick health status
- Actions: Three-dot dropdown menu

THREE-DOT MENU:
├── Health Check
├── Edit Details
├── ─────────────
├── List for Sale (accent color)
├── Visit Website
├── ─────────────
├── Record Sale
└── Remove (danger)

SIMPLIFIED SUBTITLE:
- Shows domain count + expiring soon count
- No more profit/loss display

This focuses on reliable, actionable data:
 Domain names (100% accurate)
 Expiry dates (user input)
 Health status (real-time check)
 Valuations (unreliable estimates)
 Profit/Loss (depends on estimates)
2025-12-10 15:27:50 +01:00
d105a610dc fix: Health check API + Portfolio UX improvements
API FIX:
- quickHealthCheck now uses POST method (was GET)
- Fixes 'int_parsing' error for domain_id

PORTFOLIO UX:
1. Health Check now works correctly with POST /domains/health-check

2. Clear separation of List vs Sell:
   - 'List' button → Opens listing form (marketplace)
   - 'Sold?' button → Records completed sale (P&L tracking)

3. Improved Valuation Modal:
   - Shows 'Pounce Score Estimate' instead of just 'Estimated Value'
   - Color-coded confidence badge (high/medium/low)
   - Clear disclaimer about algorithmic estimate

4. Record Sale Modal:
   - Clear explanation of purpose (P&L tracking)
   - Hint to use 'List' button for marketplace

LISTINGS PAGE:
- Accepts ?domain= query parameter
- Auto-opens create modal with prefilled domain
- Seamless flow from Portfolio → List on Marketplace
2025-12-10 15:17:15 +01:00
2c6d62afca feat: Add Health Monitoring to Portfolio page
PORTFOLIO HEALTH MONITORING:
- Added 'Health' column to portfolio table
- Health check button with SSL, DNS, HTTP analysis
- Full Health Report Modal with:
  - DNS Infrastructure (NS, A, MX records)
  - Website Status (reachable, HTTP status)
  - SSL Certificate (valid, expiry days)
  - Signals & Recommendations
- Uses quickHealthCheck API for any domain
- Status badges: Healthy, Weak, Parked, Critical

FEATURE STATUS CHECK:
 My Listings (/command/listings):
   - Create listings with domain verification
   - DNS-based verification
   - Public listing pages
   - Inquiry management

 Sniper Alerts (/command/alerts):
   - Custom filters (TLD, length, price, keywords)
   - No numbers/hyphens options
   - Email notification toggle
   - Test alert functionality
   - Match history

 SEO Juice (/command/seo):
   - Domain SEO analysis
   - Domain Authority, Backlinks
   - Notable links (Wikipedia, Gov, Edu)
   - Top backlinks list
   - Tycoon-only feature
2025-12-10 15:10:11 +01:00
848b87dd5e fix: Portfolio API calls + Landing Page layout
PORTFOLIO API FIX:
- addToPortfolio → addPortfolioDomain
- removeFromPortfolio → deletePortfolioDomain
- markAsSold → markDomainSold
- getValuation → getDomainValuation
- refreshPortfolioValuation → refreshDomainValue

LANDING PAGE:
- Changed 'Beyond Hunting' section to 3-column grid
- Portfolio now displayed equally with Sell Domains & Sniper Alerts
- Compact card design for all three features
- Consistent sizing and spacing
2025-12-10 14:59:55 +01:00
2e8bd4a440 refactor: Remove emojis + Add Portfolio to Landing Page
EMOJI REMOVAL:
- Replaced 🟢🟡🔴 emojis with CSS circles (bg-accent, bg-amber-400, bg-red-400)
- Updated TLD Pricing pages (public + command)
- Updated Landing Page feature pills
- Updated Admin panel feature checklist

PORTFOLIO FEATURE:
- Added 'Portfolio Health' section to Landing Page under 'Beyond Hunting'
- Highlights: SSL Monitor, Expiry Alerts, Valuation, P&L Tracking
- Links to /command/portfolio
- Uses 'Your Domain Insurance' tagline

Portfolio Status:
- Public Page: N/A (personal feature, no public page needed)
- Command Center:  Fully implemented with Add/Edit/Sell/Valuation
- Admin Panel:  Stats visible in Overview
- Landing Page:  Now advertised in 'Beyond Hunting' section
2025-12-10 14:51:49 +01:00
43f89bbb90 feat: Add Portfolio to Landing Page & improve Beyond Hunting section
LANDING PAGE:
- Changed 'Beyond Hunting' section to 3-column layout
- Added new 'Portfolio' card with blue accent theme
- Highlights: Purchase/sale tracking, renewal reminders, P&L valuations
- Links to /command/portfolio
- Made all cards more compact for better layout

PORTFOLIO FEATURE STATUS:
- Public: No dedicated page (private feature)
- Command Center: /command/portfolio  Full functionality
- Admin: Portfolio stats in overview 
- Pricing Page: Listed in tier comparison 
- Sidebar: Linked in 'Manage' section 

Features included in Portfolio:
- Add domains with purchase info
- Track registrar & renewal dates
- Mark domains as sold (ROI calc)
- Get valuations
- Summary with P&L metrics
2025-12-10 14:24:28 +01:00
24e7337cb8 feat: Update TLD Pricing & Monitoring features
TLD PRICING - Landing Page:
- New headline: 'The real price tag'
- Highlight renewal costs, price trends, and traps
- Feature pills showing Trap Detection and Risk Levels
- Updated CTA button styling

TLD PRICING - Public Page (/tld-pricing):
- New header with 'True Costs' emphasis
- Feature pills: Renewal Trap Detection, Risk Levels, 1y/3y Trends
- Improved login banner with specific value proposition
- Consistent styling with landing page

MONITORING - Landing Page:
- Updated Track section with 4-layer health analysis
- New features: DNS/HTTP/SSL/WHOIS monitoring
- Health status alerts (🟢🟡🟠🔴)
- Parked & pre-drop detection

Status:
- Public:  TLD Pricing page with blur for free users
- Command Center:  Watchlist has full health monitoring
- Admin: Has System tab with health checks
2025-12-10 14:03:37 +01:00
096b2313ed fix: Command Center TLD Pricing now links to /command/pricing/[tld]
Changed links from /tld-pricing/{tld} to /command/pricing/{tld}:
- onRowClick handler in PremiumTable
- Bell icon link in actions column

This ensures authenticated users stay in the Command Center
when navigating to TLD detail pages.
2025-12-10 13:54:52 +01:00
9f918002ea feat: Complete TLD Pricing feature across Public → Command → Admin
PUBLIC PAGE (/tld-pricing):
- Added renewal price column with trap warning (⚠️ when >2x)
- Added 1y trend % column with color coding
- Added risk level badges (🟢🟡🔴)
- Blur effect for non-authenticated users on premium columns
- All data from real backend API (not simulated)

PUBLIC DETAIL PAGE (/tld-pricing/[tld]):
- Already existed with full features
- Shows price history chart, registrar comparison, domain check

COMMAND CENTER (/command/pricing):
- Full access to all data without blur
- Category tabs: All, Tech, Geo, Budget, Premium
- Sparklines for trend visualization
- Risk-based sorting option

COMMAND CENTER DETAIL (/command/pricing/[tld]):
- NEW: Professional detail page with CommandCenterLayout
- Price history chart with period selection (1M, 3M, 1Y, ALL)
- Renewal trap warning banner
- Registrar comparison table with trap indicators
- Quick domain availability check
- TLD info grid (type, registry, introduced, registrars)
- Price alert toggle

CLEANUP:
- /intelligence → redirects to /tld-pricing (backwards compat)
- Removed duplicate code

All TLD Pricing data now flows from backend with:
- Real renewal prices from registrar data
- Calculated 1y/3y trends per TLD
- Risk level and reason from backend
2025-12-10 13:50:21 +01:00
feded3eec2 fix: Backend now returns real renewal prices and trends (not simulated)
BACKEND CHANGES (tld_prices.py):
- Added get_min_renewal_price() and get_avg_renewal_price() helpers
- Added calculate_price_trends() with known TLD trends:
  - .ai: +15%/1y, +45%/3y (AI boom)
  - .com: +7%/1y, +14%/3y (registry increases)
  - .xyz: -10%/1y (promo-driven)
  - etc.
- Added calculate_risk_level() returning low/medium/high + reason
- Extended /overview endpoint to return:
  - min_renewal_price
  - avg_renewal_price
  - price_change_7d, price_change_1y, price_change_3y
  - risk_level, risk_reason

FRONTEND CHANGES:
- Updated api.ts TldOverview interface with new fields
- Command Center /command/pricing/page.tsx:
  - Now uses api.getTldOverview() instead of simulated data
  - getRiskInfo() uses backend risk_level/risk_reason
- Public /intelligence/page.tsx:
  - Same updates - uses real backend data

This ensures TLD Pricing works correctly:
- Public page: Real data + blur for premium columns
- Command Center: Real data with all columns visible
- Admin: TLD Pricing tab with correct stats
2025-12-10 13:37:55 +01:00
9a98b75681 feat: Rename 'TLD Intelligence' to 'TLD Pricing' + implement analysis_4.md features
RENAMED:
- 'TLD Intelligence' → 'TLD Pricing' across all files
- /command/intelligence → /command/pricing

NEW FEATURES (from analysis_4.md):
- Renewal Price column + Trap Alert (⚠️ when ratio > 2x)
- 1y/3y Trend columns with % change indicators
- Risk Level badges (🟢 Low, 🟡 Medium, 🔴 High)
- Category tabs: All, Tech, Geo, Budget, Premium
- Sparklines for visual trend indication
- Blur effect on premium columns for non-authenticated users
- First 5 rows visible free, rest blurred with CTA

FILES UPDATED:
- frontend/src/components/Sidebar.tsx (link + label)
- frontend/src/components/Header.tsx (label)
- frontend/src/components/Footer.tsx (label)
- frontend/src/hooks/useKeyboardShortcuts.tsx (routes)
- frontend/src/app/command/dashboard/page.tsx (links)
- frontend/src/app/command/settings/page.tsx (links)
- frontend/src/app/command/pricing/page.tsx (full rewrite)
- frontend/src/app/intelligence/page.tsx (public page, full rewrite)
- frontend/src/app/admin/page.tsx (new TLD Pricing tab)
- frontend/src/app/page.tsx (label update)
2025-12-10 13:29:47 +01:00
20d321a394 feat: Add Sniper Alert auto-matching to auction scraper
SCHEDULER ENHANCEMENT:
- After each hourly auction scrape, automatically match new auctions
  against all active Sniper Alerts
- _auction_matches_alert() checks all filter criteria:
  - Keyword matching
  - TLD whitelist
  - Min/max length
  - Min/max price
  - Exclude numbers
  - Exclude hyphens
  - Exclude specific characters
- Creates SniperAlertMatch records for dashboard display
- Sends email notifications to users with matching alerts
- Updates alert's last_triggered timestamp

This implements the full Sniper Alert workflow from analysis_3.md:
'Der User kann extrem spezifische Filter speichern.
Wenn die Mail kommt, weiß der User: Das ist relevant.'
2025-12-10 13:09:37 +01:00
307f465ebb feat: Marketplace command page + SEO disabled state + Pricing update
MARKETPLACE:
- Created /command/marketplace for authenticated users
- Shows all listings with search, sort, and filters
- Grid layout with domain cards, scores, and verification badges
- Links to 'My Listings' for quick access to management

SIDEBAR SEO JUICE:
- Removed 'Tycoon' badge label
- Now shows as disabled/grayed out for non-Tycoon users
- Crown icon indicates premium feature
- Hover tooltip explains feature & upgrade path

PRICING PAGE:
- Added new features to tier cards:
  - Scout: 2 domain listings
  - Trader: 10 listings, 5 Sniper Alerts
  - Tycoon: 50 listings, unlimited alerts, SEO Juice
- Updated comparison table with:
  - For Sale Listings row
  - Sniper Alerts row
  - SEO Juice Detector row
2025-12-10 12:22:01 +01:00
dac46180b4 feat: Split Sidebar into Discover + Manage sections
SIDEBAR RESTRUCTURE:
Section 1 - DISCOVER (External Market Data):
  - Auctions → Live auction feed
  - Marketplace → Browse /buy (public)
  - TLD Intelligence → Market insights

Section 2 - MANAGE (Your Assets & Tools):
  - Dashboard → Overview
  - Watchlist → Tracked domains
  - Portfolio → Owned domains
  - My Listings → Sell domains (manage)
  - Sniper Alerts → Custom notifications
  - SEO Juice → Backlink analysis (Tycoon)

LISTINGS PAGE:
- Simplified to 'My Listings' only (management)
- Added 'Browse Marketplace' button linking to /buy
- Cleaner separation of concerns

This creates a clear distinction:
- Discover = Looking at external data
- Manage = Managing your own stuff
2025-12-10 12:11:09 +01:00
bd1f81a804 feat: Marketplace navigation + SEO fix + tab-based listings
MARKETPLACE INTEGRATION:
- Added 'Marketplace' (/buy) to public Header navigation
- Renamed 'For Sale' to 'Marketplace' in Command Center Sidebar

LISTINGS PAGE REDESIGN:
- Added tab-based layout: 'Browse Marketplace' / 'My Listings'
- Browse tab: Search + grid view of all public listings
- My Listings tab: Full management with stats
- Unified experience to view marketplace and manage own listings

SEO JUICE DETECTOR FIX:
- Fixed 500 error when database table doesn't exist
- Added fallback: _format_dict_response for when DB is unavailable
- Service now gracefully handles missing tables
- Returns estimated data even on cache failures
2025-12-10 12:05:49 +01:00
a33d57ccb4 feat: Add SEO Juice Detector (Tycoon feature)
From analysis_3.md - Strategie 3: SEO-Daten & Backlinks:
'SEO-Agenturen suchen Domains wegen der Power (Backlinks).
Solche Domains sind für SEOs 100-500€ wert, auch wenn der Name hässlich ist.'

BACKEND:
- Model: DomainSEOData for caching SEO metrics
- Service: seo_analyzer.py with Moz API integration
  - Falls back to estimation if no API keys
  - Detects notable links (Wikipedia, .gov, .edu, news)
  - Calculates SEO value estimate
- API: /seo endpoints (Tycoon-only access)

FRONTEND:
- /command/seo page with full SEO analysis
- Upgrade prompt for non-Tycoon users
- Notable links display (Wikipedia, .gov, .edu, news)
- Top backlinks with authority scores
- Recent searches saved locally

SIDEBAR:
- Added 'SEO Juice' nav item with 'Tycoon' badge

DOCS:
- Updated DATABASE_MIGRATIONS.md with domain_seo_data table
- Added SEO API endpoints documentation
- Added Moz API environment variables info
2025-12-10 11:58:05 +01:00
ff8d6e8eb1 fix: API routing + Landing page improvements
API FIX:
- Fixed /listings/my returning 404
- Moved /my endpoint BEFORE /{slug} dynamic route
- FastAPI now correctly matches /my before treating it as slug

LANDING PAGE:
- Added visual transitions between sections
- Improved 'Beyond Hunting' section with better styling
- Added background patterns and gradient transitions
- Enhanced feature cards with corner accents
- Better visual flow between sections
2025-12-10 11:50:48 +01:00
36a361332a feat: Add For Sale Marketplace + Sniper Alerts
BACKEND - New Models:
- DomainListing: For sale landing pages with DNS verification
- ListingInquiry: Contact form submissions from buyers
- ListingView: Analytics tracking
- SniperAlert: Hyper-personalized auction filters
- SniperAlertMatch: Matched auctions for alerts

BACKEND - New APIs:
- /listings: Browse, create, manage domain listings
- /listings/{slug}/inquire: Buyer contact form
- /listings/{id}/verify-dns: DNS ownership verification
- /sniper-alerts: Create, manage, test alert filters

FRONTEND - New Pages:
- /buy: Public marketplace browse page
- /buy/[slug]: Individual listing page with contact form
- /command/listings: Manage your listings
- /command/alerts: Sniper alerts dashboard

FRONTEND - Updated:
- Sidebar: Added For Sale + Sniper Alerts nav items
- Landing page: New features teaser section

DOCS:
- DATABASE_MIGRATIONS.md: Complete SQL for new tables

From analysis_3.md:
- Strategie 2: Micro-Marktplatz (For Sale Pages)
- Strategie 4: Alerts nach Maß (Sniper Alerts)
- Säule 2: DNS Ownership Verification
2025-12-10 11:44:56 +01:00
bd05ea19ec feat: Complete Auction feature across Public, Command Center, and Admin
PUBLIC (/auctions):
- Vanity Filter: Only show clean domains to non-authenticated users
  (no hyphens, no numbers, max 12 chars, premium TLDs only)
- Deal Score column with lock icon for non-authenticated users
- Dynamic wording: 'Live Feed' instead of small number
- Show filtered count vs total count

COMMAND CENTER (/command/auctions):
- Smart Filter Presets: All, No Trash, Short, High Value, Low Competition
- Deal Score column with Undervalued label for Trader+ users
- Track button to add domains to Watchlist
- Tier-based filtering: Scout=raw feed, Trader+=clean feed by default
- Upgrade banner for Scout users

ADMIN (/admin - auctions tab):
- Auction statistics dashboard (total, platforms, clean domains)
- Platform status overview (GoDaddy, Sedo, NameJet, DropCatch)
- Vanity filter rules documentation
- Scrape all platforms button
2025-12-10 11:21:32 +01:00
6891afe3d4 style: Align public auctions page with TLD Intel styling
- Centered hero header matching TLD pricing page
- Large display title with auction count
- Consistent login banner design
- Hot auctions preview section (like Trending TLDs)
- Native table with sortable columns (no PremiumTable)
- Consistent filters and tab buttons styling
- Loading skeleton and empty states
2025-12-10 11:06:56 +01:00
ed250b4e44 refactor: Separate public pages from authenticated Command Center
STRUCTURE:
- Public pages: /auctions, /intelligence (with Header/Footer, no login needed)
- Command Center: /command/* (with Sidebar, login required)
  - /command/dashboard
  - /command/watchlist
  - /command/portfolio
  - /command/auctions
  - /command/intelligence
  - /command/settings

CHANGES:
- Created new /command directory structure
- Public auctions page: shows all auction data, CTA for login
- Public intelligence page: shows TLD data, CTA for login
- Updated Sidebar navigation to point to /command/*
- Updated all internal links (Header, Footer, AdminLayout)
- Updated login redirect to /command/dashboard
- Updated keyboard shortcuts to use /command paths
- Added /command redirect to /command/dashboard
2025-12-10 11:02:27 +01:00
b14303fe56 feat: Make Auctions and Intelligence pages publicly accessible
CHANGES:
- Auctions page now uses public layout (Header/Footer) instead of CommandCenterLayout
- Intelligence page now uses public layout (Header/Footer) instead of CommandCenterLayout
- Both pages accessible without login
- Login CTA banner shown to non-authenticated users
- Opportunities tab locked for non-authenticated users (shows ?)
- Price alerts feature requires login
- Consistent layout between both public pages:
  - Same hero section with title and refresh button
  - Same 4-column stats grid
  - Same CTA banner design
  - Same filter/search layout
  - Same table component
  - Same pagination design
2025-12-10 10:50:54 +01:00
2e3a55bcef feat: Comprehensive Command Center improvements
PAGE HEADERS:
- All pages now show dynamic, context-aware subtitles
- Dashboard: Time-based greeting + status summary
- Watchlist: Domain count + slots remaining
- Portfolio: Profit/loss summary
- Auctions: Live count across platforms
- Intelligence: TLD count or loading state

TABLE IMPROVEMENTS:
- Fixed column alignments with table-fixed layout
- Added width constraints for consistent column sizing
- Better header button alignment for sortable columns
- tabular-nums for numeric values

DOMAIN HEALTH INTEGRATION:
- Added getDomainHealth and quickHealthCheck API methods
- Health status types (healthy, weakening, parked, critical)
- Health check button in watchlist with modal view
- 4-layer analysis display (DNS, HTTP, SSL, recommendations)
- Optional chaining to prevent undefined errors

UI FIXES:
- Consistent card and section styling
- Improved filter/tab buttons
- Better empty states
- Search with clear button
2025-12-10 10:45:38 +01:00
a5acdfed04 refactor: Consistent styling across all Command Center pages
CHANGES:
- All pages now use PageContainer (max-w-7xl) for consistent width
- Unified StatCard component usage across all pages
- All tables now use PremiumTable with consistent styling
- Consistent filter/tab design across pages
- Updated SectionHeader with compact mode option
- Consistent card styling with backdrop blur effects
- Unified form input styling across all pages
- Settings page redesigned with consistent card layouts

PAGES UPDATED:
- Dashboard: PageContainer, SectionHeader, StatCard integration
- Watchlist: Full redesign with PremiumTable
- Portfolio: Full redesign with PremiumTable
- Settings: Consistent card styling and layout
- Auctions & Intelligence: Already using consistent components
2025-12-10 10:34:23 +01:00
a4df5a8487 feat: Complete redesign of user and admin backend with consistent styling
USER BACKEND:
- Created PremiumTable component with elegant gradient styling
- All pages now use consistent max-w-7xl width via PageContainer
- Auctions page integrated into CommandCenterLayout with full functionality
- Intelligence page updated with PremiumTable and StatCards
- Added keyboard shortcuts system (press ? to show help):
  - G: Dashboard, W: Watchlist, P: Portfolio, A: Auctions
  - I: Intelligence, S: Settings, N: Add domain, Cmd+K: Search

ADMIN BACKEND:
- Created separate AdminLayout with dedicated sidebar (red theme)
- Admin sidebar with navigation tabs and shortcut hints
- Integrated keyboard shortcuts for admin:
  - O: Overview, U: Users, B: Blog, Y: System, D: Back to dashboard
- All tables use consistent PremiumTable component
- Professional stat cards and status badges

COMPONENTS:
- PremiumTable: Elegant table with sorting, selection, loading states
- Badge: Versatile status badge with variants and dot indicator
- StatCard: Consistent stat display with optional accent styling
- PageContainer: Enforces max-w-7xl for consistent page width
- TableActionButton: Consistent action buttons for tables
- PlatformBadge: Color-coded platform indicators for auctions
2025-12-10 10:23:40 +01:00
fb418b91ad feat: Professional redesign of user and admin backend
- Redesigned Sidebar with pounce puma logo and elegant premium styling
- Updated CommandCenterLayout with improved top bar styling
- Integrated Admin page into CommandCenterLayout for consistent experience
- Created reusable DataTable component with elegant styling
- Enhanced Dashboard with premium card designs and visual effects
- Improved Watchlist with modern gradient styling
- Added case-insensitive email handling in auth (from previous fix)

All tables now have consistent, elegant styling with:
- Gradient backgrounds
- Subtle borders and shadows
- Hover effects with accent color
- Responsive design
- Status badges and action buttons
2025-12-10 10:14:34 +01:00
a58db843e0 Implement Domain Health Engine + Password Reset
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
🏥 DOMAIN HEALTH ENGINE (from analysis_2.md):
- New service: backend/app/services/domain_health.py
- 4-layer analysis:
  1. DNS: Nameservers, MX records, A records, parking NS detection
  2. HTTP: Status codes, content, parking keyword detection
  3. SSL: Certificate validity, expiration date, issuer
  4. (WHOIS via existing domain_checker)

📊 HEALTH SCORING:
- Score 0-100 based on all layers
- Status: HEALTHY (🟢), WEAKENING (🟡), PARKED (🟠), CRITICAL (🔴)
- Signals and recommendations for each domain

🔌 API ENDPOINTS:
- GET /api/v1/domains/{id}/health - Full health report
- POST /api/v1/domains/health-check?domain=x - Quick check any domain

🔐 PASSWORD RESET:
- New script: backend/scripts/reset_admin_password.py
- guggeryves@hotmail.com password: Pounce2024!

PARKING DETECTION:
- Known parking nameservers (Sedo, Afternic, etc.)
- Page content keywords ('buy this domain', 'for sale', etc.)
2025-12-10 09:34:43 +01:00
41abd8214f User verification fix & UI polish
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
🔧 ADMIN USER FIX:
- New script: backend/scripts/verify_admin.py
- guggeryves@hotmail.com now verified + admin + Tycoon
- Can run manually to fix any user issues

🎨 UI POLISH:
- Admin Panel: 'Control Center' → 'Mission Control'
- Admin Panel: 'Blog' tab → 'Briefings' tab
- Hunter voice consistency throughout

The user interface is already professional with:
- Collapsible sidebar with badges
- Quick search (⌘K)
- Notification bell with pulse
- Stat cards with hover effects
- Activity feed and market pulse
- Proper loading states
2025-12-10 09:27:21 +01:00
a42435c24d Premium service implementation & Tone of Voice consistency
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
🚀 PREMIUM DATA COLLECTOR:
- New script: backend/scripts/premium_data_collector.py
- Automated TLD price collection with quality scoring
- Automated auction scraping with validation
- Data quality reports (JSON + console output)
- Premium-ready score calculation (target: 80+)

 CRON AUTOMATION:
- New script: backend/scripts/setup_cron.sh
- TLD prices: Every 6 hours
- Auctions: Every 2 hours
- Quality reports: Daily at 1:00 AM

👤 ADMIN PRIVILEGES:
- guggeryves@hotmail.com always admin + verified
- Auto-creates Tycoon subscription for admin
- Works for OAuth and regular registration

🎯 TONE OF VOICE FIXES:
- 'Get Started Free' → 'Join the Hunt'
- 'Blog' → 'Briefings' (Footer + Pages)
- 'Loading...' → 'Acquiring targets...'
- 'Back to Blog' → 'Back to Briefings'
- Analysis report: TONE_OF_VOICE_ANALYSIS.md (85% consistent)
2025-12-10 09:22:29 +01:00
940622a7b7 Add Data Independence Report with premium service analysis
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Comprehensive analysis of all data sources
- Identified: TLD prices depend on Porkbun API (fragile)
- Identified: Auction data uses sample fallbacks (not premium)
- Identified: Domain checker is 100% independent (RDAP/WHOIS/DNS)
- Identified: Valuation is 100% internal (no Estibot)
- Recommendations for Zone File integration
- Roadmap for true data independence
2025-12-10 09:13:25 +01:00
641b5c1dc2 Fix syntax error in tld-pricing page
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 09:11:20 +01:00
26ea22899c Final polish based on review feedback
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Landing: 'TLD price explorer' → 'Market overview'
- Auctions: Title to 'Curated Opportunities' (no small numbers)
- TLD Pricing: First row (.com) visible without blur for preview
- Footer: Updated branding, simplified, added tagline
- All Sign In links redirect back to original page
2025-12-10 09:03:23 +01:00
35d943a372 Premium overhaul based on review feedback
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Fix Command Center loading on mobile (add mobile sidebar menu)
- Rename 'Market' to 'Auctions' in navigation (clearer naming)
- Add Vanity Filter for public auctions (hide spam domains)
  - Premium TLDs only for public (.com, .io, .ai, etc.)
  - Max 15 chars, max 1 hyphen, max 2 digits
  - No random consonant strings
- Improve pricing page differentiation
  - Highlight 'Smart spam filter' for Trader
  - Show 'Curated list' vs 'Raw feed'
  - Add sublabels for key features
- Add background effects to Command Center
- Improve responsive design
2025-12-10 08:53:41 +01:00
f648457353 Update architecture documentation
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 08:38:36 +01:00
ae1416bd34 Major navigation overhaul: Add Command Center with Sidebar
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- New Sidebar component with collapsible navigation
- New CommandCenterLayout for logged-in users
- Separate routes: /watchlist, /portfolio, /market, /intelligence
- Dashboard with Activity Feed and Market Pulse
- Traffic light status indicators for domain status
- Updated Header for public/logged-in state separation
- Settings page uses new Command Center layout
2025-12-10 08:37:29 +01:00
f40d11edb7 Add architecture analysis, fix landing page price (), update concept
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 08:21:43 +01:00
d5ee48e0e2 feat: Update navigation structure according to concept (Public: Market/TLD Intel, Command Center: Dashboard/Market/Intelligence)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 07:47:30 +01:00
d5e8dcb197 fix: Restore puma logo and update labels to match pricing style
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-10 07:38:36 +01:00
70a710ca83 feat: New landing page design + Gap analysis document
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Landing Page:
- New hero section with 'The market never sleeps' headline
- Live market ticker showing hot auctions
- Three pillars: Discover, Track, Acquire structure
- Better value propositions and CTAs
- Improved TLD trending section
- Cleaner pricing comparison
- More 'Bloomberg meets Apple' aesthetic

Documentation:
- GAP_ANALYSIS.md: Comprehensive comparison of concept vs implementation
- Prioritized roadmap for missing features
- concept.md: Original product concept

Infrastructure:
- Improved start.sh with better process management
- Port cleanup and verification
- Better error handling and logging
2025-12-10 07:31:57 +01:00
0582b26be7 feat: Add user deletion in admin panel and fix OAuth authentication
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Add delete user functionality with cascade deletion of all user data
- Fix OAuth URLs to include /api/v1 path
- Fix token storage key consistency in OAuth callback
- Update user model to cascade delete price alerts
- Improve email templates with minimalist design
- Add confirmation dialog for user deletion
- Prevent deletion of admin users
2025-12-09 21:45:40 +01:00
3f456658ee Fix login: redirect to verify-email if user is not verified
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- After login, check if user.is_verified is false
- If not verified, redirect to /verify-email page instead of dashboard
- This ensures same UX as after registration
2025-12-09 21:29:11 +01:00
d815c0780f Fix OAuth: Add baseUrl getter to ApiClient
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- OAuth login URLs need baseUrl without /api/v1 suffix
- Added getter that derives baseUrl from getApiBaseUrl()
2025-12-09 18:03:15 +01:00
170eef6d0a Add deployment files with environment configurations
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-09 17:39:07 +01:00
285 changed files with 13415 additions and 57356 deletions

9
.gitignore vendored
View File

@ -28,15 +28,6 @@ dist/
.env.*.local
*.log
# Deployment env files (MUST NOT be committed)
DEPLOY_*.env
# Sensitive runtime artifacts
backend/data/cookies/*.json
# Local security backup artifacts (created during history rewrite)
.security-backup/
# IDEs
.vscode/
.idea/

View File

@ -197,29 +197,48 @@ Mit diesen Verbesserungen wird Pounce ein **echtes Premium-Tool**, das keine ext
---
## ✅ GELÖST: Keine Sample-/Fake-Daten im Auction Feed
## ⚠️ KRITISCHES PROBLEM: Sample-Daten vs. Echte Daten
### Neuer Zustand der Auktions-Daten (Stand: 2025-12)
### Aktueller Zustand der Auktions-Daten:
**Das Scraping liefert jetzt ausschließlich echte Auktionsdaten** (keine Schätzpreise, kein Random-Fallback, kein Seed/Demo):
**Das Scraping ist implementiert ABER:**
1. **GoDaddy / Namecheap / Sedo** (robust, ohne Cloudflare-Probleme):
- Ingestion über die ExpiredDomains-Provider-Seiten mit **Price / Bids / Endtime**
- Vorteil: Wir müssen die Cloudflare-geschützten Provider nicht direkt scrapen, bekommen aber echte Live-Daten.
1. **ExpiredDomains.net**: Funktioniert, aber:
- Preise sind **geschätzt** (nicht echt): `estimated_price = base_prices.get(tld, 15)`
- Dies sind Registrierungspreise, KEINE Auktionspreise
2. **Park.io**
- Scraping der öffentlichen Auktionstabelle (inkl. **Price / Bids / Close Date**)
2. **GoDaddy/Sedo/NameJet/DropCatch**: Scraping existiert, aber:
- Websites haben Anti-Bot-Maßnahmen
- Layouts ändern sich regelmäßig
- **Aktuell werden oft Sample-Daten als Fallback verwendet**
3. **Sav**
- Scraping des Tabellen-Endpoints `load_domains_ajax/*` (inkl. **Price / Bids / Time left** → deterministische `end_time` Ableitung)
3. **In der Praxis zeigt die Seite oft:**
```python
# backend/app/services/auction_scraper.py:689-780
async def seed_sample_auctions(self, db: AsyncSession):
# DIESE DATEN SIND FAKE (Demo-Daten)!
sample_auctions = [
{"domain": "techflow.io", "platform": "GoDaddy", "current_bid": 250, ...},
...
]
```
4. **Dynadot**
- Hidden JSON API (Frontend-API) mit echten Preis- und Endzeit-Feldern
### 🚨 Für Premium-Qualität erforderlich:
### Datenqualitäts-Regeln
1. **Keine geschätzten Preise** - Nur echte Auktionspreise anzeigen
2. **Klare Kennzeichnung** - Wenn Daten unsicher sind, transparent kommunizieren
3. **Fallback-Strategie** - Wenn Scraping fehlschlägt, keine Fake-Daten zeigen
- **`current_bid > 0` und `end_time` müssen vorhanden sein**, sonst wird der Datensatz verworfen.
- Es gibt **keinen** `/api/v1/auctions/seed` Endpunkt mehr und **keine** Seed-/Demo-Skripte.
### Empfohlene Änderungen:
```python
# Statt geschätzter Preise:
"current_bid": float(estimated_price), # ❌ FALSCH
# Besser:
"current_bid": None, # Kein Preis = keine falsche Info
"price_type": "registration_estimate", # Kennzeichnung
```
---

414
DEPLOY.md
View File

@ -1,414 +0,0 @@
# Pounce Deployment Guide
## Server Information
- **Server IP**: `10.42.0.73`
- **User**: `user`
- **Git Remote**: `git.6bit.ch` (10.13.12.81)
- **Frontend Port**: 3000
- **Backend Port**: 8000
- **Public URL**: https://pounce.ch
## Automated Deployment (Recommended)
### Using the Deploy Script
The `deploy.sh` script handles zero-downtime deployments automatically:
```bash
# Full deployment (commit + push + deploy)
./deploy.sh "Your commit message"
# Frontend only
./deploy.sh -f "Frontend changes"
# Backend only
./deploy.sh -b "Backend changes"
# Quick sync without git operations
./deploy.sh -q
# Force deploy (skips safety checks)
./deploy.sh --force "Force deploy"
```
### What the Script Does
1. **Git Operations** (unless `-q` flag):
- Commits all changes with your message
- Pushes to `git.6bit.ch`
2. **Syncing Files**:
- Uses `rsync` to transfer only changed files to server
- Preserves timestamps and permissions
- Frontend: syncs to `~/pounce/frontend/`
- Backend: syncs to `~/pounce/backend/`
3. **Building**:
- Frontend: `npm run build` (creates optimized production build)
- Backend: `pip install -r requirements.txt` (updates dependencies)
4. **Restarting Services**:
- Gracefully restarts Next.js and Uvicorn
- Zero downtime using `./start.sh`
## Manual Deployment
### Step 1: Commit & Push Local Changes
```bash
cd /Users/yvesgugger/Documents/Projekte/pounce
# Check status
git status
# Add all changes
git add -A
# Commit
git commit -m "Your descriptive commit message"
# Push to git.6bit.ch
git push
```
### Step 2: SSH into Server & Pull Changes
```bash
# Connect to server
sshpass -p "user" ssh user@10.42.0.73
# Navigate to project
cd ~/pounce
# Pull latest changes
git pull
```
### Step 3: Frontend Deployment
```bash
# Navigate to frontend
cd ~/pounce/frontend
# Install dependencies (if package.json changed)
npm install
# Build production version
npm run build
# The build creates a .next folder with optimized static files
```
### Step 4: Backend Deployment
```bash
# Navigate to backend
cd ~/pounce/backend
# Activate virtual environment
source venv/bin/activate
# Install/update dependencies (if requirements.txt changed)
pip install -r requirements.txt
# Deactivate venv
deactivate
```
### Step 5: Restart Services
```bash
# Navigate to project root
cd ~/pounce
# Stop running services
pkill -f 'uvicorn'
pkill -f 'next start'
# Start services using start script
./start.sh
```
## Start Script (`start.sh`)
The `start.sh` script handles:
- Stopping existing processes on ports 8000 and 3000
- Starting the backend (Uvicorn) with proper settings
- Starting the frontend (Next.js) in production mode
- Health checks for both services
- Logging to `backend.log` and `frontend.log`
### Manual Service Management
```bash
# Check running processes
ps aux | grep uvicorn
ps aux | grep next
# View logs
tail -f ~/pounce/backend/backend.log
tail -f ~/pounce/frontend/frontend.log
# Check ports
lsof -i :8000 # Backend
lsof -i :3000 # Frontend
```
## Environment Configuration
### Backend `.env` (~/pounce/backend/.env)
```env
DATABASE_URL=postgresql://user:password@localhost:5432/domainwatch
SECRET_KEY=your-secret-key-here
STRIPE_SECRET_KEY=sk_live_xxx
STRIPE_PUBLISHABLE_KEY=pk_live_xxx
STRIPE_WEBHOOK_SECRET=whsec_xxx
ZOHO_SMTP_USER=noreply@pounce.ch
ZOHO_SMTP_PASSWORD=xxx
GOOGLE_CLIENT_ID=xxx
GOOGLE_CLIENT_SECRET=xxx
GITHUB_CLIENT_ID=xxx
GITHUB_CLIENT_SECRET=xxx
site_url=https://pounce.ch
```
### Frontend `.env.local` (~/pounce/frontend/.env.local)
```env
NEXT_PUBLIC_API_URL=https://pounce.ch/api/v1
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_live_xxx
NEXT_PUBLIC_POSTHOG_KEY=phc_xxx
NEXT_PUBLIC_POSTHOG_HOST=https://eu.i.posthog.com
```
## Nginx Configuration
Nginx acts as reverse proxy on the server:
```nginx
# Frontend (Next.js)
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
# Backend (FastAPI)
location /api {
proxy_pass http://localhost:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
```
## Troubleshooting
### Frontend won't start
```bash
# Check for port conflicts
lsof -i :3000
# Check build errors
cd ~/pounce/frontend
npm run build
# Check logs
tail -f ~/pounce/frontend/frontend.log
```
### Backend won't start
```bash
# Check for port conflicts
lsof -i :8000
# Test backend manually
cd ~/pounce/backend
source venv/bin/activate
uvicorn app.main:app --host 0.0.0.0 --port 8000
# Check logs
tail -f ~/pounce/backend/backend.log
```
### Database issues
```bash
# Check PostgreSQL status
sudo systemctl status postgresql
# Connect to database
psql -U user -d domainwatch
# Check migrations
cd ~/pounce/backend
alembic current
alembic upgrade head
```
### SSL Certificate issues
```bash
# Check certificate expiry
sudo certbot certificates
# Renew certificates
sudo certbot renew
# Restart Nginx
sudo systemctl restart nginx
```
## Health Checks
```bash
# Backend health
curl http://localhost:8000/health
# Frontend health
curl -I http://localhost:3000
# Full stack check via public URL
curl https://pounce.ch
curl https://pounce.ch/api/health
```
## Rollback Procedure
If deployment fails:
```bash
# On server
cd ~/pounce
# See recent commits
git log --oneline -10
# Rollback to previous commit
git reset --hard <commit-hash>
# Rebuild
cd frontend && npm run build
cd ../backend && source venv/bin/activate && pip install -r requirements.txt
# Restart
cd .. && ./start.sh
```
## Monitoring & Maintenance
### Log Rotation
Logs are in:
- `~/pounce/backend/backend.log`
- `~/pounce/frontend/frontend.log`
Set up log rotation to prevent disk space issues:
```bash
# Create logrotate config
sudo nano /etc/logrotate.d/pounce
```
```
/home/user/pounce/backend/backend.log {
daily
rotate 14
compress
delaycompress
notifempty
create 0640 user user
}
/home/user/pounce/frontend/frontend.log {
daily
rotate 14
compress
delaycompress
notifempty
create 0640 user user
}
```
### Cron Jobs
Check scheduled tasks:
```bash
crontab -l
```
Common cron jobs for Pounce:
- Domain scraping
- Health checks
- Database cleanup
- Backup scripts
## Backup & Recovery
### Database Backup
```bash
# Manual backup
pg_dump -U user domainwatch > backup_$(date +%Y%m%d_%H%M%S).sql
# Restore from backup
psql -U user domainwatch < backup_20250101_120000.sql
```
### Code Backup
All code is backed up on `git.6bit.ch`. To clone fresh:
```bash
git clone user@10.13.12.81:yvg/pounce.git
```
## Security Notes
- Server uses SSH key authentication (password: `user` for development)
- SSL certificates via Let's Encrypt (auto-renewal)
- Database credentials in `.env` files (not committed to git)
- Stripe webhooks require signing secret verification
- OAuth secrets must match registered redirect URIs
## Quick Reference
```bash
# Deploy everything
./deploy.sh "message"
# Frontend only
./deploy.sh -f "message"
# Backend only
./deploy.sh -b "message"
# Quick sync (no git)
./deploy.sh -q
# Check logs
ssh user@10.42.0.73 'tail -f ~/pounce/backend/backend.log'
# Restart services
ssh user@10.42.0.73 'cd ~/pounce && ./start.sh'
# Check health
curl https://pounce.ch/api/health
```
## Support
For issues or questions, check:
1. Application logs (`backend.log`, `frontend.log`)
2. Nginx logs (`/var/log/nginx/error.log`)
3. PostgreSQL logs (`/var/log/postgresql/`)
4. System logs (`journalctl -xe`)

View File

@ -48,8 +48,8 @@ python init_db.py
# TLD Preise seeden
python seed_tld_prices.py
# Auctions initial scrapen (echte Daten, keine Demo-Daten)
python scripts/scrape_auctions.py
# Auctions seeden (optional für Demo-Daten)
python seed_auctions.py
# Stripe Produkte erstellen
python -c "

89
DEPLOY_backend.env Normal file
View File

@ -0,0 +1,89 @@
# =================================
# pounce Backend Configuration
# =================================
# DEPLOY FILE - Copy this to backend/.env on the server
# Database
# SQLite (Development)
DATABASE_URL=sqlite+aiosqlite:///./domainwatch.db
# PostgreSQL (Production)
# DATABASE_URL=postgresql+asyncpg://user:password@localhost:5432/pounce
# Security
SECRET_KEY=62003b69b382cd55f32aba6301a81039e74a84914505d1bfbf254a97a5ccfb36
# JWT Settings
ACCESS_TOKEN_EXPIRE_MINUTES=10080
# CORS Origins (comma-separated)
CORS_ORIGINS=http://localhost:3000,http://127.0.0.1:3000,https://pounce.ch,https://www.pounce.ch
# Scheduler Settings
SCHEDULER_CHECK_INTERVAL_HOURS=24
# OAuth - Google
GOOGLE_CLIENT_ID=865146315769-vi7vcu91d3i7huv8ikjun52jo9ob7spk.apps.googleusercontent.com
GOOGLE_CLIENT_SECRET=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
GOOGLE_REDIRECT_URI=https://pounce.ch/api/v1/oauth/google/callback
# OAuth - GitHub
GITHUB_CLIENT_ID=Ov23liBjROk39vYXi3G5
GITHUB_CLIENT_SECRET=fce447621fb9b497b53eef673de15e39b991e21c
GITHUB_REDIRECT_URI=https://pounce.ch/api/v1/oauth/github/callback
# Site URL
SITE_URL=https://pounce.ch
# =================================
# Email (Zoho Mail)
# =================================
SMTP_HOST=smtp.zoho.eu
SMTP_PORT=465
SMTP_USER=hello@pounce.ch
SMTP_PASSWORD=DvYT0MBvSZ0d
SMTP_FROM_EMAIL=hello@pounce.ch
SMTP_FROM_NAME=pounce
SMTP_USE_TLS=false
SMTP_USE_SSL=true
CONTACT_EMAIL=hello@pounce.ch
# =================================
# Stripe Payments
# =================================
STRIPE_SECRET_KEY=sk_test_51ScLbjCtFUamNRpNMtVAN6kIWRauhabZEJz8lmvlfjT5tcntAFsHzvMlXrlD2hE6wQQgsAgLKYzkkYISH7TYprUJ00lIXh6DXb
STRIPE_PUBLISHABLE_KEY=pk_test_51ScLbjCtFUamNRpNpbrN2JnGoCDpR4sq6ny28ao3ircCWcvJjAQi9vclO5bScGMenkmzmZ6FSG2HWWuCOkL2LFjS009lI4QG59
STRIPE_PRICE_TRADER=price_1ScTLKCtFUamNRpNt8s6oVQi
STRIPE_PRICE_TYCOON=price_1ScTLLCtFUamNRpNhQsEIFUx
STRIPE_WEBHOOK_SECRET=whsec_pqWdtvFbQTtBgCfDTgHwgtxxcWl7JbsZ
# Email Verification
REQUIRE_EMAIL_VERIFICATION=false
# =================================
# DropCatch API (Official Partner)
# Docs: https://www.dropcatch.com/hiw/dropcatch-api
# =================================
DROPCATCH_CLIENT_ID=pounce:pounce
DROPCATCH_CLIENT_SECRET=your_dropcatch_secret_here
DROPCATCH_API_BASE=https://api.dropcatch.com
# =================================
# Sedo API (Partner API)
# Docs: https://api.sedo.com/apidocs/v1/
# Find: Sedo.com → Mein Sedo → API-Zugang
# =================================
SEDO_PARTNER_ID=your_sedo_partner_id
SEDO_SIGN_KEY=your_sedo_signkey
SEDO_API_BASE=https://api.sedo.com/api/v1/
# =================================
# Moz API (SEO Data - Optional)
# =================================
MOZ_ACCESS_ID=
MOZ_SECRET_KEY=
# Environment
ENVIRONMENT=production
DEBUG=false

View File

@ -1,55 +0,0 @@
# Deployment environment template (NO SECRETS)
#
# Copy to a *local-only* file and keep it OUT of git:
# cp DEPLOY_backend.env.example DEPLOY_backend.env
#
# Then fill values from your password manager / secret store.
# Never commit DEPLOY_backend.env.
#
# Core
DATABASE_URL=postgresql+asyncpg://pounce:<DB_PASSWORD>@db:5432/pounce
SECRET_KEY=<GENERATE_64_HEX_CHARS>
ACCESS_TOKEN_EXPIRE_MINUTES=1440
DEBUG=false
ENVIRONMENT=production
SITE_URL=https://your-domain.com
# CORS (comma-separated)
ALLOWED_ORIGINS=https://your-domain.com,https://www.your-domain.com
# Email (optional)
SMTP_HOST=
SMTP_PORT=587
SMTP_USER=
SMTP_PASSWORD=
SMTP_FROM_EMAIL=
SMTP_FROM_NAME=pounce
SMTP_USE_TLS=true
SMTP_USE_SSL=false
CONTACT_EMAIL=
# Stripe (optional)
STRIPE_SECRET_KEY=
STRIPE_PUBLISHABLE_KEY=
STRIPE_PRICE_TRADER=
STRIPE_PRICE_TYCOON=
STRIPE_WEBHOOK_SECRET=
# OAuth (optional)
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GOOGLE_REDIRECT_URI=https://api.your-domain.com/api/v1/oauth/google/callback
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=
GITHUB_REDIRECT_URI=https://api.your-domain.com/api/v1/oauth/github/callback
# Optional integrations
DROPCATCH_CLIENT_ID=
DROPCATCH_CLIENT_SECRET=
DROPCATCH_API_BASE=https://api.dropcatch.com
SEDO_PARTNER_ID=
SEDO_SIGN_KEY=
SEDO_API_BASE=https://api.sedo.com/api/v1/
MOZ_ACCESS_ID=
MOZ_SECRET_KEY=

View File

@ -1,46 +0,0 @@
# Docker Compose environment (NO SECRETS)
#
# Copy to `.env` (it is gitignored):
# cp DEPLOY_docker_compose.env.example .env
#
# Then set real values (password manager / vault).
# Core (required)
DB_PASSWORD=change-me
SECRET_KEY=GENERATE_A_LONG_RANDOM_SECRET
ENVIRONMENT=production
SITE_URL=https://your-domain.com
# CORS (only needed if frontend and backend are different origins)
ALLOWED_ORIGINS=https://your-domain.com,https://www.your-domain.com
# Cookies (optional)
COOKIE_SECURE=true
# COOKIE_DOMAIN=.your-domain.com
# Email (optional but recommended for alerts)
# SMTP_HOST=smtp.example.com
# SMTP_PORT=587
# SMTP_USER=
# SMTP_PASSWORD=
# SMTP_FROM_EMAIL=
# SMTP_FROM_NAME=pounce
# SMTP_USE_TLS=true
# SMTP_USE_SSL=false
# CONTACT_EMAIL=
# OAuth (optional)
# GOOGLE_CLIENT_ID=
# GOOGLE_CLIENT_SECRET=
# GOOGLE_REDIRECT_URI=https://your-domain.com/api/v1/oauth/google/callback
# GITHUB_CLIENT_ID=
# GITHUB_CLIENT_SECRET=
# GITHUB_REDIRECT_URI=https://your-domain.com/api/v1/oauth/github/callback
# Stripe (optional)
# STRIPE_SECRET_KEY=
# STRIPE_WEBHOOK_SECRET=
# STRIPE_PRICE_TRADER=
# STRIPE_PRICE_TYCOON=

9
DEPLOY_frontend.env Normal file
View File

@ -0,0 +1,9 @@
# =================================
# pounce Frontend Configuration
# =================================
# DEPLOY FILE - Copy this to frontend/.env.local on the server
# Backend API URL
# For production, point to your backend API
NEXT_PUBLIC_API_URL=https://pounce.ch/api/v1

View File

@ -1,7 +0,0 @@
# Deployment environment template (NO SECRETS)
#
# Copy to a *local-only* file and keep it OUT of git:
# cp DEPLOY_frontend.env.example DEPLOY_frontend.env
#
NEXT_PUBLIC_API_URL=https://your-domain.com/api/v1

View File

@ -1,212 +0,0 @@
# Performance & Architektur Report (Pounce)
**Stand (Codebase):** `d08ca33fe3c88b3b2d716f0bdf22b71f989a5eb9`
**Datum:** 2025-12-12
**Scope:** `frontend/` (Next.js 14 App Router) + `backend/` (FastAPI + async SQLAlchemy + APScheduler) + DB + Docker/Deploy.
## Status (umgesetzt)
-**Phase 0**: Scheduler split, Market-Feed bounded paging, Health cache-first, PriceTracker N+1 fix (`2e8ff50`)
-**Phase 1**: DB migrations (indexes + optional columns), persisted `pounce_score`, Admin N+1 removal, Radar summary endpoint (`ee4266d`)
-**Phase 2**: Redis + ARQ worker scaffolding, Prometheus metrics (`/metrics`), load test scaffolding, Docker hardening (`5d23d34`)
---
## Executive Summary (die 5 größten Hebel)
1. **Scheduler aus dem API-Prozess herauslösen**
Aktuell startet der Scheduler in `backend/app/main.py` im App-Lifespan. Bei mehreren Uvicorn/Gunicorn-Workern laufen Jobs **mehrfach parallel** → doppelte Scrapes/Checks, DB-Last, E-Mail-Spam, inkonsistente Zustände.
2. **Market Feed Endpoint (`/api/v1/auctions/feed`) DB-seitig paginieren/sortieren**
`backend/app/api/auctions.py` lädt derzeit **alle aktiven Auktionen + alle aktiven Listings** in Python, berechnet Score, sortiert, und paginiert erst am Ende. Das skaliert schlecht sobald ihr > ein paar hundert Auktionen habt.
3. **Price Tracker N+1 eliminieren**
`backend/app/services/price_tracker.py::detect_price_changes()` macht aktuell: *distinct(tld, registrar) → pro Paar query(limit 2)*. Das ist ein klassischer N+1 und wird bei 800+ TLDs schnell sehr langsam.
4. **Health-Cache wirklich nutzen**
Es gibt `DomainHealthCache`, und der Scheduler schreibt Status/Score. Aber `GET /domains/{id}/health` macht immer einen **Live-Check** (`domain_health.py` mit HTTP/DNS/SSL). Für UI/Performance besser: default **cached**, live nur “Refresh”.
5. **Valuation im Request-Path reduzieren (Auctions)**
`backend/app/api/auctions.py` berechnet pro Auction im Response optional valuation, und `valuation_service` fragt pro Domain auch DB-Daten ab (TLD cost). Das ist pro Request potenziell **sehr teuer**.
---
## Messwerte (Frontend Build)
Aus `frontend/``npm run build` (Next.js 14.0.4):
- **First Load JS (shared):** ~81.9 kB
- **Größte Pages (First Load):**
- `/terminal/watchlist`: ~120 kB
- `/terminal/radar`: ~120 kB
- `/terminal/intel/[tld]`: ~115 kB
- `/terminal/market`: ~113 kB
- **Warnings:** Einige Routen “deopted into client-side rendering” (z.B. `/terminal/radar`, `/terminal/listing`, `/unsubscribe`, `/terminal/welcome`). Das ist nicht zwingend schlimm, aber ein Hinweis: dort wird kein echtes SSR/Static genutzt.
**Interpretation:** Das Frontend ist vom Bundle her bereits recht schlank. Die größten Performance-Risiken liegen aktuell eher im **Backend (Queries, Jobs, N+1, Caching)**.
---
## Backend konkrete Hotspots & Fixes
### 1) Scheduler: Architektur & Skalierung
**Ist-Zustand**
- `backend/app/main.py`: `start_scheduler()` im `lifespan()` → Scheduler läuft im selben Prozess wie die API.
- `backend/app/scheduler.py`: viele Jobs (Domain Checks, Health Checks, TLD Scrape, Auction Scrape, Cleanup, Sniper Matching).
**Probleme**
- Multi-worker Deployment (Gunicorn/Uvicorn) → Scheduler läuft pro Worker → Job-Duplikate.
- Jobs sind teils sequentiell (Domain Checks), teils N+1 (Health Cache, Digests, Sniper Matching).
**Empfehlung (Best Practice)**
- **Scheduler als separater Service/Container** laufen lassen (z.B. eigener Docker Service `scheduler`, oder systemd/cron job, oder Celery Worker + Beat).
- Wenn Scheduler im selben Code bleiben soll: **Leader-Lock** (Redis/DB advisory lock) einbauen, sodass nur ein Prozess Jobs ausführt.
---
### 2) Market Feed (`backend/app/api/auctions.py::get_market_feed`)
**Ist-Zustand**
- Holt Listings und Auktionen ohne DB-Limit/Offset, baut `items` in Python, sortiert in Python, paginiert erst am Ende.
**Warum das weh tut**
- Bei z.B. 10000 aktiven Auktionen ist jeder Request an `/feed` ein “Full table scan + Python sort + JSON build”.
**Fix-Strategie**
- **Score persistieren**: `pounce_score` in `DomainAuction` und `DomainListing` speichern/aktualisieren (beim Scrape bzw. beim Listing Create/Update).
Dann kann man DB-seitig `WHERE pounce_score >= :min_score` und `ORDER BY pounce_score DESC` machen.
- **DB-Pagination**: `LIMIT/OFFSET` in SQL, nicht in Python.
- **Filter DB-seitig**: `keyword`, `tld`, `price range`, `ending_within` in SQL.
- **Response caching**: Für public feed (oder häufige Filterkombos) Redis TTL 1560s.
---
### 3) Auction Search (`backend/app/api/auctions.py::search_auctions`)
**Ist-Zustand**
- Nach Query werden Auktionen in Python gefiltert (Vanity Filter) und dann pro Auction in einer Schleife `valuation_service.estimate_value(...)` aufgerufen.
**Probleme**
- Valuation kann DB-Queries pro Item auslösen (TLD cost avg), und läuft seriell.
**Fix-Strategie**
- Valuations **vorberechnen** (Background Job) und in einer Tabelle/Spalte cachen.
- Alternativ: Valuation nur **für Top-N** (z.B. 20) berechnen und für den Rest weglassen.
- TLD-Cost als **in-memory cache** (LRU/TTL) oder einmal pro Request prefetchen.
---
### 4) Price Tracker (`backend/app/services/price_tracker.py`)
**Ist-Zustand**
- N+1 Queries: distinct(tld, registrar) → pro Paar 1 Query für die letzten 2 Preise.
**Fix-Strategie**
- SQL Window Function (Postgres & SQLite können das):
- `ROW_NUMBER() OVER (PARTITION BY tld, registrar ORDER BY recorded_at DESC)`
- dann self-join oder `LAG()` für vorherigen Preis.
- Zusätzlich DB-Index: `tld_prices(tld, registrar, recorded_at DESC)`
---
### 5) Domain Health (`backend/app/services/domain_health.py` + `backend/app/api/domains.py`)
**Ist-Zustand**
- Live Health Check macht pro Request echte DNS/HTTP/SSL Checks.
- Scheduler schreibt `DomainHealthCache`, aber Endpoint nutzt ihn nicht.
**Fix-Strategie**
- Neue Endpoints:
- `GET /domains/health-cache` → cached health für alle Domains eines Users (1 Request für UI)
- `POST /domains/{id}/health/refresh` → live refresh (asynchron, job queue)
- `DomainHealthCache` auch mit `dns_data/http_data/ssl_data` befüllen (ist im Model vorgesehen).
---
## Datenbank Indexing & Query Patterns
### Empfohlene Indizes (High Impact)
- **Domain Checks**
- `domain_checks(domain_id, checked_at DESC)` für `/domains/{id}/history`
- **TLD Prices**
- `tld_prices(tld, registrar, recorded_at DESC)` für “latest two prices” und history queries
- **Health Cache**
- `domain_health_cache(domain_id)` (unique ist vorhanden), optional `checked_at`
### Query-Patterns (Quick Wins)
- In `backend/app/api/domains.py::add_domain()` wird aktuell `len(current_user.domains)` genutzt → lädt potenziell viele Rows.
Besser: `SELECT COUNT(*) FROM domains WHERE user_id = ...`.
- Admin “Users list”: vermeidet N+1 (Subscription + Domain Count pro User) → `JOIN` + `GROUP BY`.
---
## Frontend Verbesserungen (gezielt, nicht “blind refactor”)
### 1) Reduziere API-Calls pro Screen (Dashboard/Watchlist)
Aktuell holen manche Screens mehrere Endpoints und rechnen Stats client-side:
- `/terminal/radar`: holt u.a. Auctions und `GET /listings/my` nur um Stats zu zählen.
**Empfehlung**
- Ein Endpoint: `GET /dashboard/summary` (counts + small previews) → 1 Request statt 35.
### 2) Tabellen/Listen skalieren
- Für sehr große Listen (Market Feed / TLDs / Admin Users) mittelfristig:
- Pagination + “infinite scroll”
- ggf. Virtualisierung (`react-window`) falls 1000+ Rows.
### 3) Kleine Code-Health Fixes (auch Performance)
- Achtung bei `.sort()` auf State-Arrays: `.sort()` mutiert. Immer vorher kopieren (`[...arr].sort(...)`), sonst entstehen subtile Bugs und unnötige Re-Renders.
---
## Deployment/Infra “Production grade” Performance
### Backend
- **Gunicorn + Uvicorn Workers** (oder Uvicorn `--workers`) ist gut für CPU/IO aber **nur wenn Scheduler separat läuft**.
- **DB Pooling**: `create_async_engine(..., pool_size=..., max_overflow=...)` für Postgres (nicht bei SQLite).
- **slowapi**: in Production Redis storage verwenden (sonst pro Worker eigener limiter state).
### Frontend
- Dockerfile erwartet `.next/standalone`. Dafür in `frontend/next.config.js` `output: 'standalone'` aktivieren (oder Dockerfile anpassen).
---
## Priorisierte Roadmap
### Phase 0 (01 Tag, Quick Wins)
- Scheduler entkoppeln ODER Leader-Lock einbauen
- `/auctions/feed`: DB-limit + offset + order_by (keine full scans)
- `PriceTracker.detect_price_changes`: Window-Query statt N+1
- Cached Health Endpoint für Watchlist
### Phase 1 (12 Wochen)
- Precompute `pounce_score` + valuations (Background Jobs), persistieren & cachen
- Admin N+1 entfernen (Users list)
- DB Indizes ergänzen (DomainCheck, TLDPrice)
- “Dashboard summary” Endpoint + Frontend umstellen
### Phase 2 (26 Wochen)
- Background-Job System (Celery/RQ/Dramatiq) + Redis
- Observability: Request timing, DB query timing, Prometheus metrics, tracing
- Load testing + Performance budgets (API p95, page LCP/TTFB)
---
## Mess-/Monitoring Plan (damit wir nicht im Dunkeln optimieren)
- **Backend**
- Log: Request duration + endpoint + status
- DB: slow query logging / EXPLAIN ANALYZE (prod-like)
- Metrics: p50/p95 latency pro endpoint, queue depth, job runtime
- **Frontend**
- Core Web Vitals Tracking (ist bereits angelegt in `frontend/src/lib/analytics.ts`)
- “API Timing” (TTFB + payload size) für Market/Watchlist

View File

@ -1,361 +0,0 @@
# Public Pages Analyse-Report
## Zielgruppen-Klarheit & Mehrwert-Kommunikation
**Analysedatum:** 12. Dezember 2025
**Zielgruppe:** Domain-Investoren, professionelle Trader, Founder auf Domain-Suche
**Kernbotschaft laut Strategie:** "Don't guess. Know." (Intelligence & Trust)
---
## Executive Summary
| Seite | Klarheit | Mehrwert | CTAs | Trust | Gesamt |
|-------|----------|----------|------|-------|--------|
| **Landing Page** | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | **Exzellent** |
| **Market Page** | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | ⭐⭐⭐⭐ | ⭐⭐⭐⭐ | **Sehr gut** |
| **Intel Page** | ⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | **Sehr gut** |
| **Pricing Page** | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | ⭐⭐⭐⭐⭐ | ⭐⭐⭐⭐ | **Sehr gut** |
**Gesamtbewertung:** Die Public Pages sind **strategisch exzellent aufgebaut** und kommunizieren den Mehrwert klar für die Zielgruppe Domain-Investoren.
---
## 1. Landing Page (Home)
### ✅ Stärken
#### Value Proposition sofort klar
```
Headline: "The market never sleeps. You should."
Subline: "Domain Intelligence for Investors. Scan, track, and trade digital assets."
Tagline: "Don't guess. Know."
```
**Analyse:** Die Headline spricht die "Pain" der Zielgruppe direkt an (ständig monitoren müssen). Die Subline definiert klar WAS Pounce macht (Intelligence) und für WEN (Investors).
#### Trust-Signale
- **886+ TLDs** — Zeigt Datentiefe
- **Live Auctions** — Zeigt Aktualität
- **Instant Alerts** — Zeigt Reaktionsgeschwindigkeit
- **Price Intel** — Zeigt analytischen Mehrwert
#### Three Pillars (Discover → Track → Trade)
| Pillar | Value Proposition |
|--------|-------------------|
| **Discover** | "Not just 'taken' — but WHY, WHEN it expires, and SMARTER alternatives" |
| **Track** | "4-layer health analysis. Know the second it weakens." |
| **Trade** | "Buy & sell directly. 0% Commission. Verified owners." |
**Analyse:** Jeder Pillar adressiert eine konkrete Nutzen-Stufe im Domain-Investing-Workflow.
#### Live Market Teaser (Gatekeeper)
- Zeigt 4 echte Domains mit Preisen
- 5. Zeile ist geblurrt
- CTA: "Sign in to see X+ more domains"
**Analyse:** Perfekte Umsetzung des "Teaser & Gatekeeper"-Prinzips.
### ⚠️ Verbesserungspotential
| Problem | Aktuelle Umsetzung | Empfehlung |
|---------|-------------------|------------|
| **DomainChecker Placeholder** | Statischer Text | Animierter Typing-Effect fehlt noch ("Search crypto.ai...", "Search hotel.zurich...") |
| **Beyond Hunting Section** | "Own. Protect. Monetize." | Guter Text, aber Link zu `/buy` könnte verwirrend sein - besser `/market` oder `/terminal` |
| **Sniper Alerts Link** | `/terminal/watchlist` | Für nicht-eingeloggte User nutzlos - sollte zu `/register` führen |
### 📊 Kennzahlen
- **Sections:** 8 (Hero, Ticker, Market Teaser, Pillars, Beyond, TLDs, Stats, CTA)
- **CTAs zum Registrieren:** 4
- **Trust-Indikatoren:** 7
- **Lock/Blur-Elemente:** 2 (Market Teaser, TLD Preise)
---
## 2. Market Page
### ✅ Stärken
#### Klare Positionierung
```
H1: "Live Domain Market"
Sub: "Aggregated from GoDaddy, Sedo, and Pounce Direct."
```
**Analyse:** Sofort klar: Aggregation mehrerer Quellen an einem Ort = Zeitersparnis.
#### Vanity-Filter für nicht-eingeloggte User
```javascript
// Rules: No numbers (except short domains), no hyphens, length < 12, only premium TLDs
const PREMIUM_TLDS = ['com', 'io', 'ai', 'co', 'de', 'ch', 'net', 'org', 'app', 'dev', 'xyz']
```
**Analyse:** Zeigt nur "Premium-Looking" Domains → Professioneller erster Eindruck.
#### Pounce Score & Valuation geblurrt
- Sichtbar aber geblurrt mit Lock-Icon
- Hover-Text verfügbar
- Motiviert zur Registrierung
#### Bottom CTA
```
"Tired of digging through spam? Our 'Trader' plan filters 99% of junk domains automatically."
[Upgrade Filter]
```
**Analyse:** Adressiert direkten Pain Point (Spam in Auktionen) und bietet Lösung.
### ⚠️ Verbesserungspotential
| Problem | Aktuelle Umsetzung | Empfehlung |
|---------|-------------------|------------|
| **Pounce Direct Section** | Zeigt interne Listings | Gut, aber "0% Commission" sollte prominenter sein |
| **Mobile Darstellung** | Einige Spalten hidden | Ok, aber Deal Score sollte auch mobil geblurrt sichtbar sein |
### 📊 Gatekeeper-Elemente
- ✅ Vanity Filter (nur schöne Domains für Ausgeloggte)
- ✅ Pounce Score geblurrt
- ✅ Valuation geblurrt
- ✅ Bottom CTA für Upgrade
- ✅ Login Banner
---
## 3. Intel Page (TLD Inflation Monitor)
### ✅ Stärken
#### Unique Value Proposition
```
H1: "TLD Market Inflation Monitor"
Sub: "Don't fall for promo prices. See renewal costs, spot traps, and track price trends..."
```
**Analyse:** Adressiert einen echten, wenig bekannten Pain Point: Registrar locken mit günstigen Erstjahr-Preisen, aber Renewals sind teuer ("Renewal Traps").
#### Top Movers Cards
- Zeigt TLDs mit größten Preisänderungen
- Visuell ansprechend mit Trend-Badges
- Sofort sichtbarer Mehrwert
#### Intelligentes Gating
```
.com, .net, .org → Vollständig sichtbar (als Beweis)
Alle anderen → Buy Price + Trend sichtbar, Renewal + Risk geblurrt
```
**Analyse:** Perfekte Umsetzung: Zeigt DASS die Daten existieren (bei .com), versteckt die "Intelligence" (Renewal/Risk) für andere.
#### Trust-Indikatoren
- "Renewal Trap Detection" Badge
- "Risk Levels" Badge mit Farben
- "1y/3y Trends" Badge
### ⚠️ Verbesserungspotential
| Problem | Aktuelle Umsetzung | Empfehlung |
|---------|-------------------|------------|
| **SEO-Titel** | "TLD Market Inflation Monitor" | Exzellent für SEO - bleibt so |
| **Top Movers Links** | Führen zu `/register` für Ausgeloggte | Ok, aber könnte auch zur Intel-Detailseite mit Gating führen |
### 📊 Gatekeeper-Elemente
- ✅ Renewal Price geblurrt (außer .com/.net/.org)
- ✅ Risk Level geblurrt (außer .com/.net/.org)
- ✅ Login Banner prominent
- ✅ "Stop overpaying" Messaging
---
## 4. Pricing Page
### ✅ Stärken
#### Klare Tier-Struktur
```
Scout (Free) → Trader ($9) → Tycoon ($29)
```
#### Feature-Differenzierung mit Emojis
| Feature | Scout | Trader | Tycoon |
|---------|-------|--------|--------|
| Market Feed | 🌪️ Raw | ✨ Curated | ⚡ Priority |
| Alert Speed | 🐢 Daily | 🐇 Hourly | ⚡ 10 min |
| Watchlist | 5 Domains | 50 Domains | 500 Domains |
**Analyse:** Emojis machen die Differenzierung sofort visuell verständlich.
#### FAQ Section
Adressiert echte Fragen:
- "How fast will I know when a domain drops?"
- "What's domain valuation?"
- "Can I track domains I already own?"
#### Best Value Highlight
- Trader-Plan hat "Best Value" Badge
- Visuell hervorgehoben (Rahmen/Farbe)
### ⚠️ Verbesserungspotential
| Problem | Aktuelle Umsetzung | Empfehlung |
|---------|-------------------|------------|
| **Sniper Alerts** | Scout: "—", Trader: "5", Tycoon: "Unlimited" | Könnte klarer erklärt werden was das ist |
| **Portfolio Feature** | Scout: "—", Trader: "25 Domains" | Sollte erklären: "Track YOUR owned domains" |
---
## 5. Header & Navigation
### ✅ Stärken
```
Market | Intel | Pricing | [Sign In] | [Start Hunting]
```
- **Dark Mode durchgängig** — Professioneller Look
- **"Start Hunting" statt "Get Started"** — Spricht die Zielgruppe direkt an
- **Neon-grüner CTA** — Hohe Visibility
- **Minimalistisch** — Keine Überladung
### ⚠️ Verbesserungspotential
| Problem | Aktuelle Umsetzung | Empfehlung |
|---------|-------------------|------------|
| **Mobile Menu** | Funktional | Ok, aber CTA sollte noch prominenter sein |
---
## 6. Footer
### ✅ Stärken
- **"Don't guess. Know."** — Tagline präsent
- **Social Links** — Twitter, LinkedIn, Email
- **Korrekte Links** — Market, Intel, Pricing
---
## Zielgruppen-Analyse
### Primäre Zielgruppe: Domain-Investoren
| Bedürfnis | Wird adressiert? | Wo? |
|-----------|------------------|-----|
| Auktionen monitoren | ✅ | Market Page, Ticker |
| Expiring Domains finden | ✅ | Track Pillar, Alerts |
| TLD-Preise vergleichen | ✅ | Intel Page |
| Portfolio verwalten | ✅ | Beyond Hunting Section |
| Domains verkaufen | ✅ | Trade Pillar, Marketplace |
### Sekundäre Zielgruppe: Founder auf Domain-Suche
| Bedürfnis | Wird adressiert? | Wo? |
|-----------|------------------|-----|
| Domain-Verfügbarkeit prüfen | ✅ | DomainChecker (Hero) |
| Alternativen finden | ✅ | "AI-powered alternatives" |
| Faire Preise kennen | ✅ | Intel Page |
---
## Conversion-Funnel Analyse
```
┌─────────────────────────────────────────────────────────┐
│ LANDING PAGE │
│ "The market never sleeps. You should." │
│ │
│ ┌─────────────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ DISCOVER │ │ TRACK │ │ TRADE │ │
│ └─────────────┘ └─────────────┘ └─────────────┘ │
│ │
│ ┌──────────────────────────────────────────────┐ │
│ │ LIVE MARKET TEASER (Blurred) │ │
│ │ "Sign in to see X+ more domains" │ │
│ └──────────────────────────────────────────────┘ │
│ ↓ │
│ [START HUNTING] │
└─────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ MARKET PAGE │
│ "Aggregated from GoDaddy, Sedo, and Pounce Direct" │
│ │
│ ┌──────────────────────────────────────────────┐ │
│ │ Domain | Price | Score (🔒) | Valuation (🔒) │ │
│ └──────────────────────────────────────────────┘ │
│ │
│ "Tired of digging through spam?" → [UPGRADE FILTER] │
└─────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ INTEL PAGE │
│ "TLD Market Inflation Monitor" │
│ │
│ .com, .net, .org → FULL DATA │
│ Others → Renewal (🔒), Risk (🔒) │
│ │
│ "Stop overpaying. Know the true costs." │
│ ↓ │
│ [START HUNTING] │
└─────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ PRICING PAGE │
│ │
│ Scout (Free) → Trader ($9) → Tycoon ($29) │
│ │
│ "Start with Scout. It's free forever." │
└─────────────────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────┐
│ REGISTER PAGE │
│ │
│ "Track up to 5 domains. Free." │
│ "Daily status scans. Never miss a drop." │
└─────────────────────────────────────────────────────────┘
```
---
## Empfehlungen für Optimierung
### Hohe Priorität
1. **DomainChecker Animation**
- Implementiere den Typing-Effect für Placeholder
- Beispiele: "Search crypto.ai...", "Search hotel.zurich..."
- Macht den Hero interaktiver und zeigt Anwendungsfälle
2. **Links für Ausgeloggte korrigieren**
- `/terminal/watchlist``/register?redirect=/terminal/watchlist`
- `/buy` → Klarstellen, dass dies der Marketplace ist
### Mittlere Priorität
3. **Testimonials/Social Proof hinzufügen**
- Aktuell: Nur Zahlen (886+ TLDs, 24/7)
- Fehlt: User-Testimonials, bekannte Nutzer, Logos
4. **Video/Demo**
- Ein kurzes Video (30s) auf der Landing Page
- Zeigt das Dashboard in Aktion
### Niedrige Priorität
5. **Blog/Briefings SEO**
- Mehr Content für organischen Traffic
- Themen: "Top 10 TLDs 2025", "Domain Investing Guide"
---
## Fazit
Die Public Pages sind **strategisch exzellent umgesetzt** und folgen dem "Teaser & Gatekeeper"-Prinzip konsequent:
1. **✅ Mehrwert ist sofort klar** — "Domain Intelligence for Investors"
2. **✅ Zielgruppe wird direkt angesprochen** — "Hunters", "Investors", "Trade"
3. **✅ Daten werden gezeigt, Intelligenz versteckt** — Blurred Scores, Locked Features
4. **✅ Trust-Signale sind präsent** — 886+ TLDs, Live Data, Dark Mode Pro-Look
5. **✅ CTAs sind konsistent** — "Start Hunting" überall
**Die Pages sind bereit für Launch.**
---
*Report generiert am 12. Dezember 2025*

159
README.md
View File

@ -4,70 +4,6 @@ A professional full-stack platform for domain hunters, investors, and portfolio
---
## Security remediation (required)
This repo previously contained **accidentally committed secrets** (`DEPLOY_backend.env`, `DEPLOY_frontend.env`) and **session cookies** (`backend/data/cookies/session_cookies.json`). The codebase was updated to **Cookie-based auth (HttpOnly)** and the **git history was rewritten** to remove the leaked files.
### Do this now (simple checklist)
1) **Rotate ALL secrets (treat old values as compromised)**
- **Backend secrets**: `SECRET_KEY`
- **Stripe**: `STRIPE_SECRET_KEY`, `STRIPE_WEBHOOK_SECRET`, price IDs if necessary
- **OAuth**: `GOOGLE_CLIENT_SECRET`, `GITHUB_CLIENT_SECRET` (and IDs if you want)
- **Email**: `SMTP_PASSWORD`
- **Other integrations** (if used): `DROPCATCH_CLIENT_SECRET`, `SEDO_SIGN_KEY`, `MOZ_SECRET_KEY`
Generate a new `SECRET_KEY` locally:
```bash
python3 -c "import secrets; print(secrets.token_hex(32))"
```
2) **Force-push the rewritten history to your remote**
```bash
git push --force-with-lease --all
git push --force-with-lease --tags
```
3) **Re-clone on every server/CI machine**
Because history changed, **do not** `git pull` on old clones. The simplest safe path:
```bash
rm -rf pounce
git clone <your-repo> pounce
```
4) **Re-deploy**
- **Backend**: `pip install -r backend/requirements.txt`
- **Frontend**: `npm ci && npm run build`
5) **Quick verification**
- Login now sets an **HttpOnly cookie**:
- `POST /api/v1/auth/login` returns `{ "expires_in": ... }` (no token in JSON)
- `POST /api/v1/auth/logout` clears the cookie
### Deployment note (keep it simple)
For the new cookie-auth to “just work”, the recommended setup is:
- **Serve the frontend on your main domain**
- **Route `/api/v1/*` to the backend via reverse proxy** (nginx/caddy/Next rewrite)
## Server deployment (recommended)
See `SERVER_DEPLOYMENT.md`.
### Env files (important)
- **Never commit** any of these:
- `DEPLOY_backend.env`, `DEPLOY_frontend.env`, `backend/data/cookies/*.json`
- Use templates:
- `DEPLOY_backend.env.example` → copy to `DEPLOY_backend.env` (local only)
- `DEPLOY_frontend.env.example` → copy to `DEPLOY_frontend.env` (local only)
---
## 🚀 What's New (v2.0)
### User Command Center
@ -441,96 +377,11 @@ The backend includes APScheduler that runs automatically:
| Job | Schedule | Description |
|-----|----------|-------------|
| **TLD Price Scrape** | 03:00 & 15:00 UTC | Scrapes 886+ TLDs from Porkbun + 4 registrars |
| **Auction Scrape** | Every 2h at :30 | Scrapes from ExpiredDomains |
| **Domain Check (Scout)** | Daily 06:00 UTC | Checks all watched domains |
| **Domain Check (Trader)** | Hourly :00 | Checks Trader domains |
| **Domain Check (Tycoon)** | Every 10 min | Checks Tycoon domains |
| **Health Checks** | Daily 06:00 UTC | DNS/HTTP/SSL health analysis |
| **Expiry Warnings** | Weekly Mon 08:00 | Warns about domains <30 days |
| **Weekly Digest** | Weekly Sun 10:00 | Summary email to all users |
| **Price Alerts** | 04:00 & 16:00 UTC | Sends email for >5% changes |
| **Sniper Match** | Every 30 min | Matches auctions to alerts |
| **Auction Cleanup** | Every 15 min | Removes expired auctions |
---
## 📧 Email Notifications & Monitoring
### What Gets Monitored
The Watchlist automatically monitors domains and sends alerts:
| Alert Type | Trigger | Email Subject |
|------------|---------|---------------|
| **Domain Available** | Domain becomes free | `🐆 POUNCE NOW: domain.com just dropped` |
| **Expiry Warning** | Domain expires in <30 days | `⏰ 3 domains expiring soon` |
| **Health Critical** | Domain goes offline/critical | `🐆 POUNCE NOW: domain.com` |
| **Price Change** | TLD price changes >5% | `💰 .ai moved down 12%` |
| **Sniper Match** | Auction matches your criteria | `🎯 Sniper Alert: 5 matching domains found!` |
| **Weekly Digest** | Every Sunday | `📊 Your week in domains` |
### Check Frequency by Subscription
| Tier | Frequency | Use Case |
|------|-----------|----------|
| Scout (Free) | Daily | Hobby monitoring |
| Trader ($9) | Hourly | Active domain hunters |
| Tycoon ($29) | Every 10 min | Professional investors |
### ⚠️ Required: Email Configuration
**Email notifications will NOT work without SMTP configuration!**
Add these to your `.env` file:
```env
# SMTP Configuration (Required for email alerts)
SMTP_HOST=smtp.zoho.eu # Your SMTP server
SMTP_PORT=465 # Usually 465 (SSL) or 587 (TLS)
SMTP_USER=hello@pounce.ch # SMTP username
SMTP_PASSWORD=your-password # SMTP password
SMTP_FROM_EMAIL=hello@pounce.ch # Sender address
SMTP_FROM_NAME=pounce # Sender name
SMTP_USE_SSL=true # Use SSL (port 465)
SMTP_USE_TLS=false # Use STARTTLS (port 587)
```
**Recommended SMTP Providers:**
- **Zoho Mail** (Free tier available) - Port 465 SSL
- **Resend** (Developer-friendly) - Port 587 TLS
- **SendGrid** (10k free/month) - Port 587 TLS
- **Amazon SES** (Cheap at scale) - Port 587 TLS
### Verify Email is Working
```bash
cd backend && source venv/bin/activate
python3 -c "
from app.services.email_service import email_service
print('Email configured:', email_service.is_configured())
"
```
### Test Email Manually
```bash
python3 -c "
import asyncio
from app.services.email_service import email_service
async def test():
result = await email_service.send_email(
to_email='your@email.com',
subject='Test from Pounce',
html_content='<h1>It works!</h1>'
)
print('Sent:', result)
asyncio.run(test())
"
```
| TLD Price Scrape | Daily 03:00 UTC | Scrapes 886+ TLDs from Porkbun |
| Auction Scrape | Hourly :30 | Scrapes from ExpiredDomains |
| Domain Check | Daily 06:00 UTC | Checks all watched domains |
| Price Alerts | Daily 04:00 UTC | Sends email for >5% changes |
| Sniper Alert Match | Every 15 min | Matches auctions to alerts |
---

View File

@ -1,403 +0,0 @@
# SEO & Performance Optimization Guide
## ✅ Implemented Features
### 1. **SEO Meta Tags & Structured Data**
#### Global Configuration
- **Root Layout** (`frontend/src/app/layout.tsx`):
- Complete OpenGraph tags
- Twitter Card tags
- Favicon & App Icons
- Organization & WebSite schema (JSON-LD)
- Search box schema for Google
#### Page-Specific Metadata
- **Homepage** (`frontend/src/app/metadata.ts`):
- SoftwareApplication schema
- AggregateRating schema
- Feature list
- **TLD Pages** (`frontend/src/app/intel/[tld]/metadata.ts`):
- Dynamic metadata generation
- Article schema
- Product schema (domain TLD)
- Breadcrumb schema
- Registrar comparison offers
- **Pricing Page** (`frontend/src/app/pricing/metadata.ts`):
- ProductGroup schema
- Multiple offer types (Scout, Trader, Tycoon)
- FAQ schema
- AggregateRating for each plan
- **Market Page** (`frontend/src/app/market/metadata.ts`):
- CollectionPage schema
- ItemList schema
- Individual auction schemas
- **Domain Listings** (`frontend/src/lib/domain-seo.ts`):
- Product schema with Offer
- Price specification
- Aggregate rating
- Breadcrumb
- FAQ schema for buying process
- Domain quality scoring
---
### 2. **Programmatic SEO**
#### Sitemap Generation (`frontend/src/app/sitemap.ts`)
- **Automatic sitemap** for:
- Main pages (Home, Market, Intel, Pricing)
- **120+ TLD landing pages** (programmatic SEO)
- Dynamic priorities & change frequencies
- Proper lastModified timestamps
#### robots.txt (`frontend/public/robots.txt`)
- Allow public pages
- Disallow private areas (/terminal/, /api/, /login, etc.)
- Crawl-delay directive
- Sitemap location
#### TLD Landing Pages
- **120+ indexed TLD pages** for SEO traffic
- Rich snippets for each TLD
- Registrar comparison data
- Price trends & market analysis
- Schema markup for search engines
---
### 3. **Performance Optimizations**
#### Next.js Configuration (`frontend/next.config.js`)
- **Image Optimization**:
- AVIF & WebP formats
- Responsive device sizes
- 1-year cache TTL
- SVG safety
- **Compression**: Gzip enabled
- **Security Headers**:
- HSTS (Strict-Transport-Security)
- X-Frame-Options
- X-Content-Type-Options
- X-XSS-Protection
- CSP for images
- Referrer-Policy
- Permissions-Policy
- **Cache Headers**:
- Static assets: 1 year immutable cache
- **Remove X-Powered-By**: Security improvement
#### Web Performance Monitoring (`frontend/src/lib/analytics.ts`)
- **Core Web Vitals**:
- FCP (First Contentful Paint)
- LCP (Largest Contentful Paint)
- FID (First Input Delay)
- CLS (Cumulative Layout Shift)
- TTFB (Time to First Byte)
- **Analytics Integration**:
- Google Analytics (gtag)
- Plausible Analytics (privacy-friendly)
- Custom endpoint support
- **Event Tracking**:
- Page views
- Search queries
- Domain views
- Inquiries
- Signups
- Subscriptions
- Errors
- A/B tests
---
### 4. **Dynamic OG Images**
#### TLD OG Images (`frontend/src/app/api/og/tld/route.tsx`)
- **Edge Runtime** for fast generation
- Dynamic content:
- TLD name
- Current price
- Trend indicator (up/down)
- Brand colors & logo
#### Domain OG Images (`frontend/src/app/api/og/domain/route.tsx`)
- Dynamic listing images:
- Domain name (SLD + TLD split)
- Price
- Featured badge
- "For Sale" indicator
- Trust signals (Instant Transfer, 0% Commission, Secure Escrow)
---
### 5. **Geo-Targeting & Internationalization**
#### Multi-Language Support (`frontend/src/lib/seo.ts`)
- **13 Supported Locales**:
- en-US, en-GB, en-CA, en-AU
- de-DE, de-CH
- fr-FR, es-ES, it-IT, nl-NL
- pt-BR, ja-JP, zh-CN
- **Hreflang Generation**: Automatic alternate language tags
- **Locale Detection**: From Accept-Language header
- **Price Formatting**: Currency per locale
- **x-default**: Fallback for unsupported regions
#### SEO Utilities
- Canonical URL generation
- Slug generation
- Breadcrumb schema builder
- UTM parameter tracking
- External URL detection
- Lazy loading setup
---
### 6. **PWA Support**
#### Web Manifest (`frontend/public/site.webmanifest`)
- **Installable** as Progressive Web App
- App shortcuts:
- Market
- Intel
- Terminal
- Themed icons (192x192, 512x512)
- Standalone display mode
- Categories: Finance, Business, Productivity
---
## 🎯 SEO Strategy Implementation
### Content Strategy
1. **Programmatic SEO for TLDs**:
- 120+ indexed pages targeting `.com domain price`, `.io domain registration`, etc.
- Each page: 1,200+ words of unique content
- Rich snippets with pricing & registrar data
2. **Domain Marketplace SEO**:
- Each listing: Product schema
- Optimized titles & descriptions
- Quality scoring algorithm
- FAQ schema for common questions
3. **Blog/Content Marketing** (Future):
- Domain investing guides
- TLD market reports
- Success stories
- Industry news
---
## 🚀 Performance Targets
### Core Web Vitals (Google PageSpeed)
- **LCP**: < 2.5s
- **FID**: < 100ms
- **CLS**: < 0.1
### Lighthouse Scores (Target)
- **Performance**: 95+
- **Accessibility**: 100
- **Best Practices**: 100
- **SEO**: 100
### Optimizations Applied
- Image lazy loading
- Code splitting
- Tree shaking
- Compression (gzip/brotli)
- Browser caching
- CDN delivery (static assets)
- Edge functions (OG images)
---
## 📊 Analytics & Tracking
### Implemented Events
- `pageview`: Every page navigation
- `search`: Domain/TLD searches
- `domain_view`: Listing views
- `listing_inquiry`: Contact seller
- `signup`: New user registration
- `subscription`: Tier upgrades
- `error`: Client-side errors
- `ab_test`: A/B test variants
### Privacy
- **GDPR Compliant**: Consent management
- **Cookie-less option**: Plausible Analytics
- **Anonymous tracking**: No PII stored
---
## 🔧 Setup Instructions
### Environment Variables
```bash
# SEO & Analytics
NEXT_PUBLIC_SITE_URL=https://pounce.com
NEXT_PUBLIC_GA_ID=G-XXXXXXXXXX
NEXT_PUBLIC_ANALYTICS_ENDPOINT=https://api.pounce.com/analytics
# Optional: Plausible
NEXT_PUBLIC_PLAUSIBLE_DOMAIN=pounce.com
```
### Google Search Console
1. Verify domain ownership
2. Submit sitemap: `https://pounce.com/sitemap.xml`
3. Request indexing for priority pages
4. Monitor Core Web Vitals
### Google Analytics
1. Create GA4 property
2. Add tracking ID to `.env.local`
3. Configure custom events
4. Set up conversions (signups, subscriptions)
### Bing Webmaster Tools
1. Import from Google Search Console
2. Submit sitemap
3. Monitor crawl stats
---
## 🎨 OG Image Generation
### TLD Pages
```
https://pounce.com/api/og/tld?tld=com&price=9.99&trend=5.2
```
### Domain Listings
```
https://pounce.com/api/og/domain?domain=crypto.io&price=50000&featured=true
```
### Custom Generator
Use `generateOGImageUrl()` from `src/lib/seo.ts` for dynamic generation.
---
## 📱 Mobile Optimization
### Responsive Images
- Automatic srcset generation
- AVIF/WebP fallbacks
- Lazy loading
- Proper aspect ratios
### Touch Optimization
- Minimum 44x44px touch targets
- Swipe gestures
- Mobile-first CSS
### Performance
- Service Worker (PWA)
- Offline fallback
- Cache-first strategy for static assets
---
## 🔍 Search Engine Submission
### Submit to:
1. **Google Search Console**: https://search.google.com/search-console
2. **Bing Webmaster Tools**: https://www.bing.com/webmasters
3. **Yandex Webmaster**: https://webmaster.yandex.com
4. **Baidu Webmaster**: https://ziyuan.baidu.com (for China)
### Sitemap URL
```
https://pounce.com/sitemap.xml
```
---
## 🎯 Next Steps
### Immediate (Week 1)
- [ ] Add GA4 tracking code
- [ ] Submit sitemap to Google
- [ ] Generate OG images for top 50 TLDs
- [ ] Test Core Web Vitals on Lighthouse
### Short-term (Month 1)
- [ ] Content for top 20 TLD pages (1,500+ words each)
- [ ] Internal linking strategy
- [ ] Backlink outreach (domain blogs, forums)
- [ ] Create domain investing guides
### Long-term (Quarter 1)
- [ ] Blog with 2-3 posts/week
- [ ] Video content (YouTube SEO)
- [ ] Domain market reports (monthly)
- [ ] Influencer partnerships
---
## 📈 Expected Results
### Traffic Growth (Conservative)
- **Month 1**: 1,000 organic visitors/month
- **Month 3**: 5,000 organic visitors/month
- **Month 6**: 20,000 organic visitors/month
- **Month 12**: 100,000+ organic visitors/month
### Top Keywords (Target Rankings)
- "domain pricing" (Top 10)
- ".io domain" (Top 5)
- "domain marketplace" (Top 20)
- "buy premium domains" (Top 20)
- "TLD prices" (Top 10)
---
## 🛠️ Maintenance
### Weekly
- Check GSC for crawl errors
- Monitor Core Web Vitals
- Review top queries
- Update sitemap if needed
### Monthly
- Analyze traffic trends
- Update TLD price data
- Refresh OG images for trending TLDs
- Content updates
### Quarterly
- SEO audit
- Competitor analysis
- Backlink review
- Strategy adjustment
---
## 📚 Resources
- [Next.js SEO Guide](https://nextjs.org/learn/seo/introduction-to-seo)
- [Google Search Central](https://developers.google.com/search)
- [Schema.org Documentation](https://schema.org/docs/schemas.html)
- [Core Web Vitals](https://web.dev/vitals/)
- [Open Graph Protocol](https://ogp.me/)
---
**Status**: **Production Ready**
All SEO & performance optimizations are implemented and ready for launch. The platform is configured for maximum visibility and lightning-fast performance.

View File

@ -1,170 +0,0 @@
# Server Deployment (Docker Compose)
## Ziel
Pounce auf einem Server starten mit:
- **Frontend** (Next.js)
- **Backend API** (FastAPI)
- **Postgres**
- **Redis** (Rate-Limit Storage + Job Queue)
- **Scheduler** (APScheduler) **separater Prozess**
- **Worker** (ARQ) **separater Prozess**
Damit laufen Jobs nicht mehrfach bei mehreren API-Workern und die UI bleibt schnell.
---
## Voraussetzungen
- Linux Server (z.B. Ubuntu 22.04+)
- Docker + Docker Compose Plugin
- Domain + HTTPS Reverse Proxy (empfohlen), damit Cookie-Auth zuverlässig funktioniert
---
## 1) Repo auf den Server holen
```bash
cd /opt
git clone <your-repo-url> pounce
cd pounce
```
---
## 2) Server-Environment anlegen
In `/opt/pounce`:
```bash
cp DEPLOY_docker_compose.env.example .env
```
Dann `.env` öffnen und mindestens setzen:
- **DB_PASSWORD**
- **SECRET_KEY**
- **SITE_URL** (z.B. `https://pounce.example.com`)
- **ALLOWED_ORIGINS** (z.B. `https://pounce.example.com`)
Optional (aber empfohlen):
- **SMTP_\*** (für Alerts/Emails)
- **COOKIE_DOMAIN** (wenn du Cookies über Subdomains teilen willst)
---
## 3) Starten
```bash
docker compose up -d --build
```
Services:
- `frontend` (Port 3000)
- `backend` (Port 8000)
- `scheduler` (kein Port)
- `worker` (kein Port)
- `db` (kein Port)
- `redis` (kein Port)
---
## 4) Initial Setup (1× nach erstem Start)
### DB Tabellen + Baseline Seed
```bash
docker compose exec backend python scripts/init_db.py
```
### TLD Price Seed (886+)
```bash
docker compose exec backend python scripts/seed_tld_prices.py
```
---
## 5) Reverse Proxy (empfohlen)
### Warum?
Das Frontend ruft im Browser standardmässig `https://<domain>/api/v1/...` auf (same-origin).
Darum solltest du:
- **HTTPS** terminieren
- `/api/v1/*` an das Backend routen
- `/` an das Frontend routen
### Beispiel: Caddy (sehr simpel)
```caddy
pounce.example.com {
encode zstd gzip
# API
handle_path /api/v1/* {
reverse_proxy 127.0.0.1:8000
}
# Frontend
reverse_proxy 127.0.0.1:3000
# optional: metrics nur intern
@metrics path /metrics
handle @metrics {
respond 403
}
}
```
Wichtig:
- Setze `SITE_URL=https://pounce.example.com`
- Setze `COOKIE_SECURE=true` (oder via `ENVIRONMENT=production`)
---
## 6) Checks (nach Deploy)
```bash
curl -f http://127.0.0.1:8000/health
curl -f http://127.0.0.1:8000/metrics
```
Logs:
```bash
docker compose logs -f backend
docker compose logs -f scheduler
docker compose logs -f worker
```
---
## 7) Updates
```bash
cd /opt/pounce
git pull
docker compose up -d --build
```
---
## Troubleshooting (häufig)
- **Cookies/Login klappt nicht**:
- Prüfe `SITE_URL` und HTTPS (Secure Cookies)
- Prüfe `ALLOWED_ORIGINS` (falls Frontend/Backend nicht same-origin sind)
- **Scheduler läuft doppelt**:
- Stelle sicher, dass nur **ein** `scheduler` Service läuft (keine zweite Instanz)
- **Emails werden nicht gesendet**:
- `docker compose exec scheduler env | grep SMTP_`
- SMTP Vars müssen im Container vorhanden sein (kommen aus `.env`)

View File

@ -1,291 +0,0 @@
## Pounce Unicorn Plan (integriert)
Ziel: Pounce von einem starken Produkt (Trust + Inventory + Lead Capture) zu einem skalierbaren System mit Moat + Flywheel entwickeln.
---
## Umsetzungsstatus (Stand: 2025-12-15)
### Wo wir stehen (kurz, ehrlich)
- **Deal-System (Liquidity Loop)**: **fertig & gehärtet** (Inbox → Threading → Sold/GMV → AntiAbuse).
- **Yield (Moat)**: **Connect + Routing + Tracking + Webhooks + Ledger-Basis** ist da. Wir können Domains verbinden, Traffic routen, Clicks/Conversions tracken und Payouts vorbereiten/abschliessen.
- **Flywheel/Distribution**: teilweise (Public Deal Surface + Login Gate ist da), Programmatic SEO & Viral Loop noch nicht systematisch ausgebaut.
- **Telemetry/Ops**: einzelne Events existieren implizit (Audit/Transactions), aber **kein zentrales Event-Schema + KPIs Dashboard**.
### Fortschritt nach Workstream
#### 1) DealSystem
- [x] 1A Inbox Workflow (Status, Close Reason, Audit)
- [x] 1B Threading/Negotiation (Buyer/Seller Threads + Email + Rate Limits + Content Safety)
- [x] 1C Deal Closure + GMV (Mark as Sold, Close open inquiries)
- [x] 1D AntiAbuse (Limits + Safety Checks an den kritischen Stellen)
#### 2) Yield (Moat)
- [x] 2A Connect/Nameserver Flow (PortfolioOnly + DNS Verified + Connect Wizard + `connected_at`)
- [x] 2B Routing → Tracking (Async, Click Tracking, IPHashing, RateLimit, strict partner config)
- [x] 2B Attribution (Webhook kann `click_id` mitschicken)
- [x] 2C Ledger/PayoutBasics (generate payouts + complete payouts; serversafe keys)
- [x] 2C.2 DashboardKorrektheit (monatliche Stats = confirmed/paid, pending payout = confirmed+unpaid)
#### 3) Flywheel / Distribution
- [~] 3B Public Deal Surface + Login Gate (Pounce Direct gated) — **vorhanden**
- [~] 3A Programmatic SEO maximal (Templates + CTA Pfade + Indexation)
- [~] 3C Viral Loop „Powered by Pounce“ (nur wo intent passt, sauberer Referral Loop)
**3C Stand (Viral Loop)**
- **Invite Codes**: jeder User hat jetzt einen eigenen `invite_code` (unique) + `GET /api/v1/auth/referral` liefert den InviteLink.
- **Attribution**: `ref` wird auf Public Pages in Cookie gespeichert (30 Tage) und bei `/register` mitgeschickt → Backend setzt `referred_by_user_id`.
- **Surfaces (intent-fit)**:
- Terminal Settings: “Invite” Panel mit CopyLink
- Public Buy Listing: “Powered by Pounce” → Register mit `?ref=<seller_invite_code>`
- **Telemetry**: Events `user_registered`, `referral_attributed`, `referral_link_viewed`
- **Admin KPIs (3C.2)**: Telemetry Tab zeigt jetzt ReferralKPIs (Link Views + Signups pro Referrer) via `GET /api/v1/telemetry/referrals?days=...`
- **Rewards/Badges (3C.2)**: Deterministische ReferralRewards (abuseresistent) → `subscriptions.referral_bonus_domains` (+5 Slots pro 3 “qualified referrals”), Badge `verified_referrer` / `elite_referrer` wird im TerminalSettings InvitePanel angezeigt.
- **AntiFraud/Cooldown**: Qualified zählt erst nach **Cooldown** (User+Subscription Age) und wird bei **shared IP / duplicate IP / missing IP** disqualifiziert (Telemetry `ip_hash`).
**3A Stand (Programmatic SEO)**
- **Indexation**: `sitemap.xml` ist jetzt dynamisch (DiscoverTLDs aus DB + Blog Slugs + Public Listings) und `robots.txt` blockt Legacy Pfade.
- **Canonical Cleanup**: Legacy Routen (`/tld/*`, `/tld-pricing/*`) redirecten server-seitig nach `/discover/*`.
- **Templates**: `/discover/[tld]` hat jetzt serverseitiges Metadata + JSONLD (aus echten RegistrarCompare Daten). `/buy/[slug]` ist serverseitig (Metadata + JSONLD).
- **Blog Article SEO**: `/blog/[slug]` hat jetzt serverseitiges `generateMetadata` + Article JSONLD, ohne ViewCount SideEffects (MetaEndpoint).
#### 4) Skalierung / Telemetry
- [x] 4A Events (kanonisches Event-Schema + persistente Events in Deal+Yield Funnel)
- [x] 4A.2 KPI Views (Admin KPIs aus Telemetry Events: Rates + Median Times)
- [x] 4B Ops (Backups + Restore-Verification + Monitoring/Alerts + Deliverability)
**4B Stand (Ops)**
- **Backups**: Admin-Endpoint + Scheduler Daily Backup + Restore-Verification (SQLite integrity_check / Postgres pg_restore --list)
- **Monitoring**: `/metrics` exportiert jetzt zusätzlich Business-KPIs (Deal+Yield aus `telemetry_events`, gecached) + Ops-Metriken (Backup enabled + Backup age)
- **Deliverability**: Newsletter Emails mit `List-Unsubscribe` (One-Click) + neue One-Click Unsubscribe Route
- **Alerting (Vorbereitung)**: `ops/prometheus-alerts.yml` mit Alerts (5xx rate, Backup stale, 24h Funnel-Null)
- **Alerting (ohne Docker)**: Scheduler Job `ops_alerting` + Admin Endpoint `POST /api/v1/admin/system/ops-alerts/run`
- **Alert History + Cooldown (persistiert)**: Table `ops_alert_events` + Admin Endpoint `GET /api/v1/admin/system/ops-alerts/history` + Admin UI History Panel
---
## Absicht & holistisches Konzept
### Absicht (warum es Pounce gibt)
Pounce existiert, um Domains von „toten Namen“ (nur Renewal-Kosten, keine Nutzung) zu **messbaren, handelbaren digitalen Assets** zu machen.
Wir bauen nicht nur einen Feed oder einen Marktplatz, sondern eine **Lifecycle Engine**: entdecken → erwerben → monetarisieren → liquidieren.
### Für wen (Zielgruppe)
- **Domain Investors / Operators**: brauchen sauberes Inventory, schnelle Entscheidungen, klare Workflows.
- **Builders / Entrepreneurs**: wollen gute Assets finden und sofort nutzen/monetarisieren.
- **Portfolio Owner** (ab 10+ Domains): wollen Governance (Health, Renewal, Cashflow) statt Chaos.
### Positionierung (klarer Satz)
**Pounce ist das Operating System für Domains**: ein Clean Market Feed + Verified Direct Deals + Yield Routing mit Messbarkeit vom ersten View bis zum Exit.
### Das Gesamtmodell (4 Module)
1. **Discover (Intelligence)**
Findet Assets: Clean Feed, Scores, TLD Intel, Filter, Alerts.
2. **Acquire (Marketplace / Liquidity)**
Sichert Assets: externe Auktionen + **Pounce Direct** (DNS-verified Owner).
3. **Yield (Intent Routing)**
Monetarisiert Assets: Domain-Traffic → Intent → Partner → Revenue Share.
4. **Trade (Exit / Outcomes)**
Liquidität und Bewertung: Domains werden nach **Cashflow** bepreist (Multiple), nicht nur nach „Vibe“.
### Warum das Unicorn-Potenzial hat (Moat + Flywheel)
- **Moat**: Proprietäre Daten über Intent, Traffic, Conversion und Cashflow auf Domain-Level (schwer kopierbar).
- **Flywheel**: mehr Domains → mehr Routing/Conversions → mehr Daten → bessere Scores/Routing → mehr Deals → mehr Domains.
---
## 0) Leitprinzipien
- **Moat entsteht dort, wo proprietäre Daten entstehen**: Yield/Intent + Deal Outcomes.
- **Trust ist ein Feature**: alles, was Spam/Scam senkt, steigert Conversion.
- **Telemetry ist nicht „später“**: jede neue Funktion erzeugt Events + messbare KPIs.
---
## 1) DealSystem (Liquidity Loop fertig machen)
### 1A — Inbox Workflow (Woche 1)
**Ziel**: Seller können Leads zuverlässig triagieren und messen.
- **Inquiry Status Workflow komplett**: `new → read → replied → closed` + `spam`
- Backend PATCH Endpoint + UI Actions
- „Close“ inkl. Grund (z.B. sold elsewhere / low offer / no fit)
- **Audit Trail (minimal)**
- jede Statusänderung speichert: `who/when/old/new`
**KPIs**
- inquiry→read rate
- inquiry→replied rate
- median reply time
### 1B — Threading/Negotiation (Woche 23)
**Ziel**: Verhandlung im Produkt, nicht off-platform.
- **Threading**: Buyer ↔ Seller Messages als Conversation pro Listing
- **Notifications**: EMail „New message“ + LoginGate
- **Audit Trail (voll)**: message events + status events
- **Security**: rate limits (buyer + seller), keyword checks, link safety
**KPIs**
- inquiry→first message
- messages/thread
- reply rate
### 1C — Deal Closure + GMV (Woche 34)
**Ziel**: echte Conversion/GMV messbar machen.
- **“Mark as Sold”** auf Listing
- Gründe: sold on Pounce / sold offplatform / removed
- optional: **deal_value** + currency
- optional sauberer **Deal-Record**
- `deal_id`, `listing_id`, `buyer_user_id(optional)`, `final_price`, `closed_at`
**KPIs**
- inquiry→sold
- close rate
- time-to-close
- GMV
### 1D — AntiAbuse (laufend ab Woche 1)
- **Rate limit** pro IP + pro User (inquire + message + status flips)
- **Spam flagging** (Heuristiken + manuell)
- **Blocklist** (buyer account/email/domain-level)
**KPIs**
- spam rate
- blocked attempts
- false positive rate
---
## 2) Yield als Burggraben (Moat)
### 2A — Connect/Nameserver Flow (Woche 24)
**Ziel**: Domains „unter Kontrolle“ bringen (Connect Layer).
- **Connect Wizard** (Portfolio → Yield)
- Anleitung: NS/TXT Setup
- Status: pending/verified/active
- **Backend checks** (NS/TXT) + Speicherung: `connected_at`
- **Routing Entry** (Edge/Web): Request → route decision
**KPIs**
- connect attempts→verified
- connected domains
### 2B — Intent → Routing → Tracking (Monat 2)
**Ziel**: Intent Routing MVP für 1 Vertical.
- **Intent detection** (MVP)
- **Routing** zu Partnern + Fallbacks
- **Tracking**: click_id, domain_id, partner_id
- **Attribution**: conversion mapping + payout status
**KPIs**
- clicks/domain
- conversion rate
- revenue/domain
### 2C — Payout + Revenue Share (Monat 23)
- Ledger: pending → confirmed → paid
- payout schedule (monatlich) + export/reports
**KPIs**
- payout accuracy
- disputes
- net margin
### 2D — Portfolio Cashflow Dashboard (Monat 3)
- Portfolio zeigt: **MRR, last 30d revenue, ROI**, top routes
- Domains werden „yield-bearing assets“ → später handelbar nach Multiple
**KPIs**
- MRR
- retention/churn
- expansion
---
## 3) Flywheel / Distribution
### 3A — Programmatic SEO maximal (Monat 12)
- Templates skalieren (TLD/Intel/Price)
- klare CTAPfade: „Track this TLD“, „Enter Terminal“, „View Direct Deals“
**KPIs**
- organic sessions
- signup conversion
### 3B — Public Deal Surface + Login Gate (Monat 1)
- Public Acquire + /buy als ConversionEngine
- “contact requires login” überall konsistent
**KPIs**
- view→login
- login→inquiry
### 3C — Viral Loop „Powered by Pounce“ (Monat 23)
- nur wenn intent passt / low intent fallback
- referral link + revenue share
**KPIs**
- referral signups
- CAC ~0
---
## 4) Skalierung / Telemetry
### 4A — Events (Woche 12)
Definiere & logge Events:
- `listing_view`
- `inquiry_created`
- `inquiry_status_changed`
- `message_sent`
- `listing_marked_sold`
- `yield_connected`
- `yield_click`
- `yield_conversion`
- `payout_paid`
**KPIs**
- funnel conversion
- time metrics
### 4B — Ops (Monat 1)
- Monitoring/alerts (Errors + Business KPIs)
- Backups (DB daily + restore drill)
- Deliverability (SPF/DKIM/DMARC, bounce handling)
- Abuse monitoring dashboards
---
## Empfohlene Reihenfolge (damit es schnell „unfair“ wird)
1. **Deal-System 1A1C** (GMV & close-rate messbar)
2. **Yield 2A** (Connect Layer) parallel starten
3. **Events 4A** sofort mitziehen
4. **Yield 2B2C** (Moat) sobald Connect stabil
5. Flywheel 3A3C kontinuierlich

View File

@ -1,506 +0,0 @@
# Yield / Intent Routing Integrations-Konzept
**Ziel:** Domains von "toten Assets" zu "Yield-Generatoren" machen.
**Kern-Mechanismus:** User verbindet Domain → Pounce erkennt Intent → Routing zu Affiliate-Partnern → Passive Einnahmen.
---
## 1. Public Pages (nicht eingeloggt)
### 1.1 Landing Page 4. Pillar hinzufügen
Aktuell: **DISCOVER → TRACK → TRADE**
Neu: **DISCOVER → TRACK → TRADE → YIELD**
```
┌─────────────────────────────────────────────────────────────────┐
│ │
│ YIELD │
│ "Let your domains work for you." │
│ │
│ ┌─────────────────────────────────────────────────────────┐ │
│ │ 🔌 Connect Point DNS to ns.pounce.io │ │
│ │ 🧠 Analyze We detect: "kredit.ch" → Loan Intent │ │
│ │ 💰 Earn Affiliate routing → CHF 25/lead │ │
│ └─────────────────────────────────────────────────────────┘ │
│ │
│ "Your domains become autonomous agents." │
│ │
│ [Activate My Domains →] │
│ │
└─────────────────────────────────────────────────────────────────┘
```
**Teaser-Statistiken (für Trust):**
- "CHF 45'000+ generated this month"
- "2'400+ domains earning passively"
- "Avg. CHF 18.50/domain/month"
### 1.2 Neue Public Page: `/yield`
Eine eigene Landingpage für das Yield-Feature:
| Section | Inhalt |
|---------|--------|
| **Hero** | "Dead Domains? Make them work." + Animated revenue counter |
| **How it works** | 3-Step Animation: Connect → Analyze → Earn |
| **Use Cases** | Branchen-spezifische Beispiele (zahnarzt.ch, kredit.de, hotel-x.ch) |
| **Revenue Calculator** | "Gib deine Domain ein → geschätzter monatlicher Ertrag" |
| **Trust Signals** | Partner-Logos (Awin, PartnerStack, etc.), Testimonials |
| **CTA** | "Start Earning" → Login/Register |
---
## 2. Terminal (eingeloggt)
### 2.1 Sidebar-Erweiterung
**Neue Struktur der Sidebar:**
```
DISCOVER
├── MARKET (Auktionen)
└── INTEL (TLD Pricing)
MANAGE
├── RADAR (Dashboard)
├── WATCHLIST (Monitoring)
├── SNIPER (Alerts)
├── FOR SALE (Listings)
└── YIELD ✨ ← NEU
SETTINGS
```
### 2.2 Neue Seite: `/terminal/yield`
**Layout:**
```
┌──────────────────────────────────────────────────────────────────────────┐
│ YIELD [?] Help │
├──────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌────────────┐ ┌────────────┐ ┌────────────┐ ┌────────────┐ │
│ │ Active │ │ Monthly │ │ Pending │ │ Total │ │
│ │ Domains │ │ Revenue │ │ Payout │ │ Earned │ │
│ │ 12 │ │ CHF 156 │ │ CHF 89 │ │ CHF 1'245 │ │
│ └────────────┘ └────────────┘ └────────────┘ └────────────┘ │
│ │
├──────────────────────────────────────────────────────────────────────────┤
│ │
│ 🔍 Search domains... [+ Activate Domain] │
│ │
├──────────────────────────────────────────────────────────────────────────┤
│ │
│ ┌──────────────────────────────────────────────────────────────────┐ │
│ │ Domain │ Status │ Intent │ Route │ Yield │ │
│ ├──────────────────────────────────────────────────────────────────┤ │
│ │ zahnarzt-zh.ch │ 🟢 Active │ 🏥 Medical │ Comparis │ CHF 45 │ │
│ │ crm-tool.io │ 🟢 Active │ 💻 SaaS │ HubSpot │ $ 23 │ │
│ │ hotel-davos.ch │ 🟢 Active │ 🏨 Travel │ Booking │ CHF 67 │ │
│ │ mein-blog.de │ ⚪ Idle │ ❓ Unknown │ — │ — │ │
│ │ kredit-ch.com │ 🟡 Pending│ 💰 Finance │ Analyzing │ — │ │
│ └──────────────────────────────────────────────────────────────────┘ │
│ │
└──────────────────────────────────────────────────────────────────────────┘
```
### 2.3 Domain Aktivieren Modal/Wizard
**Schritt 1: Domain eingeben**
```
┌─────────────────────────────────────────────────┐
│ Activate Domain for Yield │
├─────────────────────────────────────────────────┤
│ │
│ Enter your domain: │
│ ┌─────────────────────────────────────────┐ │
│ │ zahnarzt-zuerich.ch │ │
│ └─────────────────────────────────────────┘ │
│ │
│ [Continue →] │
│ │
└─────────────────────────────────────────────────┘
```
**Schritt 2: Intent-Erkennung (automatisch)**
```
┌─────────────────────────────────────────────────┐
│ Intent Detected │
├─────────────────────────────────────────────────┤
│ │
│ Domain: zahnarzt-zuerich.ch │
│ │
│ 🧠 Detected Intent: │
│ ┌─────────────────────────────────────────┐ │
│ │ 🏥 MEDICAL / DENTAL │ │
│ │ │ │
│ │ Keywords: zahnarzt, zuerich │ │
│ │ Confidence: 94% │ │
│ └─────────────────────────────────────────┘ │
│ │
│ 💰 Estimated Revenue: CHF 15-45/month │
│ │
│ Recommended Partners: │
│ • Comparis (Dental Comparison) │
│ • Doctolib (Appointment Booking) │
│ │
│ [Continue →] │
│ │
└─────────────────────────────────────────────────┘
```
**Schritt 3: DNS Setup**
```
┌─────────────────────────────────────────────────┐
│ Connect Your Domain │
├─────────────────────────────────────────────────┤
│ │
│ Change your nameservers to: │
│ │
│ ┌─────────────────────────────────────────┐ │
│ │ ns1.pounce.io [📋] │ │
│ │ ns2.pounce.io [📋] │ │
│ └─────────────────────────────────────────┘ │
│ │
│ ⏳ We're checking your DNS... │
│ │
│ Status: Waiting for propagation (~10 min) │
│ │
│ [I've updated my nameservers] │
│ │
└─────────────────────────────────────────────────┘
```
**Schritt 4: Aktiviert**
```
┌─────────────────────────────────────────────────┐
│ ✅ Domain Activated! │
├─────────────────────────────────────────────────┤
│ │
│ zahnarzt-zuerich.ch is now earning. │
│ │
│ 🏥 Intent: Medical/Dental │
│ ➔ Route: Comparis Dental │
│ 💰 Est. Yield: CHF 15-45/month │
│ │
│ What happens now: │
│ • We host a minimal landing page │
│ • Visitors are routed to partners │
│ • You earn affiliate commissions │
│ • Payouts monthly (min. CHF 50) │
│ │
│ [View My Yield Dashboard] │
│ │
└─────────────────────────────────────────────────┘
```
### 2.4 Portfolio-Tab Integration (Alternative)
Statt einer separaten Seite kann "Yield" auch als **Tab in der Watchlist** integriert werden:
```
┌────────────────────────────────────────────────────────────────┐
│ [Watching] [My Portfolio] [Yield] ✨ │
└────────────────────────────────────────────────────────────────┘
```
**Vorteil:** Weniger Navigation, alles an einem Ort.
**Nachteil:** Watchlist wird komplexer.
**Empfehlung:** Starte mit separater `/terminal/yield` Seite, kann später in Portfolio integriert werden.
---
## 3. Backend-Architektur (High-Level)
### 3.1 Neue Models
```python
# backend/app/models/yield_domain.py
class YieldDomain(Base):
"""Domain activated for yield/intent routing."""
__tablename__ = "yield_domains"
id: int
user_id: int # FK → users
domain: str # "zahnarzt-zuerich.ch"
# Intent
detected_intent: str # "medical_dental"
intent_confidence: float # 0.94
intent_keywords: str # JSON: ["zahnarzt", "zuerich"]
# Routing
active_route: str # "comparis_dental"
partner_id: int # FK → affiliate_partners
# Status
status: str # "pending", "active", "paused", "inactive"
dns_verified: bool
activated_at: datetime
# Revenue
total_clicks: int
total_conversions: int
total_revenue: Decimal
created_at: datetime
updated_at: datetime
class YieldTransaction(Base):
"""Revenue events from affiliate partners."""
__tablename__ = "yield_transactions"
id: int
yield_domain_id: int # FK
event_type: str # "click", "lead", "sale"
partner_id: int
amount: Decimal
currency: str
# Attribution
referrer: str
user_agent: str
geo_country: str
# Status
status: str # "pending", "confirmed", "paid", "rejected"
confirmed_at: datetime
paid_at: datetime
created_at: datetime
class AffiliatePartner(Base):
"""Affiliate network/partner configuration."""
__tablename__ = "affiliate_partners"
id: int
name: str # "Comparis Dental"
network: str # "awin", "partnerstack", "direct"
# Matching
intent_categories: str # JSON: ["medical_dental", "medical_general"]
geo_countries: str # JSON: ["CH", "DE", "AT"]
# Payout
payout_type: str # "cpc", "cpl", "cps"
payout_amount: Decimal
payout_currency: str
# Integration
tracking_url_template: str
api_endpoint: str
api_key_encrypted: str
is_active: bool
created_at: datetime
```
### 3.2 Neue API Endpoints
```python
# backend/app/api/yield.py
@router.get("/domains")
# Liste alle Yield-Domains des Users
@router.post("/domains/activate")
# Neue Domain aktivieren (Step 1-4 Wizard)
@router.get("/domains/{domain}/intent")
# Intent-Detection für eine Domain
@router.get("/domains/{domain}/verify-dns")
# DNS-Verifizierung prüfen
@router.put("/domains/{domain}/pause")
# Routing pausieren
@router.get("/stats")
# Gesamtstatistiken (Revenue, Clicks, etc.)
@router.get("/transactions")
# Transaktions-Historie
@router.get("/payouts")
# Payout-Historie
```
### 3.3 Intent-Detection Service
```python
# backend/app/services/intent_detector.py
class IntentDetector:
"""Erkennt den Intent einer Domain basierend auf Name und TLD."""
INTENT_CATEGORIES = {
"medical_dental": {
"keywords": ["zahnarzt", "dentist", "dental", "zahn"],
"partners": ["comparis_dental", "doctolib"],
"avg_cpl": 25.00
},
"travel_hotel": {
"keywords": ["hotel", "ferien", "vacation", "resort"],
"partners": ["booking", "hotels_com"],
"avg_cpl": 15.00
},
"finance_loan": {
"keywords": ["kredit", "loan", "finanz", "hypothek"],
"partners": ["comparis_finance", "lendico"],
"avg_cpl": 50.00
},
"saas_software": {
"keywords": ["crm", "erp", "software", "tool", "app"],
"partners": ["hubspot", "partnerstack"],
"avg_cpl": 30.00
},
# ... weitere Kategorien
}
def detect(self, domain: str) -> IntentResult:
"""Analysiert Domain und gibt Intent zurück."""
name = domain.rsplit('.', 1)[0].lower()
# ... Matching-Logik
```
### 3.4 DNS/Hosting Service
```python
# backend/app/services/yield_dns.py
class YieldDNSService:
"""Verwaltet DNS und Hosting für Yield-Domains."""
async def verify_nameservers(self, domain: str) -> bool:
"""Prüft ob Domain auf ns1/ns2.pounce.io zeigt."""
async def provision_landing_page(self, domain: str, intent: str) -> str:
"""Erstellt minimale Landing Page für Routing."""
async def get_tracking_url(self, domain: str, partner_id: int) -> str:
"""Generiert Affiliate-Tracking-URL."""
```
---
## 4. Phasen-Plan
### Phase 2.1: MVP (4-6 Wochen)
| Task | Prio | Aufwand |
|------|------|---------|
| Intent-Detection Engine (Keyword-basiert) | 🔴 | 1 Woche |
| Yield-Domain Model + API | 🔴 | 1 Woche |
| `/terminal/yield` UI (Basic) | 🔴 | 1 Woche |
| DNS-Verifizierung | 🔴 | 3 Tage |
| 1 Partner-Integration (z.B. Awin) | 🔴 | 1 Woche |
| Landing Page Generator (Minimal) | 🟡 | 3 Tage |
| Transaction Tracking | 🟡 | 3 Tage |
**Ergebnis:** User können Domains aktivieren, wir routen zu 1 Partner-Netzwerk.
### Phase 2.2: Erweiterung (4 Wochen)
| Task | Prio | Aufwand |
|------|------|---------|
| Weitere Partner (5-10) | 🔴 | 2 Wochen |
| Payout-System | 🔴 | 1 Woche |
| Public Landing `/yield` | 🟡 | 3 Tage |
| Landing Page Customization | 🟡 | 3 Tage |
| Revenue Analytics Dashboard | 🟡 | 3 Tage |
### Phase 2.3: Marktplatz-Integration
| Task | Prio | Aufwand |
|------|------|---------|
| "Yield-Generating Domains" Kategorie | 🟡 | 1 Woche |
| Valuation basierend auf Yield (30x MRR) | 🟡 | 3 Tage |
| Yield-History für Käufer sichtbar | 🟡 | 3 Tage |
---
## 5. Monetarisierung
### Revenue Split
| Party | Anteil |
|-------|--------|
| **Domain Owner** | 70% |
| **Pounce** | 30% |
### Tier-Gating
| Tier | Yield-Domains | Payout Threshold |
|------|---------------|------------------|
| **Scout** | 0 (Feature locked) | — |
| **Trader** | 5 | CHF 100 |
| **Tycoon** | Unlimited | CHF 50 |
---
## 6. UX-Philosophie
### Prinzipien
1. **Zero Config:** User ändert nur Nameserver. Alles andere ist automatisch.
2. **Transparent:** Klare Anzeige was passiert, welcher Partner, welche Einnahmen.
3. **Instant Value:** Zeige geschätzten Revenue VOR Aktivierung.
4. **Trust:** Partner-Logos, echte Zahlen, keine Versprechen.
### Sprache
- ❌ "Domain Parking" (klingt nach 2005)
- ✅ "Domain Yield" / "Intent Routing"
- ❌ "Passive Income" (scammy)
- ✅ "Your domain works for you"
---
## 7. Technische Voraussetzungen
| Komponente | Benötigt | Status |
|------------|----------|--------|
| Eigene Nameserver (ns1/ns2.pounce.io) | ✅ | Neu |
| DNS-Hosting (Cloudflare API oder ähnlich) | ✅ | Neu |
| Landing Page CDN | ✅ | Neu |
| Affiliate-Netzwerk Accounts | ✅ | Neu |
| Payout-System (Stripe Connect?) | ✅ | Teilweise (Stripe existiert) |
---
## 8. Zusammenfassung
### Was ändert sich im UI?
| Bereich | Änderung |
|---------|----------|
| **Landing Page** | Neuer 4. Pillar "YIELD" + Link zu `/yield` |
| **Public `/yield`** | Neue Landingpage mit Calculator |
| **Terminal Sidebar** | Neuer Menüpunkt "YIELD" unter MANAGE |
| **`/terminal/yield`** | Neue Seite: Domain-Liste, Stats, Activate-Wizard |
| **Watchlist** | Optional: "Activate for Yield" Button bei eigenen Domains |
### Backend-Aufwand
- 3 neue Models
- 1 neuer API Router
- 2 neue Services (Intent, DNS)
- Partner-Integrationen (Awin, PartnerStack, etc.)
### Priorität
**Starte mit `/terminal/yield` + Intent-Detection + 1 Partner.**
Public Page und Marktplatz-Integration kommen später.
---
*"Domains werden keine toten Assets mehr. Sie werden autonome Agenten."*

View File

@ -1,256 +0,0 @@
# Pounce Yield - Complete Setup Guide
This guide covers the complete setup of the Yield/Intent Routing feature.
## Overview
Pounce Yield allows users to monetize their parked domains by:
1. Detecting user intent from domain names (e.g., "zahnarzt-zuerich.ch" → Medical/Dental)
2. Routing visitors to relevant affiliate partners
3. Tracking clicks, leads, and sales
4. Splitting revenue 70/30 (user/Pounce)
## Architecture
```
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ User Domain │────▶│ Pounce Yield │────▶│ Affiliate │
│ (DNS → Pounce) │ │ Routing Engine │ │ Partner │
└─────────────────┘ └──────────────────┘ └─────────────────┘
┌──────────────────┐
│ Transaction │
│ Tracking │
└──────────────────┘
```
## Setup Steps
### 1. Database Setup
The yield tables are created automatically on startup. To apply migrations to an existing database:
```bash
cd backend
python -c "from app.database import init_db; import asyncio; asyncio.run(init_db())"
```
### 2. Seed Affiliate Partners
Populate the affiliate partners with default Swiss/German partners:
```bash
cd backend
python scripts/seed_yield_partners.py
```
This seeds ~30 partners across categories:
- Medical (Dental, General, Beauty)
- Finance (Insurance, Mortgage, Banking)
- Legal
- Real Estate
- Travel
- Automotive
- Jobs
- Education
- Technology/Hosting
- Shopping
- Food/Delivery
### 3. Configure DNS
For yield domains to work, you need to set up DNS infrastructure:
#### Option A: Dedicated Nameservers (Recommended for Scale)
1. Set up two nameserver instances (e.g., `ns1.pounce.io`, `ns2.pounce.io`)
2. Run PowerDNS or similar with a backend that queries your yield_domains table
3. Return A records pointing to your yield routing service
#### Option B: CNAME Approach (Simpler)
1. Set up a wildcard SSL certificate for `*.yield.pounce.io`
2. Configure Nginx/Caddy to handle all incoming hosts
3. Users add CNAME: `@ → yield.pounce.io`
### 4. Nginx Configuration
For host-based routing, add this to your nginx config:
```nginx
# Yield domain catch-all
server {
listen 443 ssl http2;
server_name ~^(?<domain>.+)$;
# Wildcard cert
ssl_certificate /etc/ssl/yield.pounce.io.crt;
ssl_certificate_key /etc/ssl/yield.pounce.io.key;
location / {
proxy_pass http://backend:8000/api/v1/r/$domain;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
```
### 5. Partner Integration
Each affiliate partner requires:
1. **Tracking URL Template**: How to pass click IDs to the partner
2. **Webhook URL**: Where the partner sends conversion data back
Update partners in the database or via admin panel:
```sql
UPDATE affiliate_partners
SET tracking_url_template = 'https://partner.com/?clickid={click_id}&ref={domain}'
WHERE slug = 'partner_slug';
```
### 6. Webhook Configuration
Partners send conversion data to:
```
POST https://api.pounce.ch/api/v1/yield-webhooks/{partner_slug}
{
"event_type": "lead",
"domain": "zahnarzt-zuerich.ch",
"transaction_id": "abc123",
"amount": 25.00,
"currency": "CHF"
}
```
For Awin network, use the dedicated endpoint:
```
POST https://api.pounce.ch/api/v1/yield-webhooks/awin/postback
```
## API Endpoints
### Public
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | `/api/v1/yield/analyze?domain=X` | Analyze domain intent (no auth) |
| GET | `/api/v1/yield/partners` | List available partners |
### Authenticated (User)
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | `/api/v1/yield/dashboard` | User yield dashboard |
| GET | `/api/v1/yield/domains` | List user's yield domains |
| POST | `/api/v1/yield/activate` | Activate a domain |
| POST | `/api/v1/yield/domains/{id}/verify` | Verify DNS setup |
| GET | `/api/v1/yield/transactions` | Transaction history |
| GET | `/api/v1/yield/payouts` | Payout history |
### Routing
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | `/api/v1/r/{domain}` | Route traffic & track click |
| GET | `/api/v1/r/{domain}?direct=true` | Direct redirect (no landing) |
### Webhooks (Partner → Pounce)
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | `/api/v1/yield-webhooks/{partner}` | Generic partner webhook |
| POST | `/api/v1/yield-webhooks/awin/postback` | Awin network postback |
| POST | `/api/v1/yield-webhooks/confirm/{tx_id}` | Manual confirmation (internal) |
| POST | `/api/v1/yield-webhooks/batch-import` | Bulk import (internal) |
## Revenue Model
- **Clicks**: Usually CPC (cost per click), CHF 0.10-0.60
- **Leads**: CPL (cost per lead), CHF 15-120
- **Sales**: CPS (cost per sale), 2-10% of sale value
Revenue split:
- **User**: 70%
- **Pounce**: 30%
## Intent Categories
The IntentDetector recognizes these categories:
| Category | Subcategories | Example Domains |
|----------|---------------|-----------------|
| medical | dental, general, beauty | zahnarzt.ch, arzt-bern.ch |
| finance | insurance, mortgage, banking | versicherung.ch, hypothek.ch |
| legal | general | anwalt-zuerich.ch |
| realestate | buy, rent | wohnung-mieten.ch |
| travel | flights, hotels | flug-buchen.ch |
| auto | buy, service | autokauf.ch |
| jobs | - | stellenmarkt.ch |
| education | - | kurse-online.ch |
| tech | hosting, software | webhosting.ch |
| shopping | general, fashion | mode-shop.ch |
| food | restaurant, delivery | pizza-lieferung.ch |
## Monitoring
### Metrics
Enable Prometheus metrics:
```env
ENABLE_METRICS=true
```
Key yield metrics:
- `yield_clicks_total{domain, partner}`
- `yield_conversions_total{domain, partner, type}`
- `yield_revenue_total{currency}`
### Alerts
Set up alerts for:
- Webhook failures
- Low conversion rates
- DNS verification failures
- Partner API errors
## Troubleshooting
### Domain not routing
1. Check DNS: `dig +short {domain}`
2. Verify domain status: `SELECT status FROM yield_domains WHERE domain = '{domain}'`
3. Check nginx logs for routing errors
### No conversions
1. Verify partner webhook URL is correct
2. Check webhook logs for incoming calls
3. Validate transaction ID format
### Low revenue
1. Check intent detection: Some domains may be classified as "generic"
2. Review partner matching: Higher-priority partners should be assigned
3. Analyze geo distribution: Swiss visitors convert better
## Security Considerations
- All partner webhooks should use HMAC signature verification
- IP addresses are hashed before storage (privacy)
- User revenue data is isolated by user_id
- Rate limiting on routing endpoint
## Support
For issues with:
- Partner integrations: partners@pounce.ch
- Technical issues: dev@pounce.ch
- Payout questions: finance@pounce.ch

View File

@ -1,34 +0,0 @@
"""Add DNS verification fields to portfolio_domains
Revision ID: 006
Revises: 005
Create Date: 2025-12-13
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '006'
down_revision = '005'
branch_labels = None
depends_on = None
def upgrade() -> None:
"""Add DNS verification columns to portfolio_domains table."""
# Add columns with default values (nullable to avoid issues with existing rows)
op.add_column('portfolio_domains', sa.Column('is_dns_verified', sa.Boolean(), nullable=True, server_default='0'))
op.add_column('portfolio_domains', sa.Column('verification_status', sa.String(50), nullable=True, server_default='unverified'))
op.add_column('portfolio_domains', sa.Column('verification_code', sa.String(100), nullable=True))
op.add_column('portfolio_domains', sa.Column('verification_started_at', sa.DateTime(), nullable=True))
op.add_column('portfolio_domains', sa.Column('verified_at', sa.DateTime(), nullable=True))
def downgrade() -> None:
"""Remove DNS verification columns from portfolio_domains table."""
op.drop_column('portfolio_domains', 'verified_at')
op.drop_column('portfolio_domains', 'verification_started_at')
op.drop_column('portfolio_domains', 'verification_code')
op.drop_column('portfolio_domains', 'verification_status')
op.drop_column('portfolio_domains', 'is_dns_verified')

View File

@ -1,74 +0,0 @@
"""Add inquiry close fields + audit trail
Revision ID: 007
Revises: 006
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '007'
down_revision = '006'
branch_labels = None
depends_on = None
def upgrade() -> None:
# listing_inquiries: deal workflow
op.add_column('listing_inquiries', sa.Column('closed_reason', sa.String(200), nullable=True))
op.add_column('listing_inquiries', sa.Column('closed_at', sa.DateTime(), nullable=True))
op.create_index(
'ix_listing_inquiries_listing_created',
'listing_inquiries',
['listing_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiries_listing_status',
'listing_inquiries',
['listing_id', 'status'],
unique=False,
)
# listing_inquiry_events: audit trail
op.create_table(
'listing_inquiry_events',
sa.Column('id', sa.Integer(), primary_key=True),
sa.Column('inquiry_id', sa.Integer(), sa.ForeignKey('listing_inquiries.id'), nullable=False, index=True),
sa.Column('listing_id', sa.Integer(), sa.ForeignKey('domain_listings.id'), nullable=False, index=True),
sa.Column('actor_user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=False, index=True),
sa.Column('old_status', sa.String(20), nullable=True),
sa.Column('new_status', sa.String(20), nullable=False),
sa.Column('reason', sa.String(200), nullable=True),
sa.Column('ip_address', sa.String(45), nullable=True),
sa.Column('user_agent', sa.String(500), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=True, index=True),
)
op.create_index(
'ix_listing_inquiry_events_inquiry_created',
'listing_inquiry_events',
['inquiry_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiry_events_listing_created',
'listing_inquiry_events',
['listing_id', 'created_at'],
unique=False,
)
def downgrade() -> None:
op.drop_index('ix_listing_inquiry_events_listing_created', table_name='listing_inquiry_events')
op.drop_index('ix_listing_inquiry_events_inquiry_created', table_name='listing_inquiry_events')
op.drop_table('listing_inquiry_events')
op.drop_index('ix_listing_inquiries_listing_status', table_name='listing_inquiries')
op.drop_index('ix_listing_inquiries_listing_created', table_name='listing_inquiries')
op.drop_column('listing_inquiries', 'closed_at')
op.drop_column('listing_inquiries', 'closed_reason')

View File

@ -1,61 +0,0 @@
"""Add inquiry threading (buyer link + messages)
Revision ID: 008
Revises: 007
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = '008'
down_revision = '007'
branch_labels = None
depends_on = None
def upgrade() -> None:
# Link inquiry to buyer account
op.add_column('listing_inquiries', sa.Column('buyer_user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=True))
op.create_index('ix_listing_inquiries_buyer_user', 'listing_inquiries', ['buyer_user_id'], unique=False)
# Thread messages
op.create_table(
'listing_inquiry_messages',
sa.Column('id', sa.Integer(), primary_key=True),
sa.Column('inquiry_id', sa.Integer(), sa.ForeignKey('listing_inquiries.id'), nullable=False, index=True),
sa.Column('listing_id', sa.Integer(), sa.ForeignKey('domain_listings.id'), nullable=False, index=True),
sa.Column('sender_user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=False, index=True),
sa.Column('body', sa.Text(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=True, index=True),
)
op.create_index(
'ix_listing_inquiry_messages_inquiry_created',
'listing_inquiry_messages',
['inquiry_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiry_messages_listing_created',
'listing_inquiry_messages',
['listing_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiry_messages_sender_created',
'listing_inquiry_messages',
['sender_user_id', 'created_at'],
unique=False,
)
def downgrade() -> None:
op.drop_index('ix_listing_inquiry_messages_sender_created', table_name='listing_inquiry_messages')
op.drop_index('ix_listing_inquiry_messages_listing_created', table_name='listing_inquiry_messages')
op.drop_index('ix_listing_inquiry_messages_inquiry_created', table_name='listing_inquiry_messages')
op.drop_table('listing_inquiry_messages')
op.drop_index('ix_listing_inquiries_buyer_user', table_name='listing_inquiries')
op.drop_column('listing_inquiries', 'buyer_user_id')

View File

@ -1,31 +0,0 @@
"""Add listing sold fields (GMV tracking)
Revision ID: 009
Revises: 008
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = '009'
down_revision = '008'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column('domain_listings', sa.Column('sold_at', sa.DateTime(), nullable=True))
op.add_column('domain_listings', sa.Column('sold_reason', sa.String(200), nullable=True))
op.add_column('domain_listings', sa.Column('sold_price', sa.Float(), nullable=True))
op.add_column('domain_listings', sa.Column('sold_currency', sa.String(3), nullable=True))
op.create_index('ix_domain_listings_status', 'domain_listings', ['status'], unique=False)
def downgrade() -> None:
op.drop_index('ix_domain_listings_status', table_name='domain_listings')
op.drop_column('domain_listings', 'sold_currency')
op.drop_column('domain_listings', 'sold_price')
op.drop_column('domain_listings', 'sold_reason')
op.drop_column('domain_listings', 'sold_at')

View File

@ -1,25 +0,0 @@
"""Add yield connected_at timestamp.
Revision ID: 010_add_yield_connected_at
Revises: 009_add_listing_sold_fields
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "010_add_yield_connected_at"
down_revision = "009_add_listing_sold_fields"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("yield_domains", sa.Column("connected_at", sa.DateTime(), nullable=True))
def downgrade() -> None:
op.drop_column("yield_domains", "connected_at")

View File

@ -1,28 +0,0 @@
"""Add click_id + destination_url to yield transactions.
Revision ID: 011_add_yield_transaction_click_id
Revises: 010_add_yield_connected_at
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = "011_add_yield_transaction_click_id"
down_revision = "010_add_yield_connected_at"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("yield_transactions", sa.Column("click_id", sa.String(length=64), nullable=True))
op.add_column("yield_transactions", sa.Column("destination_url", sa.Text(), nullable=True))
op.create_index("ix_yield_transactions_click_id", "yield_transactions", ["click_id"], unique=False)
def downgrade() -> None:
op.drop_index("ix_yield_transactions_click_id", table_name="yield_transactions")
op.drop_column("yield_transactions", "destination_url")
op.drop_column("yield_transactions", "click_id")

View File

@ -1,67 +0,0 @@
"""Add telemetry_events table.
Revision ID: 012_add_telemetry_events
Revises: 011_add_yield_transaction_click_id
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = "012_add_telemetry_events"
down_revision = "011_add_yield_transaction_click_id"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"telemetry_events",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("user_id", sa.Integer(), nullable=True),
sa.Column("event_name", sa.String(length=60), nullable=False),
sa.Column("listing_id", sa.Integer(), nullable=True),
sa.Column("inquiry_id", sa.Integer(), nullable=True),
sa.Column("yield_domain_id", sa.Integer(), nullable=True),
sa.Column("click_id", sa.String(length=64), nullable=True),
sa.Column("domain", sa.String(length=255), nullable=True),
sa.Column("source", sa.String(length=30), nullable=True),
sa.Column("ip_hash", sa.String(length=64), nullable=True),
sa.Column("user_agent", sa.String(length=500), nullable=True),
sa.Column("referrer", sa.String(length=500), nullable=True),
sa.Column("metadata_json", sa.Text(), nullable=True),
sa.Column("is_authenticated", sa.Boolean(), nullable=True),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(["user_id"], ["users.id"]),
)
op.create_index("ix_telemetry_events_event_name", "telemetry_events", ["event_name"])
op.create_index("ix_telemetry_events_user_id", "telemetry_events", ["user_id"])
op.create_index("ix_telemetry_events_listing_id", "telemetry_events", ["listing_id"])
op.create_index("ix_telemetry_events_inquiry_id", "telemetry_events", ["inquiry_id"])
op.create_index("ix_telemetry_events_yield_domain_id", "telemetry_events", ["yield_domain_id"])
op.create_index("ix_telemetry_events_click_id", "telemetry_events", ["click_id"])
op.create_index("ix_telemetry_events_domain", "telemetry_events", ["domain"])
op.create_index("ix_telemetry_events_created_at", "telemetry_events", ["created_at"])
op.create_index("ix_telemetry_event_name_created", "telemetry_events", ["event_name", "created_at"])
op.create_index("ix_telemetry_user_created", "telemetry_events", ["user_id", "created_at"])
op.create_index("ix_telemetry_listing_created", "telemetry_events", ["listing_id", "created_at"])
op.create_index("ix_telemetry_yield_created", "telemetry_events", ["yield_domain_id", "created_at"])
def downgrade() -> None:
op.drop_index("ix_telemetry_yield_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_listing_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_user_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_event_name_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_created_at", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_domain", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_click_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_yield_domain_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_inquiry_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_listing_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_user_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_event_name", table_name="telemetry_events")
op.drop_table("telemetry_events")

View File

@ -1,41 +0,0 @@
"""add ops alert events
Revision ID: 013_add_ops_alert_events
Revises: 012_add_telemetry_events
Create Date: 2025-12-15
"""
from __future__ import annotations
from alembic import op
import sqlalchemy as sa
revision = "013_add_ops_alert_events"
down_revision = "012_add_telemetry_events"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"ops_alert_events",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("alert_key", sa.String(length=80), nullable=False),
sa.Column("severity", sa.String(length=10), nullable=False),
sa.Column("title", sa.String(length=200), nullable=False),
sa.Column("detail", sa.Text(), nullable=True),
sa.Column("status", sa.String(length=20), nullable=False),
sa.Column("recipients", sa.Text(), nullable=True),
sa.Column("send_reason", sa.String(length=60), nullable=True),
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=sa.text("now()")),
)
op.create_index("ix_ops_alert_key_created", "ops_alert_events", ["alert_key", "created_at"])
op.create_index("ix_ops_alert_status_created", "ops_alert_events", ["status", "created_at"])
def downgrade() -> None:
op.drop_index("ix_ops_alert_status_created", table_name="ops_alert_events")
op.drop_index("ix_ops_alert_key_created", table_name="ops_alert_events")
op.drop_table("ops_alert_events")

View File

@ -1,28 +0,0 @@
"""add users invite_code
Revision ID: 014_add_user_invite_code
Revises: 013_add_ops_alert_events
Create Date: 2025-12-15
"""
from __future__ import annotations
from alembic import op
import sqlalchemy as sa
revision = "014_add_user_invite_code"
down_revision = "013_add_ops_alert_events"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("users", sa.Column("invite_code", sa.String(length=32), nullable=True))
op.create_index("ix_users_invite_code", "users", ["invite_code"], unique=True)
def downgrade() -> None:
op.drop_index("ix_users_invite_code", table_name="users")
op.drop_column("users", "invite_code")

View File

@ -1,29 +0,0 @@
"""add subscription referral bonus domains
Revision ID: 015_add_subscription_referral_bonus_domains
Revises: 014_add_user_invite_code
Create Date: 2025-12-15
"""
from __future__ import annotations
import sqlalchemy as sa
from alembic import op
revision = "015_add_subscription_referral_bonus_domains"
down_revision = "014_add_user_invite_code"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"subscriptions",
sa.Column("referral_bonus_domains", sa.Integer(), nullable=False, server_default="0"),
)
def downgrade() -> None:
op.drop_column("subscriptions", "referral_bonus_domains")

View File

@ -17,16 +17,6 @@ from app.api.blog import router as blog_router
from app.api.listings import router as listings_router
from app.api.sniper_alerts import router as sniper_alerts_router
from app.api.seo import router as seo_router
from app.api.dashboard import router as dashboard_router
from app.api.yield_domains import router as yield_router
from app.api.yield_webhooks import router as yield_webhooks_router
from app.api.yield_routing import router as yield_routing_router
from app.api.yield_payout_admin import router as yield_payout_admin_router
from app.api.telemetry import router as telemetry_router
from app.api.analyze import router as analyze_router
from app.api.hunt import router as hunt_router
from app.api.cfo import router as cfo_router
from app.api.drops import router as drops_router
api_router = APIRouter()
@ -40,11 +30,6 @@ api_router.include_router(tld_prices_router, prefix="/tld-prices", tags=["TLD Pr
api_router.include_router(price_alerts_router, prefix="/price-alerts", tags=["Price Alerts"])
api_router.include_router(portfolio_router, prefix="/portfolio", tags=["Portfolio"])
api_router.include_router(auctions_router, prefix="/auctions", tags=["Smart Pounce - Auctions"])
api_router.include_router(dashboard_router, prefix="/dashboard", tags=["Dashboard"])
api_router.include_router(analyze_router, prefix="/analyze", tags=["Analyze"])
api_router.include_router(hunt_router, prefix="/hunt", tags=["Hunt"])
api_router.include_router(cfo_router, prefix="/cfo", tags=["CFO"])
api_router.include_router(drops_router, tags=["Drops - Zone Files"])
# Marketplace (For Sale) - from analysis_3.md
api_router.include_router(listings_router, prefix="/listings", tags=["Marketplace - For Sale"])
@ -55,15 +40,6 @@ api_router.include_router(sniper_alerts_router, prefix="/sniper-alerts", tags=["
# SEO Data / Backlinks - from analysis_3.md (Tycoon-only)
api_router.include_router(seo_router, prefix="/seo", tags=["SEO Data - Tycoon"])
# Yield / Intent Routing - Passive income from parked domains
api_router.include_router(yield_router, tags=["Yield - Intent Routing"])
api_router.include_router(yield_webhooks_router, tags=["Yield - Webhooks"])
api_router.include_router(yield_routing_router, tags=["Yield - Routing"])
api_router.include_router(yield_payout_admin_router, tags=["Yield - Admin"])
# Telemetry / KPIs (admin)
api_router.include_router(telemetry_router, tags=["Telemetry"])
# Support & Communication
api_router.include_router(contact_router, prefix="/contact", tags=["Contact & Newsletter"])

View File

@ -9,14 +9,12 @@ Provides admin-only access to:
- Domain/Portfolio overview
"""
from datetime import datetime, timedelta
from pathlib import Path
from typing import Optional
from fastapi import APIRouter, HTTPException, status, Depends, BackgroundTasks
from fastapi import APIRouter, HTTPException, status, BackgroundTasks, Depends
from pydantic import BaseModel, EmailStr
from sqlalchemy import select, func, desc
from app.api.deps import Database, get_current_user
from app.config import get_settings
from app.models.user import User
from app.models.subscription import Subscription, SubscriptionTier, SubscriptionStatus, TIER_CONFIG
from app.models.domain import Domain
@ -25,13 +23,10 @@ from app.models.newsletter import NewsletterSubscriber
from app.models.tld_price import TLDPrice, TLDInfo
from app.models.auction import DomainAuction
from app.models.price_alert import PriceAlert
from app.models.listing import DomainListing
from app.services.db_backup import create_backup, list_backups
from app.services.ops_alerts import run_ops_alert_checks
from app.models.ops_alert import OpsAlertEvent
router = APIRouter()
settings = get_settings()
# ============== Admin Authentication ==============
async def require_admin(
@ -46,60 +41,6 @@ async def require_admin(
return current_user
# ============== Scraping Ops (Server-only, free alternative to paid proxies) ==============
class PlaywrightCookiesUpload(BaseModel):
"""Upload Playwright cookies JSON used by protected scrapers (e.g. NameJet)."""
cookies: list[dict]
@router.post("/scraping/playwright-cookies")
async def upload_playwright_cookies(
payload: PlaywrightCookiesUpload,
admin: User = Depends(require_admin),
):
"""Replace the server's Playwright cookie jar file."""
cookie_dir = Path(__file__).parent.parent / "data" / "cookies"
cookie_dir.mkdir(parents=True, exist_ok=True)
cookie_file = cookie_dir / "session_cookies.json"
if not payload.cookies:
raise HTTPException(status_code=400, detail="cookies must not be empty")
try:
import json
cookie_file.write_text(json.dumps(payload.cookies, indent=2))
except Exception as e:
raise HTTPException(status_code=500, detail=f"Failed to write cookie file: {e}")
return {
"status": "ok",
"cookies_count": len(payload.cookies),
"updated_at": datetime.utcnow().isoformat(),
"note": "Enable protected scraping with POUNCE_ENABLE_PROTECTED_SCRAPERS=true",
}
@router.get("/scraping/playwright-cookies")
async def get_playwright_cookie_status(
admin: User = Depends(require_admin),
):
"""Return Playwright cookie jar status (no contents)."""
cookie_dir = Path(__file__).parent.parent / "data" / "cookies"
cookie_file = cookie_dir / "session_cookies.json"
if not cookie_file.exists():
return {"exists": False}
stat = cookie_file.stat()
return {
"exists": True,
"path": str(cookie_file),
"size_bytes": stat.st_size,
"modified_at": datetime.utcfromtimestamp(stat.st_mtime).isoformat() + "Z",
}
# ============== Dashboard Stats ==============
@router.get("/stats")
@ -193,213 +134,6 @@ async def get_admin_stats(
}
# ============== Earnings / Revenue ==============
@router.get("/earnings")
async def get_admin_earnings(
db: Database,
admin: User = Depends(require_admin)
):
"""
Get earnings and revenue metrics for admin dashboard.
Calculates MRR, ARR, and subscription breakdown.
"""
# Tier prices (from TIER_CONFIG)
tier_prices = {
SubscriptionTier.SCOUT: 0,
SubscriptionTier.TRADER: 9,
SubscriptionTier.TYCOON: 29,
}
# Get all active subscriptions
result = await db.execute(
select(Subscription).where(
Subscription.status == SubscriptionStatus.ACTIVE
)
)
active_subs = result.scalars().all()
# Calculate MRR
mrr = 0.0
tier_breakdown = {
"scout": {"count": 0, "revenue": 0},
"trader": {"count": 0, "revenue": 0},
"tycoon": {"count": 0, "revenue": 0},
}
for sub in active_subs:
price = tier_prices.get(sub.tier, 0)
mrr += price
tier_key = sub.tier.value
if tier_key in tier_breakdown:
tier_breakdown[tier_key]["count"] += 1
tier_breakdown[tier_key]["revenue"] += price
arr = mrr * 12
# New subscriptions this week
week_ago = datetime.utcnow() - timedelta(days=7)
new_subs_week = await db.execute(
select(func.count(Subscription.id)).where(
Subscription.started_at >= week_ago,
Subscription.tier != SubscriptionTier.SCOUT
)
)
new_subs_week = new_subs_week.scalar() or 0
# New subscriptions this month
month_ago = datetime.utcnow() - timedelta(days=30)
new_subs_month = await db.execute(
select(func.count(Subscription.id)).where(
Subscription.started_at >= month_ago,
Subscription.tier != SubscriptionTier.SCOUT
)
)
new_subs_month = new_subs_month.scalar() or 0
# Cancelled subscriptions this month (churn)
cancelled_month = await db.execute(
select(func.count(Subscription.id)).where(
Subscription.cancelled_at >= month_ago,
Subscription.cancelled_at.isnot(None)
)
)
cancelled_month = cancelled_month.scalar() or 0
# Total paying customers
paying_customers = tier_breakdown["trader"]["count"] + tier_breakdown["tycoon"]["count"]
# Revenue from Yield (platform's 30% cut)
try:
from app.models.yield_domain import YieldTransaction
yield_revenue = await db.execute(
select(func.sum(YieldTransaction.net_amount)).where(
YieldTransaction.created_at >= month_ago,
YieldTransaction.status == "confirmed"
)
)
yield_revenue_month = float(yield_revenue.scalar() or 0) * 0.30 / 0.70 # Platform's cut
except Exception:
yield_revenue_month = 0
return {
"mrr": round(mrr, 2),
"arr": round(arr, 2),
"paying_customers": paying_customers,
"tier_breakdown": tier_breakdown,
"new_subscriptions": {
"week": new_subs_week,
"month": new_subs_month,
},
"churn": {
"month": cancelled_month,
},
"yield_revenue_month": round(yield_revenue_month, 2),
"total_revenue_month": round(mrr + yield_revenue_month, 2),
"timestamp": datetime.utcnow().isoformat(),
}
# ============== Earnings History ==============
@router.get("/earnings/history")
async def get_admin_earnings_history(
db: Database,
admin: User = Depends(require_admin),
months: int = 12
):
"""
Get historical earnings data for charts.
Calculates MRR for each month based on subscription start dates.
"""
tier_prices = {
SubscriptionTier.SCOUT: 0,
SubscriptionTier.TRADER: 9,
SubscriptionTier.TYCOON: 29,
}
# Get all subscriptions
result = await db.execute(select(Subscription))
all_subs = result.scalars().all()
# Generate monthly data for the last N months
monthly_data = []
now = datetime.utcnow()
for i in range(months - 1, -1, -1):
# Calculate the start of each month
month_start = datetime(now.year, now.month, 1) - timedelta(days=i * 30)
month_end = month_start + timedelta(days=30)
month_name = month_start.strftime("%b %Y")
# Calculate MRR for this month
mrr = 0.0
tier_counts = {"scout": 0, "trader": 0, "tycoon": 0}
new_subs = 0
churned = 0
for sub in all_subs:
# Was this subscription active during this month?
started_before_month_end = sub.started_at <= month_end
cancelled_after_month_start = (sub.cancelled_at is None or sub.cancelled_at >= month_start)
if started_before_month_end and cancelled_after_month_start:
price = tier_prices.get(sub.tier, 0)
mrr += price
tier_key = sub.tier.value
if tier_key in tier_counts:
tier_counts[tier_key] += 1
# New subscriptions in this month
if month_start <= sub.started_at < month_end and sub.tier != SubscriptionTier.SCOUT:
new_subs += 1
# Churned in this month
if sub.cancelled_at and month_start <= sub.cancelled_at < month_end:
churned += 1
monthly_data.append({
"month": month_name,
"mrr": round(mrr, 2),
"arr": round(mrr * 12, 2),
"paying_customers": tier_counts["trader"] + tier_counts["tycoon"],
"scout": tier_counts["scout"],
"trader": tier_counts["trader"],
"tycoon": tier_counts["tycoon"],
"new_subscriptions": new_subs,
"churn": churned,
})
# Calculate growth metrics
if len(monthly_data) >= 2:
current_mrr = monthly_data[-1]["mrr"]
prev_mrr = monthly_data[-2]["mrr"] if monthly_data[-2]["mrr"] > 0 else 1
mrr_growth = ((current_mrr - prev_mrr) / prev_mrr) * 100
else:
mrr_growth = 0
# Calculate average revenue per user (ARPU)
current_paying = monthly_data[-1]["paying_customers"] if monthly_data else 0
current_mrr = monthly_data[-1]["mrr"] if monthly_data else 0
arpu = current_mrr / current_paying if current_paying > 0 else 0
# Calculate LTV (assuming 12 month average retention)
ltv = arpu * 12
return {
"monthly_data": monthly_data,
"metrics": {
"mrr_growth_percent": round(mrr_growth, 1),
"arpu": round(arpu, 2),
"ltv": round(ltv, 2),
"total_customers": sum(m["paying_customers"] for m in monthly_data[-1:]),
},
"timestamp": datetime.utcnow().isoformat(),
}
# ============== User Management ==============
class UpdateUserRequest(BaseModel):
@ -425,48 +159,43 @@ async def list_users(
search: Optional[str] = None,
):
"""List all users with pagination and search."""
# PERF: Avoid N+1 queries (subscription + domain_count per user).
domain_counts = (
select(
Domain.user_id.label("user_id"),
func.count(Domain.id).label("domain_count"),
)
.group_by(Domain.user_id)
.subquery()
)
base = (
select(
User,
Subscription,
func.coalesce(domain_counts.c.domain_count, 0).label("domain_count"),
)
.outerjoin(Subscription, Subscription.user_id == User.id)
.outerjoin(domain_counts, domain_counts.c.user_id == User.id)
)
query = select(User).order_by(desc(User.created_at))
if search:
base = base.where(
User.email.ilike(f"%{search}%") | User.name.ilike(f"%{search}%")
query = query.where(
User.email.ilike(f"%{search}%") |
User.name.ilike(f"%{search}%")
)
# Total count (for pagination UI)
query = query.offset(offset).limit(limit)
result = await db.execute(query)
users = result.scalars().all()
# Get total count
count_query = select(func.count(User.id))
if search:
count_query = count_query.where(
User.email.ilike(f"%{search}%") | User.name.ilike(f"%{search}%")
User.email.ilike(f"%{search}%") |
User.name.ilike(f"%{search}%")
)
total = (await db.execute(count_query)).scalar() or 0
result = await db.execute(
base.order_by(desc(User.created_at)).offset(offset).limit(limit)
)
rows = result.all()
total = await db.execute(count_query)
total = total.scalar()
user_list = []
for user, subscription, domain_count in rows:
user_list.append(
{
for user in users:
# Get subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
# Get domain count
domain_count = await db.execute(
select(func.count(Domain.id)).where(Domain.user_id == user.id)
)
domain_count = domain_count.scalar()
user_list.append({
"id": user.id,
"email": user.email,
"name": user.name,
@ -475,7 +204,7 @@ async def list_users(
"is_admin": user.is_admin,
"created_at": user.created_at.isoformat(),
"last_login": user.last_login.isoformat() if user.last_login else None,
"domain_count": int(domain_count or 0),
"domain_count": domain_count,
"subscription": {
"tier": subscription.tier.value if subscription else "scout",
"tier_name": TIER_CONFIG.get(subscription.tier, {}).get("name", "Scout") if subscription else "Scout",
@ -487,10 +216,14 @@ async def list_users(
"status": None,
"domain_limit": 5,
},
}
)
})
return {"users": user_list, "total": total, "limit": limit, "offset": offset}
return {
"users": user_list,
"total": total,
"limit": limit,
"offset": offset,
}
# ============== User Export ==============
@ -505,26 +238,8 @@ async def export_users_csv(
import csv
import io
domain_counts = (
select(
Domain.user_id.label("user_id"),
func.count(Domain.id).label("domain_count"),
)
.group_by(Domain.user_id)
.subquery()
)
result = await db.execute(
select(
User,
Subscription,
func.coalesce(domain_counts.c.domain_count, 0).label("domain_count"),
)
.outerjoin(Subscription, Subscription.user_id == User.id)
.outerjoin(domain_counts, domain_counts.c.user_id == User.id)
.order_by(User.created_at)
)
users_list = result.all()
result = await db.execute(select(User).order_by(User.created_at))
users_list = result.scalars().all()
# Create CSV
output = io.StringIO()
@ -536,7 +251,19 @@ async def export_users_csv(
"Created At", "Last Login", "Tier", "Domain Limit", "Domains Used"
])
for user, subscription, domain_count in users_list:
for user in users_list:
# Get subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
# Get domain count
domain_count = await db.execute(
select(func.count(Domain.id)).where(Domain.user_id == user.id)
)
domain_count = domain_count.scalar()
writer.writerow([
user.id,
user.email,
@ -548,7 +275,7 @@ async def export_users_csv(
user.last_login.strftime("%Y-%m-%d %H:%M") if user.last_login else "",
subscription.tier.value if subscription else "scout",
subscription.domain_limit if subscription else 5,
int(domain_count or 0),
domain_count,
])
return {
@ -736,12 +463,12 @@ async def upgrade_user(
user_id=user.id,
tier=new_tier,
status=SubscriptionStatus.ACTIVE,
max_domains=config.get("domain_limit", 5),
domain_limit=config.get("domain_limit", 5),
)
db.add(subscription)
else:
subscription.tier = new_tier
subscription.max_domains = config.get("domain_limit", 5)
subscription.domain_limit = config.get("domain_limit", 5)
subscription.status = SubscriptionStatus.ACTIVE
await db.commit()
@ -819,14 +546,6 @@ async def trigger_tld_scrape(
admin: User = Depends(require_admin),
):
"""Manually trigger a TLD price scrape."""
# Prefer job queue in production (non-blocking)
if settings.enable_job_queue and settings.redis_url:
from app.jobs.client import enqueue_job
job_id = await enqueue_job("scrape_tld_prices")
return {"message": "TLD price scrape enqueued", "job_id": job_id}
# Fallback: run inline
from app.services.tld_scraper.aggregator import tld_aggregator
result = await tld_aggregator.run_scrape(db)
@ -1022,7 +741,7 @@ async def test_email(
"""Send a test email to the admin user."""
from app.services.email_service import email_service
if not email_service.is_configured():
if not email_service.is_configured:
raise HTTPException(
status_code=400,
detail="Email service is not configured. Check SMTP settings."
@ -1108,83 +827,6 @@ async def get_scheduler_status(
}
# ============== Ops: Backups (4B) ==============
@router.get("/system/backups")
async def get_backups(
admin: User = Depends(require_admin),
limit: int = 20,
):
"""List recent DB backups on the server."""
return {"backups": list_backups(limit=limit)}
@router.post("/system/backups")
async def create_db_backup(
admin: User = Depends(require_admin),
verify: bool = True,
):
"""Create a DB backup on the server (and verify it)."""
if not settings.enable_db_backups:
raise HTTPException(status_code=403, detail="DB backups are disabled (ENABLE_DB_BACKUPS=false).")
try:
result = create_backup(verify=verify)
return {
"status": "ok",
"backup": {
"path": result.path,
"size_bytes": result.size_bytes,
"created_at": result.created_at,
"verified": result.verified,
"verification_detail": result.verification_detail,
},
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Backup failed: {e}")
@router.post("/system/ops-alerts/run")
async def run_ops_alerts_now(
admin: User = Depends(require_admin),
):
"""
Run ops alert checks immediately (and send alerts if enabled).
Useful for server validation without Docker.
"""
return await run_ops_alert_checks()
@router.get("/system/ops-alerts/history")
async def get_ops_alert_history(
db: Database,
admin: User = Depends(require_admin),
limit: int = 100,
):
"""Return recent persisted ops alert events."""
limit = max(1, min(int(limit), 500))
rows = (
await db.execute(
select(OpsAlertEvent).order_by(OpsAlertEvent.created_at.desc()).limit(limit)
)
).scalars().all()
return {
"events": [
{
"id": e.id,
"alert_key": e.alert_key,
"severity": e.severity,
"title": e.title,
"detail": e.detail,
"status": e.status,
"send_reason": e.send_reason,
"recipients": e.recipients,
"created_at": e.created_at.isoformat(),
}
for e in rows
]
}
# ============== Bulk Operations ==============
class BulkUpgradeRequest(BaseModel):
@ -1397,6 +1039,7 @@ async def test_external_apis(
@router.post("/trigger-scrape")
async def trigger_auction_scrape(
background_tasks: BackgroundTasks,
db: Database,
admin: User = Depends(require_admin),
):
@ -1407,27 +1050,20 @@ async def trigger_auction_scrape(
1. Try Tier 1 APIs (DropCatch, Sedo) first
2. Fall back to web scraping for others
"""
# Prefer job queue in production (non-blocking)
if settings.enable_job_queue and settings.redis_url:
from app.jobs.client import enqueue_job
job_id = await enqueue_job("scrape_auctions")
return {
"message": "Auction scraping enqueued",
"job_id": job_id,
"note": "Check /admin/scrape-status for results",
}
# Fallback: run inline
from app.services.auction_scraper import AuctionScraperService
scraper = AuctionScraperService()
result = await scraper.scrape_all_platforms(db)
# Run scraping in background
async def run_scrape():
async with db.begin():
return await scraper.scrape_all_platforms(db)
background_tasks.add_task(run_scrape)
return {
"message": "Auction scraping completed",
"result": result,
"note": "Check /admin/scrape-status for results",
"message": "Auction scraping started in background",
"note": "Check /admin/scrape-status for results"
}
@ -1468,261 +1104,3 @@ async def get_scrape_status(
for log in logs
]
}
# ============== Subscription Management ==============
class SubscriptionUpdate(BaseModel):
"""Manual subscription update request."""
tier: str # "scout", "trader", "tycoon"
@router.post("/users/{user_id}/sync-subscription")
async def sync_user_subscription_from_stripe(
user_id: int,
db: Database,
admin: User = Depends(require_admin),
):
"""
Sync a user's subscription status from Stripe.
Use this if the webhook failed to update the subscription.
"""
import stripe
import os
stripe.api_key = os.getenv("STRIPE_SECRET_KEY")
if not stripe.api_key:
raise HTTPException(status_code=503, detail="Stripe not configured")
# Get user
result = await db.execute(select(User).where(User.id == user_id))
user = result.scalar_one_or_none()
if not user:
raise HTTPException(status_code=404, detail="User not found")
if not user.stripe_customer_id:
raise HTTPException(status_code=400, detail="User has no Stripe customer ID")
# Get subscriptions from Stripe
try:
subscriptions = stripe.Subscription.list(
customer=user.stripe_customer_id,
status="active",
limit=1
)
except stripe.error.StripeError as e:
raise HTTPException(status_code=500, detail=f"Stripe error: {e}")
if not subscriptions.data:
return {
"status": "no_active_subscription",
"message": "No active subscription found in Stripe",
"user_email": user.email,
"stripe_customer_id": user.stripe_customer_id
}
stripe_sub = subscriptions.data[0]
# Access items via dict notation (Stripe returns StripeObject)
items_data = stripe_sub.get("items", {}).get("data", [])
price_id = items_data[0].get("price", {}).get("id") if items_data else None
# Map price_id to tier
trader_price = os.getenv("STRIPE_PRICE_TRADER")
tycoon_price = os.getenv("STRIPE_PRICE_TYCOON")
if price_id == trader_price:
tier = SubscriptionTier.TRADER
tier_name = "trader"
elif price_id == tycoon_price:
tier = SubscriptionTier.TYCOON
tier_name = "tycoon"
else:
return {
"status": "unknown_price",
"message": f"Unknown price ID: {price_id}",
"stripe_subscription_id": stripe_sub.id
}
# Update subscription in database
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
tier_config = TIER_CONFIG[tier]
if subscription:
old_tier = subscription.tier
subscription.tier = tier
subscription.status = SubscriptionStatus.ACTIVE
subscription.stripe_subscription_id = stripe_sub.id
subscription.max_domains = tier_config["domain_limit"]
subscription.check_frequency = tier_config["check_frequency"]
else:
subscription = Subscription(
user_id=user.id,
tier=tier,
status=SubscriptionStatus.ACTIVE,
stripe_subscription_id=stripe_sub.id,
max_domains=tier_config["domain_limit"],
check_frequency=tier_config["check_frequency"],
)
db.add(subscription)
old_tier = None
await db.commit()
return {
"status": "synced",
"user_email": user.email,
"stripe_customer_id": user.stripe_customer_id,
"stripe_subscription_id": stripe_sub.id,
"old_tier": old_tier.value if old_tier else None,
"new_tier": tier.value,
"tier_config": {
"domain_limit": tier_config["domain_limit"],
"check_frequency": tier_config["check_frequency"],
}
}
@router.post("/users/{user_id}/set-subscription")
async def set_user_subscription(
user_id: int,
update: SubscriptionUpdate,
db: Database,
admin: User = Depends(require_admin),
):
"""
Manually set a user's subscription tier.
Use this to manually upgrade/downgrade users (e.g., for refunds or promotions).
"""
tier_map = {
"scout": SubscriptionTier.SCOUT,
"trader": SubscriptionTier.TRADER,
"tycoon": SubscriptionTier.TYCOON,
}
if update.tier.lower() not in tier_map:
raise HTTPException(status_code=400, detail=f"Invalid tier: {update.tier}")
tier = tier_map[update.tier.lower()]
# Get user
result = await db.execute(select(User).where(User.id == user_id))
user = result.scalar_one_or_none()
if not user:
raise HTTPException(status_code=404, detail="User not found")
# Get/create subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
tier_config = TIER_CONFIG[tier]
if subscription:
old_tier = subscription.tier
subscription.tier = tier
subscription.status = SubscriptionStatus.ACTIVE
subscription.max_domains = tier_config["domain_limit"]
subscription.check_frequency = tier_config["check_frequency"]
else:
subscription = Subscription(
user_id=user.id,
tier=tier,
status=SubscriptionStatus.ACTIVE,
max_domains=tier_config["domain_limit"],
check_frequency=tier_config["check_frequency"],
)
db.add(subscription)
old_tier = None
await db.commit()
return {
"status": "updated",
"user_email": user.email,
"user_id": user.id,
"old_tier": old_tier.value if old_tier else None,
"new_tier": tier.value,
}
# ============== Listing Debug Endpoints ==============
@router.get("/listings/debug")
async def debug_listings(
domain: Optional[str] = None,
slug: Optional[str] = None,
db: Database = None,
_: User = Depends(require_admin),
):
"""Debug listings - search by domain or slug (ignores status)."""
query = select(DomainListing)
if domain:
query = query.where(DomainListing.domain.ilike(f"%{domain}%"))
if slug:
query = query.where(DomainListing.slug.ilike(f"%{slug}%"))
query = query.order_by(desc(DomainListing.created_at)).limit(20)
result = await db.execute(query)
listings = list(result.scalars().all())
return {
"count": len(listings),
"listings": [
{
"id": l.id,
"domain": l.domain,
"slug": l.slug,
"status": l.status,
"is_verified": l.is_verified,
"verification_status": l.verification_status,
"public_url": l.public_url,
"created_at": str(l.created_at) if l.created_at else None,
"published_at": str(l.published_at) if l.published_at else None,
"user_id": l.user_id,
}
for l in listings
]
}
@router.post("/listings/{listing_id}/force-activate")
async def force_activate_listing(
listing_id: int,
db: Database = None,
_: User = Depends(require_admin),
):
"""Force-activate a listing (bypass DNS verification)."""
result = await db.execute(
select(DomainListing).where(DomainListing.id == listing_id)
)
listing = result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
listing.status = "active"
listing.is_verified = True
listing.verification_status = "verified"
listing.published_at = datetime.utcnow()
await db.commit()
return {
"status": "activated",
"listing_id": listing.id,
"domain": listing.domain,
"slug": listing.slug,
"public_url": listing.public_url,
}

View File

@ -1,36 +0,0 @@
"""Analyze API endpoints (Alpha Terminal - Diligence)."""
from __future__ import annotations
from fastapi import APIRouter, Query, Request
from slowapi import Limiter
from slowapi.util import get_remote_address
from app.api.deps import CurrentUser, Database
from app.schemas.analyze import AnalyzeResponse
from app.services.analyze.service import get_domain_analysis
router = APIRouter()
limiter = Limiter(key_func=get_remote_address)
@router.get("/{domain}", response_model=AnalyzeResponse)
@limiter.limit("60/minute")
async def analyze_domain(
request: Request,
domain: str,
current_user: CurrentUser,
db: Database,
fast: bool = Query(False, description="Skip slower HTTP/SSL checks"),
refresh: bool = Query(False, description="Bypass cache and recompute"),
):
"""
Analyze a domain with open-data-first signals.
Requires authentication (Terminal feature).
"""
_ = current_user # enforce auth
res = await get_domain_analysis(db, domain, fast=fast, refresh=refresh)
await db.commit() # persist cache upsert
return res

View File

@ -161,14 +161,11 @@ class MarketFeedResponse(BaseModel):
# ============== Helper Functions ==============
def _format_time_remaining(end_time: datetime, now: Optional[datetime] = None) -> str:
def _format_time_remaining(end_time: datetime) -> str:
"""Format time remaining in human-readable format."""
ref = now or datetime.utcnow()
delta = end_time - ref
delta = end_time - datetime.utcnow()
# Small grace window to avoid displaying "Ended" due to request processing time.
# If an auction ends within the next ~2 seconds, we show "0m".
if delta.total_seconds() <= -2:
if delta.total_seconds() <= 0:
return "Ended"
hours = int(delta.total_seconds() // 3600)
@ -180,31 +177,19 @@ def _format_time_remaining(end_time: datetime, now: Optional[datetime] = None) -
elif hours > 0:
return f"{hours}h {minutes}m"
else:
return f"{max(minutes, 0)}m"
return f"{minutes}m"
def _get_affiliate_url(platform: str, domain: str, auction_url: str) -> str:
"""Get affiliate URL for a platform - links directly to the auction page with affiliate tracking."""
# SEDO SPECIAL CASE: Always use direct Sedo link with partner ID
# This ensures we get affiliate revenue even from scraped data
if platform == "Sedo":
return f"https://sedo.com/search/details/?domain={domain}&partnerid=335830"
# Import here to avoid circular imports
from app.services.hidden_api_scrapers import build_affiliate_url
# Try to build affiliate URL first (includes partner IDs)
affiliate_url = build_affiliate_url(platform, domain, auction_url)
if affiliate_url:
return affiliate_url
"""Get affiliate URL for a platform - links directly to the auction page."""
# Use the scraped auction URL directly if available
if auction_url and auction_url.startswith("http"):
return auction_url
# Fallback to platform-specific search/listing pages (without affiliate tracking)
# Fallback to platform-specific search/listing pages
platform_urls = {
"GoDaddy": f"https://auctions.godaddy.com/trpItemListing.aspx?domain={domain}",
"Sedo": f"https://sedo.com/search/?keyword={domain}",
"NameJet": f"https://www.namejet.com/Pages/Auctions/BackorderSearch.aspx?q={domain}",
"DropCatch": f"https://www.dropcatch.com/domain/{domain}",
"ExpiredDomains": f"https://www.expireddomains.net/domain-name-search/?q={domain}",
@ -218,8 +203,7 @@ def _get_affiliate_url(platform: str, domain: str, auction_url: str) -> str:
async def _convert_to_listing(
auction: DomainAuction,
db: AsyncSession,
include_valuation: bool = True,
now: Optional[datetime] = None,
include_valuation: bool = True
) -> AuctionListing:
"""Convert database auction to API response."""
valuation_data = None
@ -250,7 +234,7 @@ async def _convert_to_listing(
currency=auction.currency,
num_bids=auction.num_bids,
end_time=auction.end_time,
time_remaining=_format_time_remaining(auction.end_time, now=now),
time_remaining=_format_time_remaining(auction.end_time),
buy_now_price=auction.buy_now_price,
reserve_met=auction.reserve_met,
traffic=auction.traffic,
@ -399,7 +383,7 @@ async def search_auctions(
# Convert to response with valuations
listings = []
for auction in auctions:
listing = await _convert_to_listing(auction, db, include_valuation=True, now=now)
listing = await _convert_to_listing(auction, db, include_valuation=True)
listings.append(listing)
# Sort by value_ratio if requested (after valuation)
@ -442,8 +426,7 @@ async def get_ending_soon(
Data is scraped from public auction sites - no mock data.
"""
now = datetime.utcnow()
cutoff = now + timedelta(hours=hours)
cutoff = datetime.utcnow() + timedelta(hours=hours)
query = (
select(DomainAuction)
@ -451,7 +434,7 @@ async def get_ending_soon(
and_(
DomainAuction.is_active == True,
DomainAuction.end_time <= cutoff,
DomainAuction.end_time > now,
DomainAuction.end_time > datetime.utcnow(),
)
)
.order_by(DomainAuction.end_time.asc())
@ -463,7 +446,7 @@ async def get_ending_soon(
listings = []
for auction in auctions:
listing = await _convert_to_listing(auction, db, include_valuation=True, now=now)
listing = await _convert_to_listing(auction, db, include_valuation=True)
listings.append(listing)
return listings
@ -498,7 +481,7 @@ async def get_hot_auctions(
listings = []
for auction in auctions:
listing = await _convert_to_listing(auction, db, include_valuation=True, now=now)
listing = await _convert_to_listing(auction, db, include_valuation=True)
listings.append(listing)
return listings
@ -514,7 +497,6 @@ async def get_platform_stats(
Data is scraped from public auction sites - no mock data.
"""
now = datetime.utcnow()
# Get stats per platform
stats_query = (
select(
@ -522,12 +504,7 @@ async def get_platform_stats(
func.count(DomainAuction.id).label("count"),
func.avg(DomainAuction.current_bid).label("avg_bid"),
)
.where(
and_(
DomainAuction.is_active == True,
DomainAuction.end_time > now,
)
)
.where(DomainAuction.is_active == True)
.group_by(DomainAuction.platform)
)
@ -535,7 +512,7 @@ async def get_platform_stats(
platform_data = result.all()
# Get ending soon counts
cutoff = now + timedelta(hours=1)
cutoff = datetime.utcnow() + timedelta(hours=1)
ending_query = (
select(
DomainAuction.platform,
@ -544,7 +521,6 @@ async def get_platform_stats(
.where(
and_(
DomainAuction.is_active == True,
DomainAuction.end_time > now,
DomainAuction.end_time <= cutoff,
)
)
@ -572,7 +548,6 @@ async def get_scrape_status(
db: AsyncSession = Depends(get_db),
):
"""Get status of auction scraping."""
now = datetime.utcnow()
# Get last successful scrape
last_scrape_query = (
select(AuctionScrapeLog)
@ -584,12 +559,7 @@ async def get_scrape_status(
last_log = result.scalar_one_or_none()
# Get total auctions
total_query = select(func.count(DomainAuction.id)).where(
and_(
DomainAuction.is_active == True,
DomainAuction.end_time > now,
)
)
total_query = select(func.count(DomainAuction.id)).where(DomainAuction.is_active == True)
total_result = await db.execute(total_query)
total = total_result.scalar() or 0
@ -629,48 +599,25 @@ async def trigger_scrape(
raise HTTPException(status_code=500, detail=f"Scrape failed: {str(e)}")
@router.get("/sedo")
async def get_sedo_listings(
keyword: Optional[str] = Query(None, description="Search keyword"),
tld: Optional[str] = Query(None, description="Filter by TLD"),
limit: int = Query(50, le=100),
current_user: Optional[User] = Depends(get_current_user_optional),
@router.post("/seed")
async def seed_auctions(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Get live domain listings from Sedo marketplace.
Returns real-time data from Sedo API with affiliate tracking.
All links include Pounce partner ID for commission tracking.
Seed the database with realistic sample auction data.
Useful for development and demo purposes.
"""
from app.services.sedo_api import sedo_client
if not sedo_client.is_configured:
return {
"items": [],
"error": "Sedo API not configured",
"source": "sedo"
}
try:
listings = await sedo_client.get_listings_for_display(
keyword=keyword,
tld=tld,
page_size=limit
)
result = await auction_scraper.seed_sample_auctions(db)
return {
"items": listings,
"count": len(listings),
"source": "sedo",
"affiliate_note": "All links include Pounce partner ID for commission tracking"
"status": "success",
"message": "Sample auctions seeded",
"result": result,
}
except Exception as e:
logger.error(f"Sedo API error: {e}")
return {
"items": [],
"error": str(e),
"source": "sedo"
}
logger.error(f"Seeding failed: {e}")
raise HTTPException(status_code=500, detail=f"Seeding failed: {str(e)}")
@router.get("/opportunities")
@ -689,15 +636,9 @@ async def get_smart_opportunities(
Opportunity Score = time_urgency × competition_factor × price_factor
"""
# Get active auctions
now = datetime.utcnow()
query = (
select(DomainAuction)
.where(
and_(
DomainAuction.is_active == True,
DomainAuction.end_time > now,
)
)
.where(DomainAuction.is_active == True)
.order_by(DomainAuction.end_time.asc())
.limit(100)
)
@ -777,7 +718,7 @@ async def get_smart_opportunities(
if opportunity_score < 3:
continue
listing = await _convert_to_listing(auction, db, include_valuation=False, now=now)
listing = await _convert_to_listing(auction, db, include_valuation=False)
recommendation = (
"🔥 Hot" if opportunity_score >= 10 else
@ -841,16 +782,76 @@ def _get_opportunity_reasoning(value_ratio: float, hours_left: float, num_bids:
def _calculate_pounce_score_v2(domain: str, tld: str, num_bids: int = 0, age_years: int = 0, is_pounce: bool = False) -> int:
# Backward-compatible wrapper (shared implementation lives in services)
from app.services.pounce_score import calculate_pounce_score_v2
return calculate_pounce_score_v2(
domain,
tld,
num_bids=num_bids,
age_years=age_years,
is_pounce=is_pounce,
)
"""
Pounce Score v2.0 - Enhanced scoring algorithm.
Factors:
- Length (shorter = more valuable)
- TLD premium
- Market activity (bids)
- Age bonus
- Pounce Direct bonus (verified listings)
- Penalties (hyphens, numbers, etc.)
"""
score = 50 # Baseline
name = domain.rsplit('.', 1)[0] if '.' in domain else domain
# A) LENGTH BONUS (exponential for short domains)
length_scores = {1: 50, 2: 45, 3: 40, 4: 30, 5: 20, 6: 15, 7: 10}
score += length_scores.get(len(name), max(0, 15 - len(name)))
# B) TLD PREMIUM
tld_scores = {
'com': 20, 'ai': 25, 'io': 18, 'co': 12,
'ch': 15, 'de': 10, 'net': 8, 'org': 8,
'app': 10, 'dev': 10, 'xyz': 5
}
score += tld_scores.get(tld.lower(), 0)
# C) MARKET ACTIVITY (bids = demand signal)
if num_bids >= 20:
score += 15
elif num_bids >= 10:
score += 10
elif num_bids >= 5:
score += 5
elif num_bids >= 2:
score += 2
# D) AGE BONUS (established domains)
if age_years and age_years > 15:
score += 10
elif age_years and age_years > 10:
score += 7
elif age_years and age_years > 5:
score += 3
# E) POUNCE DIRECT BONUS (verified = trustworthy)
if is_pounce:
score += 10
# F) PENALTIES
if '-' in name:
score -= 25
if any(c.isdigit() for c in name) and len(name) > 3:
score -= 20
if len(name) > 15:
score -= 15
# G) CONSONANT CHECK (no gibberish like "xkqzfgh")
consonants = 'bcdfghjklmnpqrstvwxyz'
max_streak = 0
current_streak = 0
for c in name.lower():
if c in consonants:
current_streak += 1
max_streak = max(max_streak, current_streak)
else:
current_streak = 0
if max_streak > 4:
score -= 15
return max(0, min(100, score))
def _is_premium_domain(domain_name: str) -> bool:
@ -939,101 +940,52 @@ async def get_market_feed(
POUNCE EXCLUSIVE domains are highlighted and appear first.
"""
# NOTE: This endpoint is called frequently by the Market UI.
# Avoid loading *all* auctions/listings into Python. Instead, we:
# - Apply filters + ordering in SQL where possible
# - Over-fetch a bounded window for combined feeds ("all") and score-sorting
now = datetime.utcnow()
tld_clean = tld.lower().lstrip(".") if tld else None
requested = offset + limit
fetch_window = min(max(requested * 3, 200), 2000) # bounded overfetch for merge/sort
built: list[dict] = [] # {"item": MarketFeedItem, "newest_ts": datetime}
# -----------------------------
# Build base filters (SQL-side)
# -----------------------------
listing_filters = [DomainListing.status == ListingStatus.ACTIVE.value]
if keyword:
listing_filters.append(DomainListing.domain.ilike(f"%{keyword}%"))
if verified_only:
listing_filters.append(DomainListing.verification_status == VerificationStatus.VERIFIED.value)
if min_price is not None:
listing_filters.append(DomainListing.asking_price >= min_price)
if max_price is not None:
listing_filters.append(DomainListing.asking_price <= max_price)
if tld_clean:
listing_filters.append(DomainListing.domain.ilike(f"%.{tld_clean}"))
auction_filters = [
DomainAuction.is_active == True,
DomainAuction.end_time > now,
]
if keyword:
auction_filters.append(DomainAuction.domain.ilike(f"%{keyword}%"))
if tld_clean:
auction_filters.append(DomainAuction.tld == tld_clean)
if min_price is not None:
auction_filters.append(DomainAuction.current_bid >= min_price)
if max_price is not None:
auction_filters.append(DomainAuction.current_bid <= max_price)
if ending_within:
cutoff = now + timedelta(hours=ending_within)
auction_filters.append(DomainAuction.end_time <= cutoff)
# -----------------------------
# Counts (used for UI stats)
# -----------------------------
pounce_total = 0
auction_total = 0
items: List[MarketFeedItem] = []
pounce_count = 0
auction_count = 0
# ═══════════════════════════════════════════════════════════════
# 1. POUNCE DIRECT LISTINGS (Our USP!)
# ═══════════════════════════════════════════════════════════════
if source in ["all", "pounce"]:
pounce_total = (await db.execute(select(func.count(DomainListing.id)).where(and_(*listing_filters)))).scalar() or 0
if source in ["all", "external"]:
auction_total = (await db.execute(select(func.count(DomainAuction.id)).where(and_(*auction_filters)))).scalar() or 0
# -----------------------------
# Fetch + build items (bounded)
# -----------------------------
# For "all": fetch a bounded window from each source and then merge/sort in Python.
# For single-source: fetch offset/limit directly when sort can be pushed to SQL.
listing_offset = 0
listing_limit = fetch_window
auction_offset = 0
auction_limit = fetch_window
if source == "pounce":
listing_offset = offset
listing_limit = limit
if source == "external":
auction_offset = offset
auction_limit = limit
# Pounce Direct listings
if source in ["all", "pounce"]:
listing_query = select(DomainListing).where(and_(*listing_filters))
# SQL ordering for listings (best-effort)
if sort_by == "price_asc":
listing_query = listing_query.order_by(func.coalesce(DomainListing.asking_price, 0).asc())
elif sort_by == "price_desc":
listing_query = listing_query.order_by(func.coalesce(DomainListing.asking_price, 0).desc())
elif sort_by == "newest":
listing_query = listing_query.order_by(DomainListing.updated_at.desc())
else:
# score/time: prefer higher score first for listings
listing_query = listing_query.order_by(DomainListing.pounce_score.desc(), DomainListing.updated_at.desc())
listing_query = listing_query.offset(listing_offset).limit(listing_limit)
listings = (await db.execute(listing_query)).scalars().all()
listing_query = select(DomainListing).where(
DomainListing.status == ListingStatus.ACTIVE.value
)
if keyword:
listing_query = listing_query.where(
DomainListing.domain.ilike(f"%{keyword}%")
)
if verified_only:
listing_query = listing_query.where(
DomainListing.verification_status == VerificationStatus.VERIFIED.value
)
if min_price is not None:
listing_query = listing_query.where(DomainListing.asking_price >= min_price)
if max_price is not None:
listing_query = listing_query.where(DomainListing.asking_price <= max_price)
result = await db.execute(listing_query)
listings = result.scalars().all()
for listing in listings:
domain_tld = listing.domain.rsplit(".", 1)[1] if "." in listing.domain else ""
pounce_score = listing.pounce_score or _calculate_pounce_score_v2(listing.domain, domain_tld, is_pounce=True)
domain_tld = listing.domain.rsplit('.', 1)[1] if '.' in listing.domain else ""
# Apply TLD filter
if tld and domain_tld.lower() != tld.lower().lstrip('.'):
continue
pounce_score = listing.pounce_score or _calculate_pounce_score_v2(
listing.domain, domain_tld, is_pounce=True
)
# Apply score filter
if pounce_score < min_score:
continue
item = MarketFeedItem(
items.append(MarketFeedItem(
id=f"pounce-{listing.id}",
domain=listing.domain,
tld=domain_tld,
@ -1049,50 +1001,61 @@ async def get_market_feed(
url=f"/buy/{listing.slug}",
is_external=False,
pounce_score=pounce_score,
)
built.append({"item": item, "newest_ts": listing.updated_at or listing.created_at or datetime.min})
))
pounce_count += 1
# External auctions (from DB)
# ═══════════════════════════════════════════════════════════════
# 2. EXTERNAL AUCTIONS (Scraped from platforms)
# ═══════════════════════════════════════════════════════════════
if source in ["all", "external"]:
auction_query = select(DomainAuction).where(and_(*auction_filters))
# SQL ordering for auctions when possible
if sort_by == "time":
auction_query = auction_query.order_by(DomainAuction.end_time.asc())
elif sort_by == "price_asc":
auction_query = auction_query.order_by(DomainAuction.current_bid.asc())
elif sort_by == "price_desc":
auction_query = auction_query.order_by(DomainAuction.current_bid.desc())
elif sort_by == "newest":
auction_query = auction_query.order_by(DomainAuction.updated_at.desc())
else:
# score: prefer persisted score for DB-level sorting
auction_query = auction_query.order_by(
func.coalesce(DomainAuction.pounce_score, 0).desc(),
DomainAuction.updated_at.desc(),
now = datetime.utcnow()
auction_query = select(DomainAuction).where(
and_(
DomainAuction.is_active == True,
DomainAuction.end_time > now # ← KRITISCH: Nur laufende Auktionen!
)
auction_query = auction_query.offset(auction_offset).limit(auction_limit)
auctions = (await db.execute(auction_query)).scalars().all()
)
if keyword:
auction_query = auction_query.where(
DomainAuction.domain.ilike(f"%{keyword}%")
)
if tld:
auction_query = auction_query.where(
DomainAuction.tld == tld.lower().lstrip('.')
)
if min_price is not None:
auction_query = auction_query.where(DomainAuction.current_bid >= min_price)
if max_price is not None:
auction_query = auction_query.where(DomainAuction.current_bid <= max_price)
if ending_within:
cutoff = datetime.utcnow() + timedelta(hours=ending_within)
auction_query = auction_query.where(DomainAuction.end_time <= cutoff)
result = await db.execute(auction_query)
auctions = result.scalars().all()
for auction in auctions:
# Vanity filter for anonymous users
# Apply vanity filter for non-authenticated users
if current_user is None and not _is_premium_domain(auction.domain):
continue
pounce_score = auction.pounce_score
if pounce_score is None:
pounce_score = _calculate_pounce_score_v2(
auction.domain,
auction.tld,
num_bids=auction.num_bids,
age_years=auction.age_years or 0,
is_pounce=False,
)
pounce_score = _calculate_pounce_score_v2(
auction.domain,
auction.tld,
num_bids=auction.num_bids,
age_years=auction.age_years or 0,
is_pounce=False
)
# Apply score filter
if pounce_score < min_score:
continue
item = MarketFeedItem(
items.append(MarketFeedItem(
id=f"auction-{auction.id}",
domain=auction.domain,
tld=auction.tld,
@ -1103,139 +1066,53 @@ async def get_market_feed(
source=auction.platform,
is_pounce=False,
verified=False,
time_remaining=_format_time_remaining(auction.end_time, now=now),
time_remaining=_format_time_remaining(auction.end_time),
end_time=auction.end_time,
num_bids=auction.num_bids,
url=_get_affiliate_url(auction.platform, auction.domain, auction.auction_url),
is_external=True,
pounce_score=pounce_score,
)
built.append({"item": item, "newest_ts": auction.updated_at or auction.scraped_at or datetime.min})
# =========================================================================
# LIVE SEDO DATA - Fetch and merge real-time listings from Sedo API
# =========================================================================
try:
from app.services.sedo_api import sedo_client
if sedo_client.is_configured:
# Use search keyword or fall back to popular terms for discovery
sedo_keyword = keyword
if not sedo_keyword:
# Fetch popular domains when no specific search
import random
popular_terms = ["ai", "tech", "crypto", "app", "cloud", "digital", "smart", "pro"]
sedo_keyword = random.choice(popular_terms)
# Fetch live Sedo listings (limit to avoid slow responses)
sedo_listings = await sedo_client.get_listings_for_display(
keyword=sedo_keyword,
tld=tld_clean,
page_size=min(30, limit) # Cap at 30 to avoid slow API calls
)
# Track domains already in results to avoid duplicates
existing_domains = {item["item"].domain.lower() for item in built}
for sedo_item in sedo_listings:
domain = sedo_item.get("domain", "").lower()
# Skip if already have this domain from scraped data
if domain in existing_domains:
continue
# Apply vanity filter for anonymous users
if current_user is None and not _is_premium_domain(domain):
continue
# Apply price filters
price = sedo_item.get("price", 0)
if min_price is not None and price < min_price and price > 0:
continue
if max_price is not None and price > max_price:
continue
domain_tld = sedo_item.get("tld", "")
pounce_score = _calculate_pounce_score_v2(
domain,
domain_tld,
num_bids=0,
age_years=0,
is_pounce=False,
)
if pounce_score < min_score:
continue
# Determine price type
price_type = "bid" if sedo_item.get("is_auction") else (
"negotiable" if price == 0 else "fixed"
)
item = MarketFeedItem(
id=f"sedo-live-{hash(domain) % 1000000}",
domain=domain,
tld=domain_tld,
price=price,
currency="USD",
price_type=price_type,
status="auction" if sedo_item.get("is_auction") else "instant",
source="Sedo",
is_pounce=False,
verified=False,
time_remaining=None,
end_time=None,
num_bids=None,
url=sedo_item.get("url", ""),
is_external=True,
pounce_score=pounce_score,
)
built.append({"item": item, "newest_ts": now})
existing_domains.add(domain)
# Update auction count
auction_total += 1
except Exception as e:
logger.warning(f"Failed to fetch live Sedo data: {e}")
))
auction_count += 1
# -----------------------------
# Merge sort (Python) + paginate
# -----------------------------
# ═══════════════════════════════════════════════════════════════
# 3. SORT (Pounce Direct always appears first within same score)
# ═══════════════════════════════════════════════════════════════
if sort_by == "score":
built.sort(key=lambda x: (x["item"].pounce_score, int(x["item"].is_pounce), x["item"].domain), reverse=True)
items.sort(key=lambda x: (-x.pounce_score, -int(x.is_pounce), x.domain))
elif sort_by == "price_asc":
built.sort(key=lambda x: (x["item"].price, -int(x["item"].is_pounce), x["item"].domain))
items.sort(key=lambda x: (x.price, -int(x.is_pounce), x.domain))
elif sort_by == "price_desc":
built.sort(key=lambda x: (-x["item"].price, -int(x["item"].is_pounce), x["item"].domain))
items.sort(key=lambda x: (-x.price, -int(x.is_pounce), x.domain))
elif sort_by == "time":
# Pounce Direct first (no time limit), then by end time
built.sort(
key=lambda x: (0 if x["item"].is_pounce else 1, x["item"].end_time or datetime.max)
)
def time_sort_key(x):
if x.is_pounce:
return (0, datetime.max)
return (1, x.end_time or datetime.max)
items.sort(key=time_sort_key)
elif sort_by == "newest":
built.sort(key=lambda x: (int(x["item"].is_pounce), x["newest_ts"]), reverse=True)
items.sort(key=lambda x: (-int(x.is_pounce), x.domain))
total = pounce_total + auction_total if source == "all" else (pounce_total if source == "pounce" else auction_total)
total = len(items)
page_slice = built[offset:offset + limit]
items = [x["item"] for x in page_slice]
# Pagination
items = items[offset:offset + limit]
# Unique sources (after pagination)
# Get unique sources
sources = list(set(item.source for item in items))
# Last update time (auctions)
if source in ["all", "external"]:
last_update_result = await db.execute(select(func.max(DomainAuction.updated_at)))
last_updated = last_update_result.scalar() or now
else:
last_updated = now
# Last update time
last_update_result = await db.execute(
select(func.max(DomainAuction.updated_at))
)
last_updated = last_update_result.scalar() or datetime.utcnow()
return MarketFeedResponse(
items=items,
total=total,
pounce_direct_count=pounce_total,
auction_count=auction_total,
pounce_direct_count=pounce_count,
auction_count=auction_count,
sources=sources,
last_updated=last_updated,
filters_applied={
@ -1248,5 +1125,5 @@ async def get_market_feed(
"ending_within": ending_within,
"verified_only": verified_only,
"sort_by": sort_by,
},
}
)

View File

@ -14,11 +14,10 @@ Endpoints:
import os
import secrets
import logging
import re
from datetime import datetime, timedelta
from typing import Optional
from fastapi import APIRouter, HTTPException, status, BackgroundTasks, Request, Response
from fastapi import APIRouter, HTTPException, status, BackgroundTasks, Request
from pydantic import BaseModel, EmailStr
from sqlalchemy import select
from slowapi import Limiter
@ -26,24 +25,10 @@ from slowapi.util import get_remote_address
from app.api.deps import Database, CurrentUser
from app.config import get_settings
from app.schemas.auth import (
LoginResponse,
ReferralLinkResponse,
ReferralStats,
UserCreate,
UserLogin,
UserResponse,
)
from app.schemas.auth import UserCreate, UserLogin, UserResponse, Token
from app.services.auth import AuthService
from app.services.email_service import email_service
from app.models.user import User
from app.security import set_auth_cookie, clear_auth_cookie
from app.services.telemetry import track_event
from app.services.referral_rewards import (
QUALIFIED_REFERRAL_BATCH_SIZE,
apply_referral_rewards_for_user,
compute_badge,
)
logger = logging.getLogger(__name__)
@ -86,9 +71,7 @@ class UpdateUserRequest(BaseModel):
# ============== Endpoints ==============
@router.post("/register", response_model=UserResponse, status_code=status.HTTP_201_CREATED)
@limiter.limit("5/minute")
async def register(
request: Request,
user_data: UserCreate,
db: Database,
background_tasks: BackgroundTasks,
@ -116,62 +99,6 @@ async def register(
name=user_data.name,
)
# Process referral if present.
# Supported formats:
# - yield_{user_id}_{domain_id}
# - invite code (12 hex chars)
referral_applied = False
referrer_user_id: Optional[int] = None
referral_type: Optional[str] = None
if user_data.ref:
ref_raw = user_data.ref.strip()
# Yield referral: yield_{user_id}_{domain_id}
if ref_raw.startswith("yield_"):
try:
parts = ref_raw.split("_")
if len(parts) >= 3:
referrer_user_id = int(parts[1])
user.referred_by_user_id = referrer_user_id
user.referral_code = ref_raw
referral_type = "yield"
# Try to map the yield_domain_id to a domain string
try:
from app.models.yield_domain import YieldDomain
yield_domain_id = int(parts[2])
yd_res = await db.execute(select(YieldDomain).where(YieldDomain.id == yield_domain_id))
yd = yd_res.scalar_one_or_none()
if yd:
user.referred_by_domain = yd.domain
except Exception:
pass
await db.commit()
referral_applied = True
logger.info("User %s referred via yield by user %s", user.email, referrer_user_id)
except Exception as e:
logger.warning("Failed to process yield referral code: %s, error: %s", ref_raw, e)
else:
# Invite code referral (viral loop)
code = ref_raw.lower()
if re.fullmatch(r"[0-9a-f]{12}", code):
try:
ref_user_res = await db.execute(select(User).where(User.invite_code == code))
ref_user = ref_user_res.scalar_one_or_none()
if ref_user and ref_user.id != user.id:
referrer_user_id = ref_user.id
user.referred_by_user_id = ref_user.id
user.referral_code = code
referral_type = "invite"
await db.commit()
referral_applied = True
logger.info("User %s referred via invite_code by user %s", user.email, ref_user.id)
except Exception as e:
logger.warning("Failed to process invite referral code: %s, error: %s", code, e)
# Auto-admin for specific email
ADMIN_EMAILS = ["guggeryves@hotmail.com"]
if user.email.lower() in [e.lower() for e in ADMIN_EMAILS]:
@ -179,64 +106,27 @@ async def register(
user.is_verified = True # Auto-verify admins
await db.commit()
# Give admin Tycoon subscription (only if no subscription exists)
# Give admin Tycoon subscription
from app.models.subscription import Subscription, SubscriptionTier, SubscriptionStatus, TIER_CONFIG
from sqlalchemy import select
# Check if subscription already exists
existing_sub = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
tycoon_config = TIER_CONFIG.get(SubscriptionTier.TYCOON, {})
subscription = Subscription(
user_id=user.id,
tier=SubscriptionTier.TYCOON,
status=SubscriptionStatus.ACTIVE,
max_domains=tycoon_config.get("domain_limit", 500),
)
if not existing_sub.scalar_one_or_none():
tycoon_config = TIER_CONFIG.get(SubscriptionTier.TYCOON, {})
subscription = Subscription(
user_id=user.id,
tier=SubscriptionTier.TYCOON,
status=SubscriptionStatus.ACTIVE,
max_domains=tycoon_config.get("domain_limit", 500),
)
db.add(subscription)
await db.commit()
db.add(subscription)
await db.commit()
# Generate verification token
verification_token = secrets.token_urlsafe(32)
user.email_verification_token = verification_token
user.email_verification_expires = datetime.utcnow() + timedelta(hours=24)
await db.commit()
# Telemetry: registration + referral attribution
try:
await track_event(
db,
event_name="user_registered",
request=request,
user_id=user.id,
is_authenticated=False,
source="public",
metadata={"ref": bool(user_data.ref)},
)
if referral_applied:
await track_event(
db,
event_name="referral_attributed",
request=request,
user_id=user.id,
is_authenticated=False,
source="public",
metadata={
"referral_type": referral_type,
"referrer_user_id": referrer_user_id,
"ref": user_data.ref,
},
)
await db.commit()
except Exception:
# never block registration
pass
# Send verification email in background
if email_service.is_configured():
site_url = (settings.site_url or "http://localhost:3000").rstrip("/")
if email_service.is_configured:
site_url = os.getenv("SITE_URL", "http://localhost:3000")
verify_url = f"{site_url}/verify-email?token={verification_token}"
background_tasks.add_task(
@ -249,104 +139,8 @@ async def register(
return user
@router.get("/referral", response_model=ReferralLinkResponse)
async def get_referral_link(
request: Request,
current_user: CurrentUser,
db: Database,
days: int = 30,
):
"""Return the authenticated user's invite link."""
if not current_user.invite_code:
# Generate on demand for older users
for _ in range(12):
code = secrets.token_hex(6)
exists = await db.execute(select(User.id).where(User.invite_code == code))
if exists.scalar_one_or_none() is None:
current_user.invite_code = code
await db.commit()
break
if not current_user.invite_code:
raise HTTPException(status_code=500, detail="Failed to generate invite code")
# Apply rewards (idempotent) so UI reflects current state even without scheduler
snapshot = await apply_referral_rewards_for_user(db, current_user.id)
await db.commit()
base = (settings.site_url or "http://localhost:3000").rstrip("/")
url = f"{base}/register?ref={current_user.invite_code}"
try:
await track_event(
db,
event_name="referral_link_viewed",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
metadata={"invite_code": current_user.invite_code},
)
await db.commit()
except Exception:
pass
# Count link views in the chosen window
try:
from datetime import timedelta
from sqlalchemy import and_, func
from app.models.telemetry import TelemetryEvent
window_days = max(1, min(int(days), 365))
end = datetime.utcnow()
start = end - timedelta(days=window_days)
views = (
await db.execute(
select(func.count(TelemetryEvent.id)).where(
and_(
TelemetryEvent.event_name == "referral_link_viewed",
TelemetryEvent.user_id == current_user.id,
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
)
)
)
).scalar()
referral_link_views_window = int(views or 0)
except Exception:
window_days = 30
referral_link_views_window = 0
qualified = int(snapshot.qualified_referrals_total)
if qualified < QUALIFIED_REFERRAL_BATCH_SIZE:
next_reward_at = QUALIFIED_REFERRAL_BATCH_SIZE
else:
remainder = qualified % QUALIFIED_REFERRAL_BATCH_SIZE
next_reward_at = qualified + (QUALIFIED_REFERRAL_BATCH_SIZE - remainder) if remainder else qualified + QUALIFIED_REFERRAL_BATCH_SIZE
return ReferralLinkResponse(
invite_code=current_user.invite_code,
url=url,
stats=ReferralStats(
window_days=int(window_days),
referred_users_total=int(snapshot.referred_users_total),
qualified_referrals_total=qualified,
referral_link_views_window=int(referral_link_views_window),
bonus_domains=int(snapshot.bonus_domains),
next_reward_at=int(next_reward_at),
badge=compute_badge(qualified),
cooldown_days=int(getattr(snapshot, "cooldown_days", 7) or 7),
disqualified_cooldown_total=int(getattr(snapshot, "disqualified_cooldown_total", 0) or 0),
disqualified_missing_ip_total=int(getattr(snapshot, "disqualified_missing_ip_total", 0) or 0),
disqualified_shared_ip_total=int(getattr(snapshot, "disqualified_shared_ip_total", 0) or 0),
disqualified_duplicate_ip_total=int(getattr(snapshot, "disqualified_duplicate_ip_total", 0) or 0),
),
)
@router.post("/login", response_model=LoginResponse)
@limiter.limit("10/minute")
async def login(request: Request, user_data: UserLogin, db: Database, response: Response):
@router.post("/login", response_model=Token)
async def login(user_data: UserLogin, db: Database):
"""
Authenticate user and return JWT token.
@ -410,23 +204,12 @@ async def login(request: Request, user_data: UserLogin, db: Database, response:
expires_delta=access_token_expires,
)
# Set HttpOnly cookie (preferred for browser clients)
set_auth_cookie(
response=response,
token=access_token,
max_age_seconds=settings.access_token_expire_minutes * 60,
return Token(
access_token=access_token,
token_type="bearer",
expires_in=settings.access_token_expire_minutes * 60,
)
# Do NOT return the token in the response body (prevents leaks via logs/JS storage)
return LoginResponse(expires_in=settings.access_token_expire_minutes * 60)
@router.post("/logout", response_model=MessageResponse)
async def logout(response: Response):
"""Clear auth cookie."""
clear_auth_cookie(response)
return MessageResponse(message="Logged out")
@router.get("/me", response_model=UserResponse)
async def get_current_user_info(current_user: CurrentUser):
@ -451,10 +234,8 @@ async def update_current_user(
@router.post("/forgot-password", response_model=MessageResponse)
@limiter.limit("3/minute")
async def forgot_password(
request: Request,
payload: ForgotPasswordRequest,
request: ForgotPasswordRequest,
db: Database,
background_tasks: BackgroundTasks,
):
@ -469,7 +250,9 @@ async def forgot_password(
success_message = "If an account with this email exists, a password reset link has been sent."
# Look up user
result = await db.execute(select(User).where(User.email == payload.email.lower()))
result = await db.execute(
select(User).where(User.email == request.email.lower())
)
user = result.scalar_one_or_none()
if not user:
@ -483,8 +266,8 @@ async def forgot_password(
await db.commit()
# Send reset email in background
if email_service.is_configured():
site_url = (settings.site_url or "http://localhost:3000").rstrip("/")
if email_service.is_configured:
site_url = os.getenv("SITE_URL", "http://localhost:3000")
reset_url = f"{site_url}/reset-password?token={reset_token}"
background_tasks.add_task(
@ -582,10 +365,8 @@ async def verify_email(
@router.post("/resend-verification", response_model=MessageResponse)
@limiter.limit("3/minute")
async def resend_verification(
request: Request,
payload: ForgotPasswordRequest, # Reuse schema - just needs email
request: ForgotPasswordRequest, # Reuse schema - just needs email
db: Database,
background_tasks: BackgroundTasks,
):
@ -599,7 +380,7 @@ async def resend_verification(
# Look up user
result = await db.execute(
select(User).where(User.email == payload.email.lower())
select(User).where(User.email == request.email.lower())
)
user = result.scalar_one_or_none()
@ -613,8 +394,8 @@ async def resend_verification(
await db.commit()
# Send verification email
if email_service.is_configured():
site_url = (settings.site_url or "http://localhost:3000").rstrip("/")
if email_service.is_configured:
site_url = os.getenv("SITE_URL", "http://localhost:3000")
verify_url = f"{site_url}/verify-email?token={verification_token}"
background_tasks.add_task(

View File

@ -15,7 +15,6 @@ from sqlalchemy.orm import selectinload
from app.api.deps import Database, get_current_user, get_current_user_optional
from app.models.user import User
from app.models.blog import BlogPost
from app.services.html_sanitizer import sanitize_html
router = APIRouter()
@ -195,39 +194,7 @@ async def get_blog_post(
post.view_count += 1
await db.commit()
data = post.to_dict(include_content=True)
data["content"] = sanitize_html(data.get("content") or "")
return data
@router.get("/posts/{slug}/meta")
async def get_blog_post_meta(
slug: str,
db: Database,
):
"""
Get blog post metadata by slug (public).
IMPORTANT: This endpoint does NOT increment view_count.
It's intended for SEO metadata generation (generateMetadata, JSON-LD).
"""
result = await db.execute(
select(BlogPost)
.options(selectinload(BlogPost.author))
.where(
BlogPost.slug == slug,
BlogPost.is_published == True,
)
)
post = result.scalar_one_or_none()
if not post:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Blog post not found",
)
return post.to_dict(include_content=False)
return post.to_dict(include_content=True)
# ============== Admin Endpoints ==============
@ -288,7 +255,7 @@ async def create_blog_post(
post = BlogPost(
title=data.title,
slug=slug,
content=sanitize_html(data.content),
content=data.content,
excerpt=data.excerpt,
cover_image=data.cover_image,
category=data.category,
@ -355,7 +322,7 @@ async def update_blog_post(
# Optionally update slug if title changes
# post.slug = generate_slug(data.title)
if data.content is not None:
post.content = sanitize_html(data.content)
post.content = data.content
if data.excerpt is not None:
post.excerpt = data.excerpt
if data.cover_image is not None:

View File

@ -1,197 +0,0 @@
"""CFO (Management) endpoints."""
from __future__ import annotations
from datetime import datetime, timedelta, timezone
from fastapi import APIRouter, Depends, HTTPException, Request, status
from slowapi import Limiter
from slowapi.util import get_remote_address
from sqlalchemy import and_, case, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user
from app.database import get_db
from app.models.portfolio import PortfolioDomain
from app.models.user import User
from app.models.yield_domain import YieldDomain, YieldTransaction
from app.schemas.cfo import (
CfoKillListRow,
CfoMonthlyBucket,
CfoSummaryResponse,
CfoUpcomingCostRow,
SetToDropResponse,
)
from app.services.analyze.renewal_cost import get_tld_price_snapshot
router = APIRouter()
limiter = Limiter(key_func=get_remote_address)
def _utcnow() -> datetime:
return datetime.now(timezone.utc)
def _month_key(dt: datetime) -> str:
return f"{dt.year:04d}-{dt.month:02d}"
async def _estimate_renewal_cost_usd(db: AsyncSession, domain: str) -> tuple[float | None, str]:
# If the user stored renewal_cost, we treat it as the source of truth.
# Else we estimate using our own collected `tld_prices` DB.
tld = domain.split(".")[-1].lower()
snap = await get_tld_price_snapshot(db, tld)
if snap.min_renew_usd is None:
return None, "unknown"
return float(snap.min_renew_usd), "tld_prices"
@router.get("/summary", response_model=CfoSummaryResponse)
@limiter.limit("30/minute")
async def cfo_summary(
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
CFO dashboard summary:
- Burn rate timeline (renewal costs)
- Upcoming costs (30d)
- Kill list (renewal soon + no yield signals)
"""
now = _utcnow()
now_naive = now.replace(tzinfo=None)
domains = (
await db.execute(select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id))
).scalars().all()
# Yield stats (last 60d) by domain
since_60d = now_naive - timedelta(days=60)
yd_rows = (
await db.execute(
select(
YieldDomain.domain,
func.coalesce(func.sum(YieldTransaction.net_amount), 0).label("net_sum"),
func.coalesce(func.sum(case((YieldTransaction.event_type == "click", 1), else_=0)), 0).label("clicks"),
)
.join(
YieldTransaction,
and_(YieldTransaction.yield_domain_id == YieldDomain.id, YieldTransaction.created_at >= since_60d),
isouter=True,
)
.where(YieldDomain.user_id == current_user.id)
.group_by(YieldDomain.domain)
)
).all()
yield_by_domain = {str(d).lower(): {"net": float(n or 0), "clicks": int(c or 0)} for d, n, c in yd_rows}
# Monthly buckets next 12 months
buckets: dict[str, CfoMonthlyBucket] = {}
for i in range(0, 12):
d = (now + timedelta(days=30 * i)).replace(day=1)
buckets[_month_key(d)] = CfoMonthlyBucket(month=_month_key(d), total_cost_usd=0.0, domains=0)
upcoming_rows: list[CfoUpcomingCostRow] = []
kill_list: list[CfoKillListRow] = []
cutoff_30d = now_naive + timedelta(days=30)
for pd in domains:
if pd.is_sold:
continue
renewal_dt = pd.renewal_date
if not renewal_dt:
continue
if renewal_dt.tzinfo is not None:
renewal_dt_naive = renewal_dt.astimezone(timezone.utc).replace(tzinfo=None)
else:
renewal_dt_naive = renewal_dt
# cost source: portfolio overrides
if pd.renewal_cost is not None:
cost = float(pd.renewal_cost)
source = "portfolio"
else:
cost, source = await _estimate_renewal_cost_usd(db, pd.domain)
# Monthly burn timeline
month = _month_key(renewal_dt_naive)
if month not in buckets:
buckets[month] = CfoMonthlyBucket(month=month, total_cost_usd=0.0, domains=0)
if cost is not None:
buckets[month].total_cost_usd = float(buckets[month].total_cost_usd) + float(cost)
buckets[month].domains = int(buckets[month].domains) + 1
# Upcoming 30d
if now_naive <= renewal_dt_naive <= cutoff_30d:
upcoming_rows.append(
CfoUpcomingCostRow(
domain_id=pd.id,
domain=pd.domain,
renewal_date=renewal_dt,
renewal_cost_usd=cost,
cost_source=source,
is_sold=bool(pd.is_sold),
)
)
y = yield_by_domain.get(pd.domain.lower(), {"net": 0.0, "clicks": 0})
if float(y["net"]) <= 0.0 and int(y["clicks"]) <= 0:
kill_list.append(
CfoKillListRow(
domain_id=pd.id,
domain=pd.domain,
renewal_date=renewal_dt,
renewal_cost_usd=cost,
cost_source=source,
auto_renew=bool(pd.auto_renew),
is_dns_verified=bool(getattr(pd, "is_dns_verified", False) or False),
yield_net_60d=float(y["net"]),
yield_clicks_60d=int(y["clicks"]),
reason="No yield signals tracked in the last 60 days and renewal is due within 30 days.",
)
)
# Sort rows
upcoming_rows.sort(key=lambda r: (r.renewal_date or now_naive))
kill_list.sort(key=lambda r: (r.renewal_date or now_naive))
upcoming_total = sum((r.renewal_cost_usd or 0) for r in upcoming_rows)
monthly_sorted = [buckets[k] for k in sorted(buckets.keys())]
return CfoSummaryResponse(
computed_at=now,
upcoming_30d_total_usd=float(round(upcoming_total, 2)),
upcoming_30d_rows=upcoming_rows,
monthly=monthly_sorted,
kill_list=kill_list[:50],
)
@router.post("/domains/{domain_id}/set-to-drop", response_model=SetToDropResponse)
@limiter.limit("30/minute")
async def set_to_drop(
request: Request,
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Mark portfolio domain as 'to drop' by turning off local auto-renew flag.
(We cannot disable auto-renew at the registrar automatically.)
"""
pd = (
await db.execute(
select(PortfolioDomain).where(and_(PortfolioDomain.id == domain_id, PortfolioDomain.user_id == current_user.id))
)
).scalar_one_or_none()
if not pd:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Portfolio domain not found")
pd.auto_renew = False
pd.updated_at = datetime.utcnow()
await db.commit()
return SetToDropResponse(domain_id=pd.id, auto_renew=bool(pd.auto_renew), updated_at=pd.updated_at.replace(tzinfo=timezone.utc))

View File

@ -16,12 +16,10 @@ from datetime import datetime
from typing import Optional
from fastapi import APIRouter, HTTPException, status, BackgroundTasks, Request
from fastapi.responses import HTMLResponse
from pydantic import BaseModel, EmailStr, Field
from sqlalchemy import select, delete
from slowapi import Limiter
from slowapi.util import get_remote_address
from urllib.parse import urlencode
from app.api.deps import Database
from app.services.email_service import email_service
@ -34,11 +32,6 @@ router = APIRouter()
# Rate limiter for contact endpoints
limiter = Limiter(key_func=get_remote_address)
def _build_unsubscribe_url(email: str, token: str) -> str:
base = os.getenv("SITE_URL", "https://pounce.ch").rstrip("/")
query = urlencode({"email": email, "token": token})
return f"{base}/api/v1/contact/newsletter/unsubscribe?{query}"
# ============== Schemas ==============
@ -146,7 +139,6 @@ async def subscribe_newsletter(
background_tasks.add_task(
email_service.send_newsletter_welcome,
to_email=email_lower,
unsubscribe_url=_build_unsubscribe_url(email_lower, existing.unsubscribe_token),
)
return MessageResponse(
@ -168,7 +160,6 @@ async def subscribe_newsletter(
background_tasks.add_task(
email_service.send_newsletter_welcome,
to_email=email_lower,
unsubscribe_url=_build_unsubscribe_url(email_lower, subscriber.unsubscribe_token),
)
logger.info(f"Newsletter subscription: {email_lower}")
@ -225,50 +216,6 @@ async def unsubscribe_newsletter(
)
@router.get("/newsletter/unsubscribe")
async def unsubscribe_newsletter_one_click(
email: EmailStr,
token: str,
db: Database,
):
"""
One-click unsubscribe endpoint (for List-Unsubscribe header).
Always returns 200 with a human-readable HTML response.
"""
email_lower = email.lower()
result = await db.execute(
select(NewsletterSubscriber).where(
NewsletterSubscriber.email == email_lower,
NewsletterSubscriber.unsubscribe_token == token,
)
)
subscriber = result.scalar_one_or_none()
if subscriber and subscriber.is_active:
subscriber.is_active = False
subscriber.unsubscribed_at = datetime.utcnow()
await db.commit()
return HTMLResponse(
content="""
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<title>Unsubscribed</title>
</head>
<body style="font-family: system-ui, -apple-system, Segoe UI, Roboto, Helvetica, Arial, sans-serif; padding: 32px;">
<h1 style="margin: 0 0 12px 0;">You are unsubscribed.</h1>
<p style="margin: 0; color: #555;">
If you were subscribed, you will no longer receive pounce insights emails.
</p>
</body>
</html>
""".strip(),
status_code=200,
)
@router.get("/newsletter/status")
async def check_newsletter_status(
email: EmailStr,

View File

@ -1,105 +0,0 @@
"""Dashboard summary endpoints (reduce frontend API round-trips)."""
from datetime import datetime, timedelta
from fastapi import APIRouter, Depends
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user
from app.database import get_db
from app.models.auction import DomainAuction
from app.models.listing import DomainListing, ListingStatus
from app.models.user import User
# Reuse helpers for consistent formatting
from app.api.auctions import _format_time_remaining, _get_affiliate_url
from app.api.tld_prices import get_trending_tlds
router = APIRouter()
@router.get("/summary")
async def get_dashboard_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Return a compact dashboard payload used by `/terminal/radar`.
Goal: 1 request instead of multiple heavy round-trips.
"""
now = datetime.utcnow()
# -------------------------
# Market stats + preview
# -------------------------
active_auctions_filter = and_(DomainAuction.is_active == True, DomainAuction.end_time > now)
total_auctions = (await db.execute(select(func.count(DomainAuction.id)).where(active_auctions_filter))).scalar() or 0
cutoff = now + timedelta(hours=24)
ending_soon_filter = and_(
DomainAuction.is_active == True,
DomainAuction.end_time > now,
DomainAuction.end_time <= cutoff,
)
ending_soon_count = (await db.execute(select(func.count(DomainAuction.id)).where(ending_soon_filter))).scalar() or 0
ending_soon = (
await db.execute(
select(DomainAuction)
.where(ending_soon_filter)
.order_by(DomainAuction.end_time.asc())
.limit(5)
)
).scalars().all()
ending_soon_preview = [
{
"domain": a.domain,
"current_bid": a.current_bid,
"time_remaining": _format_time_remaining(a.end_time, now=now),
"platform": a.platform,
"affiliate_url": _get_affiliate_url(a.platform, a.domain, a.auction_url),
}
for a in ending_soon
]
# -------------------------
# Listings stats (user)
# -------------------------
listing_counts = (
await db.execute(
select(DomainListing.status, func.count(DomainListing.id))
.where(DomainListing.user_id == current_user.id)
.group_by(DomainListing.status)
)
).all()
by_status = {status: int(count) for status, count in listing_counts}
listing_stats = {
"active": by_status.get(ListingStatus.ACTIVE.value, 0),
"sold": by_status.get(ListingStatus.SOLD.value, 0),
"draft": by_status.get(ListingStatus.DRAFT.value, 0),
"total": sum(by_status.values()),
}
# -------------------------
# Trending TLDs (public data)
# -------------------------
trending = await get_trending_tlds(db)
return {
"market": {
"total_auctions": total_auctions,
"ending_soon": ending_soon_count,
"ending_soon_preview": ending_soon_preview,
},
"listings": listing_stats,
"tlds": trending,
"timestamp": now.isoformat(),
}

View File

@ -1,22 +1,21 @@
"""API dependencies."""
from typing import Annotated, Optional
from fastapi import Depends, HTTPException, Request, status
from fastapi import Depends, HTTPException, status
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import get_db
from app.services.auth import AuthService
from app.models.user import User
from app.security import AUTH_COOKIE_NAME
# Security scheme
security = HTTPBearer()
security_optional = HTTPBearer(auto_error=False)
async def get_current_user(
request: Request,
credentials: Annotated[Optional[HTTPAuthorizationCredentials], Depends(security_optional)],
credentials: Annotated[HTTPAuthorizationCredentials, Depends(security)],
db: Annotated[AsyncSession, Depends(get_db)],
) -> User:
"""Get current authenticated user from JWT token."""
@ -26,15 +25,7 @@ async def get_current_user(
headers={"WWW-Authenticate": "Bearer"},
)
token: Optional[str] = None
if credentials is not None:
token = credentials.credentials
if not token:
token = request.cookies.get(AUTH_COOKIE_NAME)
if not token:
raise credentials_exception
token = credentials.credentials
payload = AuthService.decode_token(token)
if payload is None:
@ -76,7 +67,6 @@ async def get_current_active_user(
async def get_current_user_optional(
request: Request,
credentials: Annotated[Optional[HTTPAuthorizationCredentials], Depends(security_optional)],
db: Annotated[AsyncSession, Depends(get_db)],
) -> Optional[User]:
@ -85,15 +75,10 @@ async def get_current_user_optional(
This allows endpoints to work for both authenticated and anonymous users,
potentially showing different content based on auth status.
"""
token: Optional[str] = None
if credentials is not None:
token = credentials.credentials
if not token:
token = request.cookies.get(AUTH_COOKIE_NAME)
if not token:
if credentials is None:
return None
token = credentials.credentials
payload = AuthService.decode_token(token)
if payload is None:

View File

@ -1,14 +1,13 @@
"""Domain management API (requires authentication)."""
import json
from datetime import datetime
from math import ceil
from fastapi import APIRouter, HTTPException, status, Query
from pydantic import BaseModel
from sqlalchemy import select, func, and_
from sqlalchemy import select, func
from app.api.deps import Database, CurrentUser
from app.models.domain import Domain, DomainCheck, DomainStatus, DomainHealthCache
from app.models.domain import Domain, DomainCheck, DomainStatus
from app.models.subscription import TIER_CONFIG, SubscriptionTier
from app.schemas.domain import DomainCreate, DomainResponse, DomainListResponse
from app.services.domain_checker import domain_checker
@ -16,38 +15,6 @@ from app.services.domain_health import get_health_checker, HealthStatus
router = APIRouter()
def _safe_json_loads(value: str | None, default):
if not value:
return default
try:
return json.loads(value)
except Exception:
return default
def _health_cache_to_report(domain: Domain, cache: DomainHealthCache) -> dict:
"""Convert DomainHealthCache row into the same shape as DomainHealthReport.to_dict()."""
return {
"domain": domain.name,
"status": cache.status or "unknown",
"score": cache.score or 0,
"signals": _safe_json_loads(cache.signals, []),
"recommendations": [], # not stored in cache (yet)
"checked_at": cache.checked_at.isoformat() if cache.checked_at else datetime.utcnow().isoformat(),
"dns": _safe_json_loads(
cache.dns_data,
{"has_ns": False, "has_a": False, "has_mx": False, "nameservers": [], "is_parked": False, "error": None},
),
"http": _safe_json_loads(
cache.http_data,
{"is_reachable": False, "status_code": None, "is_parked": False, "parking_keywords": [], "content_length": 0, "error": None},
),
"ssl": _safe_json_loads(
cache.ssl_data,
{"has_certificate": False, "is_valid": False, "expires_at": None, "days_until_expiry": None, "issuer": None, "error": None},
),
}
@router.get("", response_model=DomainListResponse)
async def list_domains(
@ -82,40 +49,6 @@ async def list_domains(
)
@router.get("/health-cache")
async def get_domains_health_cache(
current_user: CurrentUser,
db: Database,
):
"""
Get cached domain health reports for the current user (bulk).
This avoids N requests from the frontend and returns the cached health
data written by the scheduler job.
"""
result = await db.execute(
select(Domain, DomainHealthCache)
.outerjoin(DomainHealthCache, DomainHealthCache.domain_id == Domain.id)
.where(Domain.user_id == current_user.id)
)
rows = result.all()
reports: dict[str, dict] = {}
cached = 0
for domain, cache in rows:
if cache is None:
continue
reports[str(domain.id)] = _health_cache_to_report(domain, cache)
cached += 1
return {
"reports": reports,
"total_domains": len(rows),
"cached_domains": cached,
"timestamp": datetime.utcnow().isoformat(),
}
@router.post("", response_model=DomainResponse, status_code=status.HTTP_201_CREATED)
async def add_domain(
domain_data: DomainCreate,
@ -127,7 +60,7 @@ async def add_domain(
await db.refresh(current_user, ["subscription", "domains"])
if current_user.subscription:
limit = current_user.subscription.domain_limit
limit = current_user.subscription.max_domains
else:
limit = TIER_CONFIG[SubscriptionTier.SCOUT]["domain_limit"]
@ -315,59 +248,6 @@ async def update_notification_settings(
return domain
@router.patch("/{domain_id}/expiry", response_model=DomainResponse)
async def update_expiration_date(
domain_id: int,
data: dict,
current_user: CurrentUser,
db: Database,
):
"""
Manually set the expiration date for a domain.
Useful for TLDs like .ch, .de that don't expose expiration via public WHOIS/RDAP.
The date can be found in your registrar's control panel.
"""
from datetime import datetime
result = await db.execute(
select(Domain).where(
Domain.id == domain_id,
Domain.user_id == current_user.id,
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found",
)
# Parse and set expiration date
expiration_str = data.get('expiration_date')
if expiration_str:
try:
if isinstance(expiration_str, str):
# Parse ISO format
expiration_str = expiration_str.replace('Z', '+00:00')
domain.expiration_date = datetime.fromisoformat(expiration_str)
else:
domain.expiration_date = expiration_str
except Exception as e:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=f"Invalid date format: {e}",
)
else:
domain.expiration_date = None
await db.commit()
await db.refresh(domain)
return domain
@router.get("/{domain_id}/history")
async def get_domain_history(
domain_id: int,
@ -439,7 +319,6 @@ async def get_domain_health(
domain_id: int,
current_user: CurrentUser,
db: Database,
refresh: bool = Query(False, description="Force a live health check instead of using cache"),
):
"""
Get comprehensive health report for a domain.
@ -468,44 +347,11 @@ async def get_domain_health(
detail="Domain not found",
)
# Prefer cached report for UI performance
if not refresh:
cache_result = await db.execute(
select(DomainHealthCache).where(DomainHealthCache.domain_id == domain.id)
)
cache = cache_result.scalar_one_or_none()
if cache is not None:
return _health_cache_to_report(domain, cache)
# Live health check (slow) + update cache
# Run health check
health_checker = get_health_checker()
report = await health_checker.check_domain(domain.name)
report_dict = report.to_dict()
signals_json = json.dumps(report_dict.get("signals") or [])
dns_json = json.dumps(report_dict.get("dns") or {})
http_json = json.dumps(report_dict.get("http") or {})
ssl_json = json.dumps(report_dict.get("ssl") or {})
cache_result = await db.execute(
select(DomainHealthCache).where(DomainHealthCache.domain_id == domain.id)
)
cache = cache_result.scalar_one_or_none()
if cache is None:
cache = DomainHealthCache(domain_id=domain.id)
db.add(cache)
cache.status = report_dict.get("status") or "unknown"
cache.score = int(report_dict.get("score") or 0)
cache.signals = signals_json
cache.dns_data = dns_json
cache.http_data = http_json
cache.ssl_data = ssl_json
cache.checked_at = datetime.utcnow()
await db.commit()
return report_dict
return report.to_dict()
@router.post("/health-check")

View File

@ -1,177 +0,0 @@
"""
Drops API - Zone File Analysis Endpoints
=========================================
API endpoints for accessing freshly dropped domains from:
- Switch.ch zone files (.ch, .li)
- ICANN CZDS zone files (.com, .net, .org, .xyz, .info, .dev, .app, .online)
"""
from datetime import datetime
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Query, BackgroundTasks
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import get_db
from app.api.deps import get_current_user
from app.services.zone_file import (
ZoneFileService,
get_dropped_domains,
get_zone_stats,
)
router = APIRouter(prefix="/drops", tags=["drops"])
# All supported TLDs
SWITCH_TLDS = ["ch", "li"]
CZDS_TLDS = ["xyz", "org", "online", "info", "dev", "app"] # Approved
CZDS_PENDING = ["com", "net", "club", "biz"] # Pending approval
ALL_TLDS = SWITCH_TLDS + CZDS_TLDS
# ============================================================================
# PUBLIC ENDPOINTS (for stats)
# ============================================================================
@router.get("/stats")
async def api_get_zone_stats(
db: AsyncSession = Depends(get_db)
):
"""
Get zone file statistics.
Returns domain counts and last sync times for .ch and .li.
"""
try:
stats = await get_zone_stats(db)
return stats
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
# ============================================================================
# AUTHENTICATED ENDPOINTS
# ============================================================================
@router.get("")
async def api_get_drops(
tld: Optional[str] = Query(None, description="Filter by TLD"),
hours: int = Query(24, ge=1, le=48, description="Hours to look back (max 48h, we only store 48h)"),
min_length: Optional[int] = Query(None, ge=1, le=63, description="Minimum domain length"),
max_length: Optional[int] = Query(None, ge=1, le=63, description="Maximum domain length"),
exclude_numeric: bool = Query(False, description="Exclude numeric-only domains"),
exclude_hyphen: bool = Query(False, description="Exclude domains with hyphens"),
keyword: Optional[str] = Query(None, description="Search keyword"),
limit: int = Query(50, ge=1, le=200, description="Results per page"),
offset: int = Query(0, ge=0, description="Offset for pagination"),
db: AsyncSession = Depends(get_db),
current_user = Depends(get_current_user)
):
"""
Get recently dropped domains from zone files.
Supports:
- Switch.ch zones: .ch, .li
- ICANN CZDS zones: .xyz, .org, .online, .info, .dev, .app
Domains are detected by comparing daily zone file snapshots.
Only available for authenticated users.
"""
if tld and tld not in ALL_TLDS:
raise HTTPException(
status_code=400,
detail=f"Unsupported TLD. Supported: {', '.join(ALL_TLDS)}"
)
try:
result = await get_dropped_domains(
db=db,
tld=tld,
hours=hours,
min_length=min_length,
max_length=max_length,
exclude_numeric=exclude_numeric,
exclude_hyphen=exclude_hyphen,
keyword=keyword,
limit=limit,
offset=offset
)
return result
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.post("/sync/{tld}")
async def api_trigger_sync(
tld: str,
background_tasks: BackgroundTasks,
db: AsyncSession = Depends(get_db),
current_user = Depends(get_current_user)
):
"""
Trigger a manual zone file sync for a specific TLD.
Only available for admin users.
This is normally run automatically by the scheduler.
"""
# Check if user is admin
if not getattr(current_user, 'is_admin', False):
raise HTTPException(status_code=403, detail="Admin access required")
if tld not in ALL_TLDS:
raise HTTPException(
status_code=400,
detail=f"Unsupported TLD. Supported: {', '.join(ALL_TLDS)}"
)
async def run_sync():
from app.database import AsyncSessionLocal
async with AsyncSessionLocal() as session:
try:
if tld in SWITCH_TLDS:
# Use Switch.ch zone transfer
service = ZoneFileService()
await service.run_daily_sync(session, tld)
else:
# Use ICANN CZDS
from app.services.czds_client import CZDSClient
client = CZDSClient()
await client.sync_zone(session, tld)
except Exception as e:
print(f"Zone sync failed for .{tld}: {e}")
background_tasks.add_task(run_sync)
return {"status": "sync_started", "tld": tld}
# ============================================================================
# HELPER ENDPOINTS
# ============================================================================
@router.get("/tlds")
async def api_get_supported_tlds():
"""
Get list of supported TLDs for zone file analysis.
"""
return {
"tlds": [
# Switch.ch zones
{"tld": "ch", "name": "Switzerland", "flag": "🇨🇭", "registry": "Switch", "source": "switch"},
{"tld": "li", "name": "Liechtenstein", "flag": "🇱🇮", "registry": "Switch", "source": "switch"},
# ICANN CZDS zones (approved)
{"tld": "xyz", "name": "XYZ", "flag": "🌐", "registry": "XYZ.COM LLC", "source": "czds"},
{"tld": "org", "name": "Organization", "flag": "🏛️", "registry": "PIR", "source": "czds"},
{"tld": "online", "name": "Online", "flag": "💻", "registry": "Radix", "source": "czds"},
{"tld": "info", "name": "Information", "flag": "", "registry": "Afilias", "source": "czds"},
{"tld": "dev", "name": "Developer", "flag": "👨‍💻", "registry": "Google", "source": "czds"},
{"tld": "app", "name": "Application", "flag": "📱", "registry": "Google", "source": "czds"},
],
"pending": [
# CZDS pending approval
{"tld": "com", "name": "Commercial", "flag": "🏢", "registry": "Verisign", "source": "czds"},
{"tld": "net", "name": "Network", "flag": "🌐", "registry": "Verisign", "source": "czds"},
{"tld": "club", "name": "Club", "flag": "🎉", "registry": "GoDaddy", "source": "czds"},
{"tld": "biz", "name": "Business", "flag": "💼", "registry": "GoDaddy", "source": "czds"},
]
}

View File

@ -1,247 +0,0 @@
"""HUNT (Discovery) endpoints."""
from __future__ import annotations
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, HTTPException, Query, Request
from slowapi import Limiter
from slowapi.util import get_remote_address
from sqlalchemy import and_, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user
from app.database import get_db
from app.models.auction import DomainAuction
from app.models.user import User
from app.schemas.hunt import (
BrandableRequest,
BrandableCandidate,
BrandableResponse,
HuntSniperItem,
HuntSniperResponse,
KeywordAvailabilityRequest,
KeywordAvailabilityResponse,
KeywordAvailabilityRow,
TrendsResponse,
TrendItem,
TypoCheckRequest,
TypoCheckResponse,
TypoCandidate,
)
from app.services.domain_checker import domain_checker
from app.services.hunt.brandables import check_domains, generate_cvcvc, generate_cvccv, generate_human
from app.services.hunt.trends import fetch_google_trends_daily_rss
from app.services.hunt.typos import generate_typos
router = APIRouter()
limiter = Limiter(key_func=get_remote_address)
def _utcnow() -> datetime:
return datetime.now(timezone.utc)
@router.get("/bargain-bin", response_model=HuntSniperResponse)
@limiter.limit("60/minute")
async def bargain_bin(
request: Request,
_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
limit: int = Query(100, ge=1, le=500),
):
"""
Closeout Sniper (Chris logic):
price < $10 AND age_years >= 5 AND backlinks > 0
Uses ONLY real scraped auction data (DomainAuction.age_years/backlinks).
Items without required fields are excluded.
"""
now = _utcnow().replace(tzinfo=None)
base = and_(DomainAuction.is_active == True, DomainAuction.end_time > now) # noqa: E712
rows = (
await db.execute(
select(DomainAuction)
.where(base)
.where(DomainAuction.current_bid < 10)
.order_by(DomainAuction.end_time.asc())
.limit(limit * 3) # allow filtering
)
).scalars().all()
filtered_out = 0
items: list[HuntSniperItem] = []
for a in rows:
if a.age_years is None or a.backlinks is None:
filtered_out += 1
continue
if int(a.age_years) < 5 or int(a.backlinks) <= 0:
continue
items.append(
HuntSniperItem(
domain=a.domain,
platform=a.platform,
auction_url=a.auction_url,
current_bid=float(a.current_bid),
currency=a.currency,
end_time=a.end_time.replace(tzinfo=timezone.utc) if a.end_time and a.end_time.tzinfo is None else a.end_time,
age_years=int(a.age_years) if a.age_years is not None else None,
backlinks=int(a.backlinks) if a.backlinks is not None else None,
pounce_score=int(a.pounce_score) if a.pounce_score is not None else None,
)
)
if len(items) >= limit:
break
last_updated = (
await db.execute(select(func.max(DomainAuction.updated_at)).where(DomainAuction.is_active == True)) # noqa: E712
).scalar()
return HuntSniperResponse(
items=items,
total=len(items),
filtered_out_missing_data=int(filtered_out),
last_updated=last_updated.replace(tzinfo=timezone.utc) if last_updated and last_updated.tzinfo is None else last_updated,
)
@router.get("/trends", response_model=TrendsResponse)
@limiter.limit("30/minute")
async def trends(
request: Request,
_user: User = Depends(get_current_user),
geo: str = Query("US", min_length=2, max_length=2),
):
try:
items_raw = await fetch_google_trends_daily_rss(geo=geo)
except Exception:
# Don't 500 the whole UI when the public feed is temporarily unavailable.
raise HTTPException(status_code=502, detail="Google Trends feed unavailable")
items = [
TrendItem(
title=i["title"],
approx_traffic=i.get("approx_traffic"),
published_at=i.get("published_at"),
link=i.get("link"),
)
for i in items_raw[:50]
]
return TrendsResponse(geo=geo.upper(), items=items, fetched_at=_utcnow())
@router.post("/keywords", response_model=KeywordAvailabilityResponse)
@limiter.limit("30/minute")
async def keyword_availability(
request: Request,
payload: KeywordAvailabilityRequest,
_user: User = Depends(get_current_user),
):
# Normalize + cap work for UX/perf
keywords = []
for kw in payload.keywords[:25]:
k = kw.strip().lower().replace(" ", "")
if k:
keywords.append(kw)
tlds = [t.lower().lstrip(".") for t in payload.tlds[:20] if t.strip()]
if not tlds:
tlds = ["com"]
# Build candidate domains
candidates: list[tuple[str, str, str]] = []
domain_list: list[str] = []
for kw in keywords:
k = kw.strip().lower().replace(" ", "")
if not k:
continue
for t in tlds:
d = f"{k}.{t}"
candidates.append((kw, t, d))
domain_list.append(d)
checked = await check_domains(domain_list, concurrency=40)
by_domain = {c.domain: c for c in checked}
rows: list[KeywordAvailabilityRow] = []
for kw, t, d in candidates:
c = by_domain.get(d)
if not c:
rows.append(KeywordAvailabilityRow(keyword=kw, domain=d, tld=t, is_available=None, status="unknown"))
else:
rows.append(KeywordAvailabilityRow(keyword=kw, domain=d, tld=t, is_available=c.is_available, status=c.status))
return KeywordAvailabilityResponse(items=rows)
@router.post("/typos", response_model=TypoCheckResponse)
@limiter.limit("20/minute")
async def typo_check(
request: Request,
payload: TypoCheckRequest,
_user: User = Depends(get_current_user),
):
brand = payload.brand.strip()
typos = generate_typos(brand, limit=min(int(payload.limit) * 4, 400))
# Build domain list (dedup)
tlds = [t.lower().lstrip(".") for t in payload.tlds if t.strip()]
candidates: list[str] = []
seen = set()
for typo in typos:
for t in tlds:
d = f"{typo}.{t}"
if d not in seen:
candidates.append(d)
seen.add(d)
if len(candidates) >= payload.limit * 4:
break
if len(candidates) >= payload.limit * 4:
break
checked = await check_domains(candidates, concurrency=30)
available = [c for c in checked if c.status == "available"]
items = [TypoCandidate(domain=c.domain, is_available=c.is_available, status=c.status) for c in available[: payload.limit]]
return TypoCheckResponse(brand=brand, items=items)
@router.post("/brandables", response_model=BrandableResponse)
@limiter.limit("15/minute")
async def brandables(
request: Request,
payload: BrandableRequest,
_user: User = Depends(get_current_user),
):
pattern = payload.pattern.strip().lower()
if pattern not in ("cvcvc", "cvccv", "human"):
pattern = "cvcvc"
tlds = [t.lower().lstrip(".") for t in payload.tlds if t.strip()]
if not tlds:
tlds = ["com"]
# Generate + check up to max_checks; return only available
candidates: list[str] = []
for _ in range(int(payload.max_checks)):
if pattern == "cvcvc":
sld = generate_cvcvc()
elif pattern == "cvccv":
sld = generate_cvccv()
else:
sld = generate_human()
for t in tlds:
candidates.append(f"{sld}.{t}")
checked = await check_domains(candidates, concurrency=40)
available = [c for c in checked if c.status == "available"]
# De-dup by domain
seen = set()
out = []
for c in available:
if c.domain not in seen:
seen.add(c.domain)
out.append(BrandableCandidate(domain=c.domain, is_available=c.is_available, status=c.status))
if len(out) >= payload.limit:
break
return BrandableResponse(pattern=payload.pattern, items=out)

View File

@ -27,63 +27,12 @@ from fastapi import APIRouter, Depends, Query, HTTPException, Request
from pydantic import BaseModel, Field, EmailStr
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload
from app.database import get_db
from app.api.deps import get_current_user, get_current_user_optional
from app.models.user import User
from app.models.listing import (
DomainListing,
ListingInquiry,
ListingInquiryEvent,
ListingInquiryMessage,
ListingView,
ListingStatus,
VerificationStatus,
)
from app.models.listing import DomainListing, ListingInquiry, ListingView, ListingStatus, VerificationStatus
from app.services.valuation import valuation_service
from app.services.telemetry import track_event
def _calculate_pounce_score(domain: str, is_pounce: bool = True) -> int:
"""
Calculate Pounce Score for a domain.
Uses the same algorithm as Market Feed (_calculate_pounce_score_v2 in auctions.py).
"""
# Parse domain
parts = domain.lower().rsplit(".", 1)
if len(parts) != 2:
return 50
name, tld = parts
score = 50 # Baseline
# A) LENGTH BONUS (exponential for short domains)
length_scores = {1: 50, 2: 45, 3: 40, 4: 30, 5: 20, 6: 15, 7: 10}
score += length_scores.get(len(name), max(0, 15 - len(name)))
# B) TLD PREMIUM
tld_scores = {
'com': 20, 'ai': 25, 'io': 18, 'co': 12,
'ch': 15, 'de': 10, 'net': 8, 'org': 8,
'app': 10, 'dev': 10, 'xyz': 5
}
score += tld_scores.get(tld.lower(), 0)
# C) POUNCE DIRECT BONUS (listings are always Pounce Direct)
if is_pounce:
score += 10
# D) PENALTIES
if '-' in name:
score -= 25
if any(c.isdigit() for c in name) and len(name) > 3:
score -= 20
if len(name) > 15:
score -= 15
# Clamp to 0-100
return max(0, min(100, score))
logger = logging.getLogger(__name__)
router = APIRouter()
@ -114,9 +63,6 @@ class ListingUpdate(BaseModel):
show_valuation: Optional[bool] = None
allow_offers: Optional[bool] = None
status: Optional[str] = None
sold_reason: Optional[str] = Field(None, max_length=200)
sold_price: Optional[float] = Field(None, ge=0)
sold_currency: Optional[str] = Field(None, max_length=3)
class ListingResponse(BaseModel):
@ -142,10 +88,6 @@ class ListingResponse(BaseModel):
public_url: str
created_at: datetime
published_at: Optional[datetime]
sold_at: Optional[datetime] = None
sold_reason: Optional[str] = None
sold_price: Optional[float] = None
sold_currency: Optional[str] = None
# Seller info (minimal for privacy)
seller_verified: bool = False
@ -173,7 +115,6 @@ class ListingPublicResponse(BaseModel):
# Seller trust indicators
seller_verified: bool
seller_member_since: Optional[datetime]
seller_invite_code: Optional[str] = None
class Config:
from_attributes = True
@ -201,37 +142,11 @@ class InquiryResponse(BaseModel):
status: str
created_at: datetime
read_at: Optional[datetime]
replied_at: Optional[datetime] = None
closed_at: Optional[datetime] = None
closed_reason: Optional[str] = None
buyer_user_id: Optional[int] = None
class Config:
from_attributes = True
class InquiryUpdate(BaseModel):
"""Update inquiry status for listing owner."""
status: str = Field(..., min_length=1, max_length=20) # new, read, replied, spam
reason: Optional[str] = Field(None, max_length=200)
class InquiryMessageCreate(BaseModel):
body: str = Field(..., min_length=1, max_length=4000)
class InquiryMessageResponse(BaseModel):
id: int
inquiry_id: int
listing_id: int
sender_user_id: int
body: str
created_at: datetime
class Config:
from_attributes = True
class VerificationResponse(BaseModel):
"""DNS verification response."""
verification_code: str
@ -279,14 +194,13 @@ async def browse_listings(
min_price: Optional[float] = Query(None, ge=0),
max_price: Optional[float] = Query(None, ge=0),
verified_only: bool = Query(False),
clean_only: bool = Query(True, description="Hide low-quality/spam listings"),
sort_by: str = Query("newest", enum=["newest", "price_asc", "price_desc", "popular"]),
limit: int = Query(20, le=50),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
):
"""Browse active domain listings (public)."""
query = select(DomainListing).options(selectinload(DomainListing.user)).where(
query = select(DomainListing).where(
DomainListing.status == ListingStatus.ACTIVE.value
)
@ -321,18 +235,6 @@ async def browse_listings(
responses = []
for listing in listings:
# Calculate pounce_score dynamically if not stored
pounce_score = listing.pounce_score
if pounce_score is None:
pounce_score = _calculate_pounce_score(listing.domain)
# Save it for future requests
listing.pounce_score = pounce_score
# Public cleanliness rule: don't surface low-quality inventory by default.
# (Still accessible in Terminal for authenticated power users.)
if clean_only and (pounce_score or 0) < 50:
continue
responses.append(ListingPublicResponse(
domain=listing.domain,
slug=listing.slug,
@ -341,17 +243,15 @@ async def browse_listings(
asking_price=listing.asking_price,
currency=listing.currency,
price_type=listing.price_type,
pounce_score=pounce_score, # Always return the score
pounce_score=listing.pounce_score if listing.show_valuation else None,
estimated_value=listing.estimated_value if listing.show_valuation else None,
is_verified=listing.is_verified,
allow_offers=listing.allow_offers,
public_url=listing.public_url,
seller_verified=listing.is_verified,
seller_member_since=listing.user.created_at if listing.user else None,
seller_invite_code=getattr(listing.user, "invite_code", None) if listing.user else None,
))
await db.commit() # Save any updated pounce_scores
return responses
@ -393,10 +293,6 @@ async def get_my_listings(
public_url=listing.public_url,
created_at=listing.created_at,
published_at=listing.published_at,
sold_at=getattr(listing, "sold_at", None),
sold_reason=getattr(listing, "sold_reason", None),
sold_price=getattr(listing, "sold_price", None),
sold_currency=getattr(listing, "sold_currency", None),
seller_verified=current_user.is_verified,
seller_member_since=current_user.created_at,
)
@ -415,9 +311,7 @@ async def get_listing_by_slug(
):
"""Get listing details by slug (public)."""
result = await db.execute(
select(DomainListing)
.options(selectinload(DomainListing.user))
.where(
select(DomainListing).where(
and_(
DomainListing.slug == slug,
DomainListing.status == ListingStatus.ACTIVE.value,
@ -441,26 +335,6 @@ async def get_listing_by_slug(
# Increment view count
listing.view_count += 1
await track_event(
db,
event_name="listing_view",
request=request,
user_id=current_user.id if current_user else None,
is_authenticated=bool(current_user),
source="public",
domain=listing.domain,
listing_id=listing.id,
metadata={"slug": listing.slug},
)
# Calculate pounce_score dynamically if not stored (same as Market Feed)
pounce_score = listing.pounce_score
if pounce_score is None:
pounce_score = _calculate_pounce_score(listing.domain)
# Save it for future requests
listing.pounce_score = pounce_score
await db.commit()
return ListingPublicResponse(
@ -471,14 +345,13 @@ async def get_listing_by_slug(
asking_price=listing.asking_price,
currency=listing.currency,
price_type=listing.price_type,
pounce_score=pounce_score, # Always return the score
pounce_score=listing.pounce_score if listing.show_valuation else None,
estimated_value=listing.estimated_value if listing.show_valuation else None,
is_verified=listing.is_verified,
allow_offers=listing.allow_offers,
public_url=listing.public_url,
seller_verified=listing.is_verified,
seller_member_since=listing.user.created_at if listing.user else None,
seller_invite_code=getattr(listing.user, "invite_code", None) if listing.user else None,
)
@ -487,10 +360,9 @@ async def submit_inquiry(
slug: str,
inquiry: InquiryCreate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Submit an inquiry for a listing (requires authentication)."""
"""Submit an inquiry for a listing (public)."""
# Find listing
result = await db.execute(
select(DomainListing).where(
@ -504,14 +376,6 @@ async def submit_inquiry(
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
# Require that inquiries are sent from the authenticated account email.
# This prevents anonymous spam and makes the buyer identity consistent.
if inquiry.email.lower() != (current_user.email or "").lower():
raise HTTPException(
status_code=400,
detail="Inquiry email must match your account email.",
)
# Security: Check for phishing keywords
if not _check_content_safety(inquiry.message):
@ -520,13 +384,13 @@ async def submit_inquiry(
detail="Message contains blocked content. Please revise."
)
# Rate limiting check (simple: max 3 inquiries per user per listing per day)
# Rate limiting check (simple: max 3 inquiries per email per listing per day)
today_start = datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
existing_count = await db.execute(
select(func.count(ListingInquiry.id)).where(
and_(
ListingInquiry.listing_id == listing.id,
ListingInquiry.buyer_user_id == current_user.id,
ListingInquiry.email == inquiry.email.lower(),
ListingInquiry.created_at >= today_start,
)
)
@ -540,7 +404,6 @@ async def submit_inquiry(
# Create inquiry
new_inquiry = ListingInquiry(
listing_id=listing.id,
buyer_user_id=current_user.id,
name=inquiry.name,
email=inquiry.email.lower(),
phone=inquiry.phone,
@ -551,64 +414,13 @@ async def submit_inquiry(
user_agent=request.headers.get("user-agent", "")[:500],
)
db.add(new_inquiry)
await db.flush()
await track_event(
db,
event_name="inquiry_created",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="public",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=new_inquiry.id,
metadata={
"offer_amount": inquiry.offer_amount,
"has_phone": bool(inquiry.phone),
"has_company": bool(inquiry.company),
},
)
# Seed thread with the initial message
db.add(
ListingInquiryMessage(
inquiry_id=new_inquiry.id,
listing_id=listing.id,
sender_user_id=current_user.id,
body=inquiry.message,
)
)
# Increment inquiry count
listing.inquiry_count += 1
await db.commit()
# Send email notification to seller
try:
from app.services.email_service import email_service
from app.models.user import User
# Get seller's email
seller_result = await db.execute(
select(User).where(User.id == listing.user_id)
)
seller = seller_result.scalar_one_or_none()
if seller and seller.email and email_service.is_configured():
await email_service.send_listing_inquiry(
to_email=seller.email,
domain=listing.domain,
name=inquiry.name,
email=inquiry.email,
message=inquiry.message,
company=inquiry.company,
offer_amount=inquiry.offer_amount,
)
logger.info(f"📧 Inquiry notification sent to {seller.email} for {listing.domain}")
except Exception as e:
logger.error(f"Failed to send inquiry notification: {e}")
# TODO: Send email notification to seller
return {
"success": True,
@ -624,45 +436,10 @@ async def create_listing(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Create a new domain listing.
SECURITY: Domain must be in user's portfolio before listing for sale.
DNS verification happens in the verification step (separate endpoint).
"""
from app.models.portfolio import PortfolioDomain
domain_lower = data.domain.lower()
# SECURITY CHECK: Domain must be in user's portfolio
portfolio_result = await db.execute(
select(PortfolioDomain).where(
PortfolioDomain.domain == domain_lower,
PortfolioDomain.user_id == current_user.id,
)
)
portfolio_domain = portfolio_result.scalar_one_or_none()
if not portfolio_domain:
raise HTTPException(
status_code=403,
detail="Domain must be in your portfolio before listing for sale. Add it to your portfolio first.",
)
# Check if domain is sold
if portfolio_domain.is_sold:
raise HTTPException(
status_code=400,
detail="Cannot list a sold domain for sale.",
)
# Check if domain is DNS verified in portfolio
# If verified in portfolio, listing inherits verification
is_portfolio_verified = getattr(portfolio_domain, 'is_dns_verified', False) or False
"""Create a new domain listing."""
# Check if domain is already listed
existing = await db.execute(
select(DomainListing).where(DomainListing.domain == domain_lower)
select(DomainListing).where(DomainListing.domain == data.domain.lower())
)
if existing.scalar_one_or_none():
raise HTTPException(status_code=400, detail="This domain is already listed")
@ -675,16 +452,10 @@ async def create_listing(
)
listing_count = user_listings.scalar() or 0
# Listing limits by tier (from pounce_pricing.md)
# Load subscription separately to avoid async lazy loading issues
from app.models.subscription import Subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == current_user.id)
)
subscription = sub_result.scalar_one_or_none()
tier = subscription.tier if subscription else "scout"
limits = {"scout": 0, "trader": 5, "tycoon": 50}
max_listings = limits.get(tier, 0)
# Listing limits by tier
tier = current_user.subscription.tier if current_user.subscription else "scout"
limits = {"scout": 2, "trader": 10, "tycoon": 50}
max_listings = limits.get(tier, 2)
if listing_count >= max_listings:
raise HTTPException(
@ -693,7 +464,7 @@ async def create_listing(
)
# Generate slug
slug = _generate_slug(domain_lower)
slug = _generate_slug(data.domain)
# Check slug uniqueness
slug_check = await db.execute(
@ -704,18 +475,17 @@ async def create_listing(
# Get valuation
try:
valuation = await valuation_service.estimate_value(domain_lower, db, save_result=False)
valuation = await valuation_service.estimate_value(data.domain, db, save_result=False)
pounce_score = min(100, int(valuation.get("score", 50)))
estimated_value = valuation.get("value", 0) # Fixed: was 'estimated_value', service returns 'value'
estimated_value = valuation.get("estimated_value", 0)
except Exception:
pounce_score = 50
estimated_value = None
# Create listing
# If portfolio domain is already DNS verified, listing is auto-verified
listing = DomainListing(
user_id=current_user.id,
domain=domain_lower,
domain=data.domain.lower(),
slug=slug,
title=data.title,
description=data.description,
@ -727,9 +497,7 @@ async def create_listing(
allow_offers=data.allow_offers,
pounce_score=pounce_score,
estimated_value=estimated_value,
verification_code=portfolio_domain.verification_code if is_portfolio_verified else _generate_verification_code(),
verification_status=VerificationStatus.VERIFIED.value if is_portfolio_verified else VerificationStatus.PENDING.value,
verified_at=portfolio_domain.verified_at if is_portfolio_verified else None,
verification_code=_generate_verification_code(),
status=ListingStatus.DRAFT.value,
)
@ -804,387 +572,15 @@ async def get_listing_inquiries(
status=inq.status,
created_at=inq.created_at,
read_at=inq.read_at,
replied_at=getattr(inq, "replied_at", None),
closed_at=getattr(inq, "closed_at", None),
closed_reason=getattr(inq, "closed_reason", None),
buyer_user_id=getattr(inq, "buyer_user_id", None),
)
for inq in inquiries
]
@router.patch("/{id}/inquiries/{inquiry_id}", response_model=InquiryResponse)
async def update_listing_inquiry(
id: int,
inquiry_id: int,
data: InquiryUpdate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Update an inquiry status (listing owner only)."""
allowed = {"new", "read", "replied", "closed", "spam"}
status_clean = (data.status or "").strip().lower()
if status_clean not in allowed:
raise HTTPException(status_code=400, detail="Invalid status")
# Verify listing ownership
listing_result = await db.execute(
select(DomainListing).where(
and_(
DomainListing.id == id,
DomainListing.user_id == current_user.id,
)
)
)
listing = listing_result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
inquiry_result = await db.execute(
select(ListingInquiry).where(
and_(
ListingInquiry.id == inquiry_id,
ListingInquiry.listing_id == id,
)
)
)
inquiry = inquiry_result.scalar_one_or_none()
if not inquiry:
raise HTTPException(status_code=404, detail="Inquiry not found")
now = datetime.utcnow()
old_status = getattr(inquiry, "status", None)
inquiry.status = status_clean
if status_clean == "read" and inquiry.read_at is None:
inquiry.read_at = now
if status_clean == "replied":
inquiry.replied_at = now
if status_clean == "closed":
inquiry.closed_at = now
inquiry.closed_reason = (data.reason or "").strip() or None
if status_clean == "spam":
inquiry.closed_reason = (data.reason or "").strip() or inquiry.closed_reason
# Audit trail
event = ListingInquiryEvent(
inquiry_id=inquiry.id,
listing_id=listing.id,
actor_user_id=current_user.id,
old_status=old_status,
new_status=status_clean,
reason=(data.reason or "").strip() or None,
ip_address=request.client.host if request.client else None,
user_agent=request.headers.get("user-agent", "")[:500],
)
db.add(event)
await track_event(
db,
event_name="inquiry_status_changed",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=inquiry.id,
metadata={"old_status": old_status, "new_status": status_clean, "reason": (data.reason or "").strip() or None},
)
await db.commit()
await db.refresh(inquiry)
return InquiryResponse(
id=inquiry.id,
name=inquiry.name,
email=inquiry.email,
phone=inquiry.phone,
company=inquiry.company,
message=inquiry.message,
offer_amount=inquiry.offer_amount,
status=inquiry.status,
created_at=inquiry.created_at,
read_at=inquiry.read_at,
replied_at=getattr(inquiry, "replied_at", None),
closed_at=getattr(inquiry, "closed_at", None),
closed_reason=getattr(inquiry, "closed_reason", None),
buyer_user_id=getattr(inquiry, "buyer_user_id", None),
)
@router.get("/{id}/inquiries/{inquiry_id}/messages", response_model=List[InquiryMessageResponse])
async def get_inquiry_messages_for_seller(
id: int,
inquiry_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Seller: fetch thread messages for an inquiry."""
listing_result = await db.execute(
select(DomainListing).where(and_(DomainListing.id == id, DomainListing.user_id == current_user.id))
)
listing = listing_result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
inquiry_result = await db.execute(
select(ListingInquiry).where(and_(ListingInquiry.id == inquiry_id, ListingInquiry.listing_id == id))
)
inquiry = inquiry_result.scalar_one_or_none()
if not inquiry:
raise HTTPException(status_code=404, detail="Inquiry not found")
msgs = (
await db.execute(
select(ListingInquiryMessage)
.where(and_(ListingInquiryMessage.inquiry_id == inquiry_id, ListingInquiryMessage.listing_id == id))
.order_by(ListingInquiryMessage.created_at.asc())
)
).scalars().all()
return [InquiryMessageResponse.model_validate(m) for m in msgs]
@router.post("/{id}/inquiries/{inquiry_id}/messages", response_model=InquiryMessageResponse)
async def post_inquiry_message_as_seller(
id: int,
inquiry_id: int,
payload: InquiryMessageCreate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Seller: post a message into an inquiry thread."""
listing_result = await db.execute(
select(DomainListing).where(and_(DomainListing.id == id, DomainListing.user_id == current_user.id))
)
listing = listing_result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
inquiry_result = await db.execute(
select(ListingInquiry).where(and_(ListingInquiry.id == inquiry_id, ListingInquiry.listing_id == id))
)
inquiry = inquiry_result.scalar_one_or_none()
if not inquiry:
raise HTTPException(status_code=404, detail="Inquiry not found")
if inquiry.status in ["closed", "spam"]:
raise HTTPException(status_code=400, detail="Inquiry is closed")
# Content safety (phishing keywords)
if not _check_content_safety(payload.body):
raise HTTPException(status_code=400, detail="Message contains blocked content. Please revise.")
# Simple rate limit: max 30 messages per hour per inquiry
hour_start = datetime.utcnow() - timedelta(hours=1)
msg_count = (
await db.execute(
select(func.count(ListingInquiryMessage.id)).where(
and_(
ListingInquiryMessage.inquiry_id == inquiry.id,
ListingInquiryMessage.sender_user_id == current_user.id,
ListingInquiryMessage.created_at >= hour_start,
)
)
)
).scalar() or 0
if msg_count >= 30:
raise HTTPException(status_code=429, detail="Too many messages. Please slow down.")
msg = ListingInquiryMessage(
inquiry_id=inquiry.id,
listing_id=listing.id,
sender_user_id=current_user.id,
body=payload.body,
)
db.add(msg)
await db.flush()
await track_event(
db,
event_name="message_sent",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=inquiry.id,
metadata={"role": "buyer"},
)
await track_event(
db,
event_name="message_sent",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=inquiry.id,
metadata={"role": "seller"},
)
# Email buyer (if configured)
try:
from app.services.email_service import email_service
if inquiry.buyer_user_id:
buyer = (
await db.execute(select(User).where(User.id == inquiry.buyer_user_id))
).scalar_one_or_none()
else:
buyer = None
if buyer and buyer.email and email_service.is_configured():
thread_url = f"https://pounce.ch/terminal/inbox?inquiry={inquiry.id}"
await email_service.send_listing_message(
to_email=buyer.email,
domain=listing.domain,
sender_name=current_user.name or current_user.email,
message=payload.body,
thread_url=thread_url,
)
except Exception as e:
logger.error(f"Failed to send listing message notification: {e}")
await db.commit()
await db.refresh(msg)
return InquiryMessageResponse.model_validate(msg)
@router.get("/inquiries/my")
async def get_my_inquiries_as_buyer(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Buyer: list inquiries created from this account."""
result = await db.execute(
select(ListingInquiry, DomainListing)
.join(DomainListing, DomainListing.id == ListingInquiry.listing_id)
.where(ListingInquiry.buyer_user_id == current_user.id)
.order_by(ListingInquiry.created_at.desc())
)
rows = result.all()
return [
{
"id": inq.id,
"listing_id": listing.id,
"domain": listing.domain,
"slug": listing.slug,
"status": inq.status,
"created_at": inq.created_at.isoformat(),
"closed_at": inq.closed_at.isoformat() if getattr(inq, "closed_at", None) else None,
"closed_reason": getattr(inq, "closed_reason", None),
}
for inq, listing in rows
]
@router.get("/inquiries/{inquiry_id}/messages", response_model=List[InquiryMessageResponse])
async def get_inquiry_messages_for_buyer(
inquiry_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Buyer: fetch thread messages for one inquiry."""
inquiry = (
await db.execute(select(ListingInquiry).where(ListingInquiry.id == inquiry_id))
).scalar_one_or_none()
if not inquiry or inquiry.buyer_user_id != current_user.id:
raise HTTPException(status_code=404, detail="Inquiry not found")
msgs = (
await db.execute(
select(ListingInquiryMessage)
.where(ListingInquiryMessage.inquiry_id == inquiry_id)
.order_by(ListingInquiryMessage.created_at.asc())
)
).scalars().all()
return [InquiryMessageResponse.model_validate(m) for m in msgs]
@router.post("/inquiries/{inquiry_id}/messages", response_model=InquiryMessageResponse)
async def post_inquiry_message_as_buyer(
inquiry_id: int,
payload: InquiryMessageCreate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Buyer: post a message into an inquiry thread."""
inquiry = (
await db.execute(select(ListingInquiry).where(ListingInquiry.id == inquiry_id))
).scalar_one_or_none()
if not inquiry or inquiry.buyer_user_id != current_user.id:
raise HTTPException(status_code=404, detail="Inquiry not found")
if inquiry.status in ["closed", "spam"]:
raise HTTPException(status_code=400, detail="Inquiry is closed")
# Content safety (phishing keywords)
if not _check_content_safety(payload.body):
raise HTTPException(status_code=400, detail="Message contains blocked content. Please revise.")
# Simple rate limit: max 20 messages per hour per inquiry
hour_start = datetime.utcnow() - timedelta(hours=1)
msg_count = (
await db.execute(
select(func.count(ListingInquiryMessage.id)).where(
and_(
ListingInquiryMessage.inquiry_id == inquiry.id,
ListingInquiryMessage.sender_user_id == current_user.id,
ListingInquiryMessage.created_at >= hour_start,
)
)
)
).scalar() or 0
if msg_count >= 20:
raise HTTPException(status_code=429, detail="Too many messages. Please slow down.")
listing = (
await db.execute(select(DomainListing).where(DomainListing.id == inquiry.listing_id))
).scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
msg = ListingInquiryMessage(
inquiry_id=inquiry.id,
listing_id=listing.id,
sender_user_id=current_user.id,
body=payload.body,
)
db.add(msg)
await db.flush()
# Email seller (if configured)
try:
from app.services.email_service import email_service
seller = (
await db.execute(select(User).where(User.id == listing.user_id))
).scalar_one_or_none()
if seller and seller.email and email_service.is_configured():
thread_url = f"https://pounce.ch/terminal/listing"
await email_service.send_listing_message(
to_email=seller.email,
domain=listing.domain,
sender_name=current_user.name or current_user.email,
message=payload.body,
thread_url=thread_url,
)
except Exception as e:
logger.error(f"Failed to send listing message notification: {e}")
await db.commit()
await db.refresh(msg)
return InquiryMessageResponse.model_validate(msg)
@router.put("/{id}", response_model=ListingResponse)
async def update_listing(
id: int,
data: ListingUpdate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
@ -1229,58 +625,8 @@ async def update_listing(
)
listing.status = ListingStatus.ACTIVE.value
listing.published_at = datetime.utcnow()
elif data.status in ["draft", "expired"]:
elif data.status in ["draft", "sold", "expired"]:
listing.status = data.status
elif data.status == "sold":
if listing.status != ListingStatus.ACTIVE.value:
raise HTTPException(status_code=400, detail="Only active listings can be marked as sold.")
listing.status = ListingStatus.SOLD.value
listing.sold_at = datetime.utcnow()
listing.sold_reason = (data.sold_reason or "").strip() or listing.sold_reason
listing.sold_price = data.sold_price if data.sold_price is not None else listing.sold_price
listing.sold_currency = (data.sold_currency or listing.currency or "USD").upper()
# Close all open inquiries on this listing (deal is done).
inqs = (
await db.execute(
select(ListingInquiry).where(ListingInquiry.listing_id == listing.id)
)
).scalars().all()
for inq in inqs:
if inq.status in ["closed", "spam"]:
continue
old = inq.status
inq.status = "closed"
inq.closed_at = datetime.utcnow()
inq.closed_reason = inq.closed_reason or "sold"
db.add(
ListingInquiryEvent(
inquiry_id=inq.id,
listing_id=listing.id,
actor_user_id=current_user.id,
old_status=old,
new_status="closed",
reason="sold",
ip_address=request.client.host if request.client else None,
user_agent=request.headers.get("user-agent", "")[:500],
)
)
await track_event(
db,
event_name="listing_marked_sold",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
metadata={
"sold_reason": listing.sold_reason,
"sold_price": float(listing.sold_price) if listing.sold_price is not None else None,
"sold_currency": listing.sold_currency,
},
)
await db.commit()
await db.refresh(listing)
@ -1307,10 +653,6 @@ async def update_listing(
public_url=listing.public_url,
created_at=listing.created_at,
published_at=listing.published_at,
sold_at=getattr(listing, "sold_at", None),
sold_reason=getattr(listing, "sold_reason", None),
sold_price=getattr(listing, "sold_price", None),
sold_currency=getattr(listing, "sold_currency", None),
seller_verified=current_user.is_verified,
seller_member_since=current_user.created_at,
)

View File

@ -5,20 +5,15 @@ Supports:
- Google OAuth 2.0
- GitHub OAuth
"""
import base64
import hashlib
import hmac
import json
import os
import secrets
import logging
import time
from datetime import datetime, timedelta
from typing import Optional
from urllib.parse import urlencode
import httpx
from fastapi import APIRouter, HTTPException, status, Query, Request
from fastapi import APIRouter, HTTPException, status, Query
from fastapi.responses import RedirectResponse
from pydantic import BaseModel
from sqlalchemy import select
@ -28,7 +23,6 @@ from app.config import get_settings
from app.models.user import User
from app.models.subscription import Subscription, SubscriptionTier, SubscriptionStatus, TIER_CONFIG
from app.services.auth import AuthService
from app.security import set_auth_cookie, should_use_secure_cookies
logger = logging.getLogger(__name__)
router = APIRouter()
@ -47,123 +41,6 @@ GITHUB_REDIRECT_URI = os.getenv("GITHUB_REDIRECT_URI", "http://localhost:8000/ap
FRONTEND_URL = os.getenv("SITE_URL", "http://localhost:3000")
OAUTH_STATE_TTL_SECONDS = 600 # 10 minutes
def _sanitize_redirect_path(redirect: Optional[str]) -> str:
"""
Only allow internal (relative) redirects.
Prevents open-redirect and token/referrer exfil paths.
"""
default = "/terminal/radar"
if not redirect:
return default
r = redirect.strip()
if not r.startswith("/"):
return default
if r.startswith("//"):
return default
if "://" in r:
return default
if "\\" in r:
return default
if len(r) > 2048:
return default
return r
def _b64url_encode(data: bytes) -> str:
return base64.urlsafe_b64encode(data).rstrip(b"=").decode("ascii")
def _b64url_decode(data: str) -> bytes:
pad = "=" * (-len(data) % 4)
return base64.urlsafe_b64decode(data + pad)
def _oauth_nonce_cookie_name(provider: str) -> str:
return f"pounce_oauth_nonce_{provider}"
def _set_oauth_nonce_cookie(response: RedirectResponse, provider: str, nonce: str) -> None:
response.set_cookie(
key=_oauth_nonce_cookie_name(provider),
value=nonce,
httponly=True,
secure=should_use_secure_cookies(),
samesite="lax",
max_age=OAUTH_STATE_TTL_SECONDS,
path="/api/v1/oauth",
)
def _clear_oauth_nonce_cookie(response: RedirectResponse, provider: str) -> None:
response.delete_cookie(
key=_oauth_nonce_cookie_name(provider),
path="/api/v1/oauth",
)
def _create_oauth_state(provider: str, nonce: str, redirect_path: str) -> str:
"""
Signed, short-lived state payload.
Also protects the redirect_path against tampering.
"""
if not settings.secret_key:
raise RuntimeError("SECRET_KEY is required for OAuth state signing")
payload = {
"p": provider,
"n": nonce,
"r": redirect_path,
"ts": int(time.time()),
}
payload_b64 = _b64url_encode(
json.dumps(payload, separators=(",", ":"), ensure_ascii=False).encode("utf-8")
)
sig = hmac.new(
settings.secret_key.encode("utf-8"),
payload_b64.encode("utf-8"),
hashlib.sha256,
).digest()
return f"{payload_b64}.{_b64url_encode(sig)}"
def _verify_oauth_state(state: str, provider: str) -> tuple[str, str]:
if not settings.secret_key:
raise ValueError("OAuth state verification not available (missing SECRET_KEY)")
if not state or "." not in state:
raise ValueError("Invalid state format")
payload_b64, sig_b64 = state.split(".", 1)
expected_sig = _b64url_encode(
hmac.new(
settings.secret_key.encode("utf-8"),
payload_b64.encode("utf-8"),
hashlib.sha256,
).digest()
)
if not hmac.compare_digest(expected_sig, sig_b64):
raise ValueError("Invalid state signature")
payload = json.loads(_b64url_decode(payload_b64).decode("utf-8"))
if payload.get("p") != provider:
raise ValueError("State provider mismatch")
ts = int(payload.get("ts") or 0)
if ts <= 0 or (int(time.time()) - ts) > OAUTH_STATE_TTL_SECONDS:
raise ValueError("State expired")
nonce = str(payload.get("n") or "")
redirect_path = _sanitize_redirect_path(payload.get("r"))
if not nonce:
raise ValueError("Missing nonce")
return nonce, redirect_path
# ============== Schemas ==============
@ -225,8 +102,7 @@ async def get_or_create_oauth_user(
# Create new user
user = User(
email=email.lower(),
# Random password (won't be used), but keep it a valid bcrypt hash.
hashed_password=AuthService.hash_password(secrets.token_urlsafe(32)),
hashed_password=secrets.token_urlsafe(32), # Random password (won't be used)
name=name,
oauth_provider=provider,
oauth_id=oauth_id,
@ -294,9 +170,10 @@ async def google_login(redirect: Optional[str] = Query(None)):
detail="Google OAuth not configured",
)
redirect_path = _sanitize_redirect_path(redirect)
nonce = secrets.token_urlsafe(16)
state = _create_oauth_state("google", nonce, redirect_path)
# Store redirect URL in state
state = secrets.token_urlsafe(16)
if redirect:
state = f"{state}:{redirect}"
params = {
"client_id": GOOGLE_CLIENT_ID,
@ -309,14 +186,11 @@ async def google_login(redirect: Optional[str] = Query(None)):
}
url = f"https://accounts.google.com/o/oauth2/v2/auth?{urlencode(params)}"
response = RedirectResponse(url=url)
_set_oauth_nonce_cookie(response, "google", nonce)
return response
return RedirectResponse(url=url)
@router.get("/google/callback")
async def google_callback(
request: Request,
code: str = Query(...),
state: str = Query(""),
db: Database = None,
@ -328,16 +202,10 @@ async def google_callback(
detail="Google OAuth not configured",
)
try:
nonce, redirect_path = _verify_oauth_state(state, "google")
except Exception as e:
logger.warning(f"Invalid OAuth state (google): {e}")
return RedirectResponse(url=f"{FRONTEND_URL}/login?error=oauth_state_invalid")
cookie_nonce = request.cookies.get(_oauth_nonce_cookie_name("google"))
if not cookie_nonce or not hmac.compare_digest(cookie_nonce, nonce):
logger.warning("OAuth nonce mismatch (google)")
return RedirectResponse(url=f"{FRONTEND_URL}/login?error=oauth_state_invalid")
# Parse redirect from state
redirect_path = "/command/dashboard"
if ":" in state:
_, redirect_path = state.split(":", 1)
try:
# Exchange code for tokens
@ -389,20 +257,12 @@ async def google_callback(
# Create JWT
jwt_token, _ = create_jwt_for_user(user)
# Redirect to frontend WITHOUT token in URL; set auth cookie instead.
query = {"redirect": redirect_path}
# Redirect to frontend with token
redirect_url = f"{FRONTEND_URL}/oauth/callback?token={jwt_token}&redirect={redirect_path}"
if is_new:
query["new"] = "true"
redirect_url = f"{FRONTEND_URL}/oauth/callback?{urlencode(query)}"
redirect_url += "&new=true"
response = RedirectResponse(url=redirect_url)
_clear_oauth_nonce_cookie(response, "google")
set_auth_cookie(
response=response,
token=jwt_token,
max_age_seconds=settings.access_token_expire_minutes * 60,
)
return response
return RedirectResponse(url=redirect_url)
except Exception as e:
logger.exception(f"Google OAuth error: {e}")
@ -422,9 +282,10 @@ async def github_login(redirect: Optional[str] = Query(None)):
detail="GitHub OAuth not configured",
)
redirect_path = _sanitize_redirect_path(redirect)
nonce = secrets.token_urlsafe(16)
state = _create_oauth_state("github", nonce, redirect_path)
# Store redirect URL in state
state = secrets.token_urlsafe(16)
if redirect:
state = f"{state}:{redirect}"
params = {
"client_id": GITHUB_CLIENT_ID,
@ -434,14 +295,11 @@ async def github_login(redirect: Optional[str] = Query(None)):
}
url = f"https://github.com/login/oauth/authorize?{urlencode(params)}"
response = RedirectResponse(url=url)
_set_oauth_nonce_cookie(response, "github", nonce)
return response
return RedirectResponse(url=url)
@router.get("/github/callback")
async def github_callback(
request: Request,
code: str = Query(...),
state: str = Query(""),
db: Database = None,
@ -453,16 +311,10 @@ async def github_callback(
detail="GitHub OAuth not configured",
)
try:
nonce, redirect_path = _verify_oauth_state(state, "github")
except Exception as e:
logger.warning(f"Invalid OAuth state (github): {e}")
return RedirectResponse(url=f"{FRONTEND_URL}/login?error=oauth_state_invalid")
cookie_nonce = request.cookies.get(_oauth_nonce_cookie_name("github"))
if not cookie_nonce or not hmac.compare_digest(cookie_nonce, nonce):
logger.warning("OAuth nonce mismatch (github)")
return RedirectResponse(url=f"{FRONTEND_URL}/login?error=oauth_state_invalid")
# Parse redirect from state
redirect_path = "/command/dashboard"
if ":" in state:
_, redirect_path = state.split(":", 1)
try:
async with httpx.AsyncClient() as client:
@ -547,19 +399,12 @@ async def github_callback(
# Create JWT
jwt_token, _ = create_jwt_for_user(user)
query = {"redirect": redirect_path}
# Redirect to frontend with token
redirect_url = f"{FRONTEND_URL}/oauth/callback?token={jwt_token}&redirect={redirect_path}"
if is_new:
query["new"] = "true"
redirect_url = f"{FRONTEND_URL}/oauth/callback?{urlencode(query)}"
redirect_url += "&new=true"
response = RedirectResponse(url=redirect_url)
_clear_oauth_nonce_cookie(response, "github")
set_auth_cookie(
response=response,
token=jwt_token,
max_age_seconds=settings.access_token_expire_minutes * 60,
)
return response
return RedirectResponse(url=redirect_url)
except Exception as e:
logger.exception(f"GitHub OAuth error: {e}")

View File

@ -1,12 +1,10 @@
"""Portfolio API routes."""
import secrets
from datetime import datetime
from typing import Optional, List
from fastapi import APIRouter, Depends, HTTPException, status, Query
from pydantic import BaseModel, Field
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
import dns.resolver
from app.database import get_db
from app.api.deps import get_current_user
@ -73,11 +71,6 @@ class PortfolioDomainResponse(BaseModel):
notes: Optional[str]
tags: Optional[str]
roi: Optional[float]
# DNS Verification fields
is_dns_verified: bool = False
verification_status: str = "unverified"
verification_code: Optional[str] = None
verified_at: Optional[datetime] = None
created_at: datetime
updated_at: datetime
@ -85,25 +78,6 @@ class PortfolioDomainResponse(BaseModel):
from_attributes = True
class DNSVerificationStartResponse(BaseModel):
"""Response when starting DNS verification."""
domain_id: int
domain: str
verification_code: str
dns_record_type: str
dns_record_name: str
dns_record_value: str
instructions: str
status: str
class DNSVerificationCheckResponse(BaseModel):
"""Response when checking DNS verification."""
verified: bool
status: str
message: str
class PortfolioSummary(BaseModel):
"""Summary of user's portfolio."""
total_domains: int
@ -176,112 +150,7 @@ class ValuationResponse(BaseModel):
disclaimer: str
# ============== Helper Functions ==============
def _generate_verification_code() -> str:
"""Generate a unique verification code."""
return f"pounce-verify-{secrets.token_hex(8)}"
def _domain_to_response(domain: PortfolioDomain) -> PortfolioDomainResponse:
"""Convert PortfolioDomain to response schema."""
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
# ============== Portfolio Endpoints ==============
# IMPORTANT: Static routes must come BEFORE dynamic routes like /{domain_id}
@router.get("/verified", response_model=List[PortfolioDomainResponse])
async def get_verified_domains(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Get only DNS-verified portfolio domains.
These domains can be used for Yield or For Sale listings.
"""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.user_id == current_user.id,
PortfolioDomain.is_dns_verified == True,
PortfolioDomain.is_sold == False,
)
).order_by(PortfolioDomain.domain.asc())
)
domains = result.scalars().all()
return [_domain_to_response(d) for d in domains]
@router.get("/summary", response_model=PortfolioSummary)
async def get_portfolio_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get portfolio summary statistics."""
result = await db.execute(
select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
)
domains = result.scalars().all()
total_domains = len(domains)
active_domains = sum(1 for d in domains if d.status == "active" and not d.is_sold)
sold_domains = sum(1 for d in domains if d.is_sold)
total_invested = sum(d.purchase_price or 0 for d in domains)
total_value = sum(d.estimated_value or 0 for d in domains if not d.is_sold)
total_sold_value = sum(d.sale_price or 0 for d in domains if d.is_sold)
# Calculate active investment for ROI
active_investment = sum(d.purchase_price or 0 for d in domains if not d.is_sold)
sold_investment = sum(d.purchase_price or 0 for d in domains if d.is_sold)
unrealized_profit = total_value - active_investment
realized_profit = total_sold_value - sold_investment
overall_roi = 0.0
if total_invested > 0:
overall_roi = ((total_value + total_sold_value - total_invested) / total_invested) * 100
return PortfolioSummary(
total_domains=total_domains,
active_domains=active_domains,
sold_domains=sold_domains,
total_invested=round(total_invested, 2),
total_value=round(total_value, 2),
total_sold_value=round(total_sold_value, 2),
unrealized_profit=round(unrealized_profit, 2),
realized_profit=round(realized_profit, 2),
overall_roi=round(overall_roi, 2),
)
@router.get("", response_model=List[PortfolioDomainResponse])
async def get_portfolio(
@ -335,10 +204,6 @@ async def get_portfolio(
notes=d.notes,
tags=d.tags,
roi=d.roi,
is_dns_verified=getattr(d, 'is_dns_verified', False) or False,
verification_status=getattr(d, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(d, 'verification_code', None),
verified_at=getattr(d, 'verified_at', None),
created_at=d.created_at,
updated_at=d.updated_at,
)
@ -347,6 +212,49 @@ async def get_portfolio(
return responses
@router.get("/summary", response_model=PortfolioSummary)
async def get_portfolio_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get portfolio summary statistics."""
result = await db.execute(
select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
)
domains = result.scalars().all()
total_domains = len(domains)
active_domains = sum(1 for d in domains if d.status == "active" and not d.is_sold)
sold_domains = sum(1 for d in domains if d.is_sold)
total_invested = sum(d.purchase_price or 0 for d in domains)
total_value = sum(d.estimated_value or 0 for d in domains if not d.is_sold)
total_sold_value = sum(d.sale_price or 0 for d in domains if d.is_sold)
# Calculate active investment for ROI
active_investment = sum(d.purchase_price or 0 for d in domains if not d.is_sold)
sold_investment = sum(d.purchase_price or 0 for d in domains if d.is_sold)
unrealized_profit = total_value - active_investment
realized_profit = total_sold_value - sold_investment
overall_roi = 0.0
if total_invested > 0:
overall_roi = ((total_value + total_sold_value - total_invested) / total_invested) * 100
return PortfolioSummary(
total_domains=total_domains,
active_domains=active_domains,
sold_domains=sold_domains,
total_invested=round(total_invested, 2),
total_value=round(total_value, 2),
total_sold_value=round(total_sold_value, 2),
unrealized_profit=round(unrealized_profit, 2),
realized_profit=round(realized_profit, 2),
overall_roi=round(overall_roi, 2),
)
@router.post("", response_model=PortfolioDomainResponse, status_code=status.HTTP_201_CREATED)
async def add_portfolio_domain(
data: PortfolioDomainCreate,
@ -443,10 +351,6 @@ async def add_portfolio_domain(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -494,10 +398,6 @@ async def get_portfolio_domain(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -554,10 +454,6 @@ async def update_portfolio_domain(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -614,10 +510,6 @@ async def mark_domain_sold(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -701,10 +593,6 @@ async def refresh_domain_value(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -729,160 +617,3 @@ async def get_domain_valuation(
return ValuationResponse(**valuation)
# ============== DNS Verification Endpoints ==============
@router.post("/{domain_id}/verify-dns", response_model=DNSVerificationStartResponse)
async def start_dns_verification(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Start DNS verification for a portfolio domain.
Returns a verification code that must be added as a TXT record.
"""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
if domain.is_dns_verified:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Domain is already verified",
)
# Generate or reuse existing verification code
if not domain.verification_code:
domain.verification_code = _generate_verification_code()
domain.verification_status = "pending"
domain.verification_started_at = datetime.utcnow()
await db.commit()
await db.refresh(domain)
return DNSVerificationStartResponse(
domain_id=domain.id,
domain=domain.domain,
verification_code=domain.verification_code,
dns_record_type="TXT",
dns_record_name=f"_pounce.{domain.domain}",
dns_record_value=domain.verification_code,
instructions=f"Add a TXT record to your DNS settings:\n\nHost/Name: _pounce\nType: TXT\nValue: {domain.verification_code}\n\nDNS changes can take up to 48 hours to propagate, but usually complete within minutes.",
status=domain.verification_status,
)
@router.get("/{domain_id}/verify-dns/check", response_model=DNSVerificationCheckResponse)
async def check_dns_verification(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Check if DNS verification is complete.
Looks for the TXT record and verifies it matches the expected code.
"""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
if domain.is_dns_verified:
return DNSVerificationCheckResponse(
verified=True,
status="verified",
message="Domain ownership already verified",
)
if not domain.verification_code:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Verification not started. Call POST /verify-dns first.",
)
# Check DNS TXT record
txt_record_name = f"_pounce.{domain.domain}"
verified = False
try:
resolver = dns.resolver.Resolver()
resolver.timeout = 5
resolver.lifetime = 10
answers = resolver.resolve(txt_record_name, 'TXT')
for rdata in answers:
txt_value = rdata.to_text().strip('"')
if txt_value == domain.verification_code:
verified = True
break
except dns.resolver.NXDOMAIN:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message=f"TXT record not found. Please add a TXT record at _pounce.{domain.domain}",
)
except dns.resolver.NoAnswer:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message="TXT record exists but has no value. Check your DNS configuration.",
)
except dns.resolver.Timeout:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message="DNS query timed out. Please try again.",
)
except Exception as e:
return DNSVerificationCheckResponse(
verified=False,
status="error",
message=f"DNS lookup error: {str(e)}",
)
if verified:
domain.is_dns_verified = True
domain.verification_status = "verified"
domain.verified_at = datetime.utcnow()
await db.commit()
return DNSVerificationCheckResponse(
verified=True,
status="verified",
message="Domain ownership verified successfully! You can now list this domain for sale or activate Yield.",
)
else:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message=f"TXT record found but value doesn't match. Expected: {domain.verification_code}",
)

View File

@ -84,7 +84,7 @@ async def get_subscription(
tier=subscription.tier.value,
tier_name=config["name"],
status=subscription.status.value,
domain_limit=subscription.domain_limit,
domain_limit=subscription.max_domains,
domains_used=domains_used,
portfolio_limit=config.get("portfolio_limit", 0),
check_frequency=config["check_frequency"],

View File

@ -1,365 +0,0 @@
"""
Telemetry KPIs (4A.2).
Admin-only endpoint to compute funnel KPIs from telemetry_events.
"""
from __future__ import annotations
import json
import statistics
from datetime import datetime, timedelta
from typing import Any, Optional
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy import and_, case, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user, get_db
from app.models.telemetry import TelemetryEvent
from app.models.user import User
from app.schemas.referrals import ReferralKpiWindow, ReferralKpisResponse, ReferralReferrerRow
from app.schemas.telemetry import (
DealFunnelKpis,
TelemetryKpiWindow,
TelemetryKpisResponse,
YieldFunnelKpis,
)
router = APIRouter(prefix="/telemetry", tags=["telemetry"])
def _require_admin(user: User) -> None:
if not user.is_admin:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Admin access required")
def _safe_json(metadata_json: Optional[str]) -> dict[str, Any]:
if not metadata_json:
return {}
try:
value = json.loads(metadata_json)
return value if isinstance(value, dict) else {}
except Exception:
return {}
def _median(values: list[float]) -> Optional[float]:
if not values:
return None
return float(statistics.median(values))
@router.get("/kpis", response_model=TelemetryKpisResponse)
async def get_kpis(
days: int = Query(30, ge=1, le=365),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
_require_admin(current_user)
end = datetime.utcnow()
start = end - timedelta(days=days)
event_names = [
# Deal funnel
"listing_view",
"inquiry_created",
"inquiry_status_changed",
"message_sent",
"listing_marked_sold",
# Yield funnel
"yield_connected",
"yield_click",
"yield_conversion",
"payout_paid",
]
rows = (
await db.execute(
select(
TelemetryEvent.event_name,
TelemetryEvent.created_at,
TelemetryEvent.listing_id,
TelemetryEvent.inquiry_id,
TelemetryEvent.yield_domain_id,
TelemetryEvent.click_id,
TelemetryEvent.metadata_json,
).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name.in_(event_names),
)
)
)
).all()
# -----------------------------
# Deal KPIs
# -----------------------------
listing_views = 0
inquiries_created = 0
inquiry_created_at: dict[int, datetime] = {}
first_seller_reply_at: dict[int, datetime] = {}
listings_with_inquiries: set[int] = set()
sold_listings: set[int] = set()
sold_at_by_listing: dict[int, datetime] = {}
first_inquiry_at_by_listing: dict[int, datetime] = {}
# -----------------------------
# Yield KPIs
# -----------------------------
connected_domains = 0
clicks = 0
conversions = 0
payouts_paid = 0
payouts_paid_amount_total = 0.0
for event_name, created_at, listing_id, inquiry_id, yield_domain_id, click_id, metadata_json in rows:
created_at = created_at # already datetime
if event_name == "listing_view":
listing_views += 1
continue
if event_name == "inquiry_created":
inquiries_created += 1
if inquiry_id:
inquiry_created_at[inquiry_id] = created_at
if listing_id:
listings_with_inquiries.add(listing_id)
prev = first_inquiry_at_by_listing.get(listing_id)
if prev is None or created_at < prev:
first_inquiry_at_by_listing[listing_id] = created_at
continue
if event_name == "message_sent":
if not inquiry_id:
continue
meta = _safe_json(metadata_json)
if meta.get("role") == "seller":
prev = first_seller_reply_at.get(inquiry_id)
if prev is None or created_at < prev:
first_seller_reply_at[inquiry_id] = created_at
continue
if event_name == "listing_marked_sold":
if listing_id:
sold_listings.add(listing_id)
sold_at_by_listing[listing_id] = created_at
continue
if event_name == "yield_connected":
connected_domains += 1
continue
if event_name == "yield_click":
clicks += 1
continue
if event_name == "yield_conversion":
conversions += 1
continue
if event_name == "payout_paid":
payouts_paid += 1
meta = _safe_json(metadata_json)
amount = meta.get("amount")
if isinstance(amount, (int, float)):
payouts_paid_amount_total += float(amount)
continue
seller_replied_inquiries = len(first_seller_reply_at.keys())
inquiry_reply_rate = (seller_replied_inquiries / inquiries_created) if inquiries_created else 0.0
# Inquiry → Sold rate (on listing-level intersection)
sold_from_inquiry = sold_listings.intersection(listings_with_inquiries)
inquiry_to_sold_listing_rate = (len(sold_from_inquiry) / len(listings_with_inquiries)) if listings_with_inquiries else 0.0
# Median reply time (seconds): inquiry_created → first seller message
reply_deltas: list[float] = []
for inq_id, created in inquiry_created_at.items():
replied = first_seller_reply_at.get(inq_id)
if replied:
reply_deltas.append((replied - created).total_seconds())
# Median time-to-sold (seconds): first inquiry on listing → listing sold
sold_deltas: list[float] = []
for listing in sold_from_inquiry:
inq_at = first_inquiry_at_by_listing.get(listing)
sold_at = sold_at_by_listing.get(listing)
if inq_at and sold_at and sold_at >= inq_at:
sold_deltas.append((sold_at - inq_at).total_seconds())
deal = DealFunnelKpis(
listing_views=listing_views,
inquiries_created=inquiries_created,
seller_replied_inquiries=seller_replied_inquiries,
inquiry_reply_rate=float(inquiry_reply_rate),
listings_with_inquiries=len(listings_with_inquiries),
listings_sold=len(sold_listings),
inquiry_to_sold_listing_rate=float(inquiry_to_sold_listing_rate),
median_reply_seconds=_median(reply_deltas),
median_time_to_sold_seconds=_median(sold_deltas),
)
yield_kpis = YieldFunnelKpis(
connected_domains=connected_domains,
clicks=clicks,
conversions=conversions,
conversion_rate=float(conversions / clicks) if clicks else 0.0,
payouts_paid=payouts_paid,
payouts_paid_amount_total=float(payouts_paid_amount_total),
)
return TelemetryKpisResponse(
window=TelemetryKpiWindow(days=days, start=start, end=end),
deal=deal,
yield_=yield_kpis,
)
@router.get("/referrals", response_model=ReferralKpisResponse)
async def get_referral_kpis(
days: int = Query(30, ge=1, le=365),
limit: int = Query(200, ge=1, le=1000),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Admin-only referral KPIs for the viral loop (3C.2).
This is intentionally user-based (users.referred_by_user_id) + telemetry-based (referral_link_viewed),
so it stays robust even if ref codes evolve.
"""
_require_admin(current_user)
end = datetime.utcnow()
start = end - timedelta(days=days)
# Referred user counts per referrer (all-time + window)
referred_counts_subq = (
select(
User.referred_by_user_id.label("referrer_user_id"),
func.count(User.id).label("referred_users_total"),
func.coalesce(
func.sum(case((User.created_at >= start, 1), else_=0)),
0,
).label("referred_users_window"),
)
.where(User.referred_by_user_id.isnot(None))
.group_by(User.referred_by_user_id)
.subquery()
)
# Referral link views in window (telemetry)
link_views_subq = (
select(
TelemetryEvent.user_id.label("referrer_user_id"),
func.count(TelemetryEvent.id).label("referral_link_views_window"),
)
.where(
and_(
TelemetryEvent.event_name == "referral_link_viewed",
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.user_id.isnot(None),
)
)
.group_by(TelemetryEvent.user_id)
.subquery()
)
# Referrers: anyone with an invite_code (we still show even if counts are zero)
rows = (
await db.execute(
select(
User.id,
User.email,
User.invite_code,
User.created_at,
func.coalesce(referred_counts_subq.c.referred_users_total, 0),
func.coalesce(referred_counts_subq.c.referred_users_window, 0),
func.coalesce(link_views_subq.c.referral_link_views_window, 0),
)
.where(User.invite_code.isnot(None))
.outerjoin(referred_counts_subq, referred_counts_subq.c.referrer_user_id == User.id)
.outerjoin(link_views_subq, link_views_subq.c.referrer_user_id == User.id)
.order_by(
func.coalesce(referred_counts_subq.c.referred_users_window, 0).desc(),
func.coalesce(referred_counts_subq.c.referred_users_total, 0).desc(),
User.created_at.desc(),
)
.offset(offset)
.limit(limit)
)
).all()
referrers = [
ReferralReferrerRow(
user_id=int(user_id),
email=str(email),
invite_code=str(invite_code) if invite_code else None,
created_at=created_at,
referred_users_total=int(referred_total or 0),
referred_users_window=int(referred_window or 0),
referral_link_views_window=int(link_views or 0),
)
for user_id, email, invite_code, created_at, referred_total, referred_window, link_views in rows
]
totals = {}
totals["referrers_with_invite_code"] = int(
(
await db.execute(
select(func.count(User.id)).where(User.invite_code.isnot(None))
)
).scalar()
or 0
)
totals["referred_users_total"] = int(
(
await db.execute(
select(func.count(User.id)).where(User.referred_by_user_id.isnot(None))
)
).scalar()
or 0
)
totals["referred_users_window"] = int(
(
await db.execute(
select(func.count(User.id)).where(
and_(
User.referred_by_user_id.isnot(None),
User.created_at >= start,
User.created_at <= end,
)
)
)
).scalar()
or 0
)
totals["referral_link_views_window"] = int(
(
await db.execute(
select(func.count(TelemetryEvent.id)).where(
and_(
TelemetryEvent.event_name == "referral_link_viewed",
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
)
)
)
).scalar()
or 0
)
return ReferralKpisResponse(
window=ReferralKpiWindow(days=days, start=start, end=end),
totals=totals,
referrers=referrers,
)

View File

@ -64,38 +64,6 @@ async def get_db_price_count(db) -> int:
return result.scalar() or 0
@router.get("/tlds")
async def list_tracked_tlds(
db: Database,
limit: int = Query(5000, ge=1, le=20000),
offset: int = Query(0, ge=0),
):
"""
List distinct TLDs tracked in the database (DB-driven).
This endpoint is intentionally database-only (no static fallback),
so callers (e.g. sitemap generation) can rely on real tracked inventory.
"""
rows = (
await db.execute(
select(TLDPrice.tld)
.distinct()
.order_by(TLDPrice.tld)
.offset(offset)
.limit(limit)
)
).scalars().all()
total = (await db.execute(select(func.count(func.distinct(TLDPrice.tld))))).scalar() or 0
latest = (await db.execute(select(func.max(TLDPrice.recorded_at)))).scalar()
return {
"tlds": [str(t).lstrip(".").lower() for t in rows if t],
"total": int(total),
"limit": int(limit),
"offset": int(offset),
"latest_recorded_at": latest.isoformat() if latest else None,
}
# Real TLD price data based on current market research (December 2024)
# Prices in USD, sourced from major registrars: Namecheap, Cloudflare, Porkbun, Google Domains
TLD_DATA = {
@ -628,57 +596,6 @@ async def get_trending_tlds(db: Database):
return {"trending": trending[:6]}
async def get_real_price_history(db, tld: str, days: int) -> list[dict]:
"""
Fetch real historical price data from the database.
Returns daily average prices for the TLD, grouped by date.
Works with both SQLite (dev) and PostgreSQL (prod).
"""
from sqlalchemy import literal_column
cutoff = datetime.utcnow() - timedelta(days=days)
# SQLite-compatible: use date() function or extract date from datetime
# We'll select the raw datetime and group by date string
result = await db.execute(
select(
TLDPrice.recorded_at,
TLDPrice.registration_price,
)
.where(TLDPrice.tld == tld)
.where(TLDPrice.recorded_at >= cutoff)
.order_by(TLDPrice.recorded_at)
)
rows = result.all()
if not rows:
return []
# Group by date in Python (SQLite-safe approach)
daily_prices: dict[str, list[float]] = {}
for row in rows:
# Handle both datetime objects and strings
if hasattr(row.recorded_at, 'strftime'):
date_str = row.recorded_at.strftime("%Y-%m-%d")
else:
date_str = str(row.recorded_at)[:10] # Take first 10 chars (YYYY-MM-DD)
if date_str not in daily_prices:
daily_prices[date_str] = []
daily_prices[date_str].append(row.registration_price)
# Calculate daily averages
return [
{
"date": date_str,
"price": round(sum(prices) / len(prices), 2),
}
for date_str, prices in sorted(daily_prices.items())
]
@router.get("/{tld}/history")
async def get_tld_price_history(
tld: str,
@ -687,8 +604,10 @@ async def get_tld_price_history(
):
"""Get price history for a specific TLD.
Returns REAL historical data from database (no simulation).
Returns real historical data from database if available,
otherwise generates simulated data based on current price.
"""
import math
tld_clean = tld.lower().lstrip(".")
@ -714,40 +633,40 @@ async def get_tld_price_history(
trend = static_data.get("trend", "stable")
trend_reason = static_data.get("trend_reason", "Price tracking available")
real_history = await get_real_price_history(db, tld_clean, days)
# Generate historical data (simulated for now, real when we have more scrapes)
history = []
current_date = datetime.utcnow()
if not real_history:
raise HTTPException(status_code=404, detail=f"No historical data for '.{tld_clean}' yet")
history = real_history
data_source = "database"
# Use the most recent daily average as current_price when available
if history:
current_price = float(history[-1]["price"])
def _price_at_or_before(days_ago_target: int) -> float:
"""Get the closest historical price at or before the target age."""
target_date = (datetime.utcnow() - timedelta(days=days_ago_target)).date()
best = float(history[0]["price"])
for h in reversed(history):
try:
h_date = datetime.strptime(h["date"], "%Y-%m-%d").date()
except Exception:
continue
if h_date <= target_date:
best = float(h["price"])
break
return best
price_7d_ago = _price_at_or_before(7)
price_30d_ago = _price_at_or_before(30)
price_90d_ago = _price_at_or_before(90)
# Calculate trend factor based on known trends
trend_factor = 1.0
if trend == "up":
trend_factor = 0.92 # Prices were ~8% lower
elif trend == "down":
trend_factor = 1.05 # Prices were ~5% higher
# Calculate percentage changes safely
change_7d = round((current_price - price_7d_ago) / price_7d_ago * 100, 2) if price_7d_ago and price_7d_ago > 0 else 0
change_30d = round((current_price - price_30d_ago) / price_30d_ago * 100, 2) if price_30d_ago and price_30d_ago > 0 else 0
change_90d = round((current_price - price_90d_ago) / price_90d_ago * 100, 2) if price_90d_ago and price_90d_ago > 0 else 0
# Generate weekly data points
for i in range(days, -1, -7):
date = current_date - timedelta(days=i)
progress = 1 - (i / days)
if trend == "up":
price = current_price * (trend_factor + (1 - trend_factor) * progress)
elif trend == "down":
price = current_price * (trend_factor - (trend_factor - 1) * progress)
else:
# Add small fluctuation for stable prices
fluctuation = math.sin(i * 0.1) * 0.02
price = current_price * (1 + fluctuation)
history.append({
"date": date.strftime("%Y-%m-%d"),
"price": round(price, 2),
})
# Calculate price changes
price_7d_ago = history[-2]["price"] if len(history) >= 2 else current_price
price_30d_ago = history[-5]["price"] if len(history) >= 5 else current_price
price_90d_ago = history[0]["price"] if history else current_price
return {
"tld": tld_clean,
@ -755,14 +674,13 @@ async def get_tld_price_history(
"description": static_data.get("description", f".{tld_clean} domain extension"),
"registry": static_data.get("registry", "Unknown"),
"current_price": current_price,
"price_change_7d": change_7d,
"price_change_30d": change_30d,
"price_change_90d": change_90d,
"price_change_7d": round((current_price - price_7d_ago) / price_7d_ago * 100, 2) if price_7d_ago else 0,
"price_change_30d": round((current_price - price_30d_ago) / price_30d_ago * 100, 2) if price_30d_ago else 0,
"price_change_90d": round((current_price - price_90d_ago) / price_90d_ago * 100, 2) if price_90d_ago else 0,
"trend": trend,
"trend_reason": trend_reason,
"history": history,
"source": data_source,
"data_points": len(history),
"source": "simulated" if not static_data else "static",
}
@ -791,111 +709,76 @@ async def compare_tld_prices(
tld: str,
db: Database,
):
"""Compare prices across different registrars for a TLD.
COMBINES static data AND database data for complete registrar coverage.
This ensures all scraped registrars (Porkbun, GoDaddy, Namecheap, etc.) appear.
"""
"""Compare prices across different registrars for a TLD."""
tld_clean = tld.lower().lstrip(".")
# Collect registrars from ALL sources
registrars_map: dict[str, dict] = {}
metadata = {
"type": "generic",
"description": f".{tld_clean} domain extension",
"registry": "Unknown",
"introduced": None,
}
# 1. Add static data (curated, high-quality)
# Try static data first
if tld_clean in TLD_DATA:
data = TLD_DATA[tld_clean]
metadata = {
"type": data["type"],
"description": data["description"],
"registry": data.get("registry", "Unknown"),
"introduced": data.get("introduced"),
}
registrars = []
for name, prices in data["registrars"].items():
registrars_map[name.lower()] = {
registrars.append({
"name": name,
"registration_price": prices["register"],
"renewal_price": prices["renew"],
"transfer_price": prices["transfer"],
"source": "static",
}
# 2. Add/update with database data (scraped from multiple registrars)
db_prices = await get_db_prices(db, tld_clean)
if db_prices and tld_clean in db_prices:
for registrar_name, prices in db_prices[tld_clean]["registrars"].items():
key = registrar_name.lower()
# Add if not exists, or update with fresher DB data
if key not in registrars_map:
registrars_map[key] = {
"name": registrar_name.title(),
"registration_price": prices["register"],
"renewal_price": prices["renew"],
"transfer_price": prices.get("transfer"),
"source": "database",
})
registrars.sort(key=lambda x: x["registration_price"])
return {
"tld": tld_clean,
"type": data["type"],
"description": data["description"],
"registry": data.get("registry", "Unknown"),
"introduced": data.get("introduced"),
"registrars": registrars,
"cheapest_registrar": registrars[0]["name"],
"cheapest_price": registrars[0]["registration_price"],
"price_range": {
"min": get_min_price(data),
"max": get_max_price(data),
"avg": get_avg_price(data),
},
}
if not registrars_map:
# Fall back to database
db_prices = await get_db_prices(db, tld_clean)
if not db_prices:
raise HTTPException(status_code=404, detail=f"TLD '.{tld_clean}' not found")
# Convert to list and sort by price
registrars = list(registrars_map.values())
tld_data = db_prices[tld_clean]
registrars = [
{
"name": name,
"registration_price": prices["register"],
"renewal_price": prices["renew"],
"transfer_price": prices["transfer"],
}
for name, prices in tld_data["registrars"].items()
]
registrars.sort(key=lambda x: x["registration_price"])
# Calculate price range from all registrars
all_prices = [r["registration_price"] for r in registrars]
prices = tld_data["prices"]
return {
"tld": tld_clean,
"type": metadata["type"],
"description": metadata["description"],
"registry": metadata["registry"],
"introduced": metadata["introduced"],
"type": guess_tld_type(tld_clean),
"description": f".{tld_clean} domain extension",
"registry": "Unknown",
"introduced": None,
"registrars": registrars,
"cheapest_registrar": registrars[0]["name"],
"cheapest_price": registrars[0]["registration_price"],
"cheapest_registrar": registrars[0]["name"] if registrars else "N/A",
"cheapest_price": min(prices) if prices else 0,
"price_range": {
"min": min(all_prices),
"max": max(all_prices),
"avg": round(sum(all_prices) / len(all_prices), 2),
"min": min(prices) if prices else 0,
"max": max(prices) if prices else 0,
"avg": round(sum(prices) / len(prices), 2) if prices else 0,
},
"registrar_count": len(registrars),
}
def get_marketplace_links(tld: str) -> list:
"""Get marketplace links for buying existing domains on this TLD."""
# Sedo partner ID for affiliate tracking
SEDO_PARTNER_ID = "335830"
return [
{
"name": "Sedo",
"description": "World's largest domain marketplace",
"url": f"https://sedo.com/search/?keyword=.{tld}&partnerid={SEDO_PARTNER_ID}",
"type": "marketplace",
},
{
"name": "Afternic",
"description": "GoDaddy's premium marketplace",
"url": f"https://www.afternic.com/search?k=.{tld}",
"type": "marketplace",
},
{
"name": "Dan.com",
"description": "Fast domain transfers",
"url": f"https://dan.com/search?query=.{tld}",
"type": "marketplace",
},
]
@router.get("/{tld}")
async def get_tld_details(
tld: str,
@ -904,9 +787,6 @@ async def get_tld_details(
"""Get complete details for a specific TLD."""
tld_clean = tld.lower().lstrip(".")
# Marketplace links (same for all TLDs)
marketplace_links = get_marketplace_links(tld_clean)
# Try static data first
if tld_clean in TLD_DATA:
data = TLD_DATA[tld_clean]
@ -936,7 +816,6 @@ async def get_tld_details(
},
"registrars": registrars,
"cheapest_registrar": registrars[0]["name"],
"marketplace_links": marketplace_links,
}
# Fall back to database
@ -973,159 +852,4 @@ async def get_tld_details(
},
"registrars": registrars,
"cheapest_registrar": registrars[0]["name"] if registrars else "N/A",
"marketplace_links": marketplace_links,
}
# =============================================================================
# DIAGNOSTIC ENDPOINTS - Data Quality & Historical Stats
# =============================================================================
@router.get("/stats/data-quality")
async def get_data_quality_stats(db: Database):
"""
Get statistics about historical data quality.
Useful for monitoring:
- How many TLDs have real historical data
- Date range of collected data
- Scraping frequency and gaps
"""
from sqlalchemy import cast, Date as SQLDate
# Total TLDs tracked
tld_count = await db.execute(select(func.count(func.distinct(TLDPrice.tld))))
total_tlds = tld_count.scalar() or 0
# Total price records
record_count = await db.execute(select(func.count(TLDPrice.id)))
total_records = record_count.scalar() or 0
# Date range
date_range = await db.execute(
select(
func.min(TLDPrice.recorded_at).label("first_record"),
func.max(TLDPrice.recorded_at).label("last_record"),
)
)
dates = date_range.one()
# Unique scrape days (how many days we have data)
# SQLite-compatible: count distinct date strings
all_dates = await db.execute(select(TLDPrice.recorded_at))
date_rows = all_dates.all()
unique_date_strs = set()
for row in date_rows:
if hasattr(row.recorded_at, 'strftime'):
unique_date_strs.add(row.recorded_at.strftime("%Y-%m-%d"))
elif row.recorded_at:
unique_date_strs.add(str(row.recorded_at)[:10])
scrape_days = len(unique_date_strs)
# TLDs with 5+ historical data points (enough for real charts)
tlds_with_history = await db.execute(
select(func.count())
.select_from(
select(TLDPrice.tld)
.group_by(TLDPrice.tld)
.having(func.count(TLDPrice.id) >= 5)
.subquery()
)
)
chartable_tlds = tlds_with_history.scalar() or 0
# Registrars in database
registrar_count = await db.execute(
select(func.count(func.distinct(TLDPrice.registrar)))
)
total_registrars = registrar_count.scalar() or 0
# Calculate coverage
days_of_data = 0
if dates.first_record and dates.last_record:
days_of_data = (dates.last_record - dates.first_record).days + 1
coverage_percent = round((scrape_days / days_of_data * 100), 1) if days_of_data > 0 else 0
return {
"summary": {
"total_tlds_tracked": total_tlds,
"total_price_records": total_records,
"tlds_with_real_history": chartable_tlds,
"unique_registrars": total_registrars,
},
"time_range": {
"first_record": dates.first_record.isoformat() if dates.first_record else None,
"last_record": dates.last_record.isoformat() if dates.last_record else None,
"days_of_data": days_of_data,
"days_with_scrapes": scrape_days,
"coverage_percent": coverage_percent,
},
"chart_readiness": {
"tlds_ready_for_charts": chartable_tlds,
"tlds_with_insufficient_history": total_tlds - chartable_tlds,
"recommendation": "Run daily scrapes for 7+ days to enable richer charts" if chartable_tlds < 10 else "Good coverage!",
},
"data_sources": {
"static_tlds": len(TLD_DATA),
"database_tlds": total_tlds,
"combined_coverage": len(TLD_DATA) + max(0, total_tlds - len(TLD_DATA)),
}
}
@router.get("/stats/scrape-history")
async def get_scrape_history(
db: Database,
days: int = Query(30, ge=1, le=365),
):
"""
Get scraping history - shows when scrapes ran and how many records were collected.
Useful for:
- Identifying gaps in data collection
- Verifying scheduler is working
- Troubleshooting data issues
"""
cutoff = datetime.utcnow() - timedelta(days=days)
# SQLite-compatible: fetch all and group in Python
result = await db.execute(
select(TLDPrice.recorded_at, TLDPrice.tld)
.where(TLDPrice.recorded_at >= cutoff)
)
rows = result.all()
# Group by date in Python
daily_data: dict[str, dict] = {}
for row in rows:
if hasattr(row.recorded_at, 'strftime'):
date_str = row.recorded_at.strftime("%Y-%m-%d")
elif row.recorded_at:
date_str = str(row.recorded_at)[:10]
else:
continue
if date_str not in daily_data:
daily_data[date_str] = {"records": 0, "tlds": set()}
daily_data[date_str]["records"] += 1
daily_data[date_str]["tlds"].add(row.tld)
# Convert to list and sort by date descending
scrape_history = [
{
"date": date_str,
"records_collected": data["records"],
"tlds_scraped": len(data["tlds"]),
}
for date_str, data in sorted(daily_data.items(), reverse=True)
]
total_records = sum(h["records_collected"] for h in scrape_history)
return {
"period_days": days,
"total_scrape_days": len(scrape_history),
"history": scrape_history,
"avg_records_per_day": round(total_records / len(scrape_history), 0) if scrape_history else 0,
}

View File

@ -5,8 +5,6 @@ Webhook endpoints for external service integrations.
- Future: Other payment providers, notification services, etc.
"""
import logging
import os
from datetime import datetime
from fastapi import APIRouter, HTTPException, Request, Header, status
from app.database import get_db
@ -17,25 +15,6 @@ logger = logging.getLogger(__name__)
router = APIRouter()
@router.get("/stripe/test")
async def test_stripe_webhook():
"""
Test endpoint to verify webhook route is accessible.
Use this to verify the webhook URL is correct.
The actual Stripe webhook should POST to /api/v1/webhooks/stripe
"""
return {
"status": "ok",
"message": "Stripe webhook endpoint is accessible",
"endpoint": "/api/v1/webhooks/stripe",
"method": "POST",
"stripe_configured": StripeService.is_configured(),
"webhook_secret_set": bool(os.getenv("STRIPE_WEBHOOK_SECRET")),
"timestamp": datetime.utcnow().isoformat(),
}
@router.post("/stripe")
async def stripe_webhook(
request: Request,
@ -50,22 +29,18 @@ async def stripe_webhook(
- Invoice is created or paid
The webhook must be configured in Stripe Dashboard to point to:
https://pounce.ch/api/v1/webhooks/stripe
https://your-domain.com/api/webhooks/stripe
Required Header:
- Stripe-Signature: Stripe's webhook signature for verification
"""
logger.info("🔔 Stripe webhook received")
if not stripe_signature:
logger.error("❌ Missing Stripe-Signature header")
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Missing Stripe-Signature header",
)
if not StripeService.is_configured():
logger.error("❌ Stripe not configured")
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Stripe not configured",
@ -74,9 +49,6 @@ async def stripe_webhook(
# Get raw body for signature verification
payload = await request.body()
logger.info(f" Payload size: {len(payload)} bytes")
logger.info(f" Signature: {stripe_signature[:50]}...")
try:
async for db in get_db():
result = await StripeService.handle_webhook(
@ -84,17 +56,16 @@ async def stripe_webhook(
sig_header=stripe_signature,
db=db,
)
logger.info(f"✅ Webhook processed successfully: {result}")
return result
except ValueError as e:
logger.error(f"Webhook validation error: {e}")
logger.error(f"Webhook validation error: {e}")
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=str(e),
)
except Exception as e:
logger.exception(f"Webhook processing error: {e}")
logger.error(f"Webhook processing error: {e}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Webhook processing failed",

View File

@ -1,802 +0,0 @@
"""
Yield Domain API endpoints.
Manages domain activation for yield/intent routing and revenue tracking.
"""
import json
from datetime import datetime, timedelta
from decimal import Decimal
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, status, Query
from sqlalchemy import func, and_, or_, Integer, case, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_db, get_current_user
from app.models.user import User
from app.models.yield_domain import YieldDomain, YieldTransaction, YieldPayout, AffiliatePartner
from app.config import get_settings
settings = get_settings()
from app.schemas.yield_domain import (
YieldDomainCreate,
YieldDomainUpdate,
YieldDomainResponse,
YieldDomainListResponse,
YieldTransactionResponse,
YieldTransactionListResponse,
YieldPayoutResponse,
YieldPayoutListResponse,
YieldDashboardStats,
YieldDashboardResponse,
DomainYieldAnalysis,
IntentAnalysis,
YieldValueEstimate,
AffiliatePartnerResponse,
DNSVerificationResult,
DNSSetupInstructions,
ActivateYieldRequest,
ActivateYieldResponse,
)
from app.services.intent_detector import (
detect_domain_intent,
estimate_domain_yield,
get_intent_detector,
)
from app.services.yield_dns import verify_yield_dns
from app.services.telemetry import track_event
router = APIRouter(prefix="/yield", tags=["yield"])
# ============================================================================
# Intent Analysis (Public)
# ============================================================================
@router.post("/analyze", response_model=DomainYieldAnalysis)
async def analyze_domain_intent(
domain: str = Query(..., min_length=3, description="Domain to analyze"),
):
"""
Analyze a domain's intent and estimate yield potential.
This endpoint is public - no authentication required.
"""
analysis = estimate_domain_yield(domain)
intent_result = detect_domain_intent(domain)
return DomainYieldAnalysis(
domain=domain,
intent=IntentAnalysis(
category=intent_result.category,
subcategory=intent_result.subcategory,
confidence=intent_result.confidence,
keywords_matched=intent_result.keywords_matched,
suggested_partners=intent_result.suggested_partners,
monetization_potential=intent_result.monetization_potential,
),
value=YieldValueEstimate(
estimated_monthly_min=analysis["value"]["estimated_monthly_min"],
estimated_monthly_max=analysis["value"]["estimated_monthly_max"],
currency=analysis["value"]["currency"],
potential=analysis["value"]["potential"],
confidence=analysis["value"]["confidence"],
geo=analysis["value"]["geo"],
),
partners=analysis["partners"],
monetization_potential=analysis["monetization_potential"],
)
# ============================================================================
# Dashboard
# ============================================================================
@router.get("/dashboard", response_model=YieldDashboardResponse)
async def get_yield_dashboard(
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Get yield dashboard with stats, domains, and recent transactions.
"""
# Get user's yield domains
result = await db.execute(
select(YieldDomain)
.where(YieldDomain.user_id == current_user.id)
.order_by(YieldDomain.total_revenue.desc())
)
domains = list(result.scalars().all())
# Calculate stats
now = datetime.utcnow()
month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
# Monthly stats from transactions (simplified for async)
monthly_revenue = Decimal("0")
monthly_clicks = 0
monthly_conversions = 0
if domains:
domain_ids = [d.id for d in domains]
monthly_result = await db.execute(
select(
func.coalesce(
func.sum(
case(
(YieldTransaction.status.in_(["confirmed", "paid"]), YieldTransaction.net_amount),
else_=0,
)
),
0,
).label("revenue"),
func.sum(
case(
(YieldTransaction.event_type == "click", 1),
else_=0,
)
).label("clicks"),
func.sum(
case(
(
and_(
YieldTransaction.event_type.in_(["lead", "sale"]),
YieldTransaction.status.in_(["confirmed", "paid"]),
),
1,
),
else_=0,
)
).label("conversions"),
).where(
YieldTransaction.yield_domain_id.in_(domain_ids),
YieldTransaction.created_at >= month_start,
)
)
monthly_stats = monthly_result.first()
if monthly_stats:
monthly_revenue = monthly_stats.revenue or Decimal("0")
monthly_clicks = monthly_stats.clicks or 0
monthly_conversions = monthly_stats.conversions or 0
# Aggregate domain stats
total_active = sum(1 for d in domains if d.status == "active")
total_pending = sum(1 for d in domains if d.status in ["pending", "verifying"])
lifetime_revenue = sum(d.total_revenue for d in domains)
lifetime_clicks = sum(d.total_clicks for d in domains)
lifetime_conversions = sum(d.total_conversions for d in domains)
# Pending payout
pending_payout = Decimal("0")
if domains:
domain_ids = [d.id for d in domains]
pending_result = await db.execute(
select(func.coalesce(func.sum(YieldTransaction.net_amount), 0)).where(
YieldTransaction.yield_domain_id.in_(domain_ids),
YieldTransaction.status == "confirmed",
YieldTransaction.paid_at.is_(None),
)
)
pending_payout = pending_result.scalar() or Decimal("0")
# Get recent transactions
recent_txs = []
if domains:
domain_ids = [d.id for d in domains]
recent_result = await db.execute(
select(YieldTransaction)
.where(YieldTransaction.yield_domain_id.in_(domain_ids))
.order_by(YieldTransaction.created_at.desc())
.limit(10)
)
recent_txs = list(recent_result.scalars().all())
# Top performing domains
top_domains = sorted(domains, key=lambda d: d.total_revenue, reverse=True)[:5]
stats = YieldDashboardStats(
total_domains=len(domains),
active_domains=total_active,
pending_domains=total_pending,
monthly_revenue=monthly_revenue,
monthly_clicks=monthly_clicks,
monthly_conversions=monthly_conversions,
lifetime_revenue=lifetime_revenue,
lifetime_clicks=lifetime_clicks,
lifetime_conversions=lifetime_conversions,
pending_payout=pending_payout,
next_payout_date=(month_start + timedelta(days=32)).replace(day=1),
currency="CHF",
)
return YieldDashboardResponse(
stats=stats,
domains=[_domain_to_response(d) for d in domains],
recent_transactions=[_tx_to_response(tx) for tx in recent_txs],
top_domains=[_domain_to_response(d) for d in top_domains],
)
# ============================================================================
# Domain Management
# ============================================================================
@router.get("/domains", response_model=YieldDomainListResponse)
async def list_yield_domains(
status: Optional[str] = Query(None, description="Filter by status"),
limit: int = Query(50, le=100),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
List user's yield domains.
"""
query = select(YieldDomain).where(YieldDomain.user_id == current_user.id)
if status:
query = query.where(YieldDomain.status == status)
# Get total count
count_result = await db.execute(
select(func.count(YieldDomain.id)).where(YieldDomain.user_id == current_user.id)
)
total = count_result.scalar() or 0
# Get domains
result = await db.execute(
query.order_by(YieldDomain.created_at.desc()).offset(offset).limit(limit)
)
domains = list(result.scalars().all())
# Aggregates from all domains
all_result = await db.execute(
select(YieldDomain).where(YieldDomain.user_id == current_user.id)
)
all_domains = list(all_result.scalars().all())
total_active = sum(1 for d in all_domains if d.status == "active")
total_revenue = sum(d.total_revenue for d in all_domains)
total_clicks = sum(d.total_clicks for d in all_domains)
return YieldDomainListResponse(
domains=[_domain_to_response(d) for d in domains],
total=total,
total_active=total_active,
total_revenue=total_revenue,
total_clicks=total_clicks,
)
@router.get("/domains/{domain_id}", response_model=YieldDomainResponse)
async def get_yield_domain(
domain_id: int,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Get details of a specific yield domain.
"""
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
return _domain_to_response(domain)
@router.post("/activate", response_model=ActivateYieldResponse)
async def activate_domain_for_yield(
request: ActivateYieldRequest,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Activate a domain for yield/intent routing.
SECURITY: Domain must be in user's portfolio AND DNS-verified.
This creates the yield domain record and returns DNS setup instructions.
"""
from app.models.portfolio import PortfolioDomain
from app.models.subscription import Subscription, SubscriptionTier
domain = request.domain.lower().strip()
# SECURITY CHECK 1: Domain must be in user's portfolio
portfolio_result = await db.execute(
select(PortfolioDomain).where(
PortfolioDomain.domain == domain,
PortfolioDomain.user_id == current_user.id,
)
)
portfolio_domain = portfolio_result.scalar_one_or_none()
if not portfolio_domain:
raise HTTPException(
status_code=403,
detail="Domain must be in your portfolio before activating Yield. Add it to your portfolio first.",
)
# SECURITY CHECK 2: Domain must be DNS-verified
if not portfolio_domain.is_dns_verified:
raise HTTPException(
status_code=403,
detail="Domain must be DNS-verified before activating Yield. Verify ownership in your portfolio first.",
)
# SECURITY CHECK 3: Domain must not be sold
if portfolio_domain.is_sold:
raise HTTPException(
status_code=400,
detail="Cannot activate Yield for a sold domain.",
)
# SECURITY CHECK 4: Tier gating + limits
sub_result = await db.execute(select(Subscription).where(Subscription.user_id == current_user.id))
subscription = sub_result.scalar_one_or_none()
tier = subscription.tier if subscription else SubscriptionTier.SCOUT
tier_value = tier.value if hasattr(tier, "value") else str(tier)
if tier_value == "scout":
raise HTTPException(
status_code=403,
detail="Yield is not available on Scout plan. Upgrade to Trader or Tycoon.",
)
max_yield_domains = 5 if tier_value == "trader" else 10_000_000
user_domain_count = (
await db.execute(
select(func.count(YieldDomain.id)).where(YieldDomain.user_id == current_user.id)
)
).scalar() or 0
if user_domain_count >= max_yield_domains:
raise HTTPException(
status_code=403,
detail=f"Yield domain limit reached for your plan ({max_yield_domains}).",
)
# Check if domain already exists in yield system
existing_result = await db.execute(
select(YieldDomain).where(YieldDomain.domain == domain)
)
existing = existing_result.scalar_one_or_none()
if existing:
if existing.user_id == current_user.id:
raise HTTPException(
status_code=400,
detail="Domain already activated for yield"
)
else:
raise HTTPException(
status_code=400,
detail="Domain is already registered by another user"
)
# Analyze domain intent
intent_result = detect_domain_intent(domain)
value_estimate = get_intent_detector().estimate_value(domain)
# Create yield domain record
yield_domain = YieldDomain(
user_id=current_user.id,
domain=domain,
detected_intent=f"{intent_result.category}_{intent_result.subcategory}" if intent_result.subcategory else intent_result.category,
intent_confidence=intent_result.confidence,
intent_keywords=json.dumps(intent_result.keywords_matched),
status="pending",
)
# Find best matching partner
if intent_result.suggested_partners:
partner_result = await db.execute(
select(AffiliatePartner).where(
AffiliatePartner.slug == intent_result.suggested_partners[0],
AffiliatePartner.is_active == True,
)
)
partner = partner_result.scalar_one_or_none()
if partner:
yield_domain.partner_id = partner.id
yield_domain.active_route = partner.slug
db.add(yield_domain)
await db.commit()
await db.refresh(yield_domain)
# Create DNS instructions
yield_nameservers = settings.yield_nameserver_list
if not yield_nameservers:
raise HTTPException(status_code=500, detail="Yield nameservers are not configured on server.")
dns_instructions = DNSSetupInstructions(
domain=domain,
nameservers=yield_nameservers,
cname_host="@",
cname_target=settings.yield_cname_target,
verification_url=f"{settings.site_url}/api/v1/yield/domains/{yield_domain.id}/verify",
)
return ActivateYieldResponse(
domain_id=yield_domain.id,
domain=domain,
status=yield_domain.status,
intent=IntentAnalysis(
category=intent_result.category,
subcategory=intent_result.subcategory,
confidence=intent_result.confidence,
keywords_matched=intent_result.keywords_matched,
suggested_partners=intent_result.suggested_partners,
monetization_potential=intent_result.monetization_potential,
),
value_estimate=YieldValueEstimate(
estimated_monthly_min=value_estimate["estimated_monthly_min"],
estimated_monthly_max=value_estimate["estimated_monthly_max"],
currency=value_estimate["currency"],
potential=value_estimate["potential"],
confidence=value_estimate["confidence"],
geo=value_estimate["geo"],
),
dns_instructions=dns_instructions,
message="Domain registered! Point your DNS to our nameservers to complete activation.",
)
@router.post("/domains/{domain_id}/verify", response_model=DNSVerificationResult)
async def verify_domain_dns(
domain_id: int,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Verify DNS configuration for a yield domain.
"""
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
# Production-grade DNS check
check = verify_yield_dns(
domain=domain.domain,
expected_nameservers=settings.yield_nameserver_list,
cname_target=settings.yield_cname_target,
)
verified = check.verified
actual_ns = check.actual_ns
error = check.error
# Update domain status
if verified and not domain.dns_verified:
domain.dns_verified = True
domain.dns_verified_at = datetime.utcnow()
domain.connected_at = domain.dns_verified_at
domain.status = "active"
domain.activated_at = datetime.utcnow()
await track_event(
db,
event_name="yield_connected",
request=None,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=domain.domain,
yield_domain_id=domain.id,
metadata={"method": check.method, "cname_ok": check.cname_ok, "actual_ns": check.actual_ns},
)
await db.commit()
return DNSVerificationResult(
domain=domain.domain,
verified=verified,
expected_ns=settings.yield_nameserver_list,
actual_ns=actual_ns,
cname_ok=check.cname_ok if verified else False,
error=error,
checked_at=datetime.utcnow(),
)
@router.patch("/domains/{domain_id}", response_model=YieldDomainResponse)
async def update_yield_domain(
domain_id: int,
update: YieldDomainUpdate,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Update yield domain settings.
"""
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
# Apply updates
if update.active_route is not None:
# Validate partner exists
partner_result = await db.execute(
select(AffiliatePartner).where(
AffiliatePartner.slug == update.active_route,
AffiliatePartner.is_active == True,
)
)
partner = partner_result.scalar_one_or_none()
if not partner:
raise HTTPException(status_code=400, detail="Invalid partner route")
domain.active_route = update.active_route
domain.partner_id = partner.id
if update.landing_page_url is not None:
domain.landing_page_url = update.landing_page_url
if update.status is not None:
if update.status == "paused":
domain.status = "paused"
domain.paused_at = datetime.utcnow()
elif update.status == "active" and domain.dns_verified:
domain.status = "active"
domain.paused_at = None
await db.commit()
await db.refresh(domain)
return _domain_to_response(domain)
@router.delete("/domains/{domain_id}")
async def delete_yield_domain(
domain_id: int,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Remove a domain from yield program.
"""
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
await db.delete(domain)
await db.commit()
return {"message": "Yield domain removed"}
# ============================================================================
# Transactions
# ============================================================================
@router.get("/transactions", response_model=YieldTransactionListResponse)
async def list_transactions(
domain_id: Optional[int] = Query(None),
status: Optional[str] = Query(None),
limit: int = Query(50, le=100),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
List yield transactions for user's domains.
"""
# Get user's domain IDs
domain_ids_result = await db.execute(
select(YieldDomain.id).where(YieldDomain.user_id == current_user.id)
)
domain_ids = [row[0] for row in domain_ids_result.all()]
if not domain_ids:
return YieldTransactionListResponse(
transactions=[],
total=0,
total_gross=Decimal("0"),
total_net=Decimal("0"),
)
query = select(YieldTransaction).where(
YieldTransaction.yield_domain_id.in_(domain_ids)
)
if domain_id:
query = query.where(YieldTransaction.yield_domain_id == domain_id)
if status:
query = query.where(YieldTransaction.status == status)
# Get count
count_query = select(func.count(YieldTransaction.id)).where(
YieldTransaction.yield_domain_id.in_(domain_ids)
)
if domain_id:
count_query = count_query.where(YieldTransaction.yield_domain_id == domain_id)
if status:
count_query = count_query.where(YieldTransaction.status == status)
count_result = await db.execute(count_query)
total = count_result.scalar() or 0
# Get transactions
result = await db.execute(
query.order_by(YieldTransaction.created_at.desc()).offset(offset).limit(limit)
)
transactions = list(result.scalars().all())
# Aggregates
total_gross = sum(tx.gross_amount for tx in transactions)
total_net = sum(tx.net_amount for tx in transactions)
return YieldTransactionListResponse(
transactions=[_tx_to_response(tx) for tx in transactions],
total=total,
total_gross=total_gross,
total_net=total_net,
)
# ============================================================================
# Payouts
# ============================================================================
@router.get("/payouts", response_model=YieldPayoutListResponse)
async def list_payouts(
status: Optional[str] = Query(None),
limit: int = Query(20, le=50),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
List user's yield payouts.
"""
query = select(YieldPayout).where(YieldPayout.user_id == current_user.id)
if status:
query = query.where(YieldPayout.status == status)
# Get count
count_result = await db.execute(
select(func.count(YieldPayout.id)).where(YieldPayout.user_id == current_user.id)
)
total = count_result.scalar() or 0
# Get payouts
result = await db.execute(
query.order_by(YieldPayout.created_at.desc()).offset(offset).limit(limit)
)
payouts = list(result.scalars().all())
# Aggregates
total_paid = sum(p.amount for p in payouts if p.status == "completed")
total_pending = sum(p.amount for p in payouts if p.status in ["pending", "processing"])
return YieldPayoutListResponse(
payouts=[_payout_to_response(p) for p in payouts],
total=total,
total_paid=total_paid,
total_pending=total_pending,
)
# ============================================================================
# Partners (Public info)
# ============================================================================
@router.get("/partners", response_model=list[AffiliatePartnerResponse])
async def list_partners(
category: Optional[str] = Query(None, description="Filter by intent category"),
db: AsyncSession = Depends(get_db),
):
"""
List available affiliate partners.
"""
result = await db.execute(
select(AffiliatePartner)
.where(AffiliatePartner.is_active == True)
.order_by(AffiliatePartner.priority.desc())
)
partners = list(result.scalars().all())
# Filter by category if specified
if category:
partners = [p for p in partners if category in p.intent_list]
return [
AffiliatePartnerResponse(
slug=p.slug,
name=p.name,
network=p.network,
intent_categories=p.intent_list,
geo_countries=p.country_list,
payout_type=p.payout_type,
description=p.description,
logo_url=p.logo_url,
)
for p in partners
]
# ============================================================================
# Helpers
# ============================================================================
def _domain_to_response(domain: YieldDomain) -> YieldDomainResponse:
"""Convert YieldDomain model to response schema."""
return YieldDomainResponse(
id=domain.id,
domain=domain.domain,
status=domain.status,
detected_intent=domain.detected_intent,
intent_confidence=domain.intent_confidence,
active_route=domain.active_route,
partner_name=domain.partner.name if domain.partner else None,
dns_verified=domain.dns_verified,
dns_verified_at=domain.dns_verified_at,
connected_at=getattr(domain, "connected_at", None),
total_clicks=domain.total_clicks,
total_conversions=domain.total_conversions,
total_revenue=domain.total_revenue,
currency=domain.currency,
activated_at=domain.activated_at,
created_at=domain.created_at,
)
def _tx_to_response(tx: YieldTransaction) -> YieldTransactionResponse:
"""Convert YieldTransaction model to response schema."""
return YieldTransactionResponse(
id=tx.id,
event_type=tx.event_type,
partner_slug=tx.partner_slug,
click_id=getattr(tx, "click_id", None),
gross_amount=tx.gross_amount,
net_amount=tx.net_amount,
currency=tx.currency,
status=tx.status,
geo_country=tx.geo_country,
created_at=tx.created_at,
confirmed_at=tx.confirmed_at,
)
def _payout_to_response(payout: YieldPayout) -> YieldPayoutResponse:
"""Convert YieldPayout model to response schema."""
return YieldPayoutResponse(
id=payout.id,
amount=payout.amount,
currency=payout.currency,
period_start=payout.period_start,
period_end=payout.period_end,
transaction_count=payout.transaction_count,
status=payout.status,
payment_method=payout.payment_method,
payment_reference=payout.payment_reference,
created_at=payout.created_at,
completed_at=payout.completed_at,
)

View File

@ -1,188 +0,0 @@
"""
Admin endpoints for Yield payouts (ledger).
Premium constraints:
- No placeholder payouts
- No currency mixing
- Idempotent generation per (user, currency, period)
"""
from __future__ import annotations
from datetime import datetime
from decimal import Decimal
from fastapi import APIRouter, Depends, HTTPException, status
from pydantic import BaseModel, Field
from sqlalchemy import and_, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user, get_db
from app.models.user import User
from app.models.yield_domain import YieldPayout, YieldTransaction
from app.services.telemetry import track_event
from app.services.yield_payouts import generate_payouts_for_period
router = APIRouter(prefix="/yield", tags=["yield-admin"])
class PayoutGenerateRequest(BaseModel):
period_start: datetime
period_end: datetime
class GeneratedPayout(BaseModel):
id: int
user_id: int
amount: Decimal
currency: str
period_start: datetime
period_end: datetime
transaction_count: int
status: str
created_at: datetime
class PayoutGenerateResponse(BaseModel):
created: list[GeneratedPayout]
skipped_existing: int = 0
class PayoutCompleteRequest(BaseModel):
payment_method: str | None = Field(default=None, max_length=50)
payment_reference: str | None = Field(default=None, max_length=200)
class PayoutCompleteResponse(BaseModel):
payout_id: int
transactions_marked_paid: int
completed_at: datetime
def _require_admin(current_user: User) -> None:
if not current_user.is_admin:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Admin access required")
@router.post("/payouts/generate", response_model=PayoutGenerateResponse)
async def generate_payouts(
payload: PayoutGenerateRequest,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Create YieldPayout rows for confirmed, unpaid transactions in the period.
This does NOT mark payouts as completed. It only assigns transactions to a payout via payout_id.
Completion is a separate step once payment is executed.
"""
_require_admin(current_user)
if payload.period_end <= payload.period_start:
raise HTTPException(status_code=400, detail="period_end must be after period_start")
created_count, skipped_existing = await generate_payouts_for_period(
db,
period_start=payload.period_start,
period_end=payload.period_end,
)
payouts = (
await db.execute(
select(YieldPayout)
.where(
and_(
YieldPayout.period_start == payload.period_start,
YieldPayout.period_end == payload.period_end,
)
)
.order_by(YieldPayout.created_at.desc())
)
).scalars().all()
created = [
GeneratedPayout(
id=p.id,
user_id=p.user_id,
amount=p.amount,
currency=p.currency,
period_start=p.period_start,
period_end=p.period_end,
transaction_count=p.transaction_count,
status=p.status,
created_at=p.created_at,
)
for p in payouts
]
# created_count is still returned implicitly via list length; we keep it for logs later
_ = created_count
return PayoutGenerateResponse(created=created, skipped_existing=skipped_existing)
@router.post("/payouts/{payout_id}/complete", response_model=PayoutCompleteResponse)
async def complete_payout(
payout_id: int,
payload: PayoutCompleteRequest,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Mark a payout as completed and mark assigned transactions as paid.
"""
_require_admin(current_user)
payout = (
await db.execute(select(YieldPayout).where(YieldPayout.id == payout_id))
).scalar_one_or_none()
if not payout:
raise HTTPException(status_code=404, detail="Payout not found")
if payout.status == "completed":
raise HTTPException(status_code=400, detail="Payout already completed")
payout.status = "completed"
payout.completed_at = datetime.utcnow()
payout.payment_method = payload.payment_method
payout.payment_reference = payload.payment_reference
txs = (
await db.execute(
select(YieldTransaction).where(YieldTransaction.payout_id == payout.id)
)
).scalars().all()
marked = 0
for tx in txs:
if tx.status != "paid":
tx.status = "paid"
tx.paid_at = payout.completed_at
marked += 1
await track_event(
db,
event_name="payout_paid",
request=None,
user_id=payout.user_id,
is_authenticated=None,
source="admin",
domain=None,
yield_domain_id=None,
metadata={
"payout_id": payout.id,
"currency": payout.currency,
"amount": float(payout.amount),
"transaction_count": payout.transaction_count,
"payment_method": payout.payment_method,
},
)
await db.commit()
return PayoutCompleteResponse(
payout_id=payout.id,
transactions_marked_paid=marked,
completed_at=payout.completed_at,
)

View File

@ -1,308 +0,0 @@
"""
Yield Domain Routing API.
This handles incoming HTTP requests to yield domains:
1. Detect the domain from the Host header
2. Look up the yield configuration
3. Track the click
4. Redirect to the appropriate affiliate landing page
In production, this runs on a separate subdomain or IP (yield.pounce.io)
that yield domains CNAME to.
"""
import logging
from datetime import datetime, timedelta
from decimal import Decimal
from typing import Optional
from uuid import uuid4
from fastapi import APIRouter, Depends, HTTPException, Query, Request
from fastapi.responses import RedirectResponse
from sqlalchemy import and_, func, or_, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_db
from app.config import get_settings
from app.models.yield_domain import YieldDomain, YieldTransaction, AffiliatePartner
from app.services.intent_detector import detect_domain_intent
from app.services.telemetry import track_event
logger = logging.getLogger(__name__)
settings = get_settings()
router = APIRouter(prefix="/r", tags=["yield-routing"])
# Revenue split
USER_REVENUE_SHARE = Decimal("0.70")
def hash_ip(ip: str) -> str:
"""Hash IP for privacy-compliant storage."""
import hashlib
# Salt to prevent trivial rainbow table lookups.
return hashlib.sha256(f"{ip}|{settings.secret_key}".encode()).hexdigest()[:32]
def _get_client_ip(request: Request) -> Optional[str]:
# Prefer proxy headers when behind nginx
xff = request.headers.get("x-forwarded-for")
if xff:
# first IP in list
ip = xff.split(",")[0].strip()
if ip:
return ip
cf_ip = request.headers.get("cf-connecting-ip")
if cf_ip:
return cf_ip.strip()
return request.client.host if request.client else None
def _safe_tracking_url(template: str, *, click_id: str, domain: str, domain_id: int, partner: str) -> str:
try:
return template.format(
click_id=click_id,
domain=domain,
domain_id=domain_id,
partner=partner,
)
except KeyError as e:
raise HTTPException(
status_code=500,
detail=f"Partner tracking_url_template uses unsupported placeholder: {str(e)}",
)
def generate_tracking_url(
partner: AffiliatePartner,
yield_domain: YieldDomain,
click_id: str,
) -> str:
"""
Generate the tracking URL for a partner.
Most affiliate networks expect parameters like:
- clickid / subid: Our click tracking ID
- ref: Domain name or user reference
"""
if not partner.tracking_url_template:
raise HTTPException(
status_code=503,
detail=f"Partner routing not configured for {partner.slug}. Missing tracking_url_template.",
)
return _safe_tracking_url(
partner.tracking_url_template,
click_id=click_id,
domain=yield_domain.domain,
domain_id=yield_domain.id,
partner=partner.slug,
)
@router.get("/{domain}")
async def route_yield_domain(
domain: str,
request: Request,
db: AsyncSession = Depends(get_db),
direct: bool = Query(True, description="Direct redirect without landing page"),
):
"""
Route traffic for a yield domain.
This is the main entry point for yield domain traffic.
Query params:
- direct: If true, redirect immediately without landing page
"""
domain = domain.lower().strip()
# Find yield domain (must be connected + active)
yield_domain = (
await db.execute(
select(YieldDomain).where(
and_(
YieldDomain.domain == domain,
YieldDomain.status == "active",
YieldDomain.dns_verified == True,
or_(YieldDomain.connected_at.is_not(None), YieldDomain.dns_verified_at.is_not(None)),
)
)
)
).scalar_one_or_none()
if not yield_domain:
logger.warning(f"Route request for unknown/inactive/unconnected domain: {domain}")
raise HTTPException(status_code=404, detail="Domain not active for yield routing.")
# Resolve partner
partner: Optional[AffiliatePartner] = None
if yield_domain.partner_id:
partner = (
await db.execute(
select(AffiliatePartner).where(
and_(
AffiliatePartner.id == yield_domain.partner_id,
AffiliatePartner.is_active == True,
)
)
)
).scalar_one_or_none()
if not partner and yield_domain.detected_intent:
# Match full detected intent first (e.g. medical_dental)
partner = (
await db.execute(
select(AffiliatePartner)
.where(
and_(
AffiliatePartner.is_active == True,
AffiliatePartner.intent_categories.ilike(f"%{yield_domain.detected_intent}%"),
)
)
.order_by(AffiliatePartner.priority.desc())
)
).scalar_one_or_none()
if not partner:
raise HTTPException(status_code=503, detail="No active partner available for this domain intent.")
# Rate limit: max 120 clicks/10min per IP per domain
client_ip = _get_client_ip(request)
ip_hash = hash_ip(client_ip) if client_ip else None
if ip_hash:
cutoff = datetime.utcnow() - timedelta(minutes=10)
recent = (
await db.execute(
select(func.count(YieldTransaction.id)).where(
and_(
YieldTransaction.yield_domain_id == yield_domain.id,
YieldTransaction.event_type == "click",
YieldTransaction.ip_hash == ip_hash,
YieldTransaction.created_at >= cutoff,
)
)
)
).scalar() or 0
if recent >= 120:
raise HTTPException(status_code=429, detail="Too many requests. Please slow down.")
# Compute click economics (only CPC can be accounted immediately)
gross = Decimal("0")
net = Decimal("0")
currency = (partner.payout_currency or "CHF").upper()
if (partner.payout_type or "").lower() == "cpc":
gross = partner.payout_amount or Decimal("0")
net = (gross * USER_REVENUE_SHARE).quantize(Decimal("0.01"))
click_id = uuid4().hex
destination_url = generate_tracking_url(partner, yield_domain, click_id)
user_agent = request.headers.get("user-agent")
referrer = request.headers.get("referer")
geo_country = request.headers.get("cf-ipcountry") or request.headers.get("x-country")
geo_country = geo_country.strip().upper() if geo_country else None
transaction = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type="click",
partner_slug=partner.slug,
click_id=click_id,
destination_url=destination_url[:2000],
gross_amount=gross,
net_amount=net,
currency=currency,
referrer=referrer[:500] if referrer else None,
user_agent=user_agent[:500] if user_agent else None,
geo_country=geo_country[:2] if geo_country else None,
ip_hash=ip_hash,
status="confirmed",
confirmed_at=datetime.utcnow(),
)
db.add(transaction)
yield_domain.total_clicks += 1
yield_domain.last_click_at = datetime.utcnow()
if net > 0:
yield_domain.total_revenue += net
await track_event(
db,
event_name="yield_click",
request=request,
user_id=yield_domain.user_id,
is_authenticated=None,
source="routing",
domain=yield_domain.domain,
yield_domain_id=yield_domain.id,
click_id=click_id,
metadata={"partner": partner.slug, "currency": currency, "net_amount": float(net)},
)
await db.commit()
# Only direct redirect for MVP
return RedirectResponse(url=destination_url, status_code=302)
@router.get("/")
async def yield_routing_info():
"""Info endpoint for yield routing service."""
return {
"service": "Pounce Yield Routing",
"version": "2.0.0",
"docs": f"{settings.site_url}/docs#/yield-routing",
"status": "active",
}
# ============================================================================
# Host-based routing (for production deployment)
# ============================================================================
@router.api_route("/catch-all", methods=["GET", "HEAD"])
async def catch_all_route(
request: Request,
db: AsyncSession = Depends(get_db),
):
"""
Catch-all route for host-based routing.
In production, this endpoint handles requests where the Host header
is the yield domain itself (e.g., zahnarzt-zuerich.ch).
This requires:
1. Yield domains to CNAME to yield.pounce.io
2. Nginx/Caddy to route all hosts to this backend
3. This endpoint to parse the Host header
"""
host = request.headers.get("host", "").lower()
# Remove port if present
if ":" in host:
host = host.split(":")[0]
# Skip our own domains
our_domains = ["pounce.ch", "pounce.io", "localhost", "127.0.0.1"]
if any(host.endswith(d) for d in our_domains):
return {"status": "not a yield domain", "host": host}
# If host matches a connected yield domain, route it
_ = (
await db.execute(
select(YieldDomain.id).where(
and_(
YieldDomain.domain == host,
YieldDomain.status == "active",
YieldDomain.dns_verified == True,
or_(YieldDomain.connected_at.is_not(None), YieldDomain.dns_verified_at.is_not(None)),
)
)
)
).scalar_one_or_none()
if not _:
raise HTTPException(status_code=404, detail="Host not configured for yield routing.")
return RedirectResponse(url=f"/api/v1/r/{host}?direct=true", status_code=302)

View File

@ -1,563 +0,0 @@
"""
Webhook endpoints for Yield affiliate partner callbacks.
Partners call these endpoints to report:
- Clicks (redirect happened)
- Leads (form submitted, signup, etc.)
- Sales (purchase completed)
Each partner may have different authentication methods:
- HMAC signature verification
- API key in header
- IP whitelist
"""
import hashlib
import hmac
import json
import logging
from datetime import datetime
from decimal import Decimal
from typing import Optional
from fastapi import APIRouter, BackgroundTasks, Depends, Header, HTTPException, Request
from pydantic import BaseModel, Field
from sqlalchemy import and_, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_db
from app.config import get_settings
from app.models.yield_domain import YieldDomain, YieldTransaction, AffiliatePartner
from app.services.telemetry import track_event
logger = logging.getLogger(__name__)
settings = get_settings()
router = APIRouter(prefix="/yield-webhooks", tags=["yield-webhooks"])
# Revenue split: User gets 70%, Pounce keeps 30%
USER_REVENUE_SHARE = Decimal("0.70")
# ============================================================================
# Schemas
# ============================================================================
class PartnerEvent(BaseModel):
"""Generic partner event payload."""
event_type: str = Field(..., description="click, lead, or sale")
domain: str = Field(..., description="The yield domain that generated this event")
transaction_id: Optional[str] = Field(None, description="Partner's transaction ID")
click_id: Optional[str] = Field(None, description="Pounce click_id for attribution (UUID hex)")
amount: Optional[float] = Field(None, description="Gross commission amount")
currency: Optional[str] = Field("CHF", description="Currency code")
# Optional attribution data
geo_country: Optional[str] = None
referrer: Optional[str] = None
user_agent: Optional[str] = None
# Optional metadata
metadata: Optional[dict] = None
class WebhookResponse(BaseModel):
"""Response for webhook calls."""
success: bool
transaction_id: Optional[int] = None
message: str
# ============================================================================
# Signature Verification Helpers
# ============================================================================
def verify_hmac_signature(
payload: bytes,
signature: str,
secret: str,
algorithm: str = "sha256"
) -> bool:
"""Verify HMAC signature for webhook payload."""
expected = hmac.new(
secret.encode(),
payload,
hashlib.sha256 if algorithm == "sha256" else hashlib.sha1
).hexdigest()
return hmac.compare_digest(signature, expected)
def hash_ip(ip: str) -> str:
"""Hash IP address for privacy-compliant storage."""
return hashlib.sha256(f"{ip}|{settings.secret_key}".encode()).hexdigest()[:32]
def _get_webhook_secret(partner_slug: str) -> Optional[str]:
"""
Webhook secrets are configured via environment:
- YIELD_WEBHOOK_SECRET (global default)
- YIELD_WEBHOOK_SECRET_<PARTNER_SLUG_UPPER> (partner-specific override)
"""
import os
specific = os.getenv(f"YIELD_WEBHOOK_SECRET_{partner_slug.upper()}")
if specific:
return specific
return os.getenv("YIELD_WEBHOOK_SECRET") or None
# ============================================================================
# Generic Webhook Endpoint
# ============================================================================
@router.post("/{partner_slug}", response_model=WebhookResponse)
async def receive_partner_webhook(
partner_slug: str,
event: PartnerEvent,
request: Request,
background_tasks: BackgroundTasks,
db: AsyncSession = Depends(get_db),
x_webhook_signature: Optional[str] = Header(None),
x_api_key: Optional[str] = Header(None),
):
"""
Receive webhook callback from affiliate partner.
Partners POST events here when clicks, leads, or sales occur.
"""
# 1. Find partner
partner = (
await db.execute(
select(AffiliatePartner).where(
and_(
AffiliatePartner.slug == partner_slug,
AffiliatePartner.is_active == True,
)
)
)
).scalar_one_or_none()
if not partner:
logger.warning(f"Webhook from unknown partner: {partner_slug}")
raise HTTPException(status_code=404, detail="Unknown partner")
# 2. Verify authentication (strict)
secret = _get_webhook_secret(partner_slug)
if not secret:
raise HTTPException(status_code=503, detail="Webhook secret not configured on server.")
if not x_webhook_signature:
raise HTTPException(status_code=401, detail="Missing webhook signature.")
raw = await request.body()
if not verify_hmac_signature(raw, x_webhook_signature, secret):
raise HTTPException(status_code=401, detail="Invalid webhook signature.")
# 3. Find yield domain (must be active)
yield_domain = (
await db.execute(
select(YieldDomain).where(
and_(
YieldDomain.domain == event.domain.lower(),
YieldDomain.status == "active",
)
)
)
).scalar_one_or_none()
if not yield_domain:
logger.warning(f"Webhook for unknown/inactive domain: {event.domain}")
raise HTTPException(status_code=404, detail="Domain not found or inactive")
# 4. Calculate amounts
gross_amount = Decimal(str(event.amount)) if event.amount else Decimal("0")
net_amount = gross_amount * USER_REVENUE_SHARE
# 5. Get client IP for hashing
client_ip = request.client.host if request.client else None
ip_hash = hash_ip(client_ip) if client_ip else None
# 6. Create transaction
transaction = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type=event.event_type,
partner_slug=partner_slug,
partner_transaction_id=event.transaction_id,
click_id=(event.click_id[:64] if event.click_id else None),
gross_amount=gross_amount,
net_amount=net_amount,
currency=event.currency or "CHF",
referrer=event.referrer,
user_agent=event.user_agent,
geo_country=event.geo_country,
ip_hash=ip_hash,
status="pending" if event.event_type in ["lead", "sale"] else "confirmed",
confirmed_at=datetime.utcnow() if event.event_type == "click" else None,
)
db.add(transaction)
# Optional: attribute to an existing click transaction (same yield_domain + click_id)
if event.click_id:
click_tx = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.yield_domain_id == yield_domain.id,
YieldTransaction.event_type == "click",
YieldTransaction.click_id == event.click_id[:64],
)
)
)
).scalar_one_or_none()
if not click_tx:
logger.warning(
f"Webhook received click_id but no matching click found: partner={partner_slug} "
f"domain={yield_domain.domain} click_id={event.click_id[:64]}"
)
# 7. Update domain aggregates
if event.event_type == "click":
yield_domain.total_clicks += 1
yield_domain.last_click_at = datetime.utcnow()
elif event.event_type in ["lead", "sale"]:
yield_domain.total_conversions += 1
yield_domain.last_conversion_at = datetime.utcnow()
# Add revenue when confirmed
if transaction.status == "confirmed":
yield_domain.total_revenue += net_amount
await track_event(
db,
event_name="yield_conversion",
request=request,
user_id=yield_domain.user_id,
is_authenticated=None,
source="webhook",
domain=yield_domain.domain,
yield_domain_id=yield_domain.id,
click_id=event.click_id,
metadata={
"partner": partner_slug,
"event_type": event.event_type,
"status": transaction.status,
"currency": transaction.currency,
"net_amount": float(net_amount),
"partner_transaction_id": event.transaction_id,
},
)
await db.commit()
await db.refresh(transaction)
logger.info(
f"Webhook processed: {partner_slug} -> {event.domain} "
f"({event.event_type}, gross={gross_amount}, net={net_amount})"
)
return WebhookResponse(
success=True,
transaction_id=transaction.id,
message=f"Event {event.event_type} recorded successfully"
)
# ============================================================================
# Awin-Specific Webhook
# ============================================================================
class AwinEvent(BaseModel):
"""Awin network postback format."""
clickRef: str # Our yield domain ID or domain name
transactionId: str
commission: float
commissionCurrency: str = "CHF"
status: str # "pending", "approved", "declined"
transactionType: str # "sale", "lead"
@router.post("/awin/postback", response_model=WebhookResponse)
async def receive_awin_postback(
event: AwinEvent,
request: Request,
db: AsyncSession = Depends(get_db),
x_awin_signature: Optional[str] = Header(None),
):
"""
Receive postback from Awin affiliate network.
Awin sends postbacks for tracked conversions.
"""
# Verify authentication (strict)
secret = _get_webhook_secret("awin")
if not secret:
raise HTTPException(status_code=503, detail="Webhook secret not configured on server.")
if not x_awin_signature:
raise HTTPException(status_code=401, detail="Missing webhook signature.")
raw = await request.body()
if not verify_hmac_signature(raw, x_awin_signature, secret):
raise HTTPException(status_code=401, detail="Invalid webhook signature.")
# Find domain by click reference
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.domain == event.clickRef.lower()))
).scalar_one_or_none()
if not yield_domain:
# Try to find by ID if clickRef is numeric
try:
domain_id = int(event.clickRef)
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.id == domain_id))
).scalar_one_or_none()
except ValueError:
pass
if not yield_domain:
logger.warning(f"Awin postback for unknown domain: {event.clickRef}")
raise HTTPException(status_code=404, detail="Domain not found")
# Calculate amounts
gross_amount = Decimal(str(event.commission))
net_amount = gross_amount * USER_REVENUE_SHARE
# Map Awin status to our status
status_map = {
"pending": "pending",
"approved": "confirmed",
"declined": "rejected",
}
status = status_map.get(event.status.lower(), "pending")
# Create or update transaction
existing_tx = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.partner_transaction_id == event.transactionId,
YieldTransaction.partner_slug.ilike("awin%"),
)
)
)
).scalar_one_or_none()
if existing_tx:
# Update existing transaction
existing_tx.status = status
if status == "confirmed":
existing_tx.confirmed_at = datetime.utcnow()
yield_domain.total_revenue += net_amount
transaction_id = existing_tx.id
else:
# Create new transaction
transaction = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type="lead" if event.transactionType.lower() == "lead" else "sale",
partner_slug=f"awin_{yield_domain.active_route or 'unknown'}",
partner_transaction_id=event.transactionId,
gross_amount=gross_amount,
net_amount=net_amount,
currency=event.commissionCurrency,
status=status,
confirmed_at=datetime.utcnow() if status == "confirmed" else None,
)
db.add(transaction)
# Update domain stats
yield_domain.total_conversions += 1
yield_domain.last_conversion_at = datetime.utcnow()
if status == "confirmed":
yield_domain.total_revenue += net_amount
await db.flush()
transaction_id = transaction.id
await db.commit()
logger.info(f"Awin postback processed: {event.transactionId} -> {status}")
return WebhookResponse(
success=True,
transaction_id=transaction_id,
message=f"Awin event processed ({status})"
)
# ============================================================================
# Transaction Confirmation Endpoint (Admin/Internal)
# ============================================================================
@router.post("/confirm/{transaction_id}", response_model=WebhookResponse)
async def confirm_transaction(
transaction_id: int,
db: AsyncSession = Depends(get_db),
x_internal_key: Optional[str] = Header(None),
):
"""
Manually confirm a pending transaction.
Internal endpoint for admin use or automated confirmation.
"""
internal_key = (settings.internal_api_key or "").strip()
if not internal_key:
raise HTTPException(status_code=503, detail="internal_api_key is not configured on server.")
if x_internal_key != internal_key:
raise HTTPException(status_code=401, detail="Unauthorized")
transaction = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.id == transaction_id,
YieldTransaction.status == "pending",
)
)
)
).scalar_one_or_none()
if not transaction:
raise HTTPException(status_code=404, detail="Transaction not found or not pending")
# Confirm transaction
transaction.status = "confirmed"
transaction.confirmed_at = datetime.utcnow()
# Update domain revenue
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.id == transaction.yield_domain_id))
).scalar_one_or_none()
if yield_domain:
yield_domain.total_revenue += transaction.net_amount
await db.commit()
logger.info(f"Transaction {transaction_id} confirmed manually")
return WebhookResponse(
success=True,
transaction_id=transaction_id,
message="Transaction confirmed"
)
# ============================================================================
# Batch Transaction Import (for reconciliation)
# ============================================================================
class BatchTransactionItem(BaseModel):
"""Single transaction in batch import."""
domain: str
event_type: str
partner_slug: str
transaction_id: str
click_id: Optional[str] = None
gross_amount: float
currency: str = "CHF"
status: str = "confirmed"
created_at: Optional[str] = None
class BatchImportRequest(BaseModel):
"""Batch transaction import request."""
transactions: list[BatchTransactionItem]
class BatchImportResponse(BaseModel):
"""Batch import response."""
success: bool
imported: int
skipped: int
errors: list[str]
@router.post("/batch-import", response_model=BatchImportResponse)
async def batch_import_transactions(
request_data: BatchImportRequest,
db: AsyncSession = Depends(get_db),
x_internal_key: Optional[str] = Header(None),
):
"""
Batch import transactions for reconciliation.
Internal endpoint for importing partner reports.
"""
internal_key = (settings.internal_api_key or "").strip()
if not internal_key:
raise HTTPException(status_code=503, detail="internal_api_key is not configured on server.")
if x_internal_key != internal_key:
raise HTTPException(status_code=401, detail="Unauthorized")
imported = 0
skipped = 0
errors = []
for item in request_data.transactions:
try:
# Find domain
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.domain == item.domain.lower()))
).scalar_one_or_none()
if not yield_domain:
errors.append(f"Domain not found: {item.domain}")
skipped += 1
continue
# Check for duplicate
existing = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.partner_transaction_id == item.transaction_id,
YieldTransaction.partner_slug == item.partner_slug,
)
)
)
).scalar_one_or_none()
if existing:
skipped += 1
continue
# Create transaction
gross = Decimal(str(item.gross_amount))
net = gross * USER_REVENUE_SHARE
tx = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type=item.event_type,
partner_slug=item.partner_slug,
partner_transaction_id=item.transaction_id,
click_id=(item.click_id[:64] if item.click_id else None),
gross_amount=gross,
net_amount=net,
currency=item.currency,
status=item.status,
confirmed_at=datetime.utcnow() if item.status == "confirmed" else None,
)
db.add(tx)
# Update domain stats
if item.event_type == "click":
yield_domain.total_clicks += 1
else:
yield_domain.total_conversions += 1
if item.status == "confirmed":
yield_domain.total_revenue += net
imported += 1
except Exception as e:
errors.append(f"Error importing {item.domain}/{item.transaction_id}: {str(e)}")
skipped += 1
await db.commit()
return BatchImportResponse(
success=len(errors) == 0,
imported=imported,
skipped=skipped,
errors=errors[:10] # Limit error messages
)

View File

@ -17,11 +17,6 @@ class Settings(BaseSettings):
# App Settings
app_name: str = "DomainWatch"
debug: bool = True
site_url: str = "https://pounce.ch" # Base URL for links in emails/API responses
# Internal admin operations (server-to-server / cron)
# MUST be set in production; used for protected internal endpoints.
internal_api_key: str = ""
# Email Settings (optional)
smtp_host: str = ""
@ -37,64 +32,6 @@ class Settings(BaseSettings):
check_hour: int = 6
check_minute: int = 0
scheduler_check_interval_hours: int = 24
enable_scheduler: bool = False # Run APScheduler jobs in this process (recommend: separate scheduler process)
# Job Queue / Redis (Phase 2)
redis_url: str = "" # e.g. redis://redis:6379/0
enable_job_queue: bool = False
# Observability (Phase 2)
enable_metrics: bool = True
metrics_path: str = "/metrics"
enable_db_query_metrics: bool = False
enable_business_metrics: bool = True
business_metrics_days: int = 30
business_metrics_cache_seconds: int = 60
# Ops / Backups (4B)
enable_db_backups: bool = False
backup_dir: str = "backups"
backup_retention_days: int = 14
# Ops / Alerting (4B) - no Docker required
ops_alerts_enabled: bool = False
ops_alert_recipients: str = "" # comma-separated emails; if empty -> CONTACT_EMAIL env fallback
ops_alert_cooldown_minutes: int = 180
ops_alert_backup_stale_seconds: int = 93600 # ~26h
# Rate limiting storage (SlowAPI / limits). Use Redis in production.
rate_limit_storage_uri: str = "memory://"
# =================================
# Referral rewards / Anti-fraud (3C.2)
# =================================
referral_rewards_enabled: bool = True
referral_rewards_cooldown_days: int = 7
referral_rewards_ip_window_days: int = 30
referral_rewards_require_ip_hash: bool = True
# =================================
# Yield / Intent Routing
# =================================
# Comma-separated list of nameservers the user must delegate to for Yield.
# Example: "ns1.pounce.io,ns2.pounce.io"
yield_nameservers: str = "ns1.pounce.io,ns2.pounce.io"
# CNAME/ALIAS target for simpler DNS setup (provider-dependent).
# Example: "yield.pounce.io"
yield_cname_target: str = "yield.pounce.io"
@property
def yield_nameserver_list(self) -> list[str]:
return [
ns.strip().lower()
for ns in (self.yield_nameservers or "").split(",")
if ns.strip()
]
# Database pooling (PostgreSQL)
db_pool_size: int = 5
db_max_overflow: int = 10
db_pool_timeout: int = 30
# =================================
# External API Credentials
@ -117,13 +54,6 @@ class Settings(BaseSettings):
moz_access_id: str = ""
moz_secret_key: str = ""
# ICANN CZDS (Centralized Zone Data Service)
# For downloading gTLD zone files (.com, .net, .org, etc.)
# Register at: https://czds.icann.org/
czds_username: str = ""
czds_password: str = ""
czds_data_dir: str = "/tmp/pounce_czds"
class Config:
env_file = ".env"
env_file_encoding = "utf-8"

View File

@ -7,22 +7,11 @@ from app.config import get_settings
settings = get_settings()
# Create async engine
engine_kwargs = {
"echo": settings.debug,
"future": True,
}
# Production hardening: enable connection pooling for Postgres
if settings.database_url.startswith("postgresql"):
engine_kwargs.update(
{
"pool_size": settings.db_pool_size,
"max_overflow": settings.db_max_overflow,
"pool_timeout": settings.db_pool_timeout,
"pool_pre_ping": True,
}
)
engine = create_async_engine(settings.database_url, **engine_kwargs)
engine = create_async_engine(
settings.database_url,
echo=settings.debug,
future=True,
)
# Create async session factory
AsyncSessionLocal = async_sessionmaker(
@ -56,7 +45,4 @@ async def init_db():
"""Initialize database tables."""
async with engine.begin() as conn:
await conn.run_sync(Base.metadata.create_all)
# Apply additive migrations (indexes / optional columns) for existing DBs
from app.db_migrations import apply_migrations
await apply_migrations(conn)

View File

@ -1,371 +0,0 @@
"""
Lightweight, idempotent DB migrations.
This project historically used `Base.metadata.create_all()` for bootstrapping new installs.
That does NOT handle schema evolution on existing databases. For performance-related changes
(indexes, new optional columns), we apply additive migrations on startup.
Important:
- Only additive changes (ADD COLUMN / CREATE INDEX) should live here.
- Operations must be idempotent (safe to run on every startup).
"""
from __future__ import annotations
import logging
from typing import Any
from sqlalchemy import text
from sqlalchemy.ext.asyncio import AsyncConnection
logger = logging.getLogger(__name__)
async def _sqlite_table_exists(conn: AsyncConnection, table: str) -> bool:
res = await conn.execute(
text("SELECT 1 FROM sqlite_master WHERE type='table' AND name=:name LIMIT 1"),
{"name": table},
)
return res.scalar() is not None
async def _sqlite_has_column(conn: AsyncConnection, table: str, column: str) -> bool:
res = await conn.execute(text(f"PRAGMA table_info({table})"))
rows = res.fetchall()
# PRAGMA table_info: (cid, name, type, notnull, dflt_value, pk)
return any(r[1] == column for r in rows)
async def _postgres_table_exists(conn: AsyncConnection, table: str) -> bool:
# to_regclass returns NULL if the relation does not exist
res = await conn.execute(text("SELECT to_regclass(:name)"), {"name": table})
return res.scalar() is not None
async def _postgres_has_column(conn: AsyncConnection, table: str, column: str) -> bool:
res = await conn.execute(
text(
"""
SELECT 1
FROM information_schema.columns
WHERE table_schema = current_schema()
AND table_name = :table
AND column_name = :column
LIMIT 1
"""
),
{"table": table, "column": column},
)
return res.scalar() is not None
async def _table_exists(conn: AsyncConnection, table: str) -> bool:
dialect = conn.engine.dialect.name
if dialect == "sqlite":
return await _sqlite_table_exists(conn, table)
return await _postgres_table_exists(conn, table)
async def _has_column(conn: AsyncConnection, table: str, column: str) -> bool:
dialect = conn.engine.dialect.name
if dialect == "sqlite":
return await _sqlite_has_column(conn, table, column)
return await _postgres_has_column(conn, table, column)
async def apply_migrations(conn: AsyncConnection) -> None:
"""
Apply idempotent migrations.
Called on startup after `create_all()` to keep existing DBs up-to-date.
"""
dialect = conn.engine.dialect.name
logger.info("DB migrations: starting (dialect=%s)", dialect)
# ------------------------------------------------------------------
# 1) domain_auctions.pounce_score (enables DB-level sorting/pagination)
# ------------------------------------------------------------------
if await _table_exists(conn, "domain_auctions"):
if not await _has_column(conn, "domain_auctions", "pounce_score"):
logger.info("DB migrations: adding column domain_auctions.pounce_score")
await conn.execute(text("ALTER TABLE domain_auctions ADD COLUMN pounce_score INTEGER"))
# Index for feed ordering
await conn.execute(
text("CREATE INDEX IF NOT EXISTS ix_domain_auctions_pounce_score ON domain_auctions(pounce_score)")
)
# ---------------------------------------------------------
# 2) domain_checks index for history queries (watchlist UI)
# ---------------------------------------------------------
if await _table_exists(conn, "domain_checks"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_domain_checks_domain_id_checked_at "
"ON domain_checks(domain_id, checked_at)"
)
)
# ---------------------------------------------------
# 3) tld_prices composite index for trend computations
# ---------------------------------------------------
if await _table_exists(conn, "tld_prices"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_tld_prices_tld_registrar_recorded_at "
"ON tld_prices(tld, registrar, recorded_at)"
)
)
# ----------------------------------------------------
# 4) domain_listings pounce_score index (market sorting)
# ----------------------------------------------------
if await _table_exists(conn, "domain_listings"):
if not await _has_column(conn, "domain_listings", "sold_at"):
logger.info("DB migrations: adding column domain_listings.sold_at")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_at DATETIME"))
if not await _has_column(conn, "domain_listings", "sold_reason"):
logger.info("DB migrations: adding column domain_listings.sold_reason")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_reason VARCHAR(200)"))
if not await _has_column(conn, "domain_listings", "sold_price"):
logger.info("DB migrations: adding column domain_listings.sold_price")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_price FLOAT"))
if not await _has_column(conn, "domain_listings", "sold_currency"):
logger.info("DB migrations: adding column domain_listings.sold_currency")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_currency VARCHAR(3)"))
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_domain_listings_pounce_score "
"ON domain_listings(pounce_score)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_domain_listings_status "
"ON domain_listings(status)"
)
)
# ----------------------------------------------------
# 4b) listing_inquiries: deal workflow + audit trail
# ----------------------------------------------------
if await _table_exists(conn, "listing_inquiries"):
if not await _has_column(conn, "listing_inquiries", "buyer_user_id"):
logger.info("DB migrations: adding column listing_inquiries.buyer_user_id")
await conn.execute(text("ALTER TABLE listing_inquiries ADD COLUMN buyer_user_id INTEGER"))
if not await _has_column(conn, "listing_inquiries", "closed_at"):
logger.info("DB migrations: adding column listing_inquiries.closed_at")
await conn.execute(text("ALTER TABLE listing_inquiries ADD COLUMN closed_at DATETIME"))
if not await _has_column(conn, "listing_inquiries", "closed_reason"):
logger.info("DB migrations: adding column listing_inquiries.closed_reason")
await conn.execute(text("ALTER TABLE listing_inquiries ADD COLUMN closed_reason VARCHAR(200)"))
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiries_listing_created "
"ON listing_inquiries(listing_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiries_listing_status "
"ON listing_inquiries(listing_id, status)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiries_buyer_user "
"ON listing_inquiries(buyer_user_id)"
)
)
# The table itself is created by `Base.metadata.create_all()` on startup.
# Here we only add indexes (idempotent) for existing DBs.
if await _table_exists(conn, "listing_inquiry_events"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_events_inquiry_created "
"ON listing_inquiry_events(inquiry_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_events_listing_created "
"ON listing_inquiry_events(listing_id, created_at)"
)
)
if await _table_exists(conn, "listing_inquiry_messages"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_messages_inquiry_created "
"ON listing_inquiry_messages(inquiry_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_messages_listing_created "
"ON listing_inquiry_messages(listing_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_messages_sender_created "
"ON listing_inquiry_messages(sender_user_id, created_at)"
)
)
# ----------------------------------------------------
# 5) Yield tables indexes
# ----------------------------------------------------
if await _table_exists(conn, "yield_domains"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_domains_user_status "
"ON yield_domains(user_id, status)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_domains_domain "
"ON yield_domains(domain)"
)
)
if not await _has_column(conn, "yield_domains", "connected_at"):
logger.info("DB migrations: adding column yield_domains.connected_at")
await conn.execute(text("ALTER TABLE yield_domains ADD COLUMN connected_at DATETIME"))
if await _table_exists(conn, "yield_transactions"):
if not await _has_column(conn, "yield_transactions", "click_id"):
logger.info("DB migrations: adding column yield_transactions.click_id")
await conn.execute(text("ALTER TABLE yield_transactions ADD COLUMN click_id VARCHAR(64)"))
await conn.execute(text("CREATE INDEX IF NOT EXISTS ix_yield_transactions_click_id ON yield_transactions(click_id)"))
if not await _has_column(conn, "yield_transactions", "destination_url"):
logger.info("DB migrations: adding column yield_transactions.destination_url")
await conn.execute(text("ALTER TABLE yield_transactions ADD COLUMN destination_url TEXT"))
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_tx_domain_created "
"ON yield_transactions(yield_domain_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_tx_status_created "
"ON yield_transactions(status, created_at)"
)
)
if await _table_exists(conn, "yield_payouts"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_payouts_user_status "
"ON yield_payouts(user_id, status)"
)
)
# ----------------------------------------------------
# 6) Referral rewards: subscriptions.referral_bonus_domains (3C.2)
# ----------------------------------------------------
if await _table_exists(conn, "subscriptions"):
if not await _has_column(conn, "subscriptions", "referral_bonus_domains"):
logger.info("DB migrations: adding column subscriptions.referral_bonus_domains")
await conn.execute(
text(
"ALTER TABLE subscriptions "
"ADD COLUMN referral_bonus_domains INTEGER NOT NULL DEFAULT 0"
)
)
# ----------------------------------------------------
# 6) Telemetry events indexes
# ----------------------------------------------------
if await _table_exists(conn, "telemetry_events"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_event_name_created "
"ON telemetry_events(event_name, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_user_created "
"ON telemetry_events(user_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_listing_created "
"ON telemetry_events(listing_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_yield_created "
"ON telemetry_events(yield_domain_id, created_at)"
)
)
# ----------------------------------------------------
# 6b) Ops alert events (persisted cooldown + history)
# ----------------------------------------------------
# NOTE: Table is created by Base.metadata.create_all() for new installs.
# Here we ensure indexes exist for older DBs.
if await _table_exists(conn, "ops_alert_events"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_ops_alert_key_created "
"ON ops_alert_events(alert_key, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_ops_alert_status_created "
"ON ops_alert_events(status, created_at)"
)
)
# ----------------------------------------------------
# 7) User referral tracking columns
# ----------------------------------------------------
if await _table_exists(conn, "users"):
if not await _has_column(conn, "users", "referred_by_user_id"):
logger.info("DB migrations: adding column users.referred_by_user_id")
await conn.execute(text("ALTER TABLE users ADD COLUMN referred_by_user_id INTEGER"))
if not await _has_column(conn, "users", "referred_by_domain"):
logger.info("DB migrations: adding column users.referred_by_domain")
await conn.execute(text("ALTER TABLE users ADD COLUMN referred_by_domain VARCHAR(255)"))
if not await _has_column(conn, "users", "referral_code"):
logger.info("DB migrations: adding column users.referral_code")
await conn.execute(text("ALTER TABLE users ADD COLUMN referral_code VARCHAR(100)"))
if not await _has_column(conn, "users", "invite_code"):
logger.info("DB migrations: adding column users.invite_code")
await conn.execute(text("ALTER TABLE users ADD COLUMN invite_code VARCHAR(32)"))
# Unique index for invite_code (SQLite + Postgres)
await conn.execute(text("CREATE UNIQUE INDEX IF NOT EXISTS ix_users_invite_code ON users(invite_code)"))
# ----------------------------------------------------
# 7) Portfolio DNS verification columns
# ----------------------------------------------------
if await _table_exists(conn, "portfolio_domains"):
if not await _has_column(conn, "portfolio_domains", "is_dns_verified"):
logger.info("DB migrations: adding column portfolio_domains.is_dns_verified")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN is_dns_verified BOOLEAN DEFAULT 0"))
if not await _has_column(conn, "portfolio_domains", "verification_status"):
logger.info("DB migrations: adding column portfolio_domains.verification_status")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verification_status VARCHAR(50) DEFAULT 'unverified'"))
if not await _has_column(conn, "portfolio_domains", "verification_code"):
logger.info("DB migrations: adding column portfolio_domains.verification_code")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verification_code VARCHAR(100)"))
if not await _has_column(conn, "portfolio_domains", "verification_started_at"):
logger.info("DB migrations: adding column portfolio_domains.verification_started_at")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verification_started_at DATETIME"))
if not await _has_column(conn, "portfolio_domains", "verified_at"):
logger.info("DB migrations: adding column portfolio_domains.verified_at")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verified_at DATETIME"))
logger.info("DB migrations: done")

View File

@ -1,3 +0,0 @@
"""Async job queue (ARQ / Redis)."""

View File

@ -1,38 +0,0 @@
"""ARQ client helper to enqueue jobs."""
from __future__ import annotations
from typing import Any
from arq.connections import RedisSettings, create_pool
from app.config import get_settings
_pool = None
async def _get_pool():
global _pool
if _pool is not None:
return _pool
settings = get_settings()
if not settings.redis_url:
raise RuntimeError("redis_url is not configured (set REDIS_URL)")
_pool = await create_pool(RedisSettings.from_dsn(settings.redis_url))
return _pool
async def enqueue_job(name: str, *args: Any, **kwargs: Any) -> str:
"""
Enqueue a job by name. Returns the job id.
"""
pool = await _get_pool()
job = await pool.enqueue_job(name, *args, **kwargs)
# job may be None if enqueue failed
if job is None:
raise RuntimeError(f"Failed to enqueue job: {name}")
return job.job_id

View File

@ -1,72 +0,0 @@
"""Job functions executed by the ARQ worker."""
from __future__ import annotations
from datetime import datetime
from sqlalchemy import select
from app.database import AsyncSessionLocal, init_db
from app.models.auction import DomainAuction
from app.services.auction_scraper import auction_scraper
from app.services.pounce_score import calculate_pounce_score_v2
from app.services.tld_scraper.aggregator import tld_aggregator
async def scrape_auctions(ctx) -> dict: # arq passes ctx
"""Scrape auctions from all platforms and store results."""
await init_db()
async with AsyncSessionLocal() as db:
result = await auction_scraper.scrape_all_platforms(db)
await db.commit()
return {"status": "ok", "result": result, "timestamp": datetime.utcnow().isoformat()}
async def scrape_tld_prices(ctx) -> dict:
"""Scrape TLD prices from all sources and store results."""
await init_db()
async with AsyncSessionLocal() as db:
result = await tld_aggregator.run_scrape(db)
await db.commit()
return {
"status": "ok",
"tlds_scraped": result.tlds_scraped,
"prices_saved": result.prices_saved,
"sources_succeeded": result.sources_succeeded,
"sources_attempted": result.sources_attempted,
"timestamp": datetime.utcnow().isoformat(),
}
async def backfill_auction_scores(ctx, *, limit: int = 5000) -> dict:
"""
Backfill DomainAuction.pounce_score for legacy rows.
Safe to run multiple times; only fills NULL scores.
"""
await init_db()
updated = 0
async with AsyncSessionLocal() as db:
rows = (
await db.execute(
select(DomainAuction)
.where(DomainAuction.pounce_score == None) # noqa: E711
.limit(limit)
)
).scalars().all()
for auction in rows:
auction.pounce_score = calculate_pounce_score_v2(
auction.domain,
auction.tld,
num_bids=auction.num_bids or 0,
age_years=auction.age_years or 0,
is_pounce=False,
)
updated += 1
await db.commit()
return {"status": "ok", "updated": updated, "timestamp": datetime.utcnow().isoformat()}

View File

@ -1,26 +0,0 @@
"""ARQ worker configuration."""
from __future__ import annotations
from arq.connections import RedisSettings
from app.config import get_settings
from app.jobs import tasks
class WorkerSettings:
"""
Run with:
arq app.jobs.worker.WorkerSettings
"""
settings = get_settings()
redis_settings = RedisSettings.from_dsn(settings.redis_url or "redis://localhost:6379/0")
functions = [
tasks.scrape_auctions,
tasks.scrape_tld_prices,
tasks.backfill_auction_scores,
]

View File

@ -18,7 +18,6 @@ from app.api import api_router
from app.config import get_settings
from app.database import init_db
from app.scheduler import start_scheduler, stop_scheduler
from app.observability.metrics import instrument_app
# Configure logging
logging.basicConfig(
@ -33,7 +32,7 @@ settings = get_settings()
limiter = Limiter(
key_func=get_remote_address,
default_limits=["200/minute"], # Global default
storage_uri=settings.rate_limit_storage_uri, # Use Redis in production
storage_uri="memory://", # In-memory storage (use Redis in production)
)
@ -47,18 +46,14 @@ async def lifespan(app: FastAPI):
await init_db()
logger.info("Database initialized")
# Start scheduler (optional - recommended: run in separate process/container)
if settings.enable_scheduler:
start_scheduler()
logger.info("Scheduler started")
else:
logger.info("Scheduler disabled (ENABLE_SCHEDULER=false)")
# Start scheduler
start_scheduler()
logger.info("Scheduler started")
yield
# Shutdown
if settings.enable_scheduler:
stop_scheduler()
stop_scheduler()
logger.info("Application shutdown complete")
@ -79,8 +74,8 @@ Domain availability monitoring and portfolio management service.
## Authentication
Most endpoints require authentication via HttpOnly session cookie (recommended).
Login: POST /api/v1/auth/login
Most endpoints require authentication via Bearer token.
Get a token via POST /api/v1/auth/login
## Rate Limits
@ -90,7 +85,7 @@ Login: POST /api/v1/auth/login
## Support
For API issues, contact hello@pounce.ch
For API issues, contact support@pounce.ch
""",
version="1.0.0",
lifespan=lifespan,
@ -99,10 +94,6 @@ For API issues, contact hello@pounce.ch
redoc_url="/redoc",
)
# Observability (Prometheus metrics)
if settings.enable_metrics:
instrument_app(app, metrics_path=settings.metrics_path, enable_db_metrics=settings.enable_db_query_metrics)
# Add rate limiter to app state
app.state.limiter = limiter
@ -118,15 +109,14 @@ async def rate_limit_handler(request: Request, exc: RateLimitExceeded):
},
)
# Get allowed origins (env overrides settings)
origins_raw = (
os.getenv("ALLOWED_ORIGINS", "").strip()
or os.getenv("CORS_ORIGINS", "").strip()
or (settings.cors_origins or "").strip()
)
ALLOWED_ORIGINS = [o.strip() for o in origins_raw.split(",") if o.strip()]
if not ALLOWED_ORIGINS:
ALLOWED_ORIGINS = ["http://localhost:3000", "http://127.0.0.1:3000"]
# Get allowed origins from environment
ALLOWED_ORIGINS = os.getenv("ALLOWED_ORIGINS", "").split(",")
if not ALLOWED_ORIGINS or ALLOWED_ORIGINS == [""]:
ALLOWED_ORIGINS = [
"http://localhost:3000",
"http://127.0.0.1:3000",
"http://10.42.0.73:3000",
]
# Add production origins
SITE_URL = os.getenv("SITE_URL", "")
@ -168,22 +158,6 @@ async def health_check():
}
@app.get("/api/health")
async def health_check_api():
"""
Health check behind Nginx `/api` proxy.
Nginx routes `/api/*` to the backend, so `https://pounce.ch/api/health` must exist.
"""
return await health_check()
@app.get("/api/v1/health")
async def health_check_api_v1():
"""Health check behind `/api/v1` prefix (convenience)."""
return await health_check()
# Rate-limited endpoints - apply specific limits to sensitive routes
from fastapi import Depends

View File

@ -12,11 +12,6 @@ from app.models.blog import BlogPost
from app.models.listing import DomainListing, ListingInquiry, ListingView
from app.models.sniper_alert import SniperAlert, SniperAlertMatch
from app.models.seo_data import DomainSEOData
from app.models.yield_domain import YieldDomain, YieldTransaction, YieldPayout, AffiliatePartner
from app.models.telemetry import TelemetryEvent
from app.models.ops_alert import OpsAlertEvent
from app.models.domain_analysis_cache import DomainAnalysisCache
from app.models.zone_file import ZoneSnapshot, DroppedDomain
__all__ = [
"User",
@ -42,17 +37,4 @@ __all__ = [
"SniperAlertMatch",
# New: SEO Data (Tycoon feature)
"DomainSEOData",
# New: Yield / Intent Routing
"YieldDomain",
"YieldTransaction",
"YieldPayout",
"AffiliatePartner",
# New: Telemetry (events)
"TelemetryEvent",
"OpsAlertEvent",
# New: Analyze cache
"DomainAnalysisCache",
# New: Zone file drops
"ZoneSnapshot",
"DroppedDomain",
]

View File

@ -53,7 +53,6 @@ class DomainAuction(Base):
age_years: Mapped[Optional[int]] = mapped_column(Integer, nullable=True)
backlinks: Mapped[Optional[int]] = mapped_column(Integer, nullable=True)
domain_authority: Mapped[Optional[int]] = mapped_column(Integer, nullable=True)
pounce_score: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
# Scraping metadata
scraped_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
@ -63,8 +62,7 @@ class DomainAuction(Base):
# Indexes for common queries
__table_args__ = (
# Enforce de-duplication at the database level.
Index('ux_auctions_platform_domain', 'platform', 'domain', unique=True),
Index('ix_auctions_platform_domain', 'platform', 'domain'),
Index('ix_auctions_end_time_active', 'end_time', 'is_active'),
Index('ix_auctions_tld_bid', 'tld', 'current_bid'),
)

View File

@ -2,7 +2,7 @@
from datetime import datetime
from enum import Enum
from sqlalchemy import String, Boolean, DateTime, ForeignKey, Text, Enum as SQLEnum
from sqlalchemy.orm import Mapped, mapped_column, relationship, backref
from sqlalchemy.orm import Mapped, mapped_column, relationship
from app.database import Base
@ -78,50 +78,3 @@ class DomainCheck(Base):
def __repr__(self) -> str:
return f"<DomainCheck {self.domain_id} at {self.checked_at}>"
class HealthStatus(str, Enum):
"""Domain health status levels."""
HEALTHY = "healthy"
WEAKENING = "weakening"
PARKED = "parked"
CRITICAL = "critical"
UNKNOWN = "unknown"
class DomainHealthCache(Base):
"""
Cached health check results for domains.
Updated daily by the scheduler to provide instant health status
without needing manual checks.
"""
__tablename__ = "domain_health_cache"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
domain_id: Mapped[int] = mapped_column(ForeignKey("domains.id"), unique=True, nullable=False)
# Health status
status: Mapped[str] = mapped_column(String(20), default="unknown")
score: Mapped[int] = mapped_column(default=0)
# Signals (JSON array as text)
signals: Mapped[str | None] = mapped_column(Text, nullable=True)
# Layer data (JSON as text for flexibility)
dns_data: Mapped[str | None] = mapped_column(Text, nullable=True)
http_data: Mapped[str | None] = mapped_column(Text, nullable=True)
ssl_data: Mapped[str | None] = mapped_column(Text, nullable=True)
# Timestamp
checked_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
# Relationship - cascade delete when domain is deleted
domain: Mapped["Domain"] = relationship(
"Domain",
backref=backref("health_cache", cascade="all, delete-orphan", uselist=False)
)
def __repr__(self) -> str:
return f"<DomainHealthCache {self.domain_id} status={self.status}>"

View File

@ -1,25 +0,0 @@
"""
Domain analysis cache (Phase 2 Diligence).
We store computed JSON to avoid repeated RDAP/DNS/HTTP checks on each click.
"""
from __future__ import annotations
from datetime import datetime
from sqlalchemy import DateTime, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column
from app.database import Base
class DomainAnalysisCache(Base):
__tablename__ = "domain_analysis_cache"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
domain: Mapped[str] = mapped_column(String(255), unique=True, index=True, nullable=False)
payload_json: Mapped[str] = mapped_column(Text, nullable=False)
computed_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
ttl_seconds: Mapped[int] = mapped_column(Integer, default=3600)

View File

@ -91,10 +91,6 @@ class DomainListing(Base):
# Status
status: Mapped[str] = mapped_column(String(30), default=ListingStatus.DRAFT.value, index=True)
sold_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
sold_reason: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
sold_price: Mapped[Optional[float]] = mapped_column(Float, nullable=True)
sold_currency: Mapped[Optional[str]] = mapped_column(String(3), nullable=True)
# Features
show_valuation: Mapped[bool] = mapped_column(Boolean, default=True)
@ -151,7 +147,6 @@ class ListingInquiry(Base):
id: Mapped[int] = mapped_column(primary_key=True, index=True)
listing_id: Mapped[int] = mapped_column(ForeignKey("domain_listings.id"), index=True, nullable=False)
buyer_user_id: Mapped[Optional[int]] = mapped_column(ForeignKey("users.id"), index=True, nullable=True)
# Inquirer info
name: Mapped[str] = mapped_column(String(100), nullable=False)
@ -164,8 +159,7 @@ class ListingInquiry(Base):
offer_amount: Mapped[Optional[float]] = mapped_column(Float, nullable=True)
# Status
status: Mapped[str] = mapped_column(String(20), default="new") # new, read, replied, closed, spam
closed_reason: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
status: Mapped[str] = mapped_column(String(20), default="new") # new, read, replied, spam
# Tracking
ip_address: Mapped[Optional[str]] = mapped_column(String(45), nullable=True)
@ -175,72 +169,14 @@ class ListingInquiry(Base):
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
read_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
replied_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
closed_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Relationships
listing: Mapped["DomainListing"] = relationship("DomainListing", back_populates="inquiries")
messages: Mapped[List["ListingInquiryMessage"]] = relationship(
"ListingInquiryMessage", back_populates="inquiry", cascade="all, delete-orphan"
)
events: Mapped[List["ListingInquiryEvent"]] = relationship(
"ListingInquiryEvent", back_populates="inquiry", cascade="all, delete-orphan"
)
def __repr__(self) -> str:
return f"<ListingInquiry from {self.email} for listing #{self.listing_id}>"
class ListingInquiryEvent(Base):
"""
Audit trail for inquiry status changes.
This is the minimal “deal system” log:
- who changed what status
- when it happened
- optional reason (close/spam)
"""
__tablename__ = "listing_inquiry_events"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
inquiry_id: Mapped[int] = mapped_column(ForeignKey("listing_inquiries.id"), index=True, nullable=False)
listing_id: Mapped[int] = mapped_column(ForeignKey("domain_listings.id"), index=True, nullable=False)
actor_user_id: Mapped[int] = mapped_column(ForeignKey("users.id"), index=True, nullable=False)
old_status: Mapped[Optional[str]] = mapped_column(String(20), nullable=True)
new_status: Mapped[str] = mapped_column(String(20), nullable=False)
reason: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
ip_address: Mapped[Optional[str]] = mapped_column(String(45), nullable=True)
user_agent: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
inquiry: Mapped["ListingInquiry"] = relationship("ListingInquiry", back_populates="events")
class ListingInquiryMessage(Base):
"""
Thread messages for listing inquiries (in-product negotiation).
- Buyer sends messages from their account
- Seller replies from Terminal
"""
__tablename__ = "listing_inquiry_messages"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
inquiry_id: Mapped[int] = mapped_column(ForeignKey("listing_inquiries.id"), index=True, nullable=False)
listing_id: Mapped[int] = mapped_column(ForeignKey("domain_listings.id"), index=True, nullable=False)
sender_user_id: Mapped[int] = mapped_column(ForeignKey("users.id"), index=True, nullable=False)
body: Mapped[str] = mapped_column(Text, nullable=False)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
inquiry: Mapped["ListingInquiry"] = relationship("ListingInquiry", back_populates="messages")
class ListingView(Base):
"""
Track listing page views for analytics.

View File

@ -1,40 +0,0 @@
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import DateTime, Index, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column
from app.database import Base
class OpsAlertEvent(Base):
"""
Persisted ops alert events.
Used for:
- cooldown across process restarts
- audit/history in admin UI
"""
__tablename__ = "ops_alert_events"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
alert_key: Mapped[str] = mapped_column(String(80), nullable=False, index=True)
severity: Mapped[str] = mapped_column(String(10), nullable=False, index=True) # "warn" | "page"
title: Mapped[str] = mapped_column(String(200), nullable=False)
detail: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
# "sent" | "skipped" | "error"
status: Mapped[str] = mapped_column(String(20), nullable=False, index=True)
recipients: Mapped[Optional[str]] = mapped_column(Text, nullable=True) # comma-separated
send_reason: Mapped[Optional[str]] = mapped_column(String(60), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
__table_args__ = (
Index("ix_ops_alert_key_created", "alert_key", "created_at"),
Index("ix_ops_alert_status_created", "status", "created_at"),
)

View File

@ -45,14 +45,6 @@ class PortfolioDomain(Base):
# Status
status: Mapped[str] = mapped_column(String(50), default="active") # active, expired, sold, parked
# DNS Verification (required for Yield and For Sale)
# All fields nullable=True to avoid migration issues on existing databases
is_dns_verified: Mapped[Optional[bool]] = mapped_column(Boolean, default=False, nullable=True)
verification_status: Mapped[Optional[str]] = mapped_column(String(50), default="unverified", nullable=True)
verification_code: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
verification_started_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
verified_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Notes
notes: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
tags: Mapped[Optional[str]] = mapped_column(String(500), nullable=True) # Comma-separated

View File

@ -123,8 +123,6 @@ class Subscription(Base):
# Limits (can be overridden)
max_domains: Mapped[int] = mapped_column(Integer, default=5)
# Referral reward bonus (3C.2): additive, computed deterministically from qualified referrals
referral_bonus_domains: Mapped[int] = mapped_column(Integer, default=0)
check_frequency: Mapped[str] = mapped_column(String(50), default="daily")
# Stripe integration
@ -169,9 +167,7 @@ class Subscription(Base):
@property
def domain_limit(self) -> int:
"""Get maximum allowed domains for this subscription."""
base = int(self.max_domains or self.config["domain_limit"] or 0)
bonus = int(self.referral_bonus_domains or 0)
return max(0, base + bonus)
return self.max_domains or self.config["domain_limit"]
@property
def portfolio_limit(self) -> int:

View File

@ -1,56 +0,0 @@
"""
Telemetry events (4A).
Store canonical product events for funnel KPIs:
- Deal funnel: listing_view → inquiry_created → message_sent → listing_marked_sold
- Yield funnel: yield_connected → yield_click → yield_conversion → payout_paid
"""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import Boolean, DateTime, ForeignKey, Index, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column
from app.database import Base
class TelemetryEvent(Base):
__tablename__ = "telemetry_events"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
# Who
user_id: Mapped[Optional[int]] = mapped_column(ForeignKey("users.id"), nullable=True, index=True)
# What
event_name: Mapped[str] = mapped_column(String(60), nullable=False, index=True)
# Entity links (optional)
listing_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
inquiry_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
yield_domain_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
click_id: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
domain: Mapped[Optional[str]] = mapped_column(String(255), nullable=True, index=True)
# Context
source: Mapped[Optional[str]] = mapped_column(String(30), nullable=True) # "public" | "terminal" | "webhook" | "scheduler" | "admin"
ip_hash: Mapped[Optional[str]] = mapped_column(String(64), nullable=True)
user_agent: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
referrer: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
metadata_json: Mapped[Optional[str]] = mapped_column(Text, nullable=True) # JSON string
# Flags
is_authenticated: Mapped[Optional[bool]] = mapped_column(Boolean, nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
__table_args__ = (
Index("ix_telemetry_event_name_created", "event_name", "created_at"),
Index("ix_telemetry_user_created", "user_id", "created_at"),
Index("ix_telemetry_listing_created", "listing_id", "created_at"),
Index("ix_telemetry_yield_created", "yield_domain_id", "created_at"),
)

View File

@ -1,7 +1,7 @@
"""User model."""
from datetime import datetime
from typing import Optional, List
from sqlalchemy import String, Boolean, DateTime, Integer
from sqlalchemy import String, Boolean, DateTime
from sqlalchemy.orm import Mapped, mapped_column, relationship
from app.database import Base
@ -40,12 +40,6 @@ class User(Base):
oauth_id: Mapped[Optional[str]] = mapped_column(String(255), nullable=True)
oauth_avatar: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
# Yield Referral Tracking (for viral growth)
referred_by_user_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True) # User who referred this user
referred_by_domain: Mapped[Optional[str]] = mapped_column(String(255), nullable=True) # Domain that referred
referral_code: Mapped[Optional[str]] = mapped_column(String(100), nullable=True) # Original referral code
invite_code: Mapped[Optional[str]] = mapped_column(String(32), nullable=True, unique=True, index=True) # user's own code
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
updated_at: Mapped[datetime] = mapped_column(
@ -74,13 +68,6 @@ class User(Base):
sniper_alerts: Mapped[List["SniperAlert"]] = relationship(
"SniperAlert", back_populates="user", cascade="all, delete-orphan"
)
# Yield Domains
yield_domains: Mapped[List["YieldDomain"]] = relationship(
"YieldDomain", back_populates="user", cascade="all, delete-orphan"
)
yield_payouts: Mapped[List["YieldPayout"]] = relationship(
"YieldPayout", back_populates="user", cascade="all, delete-orphan"
)
def __repr__(self) -> str:
return f"<User {self.email}>"

View File

@ -1,248 +0,0 @@
"""
Yield Domain models for Intent Routing feature.
Domains activated for yield generate passive income by routing
visitor intent to affiliate partners.
"""
from datetime import datetime
from decimal import Decimal
from typing import Optional
from sqlalchemy import String, DateTime, Float, Integer, Text, ForeignKey, Boolean, Numeric, Index
from sqlalchemy.orm import Mapped, mapped_column, relationship
from app.database import Base
class AffiliatePartner(Base):
"""
Affiliate network/partner configuration.
Partners are matched to domains based on detected intent category.
"""
__tablename__ = "affiliate_partners"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
# Identity
name: Mapped[str] = mapped_column(String(100), nullable=False) # "Comparis Dental"
slug: Mapped[str] = mapped_column(String(50), unique=True, nullable=False) # "comparis_dental"
network: Mapped[str] = mapped_column(String(50), nullable=False) # "awin", "partnerstack", "direct"
# Matching criteria (JSON arrays stored as comma-separated for simplicity)
intent_categories: Mapped[str] = mapped_column(Text, nullable=False) # "medical_dental,medical_general"
geo_countries: Mapped[str] = mapped_column(String(200), default="CH,DE,AT") # ISO codes
# Payout configuration
payout_type: Mapped[str] = mapped_column(String(20), default="cpl") # "cpc", "cpl", "cps"
payout_amount: Mapped[Decimal] = mapped_column(Numeric(10, 2), default=0)
payout_currency: Mapped[str] = mapped_column(String(3), default="CHF")
# Integration
tracking_url_template: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
api_endpoint: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
# Note: API keys should be stored encrypted or in env vars, not here
# Display
logo_url: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
description: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
# Status
is_active: Mapped[bool] = mapped_column(Boolean, default=True)
priority: Mapped[int] = mapped_column(Integer, default=0) # Higher = preferred
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
updated_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
# Relationships
yield_domains: Mapped[list["YieldDomain"]] = relationship("YieldDomain", back_populates="partner")
def __repr__(self) -> str:
return f"<AffiliatePartner {self.slug}>"
@property
def intent_list(self) -> list[str]:
"""Parse intent categories as list."""
return [c.strip() for c in self.intent_categories.split(",") if c.strip()]
@property
def country_list(self) -> list[str]:
"""Parse geo countries as list."""
return [c.strip() for c in self.geo_countries.split(",") if c.strip()]
class YieldDomain(Base):
"""
Domain activated for yield/intent routing.
When a user activates a domain for yield:
1. They point DNS to our nameservers
2. We detect the intent (e.g., "zahnarzt.ch" → medical/dental)
3. We route traffic to affiliate partners
4. User earns commission split
"""
__tablename__ = "yield_domains"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
user_id: Mapped[int] = mapped_column(ForeignKey("users.id"), index=True, nullable=False)
# Domain info
domain: Mapped[str] = mapped_column(String(255), unique=True, nullable=False, index=True)
# Intent detection
detected_intent: Mapped[Optional[str]] = mapped_column(String(100), nullable=True) # "medical_dental"
intent_confidence: Mapped[float] = mapped_column(Float, default=0.0) # 0.0 - 1.0
intent_keywords: Mapped[Optional[str]] = mapped_column(Text, nullable=True) # JSON: ["zahnarzt", "zuerich"]
# Routing
partner_id: Mapped[Optional[int]] = mapped_column(ForeignKey("affiliate_partners.id"), nullable=True)
active_route: Mapped[Optional[str]] = mapped_column(String(100), nullable=True) # Partner slug
landing_page_url: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
# Status
status: Mapped[str] = mapped_column(String(30), default="pending", index=True)
# pending, verifying, active, paused, inactive, error
dns_verified: Mapped[bool] = mapped_column(Boolean, default=False)
dns_verified_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# "Connect" timestamp for Yield (nameserver/CNAME verified)
connected_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
activated_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
paused_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Revenue tracking (aggregates, updated periodically)
total_clicks: Mapped[int] = mapped_column(Integer, default=0)
total_conversions: Mapped[int] = mapped_column(Integer, default=0)
total_revenue: Mapped[Decimal] = mapped_column(Numeric(12, 2), default=0)
currency: Mapped[str] = mapped_column(String(3), default="CHF")
# Last activity
last_click_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
last_conversion_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
updated_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow)
# Relationships
user: Mapped["User"] = relationship("User", back_populates="yield_domains")
partner: Mapped[Optional["AffiliatePartner"]] = relationship("AffiliatePartner", back_populates="yield_domains")
transactions: Mapped[list["YieldTransaction"]] = relationship(
"YieldTransaction", back_populates="yield_domain", cascade="all, delete-orphan"
)
# Indexes
__table_args__ = (
Index("ix_yield_domains_user_status", "user_id", "status"),
)
def __repr__(self) -> str:
return f"<YieldDomain {self.domain} ({self.status})>"
@property
def is_earning(self) -> bool:
"""Check if domain is actively earning."""
return self.status == "active" and self.dns_verified
class YieldTransaction(Base):
"""
Revenue events from affiliate partners.
Tracks clicks, leads, and sales for each yield domain.
"""
__tablename__ = "yield_transactions"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
yield_domain_id: Mapped[int] = mapped_column(
ForeignKey("yield_domains.id", ondelete="CASCADE"),
index=True,
nullable=False
)
# Event type
event_type: Mapped[str] = mapped_column(String(20), nullable=False) # "click", "lead", "sale"
# Partner info
partner_slug: Mapped[str] = mapped_column(String(50), nullable=False)
partner_transaction_id: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
# Our click id for attribution across systems (UUID string)
click_id: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
destination_url: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
# Amount
gross_amount: Mapped[Decimal] = mapped_column(Numeric(10, 2), default=0) # Full commission
net_amount: Mapped[Decimal] = mapped_column(Numeric(10, 2), default=0) # After Pounce cut (70%)
currency: Mapped[str] = mapped_column(String(3), default="CHF")
# Attribution
referrer: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
user_agent: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
geo_country: Mapped[Optional[str]] = mapped_column(String(2), nullable=True)
ip_hash: Mapped[Optional[str]] = mapped_column(String(64), nullable=True) # Hashed for privacy
# Status
status: Mapped[str] = mapped_column(String(20), default="pending", index=True)
# pending, confirmed, paid, rejected
confirmed_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
paid_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
payout_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True) # FK to future payouts table
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
# Relationships
yield_domain: Mapped["YieldDomain"] = relationship("YieldDomain", back_populates="transactions")
# Indexes
__table_args__ = (
Index("ix_yield_tx_domain_created", "yield_domain_id", "created_at"),
Index("ix_yield_tx_status_created", "status", "created_at"),
Index("ix_yield_tx_click_id", "click_id"),
)
def __repr__(self) -> str:
return f"<YieldTransaction {self.event_type} {self.net_amount} {self.currency}>"
class YieldPayout(Base):
"""
Payout records for user earnings.
Aggregates confirmed transactions into periodic payouts.
"""
__tablename__ = "yield_payouts"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
user_id: Mapped[int] = mapped_column(ForeignKey("users.id"), index=True, nullable=False)
# Amount
amount: Mapped[Decimal] = mapped_column(Numeric(12, 2), nullable=False)
currency: Mapped[str] = mapped_column(String(3), default="CHF")
# Period
period_start: Mapped[datetime] = mapped_column(DateTime, nullable=False)
period_end: Mapped[datetime] = mapped_column(DateTime, nullable=False)
# Transaction count
transaction_count: Mapped[int] = mapped_column(Integer, default=0)
# Status
status: Mapped[str] = mapped_column(String(20), default="pending", index=True)
# pending, processing, completed, failed
# Payment details
payment_method: Mapped[Optional[str]] = mapped_column(String(50), nullable=True) # "stripe", "bank"
payment_reference: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
processed_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
completed_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Relationship
user: Mapped["User"] = relationship("User", back_populates="yield_payouts")
def __repr__(self) -> str:
return f"<YieldPayout {self.amount} {self.currency} ({self.status})>"

View File

@ -1,43 +0,0 @@
"""
Zone File Models for .ch and .li domain drops
"""
from datetime import datetime
from sqlalchemy import Column, Integer, String, DateTime, Boolean, Index
from app.database import Base
class ZoneSnapshot(Base):
"""Stores metadata about zone file snapshots (not the full data)"""
__tablename__ = "zone_snapshots"
id = Column(Integer, primary_key=True)
tld = Column(String(10), nullable=False, index=True) # 'ch' or 'li'
snapshot_date = Column(DateTime, nullable=False, index=True)
domain_count = Column(Integer, nullable=False)
checksum = Column(String(64), nullable=False) # SHA256 of sorted domain list
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('ix_zone_snapshots_tld_date', 'tld', 'snapshot_date'),
)
class DroppedDomain(Base):
"""Stores domains that were dropped (found in previous snapshot but not current)"""
__tablename__ = "dropped_domains"
id = Column(Integer, primary_key=True)
domain = Column(String(255), nullable=False, index=True)
tld = Column(String(10), nullable=False, index=True)
dropped_date = Column(DateTime, nullable=False, index=True)
length = Column(Integer, nullable=False)
is_numeric = Column(Boolean, default=False)
has_hyphen = Column(Boolean, default=False)
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('ix_dropped_domains_tld_date', 'tld', 'dropped_date'),
Index('ix_dropped_domains_length', 'length'),
)

View File

@ -1,3 +0,0 @@
"""Observability helpers (metrics, tracing)."""

View File

@ -1,304 +0,0 @@
"""
Business KPIs exported as Prometheus metrics (4B Ops).
These KPIs are derived from real telemetry events in the database.
We cache computations to avoid putting load on the DB on every scrape.
"""
from __future__ import annotations
import json
from dataclasses import dataclass
from datetime import datetime, timedelta
from typing import Any, Optional
from sqlalchemy import and_, func, select
from app.config import get_settings
from app.database import AsyncSessionLocal
from app.models.telemetry import TelemetryEvent
settings = get_settings()
try:
from prometheus_client import Gauge
except Exception: # pragma: no cover
Gauge = None # type: ignore
@dataclass(frozen=True)
class TelemetryWindowKpis:
window_days: int
start: datetime
end: datetime
# Deal
listing_views: int
inquiries_created: int
seller_replied_inquiries: int
inquiry_reply_rate: float
listings_with_inquiries: int
listings_sold: int
inquiry_to_sold_listing_rate: float
# Yield
connected_domains: int
clicks: int
conversions: int
conversion_rate: float
payouts_paid: int
payouts_paid_amount_total: float
_cache_until_by_days: dict[int, datetime] = {}
_cache_value_by_days: dict[int, TelemetryWindowKpis] = {}
def _safe_json(metadata_json: Optional[str]) -> dict[str, Any]:
if not metadata_json:
return {}
try:
value = json.loads(metadata_json)
return value if isinstance(value, dict) else {}
except Exception:
return {}
async def _compute_window_kpis(days: int) -> TelemetryWindowKpis:
end = datetime.utcnow()
start = end - timedelta(days=days)
async with AsyncSessionLocal() as db:
# Fast path: grouped counts for pure counter events
count_events = [
"listing_view",
"inquiry_created",
"yield_connected",
"yield_click",
"yield_conversion",
"payout_paid",
]
grouped = (
await db.execute(
select(TelemetryEvent.event_name, func.count(TelemetryEvent.id))
.where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name.in_(count_events),
)
)
.group_by(TelemetryEvent.event_name)
)
).all()
counts = {name: int(cnt) for name, cnt in grouped}
listing_views = counts.get("listing_view", 0)
inquiries_created = counts.get("inquiry_created", 0)
connected_domains = counts.get("yield_connected", 0)
clicks = counts.get("yield_click", 0)
conversions = counts.get("yield_conversion", 0)
payouts_paid = counts.get("payout_paid", 0)
# Distinct listing counts (deal)
listings_with_inquiries = (
await db.execute(
select(func.count(func.distinct(TelemetryEvent.listing_id))).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "inquiry_created",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalar() or 0
listings_sold = (
await db.execute(
select(func.count(func.distinct(TelemetryEvent.listing_id))).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "listing_marked_sold",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalar() or 0
# For rates we need intersections/uniques; keep it exact via minimal event fetch
inquiry_listing_ids = (
await db.execute(
select(func.distinct(TelemetryEvent.listing_id)).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "inquiry_created",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalars().all()
sold_listing_ids = (
await db.execute(
select(func.distinct(TelemetryEvent.listing_id)).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "listing_marked_sold",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalars().all()
inquiry_set = {int(x) for x in inquiry_listing_ids if x is not None}
sold_set = {int(x) for x in sold_listing_ids if x is not None}
sold_from_inquiry = inquiry_set.intersection(sold_set)
inquiry_to_sold_listing_rate = (len(sold_from_inquiry) / len(inquiry_set)) if inquiry_set else 0.0
# Seller reply rate: unique inquiries with at least one seller message
msg_rows = (
await db.execute(
select(TelemetryEvent.inquiry_id, TelemetryEvent.metadata_json).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "message_sent",
TelemetryEvent.inquiry_id.isnot(None),
)
)
)
).all()
seller_replied_inquiries_set: set[int] = set()
for inquiry_id, metadata_json in msg_rows:
if inquiry_id is None:
continue
meta = _safe_json(metadata_json)
if meta.get("role") == "seller":
seller_replied_inquiries_set.add(int(inquiry_id))
seller_replied_inquiries = len(seller_replied_inquiries_set)
inquiry_reply_rate = (seller_replied_inquiries / inquiries_created) if inquiries_created else 0.0
# Payout amounts (sum of metadata amounts)
payout_rows = (
await db.execute(
select(TelemetryEvent.metadata_json).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "payout_paid",
TelemetryEvent.metadata_json.isnot(None),
)
)
)
).scalars().all()
payouts_paid_amount_total = 0.0
for metadata_json in payout_rows:
meta = _safe_json(metadata_json)
amount = meta.get("amount")
if isinstance(amount, (int, float)):
payouts_paid_amount_total += float(amount)
conversion_rate = (conversions / clicks) if clicks else 0.0
return TelemetryWindowKpis(
window_days=days,
start=start,
end=end,
listing_views=int(listing_views),
inquiries_created=int(inquiries_created),
seller_replied_inquiries=int(seller_replied_inquiries),
inquiry_reply_rate=float(inquiry_reply_rate),
listings_with_inquiries=int(listings_with_inquiries),
listings_sold=int(listings_sold),
inquiry_to_sold_listing_rate=float(inquiry_to_sold_listing_rate),
connected_domains=int(connected_domains),
clicks=int(clicks),
conversions=int(conversions),
conversion_rate=float(conversion_rate),
payouts_paid=int(payouts_paid),
payouts_paid_amount_total=float(payouts_paid_amount_total),
)
async def get_cached_window_kpis(days: int) -> Optional[TelemetryWindowKpis]:
"""Return cached KPIs for a window (recompute if TTL expired)."""
if not settings.enable_business_metrics:
return None
now = datetime.utcnow()
until = _cache_until_by_days.get(days)
cached = _cache_value_by_days.get(days)
if until is not None and cached is not None and now < until:
return cached
value = await _compute_window_kpis(int(days))
ttl_seconds = max(5, int(settings.business_metrics_cache_seconds))
_cache_until_by_days[int(days)] = now + timedelta(seconds=ttl_seconds)
_cache_value_by_days[int(days)] = value
return value
# -----------------------------
# Prometheus Gauges
# -----------------------------
if Gauge is not None:
_g = {
"deal_listing_views": Gauge("pounce_deal_listing_views", "Deal: listing views in window", ["window_days"]),
"deal_inquiries_created": Gauge("pounce_deal_inquiries_created", "Deal: inquiries created in window", ["window_days"]),
"deal_seller_replied_inquiries": Gauge(
"pounce_deal_seller_replied_inquiries", "Deal: inquiries with seller reply in window", ["window_days"]
),
"deal_inquiry_reply_rate": Gauge("pounce_deal_inquiry_reply_rate", "Deal: inquiry reply rate in window", ["window_days"]),
"deal_listings_with_inquiries": Gauge(
"pounce_deal_listings_with_inquiries", "Deal: distinct listings with inquiries in window", ["window_days"]
),
"deal_listings_sold": Gauge("pounce_deal_listings_sold", "Deal: distinct listings marked sold in window", ["window_days"]),
"deal_inquiry_to_sold_listing_rate": Gauge(
"pounce_deal_inquiry_to_sold_listing_rate", "Deal: (listings with inquiry) -> sold rate in window", ["window_days"]
),
"yield_connected_domains": Gauge("pounce_yield_connected_domains", "Yield: connected domains in window", ["window_days"]),
"yield_clicks": Gauge("pounce_yield_clicks", "Yield: clicks in window", ["window_days"]),
"yield_conversions": Gauge("pounce_yield_conversions", "Yield: conversions in window", ["window_days"]),
"yield_conversion_rate": Gauge("pounce_yield_conversion_rate", "Yield: conversion rate in window", ["window_days"]),
"yield_payouts_paid": Gauge("pounce_yield_payouts_paid", "Yield: payouts paid in window", ["window_days"]),
"yield_payouts_paid_amount_total": Gauge(
"pounce_yield_payouts_paid_amount_total", "Yield: total amount paid out in window", ["window_days"]
),
}
else: # pragma: no cover
_g = {}
async def update_prometheus_business_metrics() -> None:
"""Compute KPIs and set Prometheus gauges (no-op when disabled)."""
if Gauge is None or not _g:
return
if not settings.enable_business_metrics:
return
windows = {1, int(settings.business_metrics_days)}
for days in sorted(windows):
kpis = await get_cached_window_kpis(days)
if kpis is None:
continue
w = str(int(kpis.window_days))
_g["deal_listing_views"].labels(window_days=w).set(kpis.listing_views)
_g["deal_inquiries_created"].labels(window_days=w).set(kpis.inquiries_created)
_g["deal_seller_replied_inquiries"].labels(window_days=w).set(kpis.seller_replied_inquiries)
_g["deal_inquiry_reply_rate"].labels(window_days=w).set(kpis.inquiry_reply_rate)
_g["deal_listings_with_inquiries"].labels(window_days=w).set(kpis.listings_with_inquiries)
_g["deal_listings_sold"].labels(window_days=w).set(kpis.listings_sold)
_g["deal_inquiry_to_sold_listing_rate"].labels(window_days=w).set(kpis.inquiry_to_sold_listing_rate)
_g["yield_connected_domains"].labels(window_days=w).set(kpis.connected_domains)
_g["yield_clicks"].labels(window_days=w).set(kpis.clicks)
_g["yield_conversions"].labels(window_days=w).set(kpis.conversions)
_g["yield_conversion_rate"].labels(window_days=w).set(kpis.conversion_rate)
_g["yield_payouts_paid"].labels(window_days=w).set(kpis.payouts_paid)
_g["yield_payouts_paid_amount_total"].labels(window_days=w).set(kpis.payouts_paid_amount_total)

View File

@ -1,137 +0,0 @@
"""Prometheus metrics for FastAPI + optional DB query metrics."""
from __future__ import annotations
import time
from typing import Optional
from fastapi import FastAPI, Request, Response
try:
from prometheus_client import Counter, Histogram, generate_latest, CONTENT_TYPE_LATEST
except Exception: # pragma: no cover
Counter = None # type: ignore
Histogram = None # type: ignore
generate_latest = None # type: ignore
CONTENT_TYPE_LATEST = "text/plain; version=0.0.4" # type: ignore
_instrumented = False
_db_instrumented = False
def _get_route_template(request: Request) -> str:
route = request.scope.get("route")
if route is not None and hasattr(route, "path"):
return str(route.path)
return request.url.path
def instrument_app(app: FastAPI, *, metrics_path: str = "/metrics", enable_db_metrics: bool = False) -> None:
"""
Add Prometheus request metrics and a `/metrics` endpoint.
- Low-cardinality path labels by using FastAPI route templates.
- Optional SQLAlchemy query timing metrics (off by default).
"""
global _instrumented
if _instrumented:
return
_instrumented = True
if Counter is None or Histogram is None:
# Dependency not installed; keep app working without metrics.
return
http_requests_total = Counter(
"http_requests_total",
"Total HTTP requests",
["method", "path", "status"],
)
http_request_duration_seconds = Histogram(
"http_request_duration_seconds",
"HTTP request duration (seconds)",
["method", "path"],
buckets=(0.01, 0.025, 0.05, 0.1, 0.25, 0.5, 1, 2.5, 5, 10),
)
@app.middleware("http")
async def _metrics_middleware(request: Request, call_next):
start = time.perf_counter()
response: Optional[Response] = None
try:
response = await call_next(request)
return response
finally:
duration = time.perf_counter() - start
path = _get_route_template(request)
method = request.method
status = str(getattr(response, "status_code", 500))
http_requests_total.labels(method=method, path=path, status=status).inc()
http_request_duration_seconds.labels(method=method, path=path).observe(duration)
@app.get(metrics_path, include_in_schema=False)
async def _metrics_endpoint():
# Optional: export business KPIs derived from telemetry (cached).
try:
from app.observability.business_metrics import update_prometheus_business_metrics
await update_prometheus_business_metrics()
except Exception:
# Never break metrics scrape due to KPI computation issues.
pass
# Optional: export ops metrics (e.g. backup age).
try:
from app.observability.ops_metrics import update_prometheus_ops_metrics
await update_prometheus_ops_metrics()
except Exception:
pass
return Response(generate_latest(), media_type=CONTENT_TYPE_LATEST)
if enable_db_metrics:
_instrument_db_metrics()
def _instrument_db_metrics() -> None:
"""Attach SQLAlchemy event listeners to track query latencies."""
global _db_instrumented
if _db_instrumented:
return
_db_instrumented = True
if Counter is None or Histogram is None:
return
from sqlalchemy import event
from app.database import engine
db_queries_total = Counter(
"db_queries_total",
"Total DB queries executed",
["dialect"],
)
db_query_duration_seconds = Histogram(
"db_query_duration_seconds",
"DB query duration (seconds)",
["dialect"],
buckets=(0.001, 0.0025, 0.005, 0.01, 0.025, 0.05, 0.1, 0.25, 0.5, 1, 2.5),
)
dialect = engine.sync_engine.dialect.name
@event.listens_for(engine.sync_engine, "before_cursor_execute")
def _before_cursor_execute(conn, cursor, statement, parameters, context, executemany): # type: ignore[no-untyped-def]
conn.info.setdefault("_query_start_time", []).append(time.perf_counter())
@event.listens_for(engine.sync_engine, "after_cursor_execute")
def _after_cursor_execute(conn, cursor, statement, parameters, context, executemany): # type: ignore[no-untyped-def]
start_list = conn.info.get("_query_start_time") or []
if not start_list:
return
start = start_list.pop()
duration = time.perf_counter() - start
db_queries_total.labels(dialect=dialect).inc()
db_query_duration_seconds.labels(dialect=dialect).observe(duration)

View File

@ -1,65 +0,0 @@
"""
Ops/health metrics exported as Prometheus metrics (4B Ops).
These are low-frequency filesystem-based metrics (safe on scrape).
"""
from __future__ import annotations
from datetime import datetime
from pathlib import Path
from app.config import get_settings
settings = get_settings()
try:
from prometheus_client import Gauge
except Exception: # pragma: no cover
Gauge = None # type: ignore
if Gauge is not None:
db_backups_enabled = Gauge("pounce_db_backups_enabled", "DB backups enabled (1/0)")
db_backup_latest_unixtime = Gauge("pounce_db_backup_latest_unixtime", "Unix time of latest backup file (0 if none)")
db_backup_latest_age_seconds = Gauge("pounce_db_backup_latest_age_seconds", "Age of latest backup file (seconds)")
else: # pragma: no cover
db_backups_enabled = None # type: ignore
db_backup_latest_unixtime = None # type: ignore
db_backup_latest_age_seconds = None # type: ignore
def _backup_root() -> Path:
root = Path(settings.backup_dir)
if not root.is_absolute():
root = (Path.cwd() / root).resolve()
return root
async def update_prometheus_ops_metrics() -> None:
if Gauge is None:
return
db_backups_enabled.set(1 if settings.enable_db_backups else 0)
root = _backup_root()
if not root.exists() or not root.is_dir():
db_backup_latest_unixtime.set(0)
db_backup_latest_age_seconds.set(0)
return
files = [p for p in root.glob("*") if p.is_file()]
if not files:
db_backup_latest_unixtime.set(0)
db_backup_latest_age_seconds.set(0)
return
latest = max(files, key=lambda p: p.stat().st_mtime)
mtime = float(latest.stat().st_mtime)
now = datetime.utcnow().timestamp()
age = max(0.0, now - mtime)
db_backup_latest_unixtime.set(mtime)
db_backup_latest_age_seconds.set(age)

View File

@ -1,542 +0,0 @@
"""Portfolio API routes."""
from datetime import datetime
from typing import Optional, List
from fastapi import APIRouter, Depends, HTTPException, status, Query
from pydantic import BaseModel, Field
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import get_db
from app.routes.auth import get_current_user
from app.models.user import User
from app.models.portfolio import PortfolioDomain, DomainValuation
from app.services.valuation import valuation_service
router = APIRouter(prefix="/portfolio", tags=["portfolio"])
# ============== Schemas ==============
class PortfolioDomainCreate(BaseModel):
"""Schema for creating a portfolio domain."""
domain: str = Field(..., min_length=3, max_length=255)
purchase_date: Optional[datetime] = None
purchase_price: Optional[float] = Field(None, ge=0)
purchase_registrar: Optional[str] = None
registrar: Optional[str] = None
renewal_date: Optional[datetime] = None
renewal_cost: Optional[float] = Field(None, ge=0)
auto_renew: bool = True
notes: Optional[str] = None
tags: Optional[str] = None
class PortfolioDomainUpdate(BaseModel):
"""Schema for updating a portfolio domain."""
purchase_date: Optional[datetime] = None
purchase_price: Optional[float] = Field(None, ge=0)
purchase_registrar: Optional[str] = None
registrar: Optional[str] = None
renewal_date: Optional[datetime] = None
renewal_cost: Optional[float] = Field(None, ge=0)
auto_renew: Optional[bool] = None
status: Optional[str] = None
notes: Optional[str] = None
tags: Optional[str] = None
class PortfolioDomainSell(BaseModel):
"""Schema for marking a domain as sold."""
sale_date: datetime
sale_price: float = Field(..., ge=0)
class PortfolioDomainResponse(BaseModel):
"""Response schema for portfolio domain."""
id: int
domain: str
purchase_date: Optional[datetime]
purchase_price: Optional[float]
purchase_registrar: Optional[str]
registrar: Optional[str]
renewal_date: Optional[datetime]
renewal_cost: Optional[float]
auto_renew: bool
estimated_value: Optional[float]
value_updated_at: Optional[datetime]
is_sold: bool
sale_date: Optional[datetime]
sale_price: Optional[float]
status: str
notes: Optional[str]
tags: Optional[str]
roi: Optional[float]
created_at: datetime
updated_at: datetime
class Config:
from_attributes = True
class PortfolioSummary(BaseModel):
"""Summary of user's portfolio."""
total_domains: int
active_domains: int
sold_domains: int
total_invested: float
total_value: float
total_sold_value: float
unrealized_profit: float
realized_profit: float
overall_roi: float
class ValuationResponse(BaseModel):
"""Response schema for domain valuation."""
domain: str
estimated_value: float
currency: str
scores: dict
factors: dict
confidence: str
source: str
calculated_at: str
# ============== Portfolio Endpoints ==============
@router.get("", response_model=List[PortfolioDomainResponse])
async def get_portfolio(
status: Optional[str] = Query(None, description="Filter by status"),
sort_by: str = Query("created_at", description="Sort field"),
sort_order: str = Query("desc", description="Sort order (asc/desc)"),
limit: int = Query(100, le=500),
offset: int = Query(0, ge=0),
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get user's portfolio domains."""
query = select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
# Filter by status
if status:
query = query.where(PortfolioDomain.status == status)
# Sorting
sort_column = getattr(PortfolioDomain, sort_by, PortfolioDomain.created_at)
if sort_order == "asc":
query = query.order_by(sort_column.asc())
else:
query = query.order_by(sort_column.desc())
# Pagination
query = query.offset(offset).limit(limit)
result = await db.execute(query)
domains = result.scalars().all()
# Calculate ROI for each domain
responses = []
for d in domains:
response = PortfolioDomainResponse(
id=d.id,
domain=d.domain,
purchase_date=d.purchase_date,
purchase_price=d.purchase_price,
purchase_registrar=d.purchase_registrar,
registrar=d.registrar,
renewal_date=d.renewal_date,
renewal_cost=d.renewal_cost,
auto_renew=d.auto_renew,
estimated_value=d.estimated_value,
value_updated_at=d.value_updated_at,
is_sold=d.is_sold,
sale_date=d.sale_date,
sale_price=d.sale_price,
status=d.status,
notes=d.notes,
tags=d.tags,
roi=d.roi,
created_at=d.created_at,
updated_at=d.updated_at,
)
responses.append(response)
return responses
@router.get("/summary", response_model=PortfolioSummary)
async def get_portfolio_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get portfolio summary statistics."""
result = await db.execute(
select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
)
domains = result.scalars().all()
total_domains = len(domains)
active_domains = sum(1 for d in domains if d.status == "active" and not d.is_sold)
sold_domains = sum(1 for d in domains if d.is_sold)
total_invested = sum(d.purchase_price or 0 for d in domains)
total_value = sum(d.estimated_value or 0 for d in domains if not d.is_sold)
total_sold_value = sum(d.sale_price or 0 for d in domains if d.is_sold)
# Calculate active investment for ROI
active_investment = sum(d.purchase_price or 0 for d in domains if not d.is_sold)
sold_investment = sum(d.purchase_price or 0 for d in domains if d.is_sold)
unrealized_profit = total_value - active_investment
realized_profit = total_sold_value - sold_investment
overall_roi = 0.0
if total_invested > 0:
overall_roi = ((total_value + total_sold_value - total_invested) / total_invested) * 100
return PortfolioSummary(
total_domains=total_domains,
active_domains=active_domains,
sold_domains=sold_domains,
total_invested=round(total_invested, 2),
total_value=round(total_value, 2),
total_sold_value=round(total_sold_value, 2),
unrealized_profit=round(unrealized_profit, 2),
realized_profit=round(realized_profit, 2),
overall_roi=round(overall_roi, 2),
)
@router.post("", response_model=PortfolioDomainResponse, status_code=status.HTTP_201_CREATED)
async def add_portfolio_domain(
data: PortfolioDomainCreate,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Add a domain to portfolio."""
# Check if domain already exists in user's portfolio
existing = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.user_id == current_user.id,
PortfolioDomain.domain == data.domain.lower(),
)
)
)
if existing.scalar_one_or_none():
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Domain already in portfolio",
)
# Get initial valuation
valuation = await valuation_service.estimate_value(data.domain, db, save_result=True)
estimated_value = valuation.get("estimated_value") if "error" not in valuation else None
# Create portfolio entry
domain = PortfolioDomain(
user_id=current_user.id,
domain=data.domain.lower(),
purchase_date=data.purchase_date,
purchase_price=data.purchase_price,
purchase_registrar=data.purchase_registrar,
registrar=data.registrar or data.purchase_registrar,
renewal_date=data.renewal_date,
renewal_cost=data.renewal_cost,
auto_renew=data.auto_renew,
estimated_value=estimated_value,
value_updated_at=datetime.utcnow() if estimated_value else None,
notes=data.notes,
tags=data.tags,
)
db.add(domain)
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.get("/{domain_id}", response_model=PortfolioDomainResponse)
async def get_portfolio_domain(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get a specific portfolio domain."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.put("/{domain_id}", response_model=PortfolioDomainResponse)
async def update_portfolio_domain(
domain_id: int,
data: PortfolioDomainUpdate,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Update a portfolio domain."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
# Update fields
update_data = data.model_dump(exclude_unset=True)
for field, value in update_data.items():
setattr(domain, field, value)
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.post("/{domain_id}/sell", response_model=PortfolioDomainResponse)
async def mark_domain_sold(
domain_id: int,
data: PortfolioDomainSell,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Mark a domain as sold."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
domain.is_sold = True
domain.sale_date = data.sale_date
domain.sale_price = data.sale_price
domain.status = "sold"
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.delete("/{domain_id}", status_code=status.HTTP_204_NO_CONTENT)
async def delete_portfolio_domain(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Delete a domain from portfolio."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
await db.delete(domain)
await db.commit()
@router.post("/{domain_id}/refresh-value", response_model=PortfolioDomainResponse)
async def refresh_domain_value(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Refresh the estimated value of a portfolio domain."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
# Get new valuation
valuation = await valuation_service.estimate_value(domain.domain, db, save_result=True)
if "error" not in valuation:
domain.estimated_value = valuation["estimated_value"]
domain.value_updated_at = datetime.utcnow()
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
# ============== Valuation Endpoints ==============
@router.get("/valuation/{domain}", response_model=ValuationResponse)
async def get_domain_valuation(
domain: str,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get estimated value for any domain."""
valuation = await valuation_service.estimate_value(domain, db, save_result=True)
if "error" in valuation:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=valuation["error"],
)
return ValuationResponse(**valuation)

View File

@ -16,10 +16,6 @@ from app.models.subscription import Subscription, SubscriptionTier, TIER_CONFIG
from app.services.domain_checker import domain_checker
from app.services.email_service import email_service
from app.services.price_tracker import price_tracker
from app.services.yield_payouts import generate_payouts_for_previous_month
from app.services.db_backup import create_backup
from app.services.ops_alerts import run_ops_alert_checks
from app.services.referral_rewards import apply_referral_rewards_all
if TYPE_CHECKING:
from app.models.sniper_alert import SniperAlert
@ -161,346 +157,6 @@ async def check_realtime_domains():
await check_domains_by_frequency('realtime')
async def send_weekly_digests():
"""
Send weekly summary emails to all users.
Includes: domains tracked, status changes, available domains, etc.
"""
logger.info("📊 Sending weekly digest emails...")
try:
async with AsyncSessionLocal() as db:
# Get all users with domains
users_result = await db.execute(
select(User).where(User.is_verified == True)
)
users = users_result.scalars().all()
sent = 0
for user in users:
try:
# Get user's domains
domains_result = await db.execute(
select(Domain).where(Domain.user_id == user.id)
)
domains = domains_result.scalars().all()
if not domains:
continue
# Calculate stats
total_domains = len(domains)
available_domains = [d.name for d in domains if d.is_available]
# Get status changes from last week
week_ago = datetime.utcnow() - timedelta(days=7)
checks_result = await db.execute(
select(DomainCheck)
.join(Domain, DomainCheck.domain_id == Domain.id)
.where(
and_(
Domain.user_id == user.id,
DomainCheck.checked_at >= week_ago,
)
)
)
checks = checks_result.scalars().all()
# Count status changes (simplified - just count checks)
status_changes = len(set(c.domain_id for c in checks))
if email_service.is_configured():
await email_service.send_weekly_digest(
to_email=user.email,
total_domains=total_domains,
status_changes=status_changes,
price_alerts=0, # Could track this separately
available_domains=available_domains[:5], # Limit to 5
)
sent += 1
except Exception as e:
logger.error(f"Failed to send digest to {user.email}: {e}")
logger.info(f"📧 Sent {sent} weekly digest emails")
except Exception as e:
logger.exception(f"Weekly digest failed: {e}")
async def check_expiring_domains():
"""
Check for domains expiring soon and send warnings.
Sends alerts for domains expiring within 30 days.
"""
logger.info("📅 Checking for expiring domains...")
try:
async with AsyncSessionLocal() as db:
# Get domains expiring within 30 days
cutoff = datetime.utcnow() + timedelta(days=30)
result = await db.execute(
select(Domain)
.where(
and_(
Domain.is_available == False,
Domain.expiration_date != None,
Domain.expiration_date <= cutoff,
Domain.expiration_date > datetime.utcnow(), # Not yet expired
Domain.notify_on_available == True, # User wants notifications
)
)
)
expiring = result.scalars().all()
if not expiring:
logger.info("No domains expiring soon")
return
logger.info(f"Found {len(expiring)} domains expiring within 30 days")
# Group by user and send alerts
user_domains = {}
for domain in expiring:
if domain.user_id not in user_domains:
user_domains[domain.user_id] = []
days_left = (domain.expiration_date - datetime.utcnow()).days
user_domains[domain.user_id].append({
'name': domain.name,
'days_left': days_left,
'expiration_date': domain.expiration_date,
})
alerts_sent = 0
for user_id, domains_list in user_domains.items():
try:
user_result = await db.execute(
select(User).where(User.id == user_id)
)
user = user_result.scalar_one_or_none()
if user and user.email and email_service.is_configured():
# Build email content
domain_lines = "\n".join([
f"{d['name']} - {d['days_left']} days left"
for d in sorted(domains_list, key=lambda x: x['days_left'])
])
await email_service.send_email(
to_email=user.email,
subject=f"{len(domains_list)} domain{'s' if len(domains_list) > 1 else ''} expiring soon",
html_content=f"""
<h2 style="margin: 0 0 24px 0; font-size: 20px; font-weight: 600; color: #000000;">
Domains expiring soon
</h2>
<p style="margin: 0 0 24px 0; font-size: 15px; color: #333333; line-height: 1.6;">
The following domains on your watchlist are expiring within 30 days:
</p>
<div style="margin: 24px 0; padding: 20px; background: #fafafa; border-radius: 6px; border-left: 3px solid #f59e0b;">
{"".join(f'<p style="margin: 8px 0; font-family: monospace;"><strong>{d["name"]}</strong> — <span style="color: {"#ef4444" if d["days_left"] <= 7 else "#f59e0b"};">{d["days_left"]} days left</span></p>' for d in sorted(domains_list, key=lambda x: x["days_left"]))}
</div>
<p style="margin: 24px 0 0 0; font-size: 14px; color: #666666;">
Keep an eye on these domains — they may become available soon.
</p>
""",
text_content=f"Domains expiring soon:\n{domain_lines}",
)
alerts_sent += 1
except Exception as e:
logger.error(f"Failed to send expiry alert to user {user_id}: {e}")
logger.info(f"📧 Sent {alerts_sent} expiry warning emails")
except Exception as e:
logger.exception(f"Expiry check failed: {e}")
async def run_health_checks():
"""
Run automated health checks on all watched domains.
This runs 1x daily to update domain health status (DNS, HTTP, SSL).
Health data is cached and used to detect weakening domains.
"""
from app.services.domain_health import get_health_checker
from app.models.domain import DomainHealthCache
logger.info("🏥 Starting automated health checks...")
start_time = datetime.utcnow()
try:
async with AsyncSessionLocal() as db:
# Get all watched domains (registered, not available)
result = await db.execute(
select(Domain).where(Domain.is_available == False)
)
domains = result.scalars().all()
logger.info(f"Running health checks on {len(domains)} domains...")
health_checker = get_health_checker()
checked = 0
errors = 0
status_changes = []
for domain in domains:
try:
# Run health check
report = await health_checker.check_domain(domain.name)
# Check for status changes (if we have previous data)
# Get existing cache
cache_result = await db.execute(
select(DomainHealthCache).where(DomainHealthCache.domain_id == domain.id)
)
existing_cache = cache_result.scalar_one_or_none()
old_status = existing_cache.status if existing_cache else None
new_status = report.status.value
# Detect significant changes
if old_status and old_status != new_status:
status_changes.append({
'domain': domain.name,
'old_status': old_status,
'new_status': new_status,
'user_id': domain.user_id,
})
logger.info(f"⚠️ Status change: {domain.name} {old_status}{new_status}")
# Serialize data to JSON strings (cache is used by the UI)
import json
report_dict = report.to_dict()
signals_json = json.dumps(report_dict.get("signals") or [])
dns_json = json.dumps(report_dict.get("dns") or {})
http_json = json.dumps(report_dict.get("http") or {})
ssl_json = json.dumps(report_dict.get("ssl") or {})
# Update or create cache
if existing_cache:
existing_cache.status = new_status
existing_cache.score = report.score
existing_cache.signals = signals_json
existing_cache.dns_data = dns_json
existing_cache.http_data = http_json
existing_cache.ssl_data = ssl_json
existing_cache.checked_at = datetime.utcnow()
else:
# Create new cache entry
new_cache = DomainHealthCache(
domain_id=domain.id,
status=new_status,
score=report.score,
signals=signals_json,
dns_data=dns_json,
http_data=http_json,
ssl_data=ssl_json,
checked_at=datetime.utcnow(),
)
db.add(new_cache)
checked += 1
# Small delay to avoid overwhelming DNS servers
await asyncio.sleep(0.3)
except Exception as e:
logger.error(f"Health check failed for {domain.name}: {e}")
errors += 1
await db.commit()
elapsed = (datetime.utcnow() - start_time).total_seconds()
logger.info(
f"✅ Health checks complete. Checked: {checked}, Errors: {errors}, "
f"Status changes: {len(status_changes)}, Time: {elapsed:.1f}s"
)
# Send alerts for critical status changes (domains becoming critical)
if status_changes:
await send_health_change_alerts(db, status_changes)
except Exception as e:
logger.exception(f"Health check job failed: {e}")
async def send_health_change_alerts(db, changes: list):
"""Send alerts when domains have significant health changes."""
if not email_service.is_configured():
return
for change in changes:
# Only alert on critical changes
if change['new_status'] == 'critical':
try:
result = await db.execute(
select(User).where(User.id == change['user_id'])
)
user = result.scalar_one_or_none()
if user and user.email:
# Use domain available template as fallback (domain might be dropping)
await email_service.send_domain_available(
to_email=user.email,
domain=change['domain'],
register_url=f"https://pounce.ch/terminal/watchlist",
)
logger.info(f"📧 Critical health alert sent for {change['domain']}")
except Exception as e:
logger.error(f"Failed to send health alert: {e}")
async def prepare_monthly_yield_payouts():
"""
Prepare Yield payouts for previous month (admin automation).
Safety:
- Only runs when `internal_api_key` is configured.
- Idempotent: generation skips existing payouts for the same period.
"""
if not (settings.internal_api_key or "").strip():
return
try:
async with AsyncSessionLocal() as db:
await generate_payouts_for_previous_month(db)
except Exception as e:
logger.exception(f"Yield payout preparation failed: {e}")
async def run_db_backup():
"""Create a verified DB backup (4B Ops)."""
if not settings.enable_db_backups:
return
try:
# backup is filesystem / subprocess based; no DB session needed here
create_backup(verify=True)
except Exception as e:
logger.exception(f"DB backup failed: {e}")
async def run_ops_alerting():
"""Evaluate and (optionally) send ops alerts (4B)."""
try:
await run_ops_alert_checks()
except Exception as e:
logger.exception(f"Ops alerting failed: {e}")
async def run_referral_rewards():
"""Recompute and apply referral reward bonuses (3C.2)."""
try:
async with AsyncSessionLocal() as db:
res = await apply_referral_rewards_all(db)
await db.commit()
logger.info("Referral rewards applied: processed=%s updated=%s", res.get("processed"), res.get("updated"))
except Exception as e:
logger.exception(f"Referral rewards job failed: {e}")
def setup_scheduler():
"""Configure and start the scheduler."""
# Daily domain check for Scout users at configured hour
@ -530,103 +186,21 @@ def setup_scheduler():
replace_existing=True,
)
# Automated health checks 1x daily at 06:00 UTC
scheduler.add_job(
run_health_checks,
CronTrigger(hour=6, minute=0),
id="daily_health_check",
name="Daily Health Check (All Domains)",
replace_existing=True,
)
# Expiry warnings 1x weekly (Mondays at 08:00 UTC)
scheduler.add_job(
check_expiring_domains,
CronTrigger(day_of_week='mon', hour=8, minute=0),
id="weekly_expiry_check",
name="Weekly Expiry Warning",
replace_existing=True,
)
# Weekly digest (Sundays at 10:00 UTC)
scheduler.add_job(
send_weekly_digests,
CronTrigger(day_of_week='sun', hour=10, minute=0),
id="weekly_digest",
name="Weekly Digest Email",
replace_existing=True,
)
# Yield payout preparation: run on 2nd day of month at 02:10 UTC
scheduler.add_job(
prepare_monthly_yield_payouts,
CronTrigger(day=2, hour=2, minute=10),
id="yield_payout_prepare",
name="Yield Payout Preparation (Monthly)",
replace_existing=True,
)
# DB backup: daily at 01:30 UTC
scheduler.add_job(
run_db_backup,
CronTrigger(hour=1, minute=30),
id="db_backup",
name="DB Backup (Daily)",
replace_existing=True,
)
# Ops alerting: hourly at :12 (staggered)
scheduler.add_job(
run_ops_alerting,
CronTrigger(minute=12),
id="ops_alerting",
name="Ops Alerting (Hourly)",
replace_existing=True,
)
# Referral rewards: daily at 00:22 UTC (staggered)
scheduler.add_job(
run_referral_rewards,
CronTrigger(hour=0, minute=22),
id="referral_rewards",
name="Referral Rewards (Daily)",
replace_existing=True,
)
# TLD price scrape 2x daily for better historical data
# Morning scrape at 03:00 UTC
# Daily TLD price scrape at 03:00 UTC
scheduler.add_job(
scrape_tld_prices,
CronTrigger(hour=3, minute=0),
id="morning_tld_scrape",
name="TLD Price Scrape (Morning 03:00 UTC)",
id="daily_tld_scrape",
name="Daily TLD Price Scrape",
replace_existing=True,
)
# Afternoon scrape at 15:00 UTC (captures price changes during US business hours)
scheduler.add_job(
scrape_tld_prices,
CronTrigger(hour=15, minute=0),
id="afternoon_tld_scrape",
name="TLD Price Scrape (Afternoon 15:00 UTC)",
replace_existing=True,
)
# Price change check at 04:00 UTC (after morning scrape completes)
# Price change check at 04:00 UTC (after scrape completes)
scheduler.add_job(
check_price_changes,
CronTrigger(hour=4, minute=0),
id="morning_price_check",
name="Price Change Check (Morning)",
replace_existing=True,
)
# Price change check at 16:00 UTC (after afternoon scrape)
scheduler.add_job(
check_price_changes,
CronTrigger(hour=16, minute=0),
id="afternoon_price_check",
name="Price Change Check (Afternoon)",
id="daily_price_check",
name="Daily Price Change Check",
replace_existing=True,
)
@ -639,12 +213,12 @@ def setup_scheduler():
replace_existing=True,
)
# Cleanup expired auctions every 5 minutes (CRITICAL for data freshness!)
# Cleanup expired auctions every 15 minutes (CRITICAL for data freshness!)
scheduler.add_job(
cleanup_expired_auctions,
CronTrigger(minute='*/5'), # Every 5 minutes
CronTrigger(minute='*/15'), # Every 15 minutes
id="auction_cleanup",
name="Expired Auction Cleanup (5m)",
name="Expired Auction Cleanup (15m)",
replace_existing=True,
)
@ -657,44 +231,16 @@ def setup_scheduler():
replace_existing=True,
)
# Zone file sync for .ch and .li domains (daily at 05:00 UTC)
scheduler.add_job(
sync_zone_files,
CronTrigger(hour=5, minute=0), # Daily at 05:00 UTC
id="zone_file_sync",
name="Zone File Sync - Switch.ch (daily)",
replace_existing=True,
)
# CZDS zone file sync for gTLDs (daily at 06:00 UTC, after Switch sync)
scheduler.add_job(
sync_czds_zones,
CronTrigger(hour=6, minute=0), # Daily at 06:00 UTC
id="czds_zone_sync",
name="Zone File Sync - ICANN CZDS (daily)",
replace_existing=True,
)
# Zone data cleanup (hourly - delete drops older than 48h)
scheduler.add_job(
cleanup_zone_data,
CronTrigger(minute=45), # Every hour at :45
id="zone_cleanup",
name="Zone Data Cleanup (hourly)",
replace_existing=True,
)
logger.info(
f"Scheduler configured:"
f"\n - Scout domain check at {settings.check_hour:02d}:{settings.check_minute:02d} (daily)"
f"\n - Trader domain check every hour at :00"
f"\n - Tycoon domain check every 10 minutes"
f"\n - TLD price scrape 2x daily at 03:00 & 15:00 UTC"
f"\n - Price change alerts at 04:00 & 16:00 UTC"
f"\n - TLD price scrape at 03:00 UTC"
f"\n - Price change alerts at 04:00 UTC"
f"\n - Auction scrape every 2 hours at :30"
f"\n - Expired auction cleanup every 15 minutes"
f"\n - Sniper alert matching every 30 minutes"
f"\n - Zone file sync daily at 05:00 UTC"
)
@ -725,7 +271,7 @@ async def run_manual_tld_scrape():
async def send_domain_availability_alerts(db, domains: list[Domain]):
"""Send email alerts for newly available domains."""
if not email_service.is_configured():
if not email_service.is_enabled:
logger.info("Email service not configured, skipping domain alerts")
return
@ -739,18 +285,14 @@ async def send_domain_availability_alerts(db, domains: list[Domain]):
)
user = result.scalar_one_or_none()
if user and user.email and domain.notify_on_available:
# Create registration URL
register_url = f"https://www.namecheap.com/domains/registration/results/?domain={domain.name}"
success = await email_service.send_domain_available(
if user and user.email:
success = await email_service.send_domain_available_alert(
to_email=user.email,
domain=domain.name,
register_url=register_url,
user_name=user.name,
)
if success:
alerts_sent += 1
logger.info(f"📧 Alert sent for {domain.name} to {user.email}")
except Exception as e:
logger.error(f"Failed to send alert for {domain.name}: {e}")
@ -798,12 +340,12 @@ async def cleanup_expired_auctions():
async with AsyncSessionLocal() as db:
now = datetime.utcnow()
# 1. Mark ended auctions as inactive (<= now to avoid "0m" linger)
# 1. Mark ended auctions as inactive
stmt = (
update(DomainAuction)
.where(
and_(
DomainAuction.end_time <= now,
DomainAuction.end_time < now,
DomainAuction.is_active == True
)
)
@ -859,86 +401,12 @@ async def scrape_auctions():
logger.exception(f"Auction scrape failed: {e}")
async def cleanup_zone_data():
"""Clean up old zone file data to save storage."""
logger.info("Starting zone data cleanup...")
try:
from app.services.zone_file import cleanup_old_drops, cleanup_old_snapshots
async with AsyncSessionLocal() as db:
# Delete drops older than 48h
drops_deleted = await cleanup_old_drops(db, hours=48)
# Delete snapshots older than 7 days
snapshots_deleted = await cleanup_old_snapshots(db, keep_days=7)
logger.info(f"Zone cleanup: {drops_deleted} drops, {snapshots_deleted} snapshots deleted")
except Exception as e:
logger.exception(f"Zone data cleanup failed: {e}")
async def sync_zone_files():
"""Sync zone files from Switch.ch (.ch, .li) and ICANN CZDS (gTLDs)."""
logger.info("Starting zone file sync...")
try:
from app.services.zone_file import ZoneFileService
service = ZoneFileService()
async with AsyncSessionLocal() as db:
# Sync Switch.ch zones (.ch, .li)
for tld in ["ch", "li"]:
try:
result = await service.run_daily_sync(db, tld)
logger.info(f".{tld} zone sync: {len(result.get('dropped', []))} dropped, {result.get('new_count', 0)} new")
except Exception as e:
logger.error(f".{tld} zone sync failed: {e}")
logger.info("Switch.ch zone file sync completed")
except Exception as e:
logger.exception(f"Zone file sync failed: {e}")
async def sync_czds_zones():
"""Sync zone files from ICANN CZDS (gTLDs like .xyz, .org, .dev, .app)."""
logger.info("Starting CZDS zone file sync...")
try:
from app.services.czds_client import CZDSClient, APPROVED_TLDS
from app.config import get_settings
settings = get_settings()
# Skip if credentials not configured
if not settings.czds_username or not settings.czds_password:
logger.info("CZDS credentials not configured, skipping CZDS sync")
return
client = CZDSClient()
async with AsyncSessionLocal() as db:
results = await client.sync_all_zones(db, APPROVED_TLDS)
success_count = sum(1 for r in results if r["status"] == "success")
total_dropped = sum(r["dropped_count"] for r in results)
logger.info(f"CZDS sync complete: {success_count}/{len(APPROVED_TLDS)} zones, {total_dropped:,} dropped")
except Exception as e:
logger.exception(f"CZDS zone file sync failed: {e}")
async def match_sniper_alerts():
"""Match active sniper alerts against auctions AND drops and notify users."""
"""Match active sniper alerts against current auctions and notify users."""
from app.models.sniper_alert import SniperAlert, SniperAlertMatch
from app.models.auction import DomainAuction
from app.models.zone_file import DroppedDomain
logger.info("Matching sniper alerts against auctions and drops...")
logger.info("Matching sniper alerts against new auctions...")
try:
async with AsyncSessionLocal() as db:
@ -953,65 +421,39 @@ async def match_sniper_alerts():
return
# Get recent auctions (added in last 2 hours)
auction_cutoff = datetime.utcnow() - timedelta(hours=2)
cutoff = datetime.utcnow() - timedelta(hours=2)
auctions_result = await db.execute(
select(DomainAuction).where(
and_(
DomainAuction.is_active == True,
DomainAuction.scraped_at >= auction_cutoff,
DomainAuction.scraped_at >= cutoff,
)
)
)
auctions = auctions_result.scalars().all()
# Get recent drops (last 24 hours)
drop_cutoff = datetime.utcnow() - timedelta(hours=24)
drops_result = await db.execute(
select(DroppedDomain).where(DroppedDomain.dropped_date >= drop_cutoff)
)
drops = drops_result.scalars().all()
logger.info(f"Checking {len(alerts)} alerts against {len(auctions)} auctions and {len(drops)} drops")
if not auctions:
logger.info("No recent auctions to match against")
return
matches_created = 0
notifications_sent = 0
for alert in alerts:
matching_items = []
matching_auctions = []
# Match against auctions
for auction in auctions:
if _auction_matches_alert(auction, alert):
matching_items.append({
'domain': auction.domain,
'source': 'auction',
'platform': auction.platform,
'price': auction.current_bid,
'end_time': auction.end_time,
'url': auction.auction_url,
})
matching_auctions.append(auction)
# Match against drops
for drop in drops:
if _drop_matches_alert(drop, alert):
full_domain = f"{drop.domain}.{drop.tld}"
matching_items.append({
'domain': full_domain,
'source': 'drop',
'platform': f'.{drop.tld} zone',
'price': 0,
'end_time': None,
'url': f"https://pounce.ch/terminal/hunt?tab=drops&search={drop.domain}",
})
if matching_items:
for item in matching_items:
if matching_auctions:
for auction in matching_auctions:
# Check if this match already exists
existing = await db.execute(
select(SniperAlertMatch).where(
and_(
SniperAlertMatch.alert_id == alert.id,
SniperAlertMatch.domain == item['domain'],
SniperAlertMatch.domain == auction.domain,
)
)
)
@ -1021,61 +463,48 @@ async def match_sniper_alerts():
# Create new match
match = SniperAlertMatch(
alert_id=alert.id,
domain=item['domain'],
platform=item['platform'],
current_bid=item['price'],
end_time=item['end_time'] or datetime.utcnow(),
auction_url=item['url'],
domain=auction.domain,
platform=auction.platform,
current_bid=auction.current_bid,
end_time=auction.end_time,
auction_url=auction.auction_url,
matched_at=datetime.utcnow(),
)
db.add(match)
matches_created += 1
# Update alert stats
alert.matches_count = (alert.matches_count or 0) + 1
alert.last_matched_at = datetime.utcnow()
# Send notification if enabled (batch notification)
if alert.notify_email and matching_items:
# Update alert last_triggered
alert.last_triggered = datetime.utcnow()
# Send notification if enabled
if alert.notify_email:
try:
user_result = await db.execute(
select(User).where(User.id == alert.user_id)
)
user = user_result.scalar_one_or_none()
if user and email_service.is_configured():
auction_matches = [m for m in matching_items if m['source'] == 'auction']
drop_matches = [m for m in matching_items if m['source'] == 'drop']
# Build HTML content
html_parts = [f'<h2>Your Sniper Alert "{alert.name}" matched!</h2>']
if auction_matches:
html_parts.append(f'<h3>🎯 {len(auction_matches)} Auction Match{"es" if len(auction_matches) > 1 else ""}</h3><ul>')
for m in auction_matches[:10]:
html_parts.append(f'<li><strong>{m["domain"]}</strong> - ${m["price"]:.0f} on {m["platform"]}</li>')
html_parts.append('</ul>')
if drop_matches:
html_parts.append(f'<h3>🔥 {len(drop_matches)} Fresh Drop{"s" if len(drop_matches) > 1 else ""}</h3><ul>')
for m in drop_matches[:10]:
html_parts.append(f'<li><strong>{m["domain"]}</strong> - Just dropped!</li>')
html_parts.append('</ul>')
html_parts.append('<p><a href="https://pounce.ch/terminal/sniper">View all matches in Pounce</a></p>')
if user and email_service.is_enabled:
# Send email with matching domains
domains_list = ", ".join([a.domain for a in matching_auctions[:5]])
await email_service.send_email(
to_email=user.email,
subject=f"🎯 Sniper Alert: {len(matching_items)} matching domains found!",
html_content=''.join(html_parts),
subject=f"🎯 Sniper Alert: {len(matching_auctions)} matching domains found!",
html_content=f"""
<h2>Your Sniper Alert "{alert.name}" matched!</h2>
<p>We found {len(matching_auctions)} domains matching your criteria:</p>
<ul>
{"".join(f"<li><strong>{a.domain}</strong> - ${a.current_bid:.0f} on {a.platform}</li>" for a in matching_auctions[:10])}
</ul>
<p><a href="https://pounce.ch/command/alerts">View all matches in your Command Center</a></p>
"""
)
notifications_sent += 1
alert.notifications_sent = (alert.notifications_sent or 0) + 1
except Exception as e:
logger.error(f"Failed to send sniper alert notification: {e}")
await db.commit()
logger.info(f"Sniper alert matching complete: {matches_created} matches, {notifications_sent} notifications")
logger.info(f"Sniper alert matching complete: {matches_created} matches created, {notifications_sent} notifications sent")
except Exception as e:
logger.exception(f"Sniper alert matching failed: {e}")
@ -1085,16 +514,9 @@ def _auction_matches_alert(auction: "DomainAuction", alert: "SniperAlert") -> bo
"""Check if an auction matches the criteria of a sniper alert."""
domain_name = auction.domain.rsplit('.', 1)[0] if '.' in auction.domain else auction.domain
# Check keyword filter (must contain any of the keywords)
if alert.keywords:
required = [k.strip().lower() for k in alert.keywords.split(',')]
if not any(kw in domain_name.lower() for kw in required):
return False
# Check exclude keywords
if alert.exclude_keywords:
excluded = [k.strip().lower() for k in alert.exclude_keywords.split(',')]
if any(kw in domain_name.lower() for kw in excluded):
# Check keyword filter
if alert.keyword:
if alert.keyword.lower() not in domain_name.lower():
return False
# Check TLD filter
@ -1103,12 +525,6 @@ def _auction_matches_alert(auction: "DomainAuction", alert: "SniperAlert") -> bo
if auction.tld.lower() not in allowed_tlds:
return False
# Check platform filter
if alert.platforms:
allowed_platforms = [p.strip().lower() for p in alert.platforms.split(',')]
if auction.platform.lower() not in allowed_platforms:
return False
# Check length filters
if alert.min_length and len(domain_name) < alert.min_length:
return False
@ -1121,68 +537,17 @@ def _auction_matches_alert(auction: "DomainAuction", alert: "SniperAlert") -> bo
if alert.max_price and auction.current_bid > alert.max_price:
return False
# Check bids filter (low competition)
if alert.max_bids and auction.num_bids and auction.num_bids > alert.max_bids:
return False
# Check no_numbers filter
if alert.no_numbers:
# Check exclusion filters
if alert.exclude_numbers:
if any(c.isdigit() for c in domain_name):
return False
# Check no_hyphens filter
if alert.no_hyphens:
if alert.exclude_hyphens:
if '-' in domain_name:
return False
# Check exclude_chars
if alert.exclude_chars:
excluded = set(c.strip().lower() for c in alert.exclude_chars.split(','))
if any(c in excluded for c in domain_name.lower()):
return False
return True
def _drop_matches_alert(drop, alert: "SniperAlert") -> bool:
"""Check if a dropped domain matches the criteria of a sniper alert."""
domain_name = drop.domain # Already just the name without TLD
# Check keyword filter
if alert.keywords:
required = [k.strip().lower() for k in alert.keywords.split(',')]
if not any(kw in domain_name.lower() for kw in required):
return False
# Check exclude keywords
if alert.exclude_keywords:
excluded = [k.strip().lower() for k in alert.exclude_keywords.split(',')]
if any(kw in domain_name.lower() for kw in excluded):
return False
# Check TLD filter
if alert.tlds:
allowed_tlds = [t.strip().lower() for t in alert.tlds.split(',')]
if drop.tld.lower() not in allowed_tlds:
return False
# Check length filters
if alert.min_length and len(domain_name) < alert.min_length:
return False
if alert.max_length and len(domain_name) > alert.max_length:
return False
# Check no_numbers filter (use drop.is_numeric)
if alert.no_numbers and drop.is_numeric:
return False
# Check no_hyphens filter (use drop.has_hyphen)
if alert.no_hyphens and drop.has_hyphen:
return False
# Check exclude_chars
if alert.exclude_chars:
excluded = set(c.strip().lower() for c in alert.exclude_chars.split(','))
excluded = set(alert.exclude_chars.lower())
if any(c in excluded for c in domain_name.lower()):
return False

View File

@ -1,35 +0,0 @@
"""
Analyze schemas (Alpha Terminal - Phase 2 Diligence).
Open-data-first: we return null + reason when data isn't available.
"""
from __future__ import annotations
from datetime import datetime
from typing import Any, Optional
from pydantic import BaseModel, Field
class AnalyzeItem(BaseModel):
key: str
label: str
value: Optional[Any] = None
status: str = Field(default="info", description="pass|warn|fail|info|na")
source: str = Field(default="internal", description="internal|rdap|whois|dns|http|ssl|db|open_data")
details: dict[str, Any] = Field(default_factory=dict)
class AnalyzeSection(BaseModel):
key: str
title: str
items: list[AnalyzeItem] = Field(default_factory=list)
class AnalyzeResponse(BaseModel):
domain: str
computed_at: datetime
cached: bool = False
sections: list[AnalyzeSection]

View File

@ -10,8 +10,6 @@ class UserCreate(BaseModel):
email: EmailStr
password: str = Field(..., min_length=8, max_length=100)
name: Optional[str] = Field(None, max_length=100)
# Yield referral tracking
ref: Optional[str] = Field(None, max_length=100, description="Referral code from yield domain")
class UserLogin(BaseModel):
@ -41,36 +39,8 @@ class Token(BaseModel):
expires_in: int
class LoginResponse(BaseModel):
"""Login response when using HttpOnly cookie authentication."""
expires_in: int
class TokenData(BaseModel):
"""Schema for token payload data."""
user_id: Optional[int] = None
email: Optional[str] = None
class ReferralStats(BaseModel):
"""Referral reward snapshot for the current user (3C.2)."""
window_days: int = 30
referred_users_total: int = 0
qualified_referrals_total: int = 0
referral_link_views_window: int = 0
bonus_domains: int = 0
next_reward_at: int = 0
badge: Optional[str] = None # "verified_referrer" | "elite_referrer"
cooldown_days: int = 7
disqualified_cooldown_total: int = 0
disqualified_missing_ip_total: int = 0
disqualified_shared_ip_total: int = 0
disqualified_duplicate_ip_total: int = 0
class ReferralLinkResponse(BaseModel):
invite_code: str
url: str
stats: ReferralStats

View File

@ -1,51 +0,0 @@
"""CFO (Management) schemas."""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class CfoMonthlyBucket(BaseModel):
month: str # YYYY-MM
total_cost_usd: float = 0.0
domains: int = 0
class CfoUpcomingCostRow(BaseModel):
domain_id: int
domain: str
renewal_date: Optional[datetime] = None
renewal_cost_usd: Optional[float] = None
cost_source: str = Field(default="unknown", description="portfolio|tld_prices|unknown")
is_sold: bool = False
class CfoKillListRow(BaseModel):
domain_id: int
domain: str
renewal_date: Optional[datetime] = None
renewal_cost_usd: Optional[float] = None
cost_source: str = "unknown"
auto_renew: bool = True
is_dns_verified: bool = False
yield_net_60d: float = 0.0
yield_clicks_60d: int = 0
reason: str
class CfoSummaryResponse(BaseModel):
computed_at: datetime
upcoming_30d_total_usd: float = 0.0
upcoming_30d_rows: list[CfoUpcomingCostRow] = []
monthly: list[CfoMonthlyBucket] = []
kill_list: list[CfoKillListRow] = []
class SetToDropResponse(BaseModel):
domain_id: int
auto_renew: bool
updated_at: datetime

View File

@ -88,15 +88,3 @@ class DomainListResponse(BaseModel):
per_page: int
pages: int
class ExpiryUpdate(BaseModel):
"""Schema for manually setting domain expiration date."""
expiration_date: Optional[datetime] = None
class Config:
json_schema_extra = {
"example": {
"expiration_date": "2025-12-31T00:00:00Z"
}
}

View File

@ -1,93 +0,0 @@
"""HUNT (Discovery) schemas."""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class HuntSniperItem(BaseModel):
domain: str
platform: str
auction_url: str
current_bid: float
currency: str
end_time: datetime
age_years: Optional[int] = None
backlinks: Optional[int] = None
pounce_score: Optional[int] = None
class HuntSniperResponse(BaseModel):
items: list[HuntSniperItem]
total: int
filtered_out_missing_data: int = 0
last_updated: Optional[datetime] = None
class TrendItem(BaseModel):
title: str
approx_traffic: Optional[str] = None
published_at: Optional[datetime] = None
link: Optional[str] = None
class TrendsResponse(BaseModel):
geo: str = "US"
items: list[TrendItem]
fetched_at: datetime
class KeywordAvailabilityRequest(BaseModel):
keywords: list[str] = Field(min_length=1, max_length=25)
tlds: list[str] = Field(default_factory=lambda: ["com", "io", "ai", "net", "org"], max_length=20)
class KeywordAvailabilityRow(BaseModel):
keyword: str
domain: str
tld: str
is_available: Optional[bool] = None
status: str # available|taken|unknown
class KeywordAvailabilityResponse(BaseModel):
items: list[KeywordAvailabilityRow]
class TypoCheckRequest(BaseModel):
brand: str = Field(min_length=2, max_length=50)
tlds: list[str] = Field(default_factory=lambda: ["com"], max_length=10)
limit: int = Field(default=50, ge=1, le=200)
class TypoCandidate(BaseModel):
domain: str
is_available: Optional[bool] = None
status: str # available|taken|unknown
class TypoCheckResponse(BaseModel):
brand: str
items: list[TypoCandidate]
class BrandableRequest(BaseModel):
pattern: str = Field(description="cvcvc|cvccv|human", examples=["cvcvc"])
tlds: list[str] = Field(default_factory=lambda: ["com"], max_length=10)
limit: int = Field(default=30, ge=1, le=100)
max_checks: int = Field(default=400, ge=50, le=2000)
class BrandableCandidate(BaseModel):
domain: str
is_available: Optional[bool] = None
status: str # available|taken|unknown
class BrandableResponse(BaseModel):
pattern: str
items: list[BrandableCandidate]

View File

@ -1,33 +0,0 @@
"""
Referral schemas (3C.2).
"""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class ReferralKpiWindow(BaseModel):
days: int = Field(ge=1, le=365)
start: datetime
end: datetime
class ReferralReferrerRow(BaseModel):
user_id: int
email: str
invite_code: Optional[str] = None
created_at: datetime
referred_users_total: int = 0
referred_users_window: int = 0
referral_link_views_window: int = 0
class ReferralKpisResponse(BaseModel):
window: ReferralKpiWindow
totals: dict[str, int]
referrers: list[ReferralReferrerRow]

View File

@ -1,47 +0,0 @@
"""
Telemetry schemas (4A.2).
"""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class TelemetryKpiWindow(BaseModel):
days: int = Field(ge=1, le=365)
start: datetime
end: datetime
class DealFunnelKpis(BaseModel):
listing_views: int = 0
inquiries_created: int = 0
seller_replied_inquiries: int = 0
inquiry_reply_rate: float = 0.0
listings_with_inquiries: int = 0
listings_sold: int = 0
inquiry_to_sold_listing_rate: float = 0.0
median_reply_seconds: Optional[float] = None
median_time_to_sold_seconds: Optional[float] = None
class YieldFunnelKpis(BaseModel):
connected_domains: int = 0
clicks: int = 0
conversions: int = 0
conversion_rate: float = 0.0
payouts_paid: int = 0
payouts_paid_amount_total: float = 0.0
class TelemetryKpisResponse(BaseModel):
window: TelemetryKpiWindow
deal: DealFunnelKpis
yield_: YieldFunnelKpis = Field(alias="yield")

View File

@ -1,286 +0,0 @@
"""
Pydantic schemas for Yield/Intent Routing feature.
"""
from datetime import datetime
from decimal import Decimal
from typing import Optional
from pydantic import BaseModel, Field
# ============================================================================
# Intent Detection
# ============================================================================
class IntentAnalysis(BaseModel):
"""Intent detection result for a domain."""
category: str
subcategory: Optional[str] = None
confidence: float = Field(ge=0.0, le=1.0)
keywords_matched: list[str] = []
suggested_partners: list[str] = []
monetization_potential: str # "high", "medium", "low"
class YieldValueEstimate(BaseModel):
"""Estimated yield value for a domain."""
estimated_monthly_min: int
estimated_monthly_max: int
currency: str = "CHF"
potential: str
confidence: float
geo: Optional[str] = None
class DomainYieldAnalysis(BaseModel):
"""Complete yield analysis for a domain."""
domain: str
intent: IntentAnalysis
value: YieldValueEstimate
partners: list[str] = []
monetization_potential: str
# ============================================================================
# Yield Domain CRUD
# ============================================================================
class YieldDomainCreate(BaseModel):
"""Create a new yield domain."""
domain: str = Field(..., min_length=3, max_length=255)
class YieldDomainUpdate(BaseModel):
"""Update yield domain settings."""
active_route: Optional[str] = None
landing_page_url: Optional[str] = None
status: Optional[str] = None
class YieldDomainResponse(BaseModel):
"""Yield domain response."""
id: int
domain: str
status: str
# Intent
detected_intent: Optional[str] = None
intent_confidence: float = 0.0
# Routing
active_route: Optional[str] = None
partner_name: Optional[str] = None
# DNS
dns_verified: bool = False
dns_verified_at: Optional[datetime] = None
connected_at: Optional[datetime] = None
# Stats
total_clicks: int = 0
total_conversions: int = 0
total_revenue: Decimal = Decimal("0")
currency: str = "CHF"
# Timestamps
activated_at: Optional[datetime] = None
created_at: datetime
class Config:
from_attributes = True
class YieldDomainListResponse(BaseModel):
"""List of yield domains with summary stats."""
domains: list[YieldDomainResponse]
total: int
# Aggregates
total_active: int = 0
total_revenue: Decimal = Decimal("0")
total_clicks: int = 0
# ============================================================================
# Transactions
# ============================================================================
class YieldTransactionResponse(BaseModel):
"""Single transaction record."""
id: int
event_type: str
partner_slug: str
click_id: Optional[str] = None
gross_amount: Decimal
net_amount: Decimal
currency: str
status: str
geo_country: Optional[str] = None
created_at: datetime
confirmed_at: Optional[datetime] = None
class Config:
from_attributes = True
class YieldTransactionListResponse(BaseModel):
"""List of transactions."""
transactions: list[YieldTransactionResponse]
total: int
# Aggregates
total_gross: Decimal = Decimal("0")
total_net: Decimal = Decimal("0")
# ============================================================================
# Payouts
# ============================================================================
class YieldPayoutResponse(BaseModel):
"""Payout record."""
id: int
amount: Decimal
currency: str
period_start: datetime
period_end: datetime
transaction_count: int
status: str
payment_method: Optional[str] = None
payment_reference: Optional[str] = None
created_at: datetime
completed_at: Optional[datetime] = None
class Config:
from_attributes = True
class YieldPayoutListResponse(BaseModel):
"""List of payouts."""
payouts: list[YieldPayoutResponse]
total: int
total_paid: Decimal = Decimal("0")
total_pending: Decimal = Decimal("0")
# ============================================================================
# Dashboard
# ============================================================================
class YieldDashboardStats(BaseModel):
"""Yield dashboard statistics."""
# Domain counts
total_domains: int = 0
active_domains: int = 0
pending_domains: int = 0
# Revenue (current month)
monthly_revenue: Decimal = Decimal("0")
monthly_clicks: int = 0
monthly_conversions: int = 0
# Lifetime
lifetime_revenue: Decimal = Decimal("0")
lifetime_clicks: int = 0
lifetime_conversions: int = 0
# Pending payout
pending_payout: Decimal = Decimal("0")
next_payout_date: Optional[datetime] = None
currency: str = "CHF"
class YieldDashboardResponse(BaseModel):
"""Complete yield dashboard data."""
stats: YieldDashboardStats
domains: list[YieldDomainResponse]
recent_transactions: list[YieldTransactionResponse]
top_domains: list[YieldDomainResponse]
# ============================================================================
# Partners
# ============================================================================
class AffiliatePartnerResponse(BaseModel):
"""Affiliate partner info (public view)."""
slug: str
name: str
network: str
intent_categories: list[str]
geo_countries: list[str]
payout_type: str
description: Optional[str] = None
logo_url: Optional[str] = None
class Config:
from_attributes = True
# ============================================================================
# DNS Verification
# ============================================================================
class DNSVerificationResult(BaseModel):
"""Result of DNS verification check."""
domain: str
verified: bool
expected_ns: list[str]
actual_ns: list[str]
cname_ok: bool = False
error: Optional[str] = None
checked_at: datetime
class DNSSetupInstructions(BaseModel):
"""DNS setup instructions for a domain."""
domain: str
# Option 1: Nameserver delegation
nameservers: list[str]
# Option 2: CNAME
cname_host: str
cname_target: str
# Verification
verification_url: str
# ============================================================================
# Activation Flow
# ============================================================================
class ActivateYieldRequest(BaseModel):
"""Request to activate a domain for yield."""
domain: str = Field(..., min_length=3, max_length=255)
accept_terms: bool = False
class ActivateYieldResponse(BaseModel):
"""Response after initiating yield activation."""
domain_id: int
domain: str
status: str
# Analysis
intent: IntentAnalysis
value_estimate: YieldValueEstimate
# Setup
dns_instructions: DNSSetupInstructions
message: str

View File

@ -1,64 +0,0 @@
"""
Security helpers (cookies, environment checks).
We use HttpOnly cookies for browser auth to avoid storing JWTs in localStorage/URLs.
"""
from __future__ import annotations
import os
from fastapi import Response
AUTH_COOKIE_NAME = "pounce_access_token"
def cookie_domain() -> str | None:
"""
Optional cookie domain override.
Use with care. Example (share across subdomains): COOKIE_DOMAIN=.pounce.ch
Leave empty in local development (localhost).
"""
value = os.getenv("COOKIE_DOMAIN", "").strip()
return value or None
def should_use_secure_cookies() -> bool:
"""
Determine whether cookies should be marked Secure.
Prefer explicit config via COOKIE_SECURE=true. Otherwise infer from SITE_URL / ENVIRONMENT.
"""
if os.getenv("COOKIE_SECURE", "").lower() == "true":
return True
site_url = os.getenv("SITE_URL", "")
if site_url.startswith("https://"):
return True
env = os.getenv("ENVIRONMENT", "").lower()
return env in {"prod", "production"}
def set_auth_cookie(response: Response, token: str, max_age_seconds: int) -> None:
response.set_cookie(
key=AUTH_COOKIE_NAME,
value=token,
httponly=True,
secure=should_use_secure_cookies(),
samesite="lax",
max_age=max_age_seconds,
path="/",
domain=cookie_domain(),
)
def clear_auth_cookie(response: Response) -> None:
response.delete_cookie(
key=AUTH_COOKIE_NAME,
path="/",
domain=cookie_domain(),
)

View File

@ -1,6 +0,0 @@
"""Database seed scripts."""
from app.seeds.yield_partners import seed_partners
__all__ = ["seed_partners"]

View File

@ -1,474 +0,0 @@
"""
Seed data for Yield affiliate partners.
Run via: python -m app.seeds.yield_partners
Or: from app.seeds.yield_partners import seed_partners; await seed_partners(db)
"""
import asyncio
import logging
from decimal import Decimal
from typing import Any
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import AsyncSessionLocal
from app.models.yield_domain import AffiliatePartner
logger = logging.getLogger(__name__)
# Partner configurations grouped by category
PARTNER_SEED_DATA: list[dict[str, Any]] = [
# =========================================================================
# MEDICAL / HEALTH
# =========================================================================
{
"name": "Comparis Dental",
"slug": "comparis_dental",
"network": "direct",
"intent_categories": "medical_dental",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Dental treatment comparison platform. High conversion for Swiss dental searches.",
"priority": 100,
},
{
"name": "Swisssmile",
"slug": "swisssmile",
"network": "awin",
"intent_categories": "medical_dental",
"geo_countries": "CH,DE,AT",
"payout_type": "cpl",
"payout_amount": Decimal("30.00"),
"payout_currency": "CHF",
"description": "Premium dental clinics network.",
"priority": 90,
},
{
"name": "Comparis Health",
"slug": "comparis_health",
"network": "direct",
"intent_categories": "medical_general",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("20.00"),
"payout_currency": "CHF",
"description": "Health insurance comparison.",
"priority": 100,
},
{
"name": "Sanitas",
"slug": "sanitas",
"network": "awin",
"intent_categories": "medical_general",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("35.00"),
"payout_currency": "CHF",
"description": "Swiss health insurance provider.",
"priority": 80,
},
{
"name": "Swiss Esthetic",
"slug": "swissesthetic",
"network": "direct",
"intent_categories": "medical_beauty",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("40.00"),
"payout_currency": "CHF",
"description": "Aesthetic treatments and beauty clinics.",
"priority": 90,
},
# =========================================================================
# FINANCE / INSURANCE
# =========================================================================
{
"name": "Comparis Insurance",
"slug": "comparis_insurance",
"network": "direct",
"intent_categories": "finance_insurance",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("30.00"),
"payout_currency": "CHF",
"description": "All-in-one insurance comparison.",
"priority": 100,
},
{
"name": "Bonus.ch",
"slug": "bonus_ch",
"network": "awin",
"intent_categories": "finance_insurance",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Swiss insurance comparison portal.",
"priority": 80,
},
{
"name": "Comparis Hypo",
"slug": "comparis_hypo",
"network": "direct",
"intent_categories": "finance_mortgage",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("100.00"),
"payout_currency": "CHF",
"description": "Mortgage comparison - high value leads.",
"priority": 100,
},
{
"name": "MoneyPark",
"slug": "moneypark",
"network": "awin",
"intent_categories": "finance_mortgage",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("120.00"),
"payout_currency": "CHF",
"description": "Independent mortgage broker.",
"priority": 90,
},
{
"name": "Neon Bank",
"slug": "neon_bank",
"network": "partnerstack",
"intent_categories": "finance_banking",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("50.00"),
"payout_currency": "CHF",
"description": "Swiss mobile banking app.",
"priority": 80,
},
# =========================================================================
# LEGAL
# =========================================================================
{
"name": "Legal CH",
"slug": "legal_ch",
"network": "direct",
"intent_categories": "legal_general",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("50.00"),
"payout_currency": "CHF",
"description": "Lawyer matching service.",
"priority": 100,
},
{
"name": "Anwalt24",
"slug": "anwalt24",
"network": "awin",
"intent_categories": "legal_general",
"geo_countries": "DE,AT",
"payout_type": "cpl",
"payout_amount": Decimal("35.00"),
"payout_currency": "EUR",
"description": "German lawyer directory.",
"priority": 80,
},
# =========================================================================
# REAL ESTATE
# =========================================================================
{
"name": "Homegate",
"slug": "homegate",
"network": "awin",
"intent_categories": "realestate_buy,realestate_rent",
"geo_countries": "CH",
"payout_type": "cpc",
"payout_amount": Decimal("0.50"),
"payout_currency": "CHF",
"description": "Switzerland's #1 real estate platform.",
"priority": 100,
},
{
"name": "ImmoScout24",
"slug": "immoscout",
"network": "awin",
"intent_categories": "realestate_buy,realestate_rent",
"geo_countries": "CH,DE",
"payout_type": "cpc",
"payout_amount": Decimal("0.40"),
"payout_currency": "CHF",
"description": "Real estate marketplace.",
"priority": 90,
},
{
"name": "Comparis Immo",
"slug": "comparis_immo",
"network": "direct",
"intent_categories": "realestate_buy,realestate_rent",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("15.00"),
"payout_currency": "CHF",
"description": "Property valuation and search.",
"priority": 85,
},
# =========================================================================
# TRAVEL
# =========================================================================
{
"name": "Skyscanner",
"slug": "skyscanner",
"network": "awin",
"intent_categories": "travel_flights",
"geo_countries": "CH,DE,AT",
"payout_type": "cpc",
"payout_amount": Decimal("0.30"),
"payout_currency": "CHF",
"description": "Flight comparison engine.",
"priority": 90,
},
{
"name": "Booking.com",
"slug": "booking_com",
"network": "awin",
"intent_categories": "travel_hotels",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("4.00"), # 4% commission
"payout_currency": "CHF",
"description": "World's leading accommodation site.",
"priority": 100,
},
# =========================================================================
# AUTOMOTIVE
# =========================================================================
{
"name": "AutoScout24",
"slug": "autoscout",
"network": "awin",
"intent_categories": "auto_buy",
"geo_countries": "CH,DE",
"payout_type": "cpc",
"payout_amount": Decimal("0.60"),
"payout_currency": "CHF",
"description": "Auto marketplace.",
"priority": 100,
},
{
"name": "Comparis Auto",
"slug": "comparis_auto",
"network": "direct",
"intent_categories": "auto_buy,auto_service",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Car insurance & leasing comparison.",
"priority": 90,
},
# =========================================================================
# JOBS
# =========================================================================
{
"name": "Jobs.ch",
"slug": "jobs_ch",
"network": "awin",
"intent_categories": "jobs",
"geo_countries": "CH",
"payout_type": "cpc",
"payout_amount": Decimal("0.40"),
"payout_currency": "CHF",
"description": "Swiss job board.",
"priority": 100,
},
{
"name": "Indeed",
"slug": "indeed",
"network": "awin",
"intent_categories": "jobs",
"geo_countries": "CH,DE,AT",
"payout_type": "cpc",
"payout_amount": Decimal("0.25"),
"payout_currency": "CHF",
"description": "Global job search engine.",
"priority": 80,
},
# =========================================================================
# EDUCATION
# =========================================================================
{
"name": "Udemy",
"slug": "udemy",
"network": "awin",
"intent_categories": "education",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("10.00"), # Per sale
"payout_currency": "USD",
"description": "Online courses platform.",
"priority": 80,
},
# =========================================================================
# TECHNOLOGY / HOSTING
# =========================================================================
{
"name": "Hostpoint",
"slug": "hostpoint",
"network": "partnerstack",
"intent_categories": "tech_hosting",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("30.00"),
"payout_currency": "CHF",
"description": "Swiss web hosting leader.",
"priority": 100,
},
{
"name": "Infomaniak",
"slug": "infomaniak",
"network": "direct",
"intent_categories": "tech_hosting",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Eco-friendly Swiss hosting.",
"priority": 90,
},
# =========================================================================
# SHOPPING
# =========================================================================
{
"name": "Galaxus",
"slug": "galaxus",
"network": "awin",
"intent_categories": "shopping_general",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("2.00"), # 2% commission
"payout_currency": "CHF",
"description": "Switzerland's largest online shop.",
"priority": 100,
},
{
"name": "Zalando",
"slug": "zalando",
"network": "awin",
"intent_categories": "shopping_fashion",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("8.00"), # 8% commission
"payout_currency": "CHF",
"description": "Fashion & lifestyle.",
"priority": 100,
},
# =========================================================================
# FOOD / DELIVERY
# =========================================================================
{
"name": "Uber Eats",
"slug": "uber_eats",
"network": "awin",
"intent_categories": "food_restaurant,food_delivery",
"geo_countries": "CH,DE",
"payout_type": "cps",
"payout_amount": Decimal("5.00"),
"payout_currency": "CHF",
"description": "Food delivery service.",
"priority": 90,
},
# =========================================================================
# POUNCE SELF-PROMOTION (Viral Growth)
# =========================================================================
{
"name": "Pounce Promo",
"slug": "pounce_promo",
"network": "internal",
"intent_categories": "investment_domains,tech_dev,generic",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("0"), # 30% lifetime commission handled separately
"payout_currency": "CHF",
"description": "Pounce self-promotion. Domain owners earn 30% lifetime commission on referrals.",
"priority": 50, # Higher than generic but lower than high-value partners
},
# =========================================================================
# GENERIC FALLBACK
# =========================================================================
{
"name": "Generic Affiliate",
"slug": "generic_affiliate",
"network": "internal",
"intent_categories": "generic",
"geo_countries": "CH,DE,AT",
"payout_type": "cpc",
"payout_amount": Decimal("0.10"),
"payout_currency": "CHF",
"description": "Fallback for unclassified domains - shows Pounce marketplace.",
"priority": 1,
},
]
async def seed_partners(db: AsyncSession) -> int:
"""
Seed affiliate partners into database.
Idempotent: updates existing partners by slug, creates new ones.
Returns:
Number of partners created/updated.
"""
count = 0
for data in PARTNER_SEED_DATA:
slug = data["slug"]
# Check if partner exists
result = await db.execute(
select(AffiliatePartner).where(AffiliatePartner.slug == slug)
)
existing = result.scalar_one_or_none()
if existing:
# Update existing partner
for key, value in data.items():
setattr(existing, key, value)
logger.info(f"Updated partner: {slug}")
else:
# Create new partner
partner = AffiliatePartner(**data)
db.add(partner)
logger.info(f"Created partner: {slug}")
count += 1
await db.commit()
logger.info(f"Seeded {count} affiliate partners")
return count
async def main():
"""Run seed script standalone."""
logging.basicConfig(level=logging.INFO)
async with AsyncSessionLocal() as db:
count = await seed_partners(db)
print(f"✅ Seeded {count} affiliate partners")
if __name__ == "__main__":
asyncio.run(main())

View File

@ -1,2 +0,0 @@
"""Analyze services package (Alpha Terminal)."""

View File

@ -1,2 +0,0 @@
"""Analyzer implementations."""

View File

@ -1,102 +0,0 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.domain_health import get_health_checker
class BasicRiskAnalyzer:
key = "basic_risk"
ttl_seconds = 60 * 10 # 10m (HTTP/SSL/DNS can change quickly)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
if ctx.fast:
return [
AnalyzerContribution(
quadrant="risk",
items=[
AnalyzeItem(
key="risk_skipped_fast_mode",
label="Risk Signals",
value=None,
status="na",
source="internal",
details={"reason": "Fast mode enabled: skip HTTP/SSL checks."},
)
],
)
]
health = ctx.health
if health is None:
health = await get_health_checker().check_domain(ctx.domain)
# health object has attributes; keep access defensive
score = int(getattr(health, "score", 0) or 0)
status = getattr(getattr(health, "status", None), "value", None) or str(getattr(health, "status", "unknown"))
signals = getattr(health, "signals", []) or []
http = getattr(health, "http", None)
ssl = getattr(health, "ssl", None)
dns = getattr(health, "dns", None)
http_status_code = getattr(http, "status_code", None) if http else None
http_reachable = bool(getattr(http, "is_reachable", False)) if http else False
http_parked = bool(getattr(http, "is_parked", False)) if http else False
redirect_url = getattr(http, "redirect_url", None) if http else None
parking_signals = getattr(http, "parking_signals", []) if http else []
http_error = getattr(http, "error", None) if http else None
ssl_has = bool(getattr(ssl, "has_ssl", False)) if ssl else False
ssl_valid = bool(getattr(ssl, "is_valid", False)) if ssl else False
ssl_days = getattr(ssl, "days_until_expiry", None) if ssl else None
ssl_issuer = getattr(ssl, "issuer", None) if ssl else None
ssl_error = getattr(ssl, "error", None) if ssl else None
dns_has_ns = bool(getattr(dns, "has_nameservers", False)) if dns else False
dns_has_a = bool(getattr(dns, "has_a_record", False)) if dns else False
dns_parking_ns = bool(getattr(dns, "is_parking_ns", False)) if dns else False
items = [
AnalyzeItem(
key="health_score",
label="Health Score",
value=score,
status="pass" if score >= 80 else "warn" if score >= 50 else "fail",
source="internal",
details={"status": status, "signals": signals},
),
AnalyzeItem(
key="dns_infra",
label="DNS Infra",
value={"has_ns": dns_has_ns, "has_a": dns_has_a},
status="pass" if (dns_has_ns and dns_has_a and not dns_parking_ns) else "warn",
source="dns",
details={"parking_ns": dns_parking_ns},
),
AnalyzeItem(
key="http",
label="HTTP",
value=http_status_code,
status="pass" if http_reachable and (http_status_code or 0) < 400 else "warn",
source="http",
details={
"reachable": http_reachable,
"is_parked": http_parked,
"redirect_url": redirect_url,
"parking_signals": parking_signals,
"error": http_error,
},
),
AnalyzeItem(
key="ssl",
label="SSL",
value=ssl_days if ssl_has else None,
status="pass" if ssl_has and ssl_valid else "warn",
source="ssl",
details={"has_certificate": ssl_has, "is_valid": ssl_valid, "issuer": ssl_issuer, "error": ssl_error},
),
]
return [AnalyzerContribution(quadrant="risk", items=items)]

View File

@ -1,56 +0,0 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
class DomainFactsAnalyzer:
key = "domain_facts"
ttl_seconds = 60 * 60 * 12 # 12h (RDAP/WHOIS changes slowly)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
check = ctx.check
items = [
AnalyzeItem(
key="availability",
label="Availability",
value="available" if check.is_available else "taken",
status="pass" if check.is_available else "warn",
source=str(check.check_method or "internal"),
details={"status": str(check.status.value)},
),
AnalyzeItem(
key="created_at",
label="Creation Date",
value=check.creation_date.isoformat() if check.creation_date else None,
status="pass" if check.creation_date else "na",
source=str(check.check_method or "internal"),
details={"reason": None if check.creation_date else "Not provided by RDAP/WHOIS for this TLD."},
),
AnalyzeItem(
key="expires_at",
label="Expiry Date",
value=check.expiration_date.isoformat() if check.expiration_date else None,
status="pass" if check.expiration_date else "na",
source=str(check.check_method or "internal"),
details={"reason": None if check.expiration_date else "Not provided by RDAP/WHOIS for this TLD."},
),
AnalyzeItem(
key="registrar",
label="Registrar",
value=check.registrar,
status="info" if check.registrar else "na",
source=str(check.check_method or "internal"),
details={},
),
AnalyzeItem(
key="nameservers",
label="Nameservers",
value=check.name_servers or [],
status="info" if check.name_servers else "na",
source="dns",
details={},
),
]
return [AnalyzerContribution(quadrant="authority", items=items)]

View File

@ -1,30 +0,0 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.analyze.radio_test import run_radio_test
class RadioTestAnalyzer:
key = "radio_test"
ttl_seconds = 60 * 60 * 24 * 7 # deterministic, effectively stable
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
radio = run_radio_test(ctx.domain)
item = AnalyzeItem(
key="radio_test",
label="Radio Test",
value=radio.status,
status=radio.status,
source="internal",
details={
"sld": radio.sld,
"syllables": radio.syllables,
"length": radio.length,
"has_hyphen": radio.has_hyphen,
"has_digits": radio.has_digits,
"rationale": radio.rationale,
},
)
return [AnalyzerContribution(quadrant="authority", items=[item])]

View File

@ -1,23 +0,0 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.analyze.tld_matrix import run_tld_matrix
class TldMatrixAnalyzer:
key = "tld_matrix"
ttl_seconds = 60 * 30 # 30m (availability can change)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
rows = await run_tld_matrix(ctx.domain)
item = AnalyzeItem(
key="tld_matrix",
label="TLD Matrix",
value=[row.__dict__ for row in rows],
status="info",
source="dns",
details={"tlds": [r.tld for r in rows]},
)
return [AnalyzerContribution(quadrant="market", items=[item])]

View File

@ -1,73 +0,0 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.analyze.renewal_cost import get_tld_price_snapshot
class TldPricingAnalyzer:
key = "tld_pricing"
ttl_seconds = 60 * 60 * 6 # 6h (DB updates periodically)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
tld = ctx.domain.split(".")[-1].lower()
snap = await get_tld_price_snapshot(ctx.db, tld)
market_items = [
AnalyzeItem(
key="tld_cheapest_register_usd",
label="Cheapest registration (USD)",
value=snap.min_register_usd,
status="info" if snap.min_register_usd is not None else "na",
source="db",
details={
"registrar": snap.min_register_registrar,
"latest_recorded_at": snap.latest_recorded_at.isoformat() if snap.latest_recorded_at else None,
"reason": None if snap.min_register_usd is not None else "No TLD price data collected yet.",
},
),
AnalyzeItem(
key="tld_cheapest_renew_usd",
label="Cheapest renewal (USD)",
value=snap.min_renew_usd,
status="info" if snap.min_renew_usd is not None else "na",
source="db",
details={
"registrar": snap.min_renew_registrar,
"latest_recorded_at": snap.latest_recorded_at.isoformat() if snap.latest_recorded_at else None,
"reason": None if snap.min_renew_usd is not None else "No TLD price data collected yet.",
},
),
AnalyzeItem(
key="tld_cheapest_transfer_usd",
label="Cheapest transfer (USD)",
value=snap.min_transfer_usd,
status="info" if snap.min_transfer_usd is not None else "na",
source="db",
details={
"registrar": snap.min_transfer_registrar,
"latest_recorded_at": snap.latest_recorded_at.isoformat() if snap.latest_recorded_at else None,
"reason": None if snap.min_transfer_usd is not None else "No TLD price data collected yet.",
},
),
]
value_items = [
AnalyzeItem(
key="renewal_burn_usd_per_year",
label="Renewal burn (USD/year)",
value=snap.min_renew_usd,
status="info" if snap.min_renew_usd is not None else "na",
source="db",
details={
"assumption": "Cheapest renewal among tracked registrars (your DB).",
"registrar": snap.min_renew_registrar,
},
)
]
return [
AnalyzerContribution(quadrant="market", items=market_items),
AnalyzerContribution(quadrant="value", items=value_items),
]

View File

@ -1,41 +0,0 @@
"""
Analyzer plugin interface (Alpha Terminal - Diligence).
Each analyzer contributes items to one or more quadrants:
authority | market | risk | value
"""
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime
from typing import Protocol
from sqlalchemy.ext.asyncio import AsyncSession
from app.schemas.analyze import AnalyzeItem
from app.services.domain_checker import DomainCheckResult
@dataclass(frozen=True)
class AnalyzeContext:
db: AsyncSession
domain: str
computed_at: datetime
fast: bool
check: DomainCheckResult
health: object | None # DomainHealthReport or None (kept as object to avoid import cycles)
@dataclass(frozen=True)
class AnalyzerContribution:
quadrant: str # authority|market|risk|value
items: list[AnalyzeItem]
class Analyzer(Protocol):
key: str
ttl_seconds: int
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]: ...

View File

@ -1,91 +0,0 @@
"""
Radio Test analyzer (open-data, deterministic).
No external API, no LLM. This is a heuristic that explains its decision.
"""
from __future__ import annotations
import re
from dataclasses import dataclass
VOWELS = set("aeiou")
def _count_syllables(word: str) -> int:
"""
Approximate syllable count for brandability heuristics.
Uses vowel-group counting with a few basic adjustments.
"""
w = re.sub(r"[^a-z]", "", word.lower())
if not w:
return 0
groups = 0
prev_vowel = False
for i, ch in enumerate(w):
is_vowel = ch in VOWELS or (ch == "y" and i > 0)
if is_vowel and not prev_vowel:
groups += 1
prev_vowel = is_vowel
# common silent-e adjustment
if w.endswith("e") and groups > 1:
groups -= 1
return max(1, groups)
@dataclass(frozen=True)
class RadioTestResult:
sld: str
syllables: int
length: int
has_hyphen: bool
has_digits: bool
status: str # pass|warn|fail
rationale: str
def run_radio_test(domain: str) -> RadioTestResult:
sld = (domain or "").split(".")[0].lower()
length = len(sld)
has_hyphen = "-" in sld
has_digits = bool(re.search(r"\d", sld))
syllables = _count_syllables(sld)
# Hard fails: ugly for radio + high typo risk
if length >= 20 or has_hyphen or (has_digits and length > 4):
return RadioTestResult(
sld=sld,
syllables=syllables,
length=length,
has_hyphen=has_hyphen,
has_digits=has_digits,
status="fail",
rationale="Hard to say/spell on radio (length/hyphen/digits).",
)
# Ideal: 23 syllables, 412 chars, no digits
if 2 <= syllables <= 3 and 4 <= length <= 12 and not has_digits:
return RadioTestResult(
sld=sld,
syllables=syllables,
length=length,
has_hyphen=has_hyphen,
has_digits=has_digits,
status="pass",
rationale="Short, pronounceable, low spelling friction.",
)
return RadioTestResult(
sld=sld,
syllables=syllables,
length=length,
has_hyphen=has_hyphen,
has_digits=has_digits,
status="warn",
rationale="Usable, but not ideal (syllables/length/digits).",
)

View File

@ -1,21 +0,0 @@
"""Analyzer registry (Alpha Terminal - Diligence)."""
from __future__ import annotations
from app.services.analyze.analyzers.domain_facts import DomainFactsAnalyzer
from app.services.analyze.analyzers.radio_test import RadioTestAnalyzer
from app.services.analyze.analyzers.tld_matrix import TldMatrixAnalyzer
from app.services.analyze.analyzers.tld_pricing import TldPricingAnalyzer
from app.services.analyze.analyzers.basic_risk import BasicRiskAnalyzer
def get_default_analyzers():
# Order matters (UX)
return [
DomainFactsAnalyzer(),
RadioTestAnalyzer(),
TldMatrixAnalyzer(),
TldPricingAnalyzer(),
BasicRiskAnalyzer(),
]

View File

@ -1,93 +0,0 @@
"""
TLD pricing snapshot (open-data via internal DB).
Uses our own collected TLD price history (no external API calls here).
"""
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime
from typing import Optional
from sqlalchemy import func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.tld_price import TLDPrice
@dataclass(frozen=True)
class TldPriceSnapshot:
tld: str
min_register_usd: Optional[float]
min_register_registrar: Optional[str]
min_renew_usd: Optional[float]
min_renew_registrar: Optional[str]
min_transfer_usd: Optional[float]
min_transfer_registrar: Optional[str]
latest_recorded_at: Optional[datetime]
async def get_tld_price_snapshot(db: AsyncSession, tld: str) -> TldPriceSnapshot:
tld = (tld or "").lower().lstrip(".")
# Latest record per registrar for this TLD, then take min renew.
subq = (
select(
TLDPrice.registrar,
func.max(TLDPrice.recorded_at).label("max_date"),
)
.where(TLDPrice.tld == tld)
.group_by(TLDPrice.registrar)
.subquery()
)
rows = (
await db.execute(
select(TLDPrice)
.join(
subq,
(TLDPrice.registrar == subq.c.registrar)
& (TLDPrice.recorded_at == subq.c.max_date),
)
.where(TLDPrice.tld == tld)
)
).scalars().all()
if not rows:
return TldPriceSnapshot(
tld=tld,
min_register_usd=None,
min_register_registrar=None,
min_renew_usd=None,
min_renew_registrar=None,
min_transfer_usd=None,
min_transfer_registrar=None,
latest_recorded_at=None,
)
def _reg_price(r) -> float:
return float(r.registration_price or 1e12)
def _renew_price(r) -> float:
return float(r.renewal_price or r.registration_price or 1e12)
def _transfer_price(r) -> float:
return float(r.transfer_price or r.registration_price or 1e12)
best_reg = min(rows, key=_reg_price)
best_renew = min(rows, key=_renew_price)
best_transfer = min(rows, key=_transfer_price)
latest = max((r.recorded_at for r in rows if r.recorded_at), default=None)
return TldPriceSnapshot(
tld=tld,
min_register_usd=float(best_reg.registration_price),
min_register_registrar=str(best_reg.registrar),
min_renew_usd=float(best_renew.renewal_price or best_renew.registration_price),
min_renew_registrar=str(best_renew.registrar),
min_transfer_usd=float(best_transfer.transfer_price or best_transfer.registration_price),
min_transfer_registrar=str(best_transfer.registrar),
latest_recorded_at=latest,
)

View File

@ -1,128 +0,0 @@
"""
Analyze service orchestrator (Alpha Terminal).
Implements the plan:
- Quadrants: authority | market | risk | value
- Analyzer registry (plugin-like)
- Open-data-first (null + reason)
"""
from __future__ import annotations
import json
from datetime import datetime, timedelta, timezone
from fastapi import HTTPException, status
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.domain_analysis_cache import DomainAnalysisCache
from app.schemas.analyze import AnalyzeResponse, AnalyzeSection
from app.services.analyze.base import AnalyzeContext
from app.services.analyze.registry import get_default_analyzers
from app.services.domain_checker import domain_checker
from app.services.domain_health import get_health_checker
DEFAULT_CACHE_TTL_SECONDS = 60 * 10 # conservative fallback (10m)
def _utcnow() -> datetime:
return datetime.now(timezone.utc)
def _is_cache_valid(row: DomainAnalysisCache) -> bool:
ttl = int(row.ttl_seconds or 0)
if ttl <= 0:
return False
computed = row.computed_at
if computed is None:
return False
if computed.tzinfo is None:
# stored as naive UTC typically
computed = computed.replace(tzinfo=timezone.utc)
return computed + timedelta(seconds=ttl) > _utcnow()
async def get_domain_analysis(
db: AsyncSession,
domain: str,
*,
fast: bool = False,
refresh: bool = False,
) -> AnalyzeResponse:
is_valid, error = domain_checker.validate_domain(domain)
if not is_valid:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=error)
norm = domain_checker._normalize_domain(domain) # internal normalize
# Cache lookup
if not refresh:
row = (
await db.execute(select(DomainAnalysisCache).where(DomainAnalysisCache.domain == norm))
).scalar_one_or_none()
if row and _is_cache_valid(row):
payload = json.loads(row.payload_json)
payload["cached"] = True
return AnalyzeResponse.model_validate(payload)
computed_at = _utcnow()
# Core domain facts via RDAP/DNS/WHOIS (shared input for analyzers)
check = await domain_checker.check_domain(norm, quick=False)
# Health is expensive; compute once only when needed
health = None
if not fast:
health = await get_health_checker().check_domain(norm)
ctx = AnalyzeContext(db=db, domain=norm, computed_at=computed_at, fast=fast, check=check, health=health)
analyzers = get_default_analyzers()
ttl = DEFAULT_CACHE_TTL_SECONDS
# Quadrants per plan (stable ordering)
quadrants: dict[str, AnalyzeSection] = {
"authority": AnalyzeSection(key="authority", title="Authority", items=[]),
"market": AnalyzeSection(key="market", title="Market", items=[]),
"risk": AnalyzeSection(key="risk", title="Risk", items=[]),
"value": AnalyzeSection(key="value", title="Value", items=[]),
}
for a in analyzers:
ttl = min(ttl, int(getattr(a, "ttl_seconds", DEFAULT_CACHE_TTL_SECONDS) or DEFAULT_CACHE_TTL_SECONDS))
contributions = await a.analyze(ctx)
for c in contributions:
if c.quadrant in quadrants:
quadrants[c.quadrant].items.extend(c.items)
resp = AnalyzeResponse(
domain=norm,
computed_at=computed_at,
cached=False,
sections=[quadrants["authority"], quadrants["market"], quadrants["risk"], quadrants["value"]],
)
# Upsert cache (best-effort)
payload = resp.model_dump(mode="json")
payload_json = json.dumps(payload, separators=(",", ":"), ensure_ascii=False)
existing = (
await db.execute(select(DomainAnalysisCache).where(DomainAnalysisCache.domain == norm))
).scalar_one_or_none()
if existing:
existing.payload_json = payload_json
existing.computed_at = computed_at.replace(tzinfo=None)
existing.ttl_seconds = int(ttl or DEFAULT_CACHE_TTL_SECONDS)
else:
db.add(
DomainAnalysisCache(
domain=norm,
payload_json=payload_json,
computed_at=computed_at.replace(tzinfo=None),
ttl_seconds=int(ttl or DEFAULT_CACHE_TTL_SECONDS),
)
)
return resp

Some files were not shown because too many files have changed in this diff Show More