Compare commits

...

114 Commits

Author SHA1 Message Date
5fc7b33b72 Replace partner names with TBA on yield page
Some checks are pending
CI / Frontend Lint & Type Check (push) Waiting to run
CI / Frontend Build (push) Blocked by required conditions
CI / Backend Lint (push) Waiting to run
CI / Backend Tests (push) Blocked by required conditions
CI / Docker Build (push) Blocked by required conditions
CI / Security Scan (push) Waiting to run
Deploy / Build & Push Images (push) Waiting to run
Deploy / Deploy to Server (push) Blocked by required conditions
Deploy / Notify (push) Blocked by required conditions
2025-12-16 17:27:31 +01:00
b18cc63d19 Earnings Dashboard: Duplikate entfernt, cleaner Layout
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 17:15:34 +01:00
5de6b3d58b recharts package-lock.json aktualisiert
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 16:51:45 +01:00
90504bba2e Earnings Dashboard: Recharts Integration mit MRR Trend, Tier Breakdown Charts, Customer Growth, Pie Charts und mehr
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 16:51:06 +01:00
6f53780fd9 Fix: Correct template literal syntax in Terms and Privacy pages
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 15:47:46 +01:00
2cf5a5d00d Update legal pages: new address, comprehensive Terms & Privacy
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Change address to Holzmoosrütisteig 1b, 8820 Wädenswil everywhere
- Remove phone numbers and commercial register numbers
- Create comprehensive Terms of Service (15 sections)
- Create comprehensive Privacy Policy (15 sections, GDPR compliant)
- Update Footer: ZURICH → WÄDENSWIL
2025-12-16 15:46:20 +01:00
485a5a0fdc Admin Panel komplett überarbeitet: Earnings Tab mit MRR/ARR, modernisiertes Design, verbesserte UX
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 15:35:20 +01:00
3586066e28 Fix: Add BACKEND_URL env var for server-side fetching in Next.js
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 15:23:13 +01:00
d9cc83c054 Fix: Import track_event in listings API
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 15:17:59 +01:00
114fc3d9d6 Fix: Load user relationship in listings API to prevent 500 error
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 15:16:36 +01:00
dfdee7512a Add admin debug endpoints for listings
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 15:11:08 +01:00
82619f5506 Improve Portfolio UX and redesign AnalyzePanel
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Move Portfolio tabs directly under subtitle (above filters line)
- Add Health Detail Overlay when clicking health score (like Watchlist)
- Redesign AnalyzePanel: wider (600-680px), larger text, better contrast
- Improved section headers with colored backgrounds
- Larger status indicators and score display
- Better spacing and readability throughout
2025-12-16 15:02:47 +01:00
7d68266745 Fix server-side API URL construction and improve UX
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Fix double /api/v1 bug in buy/blog/discover pages causing 500 errors
- Add auto-load health checks on Portfolio page (like Watchlist)
- Add subscription cancellation UI in Settings with trust-building design
- Remove SMS notifications from Sniper alerts
- Fix Sniper alert matching for drops and auctions
- Improve Trend Surfer and Brandable Forge UI/UX
- Match Portfolio tabs to Hunt page design
- Update Hunt page header style consistency
2025-12-16 14:44:48 +01:00
5b99145fb2 fix: banner position and Sedo affiliate links
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-16 09:02:00 +01:00
7a9d7703ca feat: optimize drops to 24h only, award-winning analyze panel
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 22:46:29 +01:00
90256e6049 feat: add ICANN CZDS zone file integration for gTLDs
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 22:07:23 +01:00
c5abba5d2f feat: award-winning hunt tabs + .ch/.li zone file drops
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 21:45:42 +01:00
fccd88da46 feat: merge hunt/market pages, integrate cfo into portfolio
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 21:16:09 +01:00
b4954bf695 feat: rebuild HUNT page with radar-style mobile layout
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 17:30:32 +01:00
35877dd329 fix: stabilize HUNT on production
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Fix brandables API NameError, switch Trend Surfer to a working Google Trends RSS endpoint, and harden the HUNT UI against failed requests. Also add sharp for Next.js standalone image optimization and remove PostHog script crossOrigin to reduce CORS breakage.
2025-12-15 16:54:39 +01:00
342bebc483 fix: unblock production frontend build
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Remove unsupported Toast isVisible prop and harden deploy script to reliably rebuild/restart Next.js standalone on the server.
2025-12-15 16:41:24 +01:00
61cd40be6a fix: reorder hunt endpoint params for SlowAPI
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 16:23:46 +01:00
3485668b5e feat: add Alpha Terminal HUNT/CFO modules and Analyze framework
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Adds HUNT (Sniper/Trend/Forge), CFO dashboard (burn rate + kill list), and a plugin-based Analyze side panel with caching and SSRF hardening.
2025-12-15 16:15:58 +01:00
49732fb649 Deploy: use Next standalone start + robust ssh tty
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 14:19:02 +01:00
1cb4b64646 Deploy: add /api/health route + fix deploy git detection
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 14:14:02 +01:00
c16afe468f Deploy: fix /api/health + deploy script Next.js detection
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 14:09:38 +01:00
bb7ce97330 Deploy: referral rewards antifraud + legal contact updates
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 13:56:43 +01:00
ca8929a916 docs: Add comprehensive deployment guide
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 10:30:19 +01:00
6d7db54021 feat: Portfolio improvements, listing/yield integration, UNICORN_PLAN
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 08:04:03 +01:00
acfcab682d fix: Remove infinite loop in loadData
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-15 07:27:22 +01:00
988e1645c5 fix: Toast component props
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 22:46:04 +01:00
4efe1fdd4f feat: Complete Portfolio redesign with edit modal, full domain details, health checks
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 22:44:11 +01:00
684541deb8 feat: Portfolio tooltips, listed status badge, beautiful icons
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 22:24:56 +01:00
f963b33b32 feat: Pounce listings in acquire table, yield remove button, portfolio shows yield status
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 22:09:51 +01:00
99ccfbd23f fix: Add site_url to config for yield activation
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 21:52:59 +01:00
147f0454f1 fix: Type annotations for nav items
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 21:47:38 +01:00
1f72f2664d fix: Portfolio page redesign, unified DNS verification, fix API route conflicts
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 21:44:40 +01:00
8051b2ac51 fix: Unify Radar to use same getMarketFeed API as Market & Acquire
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 21:22:01 +01:00
3995c2d675 fix: Filter expired auctions on Acquire page, add live updates
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 21:15:21 +01:00
71a94a765a fix: Add is_valid to SSL in simulated health reports
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 09:20:58 +01:00
e027e51288 fix: TypeScript error in Portfolio health reports
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 09:16:09 +01:00
909cf21d6e feat: Portfolio monitoring, page descriptions, mobile nav cleanup
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-14 09:12:05 +01:00
a56e4c7a8a fix: Remove build-time API calls to prevent timeouts
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 21:49:20 +01:00
2236908701 fix: is_available property name
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 21:36:41 +01:00
e064486582 fix: TldData interface types
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 21:35:02 +01:00
a5600ee13c feat: Ultra SEO optimization - sitemap, robots, structured data, 800+ TLD pages
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 21:32:54 +01:00
92b309e766 fix: Re-add PostHog analytics tracking
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 20:17:38 +01:00
f8f168d063 deploy: Sync all changes
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 18:38:44 +01:00
6074506539 feat: Pricing page with Yield feature, Stripe LIVE keys
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 18:26:17 +01:00
ddeb25446e fix: Yield slider fixed, pricing features aligned with docs
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 18:16:17 +01:00
e3250baaf7 fix: getTldOverview API call params
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 18:05:51 +01:00
31b02e6790 fix: Conservative yield calculator, real TLD data on discover, fix acquire/pricing
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 18:04:09 +01:00
356db5afee feat: Complete mobile redesign for Acquire page - terminal style
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:54:28 +01:00
26daad68cf feat: Mobile-optimized + left-aligned public pages (acquire, discover, yield, pricing, landing)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:46:07 +01:00
4a1ebf0024 feat: Mobile-optimized landing page + Header drawer + Footer improvements
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:33:08 +01:00
ce961aa03d fix: Radar mobile auctions layout + import 886 TLDs
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:21:49 +01:00
8996929174 fix: Settings page matching Intel page pattern exactly
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:12:44 +01:00
89e8e64a45 feat: Settings page with mobile-first techy design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:04:45 +01:00
3e067a9792 fix: rsync compatibility for macOS
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 17:00:48 +01:00
78736ab7bf feat: Zero-downtime deploy + mobile Settings design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 16:55:50 +01:00
3601af7ec0 fix: Add status to Subscription interface
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 16:45:19 +01:00
017b4ce1f9 feat: Ultimate plan switcher in settings
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 16:43:32 +01:00
83aaca0721 fix: Stripe USD prices + tier limits alignment
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 16:29:06 +01:00
f4e4595cde Terminal: Auth guard + logout redirects to landing page
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 16:11:41 +01:00
6a063bfe89 Deploy script: protect .env and .db files from being overwritten
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 16:04:07 +01:00
267fdd0d39 Fix all Python indentation errors
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 15:52:41 +01:00
9f3539ae09 Fix auction_scraper indentation
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 15:47:45 +01:00
41790d359c Fix domain_checker WHOIS indentation
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 15:45:02 +01:00
e6ce5eaaeb Fix API methods + domain_checker indentation
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 15:42:26 +01:00
fde66af049 Watchlist layout fix + TLD detail + Sniper/Yield/Listing redesign + deploy script
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 15:39:51 +01:00
6a56360f56 Watchlist: Fixed layout order + TLD Detail: Full techy mobile design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 15:22:45 +01:00
d56081aca0 All tables: Unified sortable headers for Radar, Market, Watchlist, Intel
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 14:40:21 +01:00
09fb4e2931 Watchlist: Buy button + Intel: Unified techy mobile design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 14:28:45 +01:00
f293df3e23 Watchlist + Market: Unified techy angular mobile design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 14:16:21 +01:00
a5a9f40866 Market: Techy angular mobile design matching Radar
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 14:03:43 +01:00
a88719e02d Radar: Techy angular mobile design with more info
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:56:40 +01:00
4568679f01 Fix: Radar auth check and data loading
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:48:15 +01:00
21b06449ec Fix: domain.name instead of domain.domain
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:38:30 +01:00
7c536b32ce Radar: Added watchlist preview, Market: Native app mobile design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:36:42 +01:00
1a75802baa Radar: Perfected mobile design - Native App Feel
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:26:10 +01:00
c1db9727c7 Mobile: Beautiful slide-in navigation drawer
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:19:54 +01:00
155280a84e Radar: Fix hydration, award-winning mobile PWA design
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:12:50 +01:00
964a85412d Radar page: Native PWA-like mobile experience
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 13:00:42 +01:00
bfb5eabfc2 Fix: TypeScript null/undefined in health modal
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 10:13:58 +01:00
02545ffe76 PWA: Radar fullscreen nav + contrast, Watchlist native app experience
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 10:08:05 +01:00
2ba38a13e7 PWA: Better contrast, 5-item nav, black theme-color, Watchlist native mobile
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 09:52:07 +01:00
6b6ed51466 Radar page: Native PWA mobile experience
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 09:39:45 +01:00
969c825f91 TLD detail page redesign - techy chic
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-13 09:31:20 +01:00
d2fa162d44 Fix: API response property names for domains
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 23:50:16 +01:00
da6e39e83d Fix: Intel TLD pagination, Market page pagination + toggle tracking
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 23:45:18 +01:00
eedd61cd79 Fix: Intel TLD pagination, Market page pagination + toggle tracking
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 23:29:24 +01:00
2e507f5484 Navigation: Radar first, For Sale under Monetize; Settings page redesign
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 23:19:38 +01:00
2d5a36ea98 Fix Intel page, Listing page redesign
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 23:12:30 +01:00
b7fa3632bf Intel, Sniper, Yield pages redesign
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:45:55 +01:00
b820690478 Fix health check auto-trigger
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:36:44 +01:00
718a7d64e5 Auto health check on add, Market page redesign
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:34:39 +01:00
fd66a86408 Health check: show errors and network warnings
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:28:53 +01:00
f7c60fc667 Health modal: clearer check display with explicit status
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:20:24 +01:00
7c47c49fc9 Radar: track taken domains, Watchlist: fix health modal
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:14:31 +01:00
e737de6ff5 Radar search: track taken domains, Watchlist: radar style cards, fixed health check
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:08:07 +01:00
7b0b6a3669 Cleaner design: smaller titles, no caps, better search UX
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 22:01:11 +01:00
0916ad6c27 Less top padding, more horizontal padding
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 21:49:33 +01:00
8201367da3 Radar & Watchlist: cleaner UI, more horizontal padding, less top spacing, better readability
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 21:41:46 +01:00
8347611ad2 Fix health report property names
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 21:33:11 +01:00
1a4b0eb984 Radar & Watchlist: smaller fonts, no header, laptop optimized
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
2025-12-12 21:30:04 +01:00
dc5090a5b2 feat: RADAR & WATCHLIST komplett neu designed - Cinematic High-End UI
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
RADAR PAGE:
- Animated Radar Background mit Blips und Sweeping Line
- Hero Section mit riesiger Typografie (6rem Headlines)
- Search Terminal im 'Target Acquisition' Design
- Live Ticker mit Animation
- Market Feed Grid mit Tech-Corners
- Quick Access Navigation Cards

WATCHLIST PAGE:
- Dramatische Hero Section mit Big Numbers Grid
- Command Line Style Domain Input
- Pill-Filter mit Hover-Animationen
- Daten-Tabelle mit Status-Badges
- Health-Check Integration
- Accent Glows und Tech-Corners überall

Beide Seiten nutzen jetzt exakt den Landing Page Stil:
- #020202 Background
- font-display für Headlines
- font-mono für Labels
- text-[10px] uppercase tracking-widest
- border-white/[0.08] für Linien
- Tech-Corners an wichtigen Boxen
- Accent Glow Effects
2025-12-12 21:21:16 +01:00
ae7e257524 feat: Radar Page Redesign (Techy Chic)
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- TechCard Komponente für Metriken
- Großes Command-Line Search Interface
- Grid-Layout für Feeds (Live Ops & Intel)
- Techy Design-Elemente (Crosshairs, Mono-Fonts, Uppercase)
- Integration in CommandCenterLayout
2025-12-12 21:09:21 +01:00
545df1bcba feat: Terminal Design an Landing Page Stil angepasst
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- CommandCenterLayout: Brutalist/Technical Look (#020202 bg, Noise-Overlay, scharfe Ecken)
- Sidebar: Mono-Fonts, Border-Left für aktive Links, eckigere UI
- Top Bar: Minimalistischer, dunklere Farben, weniger Blur
- Notifications & Search Modal: Technischer Look mit schärferen Kanten
- User Card: Eckiger, weniger rund
- Konsistente Verwendung von white/[0.08] für Borders wie Landing Page
2025-12-12 21:00:18 +01:00
5a1d3f2847 fix: TypeScript build errors & CORS config
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Frontend Fixes:
- command/auctions: Fix addDomainToWatchlist → addDomain, Set iteration
- command/dashboard: Add missing Bell icon import
- command/portfolio: Optional chaining for valuation_formula
- command/pricing: Wrap AlertTriangle in span for title prop
- command/watchlist: Fix Badge className, optional form event
- legal/privacy: Escape >> in JSX
- api.ts: Add optional valuation_formula to DomainValuation

Server:
- CORS: Added http://10.42.0.73:3000 to allowed origins
- Email verification enabled
2025-12-12 20:47:41 +01:00
6cb985fa8b feat: Complete public frontend redesign - Award Winning UI
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Redesigned all public pages (Landing, Acquire, Discover, Yield, Pricing) to "Award Winning" tech-chic style
- Implemented auth pages (Login, Register, Forgot Password, Verify Email) with animated glow backgrounds
- Renamed routes: /market -> /acquire, /intel -> /discover
- Created footer pages: About, Contact, Briefings
- Created legal pages: Privacy Policy, Terms of Service, Imprint
- Updated Header & Footer components to match new design system
- Enhanced DomainChecker with typing animation and angular tech design
- All pages now feature: deep dark backgrounds, noise overlays, tech borders, glassmorphism effects
- Responsive design optimized for all screen sizes
- Consistent typography using font-display and font-mono throughout
2025-12-12 17:44:59 +01:00
b5c456af1c refactor: Rename Intel to Discover and apply Landing Page style
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Renamed /intel to /discover
- Updated styles to match dark/cinematic landing page theme
- Updated Header, Footer, and Sitemap
- Added redirects from /intel and /tld-pricing to /discover
- Optimized SEO metadata for new paths
2025-12-12 16:35:34 +01:00
58228e3d33 feat: integrate Pounce self-promotion & viral growth system
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Pounce Eigenwerbung (from pounce_endgame.md):
- Add 'pounce_promo' as fallback partner for generic/unclear intent domains
- Create dedicated Pounce promo landing page with CTA to register
- Update footer on all yield pages: 'Monetized by Pounce • Own a domain? Start yielding'

Tech/Investment Domain Detection:
- Add 'investment_domains' category (invest, crypto, trading, domain, startup)
- Add 'tech_dev' category (developer, web3, fintech, proptech)
- Both categories have 'pounce_affinity' flag for higher Pounce conversion

Referral Tracking for Domain Owners:
- Add user fields: referred_by_user_id, referred_by_domain, referral_code
- Parse yield referral codes (yield_{user_id}_{domain_id}) on registration
- Domain owners earn lifetime commission when visitors sign up via their domain

DB Migrations:
- Add referral tracking columns to users table
2025-12-12 15:27:53 +01:00
dc12f14638 fix: resolve indentation and import errors in backend
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
- Fix indentation in main.py (scheduler if/else blocks)
- Fix indentation in deps.py (credentials check)
- Fix indentation in auctions.py (filter blocks)
- Add BackgroundTasks import to admin.py
- Fix settings import in yield_domains.py (use get_settings())
2025-12-12 15:06:47 +01:00
1705b5cc6e feat: complete Yield feature setup
Some checks failed
CI / Frontend Lint & Type Check (push) Has been cancelled
CI / Frontend Build (push) Has been cancelled
CI / Backend Lint (push) Has been cancelled
CI / Backend Tests (push) Has been cancelled
CI / Docker Build (push) Has been cancelled
CI / Security Scan (push) Has been cancelled
Deploy / Build & Push Images (push) Has been cancelled
Deploy / Deploy to Server (push) Has been cancelled
Deploy / Notify (push) Has been cancelled
Backend:
- Add yield_webhooks.py for partner callbacks (generic, Awin, batch import)
- Add yield_routing.py for domain traffic routing with landing pages
- Add DB migrations for yield table indexes
- Add seed script with 30+ Swiss/German affiliate partners
- Register all new routers in API

Frontend:
- Add public /yield landing page with live analyzer demo
- Add Yield to header navigation

Documentation:
- Complete YIELD_SETUP.md with setup guide, API reference, and troubleshooting
2025-12-12 14:52:49 +01:00
240 changed files with 45483 additions and 13265 deletions

414
DEPLOY.md Normal file
View File

@ -0,0 +1,414 @@
# Pounce Deployment Guide
## Server Information
- **Server IP**: `10.42.0.73`
- **User**: `user`
- **Git Remote**: `git.6bit.ch` (10.13.12.81)
- **Frontend Port**: 3000
- **Backend Port**: 8000
- **Public URL**: https://pounce.ch
## Automated Deployment (Recommended)
### Using the Deploy Script
The `deploy.sh` script handles zero-downtime deployments automatically:
```bash
# Full deployment (commit + push + deploy)
./deploy.sh "Your commit message"
# Frontend only
./deploy.sh -f "Frontend changes"
# Backend only
./deploy.sh -b "Backend changes"
# Quick sync without git operations
./deploy.sh -q
# Force deploy (skips safety checks)
./deploy.sh --force "Force deploy"
```
### What the Script Does
1. **Git Operations** (unless `-q` flag):
- Commits all changes with your message
- Pushes to `git.6bit.ch`
2. **Syncing Files**:
- Uses `rsync` to transfer only changed files to server
- Preserves timestamps and permissions
- Frontend: syncs to `~/pounce/frontend/`
- Backend: syncs to `~/pounce/backend/`
3. **Building**:
- Frontend: `npm run build` (creates optimized production build)
- Backend: `pip install -r requirements.txt` (updates dependencies)
4. **Restarting Services**:
- Gracefully restarts Next.js and Uvicorn
- Zero downtime using `./start.sh`
## Manual Deployment
### Step 1: Commit & Push Local Changes
```bash
cd /Users/yvesgugger/Documents/Projekte/pounce
# Check status
git status
# Add all changes
git add -A
# Commit
git commit -m "Your descriptive commit message"
# Push to git.6bit.ch
git push
```
### Step 2: SSH into Server & Pull Changes
```bash
# Connect to server
sshpass -p "user" ssh user@10.42.0.73
# Navigate to project
cd ~/pounce
# Pull latest changes
git pull
```
### Step 3: Frontend Deployment
```bash
# Navigate to frontend
cd ~/pounce/frontend
# Install dependencies (if package.json changed)
npm install
# Build production version
npm run build
# The build creates a .next folder with optimized static files
```
### Step 4: Backend Deployment
```bash
# Navigate to backend
cd ~/pounce/backend
# Activate virtual environment
source venv/bin/activate
# Install/update dependencies (if requirements.txt changed)
pip install -r requirements.txt
# Deactivate venv
deactivate
```
### Step 5: Restart Services
```bash
# Navigate to project root
cd ~/pounce
# Stop running services
pkill -f 'uvicorn'
pkill -f 'next start'
# Start services using start script
./start.sh
```
## Start Script (`start.sh`)
The `start.sh` script handles:
- Stopping existing processes on ports 8000 and 3000
- Starting the backend (Uvicorn) with proper settings
- Starting the frontend (Next.js) in production mode
- Health checks for both services
- Logging to `backend.log` and `frontend.log`
### Manual Service Management
```bash
# Check running processes
ps aux | grep uvicorn
ps aux | grep next
# View logs
tail -f ~/pounce/backend/backend.log
tail -f ~/pounce/frontend/frontend.log
# Check ports
lsof -i :8000 # Backend
lsof -i :3000 # Frontend
```
## Environment Configuration
### Backend `.env` (~/pounce/backend/.env)
```env
DATABASE_URL=postgresql://user:password@localhost:5432/domainwatch
SECRET_KEY=your-secret-key-here
STRIPE_SECRET_KEY=sk_live_xxx
STRIPE_PUBLISHABLE_KEY=pk_live_xxx
STRIPE_WEBHOOK_SECRET=whsec_xxx
ZOHO_SMTP_USER=noreply@pounce.ch
ZOHO_SMTP_PASSWORD=xxx
GOOGLE_CLIENT_ID=xxx
GOOGLE_CLIENT_SECRET=xxx
GITHUB_CLIENT_ID=xxx
GITHUB_CLIENT_SECRET=xxx
site_url=https://pounce.ch
```
### Frontend `.env.local` (~/pounce/frontend/.env.local)
```env
NEXT_PUBLIC_API_URL=https://pounce.ch/api/v1
NEXT_PUBLIC_STRIPE_PUBLISHABLE_KEY=pk_live_xxx
NEXT_PUBLIC_POSTHOG_KEY=phc_xxx
NEXT_PUBLIC_POSTHOG_HOST=https://eu.i.posthog.com
```
## Nginx Configuration
Nginx acts as reverse proxy on the server:
```nginx
# Frontend (Next.js)
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
# Backend (FastAPI)
location /api {
proxy_pass http://localhost:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
```
## Troubleshooting
### Frontend won't start
```bash
# Check for port conflicts
lsof -i :3000
# Check build errors
cd ~/pounce/frontend
npm run build
# Check logs
tail -f ~/pounce/frontend/frontend.log
```
### Backend won't start
```bash
# Check for port conflicts
lsof -i :8000
# Test backend manually
cd ~/pounce/backend
source venv/bin/activate
uvicorn app.main:app --host 0.0.0.0 --port 8000
# Check logs
tail -f ~/pounce/backend/backend.log
```
### Database issues
```bash
# Check PostgreSQL status
sudo systemctl status postgresql
# Connect to database
psql -U user -d domainwatch
# Check migrations
cd ~/pounce/backend
alembic current
alembic upgrade head
```
### SSL Certificate issues
```bash
# Check certificate expiry
sudo certbot certificates
# Renew certificates
sudo certbot renew
# Restart Nginx
sudo systemctl restart nginx
```
## Health Checks
```bash
# Backend health
curl http://localhost:8000/health
# Frontend health
curl -I http://localhost:3000
# Full stack check via public URL
curl https://pounce.ch
curl https://pounce.ch/api/health
```
## Rollback Procedure
If deployment fails:
```bash
# On server
cd ~/pounce
# See recent commits
git log --oneline -10
# Rollback to previous commit
git reset --hard <commit-hash>
# Rebuild
cd frontend && npm run build
cd ../backend && source venv/bin/activate && pip install -r requirements.txt
# Restart
cd .. && ./start.sh
```
## Monitoring & Maintenance
### Log Rotation
Logs are in:
- `~/pounce/backend/backend.log`
- `~/pounce/frontend/frontend.log`
Set up log rotation to prevent disk space issues:
```bash
# Create logrotate config
sudo nano /etc/logrotate.d/pounce
```
```
/home/user/pounce/backend/backend.log {
daily
rotate 14
compress
delaycompress
notifempty
create 0640 user user
}
/home/user/pounce/frontend/frontend.log {
daily
rotate 14
compress
delaycompress
notifempty
create 0640 user user
}
```
### Cron Jobs
Check scheduled tasks:
```bash
crontab -l
```
Common cron jobs for Pounce:
- Domain scraping
- Health checks
- Database cleanup
- Backup scripts
## Backup & Recovery
### Database Backup
```bash
# Manual backup
pg_dump -U user domainwatch > backup_$(date +%Y%m%d_%H%M%S).sql
# Restore from backup
psql -U user domainwatch < backup_20250101_120000.sql
```
### Code Backup
All code is backed up on `git.6bit.ch`. To clone fresh:
```bash
git clone user@10.13.12.81:yvg/pounce.git
```
## Security Notes
- Server uses SSH key authentication (password: `user` for development)
- SSL certificates via Let's Encrypt (auto-renewal)
- Database credentials in `.env` files (not committed to git)
- Stripe webhooks require signing secret verification
- OAuth secrets must match registered redirect URIs
## Quick Reference
```bash
# Deploy everything
./deploy.sh "message"
# Frontend only
./deploy.sh -f "message"
# Backend only
./deploy.sh -b "message"
# Quick sync (no git)
./deploy.sh -q
# Check logs
ssh user@10.42.0.73 'tail -f ~/pounce/backend/backend.log'
# Restart services
ssh user@10.42.0.73 'cd ~/pounce && ./start.sh'
# Check health
curl https://pounce.ch/api/health
```
## Support
For issues or questions, check:
1. Application logs (`backend.log`, `frontend.log`)
2. Nginx logs (`/var/log/nginx/error.log`)
3. PostgreSQL logs (`/var/log/postgresql/`)
4. System logs (`journalctl -xe`)

291
UNICORN_PLAN.md Normal file
View File

@ -0,0 +1,291 @@
## Pounce Unicorn Plan (integriert)
Ziel: Pounce von einem starken Produkt (Trust + Inventory + Lead Capture) zu einem skalierbaren System mit Moat + Flywheel entwickeln.
---
## Umsetzungsstatus (Stand: 2025-12-15)
### Wo wir stehen (kurz, ehrlich)
- **Deal-System (Liquidity Loop)**: **fertig & gehärtet** (Inbox → Threading → Sold/GMV → AntiAbuse).
- **Yield (Moat)**: **Connect + Routing + Tracking + Webhooks + Ledger-Basis** ist da. Wir können Domains verbinden, Traffic routen, Clicks/Conversions tracken und Payouts vorbereiten/abschliessen.
- **Flywheel/Distribution**: teilweise (Public Deal Surface + Login Gate ist da), Programmatic SEO & Viral Loop noch nicht systematisch ausgebaut.
- **Telemetry/Ops**: einzelne Events existieren implizit (Audit/Transactions), aber **kein zentrales Event-Schema + KPIs Dashboard**.
### Fortschritt nach Workstream
#### 1) DealSystem
- [x] 1A Inbox Workflow (Status, Close Reason, Audit)
- [x] 1B Threading/Negotiation (Buyer/Seller Threads + Email + Rate Limits + Content Safety)
- [x] 1C Deal Closure + GMV (Mark as Sold, Close open inquiries)
- [x] 1D AntiAbuse (Limits + Safety Checks an den kritischen Stellen)
#### 2) Yield (Moat)
- [x] 2A Connect/Nameserver Flow (PortfolioOnly + DNS Verified + Connect Wizard + `connected_at`)
- [x] 2B Routing → Tracking (Async, Click Tracking, IPHashing, RateLimit, strict partner config)
- [x] 2B Attribution (Webhook kann `click_id` mitschicken)
- [x] 2C Ledger/PayoutBasics (generate payouts + complete payouts; serversafe keys)
- [x] 2C.2 DashboardKorrektheit (monatliche Stats = confirmed/paid, pending payout = confirmed+unpaid)
#### 3) Flywheel / Distribution
- [~] 3B Public Deal Surface + Login Gate (Pounce Direct gated) — **vorhanden**
- [~] 3A Programmatic SEO maximal (Templates + CTA Pfade + Indexation)
- [~] 3C Viral Loop „Powered by Pounce“ (nur wo intent passt, sauberer Referral Loop)
**3C Stand (Viral Loop)**
- **Invite Codes**: jeder User hat jetzt einen eigenen `invite_code` (unique) + `GET /api/v1/auth/referral` liefert den InviteLink.
- **Attribution**: `ref` wird auf Public Pages in Cookie gespeichert (30 Tage) und bei `/register` mitgeschickt → Backend setzt `referred_by_user_id`.
- **Surfaces (intent-fit)**:
- Terminal Settings: “Invite” Panel mit CopyLink
- Public Buy Listing: “Powered by Pounce” → Register mit `?ref=<seller_invite_code>`
- **Telemetry**: Events `user_registered`, `referral_attributed`, `referral_link_viewed`
- **Admin KPIs (3C.2)**: Telemetry Tab zeigt jetzt ReferralKPIs (Link Views + Signups pro Referrer) via `GET /api/v1/telemetry/referrals?days=...`
- **Rewards/Badges (3C.2)**: Deterministische ReferralRewards (abuseresistent) → `subscriptions.referral_bonus_domains` (+5 Slots pro 3 “qualified referrals”), Badge `verified_referrer` / `elite_referrer` wird im TerminalSettings InvitePanel angezeigt.
- **AntiFraud/Cooldown**: Qualified zählt erst nach **Cooldown** (User+Subscription Age) und wird bei **shared IP / duplicate IP / missing IP** disqualifiziert (Telemetry `ip_hash`).
**3A Stand (Programmatic SEO)**
- **Indexation**: `sitemap.xml` ist jetzt dynamisch (DiscoverTLDs aus DB + Blog Slugs + Public Listings) und `robots.txt` blockt Legacy Pfade.
- **Canonical Cleanup**: Legacy Routen (`/tld/*`, `/tld-pricing/*`) redirecten server-seitig nach `/discover/*`.
- **Templates**: `/discover/[tld]` hat jetzt serverseitiges Metadata + JSONLD (aus echten RegistrarCompare Daten). `/buy/[slug]` ist serverseitig (Metadata + JSONLD).
- **Blog Article SEO**: `/blog/[slug]` hat jetzt serverseitiges `generateMetadata` + Article JSONLD, ohne ViewCount SideEffects (MetaEndpoint).
#### 4) Skalierung / Telemetry
- [x] 4A Events (kanonisches Event-Schema + persistente Events in Deal+Yield Funnel)
- [x] 4A.2 KPI Views (Admin KPIs aus Telemetry Events: Rates + Median Times)
- [x] 4B Ops (Backups + Restore-Verification + Monitoring/Alerts + Deliverability)
**4B Stand (Ops)**
- **Backups**: Admin-Endpoint + Scheduler Daily Backup + Restore-Verification (SQLite integrity_check / Postgres pg_restore --list)
- **Monitoring**: `/metrics` exportiert jetzt zusätzlich Business-KPIs (Deal+Yield aus `telemetry_events`, gecached) + Ops-Metriken (Backup enabled + Backup age)
- **Deliverability**: Newsletter Emails mit `List-Unsubscribe` (One-Click) + neue One-Click Unsubscribe Route
- **Alerting (Vorbereitung)**: `ops/prometheus-alerts.yml` mit Alerts (5xx rate, Backup stale, 24h Funnel-Null)
- **Alerting (ohne Docker)**: Scheduler Job `ops_alerting` + Admin Endpoint `POST /api/v1/admin/system/ops-alerts/run`
- **Alert History + Cooldown (persistiert)**: Table `ops_alert_events` + Admin Endpoint `GET /api/v1/admin/system/ops-alerts/history` + Admin UI History Panel
---
## Absicht & holistisches Konzept
### Absicht (warum es Pounce gibt)
Pounce existiert, um Domains von „toten Namen“ (nur Renewal-Kosten, keine Nutzung) zu **messbaren, handelbaren digitalen Assets** zu machen.
Wir bauen nicht nur einen Feed oder einen Marktplatz, sondern eine **Lifecycle Engine**: entdecken → erwerben → monetarisieren → liquidieren.
### Für wen (Zielgruppe)
- **Domain Investors / Operators**: brauchen sauberes Inventory, schnelle Entscheidungen, klare Workflows.
- **Builders / Entrepreneurs**: wollen gute Assets finden und sofort nutzen/monetarisieren.
- **Portfolio Owner** (ab 10+ Domains): wollen Governance (Health, Renewal, Cashflow) statt Chaos.
### Positionierung (klarer Satz)
**Pounce ist das Operating System für Domains**: ein Clean Market Feed + Verified Direct Deals + Yield Routing mit Messbarkeit vom ersten View bis zum Exit.
### Das Gesamtmodell (4 Module)
1. **Discover (Intelligence)**
Findet Assets: Clean Feed, Scores, TLD Intel, Filter, Alerts.
2. **Acquire (Marketplace / Liquidity)**
Sichert Assets: externe Auktionen + **Pounce Direct** (DNS-verified Owner).
3. **Yield (Intent Routing)**
Monetarisiert Assets: Domain-Traffic → Intent → Partner → Revenue Share.
4. **Trade (Exit / Outcomes)**
Liquidität und Bewertung: Domains werden nach **Cashflow** bepreist (Multiple), nicht nur nach „Vibe“.
### Warum das Unicorn-Potenzial hat (Moat + Flywheel)
- **Moat**: Proprietäre Daten über Intent, Traffic, Conversion und Cashflow auf Domain-Level (schwer kopierbar).
- **Flywheel**: mehr Domains → mehr Routing/Conversions → mehr Daten → bessere Scores/Routing → mehr Deals → mehr Domains.
---
## 0) Leitprinzipien
- **Moat entsteht dort, wo proprietäre Daten entstehen**: Yield/Intent + Deal Outcomes.
- **Trust ist ein Feature**: alles, was Spam/Scam senkt, steigert Conversion.
- **Telemetry ist nicht „später“**: jede neue Funktion erzeugt Events + messbare KPIs.
---
## 1) DealSystem (Liquidity Loop fertig machen)
### 1A — Inbox Workflow (Woche 1)
**Ziel**: Seller können Leads zuverlässig triagieren und messen.
- **Inquiry Status Workflow komplett**: `new → read → replied → closed` + `spam`
- Backend PATCH Endpoint + UI Actions
- „Close“ inkl. Grund (z.B. sold elsewhere / low offer / no fit)
- **Audit Trail (minimal)**
- jede Statusänderung speichert: `who/when/old/new`
**KPIs**
- inquiry→read rate
- inquiry→replied rate
- median reply time
### 1B — Threading/Negotiation (Woche 23)
**Ziel**: Verhandlung im Produkt, nicht off-platform.
- **Threading**: Buyer ↔ Seller Messages als Conversation pro Listing
- **Notifications**: EMail „New message“ + LoginGate
- **Audit Trail (voll)**: message events + status events
- **Security**: rate limits (buyer + seller), keyword checks, link safety
**KPIs**
- inquiry→first message
- messages/thread
- reply rate
### 1C — Deal Closure + GMV (Woche 34)
**Ziel**: echte Conversion/GMV messbar machen.
- **“Mark as Sold”** auf Listing
- Gründe: sold on Pounce / sold offplatform / removed
- optional: **deal_value** + currency
- optional sauberer **Deal-Record**
- `deal_id`, `listing_id`, `buyer_user_id(optional)`, `final_price`, `closed_at`
**KPIs**
- inquiry→sold
- close rate
- time-to-close
- GMV
### 1D — AntiAbuse (laufend ab Woche 1)
- **Rate limit** pro IP + pro User (inquire + message + status flips)
- **Spam flagging** (Heuristiken + manuell)
- **Blocklist** (buyer account/email/domain-level)
**KPIs**
- spam rate
- blocked attempts
- false positive rate
---
## 2) Yield als Burggraben (Moat)
### 2A — Connect/Nameserver Flow (Woche 24)
**Ziel**: Domains „unter Kontrolle“ bringen (Connect Layer).
- **Connect Wizard** (Portfolio → Yield)
- Anleitung: NS/TXT Setup
- Status: pending/verified/active
- **Backend checks** (NS/TXT) + Speicherung: `connected_at`
- **Routing Entry** (Edge/Web): Request → route decision
**KPIs**
- connect attempts→verified
- connected domains
### 2B — Intent → Routing → Tracking (Monat 2)
**Ziel**: Intent Routing MVP für 1 Vertical.
- **Intent detection** (MVP)
- **Routing** zu Partnern + Fallbacks
- **Tracking**: click_id, domain_id, partner_id
- **Attribution**: conversion mapping + payout status
**KPIs**
- clicks/domain
- conversion rate
- revenue/domain
### 2C — Payout + Revenue Share (Monat 23)
- Ledger: pending → confirmed → paid
- payout schedule (monatlich) + export/reports
**KPIs**
- payout accuracy
- disputes
- net margin
### 2D — Portfolio Cashflow Dashboard (Monat 3)
- Portfolio zeigt: **MRR, last 30d revenue, ROI**, top routes
- Domains werden „yield-bearing assets“ → später handelbar nach Multiple
**KPIs**
- MRR
- retention/churn
- expansion
---
## 3) Flywheel / Distribution
### 3A — Programmatic SEO maximal (Monat 12)
- Templates skalieren (TLD/Intel/Price)
- klare CTAPfade: „Track this TLD“, „Enter Terminal“, „View Direct Deals“
**KPIs**
- organic sessions
- signup conversion
### 3B — Public Deal Surface + Login Gate (Monat 1)
- Public Acquire + /buy als ConversionEngine
- “contact requires login” überall konsistent
**KPIs**
- view→login
- login→inquiry
### 3C — Viral Loop „Powered by Pounce“ (Monat 23)
- nur wenn intent passt / low intent fallback
- referral link + revenue share
**KPIs**
- referral signups
- CAC ~0
---
## 4) Skalierung / Telemetry
### 4A — Events (Woche 12)
Definiere & logge Events:
- `listing_view`
- `inquiry_created`
- `inquiry_status_changed`
- `message_sent`
- `listing_marked_sold`
- `yield_connected`
- `yield_click`
- `yield_conversion`
- `payout_paid`
**KPIs**
- funnel conversion
- time metrics
### 4B — Ops (Monat 1)
- Monitoring/alerts (Errors + Business KPIs)
- Backups (DB daily + restore drill)
- Deliverability (SPF/DKIM/DMARC, bounce handling)
- Abuse monitoring dashboards
---
## Empfohlene Reihenfolge (damit es schnell „unfair“ wird)
1. **Deal-System 1A1C** (GMV & close-rate messbar)
2. **Yield 2A** (Connect Layer) parallel starten
3. **Events 4A** sofort mitziehen
4. **Yield 2B2C** (Moat) sobald Connect stabil
5. Flywheel 3A3C kontinuierlich

256
YIELD_SETUP.md Normal file
View File

@ -0,0 +1,256 @@
# Pounce Yield - Complete Setup Guide
This guide covers the complete setup of the Yield/Intent Routing feature.
## Overview
Pounce Yield allows users to monetize their parked domains by:
1. Detecting user intent from domain names (e.g., "zahnarzt-zuerich.ch" → Medical/Dental)
2. Routing visitors to relevant affiliate partners
3. Tracking clicks, leads, and sales
4. Splitting revenue 70/30 (user/Pounce)
## Architecture
```
┌─────────────────┐ ┌──────────────────┐ ┌─────────────────┐
│ User Domain │────▶│ Pounce Yield │────▶│ Affiliate │
│ (DNS → Pounce) │ │ Routing Engine │ │ Partner │
└─────────────────┘ └──────────────────┘ └─────────────────┘
┌──────────────────┐
│ Transaction │
│ Tracking │
└──────────────────┘
```
## Setup Steps
### 1. Database Setup
The yield tables are created automatically on startup. To apply migrations to an existing database:
```bash
cd backend
python -c "from app.database import init_db; import asyncio; asyncio.run(init_db())"
```
### 2. Seed Affiliate Partners
Populate the affiliate partners with default Swiss/German partners:
```bash
cd backend
python scripts/seed_yield_partners.py
```
This seeds ~30 partners across categories:
- Medical (Dental, General, Beauty)
- Finance (Insurance, Mortgage, Banking)
- Legal
- Real Estate
- Travel
- Automotive
- Jobs
- Education
- Technology/Hosting
- Shopping
- Food/Delivery
### 3. Configure DNS
For yield domains to work, you need to set up DNS infrastructure:
#### Option A: Dedicated Nameservers (Recommended for Scale)
1. Set up two nameserver instances (e.g., `ns1.pounce.io`, `ns2.pounce.io`)
2. Run PowerDNS or similar with a backend that queries your yield_domains table
3. Return A records pointing to your yield routing service
#### Option B: CNAME Approach (Simpler)
1. Set up a wildcard SSL certificate for `*.yield.pounce.io`
2. Configure Nginx/Caddy to handle all incoming hosts
3. Users add CNAME: `@ → yield.pounce.io`
### 4. Nginx Configuration
For host-based routing, add this to your nginx config:
```nginx
# Yield domain catch-all
server {
listen 443 ssl http2;
server_name ~^(?<domain>.+)$;
# Wildcard cert
ssl_certificate /etc/ssl/yield.pounce.io.crt;
ssl_certificate_key /etc/ssl/yield.pounce.io.key;
location / {
proxy_pass http://backend:8000/api/v1/r/$domain;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
```
### 5. Partner Integration
Each affiliate partner requires:
1. **Tracking URL Template**: How to pass click IDs to the partner
2. **Webhook URL**: Where the partner sends conversion data back
Update partners in the database or via admin panel:
```sql
UPDATE affiliate_partners
SET tracking_url_template = 'https://partner.com/?clickid={click_id}&ref={domain}'
WHERE slug = 'partner_slug';
```
### 6. Webhook Configuration
Partners send conversion data to:
```
POST https://api.pounce.ch/api/v1/yield-webhooks/{partner_slug}
{
"event_type": "lead",
"domain": "zahnarzt-zuerich.ch",
"transaction_id": "abc123",
"amount": 25.00,
"currency": "CHF"
}
```
For Awin network, use the dedicated endpoint:
```
POST https://api.pounce.ch/api/v1/yield-webhooks/awin/postback
```
## API Endpoints
### Public
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | `/api/v1/yield/analyze?domain=X` | Analyze domain intent (no auth) |
| GET | `/api/v1/yield/partners` | List available partners |
### Authenticated (User)
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | `/api/v1/yield/dashboard` | User yield dashboard |
| GET | `/api/v1/yield/domains` | List user's yield domains |
| POST | `/api/v1/yield/activate` | Activate a domain |
| POST | `/api/v1/yield/domains/{id}/verify` | Verify DNS setup |
| GET | `/api/v1/yield/transactions` | Transaction history |
| GET | `/api/v1/yield/payouts` | Payout history |
### Routing
| Method | Endpoint | Description |
|--------|----------|-------------|
| GET | `/api/v1/r/{domain}` | Route traffic & track click |
| GET | `/api/v1/r/{domain}?direct=true` | Direct redirect (no landing) |
### Webhooks (Partner → Pounce)
| Method | Endpoint | Description |
|--------|----------|-------------|
| POST | `/api/v1/yield-webhooks/{partner}` | Generic partner webhook |
| POST | `/api/v1/yield-webhooks/awin/postback` | Awin network postback |
| POST | `/api/v1/yield-webhooks/confirm/{tx_id}` | Manual confirmation (internal) |
| POST | `/api/v1/yield-webhooks/batch-import` | Bulk import (internal) |
## Revenue Model
- **Clicks**: Usually CPC (cost per click), CHF 0.10-0.60
- **Leads**: CPL (cost per lead), CHF 15-120
- **Sales**: CPS (cost per sale), 2-10% of sale value
Revenue split:
- **User**: 70%
- **Pounce**: 30%
## Intent Categories
The IntentDetector recognizes these categories:
| Category | Subcategories | Example Domains |
|----------|---------------|-----------------|
| medical | dental, general, beauty | zahnarzt.ch, arzt-bern.ch |
| finance | insurance, mortgage, banking | versicherung.ch, hypothek.ch |
| legal | general | anwalt-zuerich.ch |
| realestate | buy, rent | wohnung-mieten.ch |
| travel | flights, hotels | flug-buchen.ch |
| auto | buy, service | autokauf.ch |
| jobs | - | stellenmarkt.ch |
| education | - | kurse-online.ch |
| tech | hosting, software | webhosting.ch |
| shopping | general, fashion | mode-shop.ch |
| food | restaurant, delivery | pizza-lieferung.ch |
## Monitoring
### Metrics
Enable Prometheus metrics:
```env
ENABLE_METRICS=true
```
Key yield metrics:
- `yield_clicks_total{domain, partner}`
- `yield_conversions_total{domain, partner, type}`
- `yield_revenue_total{currency}`
### Alerts
Set up alerts for:
- Webhook failures
- Low conversion rates
- DNS verification failures
- Partner API errors
## Troubleshooting
### Domain not routing
1. Check DNS: `dig +short {domain}`
2. Verify domain status: `SELECT status FROM yield_domains WHERE domain = '{domain}'`
3. Check nginx logs for routing errors
### No conversions
1. Verify partner webhook URL is correct
2. Check webhook logs for incoming calls
3. Validate transaction ID format
### Low revenue
1. Check intent detection: Some domains may be classified as "generic"
2. Review partner matching: Higher-priority partners should be assigned
3. Analyze geo distribution: Swiss visitors convert better
## Security Considerations
- All partner webhooks should use HMAC signature verification
- IP addresses are hashed before storage (privacy)
- User revenue data is isolated by user_id
- Rate limiting on routing endpoint
## Support
For issues with:
- Partner integrations: partners@pounce.ch
- Technical issues: dev@pounce.ch
- Payout questions: finance@pounce.ch

View File

@ -0,0 +1,34 @@
"""Add DNS verification fields to portfolio_domains
Revision ID: 006
Revises: 005
Create Date: 2025-12-13
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '006'
down_revision = '005'
branch_labels = None
depends_on = None
def upgrade() -> None:
"""Add DNS verification columns to portfolio_domains table."""
# Add columns with default values (nullable to avoid issues with existing rows)
op.add_column('portfolio_domains', sa.Column('is_dns_verified', sa.Boolean(), nullable=True, server_default='0'))
op.add_column('portfolio_domains', sa.Column('verification_status', sa.String(50), nullable=True, server_default='unverified'))
op.add_column('portfolio_domains', sa.Column('verification_code', sa.String(100), nullable=True))
op.add_column('portfolio_domains', sa.Column('verification_started_at', sa.DateTime(), nullable=True))
op.add_column('portfolio_domains', sa.Column('verified_at', sa.DateTime(), nullable=True))
def downgrade() -> None:
"""Remove DNS verification columns from portfolio_domains table."""
op.drop_column('portfolio_domains', 'verified_at')
op.drop_column('portfolio_domains', 'verification_started_at')
op.drop_column('portfolio_domains', 'verification_code')
op.drop_column('portfolio_domains', 'verification_status')
op.drop_column('portfolio_domains', 'is_dns_verified')

View File

@ -0,0 +1,74 @@
"""Add inquiry close fields + audit trail
Revision ID: 007
Revises: 006
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '007'
down_revision = '006'
branch_labels = None
depends_on = None
def upgrade() -> None:
# listing_inquiries: deal workflow
op.add_column('listing_inquiries', sa.Column('closed_reason', sa.String(200), nullable=True))
op.add_column('listing_inquiries', sa.Column('closed_at', sa.DateTime(), nullable=True))
op.create_index(
'ix_listing_inquiries_listing_created',
'listing_inquiries',
['listing_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiries_listing_status',
'listing_inquiries',
['listing_id', 'status'],
unique=False,
)
# listing_inquiry_events: audit trail
op.create_table(
'listing_inquiry_events',
sa.Column('id', sa.Integer(), primary_key=True),
sa.Column('inquiry_id', sa.Integer(), sa.ForeignKey('listing_inquiries.id'), nullable=False, index=True),
sa.Column('listing_id', sa.Integer(), sa.ForeignKey('domain_listings.id'), nullable=False, index=True),
sa.Column('actor_user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=False, index=True),
sa.Column('old_status', sa.String(20), nullable=True),
sa.Column('new_status', sa.String(20), nullable=False),
sa.Column('reason', sa.String(200), nullable=True),
sa.Column('ip_address', sa.String(45), nullable=True),
sa.Column('user_agent', sa.String(500), nullable=True),
sa.Column('created_at', sa.DateTime(), nullable=True, index=True),
)
op.create_index(
'ix_listing_inquiry_events_inquiry_created',
'listing_inquiry_events',
['inquiry_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiry_events_listing_created',
'listing_inquiry_events',
['listing_id', 'created_at'],
unique=False,
)
def downgrade() -> None:
op.drop_index('ix_listing_inquiry_events_listing_created', table_name='listing_inquiry_events')
op.drop_index('ix_listing_inquiry_events_inquiry_created', table_name='listing_inquiry_events')
op.drop_table('listing_inquiry_events')
op.drop_index('ix_listing_inquiries_listing_status', table_name='listing_inquiries')
op.drop_index('ix_listing_inquiries_listing_created', table_name='listing_inquiries')
op.drop_column('listing_inquiries', 'closed_at')
op.drop_column('listing_inquiries', 'closed_reason')

View File

@ -0,0 +1,61 @@
"""Add inquiry threading (buyer link + messages)
Revision ID: 008
Revises: 007
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = '008'
down_revision = '007'
branch_labels = None
depends_on = None
def upgrade() -> None:
# Link inquiry to buyer account
op.add_column('listing_inquiries', sa.Column('buyer_user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=True))
op.create_index('ix_listing_inquiries_buyer_user', 'listing_inquiries', ['buyer_user_id'], unique=False)
# Thread messages
op.create_table(
'listing_inquiry_messages',
sa.Column('id', sa.Integer(), primary_key=True),
sa.Column('inquiry_id', sa.Integer(), sa.ForeignKey('listing_inquiries.id'), nullable=False, index=True),
sa.Column('listing_id', sa.Integer(), sa.ForeignKey('domain_listings.id'), nullable=False, index=True),
sa.Column('sender_user_id', sa.Integer(), sa.ForeignKey('users.id'), nullable=False, index=True),
sa.Column('body', sa.Text(), nullable=False),
sa.Column('created_at', sa.DateTime(), nullable=True, index=True),
)
op.create_index(
'ix_listing_inquiry_messages_inquiry_created',
'listing_inquiry_messages',
['inquiry_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiry_messages_listing_created',
'listing_inquiry_messages',
['listing_id', 'created_at'],
unique=False,
)
op.create_index(
'ix_listing_inquiry_messages_sender_created',
'listing_inquiry_messages',
['sender_user_id', 'created_at'],
unique=False,
)
def downgrade() -> None:
op.drop_index('ix_listing_inquiry_messages_sender_created', table_name='listing_inquiry_messages')
op.drop_index('ix_listing_inquiry_messages_listing_created', table_name='listing_inquiry_messages')
op.drop_index('ix_listing_inquiry_messages_inquiry_created', table_name='listing_inquiry_messages')
op.drop_table('listing_inquiry_messages')
op.drop_index('ix_listing_inquiries_buyer_user', table_name='listing_inquiries')
op.drop_column('listing_inquiries', 'buyer_user_id')

View File

@ -0,0 +1,31 @@
"""Add listing sold fields (GMV tracking)
Revision ID: 009
Revises: 008
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = '009'
down_revision = '008'
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column('domain_listings', sa.Column('sold_at', sa.DateTime(), nullable=True))
op.add_column('domain_listings', sa.Column('sold_reason', sa.String(200), nullable=True))
op.add_column('domain_listings', sa.Column('sold_price', sa.Float(), nullable=True))
op.add_column('domain_listings', sa.Column('sold_currency', sa.String(3), nullable=True))
op.create_index('ix_domain_listings_status', 'domain_listings', ['status'], unique=False)
def downgrade() -> None:
op.drop_index('ix_domain_listings_status', table_name='domain_listings')
op.drop_column('domain_listings', 'sold_currency')
op.drop_column('domain_listings', 'sold_price')
op.drop_column('domain_listings', 'sold_reason')
op.drop_column('domain_listings', 'sold_at')

View File

@ -0,0 +1,25 @@
"""Add yield connected_at timestamp.
Revision ID: 010_add_yield_connected_at
Revises: 009_add_listing_sold_fields
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = "010_add_yield_connected_at"
down_revision = "009_add_listing_sold_fields"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("yield_domains", sa.Column("connected_at", sa.DateTime(), nullable=True))
def downgrade() -> None:
op.drop_column("yield_domains", "connected_at")

View File

@ -0,0 +1,28 @@
"""Add click_id + destination_url to yield transactions.
Revision ID: 011_add_yield_transaction_click_id
Revises: 010_add_yield_connected_at
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = "011_add_yield_transaction_click_id"
down_revision = "010_add_yield_connected_at"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("yield_transactions", sa.Column("click_id", sa.String(length=64), nullable=True))
op.add_column("yield_transactions", sa.Column("destination_url", sa.Text(), nullable=True))
op.create_index("ix_yield_transactions_click_id", "yield_transactions", ["click_id"], unique=False)
def downgrade() -> None:
op.drop_index("ix_yield_transactions_click_id", table_name="yield_transactions")
op.drop_column("yield_transactions", "destination_url")
op.drop_column("yield_transactions", "click_id")

View File

@ -0,0 +1,67 @@
"""Add telemetry_events table.
Revision ID: 012_add_telemetry_events
Revises: 011_add_yield_transaction_click_id
Create Date: 2025-12-15
"""
from alembic import op
import sqlalchemy as sa
revision = "012_add_telemetry_events"
down_revision = "011_add_yield_transaction_click_id"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"telemetry_events",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("user_id", sa.Integer(), nullable=True),
sa.Column("event_name", sa.String(length=60), nullable=False),
sa.Column("listing_id", sa.Integer(), nullable=True),
sa.Column("inquiry_id", sa.Integer(), nullable=True),
sa.Column("yield_domain_id", sa.Integer(), nullable=True),
sa.Column("click_id", sa.String(length=64), nullable=True),
sa.Column("domain", sa.String(length=255), nullable=True),
sa.Column("source", sa.String(length=30), nullable=True),
sa.Column("ip_hash", sa.String(length=64), nullable=True),
sa.Column("user_agent", sa.String(length=500), nullable=True),
sa.Column("referrer", sa.String(length=500), nullable=True),
sa.Column("metadata_json", sa.Text(), nullable=True),
sa.Column("is_authenticated", sa.Boolean(), nullable=True),
sa.Column("created_at", sa.DateTime(), nullable=False),
sa.ForeignKeyConstraint(["user_id"], ["users.id"]),
)
op.create_index("ix_telemetry_events_event_name", "telemetry_events", ["event_name"])
op.create_index("ix_telemetry_events_user_id", "telemetry_events", ["user_id"])
op.create_index("ix_telemetry_events_listing_id", "telemetry_events", ["listing_id"])
op.create_index("ix_telemetry_events_inquiry_id", "telemetry_events", ["inquiry_id"])
op.create_index("ix_telemetry_events_yield_domain_id", "telemetry_events", ["yield_domain_id"])
op.create_index("ix_telemetry_events_click_id", "telemetry_events", ["click_id"])
op.create_index("ix_telemetry_events_domain", "telemetry_events", ["domain"])
op.create_index("ix_telemetry_events_created_at", "telemetry_events", ["created_at"])
op.create_index("ix_telemetry_event_name_created", "telemetry_events", ["event_name", "created_at"])
op.create_index("ix_telemetry_user_created", "telemetry_events", ["user_id", "created_at"])
op.create_index("ix_telemetry_listing_created", "telemetry_events", ["listing_id", "created_at"])
op.create_index("ix_telemetry_yield_created", "telemetry_events", ["yield_domain_id", "created_at"])
def downgrade() -> None:
op.drop_index("ix_telemetry_yield_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_listing_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_user_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_event_name_created", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_created_at", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_domain", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_click_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_yield_domain_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_inquiry_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_listing_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_user_id", table_name="telemetry_events")
op.drop_index("ix_telemetry_events_event_name", table_name="telemetry_events")
op.drop_table("telemetry_events")

View File

@ -0,0 +1,41 @@
"""add ops alert events
Revision ID: 013_add_ops_alert_events
Revises: 012_add_telemetry_events
Create Date: 2025-12-15
"""
from __future__ import annotations
from alembic import op
import sqlalchemy as sa
revision = "013_add_ops_alert_events"
down_revision = "012_add_telemetry_events"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.create_table(
"ops_alert_events",
sa.Column("id", sa.Integer(), primary_key=True),
sa.Column("alert_key", sa.String(length=80), nullable=False),
sa.Column("severity", sa.String(length=10), nullable=False),
sa.Column("title", sa.String(length=200), nullable=False),
sa.Column("detail", sa.Text(), nullable=True),
sa.Column("status", sa.String(length=20), nullable=False),
sa.Column("recipients", sa.Text(), nullable=True),
sa.Column("send_reason", sa.String(length=60), nullable=True),
sa.Column("created_at", sa.DateTime(), nullable=False, server_default=sa.text("now()")),
)
op.create_index("ix_ops_alert_key_created", "ops_alert_events", ["alert_key", "created_at"])
op.create_index("ix_ops_alert_status_created", "ops_alert_events", ["status", "created_at"])
def downgrade() -> None:
op.drop_index("ix_ops_alert_status_created", table_name="ops_alert_events")
op.drop_index("ix_ops_alert_key_created", table_name="ops_alert_events")
op.drop_table("ops_alert_events")

View File

@ -0,0 +1,28 @@
"""add users invite_code
Revision ID: 014_add_user_invite_code
Revises: 013_add_ops_alert_events
Create Date: 2025-12-15
"""
from __future__ import annotations
from alembic import op
import sqlalchemy as sa
revision = "014_add_user_invite_code"
down_revision = "013_add_ops_alert_events"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column("users", sa.Column("invite_code", sa.String(length=32), nullable=True))
op.create_index("ix_users_invite_code", "users", ["invite_code"], unique=True)
def downgrade() -> None:
op.drop_index("ix_users_invite_code", table_name="users")
op.drop_column("users", "invite_code")

View File

@ -0,0 +1,29 @@
"""add subscription referral bonus domains
Revision ID: 015_add_subscription_referral_bonus_domains
Revises: 014_add_user_invite_code
Create Date: 2025-12-15
"""
from __future__ import annotations
import sqlalchemy as sa
from alembic import op
revision = "015_add_subscription_referral_bonus_domains"
down_revision = "014_add_user_invite_code"
branch_labels = None
depends_on = None
def upgrade() -> None:
op.add_column(
"subscriptions",
sa.Column("referral_bonus_domains", sa.Integer(), nullable=False, server_default="0"),
)
def downgrade() -> None:
op.drop_column("subscriptions", "referral_bonus_domains")

View File

@ -19,6 +19,14 @@ from app.api.sniper_alerts import router as sniper_alerts_router
from app.api.seo import router as seo_router
from app.api.dashboard import router as dashboard_router
from app.api.yield_domains import router as yield_router
from app.api.yield_webhooks import router as yield_webhooks_router
from app.api.yield_routing import router as yield_routing_router
from app.api.yield_payout_admin import router as yield_payout_admin_router
from app.api.telemetry import router as telemetry_router
from app.api.analyze import router as analyze_router
from app.api.hunt import router as hunt_router
from app.api.cfo import router as cfo_router
from app.api.drops import router as drops_router
api_router = APIRouter()
@ -33,6 +41,10 @@ api_router.include_router(price_alerts_router, prefix="/price-alerts", tags=["Pr
api_router.include_router(portfolio_router, prefix="/portfolio", tags=["Portfolio"])
api_router.include_router(auctions_router, prefix="/auctions", tags=["Smart Pounce - Auctions"])
api_router.include_router(dashboard_router, prefix="/dashboard", tags=["Dashboard"])
api_router.include_router(analyze_router, prefix="/analyze", tags=["Analyze"])
api_router.include_router(hunt_router, prefix="/hunt", tags=["Hunt"])
api_router.include_router(cfo_router, prefix="/cfo", tags=["CFO"])
api_router.include_router(drops_router, tags=["Drops - Zone Files"])
# Marketplace (For Sale) - from analysis_3.md
api_router.include_router(listings_router, prefix="/listings", tags=["Marketplace - For Sale"])
@ -45,6 +57,12 @@ api_router.include_router(seo_router, prefix="/seo", tags=["SEO Data - Tycoon"])
# Yield / Intent Routing - Passive income from parked domains
api_router.include_router(yield_router, tags=["Yield - Intent Routing"])
api_router.include_router(yield_webhooks_router, tags=["Yield - Webhooks"])
api_router.include_router(yield_routing_router, tags=["Yield - Routing"])
api_router.include_router(yield_payout_admin_router, tags=["Yield - Admin"])
# Telemetry / KPIs (admin)
api_router.include_router(telemetry_router, tags=["Telemetry"])
# Support & Communication
api_router.include_router(contact_router, prefix="/contact", tags=["Contact & Newsletter"])

View File

@ -11,7 +11,7 @@ Provides admin-only access to:
from datetime import datetime, timedelta
from pathlib import Path
from typing import Optional
from fastapi import APIRouter, HTTPException, status, Depends
from fastapi import APIRouter, HTTPException, status, Depends, BackgroundTasks
from pydantic import BaseModel, EmailStr
from sqlalchemy import select, func, desc
@ -25,6 +25,10 @@ from app.models.newsletter import NewsletterSubscriber
from app.models.tld_price import TLDPrice, TLDInfo
from app.models.auction import DomainAuction
from app.models.price_alert import PriceAlert
from app.models.listing import DomainListing
from app.services.db_backup import create_backup, list_backups
from app.services.ops_alerts import run_ops_alert_checks
from app.models.ops_alert import OpsAlertEvent
router = APIRouter()
settings = get_settings()
@ -189,6 +193,213 @@ async def get_admin_stats(
}
# ============== Earnings / Revenue ==============
@router.get("/earnings")
async def get_admin_earnings(
db: Database,
admin: User = Depends(require_admin)
):
"""
Get earnings and revenue metrics for admin dashboard.
Calculates MRR, ARR, and subscription breakdown.
"""
# Tier prices (from TIER_CONFIG)
tier_prices = {
SubscriptionTier.SCOUT: 0,
SubscriptionTier.TRADER: 9,
SubscriptionTier.TYCOON: 29,
}
# Get all active subscriptions
result = await db.execute(
select(Subscription).where(
Subscription.status == SubscriptionStatus.ACTIVE
)
)
active_subs = result.scalars().all()
# Calculate MRR
mrr = 0.0
tier_breakdown = {
"scout": {"count": 0, "revenue": 0},
"trader": {"count": 0, "revenue": 0},
"tycoon": {"count": 0, "revenue": 0},
}
for sub in active_subs:
price = tier_prices.get(sub.tier, 0)
mrr += price
tier_key = sub.tier.value
if tier_key in tier_breakdown:
tier_breakdown[tier_key]["count"] += 1
tier_breakdown[tier_key]["revenue"] += price
arr = mrr * 12
# New subscriptions this week
week_ago = datetime.utcnow() - timedelta(days=7)
new_subs_week = await db.execute(
select(func.count(Subscription.id)).where(
Subscription.started_at >= week_ago,
Subscription.tier != SubscriptionTier.SCOUT
)
)
new_subs_week = new_subs_week.scalar() or 0
# New subscriptions this month
month_ago = datetime.utcnow() - timedelta(days=30)
new_subs_month = await db.execute(
select(func.count(Subscription.id)).where(
Subscription.started_at >= month_ago,
Subscription.tier != SubscriptionTier.SCOUT
)
)
new_subs_month = new_subs_month.scalar() or 0
# Cancelled subscriptions this month (churn)
cancelled_month = await db.execute(
select(func.count(Subscription.id)).where(
Subscription.cancelled_at >= month_ago,
Subscription.cancelled_at.isnot(None)
)
)
cancelled_month = cancelled_month.scalar() or 0
# Total paying customers
paying_customers = tier_breakdown["trader"]["count"] + tier_breakdown["tycoon"]["count"]
# Revenue from Yield (platform's 30% cut)
try:
from app.models.yield_domain import YieldTransaction
yield_revenue = await db.execute(
select(func.sum(YieldTransaction.net_amount)).where(
YieldTransaction.created_at >= month_ago,
YieldTransaction.status == "confirmed"
)
)
yield_revenue_month = float(yield_revenue.scalar() or 0) * 0.30 / 0.70 # Platform's cut
except Exception:
yield_revenue_month = 0
return {
"mrr": round(mrr, 2),
"arr": round(arr, 2),
"paying_customers": paying_customers,
"tier_breakdown": tier_breakdown,
"new_subscriptions": {
"week": new_subs_week,
"month": new_subs_month,
},
"churn": {
"month": cancelled_month,
},
"yield_revenue_month": round(yield_revenue_month, 2),
"total_revenue_month": round(mrr + yield_revenue_month, 2),
"timestamp": datetime.utcnow().isoformat(),
}
# ============== Earnings History ==============
@router.get("/earnings/history")
async def get_admin_earnings_history(
db: Database,
admin: User = Depends(require_admin),
months: int = 12
):
"""
Get historical earnings data for charts.
Calculates MRR for each month based on subscription start dates.
"""
tier_prices = {
SubscriptionTier.SCOUT: 0,
SubscriptionTier.TRADER: 9,
SubscriptionTier.TYCOON: 29,
}
# Get all subscriptions
result = await db.execute(select(Subscription))
all_subs = result.scalars().all()
# Generate monthly data for the last N months
monthly_data = []
now = datetime.utcnow()
for i in range(months - 1, -1, -1):
# Calculate the start of each month
month_start = datetime(now.year, now.month, 1) - timedelta(days=i * 30)
month_end = month_start + timedelta(days=30)
month_name = month_start.strftime("%b %Y")
# Calculate MRR for this month
mrr = 0.0
tier_counts = {"scout": 0, "trader": 0, "tycoon": 0}
new_subs = 0
churned = 0
for sub in all_subs:
# Was this subscription active during this month?
started_before_month_end = sub.started_at <= month_end
cancelled_after_month_start = (sub.cancelled_at is None or sub.cancelled_at >= month_start)
if started_before_month_end and cancelled_after_month_start:
price = tier_prices.get(sub.tier, 0)
mrr += price
tier_key = sub.tier.value
if tier_key in tier_counts:
tier_counts[tier_key] += 1
# New subscriptions in this month
if month_start <= sub.started_at < month_end and sub.tier != SubscriptionTier.SCOUT:
new_subs += 1
# Churned in this month
if sub.cancelled_at and month_start <= sub.cancelled_at < month_end:
churned += 1
monthly_data.append({
"month": month_name,
"mrr": round(mrr, 2),
"arr": round(mrr * 12, 2),
"paying_customers": tier_counts["trader"] + tier_counts["tycoon"],
"scout": tier_counts["scout"],
"trader": tier_counts["trader"],
"tycoon": tier_counts["tycoon"],
"new_subscriptions": new_subs,
"churn": churned,
})
# Calculate growth metrics
if len(monthly_data) >= 2:
current_mrr = monthly_data[-1]["mrr"]
prev_mrr = monthly_data[-2]["mrr"] if monthly_data[-2]["mrr"] > 0 else 1
mrr_growth = ((current_mrr - prev_mrr) / prev_mrr) * 100
else:
mrr_growth = 0
# Calculate average revenue per user (ARPU)
current_paying = monthly_data[-1]["paying_customers"] if monthly_data else 0
current_mrr = monthly_data[-1]["mrr"] if monthly_data else 0
arpu = current_mrr / current_paying if current_paying > 0 else 0
# Calculate LTV (assuming 12 month average retention)
ltv = arpu * 12
return {
"monthly_data": monthly_data,
"metrics": {
"mrr_growth_percent": round(mrr_growth, 1),
"arpu": round(arpu, 2),
"ltv": round(ltv, 2),
"total_customers": sum(m["paying_customers"] for m in monthly_data[-1:]),
},
"timestamp": datetime.utcnow().isoformat(),
}
# ============== User Management ==============
class UpdateUserRequest(BaseModel):
@ -525,12 +736,12 @@ async def upgrade_user(
user_id=user.id,
tier=new_tier,
status=SubscriptionStatus.ACTIVE,
domain_limit=config.get("domain_limit", 5),
max_domains=config.get("domain_limit", 5),
)
db.add(subscription)
else:
subscription.tier = new_tier
subscription.domain_limit = config.get("domain_limit", 5)
subscription.max_domains = config.get("domain_limit", 5)
subscription.status = SubscriptionStatus.ACTIVE
await db.commit()
@ -811,7 +1022,7 @@ async def test_email(
"""Send a test email to the admin user."""
from app.services.email_service import email_service
if not email_service.is_configured:
if not email_service.is_configured():
raise HTTPException(
status_code=400,
detail="Email service is not configured. Check SMTP settings."
@ -897,6 +1108,83 @@ async def get_scheduler_status(
}
# ============== Ops: Backups (4B) ==============
@router.get("/system/backups")
async def get_backups(
admin: User = Depends(require_admin),
limit: int = 20,
):
"""List recent DB backups on the server."""
return {"backups": list_backups(limit=limit)}
@router.post("/system/backups")
async def create_db_backup(
admin: User = Depends(require_admin),
verify: bool = True,
):
"""Create a DB backup on the server (and verify it)."""
if not settings.enable_db_backups:
raise HTTPException(status_code=403, detail="DB backups are disabled (ENABLE_DB_BACKUPS=false).")
try:
result = create_backup(verify=verify)
return {
"status": "ok",
"backup": {
"path": result.path,
"size_bytes": result.size_bytes,
"created_at": result.created_at,
"verified": result.verified,
"verification_detail": result.verification_detail,
},
}
except Exception as e:
raise HTTPException(status_code=500, detail=f"Backup failed: {e}")
@router.post("/system/ops-alerts/run")
async def run_ops_alerts_now(
admin: User = Depends(require_admin),
):
"""
Run ops alert checks immediately (and send alerts if enabled).
Useful for server validation without Docker.
"""
return await run_ops_alert_checks()
@router.get("/system/ops-alerts/history")
async def get_ops_alert_history(
db: Database,
admin: User = Depends(require_admin),
limit: int = 100,
):
"""Return recent persisted ops alert events."""
limit = max(1, min(int(limit), 500))
rows = (
await db.execute(
select(OpsAlertEvent).order_by(OpsAlertEvent.created_at.desc()).limit(limit)
)
).scalars().all()
return {
"events": [
{
"id": e.id,
"alert_key": e.alert_key,
"severity": e.severity,
"title": e.title,
"detail": e.detail,
"status": e.status,
"send_reason": e.send_reason,
"recipients": e.recipients,
"created_at": e.created_at.isoformat(),
}
for e in rows
]
}
# ============== Bulk Operations ==============
class BulkUpgradeRequest(BaseModel):
@ -1180,3 +1468,261 @@ async def get_scrape_status(
for log in logs
]
}
# ============== Subscription Management ==============
class SubscriptionUpdate(BaseModel):
"""Manual subscription update request."""
tier: str # "scout", "trader", "tycoon"
@router.post("/users/{user_id}/sync-subscription")
async def sync_user_subscription_from_stripe(
user_id: int,
db: Database,
admin: User = Depends(require_admin),
):
"""
Sync a user's subscription status from Stripe.
Use this if the webhook failed to update the subscription.
"""
import stripe
import os
stripe.api_key = os.getenv("STRIPE_SECRET_KEY")
if not stripe.api_key:
raise HTTPException(status_code=503, detail="Stripe not configured")
# Get user
result = await db.execute(select(User).where(User.id == user_id))
user = result.scalar_one_or_none()
if not user:
raise HTTPException(status_code=404, detail="User not found")
if not user.stripe_customer_id:
raise HTTPException(status_code=400, detail="User has no Stripe customer ID")
# Get subscriptions from Stripe
try:
subscriptions = stripe.Subscription.list(
customer=user.stripe_customer_id,
status="active",
limit=1
)
except stripe.error.StripeError as e:
raise HTTPException(status_code=500, detail=f"Stripe error: {e}")
if not subscriptions.data:
return {
"status": "no_active_subscription",
"message": "No active subscription found in Stripe",
"user_email": user.email,
"stripe_customer_id": user.stripe_customer_id
}
stripe_sub = subscriptions.data[0]
# Access items via dict notation (Stripe returns StripeObject)
items_data = stripe_sub.get("items", {}).get("data", [])
price_id = items_data[0].get("price", {}).get("id") if items_data else None
# Map price_id to tier
trader_price = os.getenv("STRIPE_PRICE_TRADER")
tycoon_price = os.getenv("STRIPE_PRICE_TYCOON")
if price_id == trader_price:
tier = SubscriptionTier.TRADER
tier_name = "trader"
elif price_id == tycoon_price:
tier = SubscriptionTier.TYCOON
tier_name = "tycoon"
else:
return {
"status": "unknown_price",
"message": f"Unknown price ID: {price_id}",
"stripe_subscription_id": stripe_sub.id
}
# Update subscription in database
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
tier_config = TIER_CONFIG[tier]
if subscription:
old_tier = subscription.tier
subscription.tier = tier
subscription.status = SubscriptionStatus.ACTIVE
subscription.stripe_subscription_id = stripe_sub.id
subscription.max_domains = tier_config["domain_limit"]
subscription.check_frequency = tier_config["check_frequency"]
else:
subscription = Subscription(
user_id=user.id,
tier=tier,
status=SubscriptionStatus.ACTIVE,
stripe_subscription_id=stripe_sub.id,
max_domains=tier_config["domain_limit"],
check_frequency=tier_config["check_frequency"],
)
db.add(subscription)
old_tier = None
await db.commit()
return {
"status": "synced",
"user_email": user.email,
"stripe_customer_id": user.stripe_customer_id,
"stripe_subscription_id": stripe_sub.id,
"old_tier": old_tier.value if old_tier else None,
"new_tier": tier.value,
"tier_config": {
"domain_limit": tier_config["domain_limit"],
"check_frequency": tier_config["check_frequency"],
}
}
@router.post("/users/{user_id}/set-subscription")
async def set_user_subscription(
user_id: int,
update: SubscriptionUpdate,
db: Database,
admin: User = Depends(require_admin),
):
"""
Manually set a user's subscription tier.
Use this to manually upgrade/downgrade users (e.g., for refunds or promotions).
"""
tier_map = {
"scout": SubscriptionTier.SCOUT,
"trader": SubscriptionTier.TRADER,
"tycoon": SubscriptionTier.TYCOON,
}
if update.tier.lower() not in tier_map:
raise HTTPException(status_code=400, detail=f"Invalid tier: {update.tier}")
tier = tier_map[update.tier.lower()]
# Get user
result = await db.execute(select(User).where(User.id == user_id))
user = result.scalar_one_or_none()
if not user:
raise HTTPException(status_code=404, detail="User not found")
# Get/create subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
tier_config = TIER_CONFIG[tier]
if subscription:
old_tier = subscription.tier
subscription.tier = tier
subscription.status = SubscriptionStatus.ACTIVE
subscription.max_domains = tier_config["domain_limit"]
subscription.check_frequency = tier_config["check_frequency"]
else:
subscription = Subscription(
user_id=user.id,
tier=tier,
status=SubscriptionStatus.ACTIVE,
max_domains=tier_config["domain_limit"],
check_frequency=tier_config["check_frequency"],
)
db.add(subscription)
old_tier = None
await db.commit()
return {
"status": "updated",
"user_email": user.email,
"user_id": user.id,
"old_tier": old_tier.value if old_tier else None,
"new_tier": tier.value,
}
# ============== Listing Debug Endpoints ==============
@router.get("/listings/debug")
async def debug_listings(
domain: Optional[str] = None,
slug: Optional[str] = None,
db: Database = None,
_: User = Depends(require_admin),
):
"""Debug listings - search by domain or slug (ignores status)."""
query = select(DomainListing)
if domain:
query = query.where(DomainListing.domain.ilike(f"%{domain}%"))
if slug:
query = query.where(DomainListing.slug.ilike(f"%{slug}%"))
query = query.order_by(desc(DomainListing.created_at)).limit(20)
result = await db.execute(query)
listings = list(result.scalars().all())
return {
"count": len(listings),
"listings": [
{
"id": l.id,
"domain": l.domain,
"slug": l.slug,
"status": l.status,
"is_verified": l.is_verified,
"verification_status": l.verification_status,
"public_url": l.public_url,
"created_at": str(l.created_at) if l.created_at else None,
"published_at": str(l.published_at) if l.published_at else None,
"user_id": l.user_id,
}
for l in listings
]
}
@router.post("/listings/{listing_id}/force-activate")
async def force_activate_listing(
listing_id: int,
db: Database = None,
_: User = Depends(require_admin),
):
"""Force-activate a listing (bypass DNS verification)."""
result = await db.execute(
select(DomainListing).where(DomainListing.id == listing_id)
)
listing = result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
listing.status = "active"
listing.is_verified = True
listing.verification_status = "verified"
listing.published_at = datetime.utcnow()
await db.commit()
return {
"status": "activated",
"listing_id": listing.id,
"domain": listing.domain,
"slug": listing.slug,
"public_url": listing.public_url,
}

View File

@ -0,0 +1,36 @@
"""Analyze API endpoints (Alpha Terminal - Diligence)."""
from __future__ import annotations
from fastapi import APIRouter, Query, Request
from slowapi import Limiter
from slowapi.util import get_remote_address
from app.api.deps import CurrentUser, Database
from app.schemas.analyze import AnalyzeResponse
from app.services.analyze.service import get_domain_analysis
router = APIRouter()
limiter = Limiter(key_func=get_remote_address)
@router.get("/{domain}", response_model=AnalyzeResponse)
@limiter.limit("60/minute")
async def analyze_domain(
request: Request,
domain: str,
current_user: CurrentUser,
db: Database,
fast: bool = Query(False, description="Skip slower HTTP/SSL checks"),
refresh: bool = Query(False, description="Bypass cache and recompute"),
):
"""
Analyze a domain with open-data-first signals.
Requires authentication (Terminal feature).
"""
_ = current_user # enforce auth
res = await get_domain_analysis(db, domain, fast=fast, refresh=refresh)
await db.commit() # persist cache upsert
return res

View File

@ -184,15 +184,27 @@ def _format_time_remaining(end_time: datetime, now: Optional[datetime] = None) -
def _get_affiliate_url(platform: str, domain: str, auction_url: str) -> str:
"""Get affiliate URL for a platform - links directly to the auction page."""
"""Get affiliate URL for a platform - links directly to the auction page with affiliate tracking."""
# SEDO SPECIAL CASE: Always use direct Sedo link with partner ID
# This ensures we get affiliate revenue even from scraped data
if platform == "Sedo":
return f"https://sedo.com/search/details/?domain={domain}&partnerid=335830"
# Import here to avoid circular imports
from app.services.hidden_api_scrapers import build_affiliate_url
# Try to build affiliate URL first (includes partner IDs)
affiliate_url = build_affiliate_url(platform, domain, auction_url)
if affiliate_url:
return affiliate_url
# Use the scraped auction URL directly if available
if auction_url and auction_url.startswith("http"):
return auction_url
# Fallback to platform-specific search/listing pages
# Fallback to platform-specific search/listing pages (without affiliate tracking)
platform_urls = {
"GoDaddy": f"https://auctions.godaddy.com/trpItemListing.aspx?domain={domain}",
"Sedo": f"https://sedo.com/search/?keyword={domain}",
"NameJet": f"https://www.namejet.com/Pages/Auctions/BackorderSearch.aspx?q={domain}",
"DropCatch": f"https://www.dropcatch.com/domain/{domain}",
"ExpiredDomains": f"https://www.expireddomains.net/domain-name-search/?q={domain}",
@ -617,6 +629,50 @@ async def trigger_scrape(
raise HTTPException(status_code=500, detail=f"Scrape failed: {str(e)}")
@router.get("/sedo")
async def get_sedo_listings(
keyword: Optional[str] = Query(None, description="Search keyword"),
tld: Optional[str] = Query(None, description="Filter by TLD"),
limit: int = Query(50, le=100),
current_user: Optional[User] = Depends(get_current_user_optional),
):
"""
Get live domain listings from Sedo marketplace.
Returns real-time data from Sedo API with affiliate tracking.
All links include Pounce partner ID for commission tracking.
"""
from app.services.sedo_api import sedo_client
if not sedo_client.is_configured:
return {
"items": [],
"error": "Sedo API not configured",
"source": "sedo"
}
try:
listings = await sedo_client.get_listings_for_display(
keyword=keyword,
tld=tld,
page_size=limit
)
return {
"items": listings,
"count": len(listings),
"source": "sedo",
"affiliate_note": "All links include Pounce partner ID for commission tracking"
}
except Exception as e:
logger.error(f"Sedo API error: {e}")
return {
"items": [],
"error": str(e),
"source": "sedo"
}
@router.get("/opportunities")
async def get_smart_opportunities(
current_user: User = Depends(get_current_user),
@ -899,9 +955,9 @@ async def get_market_feed(
# Build base filters (SQL-side)
# -----------------------------
listing_filters = [DomainListing.status == ListingStatus.ACTIVE.value]
if keyword:
if keyword:
listing_filters.append(DomainListing.domain.ilike(f"%{keyword}%"))
if verified_only:
if verified_only:
listing_filters.append(DomainListing.verification_status == VerificationStatus.VERIFIED.value)
if min_price is not None:
listing_filters.append(DomainListing.asking_price >= min_price)
@ -918,9 +974,9 @@ async def get_market_feed(
auction_filters.append(DomainAuction.domain.ilike(f"%{keyword}%"))
if tld_clean:
auction_filters.append(DomainAuction.tld == tld_clean)
if min_price is not None:
if min_price is not None:
auction_filters.append(DomainAuction.current_bid >= min_price)
if max_price is not None:
if max_price is not None:
auction_filters.append(DomainAuction.current_bid <= max_price)
if ending_within:
cutoff = now + timedelta(hours=ending_within)
@ -996,7 +1052,7 @@ async def get_market_feed(
)
built.append({"item": item, "newest_ts": listing.updated_at or listing.created_at or datetime.min})
# External auctions
# External auctions (from DB)
if source in ["all", "external"]:
auction_query = select(DomainAuction).where(and_(*auction_filters))
@ -1026,13 +1082,13 @@ async def get_market_feed(
pounce_score = auction.pounce_score
if pounce_score is None:
pounce_score = _calculate_pounce_score_v2(
auction.domain,
auction.tld,
num_bids=auction.num_bids,
age_years=auction.age_years or 0,
pounce_score = _calculate_pounce_score_v2(
auction.domain,
auction.tld,
num_bids=auction.num_bids,
age_years=auction.age_years or 0,
is_pounce=False,
)
)
if pounce_score < min_score:
continue
@ -1055,6 +1111,93 @@ async def get_market_feed(
pounce_score=pounce_score,
)
built.append({"item": item, "newest_ts": auction.updated_at or auction.scraped_at or datetime.min})
# =========================================================================
# LIVE SEDO DATA - Fetch and merge real-time listings from Sedo API
# =========================================================================
try:
from app.services.sedo_api import sedo_client
if sedo_client.is_configured:
# Use search keyword or fall back to popular terms for discovery
sedo_keyword = keyword
if not sedo_keyword:
# Fetch popular domains when no specific search
import random
popular_terms = ["ai", "tech", "crypto", "app", "cloud", "digital", "smart", "pro"]
sedo_keyword = random.choice(popular_terms)
# Fetch live Sedo listings (limit to avoid slow responses)
sedo_listings = await sedo_client.get_listings_for_display(
keyword=sedo_keyword,
tld=tld_clean,
page_size=min(30, limit) # Cap at 30 to avoid slow API calls
)
# Track domains already in results to avoid duplicates
existing_domains = {item["item"].domain.lower() for item in built}
for sedo_item in sedo_listings:
domain = sedo_item.get("domain", "").lower()
# Skip if already have this domain from scraped data
if domain in existing_domains:
continue
# Apply vanity filter for anonymous users
if current_user is None and not _is_premium_domain(domain):
continue
# Apply price filters
price = sedo_item.get("price", 0)
if min_price is not None and price < min_price and price > 0:
continue
if max_price is not None and price > max_price:
continue
domain_tld = sedo_item.get("tld", "")
pounce_score = _calculate_pounce_score_v2(
domain,
domain_tld,
num_bids=0,
age_years=0,
is_pounce=False,
)
if pounce_score < min_score:
continue
# Determine price type
price_type = "bid" if sedo_item.get("is_auction") else (
"negotiable" if price == 0 else "fixed"
)
item = MarketFeedItem(
id=f"sedo-live-{hash(domain) % 1000000}",
domain=domain,
tld=domain_tld,
price=price,
currency="USD",
price_type=price_type,
status="auction" if sedo_item.get("is_auction") else "instant",
source="Sedo",
is_pounce=False,
verified=False,
time_remaining=None,
end_time=None,
num_bids=None,
url=sedo_item.get("url", ""),
is_external=True,
pounce_score=pounce_score,
)
built.append({"item": item, "newest_ts": now})
existing_domains.add(domain)
# Update auction count
auction_total += 1
except Exception as e:
logger.warning(f"Failed to fetch live Sedo data: {e}")
# -----------------------------
# Merge sort (Python) + paginate

View File

@ -14,6 +14,7 @@ Endpoints:
import os
import secrets
import logging
import re
from datetime import datetime, timedelta
from typing import Optional
@ -25,11 +26,24 @@ from slowapi.util import get_remote_address
from app.api.deps import Database, CurrentUser
from app.config import get_settings
from app.schemas.auth import UserCreate, UserLogin, UserResponse, LoginResponse
from app.schemas.auth import (
LoginResponse,
ReferralLinkResponse,
ReferralStats,
UserCreate,
UserLogin,
UserResponse,
)
from app.services.auth import AuthService
from app.services.email_service import email_service
from app.models.user import User
from app.security import set_auth_cookie, clear_auth_cookie
from app.services.telemetry import track_event
from app.services.referral_rewards import (
QUALIFIED_REFERRAL_BATCH_SIZE,
apply_referral_rewards_for_user,
compute_badge,
)
logger = logging.getLogger(__name__)
@ -72,7 +86,9 @@ class UpdateUserRequest(BaseModel):
# ============== Endpoints ==============
@router.post("/register", response_model=UserResponse, status_code=status.HTTP_201_CREATED)
@limiter.limit("5/minute")
async def register(
request: Request,
user_data: UserCreate,
db: Database,
background_tasks: BackgroundTasks,
@ -100,6 +116,62 @@ async def register(
name=user_data.name,
)
# Process referral if present.
# Supported formats:
# - yield_{user_id}_{domain_id}
# - invite code (12 hex chars)
referral_applied = False
referrer_user_id: Optional[int] = None
referral_type: Optional[str] = None
if user_data.ref:
ref_raw = user_data.ref.strip()
# Yield referral: yield_{user_id}_{domain_id}
if ref_raw.startswith("yield_"):
try:
parts = ref_raw.split("_")
if len(parts) >= 3:
referrer_user_id = int(parts[1])
user.referred_by_user_id = referrer_user_id
user.referral_code = ref_raw
referral_type = "yield"
# Try to map the yield_domain_id to a domain string
try:
from app.models.yield_domain import YieldDomain
yield_domain_id = int(parts[2])
yd_res = await db.execute(select(YieldDomain).where(YieldDomain.id == yield_domain_id))
yd = yd_res.scalar_one_or_none()
if yd:
user.referred_by_domain = yd.domain
except Exception:
pass
await db.commit()
referral_applied = True
logger.info("User %s referred via yield by user %s", user.email, referrer_user_id)
except Exception as e:
logger.warning("Failed to process yield referral code: %s, error: %s", ref_raw, e)
else:
# Invite code referral (viral loop)
code = ref_raw.lower()
if re.fullmatch(r"[0-9a-f]{12}", code):
try:
ref_user_res = await db.execute(select(User).where(User.invite_code == code))
ref_user = ref_user_res.scalar_one_or_none()
if ref_user and ref_user.id != user.id:
referrer_user_id = ref_user.id
user.referred_by_user_id = ref_user.id
user.referral_code = code
referral_type = "invite"
await db.commit()
referral_applied = True
logger.info("User %s referred via invite_code by user %s", user.email, ref_user.id)
except Exception as e:
logger.warning("Failed to process invite referral code: %s, error: %s", code, e)
# Auto-admin for specific email
ADMIN_EMAILS = ["guggeryves@hotmail.com"]
if user.email.lower() in [e.lower() for e in ADMIN_EMAILS]:
@ -131,10 +203,40 @@ async def register(
user.email_verification_token = verification_token
user.email_verification_expires = datetime.utcnow() + timedelta(hours=24)
await db.commit()
# Telemetry: registration + referral attribution
try:
await track_event(
db,
event_name="user_registered",
request=request,
user_id=user.id,
is_authenticated=False,
source="public",
metadata={"ref": bool(user_data.ref)},
)
if referral_applied:
await track_event(
db,
event_name="referral_attributed",
request=request,
user_id=user.id,
is_authenticated=False,
source="public",
metadata={
"referral_type": referral_type,
"referrer_user_id": referrer_user_id,
"ref": user_data.ref,
},
)
await db.commit()
except Exception:
# never block registration
pass
# Send verification email in background
if email_service.is_configured:
site_url = os.getenv("SITE_URL", "http://localhost:3000")
if email_service.is_configured():
site_url = (settings.site_url or "http://localhost:3000").rstrip("/")
verify_url = f"{site_url}/verify-email?token={verification_token}"
background_tasks.add_task(
@ -147,8 +249,104 @@ async def register(
return user
@router.get("/referral", response_model=ReferralLinkResponse)
async def get_referral_link(
request: Request,
current_user: CurrentUser,
db: Database,
days: int = 30,
):
"""Return the authenticated user's invite link."""
if not current_user.invite_code:
# Generate on demand for older users
for _ in range(12):
code = secrets.token_hex(6)
exists = await db.execute(select(User.id).where(User.invite_code == code))
if exists.scalar_one_or_none() is None:
current_user.invite_code = code
await db.commit()
break
if not current_user.invite_code:
raise HTTPException(status_code=500, detail="Failed to generate invite code")
# Apply rewards (idempotent) so UI reflects current state even without scheduler
snapshot = await apply_referral_rewards_for_user(db, current_user.id)
await db.commit()
base = (settings.site_url or "http://localhost:3000").rstrip("/")
url = f"{base}/register?ref={current_user.invite_code}"
try:
await track_event(
db,
event_name="referral_link_viewed",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
metadata={"invite_code": current_user.invite_code},
)
await db.commit()
except Exception:
pass
# Count link views in the chosen window
try:
from datetime import timedelta
from sqlalchemy import and_, func
from app.models.telemetry import TelemetryEvent
window_days = max(1, min(int(days), 365))
end = datetime.utcnow()
start = end - timedelta(days=window_days)
views = (
await db.execute(
select(func.count(TelemetryEvent.id)).where(
and_(
TelemetryEvent.event_name == "referral_link_viewed",
TelemetryEvent.user_id == current_user.id,
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
)
)
)
).scalar()
referral_link_views_window = int(views or 0)
except Exception:
window_days = 30
referral_link_views_window = 0
qualified = int(snapshot.qualified_referrals_total)
if qualified < QUALIFIED_REFERRAL_BATCH_SIZE:
next_reward_at = QUALIFIED_REFERRAL_BATCH_SIZE
else:
remainder = qualified % QUALIFIED_REFERRAL_BATCH_SIZE
next_reward_at = qualified + (QUALIFIED_REFERRAL_BATCH_SIZE - remainder) if remainder else qualified + QUALIFIED_REFERRAL_BATCH_SIZE
return ReferralLinkResponse(
invite_code=current_user.invite_code,
url=url,
stats=ReferralStats(
window_days=int(window_days),
referred_users_total=int(snapshot.referred_users_total),
qualified_referrals_total=qualified,
referral_link_views_window=int(referral_link_views_window),
bonus_domains=int(snapshot.bonus_domains),
next_reward_at=int(next_reward_at),
badge=compute_badge(qualified),
cooldown_days=int(getattr(snapshot, "cooldown_days", 7) or 7),
disqualified_cooldown_total=int(getattr(snapshot, "disqualified_cooldown_total", 0) or 0),
disqualified_missing_ip_total=int(getattr(snapshot, "disqualified_missing_ip_total", 0) or 0),
disqualified_shared_ip_total=int(getattr(snapshot, "disqualified_shared_ip_total", 0) or 0),
disqualified_duplicate_ip_total=int(getattr(snapshot, "disqualified_duplicate_ip_total", 0) or 0),
),
)
@router.post("/login", response_model=LoginResponse)
async def login(user_data: UserLogin, db: Database, response: Response):
@limiter.limit("10/minute")
async def login(request: Request, user_data: UserLogin, db: Database, response: Response):
"""
Authenticate user and return JWT token.
@ -253,8 +451,10 @@ async def update_current_user(
@router.post("/forgot-password", response_model=MessageResponse)
@limiter.limit("3/minute")
async def forgot_password(
request: ForgotPasswordRequest,
request: Request,
payload: ForgotPasswordRequest,
db: Database,
background_tasks: BackgroundTasks,
):
@ -269,9 +469,7 @@ async def forgot_password(
success_message = "If an account with this email exists, a password reset link has been sent."
# Look up user
result = await db.execute(
select(User).where(User.email == request.email.lower())
)
result = await db.execute(select(User).where(User.email == payload.email.lower()))
user = result.scalar_one_or_none()
if not user:
@ -285,8 +483,8 @@ async def forgot_password(
await db.commit()
# Send reset email in background
if email_service.is_configured:
site_url = os.getenv("SITE_URL", "http://localhost:3000")
if email_service.is_configured():
site_url = (settings.site_url or "http://localhost:3000").rstrip("/")
reset_url = f"{site_url}/reset-password?token={reset_token}"
background_tasks.add_task(
@ -384,8 +582,10 @@ async def verify_email(
@router.post("/resend-verification", response_model=MessageResponse)
@limiter.limit("3/minute")
async def resend_verification(
request: ForgotPasswordRequest, # Reuse schema - just needs email
request: Request,
payload: ForgotPasswordRequest, # Reuse schema - just needs email
db: Database,
background_tasks: BackgroundTasks,
):
@ -399,7 +599,7 @@ async def resend_verification(
# Look up user
result = await db.execute(
select(User).where(User.email == request.email.lower())
select(User).where(User.email == payload.email.lower())
)
user = result.scalar_one_or_none()
@ -413,8 +613,8 @@ async def resend_verification(
await db.commit()
# Send verification email
if email_service.is_configured:
site_url = os.getenv("SITE_URL", "http://localhost:3000")
if email_service.is_configured():
site_url = (settings.site_url or "http://localhost:3000").rstrip("/")
verify_url = f"{site_url}/verify-email?token={verification_token}"
background_tasks.add_task(

View File

@ -200,6 +200,36 @@ async def get_blog_post(
return data
@router.get("/posts/{slug}/meta")
async def get_blog_post_meta(
slug: str,
db: Database,
):
"""
Get blog post metadata by slug (public).
IMPORTANT: This endpoint does NOT increment view_count.
It's intended for SEO metadata generation (generateMetadata, JSON-LD).
"""
result = await db.execute(
select(BlogPost)
.options(selectinload(BlogPost.author))
.where(
BlogPost.slug == slug,
BlogPost.is_published == True,
)
)
post = result.scalar_one_or_none()
if not post:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Blog post not found",
)
return post.to_dict(include_content=False)
# ============== Admin Endpoints ==============
@router.get("/admin/posts")

197
backend/app/api/cfo.py Normal file
View File

@ -0,0 +1,197 @@
"""CFO (Management) endpoints."""
from __future__ import annotations
from datetime import datetime, timedelta, timezone
from fastapi import APIRouter, Depends, HTTPException, Request, status
from slowapi import Limiter
from slowapi.util import get_remote_address
from sqlalchemy import and_, case, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user
from app.database import get_db
from app.models.portfolio import PortfolioDomain
from app.models.user import User
from app.models.yield_domain import YieldDomain, YieldTransaction
from app.schemas.cfo import (
CfoKillListRow,
CfoMonthlyBucket,
CfoSummaryResponse,
CfoUpcomingCostRow,
SetToDropResponse,
)
from app.services.analyze.renewal_cost import get_tld_price_snapshot
router = APIRouter()
limiter = Limiter(key_func=get_remote_address)
def _utcnow() -> datetime:
return datetime.now(timezone.utc)
def _month_key(dt: datetime) -> str:
return f"{dt.year:04d}-{dt.month:02d}"
async def _estimate_renewal_cost_usd(db: AsyncSession, domain: str) -> tuple[float | None, str]:
# If the user stored renewal_cost, we treat it as the source of truth.
# Else we estimate using our own collected `tld_prices` DB.
tld = domain.split(".")[-1].lower()
snap = await get_tld_price_snapshot(db, tld)
if snap.min_renew_usd is None:
return None, "unknown"
return float(snap.min_renew_usd), "tld_prices"
@router.get("/summary", response_model=CfoSummaryResponse)
@limiter.limit("30/minute")
async def cfo_summary(
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
CFO dashboard summary:
- Burn rate timeline (renewal costs)
- Upcoming costs (30d)
- Kill list (renewal soon + no yield signals)
"""
now = _utcnow()
now_naive = now.replace(tzinfo=None)
domains = (
await db.execute(select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id))
).scalars().all()
# Yield stats (last 60d) by domain
since_60d = now_naive - timedelta(days=60)
yd_rows = (
await db.execute(
select(
YieldDomain.domain,
func.coalesce(func.sum(YieldTransaction.net_amount), 0).label("net_sum"),
func.coalesce(func.sum(case((YieldTransaction.event_type == "click", 1), else_=0)), 0).label("clicks"),
)
.join(
YieldTransaction,
and_(YieldTransaction.yield_domain_id == YieldDomain.id, YieldTransaction.created_at >= since_60d),
isouter=True,
)
.where(YieldDomain.user_id == current_user.id)
.group_by(YieldDomain.domain)
)
).all()
yield_by_domain = {str(d).lower(): {"net": float(n or 0), "clicks": int(c or 0)} for d, n, c in yd_rows}
# Monthly buckets next 12 months
buckets: dict[str, CfoMonthlyBucket] = {}
for i in range(0, 12):
d = (now + timedelta(days=30 * i)).replace(day=1)
buckets[_month_key(d)] = CfoMonthlyBucket(month=_month_key(d), total_cost_usd=0.0, domains=0)
upcoming_rows: list[CfoUpcomingCostRow] = []
kill_list: list[CfoKillListRow] = []
cutoff_30d = now_naive + timedelta(days=30)
for pd in domains:
if pd.is_sold:
continue
renewal_dt = pd.renewal_date
if not renewal_dt:
continue
if renewal_dt.tzinfo is not None:
renewal_dt_naive = renewal_dt.astimezone(timezone.utc).replace(tzinfo=None)
else:
renewal_dt_naive = renewal_dt
# cost source: portfolio overrides
if pd.renewal_cost is not None:
cost = float(pd.renewal_cost)
source = "portfolio"
else:
cost, source = await _estimate_renewal_cost_usd(db, pd.domain)
# Monthly burn timeline
month = _month_key(renewal_dt_naive)
if month not in buckets:
buckets[month] = CfoMonthlyBucket(month=month, total_cost_usd=0.0, domains=0)
if cost is not None:
buckets[month].total_cost_usd = float(buckets[month].total_cost_usd) + float(cost)
buckets[month].domains = int(buckets[month].domains) + 1
# Upcoming 30d
if now_naive <= renewal_dt_naive <= cutoff_30d:
upcoming_rows.append(
CfoUpcomingCostRow(
domain_id=pd.id,
domain=pd.domain,
renewal_date=renewal_dt,
renewal_cost_usd=cost,
cost_source=source,
is_sold=bool(pd.is_sold),
)
)
y = yield_by_domain.get(pd.domain.lower(), {"net": 0.0, "clicks": 0})
if float(y["net"]) <= 0.0 and int(y["clicks"]) <= 0:
kill_list.append(
CfoKillListRow(
domain_id=pd.id,
domain=pd.domain,
renewal_date=renewal_dt,
renewal_cost_usd=cost,
cost_source=source,
auto_renew=bool(pd.auto_renew),
is_dns_verified=bool(getattr(pd, "is_dns_verified", False) or False),
yield_net_60d=float(y["net"]),
yield_clicks_60d=int(y["clicks"]),
reason="No yield signals tracked in the last 60 days and renewal is due within 30 days.",
)
)
# Sort rows
upcoming_rows.sort(key=lambda r: (r.renewal_date or now_naive))
kill_list.sort(key=lambda r: (r.renewal_date or now_naive))
upcoming_total = sum((r.renewal_cost_usd or 0) for r in upcoming_rows)
monthly_sorted = [buckets[k] for k in sorted(buckets.keys())]
return CfoSummaryResponse(
computed_at=now,
upcoming_30d_total_usd=float(round(upcoming_total, 2)),
upcoming_30d_rows=upcoming_rows,
monthly=monthly_sorted,
kill_list=kill_list[:50],
)
@router.post("/domains/{domain_id}/set-to-drop", response_model=SetToDropResponse)
@limiter.limit("30/minute")
async def set_to_drop(
request: Request,
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Mark portfolio domain as 'to drop' by turning off local auto-renew flag.
(We cannot disable auto-renew at the registrar automatically.)
"""
pd = (
await db.execute(
select(PortfolioDomain).where(and_(PortfolioDomain.id == domain_id, PortfolioDomain.user_id == current_user.id))
)
).scalar_one_or_none()
if not pd:
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail="Portfolio domain not found")
pd.auto_renew = False
pd.updated_at = datetime.utcnow()
await db.commit()
return SetToDropResponse(domain_id=pd.id, auto_renew=bool(pd.auto_renew), updated_at=pd.updated_at.replace(tzinfo=timezone.utc))

View File

@ -16,10 +16,12 @@ from datetime import datetime
from typing import Optional
from fastapi import APIRouter, HTTPException, status, BackgroundTasks, Request
from fastapi.responses import HTMLResponse
from pydantic import BaseModel, EmailStr, Field
from sqlalchemy import select, delete
from slowapi import Limiter
from slowapi.util import get_remote_address
from urllib.parse import urlencode
from app.api.deps import Database
from app.services.email_service import email_service
@ -32,6 +34,11 @@ router = APIRouter()
# Rate limiter for contact endpoints
limiter = Limiter(key_func=get_remote_address)
def _build_unsubscribe_url(email: str, token: str) -> str:
base = os.getenv("SITE_URL", "https://pounce.ch").rstrip("/")
query = urlencode({"email": email, "token": token})
return f"{base}/api/v1/contact/newsletter/unsubscribe?{query}"
# ============== Schemas ==============
@ -139,6 +146,7 @@ async def subscribe_newsletter(
background_tasks.add_task(
email_service.send_newsletter_welcome,
to_email=email_lower,
unsubscribe_url=_build_unsubscribe_url(email_lower, existing.unsubscribe_token),
)
return MessageResponse(
@ -160,6 +168,7 @@ async def subscribe_newsletter(
background_tasks.add_task(
email_service.send_newsletter_welcome,
to_email=email_lower,
unsubscribe_url=_build_unsubscribe_url(email_lower, subscriber.unsubscribe_token),
)
logger.info(f"Newsletter subscription: {email_lower}")
@ -216,6 +225,50 @@ async def unsubscribe_newsletter(
)
@router.get("/newsletter/unsubscribe")
async def unsubscribe_newsletter_one_click(
email: EmailStr,
token: str,
db: Database,
):
"""
One-click unsubscribe endpoint (for List-Unsubscribe header).
Always returns 200 with a human-readable HTML response.
"""
email_lower = email.lower()
result = await db.execute(
select(NewsletterSubscriber).where(
NewsletterSubscriber.email == email_lower,
NewsletterSubscriber.unsubscribe_token == token,
)
)
subscriber = result.scalar_one_or_none()
if subscriber and subscriber.is_active:
subscriber.is_active = False
subscriber.unsubscribed_at = datetime.utcnow()
await db.commit()
return HTMLResponse(
content="""
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8" />
<meta name="viewport" content="width=device-width, initial-scale=1" />
<title>Unsubscribed</title>
</head>
<body style="font-family: system-ui, -apple-system, Segoe UI, Roboto, Helvetica, Arial, sans-serif; padding: 32px;">
<h1 style="margin: 0 0 12px 0;">You are unsubscribed.</h1>
<p style="margin: 0; color: #555;">
If you were subscribed, you will no longer receive pounce insights emails.
</p>
</body>
</html>
""".strip(),
status_code=200,
)
@router.get("/newsletter/status")
async def check_newsletter_status(
email: EmailStr,

View File

@ -28,7 +28,7 @@ async def get_current_user(
token: Optional[str] = None
if credentials is not None:
token = credentials.credentials
token = credentials.credentials
if not token:
token = request.cookies.get(AUTH_COOKIE_NAME)

View File

@ -127,7 +127,7 @@ async def add_domain(
await db.refresh(current_user, ["subscription", "domains"])
if current_user.subscription:
limit = current_user.subscription.max_domains
limit = current_user.subscription.domain_limit
else:
limit = TIER_CONFIG[SubscriptionTier.SCOUT]["domain_limit"]

177
backend/app/api/drops.py Normal file
View File

@ -0,0 +1,177 @@
"""
Drops API - Zone File Analysis Endpoints
=========================================
API endpoints for accessing freshly dropped domains from:
- Switch.ch zone files (.ch, .li)
- ICANN CZDS zone files (.com, .net, .org, .xyz, .info, .dev, .app, .online)
"""
from datetime import datetime
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, Query, BackgroundTasks
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import get_db
from app.api.deps import get_current_user
from app.services.zone_file import (
ZoneFileService,
get_dropped_domains,
get_zone_stats,
)
router = APIRouter(prefix="/drops", tags=["drops"])
# All supported TLDs
SWITCH_TLDS = ["ch", "li"]
CZDS_TLDS = ["xyz", "org", "online", "info", "dev", "app"] # Approved
CZDS_PENDING = ["com", "net", "club", "biz"] # Pending approval
ALL_TLDS = SWITCH_TLDS + CZDS_TLDS
# ============================================================================
# PUBLIC ENDPOINTS (for stats)
# ============================================================================
@router.get("/stats")
async def api_get_zone_stats(
db: AsyncSession = Depends(get_db)
):
"""
Get zone file statistics.
Returns domain counts and last sync times for .ch and .li.
"""
try:
stats = await get_zone_stats(db)
return stats
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
# ============================================================================
# AUTHENTICATED ENDPOINTS
# ============================================================================
@router.get("")
async def api_get_drops(
tld: Optional[str] = Query(None, description="Filter by TLD"),
hours: int = Query(24, ge=1, le=48, description="Hours to look back (max 48h, we only store 48h)"),
min_length: Optional[int] = Query(None, ge=1, le=63, description="Minimum domain length"),
max_length: Optional[int] = Query(None, ge=1, le=63, description="Maximum domain length"),
exclude_numeric: bool = Query(False, description="Exclude numeric-only domains"),
exclude_hyphen: bool = Query(False, description="Exclude domains with hyphens"),
keyword: Optional[str] = Query(None, description="Search keyword"),
limit: int = Query(50, ge=1, le=200, description="Results per page"),
offset: int = Query(0, ge=0, description="Offset for pagination"),
db: AsyncSession = Depends(get_db),
current_user = Depends(get_current_user)
):
"""
Get recently dropped domains from zone files.
Supports:
- Switch.ch zones: .ch, .li
- ICANN CZDS zones: .xyz, .org, .online, .info, .dev, .app
Domains are detected by comparing daily zone file snapshots.
Only available for authenticated users.
"""
if tld and tld not in ALL_TLDS:
raise HTTPException(
status_code=400,
detail=f"Unsupported TLD. Supported: {', '.join(ALL_TLDS)}"
)
try:
result = await get_dropped_domains(
db=db,
tld=tld,
hours=hours,
min_length=min_length,
max_length=max_length,
exclude_numeric=exclude_numeric,
exclude_hyphen=exclude_hyphen,
keyword=keyword,
limit=limit,
offset=offset
)
return result
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
@router.post("/sync/{tld}")
async def api_trigger_sync(
tld: str,
background_tasks: BackgroundTasks,
db: AsyncSession = Depends(get_db),
current_user = Depends(get_current_user)
):
"""
Trigger a manual zone file sync for a specific TLD.
Only available for admin users.
This is normally run automatically by the scheduler.
"""
# Check if user is admin
if not getattr(current_user, 'is_admin', False):
raise HTTPException(status_code=403, detail="Admin access required")
if tld not in ALL_TLDS:
raise HTTPException(
status_code=400,
detail=f"Unsupported TLD. Supported: {', '.join(ALL_TLDS)}"
)
async def run_sync():
from app.database import AsyncSessionLocal
async with AsyncSessionLocal() as session:
try:
if tld in SWITCH_TLDS:
# Use Switch.ch zone transfer
service = ZoneFileService()
await service.run_daily_sync(session, tld)
else:
# Use ICANN CZDS
from app.services.czds_client import CZDSClient
client = CZDSClient()
await client.sync_zone(session, tld)
except Exception as e:
print(f"Zone sync failed for .{tld}: {e}")
background_tasks.add_task(run_sync)
return {"status": "sync_started", "tld": tld}
# ============================================================================
# HELPER ENDPOINTS
# ============================================================================
@router.get("/tlds")
async def api_get_supported_tlds():
"""
Get list of supported TLDs for zone file analysis.
"""
return {
"tlds": [
# Switch.ch zones
{"tld": "ch", "name": "Switzerland", "flag": "🇨🇭", "registry": "Switch", "source": "switch"},
{"tld": "li", "name": "Liechtenstein", "flag": "🇱🇮", "registry": "Switch", "source": "switch"},
# ICANN CZDS zones (approved)
{"tld": "xyz", "name": "XYZ", "flag": "🌐", "registry": "XYZ.COM LLC", "source": "czds"},
{"tld": "org", "name": "Organization", "flag": "🏛️", "registry": "PIR", "source": "czds"},
{"tld": "online", "name": "Online", "flag": "💻", "registry": "Radix", "source": "czds"},
{"tld": "info", "name": "Information", "flag": "", "registry": "Afilias", "source": "czds"},
{"tld": "dev", "name": "Developer", "flag": "👨‍💻", "registry": "Google", "source": "czds"},
{"tld": "app", "name": "Application", "flag": "📱", "registry": "Google", "source": "czds"},
],
"pending": [
# CZDS pending approval
{"tld": "com", "name": "Commercial", "flag": "🏢", "registry": "Verisign", "source": "czds"},
{"tld": "net", "name": "Network", "flag": "🌐", "registry": "Verisign", "source": "czds"},
{"tld": "club", "name": "Club", "flag": "🎉", "registry": "GoDaddy", "source": "czds"},
{"tld": "biz", "name": "Business", "flag": "💼", "registry": "GoDaddy", "source": "czds"},
]
}

247
backend/app/api/hunt.py Normal file
View File

@ -0,0 +1,247 @@
"""HUNT (Discovery) endpoints."""
from __future__ import annotations
from datetime import datetime, timezone
from fastapi import APIRouter, Depends, HTTPException, Query, Request
from slowapi import Limiter
from slowapi.util import get_remote_address
from sqlalchemy import and_, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user
from app.database import get_db
from app.models.auction import DomainAuction
from app.models.user import User
from app.schemas.hunt import (
BrandableRequest,
BrandableCandidate,
BrandableResponse,
HuntSniperItem,
HuntSniperResponse,
KeywordAvailabilityRequest,
KeywordAvailabilityResponse,
KeywordAvailabilityRow,
TrendsResponse,
TrendItem,
TypoCheckRequest,
TypoCheckResponse,
TypoCandidate,
)
from app.services.domain_checker import domain_checker
from app.services.hunt.brandables import check_domains, generate_cvcvc, generate_cvccv, generate_human
from app.services.hunt.trends import fetch_google_trends_daily_rss
from app.services.hunt.typos import generate_typos
router = APIRouter()
limiter = Limiter(key_func=get_remote_address)
def _utcnow() -> datetime:
return datetime.now(timezone.utc)
@router.get("/bargain-bin", response_model=HuntSniperResponse)
@limiter.limit("60/minute")
async def bargain_bin(
request: Request,
_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
limit: int = Query(100, ge=1, le=500),
):
"""
Closeout Sniper (Chris logic):
price < $10 AND age_years >= 5 AND backlinks > 0
Uses ONLY real scraped auction data (DomainAuction.age_years/backlinks).
Items without required fields are excluded.
"""
now = _utcnow().replace(tzinfo=None)
base = and_(DomainAuction.is_active == True, DomainAuction.end_time > now) # noqa: E712
rows = (
await db.execute(
select(DomainAuction)
.where(base)
.where(DomainAuction.current_bid < 10)
.order_by(DomainAuction.end_time.asc())
.limit(limit * 3) # allow filtering
)
).scalars().all()
filtered_out = 0
items: list[HuntSniperItem] = []
for a in rows:
if a.age_years is None or a.backlinks is None:
filtered_out += 1
continue
if int(a.age_years) < 5 or int(a.backlinks) <= 0:
continue
items.append(
HuntSniperItem(
domain=a.domain,
platform=a.platform,
auction_url=a.auction_url,
current_bid=float(a.current_bid),
currency=a.currency,
end_time=a.end_time.replace(tzinfo=timezone.utc) if a.end_time and a.end_time.tzinfo is None else a.end_time,
age_years=int(a.age_years) if a.age_years is not None else None,
backlinks=int(a.backlinks) if a.backlinks is not None else None,
pounce_score=int(a.pounce_score) if a.pounce_score is not None else None,
)
)
if len(items) >= limit:
break
last_updated = (
await db.execute(select(func.max(DomainAuction.updated_at)).where(DomainAuction.is_active == True)) # noqa: E712
).scalar()
return HuntSniperResponse(
items=items,
total=len(items),
filtered_out_missing_data=int(filtered_out),
last_updated=last_updated.replace(tzinfo=timezone.utc) if last_updated and last_updated.tzinfo is None else last_updated,
)
@router.get("/trends", response_model=TrendsResponse)
@limiter.limit("30/minute")
async def trends(
request: Request,
_user: User = Depends(get_current_user),
geo: str = Query("US", min_length=2, max_length=2),
):
try:
items_raw = await fetch_google_trends_daily_rss(geo=geo)
except Exception:
# Don't 500 the whole UI when the public feed is temporarily unavailable.
raise HTTPException(status_code=502, detail="Google Trends feed unavailable")
items = [
TrendItem(
title=i["title"],
approx_traffic=i.get("approx_traffic"),
published_at=i.get("published_at"),
link=i.get("link"),
)
for i in items_raw[:50]
]
return TrendsResponse(geo=geo.upper(), items=items, fetched_at=_utcnow())
@router.post("/keywords", response_model=KeywordAvailabilityResponse)
@limiter.limit("30/minute")
async def keyword_availability(
request: Request,
payload: KeywordAvailabilityRequest,
_user: User = Depends(get_current_user),
):
# Normalize + cap work for UX/perf
keywords = []
for kw in payload.keywords[:25]:
k = kw.strip().lower().replace(" ", "")
if k:
keywords.append(kw)
tlds = [t.lower().lstrip(".") for t in payload.tlds[:20] if t.strip()]
if not tlds:
tlds = ["com"]
# Build candidate domains
candidates: list[tuple[str, str, str]] = []
domain_list: list[str] = []
for kw in keywords:
k = kw.strip().lower().replace(" ", "")
if not k:
continue
for t in tlds:
d = f"{k}.{t}"
candidates.append((kw, t, d))
domain_list.append(d)
checked = await check_domains(domain_list, concurrency=40)
by_domain = {c.domain: c for c in checked}
rows: list[KeywordAvailabilityRow] = []
for kw, t, d in candidates:
c = by_domain.get(d)
if not c:
rows.append(KeywordAvailabilityRow(keyword=kw, domain=d, tld=t, is_available=None, status="unknown"))
else:
rows.append(KeywordAvailabilityRow(keyword=kw, domain=d, tld=t, is_available=c.is_available, status=c.status))
return KeywordAvailabilityResponse(items=rows)
@router.post("/typos", response_model=TypoCheckResponse)
@limiter.limit("20/minute")
async def typo_check(
request: Request,
payload: TypoCheckRequest,
_user: User = Depends(get_current_user),
):
brand = payload.brand.strip()
typos = generate_typos(brand, limit=min(int(payload.limit) * 4, 400))
# Build domain list (dedup)
tlds = [t.lower().lstrip(".") for t in payload.tlds if t.strip()]
candidates: list[str] = []
seen = set()
for typo in typos:
for t in tlds:
d = f"{typo}.{t}"
if d not in seen:
candidates.append(d)
seen.add(d)
if len(candidates) >= payload.limit * 4:
break
if len(candidates) >= payload.limit * 4:
break
checked = await check_domains(candidates, concurrency=30)
available = [c for c in checked if c.status == "available"]
items = [TypoCandidate(domain=c.domain, is_available=c.is_available, status=c.status) for c in available[: payload.limit]]
return TypoCheckResponse(brand=brand, items=items)
@router.post("/brandables", response_model=BrandableResponse)
@limiter.limit("15/minute")
async def brandables(
request: Request,
payload: BrandableRequest,
_user: User = Depends(get_current_user),
):
pattern = payload.pattern.strip().lower()
if pattern not in ("cvcvc", "cvccv", "human"):
pattern = "cvcvc"
tlds = [t.lower().lstrip(".") for t in payload.tlds if t.strip()]
if not tlds:
tlds = ["com"]
# Generate + check up to max_checks; return only available
candidates: list[str] = []
for _ in range(int(payload.max_checks)):
if pattern == "cvcvc":
sld = generate_cvcvc()
elif pattern == "cvccv":
sld = generate_cvccv()
else:
sld = generate_human()
for t in tlds:
candidates.append(f"{sld}.{t}")
checked = await check_domains(candidates, concurrency=40)
available = [c for c in checked if c.status == "available"]
# De-dup by domain
seen = set()
out = []
for c in available:
if c.domain not in seen:
seen.add(c.domain)
out.append(BrandableCandidate(domain=c.domain, is_available=c.is_available, status=c.status))
if len(out) >= payload.limit:
break
return BrandableResponse(pattern=payload.pattern, items=out)

View File

@ -27,12 +27,22 @@ from fastapi import APIRouter, Depends, Query, HTTPException, Request
from pydantic import BaseModel, Field, EmailStr
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.orm import selectinload
from app.database import get_db
from app.api.deps import get_current_user, get_current_user_optional
from app.models.user import User
from app.models.listing import DomainListing, ListingInquiry, ListingView, ListingStatus, VerificationStatus
from app.models.listing import (
DomainListing,
ListingInquiry,
ListingInquiryEvent,
ListingInquiryMessage,
ListingView,
ListingStatus,
VerificationStatus,
)
from app.services.valuation import valuation_service
from app.services.telemetry import track_event
def _calculate_pounce_score(domain: str, is_pounce: bool = True) -> int:
@ -104,6 +114,9 @@ class ListingUpdate(BaseModel):
show_valuation: Optional[bool] = None
allow_offers: Optional[bool] = None
status: Optional[str] = None
sold_reason: Optional[str] = Field(None, max_length=200)
sold_price: Optional[float] = Field(None, ge=0)
sold_currency: Optional[str] = Field(None, max_length=3)
class ListingResponse(BaseModel):
@ -129,6 +142,10 @@ class ListingResponse(BaseModel):
public_url: str
created_at: datetime
published_at: Optional[datetime]
sold_at: Optional[datetime] = None
sold_reason: Optional[str] = None
sold_price: Optional[float] = None
sold_currency: Optional[str] = None
# Seller info (minimal for privacy)
seller_verified: bool = False
@ -156,6 +173,7 @@ class ListingPublicResponse(BaseModel):
# Seller trust indicators
seller_verified: bool
seller_member_since: Optional[datetime]
seller_invite_code: Optional[str] = None
class Config:
from_attributes = True
@ -183,11 +201,37 @@ class InquiryResponse(BaseModel):
status: str
created_at: datetime
read_at: Optional[datetime]
replied_at: Optional[datetime] = None
closed_at: Optional[datetime] = None
closed_reason: Optional[str] = None
buyer_user_id: Optional[int] = None
class Config:
from_attributes = True
class InquiryUpdate(BaseModel):
"""Update inquiry status for listing owner."""
status: str = Field(..., min_length=1, max_length=20) # new, read, replied, spam
reason: Optional[str] = Field(None, max_length=200)
class InquiryMessageCreate(BaseModel):
body: str = Field(..., min_length=1, max_length=4000)
class InquiryMessageResponse(BaseModel):
id: int
inquiry_id: int
listing_id: int
sender_user_id: int
body: str
created_at: datetime
class Config:
from_attributes = True
class VerificationResponse(BaseModel):
"""DNS verification response."""
verification_code: str
@ -235,13 +279,14 @@ async def browse_listings(
min_price: Optional[float] = Query(None, ge=0),
max_price: Optional[float] = Query(None, ge=0),
verified_only: bool = Query(False),
clean_only: bool = Query(True, description="Hide low-quality/spam listings"),
sort_by: str = Query("newest", enum=["newest", "price_asc", "price_desc", "popular"]),
limit: int = Query(20, le=50),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
):
"""Browse active domain listings (public)."""
query = select(DomainListing).where(
query = select(DomainListing).options(selectinload(DomainListing.user)).where(
DomainListing.status == ListingStatus.ACTIVE.value
)
@ -282,6 +327,11 @@ async def browse_listings(
pounce_score = _calculate_pounce_score(listing.domain)
# Save it for future requests
listing.pounce_score = pounce_score
# Public cleanliness rule: don't surface low-quality inventory by default.
# (Still accessible in Terminal for authenticated power users.)
if clean_only and (pounce_score or 0) < 50:
continue
responses.append(ListingPublicResponse(
domain=listing.domain,
@ -298,6 +348,7 @@ async def browse_listings(
public_url=listing.public_url,
seller_verified=listing.is_verified,
seller_member_since=listing.user.created_at if listing.user else None,
seller_invite_code=getattr(listing.user, "invite_code", None) if listing.user else None,
))
await db.commit() # Save any updated pounce_scores
@ -342,6 +393,10 @@ async def get_my_listings(
public_url=listing.public_url,
created_at=listing.created_at,
published_at=listing.published_at,
sold_at=getattr(listing, "sold_at", None),
sold_reason=getattr(listing, "sold_reason", None),
sold_price=getattr(listing, "sold_price", None),
sold_currency=getattr(listing, "sold_currency", None),
seller_verified=current_user.is_verified,
seller_member_since=current_user.created_at,
)
@ -360,7 +415,9 @@ async def get_listing_by_slug(
):
"""Get listing details by slug (public)."""
result = await db.execute(
select(DomainListing).where(
select(DomainListing)
.options(selectinload(DomainListing.user))
.where(
and_(
DomainListing.slug == slug,
DomainListing.status == ListingStatus.ACTIVE.value,
@ -384,6 +441,18 @@ async def get_listing_by_slug(
# Increment view count
listing.view_count += 1
await track_event(
db,
event_name="listing_view",
request=request,
user_id=current_user.id if current_user else None,
is_authenticated=bool(current_user),
source="public",
domain=listing.domain,
listing_id=listing.id,
metadata={"slug": listing.slug},
)
# Calculate pounce_score dynamically if not stored (same as Market Feed)
pounce_score = listing.pounce_score
@ -409,6 +478,7 @@ async def get_listing_by_slug(
public_url=listing.public_url,
seller_verified=listing.is_verified,
seller_member_since=listing.user.created_at if listing.user else None,
seller_invite_code=getattr(listing.user, "invite_code", None) if listing.user else None,
)
@ -417,9 +487,10 @@ async def submit_inquiry(
slug: str,
inquiry: InquiryCreate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Submit an inquiry for a listing (public)."""
"""Submit an inquiry for a listing (requires authentication)."""
# Find listing
result = await db.execute(
select(DomainListing).where(
@ -433,6 +504,14 @@ async def submit_inquiry(
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
# Require that inquiries are sent from the authenticated account email.
# This prevents anonymous spam and makes the buyer identity consistent.
if inquiry.email.lower() != (current_user.email or "").lower():
raise HTTPException(
status_code=400,
detail="Inquiry email must match your account email.",
)
# Security: Check for phishing keywords
if not _check_content_safety(inquiry.message):
@ -441,13 +520,13 @@ async def submit_inquiry(
detail="Message contains blocked content. Please revise."
)
# Rate limiting check (simple: max 3 inquiries per email per listing per day)
# Rate limiting check (simple: max 3 inquiries per user per listing per day)
today_start = datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
existing_count = await db.execute(
select(func.count(ListingInquiry.id)).where(
and_(
ListingInquiry.listing_id == listing.id,
ListingInquiry.email == inquiry.email.lower(),
ListingInquiry.buyer_user_id == current_user.id,
ListingInquiry.created_at >= today_start,
)
)
@ -461,6 +540,7 @@ async def submit_inquiry(
# Create inquiry
new_inquiry = ListingInquiry(
listing_id=listing.id,
buyer_user_id=current_user.id,
name=inquiry.name,
email=inquiry.email.lower(),
phone=inquiry.phone,
@ -471,6 +551,34 @@ async def submit_inquiry(
user_agent=request.headers.get("user-agent", "")[:500],
)
db.add(new_inquiry)
await db.flush()
await track_event(
db,
event_name="inquiry_created",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="public",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=new_inquiry.id,
metadata={
"offer_amount": inquiry.offer_amount,
"has_phone": bool(inquiry.phone),
"has_company": bool(inquiry.company),
},
)
# Seed thread with the initial message
db.add(
ListingInquiryMessage(
inquiry_id=new_inquiry.id,
listing_id=listing.id,
sender_user_id=current_user.id,
body=inquiry.message,
)
)
# Increment inquiry count
listing.inquiry_count += 1
@ -516,10 +624,45 @@ async def create_listing(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Create a new domain listing."""
"""
Create a new domain listing.
SECURITY: Domain must be in user's portfolio before listing for sale.
DNS verification happens in the verification step (separate endpoint).
"""
from app.models.portfolio import PortfolioDomain
domain_lower = data.domain.lower()
# SECURITY CHECK: Domain must be in user's portfolio
portfolio_result = await db.execute(
select(PortfolioDomain).where(
PortfolioDomain.domain == domain_lower,
PortfolioDomain.user_id == current_user.id,
)
)
portfolio_domain = portfolio_result.scalar_one_or_none()
if not portfolio_domain:
raise HTTPException(
status_code=403,
detail="Domain must be in your portfolio before listing for sale. Add it to your portfolio first.",
)
# Check if domain is sold
if portfolio_domain.is_sold:
raise HTTPException(
status_code=400,
detail="Cannot list a sold domain for sale.",
)
# Check if domain is DNS verified in portfolio
# If verified in portfolio, listing inherits verification
is_portfolio_verified = getattr(portfolio_domain, 'is_dns_verified', False) or False
# Check if domain is already listed
existing = await db.execute(
select(DomainListing).where(DomainListing.domain == data.domain.lower())
select(DomainListing).where(DomainListing.domain == domain_lower)
)
if existing.scalar_one_or_none():
raise HTTPException(status_code=400, detail="This domain is already listed")
@ -533,7 +676,13 @@ async def create_listing(
listing_count = user_listings.scalar() or 0
# Listing limits by tier (from pounce_pricing.md)
tier = current_user.subscription.tier if current_user.subscription else "scout"
# Load subscription separately to avoid async lazy loading issues
from app.models.subscription import Subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == current_user.id)
)
subscription = sub_result.scalar_one_or_none()
tier = subscription.tier if subscription else "scout"
limits = {"scout": 0, "trader": 5, "tycoon": 50}
max_listings = limits.get(tier, 0)
@ -544,7 +693,7 @@ async def create_listing(
)
# Generate slug
slug = _generate_slug(data.domain)
slug = _generate_slug(domain_lower)
# Check slug uniqueness
slug_check = await db.execute(
@ -555,7 +704,7 @@ async def create_listing(
# Get valuation
try:
valuation = await valuation_service.estimate_value(data.domain, db, save_result=False)
valuation = await valuation_service.estimate_value(domain_lower, db, save_result=False)
pounce_score = min(100, int(valuation.get("score", 50)))
estimated_value = valuation.get("value", 0) # Fixed: was 'estimated_value', service returns 'value'
except Exception:
@ -563,9 +712,10 @@ async def create_listing(
estimated_value = None
# Create listing
# If portfolio domain is already DNS verified, listing is auto-verified
listing = DomainListing(
user_id=current_user.id,
domain=data.domain.lower(),
domain=domain_lower,
slug=slug,
title=data.title,
description=data.description,
@ -577,7 +727,9 @@ async def create_listing(
allow_offers=data.allow_offers,
pounce_score=pounce_score,
estimated_value=estimated_value,
verification_code=_generate_verification_code(),
verification_code=portfolio_domain.verification_code if is_portfolio_verified else _generate_verification_code(),
verification_status=VerificationStatus.VERIFIED.value if is_portfolio_verified else VerificationStatus.PENDING.value,
verified_at=portfolio_domain.verified_at if is_portfolio_verified else None,
status=ListingStatus.DRAFT.value,
)
@ -652,15 +804,387 @@ async def get_listing_inquiries(
status=inq.status,
created_at=inq.created_at,
read_at=inq.read_at,
replied_at=getattr(inq, "replied_at", None),
closed_at=getattr(inq, "closed_at", None),
closed_reason=getattr(inq, "closed_reason", None),
buyer_user_id=getattr(inq, "buyer_user_id", None),
)
for inq in inquiries
]
@router.patch("/{id}/inquiries/{inquiry_id}", response_model=InquiryResponse)
async def update_listing_inquiry(
id: int,
inquiry_id: int,
data: InquiryUpdate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Update an inquiry status (listing owner only)."""
allowed = {"new", "read", "replied", "closed", "spam"}
status_clean = (data.status or "").strip().lower()
if status_clean not in allowed:
raise HTTPException(status_code=400, detail="Invalid status")
# Verify listing ownership
listing_result = await db.execute(
select(DomainListing).where(
and_(
DomainListing.id == id,
DomainListing.user_id == current_user.id,
)
)
)
listing = listing_result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
inquiry_result = await db.execute(
select(ListingInquiry).where(
and_(
ListingInquiry.id == inquiry_id,
ListingInquiry.listing_id == id,
)
)
)
inquiry = inquiry_result.scalar_one_or_none()
if not inquiry:
raise HTTPException(status_code=404, detail="Inquiry not found")
now = datetime.utcnow()
old_status = getattr(inquiry, "status", None)
inquiry.status = status_clean
if status_clean == "read" and inquiry.read_at is None:
inquiry.read_at = now
if status_clean == "replied":
inquiry.replied_at = now
if status_clean == "closed":
inquiry.closed_at = now
inquiry.closed_reason = (data.reason or "").strip() or None
if status_clean == "spam":
inquiry.closed_reason = (data.reason or "").strip() or inquiry.closed_reason
# Audit trail
event = ListingInquiryEvent(
inquiry_id=inquiry.id,
listing_id=listing.id,
actor_user_id=current_user.id,
old_status=old_status,
new_status=status_clean,
reason=(data.reason or "").strip() or None,
ip_address=request.client.host if request.client else None,
user_agent=request.headers.get("user-agent", "")[:500],
)
db.add(event)
await track_event(
db,
event_name="inquiry_status_changed",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=inquiry.id,
metadata={"old_status": old_status, "new_status": status_clean, "reason": (data.reason or "").strip() or None},
)
await db.commit()
await db.refresh(inquiry)
return InquiryResponse(
id=inquiry.id,
name=inquiry.name,
email=inquiry.email,
phone=inquiry.phone,
company=inquiry.company,
message=inquiry.message,
offer_amount=inquiry.offer_amount,
status=inquiry.status,
created_at=inquiry.created_at,
read_at=inquiry.read_at,
replied_at=getattr(inquiry, "replied_at", None),
closed_at=getattr(inquiry, "closed_at", None),
closed_reason=getattr(inquiry, "closed_reason", None),
buyer_user_id=getattr(inquiry, "buyer_user_id", None),
)
@router.get("/{id}/inquiries/{inquiry_id}/messages", response_model=List[InquiryMessageResponse])
async def get_inquiry_messages_for_seller(
id: int,
inquiry_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Seller: fetch thread messages for an inquiry."""
listing_result = await db.execute(
select(DomainListing).where(and_(DomainListing.id == id, DomainListing.user_id == current_user.id))
)
listing = listing_result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
inquiry_result = await db.execute(
select(ListingInquiry).where(and_(ListingInquiry.id == inquiry_id, ListingInquiry.listing_id == id))
)
inquiry = inquiry_result.scalar_one_or_none()
if not inquiry:
raise HTTPException(status_code=404, detail="Inquiry not found")
msgs = (
await db.execute(
select(ListingInquiryMessage)
.where(and_(ListingInquiryMessage.inquiry_id == inquiry_id, ListingInquiryMessage.listing_id == id))
.order_by(ListingInquiryMessage.created_at.asc())
)
).scalars().all()
return [InquiryMessageResponse.model_validate(m) for m in msgs]
@router.post("/{id}/inquiries/{inquiry_id}/messages", response_model=InquiryMessageResponse)
async def post_inquiry_message_as_seller(
id: int,
inquiry_id: int,
payload: InquiryMessageCreate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Seller: post a message into an inquiry thread."""
listing_result = await db.execute(
select(DomainListing).where(and_(DomainListing.id == id, DomainListing.user_id == current_user.id))
)
listing = listing_result.scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
inquiry_result = await db.execute(
select(ListingInquiry).where(and_(ListingInquiry.id == inquiry_id, ListingInquiry.listing_id == id))
)
inquiry = inquiry_result.scalar_one_or_none()
if not inquiry:
raise HTTPException(status_code=404, detail="Inquiry not found")
if inquiry.status in ["closed", "spam"]:
raise HTTPException(status_code=400, detail="Inquiry is closed")
# Content safety (phishing keywords)
if not _check_content_safety(payload.body):
raise HTTPException(status_code=400, detail="Message contains blocked content. Please revise.")
# Simple rate limit: max 30 messages per hour per inquiry
hour_start = datetime.utcnow() - timedelta(hours=1)
msg_count = (
await db.execute(
select(func.count(ListingInquiryMessage.id)).where(
and_(
ListingInquiryMessage.inquiry_id == inquiry.id,
ListingInquiryMessage.sender_user_id == current_user.id,
ListingInquiryMessage.created_at >= hour_start,
)
)
)
).scalar() or 0
if msg_count >= 30:
raise HTTPException(status_code=429, detail="Too many messages. Please slow down.")
msg = ListingInquiryMessage(
inquiry_id=inquiry.id,
listing_id=listing.id,
sender_user_id=current_user.id,
body=payload.body,
)
db.add(msg)
await db.flush()
await track_event(
db,
event_name="message_sent",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=inquiry.id,
metadata={"role": "buyer"},
)
await track_event(
db,
event_name="message_sent",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
inquiry_id=inquiry.id,
metadata={"role": "seller"},
)
# Email buyer (if configured)
try:
from app.services.email_service import email_service
if inquiry.buyer_user_id:
buyer = (
await db.execute(select(User).where(User.id == inquiry.buyer_user_id))
).scalar_one_or_none()
else:
buyer = None
if buyer and buyer.email and email_service.is_configured():
thread_url = f"https://pounce.ch/terminal/inbox?inquiry={inquiry.id}"
await email_service.send_listing_message(
to_email=buyer.email,
domain=listing.domain,
sender_name=current_user.name or current_user.email,
message=payload.body,
thread_url=thread_url,
)
except Exception as e:
logger.error(f"Failed to send listing message notification: {e}")
await db.commit()
await db.refresh(msg)
return InquiryMessageResponse.model_validate(msg)
@router.get("/inquiries/my")
async def get_my_inquiries_as_buyer(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Buyer: list inquiries created from this account."""
result = await db.execute(
select(ListingInquiry, DomainListing)
.join(DomainListing, DomainListing.id == ListingInquiry.listing_id)
.where(ListingInquiry.buyer_user_id == current_user.id)
.order_by(ListingInquiry.created_at.desc())
)
rows = result.all()
return [
{
"id": inq.id,
"listing_id": listing.id,
"domain": listing.domain,
"slug": listing.slug,
"status": inq.status,
"created_at": inq.created_at.isoformat(),
"closed_at": inq.closed_at.isoformat() if getattr(inq, "closed_at", None) else None,
"closed_reason": getattr(inq, "closed_reason", None),
}
for inq, listing in rows
]
@router.get("/inquiries/{inquiry_id}/messages", response_model=List[InquiryMessageResponse])
async def get_inquiry_messages_for_buyer(
inquiry_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Buyer: fetch thread messages for one inquiry."""
inquiry = (
await db.execute(select(ListingInquiry).where(ListingInquiry.id == inquiry_id))
).scalar_one_or_none()
if not inquiry or inquiry.buyer_user_id != current_user.id:
raise HTTPException(status_code=404, detail="Inquiry not found")
msgs = (
await db.execute(
select(ListingInquiryMessage)
.where(ListingInquiryMessage.inquiry_id == inquiry_id)
.order_by(ListingInquiryMessage.created_at.asc())
)
).scalars().all()
return [InquiryMessageResponse.model_validate(m) for m in msgs]
@router.post("/inquiries/{inquiry_id}/messages", response_model=InquiryMessageResponse)
async def post_inquiry_message_as_buyer(
inquiry_id: int,
payload: InquiryMessageCreate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Buyer: post a message into an inquiry thread."""
inquiry = (
await db.execute(select(ListingInquiry).where(ListingInquiry.id == inquiry_id))
).scalar_one_or_none()
if not inquiry or inquiry.buyer_user_id != current_user.id:
raise HTTPException(status_code=404, detail="Inquiry not found")
if inquiry.status in ["closed", "spam"]:
raise HTTPException(status_code=400, detail="Inquiry is closed")
# Content safety (phishing keywords)
if not _check_content_safety(payload.body):
raise HTTPException(status_code=400, detail="Message contains blocked content. Please revise.")
# Simple rate limit: max 20 messages per hour per inquiry
hour_start = datetime.utcnow() - timedelta(hours=1)
msg_count = (
await db.execute(
select(func.count(ListingInquiryMessage.id)).where(
and_(
ListingInquiryMessage.inquiry_id == inquiry.id,
ListingInquiryMessage.sender_user_id == current_user.id,
ListingInquiryMessage.created_at >= hour_start,
)
)
)
).scalar() or 0
if msg_count >= 20:
raise HTTPException(status_code=429, detail="Too many messages. Please slow down.")
listing = (
await db.execute(select(DomainListing).where(DomainListing.id == inquiry.listing_id))
).scalar_one_or_none()
if not listing:
raise HTTPException(status_code=404, detail="Listing not found")
msg = ListingInquiryMessage(
inquiry_id=inquiry.id,
listing_id=listing.id,
sender_user_id=current_user.id,
body=payload.body,
)
db.add(msg)
await db.flush()
# Email seller (if configured)
try:
from app.services.email_service import email_service
seller = (
await db.execute(select(User).where(User.id == listing.user_id))
).scalar_one_or_none()
if seller and seller.email and email_service.is_configured():
thread_url = f"https://pounce.ch/terminal/listing"
await email_service.send_listing_message(
to_email=seller.email,
domain=listing.domain,
sender_name=current_user.name or current_user.email,
message=payload.body,
thread_url=thread_url,
)
except Exception as e:
logger.error(f"Failed to send listing message notification: {e}")
await db.commit()
await db.refresh(msg)
return InquiryMessageResponse.model_validate(msg)
@router.put("/{id}", response_model=ListingResponse)
async def update_listing(
id: int,
data: ListingUpdate,
request: Request,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
@ -705,8 +1229,58 @@ async def update_listing(
)
listing.status = ListingStatus.ACTIVE.value
listing.published_at = datetime.utcnow()
elif data.status in ["draft", "sold", "expired"]:
elif data.status in ["draft", "expired"]:
listing.status = data.status
elif data.status == "sold":
if listing.status != ListingStatus.ACTIVE.value:
raise HTTPException(status_code=400, detail="Only active listings can be marked as sold.")
listing.status = ListingStatus.SOLD.value
listing.sold_at = datetime.utcnow()
listing.sold_reason = (data.sold_reason or "").strip() or listing.sold_reason
listing.sold_price = data.sold_price if data.sold_price is not None else listing.sold_price
listing.sold_currency = (data.sold_currency or listing.currency or "USD").upper()
# Close all open inquiries on this listing (deal is done).
inqs = (
await db.execute(
select(ListingInquiry).where(ListingInquiry.listing_id == listing.id)
)
).scalars().all()
for inq in inqs:
if inq.status in ["closed", "spam"]:
continue
old = inq.status
inq.status = "closed"
inq.closed_at = datetime.utcnow()
inq.closed_reason = inq.closed_reason or "sold"
db.add(
ListingInquiryEvent(
inquiry_id=inq.id,
listing_id=listing.id,
actor_user_id=current_user.id,
old_status=old,
new_status="closed",
reason="sold",
ip_address=request.client.host if request.client else None,
user_agent=request.headers.get("user-agent", "")[:500],
)
)
await track_event(
db,
event_name="listing_marked_sold",
request=request,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=listing.domain,
listing_id=listing.id,
metadata={
"sold_reason": listing.sold_reason,
"sold_price": float(listing.sold_price) if listing.sold_price is not None else None,
"sold_currency": listing.sold_currency,
},
)
await db.commit()
await db.refresh(listing)
@ -733,6 +1307,10 @@ async def update_listing(
public_url=listing.public_url,
created_at=listing.created_at,
published_at=listing.published_at,
sold_at=getattr(listing, "sold_at", None),
sold_reason=getattr(listing, "sold_reason", None),
sold_price=getattr(listing, "sold_price", None),
sold_currency=getattr(listing, "sold_currency", None),
seller_verified=current_user.is_verified,
seller_member_since=current_user.created_at,
)

View File

@ -1,10 +1,12 @@
"""Portfolio API routes."""
import secrets
from datetime import datetime
from typing import Optional, List
from fastapi import APIRouter, Depends, HTTPException, status, Query
from pydantic import BaseModel, Field
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
import dns.resolver
from app.database import get_db
from app.api.deps import get_current_user
@ -71,6 +73,11 @@ class PortfolioDomainResponse(BaseModel):
notes: Optional[str]
tags: Optional[str]
roi: Optional[float]
# DNS Verification fields
is_dns_verified: bool = False
verification_status: str = "unverified"
verification_code: Optional[str] = None
verified_at: Optional[datetime] = None
created_at: datetime
updated_at: datetime
@ -78,6 +85,25 @@ class PortfolioDomainResponse(BaseModel):
from_attributes = True
class DNSVerificationStartResponse(BaseModel):
"""Response when starting DNS verification."""
domain_id: int
domain: str
verification_code: str
dns_record_type: str
dns_record_name: str
dns_record_value: str
instructions: str
status: str
class DNSVerificationCheckResponse(BaseModel):
"""Response when checking DNS verification."""
verified: bool
status: str
message: str
class PortfolioSummary(BaseModel):
"""Summary of user's portfolio."""
total_domains: int
@ -150,7 +176,112 @@ class ValuationResponse(BaseModel):
disclaimer: str
# ============== Helper Functions ==============
def _generate_verification_code() -> str:
"""Generate a unique verification code."""
return f"pounce-verify-{secrets.token_hex(8)}"
def _domain_to_response(domain: PortfolioDomain) -> PortfolioDomainResponse:
"""Convert PortfolioDomain to response schema."""
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
# ============== Portfolio Endpoints ==============
# IMPORTANT: Static routes must come BEFORE dynamic routes like /{domain_id}
@router.get("/verified", response_model=List[PortfolioDomainResponse])
async def get_verified_domains(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Get only DNS-verified portfolio domains.
These domains can be used for Yield or For Sale listings.
"""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.user_id == current_user.id,
PortfolioDomain.is_dns_verified == True,
PortfolioDomain.is_sold == False,
)
).order_by(PortfolioDomain.domain.asc())
)
domains = result.scalars().all()
return [_domain_to_response(d) for d in domains]
@router.get("/summary", response_model=PortfolioSummary)
async def get_portfolio_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get portfolio summary statistics."""
result = await db.execute(
select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
)
domains = result.scalars().all()
total_domains = len(domains)
active_domains = sum(1 for d in domains if d.status == "active" and not d.is_sold)
sold_domains = sum(1 for d in domains if d.is_sold)
total_invested = sum(d.purchase_price or 0 for d in domains)
total_value = sum(d.estimated_value or 0 for d in domains if not d.is_sold)
total_sold_value = sum(d.sale_price or 0 for d in domains if d.is_sold)
# Calculate active investment for ROI
active_investment = sum(d.purchase_price or 0 for d in domains if not d.is_sold)
sold_investment = sum(d.purchase_price or 0 for d in domains if d.is_sold)
unrealized_profit = total_value - active_investment
realized_profit = total_sold_value - sold_investment
overall_roi = 0.0
if total_invested > 0:
overall_roi = ((total_value + total_sold_value - total_invested) / total_invested) * 100
return PortfolioSummary(
total_domains=total_domains,
active_domains=active_domains,
sold_domains=sold_domains,
total_invested=round(total_invested, 2),
total_value=round(total_value, 2),
total_sold_value=round(total_sold_value, 2),
unrealized_profit=round(unrealized_profit, 2),
realized_profit=round(realized_profit, 2),
overall_roi=round(overall_roi, 2),
)
@router.get("", response_model=List[PortfolioDomainResponse])
async def get_portfolio(
@ -204,6 +335,10 @@ async def get_portfolio(
notes=d.notes,
tags=d.tags,
roi=d.roi,
is_dns_verified=getattr(d, 'is_dns_verified', False) or False,
verification_status=getattr(d, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(d, 'verification_code', None),
verified_at=getattr(d, 'verified_at', None),
created_at=d.created_at,
updated_at=d.updated_at,
)
@ -212,49 +347,6 @@ async def get_portfolio(
return responses
@router.get("/summary", response_model=PortfolioSummary)
async def get_portfolio_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get portfolio summary statistics."""
result = await db.execute(
select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
)
domains = result.scalars().all()
total_domains = len(domains)
active_domains = sum(1 for d in domains if d.status == "active" and not d.is_sold)
sold_domains = sum(1 for d in domains if d.is_sold)
total_invested = sum(d.purchase_price or 0 for d in domains)
total_value = sum(d.estimated_value or 0 for d in domains if not d.is_sold)
total_sold_value = sum(d.sale_price or 0 for d in domains if d.is_sold)
# Calculate active investment for ROI
active_investment = sum(d.purchase_price or 0 for d in domains if not d.is_sold)
sold_investment = sum(d.purchase_price or 0 for d in domains if d.is_sold)
unrealized_profit = total_value - active_investment
realized_profit = total_sold_value - sold_investment
overall_roi = 0.0
if total_invested > 0:
overall_roi = ((total_value + total_sold_value - total_invested) / total_invested) * 100
return PortfolioSummary(
total_domains=total_domains,
active_domains=active_domains,
sold_domains=sold_domains,
total_invested=round(total_invested, 2),
total_value=round(total_value, 2),
total_sold_value=round(total_sold_value, 2),
unrealized_profit=round(unrealized_profit, 2),
realized_profit=round(realized_profit, 2),
overall_roi=round(overall_roi, 2),
)
@router.post("", response_model=PortfolioDomainResponse, status_code=status.HTTP_201_CREATED)
async def add_portfolio_domain(
data: PortfolioDomainCreate,
@ -351,6 +443,10 @@ async def add_portfolio_domain(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -398,6 +494,10 @@ async def get_portfolio_domain(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -454,6 +554,10 @@ async def update_portfolio_domain(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -510,6 +614,10 @@ async def mark_domain_sold(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -593,6 +701,10 @@ async def refresh_domain_value(
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
is_dns_verified=getattr(domain, 'is_dns_verified', False) or False,
verification_status=getattr(domain, 'verification_status', 'unverified') or 'unverified',
verification_code=getattr(domain, 'verification_code', None),
verified_at=getattr(domain, 'verified_at', None),
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@ -617,3 +729,160 @@ async def get_domain_valuation(
return ValuationResponse(**valuation)
# ============== DNS Verification Endpoints ==============
@router.post("/{domain_id}/verify-dns", response_model=DNSVerificationStartResponse)
async def start_dns_verification(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Start DNS verification for a portfolio domain.
Returns a verification code that must be added as a TXT record.
"""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
if domain.is_dns_verified:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Domain is already verified",
)
# Generate or reuse existing verification code
if not domain.verification_code:
domain.verification_code = _generate_verification_code()
domain.verification_status = "pending"
domain.verification_started_at = datetime.utcnow()
await db.commit()
await db.refresh(domain)
return DNSVerificationStartResponse(
domain_id=domain.id,
domain=domain.domain,
verification_code=domain.verification_code,
dns_record_type="TXT",
dns_record_name=f"_pounce.{domain.domain}",
dns_record_value=domain.verification_code,
instructions=f"Add a TXT record to your DNS settings:\n\nHost/Name: _pounce\nType: TXT\nValue: {domain.verification_code}\n\nDNS changes can take up to 48 hours to propagate, but usually complete within minutes.",
status=domain.verification_status,
)
@router.get("/{domain_id}/verify-dns/check", response_model=DNSVerificationCheckResponse)
async def check_dns_verification(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""
Check if DNS verification is complete.
Looks for the TXT record and verifies it matches the expected code.
"""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
if domain.is_dns_verified:
return DNSVerificationCheckResponse(
verified=True,
status="verified",
message="Domain ownership already verified",
)
if not domain.verification_code:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Verification not started. Call POST /verify-dns first.",
)
# Check DNS TXT record
txt_record_name = f"_pounce.{domain.domain}"
verified = False
try:
resolver = dns.resolver.Resolver()
resolver.timeout = 5
resolver.lifetime = 10
answers = resolver.resolve(txt_record_name, 'TXT')
for rdata in answers:
txt_value = rdata.to_text().strip('"')
if txt_value == domain.verification_code:
verified = True
break
except dns.resolver.NXDOMAIN:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message=f"TXT record not found. Please add a TXT record at _pounce.{domain.domain}",
)
except dns.resolver.NoAnswer:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message="TXT record exists but has no value. Check your DNS configuration.",
)
except dns.resolver.Timeout:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message="DNS query timed out. Please try again.",
)
except Exception as e:
return DNSVerificationCheckResponse(
verified=False,
status="error",
message=f"DNS lookup error: {str(e)}",
)
if verified:
domain.is_dns_verified = True
domain.verification_status = "verified"
domain.verified_at = datetime.utcnow()
await db.commit()
return DNSVerificationCheckResponse(
verified=True,
status="verified",
message="Domain ownership verified successfully! You can now list this domain for sale or activate Yield.",
)
else:
return DNSVerificationCheckResponse(
verified=False,
status="pending",
message=f"TXT record found but value doesn't match. Expected: {domain.verification_code}",
)

View File

@ -84,7 +84,7 @@ async def get_subscription(
tier=subscription.tier.value,
tier_name=config["name"],
status=subscription.status.value,
domain_limit=subscription.max_domains,
domain_limit=subscription.domain_limit,
domains_used=domains_used,
portfolio_limit=config.get("portfolio_limit", 0),
check_frequency=config["check_frequency"],

View File

@ -0,0 +1,365 @@
"""
Telemetry KPIs (4A.2).
Admin-only endpoint to compute funnel KPIs from telemetry_events.
"""
from __future__ import annotations
import json
import statistics
from datetime import datetime, timedelta
from typing import Any, Optional
from fastapi import APIRouter, Depends, HTTPException, Query, status
from sqlalchemy import and_, case, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user, get_db
from app.models.telemetry import TelemetryEvent
from app.models.user import User
from app.schemas.referrals import ReferralKpiWindow, ReferralKpisResponse, ReferralReferrerRow
from app.schemas.telemetry import (
DealFunnelKpis,
TelemetryKpiWindow,
TelemetryKpisResponse,
YieldFunnelKpis,
)
router = APIRouter(prefix="/telemetry", tags=["telemetry"])
def _require_admin(user: User) -> None:
if not user.is_admin:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Admin access required")
def _safe_json(metadata_json: Optional[str]) -> dict[str, Any]:
if not metadata_json:
return {}
try:
value = json.loads(metadata_json)
return value if isinstance(value, dict) else {}
except Exception:
return {}
def _median(values: list[float]) -> Optional[float]:
if not values:
return None
return float(statistics.median(values))
@router.get("/kpis", response_model=TelemetryKpisResponse)
async def get_kpis(
days: int = Query(30, ge=1, le=365),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
_require_admin(current_user)
end = datetime.utcnow()
start = end - timedelta(days=days)
event_names = [
# Deal funnel
"listing_view",
"inquiry_created",
"inquiry_status_changed",
"message_sent",
"listing_marked_sold",
# Yield funnel
"yield_connected",
"yield_click",
"yield_conversion",
"payout_paid",
]
rows = (
await db.execute(
select(
TelemetryEvent.event_name,
TelemetryEvent.created_at,
TelemetryEvent.listing_id,
TelemetryEvent.inquiry_id,
TelemetryEvent.yield_domain_id,
TelemetryEvent.click_id,
TelemetryEvent.metadata_json,
).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name.in_(event_names),
)
)
)
).all()
# -----------------------------
# Deal KPIs
# -----------------------------
listing_views = 0
inquiries_created = 0
inquiry_created_at: dict[int, datetime] = {}
first_seller_reply_at: dict[int, datetime] = {}
listings_with_inquiries: set[int] = set()
sold_listings: set[int] = set()
sold_at_by_listing: dict[int, datetime] = {}
first_inquiry_at_by_listing: dict[int, datetime] = {}
# -----------------------------
# Yield KPIs
# -----------------------------
connected_domains = 0
clicks = 0
conversions = 0
payouts_paid = 0
payouts_paid_amount_total = 0.0
for event_name, created_at, listing_id, inquiry_id, yield_domain_id, click_id, metadata_json in rows:
created_at = created_at # already datetime
if event_name == "listing_view":
listing_views += 1
continue
if event_name == "inquiry_created":
inquiries_created += 1
if inquiry_id:
inquiry_created_at[inquiry_id] = created_at
if listing_id:
listings_with_inquiries.add(listing_id)
prev = first_inquiry_at_by_listing.get(listing_id)
if prev is None or created_at < prev:
first_inquiry_at_by_listing[listing_id] = created_at
continue
if event_name == "message_sent":
if not inquiry_id:
continue
meta = _safe_json(metadata_json)
if meta.get("role") == "seller":
prev = first_seller_reply_at.get(inquiry_id)
if prev is None or created_at < prev:
first_seller_reply_at[inquiry_id] = created_at
continue
if event_name == "listing_marked_sold":
if listing_id:
sold_listings.add(listing_id)
sold_at_by_listing[listing_id] = created_at
continue
if event_name == "yield_connected":
connected_domains += 1
continue
if event_name == "yield_click":
clicks += 1
continue
if event_name == "yield_conversion":
conversions += 1
continue
if event_name == "payout_paid":
payouts_paid += 1
meta = _safe_json(metadata_json)
amount = meta.get("amount")
if isinstance(amount, (int, float)):
payouts_paid_amount_total += float(amount)
continue
seller_replied_inquiries = len(first_seller_reply_at.keys())
inquiry_reply_rate = (seller_replied_inquiries / inquiries_created) if inquiries_created else 0.0
# Inquiry → Sold rate (on listing-level intersection)
sold_from_inquiry = sold_listings.intersection(listings_with_inquiries)
inquiry_to_sold_listing_rate = (len(sold_from_inquiry) / len(listings_with_inquiries)) if listings_with_inquiries else 0.0
# Median reply time (seconds): inquiry_created → first seller message
reply_deltas: list[float] = []
for inq_id, created in inquiry_created_at.items():
replied = first_seller_reply_at.get(inq_id)
if replied:
reply_deltas.append((replied - created).total_seconds())
# Median time-to-sold (seconds): first inquiry on listing → listing sold
sold_deltas: list[float] = []
for listing in sold_from_inquiry:
inq_at = first_inquiry_at_by_listing.get(listing)
sold_at = sold_at_by_listing.get(listing)
if inq_at and sold_at and sold_at >= inq_at:
sold_deltas.append((sold_at - inq_at).total_seconds())
deal = DealFunnelKpis(
listing_views=listing_views,
inquiries_created=inquiries_created,
seller_replied_inquiries=seller_replied_inquiries,
inquiry_reply_rate=float(inquiry_reply_rate),
listings_with_inquiries=len(listings_with_inquiries),
listings_sold=len(sold_listings),
inquiry_to_sold_listing_rate=float(inquiry_to_sold_listing_rate),
median_reply_seconds=_median(reply_deltas),
median_time_to_sold_seconds=_median(sold_deltas),
)
yield_kpis = YieldFunnelKpis(
connected_domains=connected_domains,
clicks=clicks,
conversions=conversions,
conversion_rate=float(conversions / clicks) if clicks else 0.0,
payouts_paid=payouts_paid,
payouts_paid_amount_total=float(payouts_paid_amount_total),
)
return TelemetryKpisResponse(
window=TelemetryKpiWindow(days=days, start=start, end=end),
deal=deal,
yield_=yield_kpis,
)
@router.get("/referrals", response_model=ReferralKpisResponse)
async def get_referral_kpis(
days: int = Query(30, ge=1, le=365),
limit: int = Query(200, ge=1, le=1000),
offset: int = Query(0, ge=0),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Admin-only referral KPIs for the viral loop (3C.2).
This is intentionally user-based (users.referred_by_user_id) + telemetry-based (referral_link_viewed),
so it stays robust even if ref codes evolve.
"""
_require_admin(current_user)
end = datetime.utcnow()
start = end - timedelta(days=days)
# Referred user counts per referrer (all-time + window)
referred_counts_subq = (
select(
User.referred_by_user_id.label("referrer_user_id"),
func.count(User.id).label("referred_users_total"),
func.coalesce(
func.sum(case((User.created_at >= start, 1), else_=0)),
0,
).label("referred_users_window"),
)
.where(User.referred_by_user_id.isnot(None))
.group_by(User.referred_by_user_id)
.subquery()
)
# Referral link views in window (telemetry)
link_views_subq = (
select(
TelemetryEvent.user_id.label("referrer_user_id"),
func.count(TelemetryEvent.id).label("referral_link_views_window"),
)
.where(
and_(
TelemetryEvent.event_name == "referral_link_viewed",
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.user_id.isnot(None),
)
)
.group_by(TelemetryEvent.user_id)
.subquery()
)
# Referrers: anyone with an invite_code (we still show even if counts are zero)
rows = (
await db.execute(
select(
User.id,
User.email,
User.invite_code,
User.created_at,
func.coalesce(referred_counts_subq.c.referred_users_total, 0),
func.coalesce(referred_counts_subq.c.referred_users_window, 0),
func.coalesce(link_views_subq.c.referral_link_views_window, 0),
)
.where(User.invite_code.isnot(None))
.outerjoin(referred_counts_subq, referred_counts_subq.c.referrer_user_id == User.id)
.outerjoin(link_views_subq, link_views_subq.c.referrer_user_id == User.id)
.order_by(
func.coalesce(referred_counts_subq.c.referred_users_window, 0).desc(),
func.coalesce(referred_counts_subq.c.referred_users_total, 0).desc(),
User.created_at.desc(),
)
.offset(offset)
.limit(limit)
)
).all()
referrers = [
ReferralReferrerRow(
user_id=int(user_id),
email=str(email),
invite_code=str(invite_code) if invite_code else None,
created_at=created_at,
referred_users_total=int(referred_total or 0),
referred_users_window=int(referred_window or 0),
referral_link_views_window=int(link_views or 0),
)
for user_id, email, invite_code, created_at, referred_total, referred_window, link_views in rows
]
totals = {}
totals["referrers_with_invite_code"] = int(
(
await db.execute(
select(func.count(User.id)).where(User.invite_code.isnot(None))
)
).scalar()
or 0
)
totals["referred_users_total"] = int(
(
await db.execute(
select(func.count(User.id)).where(User.referred_by_user_id.isnot(None))
)
).scalar()
or 0
)
totals["referred_users_window"] = int(
(
await db.execute(
select(func.count(User.id)).where(
and_(
User.referred_by_user_id.isnot(None),
User.created_at >= start,
User.created_at <= end,
)
)
)
).scalar()
or 0
)
totals["referral_link_views_window"] = int(
(
await db.execute(
select(func.count(TelemetryEvent.id)).where(
and_(
TelemetryEvent.event_name == "referral_link_viewed",
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
)
)
)
).scalar()
or 0
)
return ReferralKpisResponse(
window=ReferralKpiWindow(days=days, start=start, end=end),
totals=totals,
referrers=referrers,
)

View File

@ -64,6 +64,38 @@ async def get_db_price_count(db) -> int:
return result.scalar() or 0
@router.get("/tlds")
async def list_tracked_tlds(
db: Database,
limit: int = Query(5000, ge=1, le=20000),
offset: int = Query(0, ge=0),
):
"""
List distinct TLDs tracked in the database (DB-driven).
This endpoint is intentionally database-only (no static fallback),
so callers (e.g. sitemap generation) can rely on real tracked inventory.
"""
rows = (
await db.execute(
select(TLDPrice.tld)
.distinct()
.order_by(TLDPrice.tld)
.offset(offset)
.limit(limit)
)
).scalars().all()
total = (await db.execute(select(func.count(func.distinct(TLDPrice.tld))))).scalar() or 0
latest = (await db.execute(select(func.max(TLDPrice.recorded_at)))).scalar()
return {
"tlds": [str(t).lstrip(".").lower() for t in rows if t],
"total": int(total),
"limit": int(limit),
"offset": int(offset),
"latest_recorded_at": latest.isoformat() if latest else None,
}
# Real TLD price data based on current market research (December 2024)
# Prices in USD, sourced from major registrars: Namecheap, Cloudflare, Porkbun, Google Domains
TLD_DATA = {
@ -655,14 +687,8 @@ async def get_tld_price_history(
):
"""Get price history for a specific TLD.
Returns REAL historical data from database if available (5+ data points),
otherwise generates simulated data based on current price and known trends.
Data Source Priority:
1. Real DB data (from daily scrapes) - marked as source: "database"
2. Simulated data based on trend - marked as source: "simulated"
Returns REAL historical data from database (no simulation).
"""
import math
tld_clean = tld.lower().lstrip(".")
@ -688,81 +714,35 @@ async def get_tld_price_history(
trend = static_data.get("trend", "stable")
trend_reason = static_data.get("trend_reason", "Price tracking available")
# ==========================================================================
# TRY REAL HISTORICAL DATA FROM DATABASE FIRST
# ==========================================================================
real_history = await get_real_price_history(db, tld_clean, days)
# Use real data if we have enough points (at least 5 data points)
if len(real_history) >= 5:
history = real_history
data_source = "database"
# Calculate price changes from real data
price_7d_ago = None
price_30d_ago = None
price_90d_ago = None
now = datetime.utcnow().date()
for h in history:
if not real_history:
raise HTTPException(status_code=404, detail=f"No historical data for '.{tld_clean}' yet")
history = real_history
data_source = "database"
# Use the most recent daily average as current_price when available
if history:
current_price = float(history[-1]["price"])
def _price_at_or_before(days_ago_target: int) -> float:
"""Get the closest historical price at or before the target age."""
target_date = (datetime.utcnow() - timedelta(days=days_ago_target)).date()
best = float(history[0]["price"])
for h in reversed(history):
try:
h_date = datetime.strptime(h["date"], "%Y-%m-%d").date()
days_ago = (now - h_date).days
if days_ago <= 7 and price_7d_ago is None:
price_7d_ago = h["price"]
if days_ago <= 30 and price_30d_ago is None:
price_30d_ago = h["price"]
if days_ago <= 90 and price_90d_ago is None:
price_90d_ago = h["price"]
except (ValueError, TypeError):
except Exception:
continue
# Fallback to earliest available
if price_7d_ago is None and history:
price_7d_ago = history[-1]["price"]
if price_30d_ago is None and history:
price_30d_ago = history[0]["price"]
if price_90d_ago is None and history:
price_90d_ago = history[0]["price"]
else:
# ==========================================================================
# FALLBACK: SIMULATED DATA BASED ON TREND
# ==========================================================================
data_source = "simulated"
history = []
current_date = datetime.utcnow()
# Calculate trend factor based on known trends
trend_factor = 1.0
if trend == "up":
trend_factor = 0.92 # Prices were ~8% lower
elif trend == "down":
trend_factor = 1.05 # Prices were ~5% higher
# Generate weekly data points
for i in range(days, -1, -7):
date = current_date - timedelta(days=i)
progress = 1 - (i / days)
if trend == "up":
price = current_price * (trend_factor + (1 - trend_factor) * progress)
elif trend == "down":
price = current_price * (trend_factor - (trend_factor - 1) * progress)
else:
# Add small fluctuation for stable prices
fluctuation = math.sin(i * 0.1) * 0.02
price = current_price * (1 + fluctuation)
history.append({
"date": date.strftime("%Y-%m-%d"),
"price": round(price, 2),
})
# Calculate price changes from simulated data
price_7d_ago = history[-2]["price"] if len(history) >= 2 else current_price
price_30d_ago = history[-5]["price"] if len(history) >= 5 else current_price
price_90d_ago = history[0]["price"] if history else current_price
if h_date <= target_date:
best = float(h["price"])
break
return best
price_7d_ago = _price_at_or_before(7)
price_30d_ago = _price_at_or_before(30)
price_90d_ago = _price_at_or_before(90)
# Calculate percentage changes safely
change_7d = round((current_price - price_7d_ago) / price_7d_ago * 100, 2) if price_7d_ago and price_7d_ago > 0 else 0
@ -889,6 +869,33 @@ async def compare_tld_prices(
}
def get_marketplace_links(tld: str) -> list:
"""Get marketplace links for buying existing domains on this TLD."""
# Sedo partner ID for affiliate tracking
SEDO_PARTNER_ID = "335830"
return [
{
"name": "Sedo",
"description": "World's largest domain marketplace",
"url": f"https://sedo.com/search/?keyword=.{tld}&partnerid={SEDO_PARTNER_ID}",
"type": "marketplace",
},
{
"name": "Afternic",
"description": "GoDaddy's premium marketplace",
"url": f"https://www.afternic.com/search?k=.{tld}",
"type": "marketplace",
},
{
"name": "Dan.com",
"description": "Fast domain transfers",
"url": f"https://dan.com/search?query=.{tld}",
"type": "marketplace",
},
]
@router.get("/{tld}")
async def get_tld_details(
tld: str,
@ -897,6 +904,9 @@ async def get_tld_details(
"""Get complete details for a specific TLD."""
tld_clean = tld.lower().lstrip(".")
# Marketplace links (same for all TLDs)
marketplace_links = get_marketplace_links(tld_clean)
# Try static data first
if tld_clean in TLD_DATA:
data = TLD_DATA[tld_clean]
@ -926,6 +936,7 @@ async def get_tld_details(
},
"registrars": registrars,
"cheapest_registrar": registrars[0]["name"],
"marketplace_links": marketplace_links,
}
# Fall back to database
@ -962,6 +973,7 @@ async def get_tld_details(
},
"registrars": registrars,
"cheapest_registrar": registrars[0]["name"] if registrars else "N/A",
"marketplace_links": marketplace_links,
}
@ -1051,8 +1063,8 @@ async def get_data_quality_stats(db: Database):
},
"chart_readiness": {
"tlds_ready_for_charts": chartable_tlds,
"tlds_using_simulation": total_tlds - chartable_tlds,
"recommendation": "Run daily scrapes for 7+ days to enable real charts" if chartable_tlds < 10 else "Good coverage!",
"tlds_with_insufficient_history": total_tlds - chartable_tlds,
"recommendation": "Run daily scrapes for 7+ days to enable richer charts" if chartable_tlds < 10 else "Good coverage!",
},
"data_sources": {
"static_tlds": len(TLD_DATA),

View File

@ -5,6 +5,8 @@ Webhook endpoints for external service integrations.
- Future: Other payment providers, notification services, etc.
"""
import logging
import os
from datetime import datetime
from fastapi import APIRouter, HTTPException, Request, Header, status
from app.database import get_db
@ -15,6 +17,25 @@ logger = logging.getLogger(__name__)
router = APIRouter()
@router.get("/stripe/test")
async def test_stripe_webhook():
"""
Test endpoint to verify webhook route is accessible.
Use this to verify the webhook URL is correct.
The actual Stripe webhook should POST to /api/v1/webhooks/stripe
"""
return {
"status": "ok",
"message": "Stripe webhook endpoint is accessible",
"endpoint": "/api/v1/webhooks/stripe",
"method": "POST",
"stripe_configured": StripeService.is_configured(),
"webhook_secret_set": bool(os.getenv("STRIPE_WEBHOOK_SECRET")),
"timestamp": datetime.utcnow().isoformat(),
}
@router.post("/stripe")
async def stripe_webhook(
request: Request,
@ -29,18 +50,22 @@ async def stripe_webhook(
- Invoice is created or paid
The webhook must be configured in Stripe Dashboard to point to:
https://your-domain.com/api/webhooks/stripe
https://pounce.ch/api/v1/webhooks/stripe
Required Header:
- Stripe-Signature: Stripe's webhook signature for verification
"""
logger.info("🔔 Stripe webhook received")
if not stripe_signature:
logger.error("❌ Missing Stripe-Signature header")
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Missing Stripe-Signature header",
)
if not StripeService.is_configured():
logger.error("❌ Stripe not configured")
raise HTTPException(
status_code=status.HTTP_503_SERVICE_UNAVAILABLE,
detail="Stripe not configured",
@ -49,6 +74,9 @@ async def stripe_webhook(
# Get raw body for signature verification
payload = await request.body()
logger.info(f" Payload size: {len(payload)} bytes")
logger.info(f" Signature: {stripe_signature[:50]}...")
try:
async for db in get_db():
result = await StripeService.handle_webhook(
@ -56,16 +84,17 @@ async def stripe_webhook(
sig_header=stripe_signature,
db=db,
)
logger.info(f"✅ Webhook processed successfully: {result}")
return result
except ValueError as e:
logger.error(f"Webhook validation error: {e}")
logger.error(f"Webhook validation error: {e}")
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=str(e),
)
except Exception as e:
logger.error(f"Webhook processing error: {e}")
logger.exception(f"Webhook processing error: {e}")
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail="Webhook processing failed",

View File

@ -9,13 +9,15 @@ from decimal import Decimal
from typing import Optional
from fastapi import APIRouter, Depends, HTTPException, status, Query
from sqlalchemy import func, and_, or_
from sqlalchemy.orm import Session
from sqlalchemy import func, and_, or_, Integer, case, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_db, get_current_user
from app.models.user import User
from app.models.yield_domain import YieldDomain, YieldTransaction, YieldPayout, AffiliatePartner
from app.config import settings
from app.config import get_settings
settings = get_settings()
from app.schemas.yield_domain import (
YieldDomainCreate,
YieldDomainUpdate,
@ -41,13 +43,11 @@ from app.services.intent_detector import (
estimate_domain_yield,
get_intent_detector,
)
from app.services.yield_dns import verify_yield_dns
from app.services.telemetry import track_event
router = APIRouter(prefix="/yield", tags=["yield"])
# DNS Configuration (would be in config in production)
YIELD_NAMESERVERS = ["ns1.pounce.io", "ns2.pounce.io"]
YIELD_CNAME_TARGET = "yield.pounce.io"
# ============================================================================
# Intent Analysis (Public)
@ -95,31 +95,70 @@ async def analyze_domain_intent(
@router.get("/dashboard", response_model=YieldDashboardResponse)
async def get_yield_dashboard(
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Get yield dashboard with stats, domains, and recent transactions.
"""
# Get user's yield domains
domains = db.query(YieldDomain).filter(
YieldDomain.user_id == current_user.id
).order_by(YieldDomain.total_revenue.desc()).all()
result = await db.execute(
select(YieldDomain)
.where(YieldDomain.user_id == current_user.id)
.order_by(YieldDomain.total_revenue.desc())
)
domains = list(result.scalars().all())
# Calculate stats
now = datetime.utcnow()
month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
# Monthly stats from transactions
monthly_stats = db.query(
func.count(YieldTransaction.id).label("count"),
func.sum(YieldTransaction.net_amount).label("revenue"),
func.sum(func.cast(YieldTransaction.event_type == "click", Integer)).label("clicks"),
func.sum(func.cast(YieldTransaction.event_type.in_(["lead", "sale"]), Integer)).label("conversions"),
).join(YieldDomain).filter(
YieldDomain.user_id == current_user.id,
YieldTransaction.created_at >= month_start,
).first()
# Monthly stats from transactions (simplified for async)
monthly_revenue = Decimal("0")
monthly_clicks = 0
monthly_conversions = 0
if domains:
domain_ids = [d.id for d in domains]
monthly_result = await db.execute(
select(
func.coalesce(
func.sum(
case(
(YieldTransaction.status.in_(["confirmed", "paid"]), YieldTransaction.net_amount),
else_=0,
)
),
0,
).label("revenue"),
func.sum(
case(
(YieldTransaction.event_type == "click", 1),
else_=0,
)
).label("clicks"),
func.sum(
case(
(
and_(
YieldTransaction.event_type.in_(["lead", "sale"]),
YieldTransaction.status.in_(["confirmed", "paid"]),
),
1,
),
else_=0,
)
).label("conversions"),
).where(
YieldTransaction.yield_domain_id.in_(domain_ids),
YieldTransaction.created_at >= month_start,
)
)
monthly_stats = monthly_result.first()
if monthly_stats:
monthly_revenue = monthly_stats.revenue or Decimal("0")
monthly_clicks = monthly_stats.clicks or 0
monthly_conversions = monthly_stats.conversions or 0
# Aggregate domain stats
total_active = sum(1 for d in domains if d.status == "active")
@ -129,16 +168,29 @@ async def get_yield_dashboard(
lifetime_conversions = sum(d.total_conversions for d in domains)
# Pending payout
pending_payout = db.query(func.sum(YieldTransaction.net_amount)).filter(
YieldTransaction.yield_domain_id.in_([d.id for d in domains]),
pending_payout = Decimal("0")
if domains:
domain_ids = [d.id for d in domains]
pending_result = await db.execute(
select(func.coalesce(func.sum(YieldTransaction.net_amount), 0)).where(
YieldTransaction.yield_domain_id.in_(domain_ids),
YieldTransaction.status == "confirmed",
YieldTransaction.paid_at.is_(None),
).scalar() or Decimal("0")
)
)
pending_payout = pending_result.scalar() or Decimal("0")
# Get recent transactions
recent_txs = db.query(YieldTransaction).join(YieldDomain).filter(
YieldDomain.user_id == current_user.id,
).order_by(YieldTransaction.created_at.desc()).limit(10).all()
recent_txs = []
if domains:
domain_ids = [d.id for d in domains]
recent_result = await db.execute(
select(YieldTransaction)
.where(YieldTransaction.yield_domain_id.in_(domain_ids))
.order_by(YieldTransaction.created_at.desc())
.limit(10)
)
recent_txs = list(recent_result.scalars().all())
# Top performing domains
top_domains = sorted(domains, key=lambda d: d.total_revenue, reverse=True)[:5]
@ -147,14 +199,14 @@ async def get_yield_dashboard(
total_domains=len(domains),
active_domains=total_active,
pending_domains=total_pending,
monthly_revenue=monthly_stats.revenue or Decimal("0"),
monthly_clicks=monthly_stats.clicks or 0,
monthly_conversions=monthly_stats.conversions or 0,
monthly_revenue=monthly_revenue,
monthly_clicks=monthly_clicks,
monthly_conversions=monthly_conversions,
lifetime_revenue=lifetime_revenue,
lifetime_clicks=lifetime_clicks,
lifetime_conversions=lifetime_conversions,
pending_payout=pending_payout,
next_payout_date=month_start + timedelta(days=32), # Approx next month
next_payout_date=(month_start + timedelta(days=32)).replace(day=1),
currency="CHF",
)
@ -175,22 +227,34 @@ async def list_yield_domains(
status: Optional[str] = Query(None, description="Filter by status"),
limit: int = Query(50, le=100),
offset: int = Query(0, ge=0),
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
List user's yield domains.
"""
query = db.query(YieldDomain).filter(YieldDomain.user_id == current_user.id)
query = select(YieldDomain).where(YieldDomain.user_id == current_user.id)
if status:
query = query.filter(YieldDomain.status == status)
query = query.where(YieldDomain.status == status)
total = query.count()
domains = query.order_by(YieldDomain.created_at.desc()).offset(offset).limit(limit).all()
# Get total count
count_result = await db.execute(
select(func.count(YieldDomain.id)).where(YieldDomain.user_id == current_user.id)
)
total = count_result.scalar() or 0
# Aggregates
all_domains = db.query(YieldDomain).filter(YieldDomain.user_id == current_user.id).all()
# Get domains
result = await db.execute(
query.order_by(YieldDomain.created_at.desc()).offset(offset).limit(limit)
)
domains = list(result.scalars().all())
# Aggregates from all domains
all_result = await db.execute(
select(YieldDomain).where(YieldDomain.user_id == current_user.id)
)
all_domains = list(all_result.scalars().all())
total_active = sum(1 for d in all_domains if d.status == "active")
total_revenue = sum(d.total_revenue for d in all_domains)
total_clicks = sum(d.total_clicks for d in all_domains)
@ -207,16 +271,19 @@ async def list_yield_domains(
@router.get("/domains/{domain_id}", response_model=YieldDomainResponse)
async def get_yield_domain(
domain_id: int,
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Get details of a specific yield domain.
"""
domain = db.query(YieldDomain).filter(
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
).first()
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
@ -227,18 +294,78 @@ async def get_yield_domain(
@router.post("/activate", response_model=ActivateYieldResponse)
async def activate_domain_for_yield(
request: ActivateYieldRequest,
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Activate a domain for yield/intent routing.
SECURITY: Domain must be in user's portfolio AND DNS-verified.
This creates the yield domain record and returns DNS setup instructions.
"""
from app.models.portfolio import PortfolioDomain
from app.models.subscription import Subscription, SubscriptionTier
domain = request.domain.lower().strip()
# Check if domain already exists
existing = db.query(YieldDomain).filter(YieldDomain.domain == domain).first()
# SECURITY CHECK 1: Domain must be in user's portfolio
portfolio_result = await db.execute(
select(PortfolioDomain).where(
PortfolioDomain.domain == domain,
PortfolioDomain.user_id == current_user.id,
)
)
portfolio_domain = portfolio_result.scalar_one_or_none()
if not portfolio_domain:
raise HTTPException(
status_code=403,
detail="Domain must be in your portfolio before activating Yield. Add it to your portfolio first.",
)
# SECURITY CHECK 2: Domain must be DNS-verified
if not portfolio_domain.is_dns_verified:
raise HTTPException(
status_code=403,
detail="Domain must be DNS-verified before activating Yield. Verify ownership in your portfolio first.",
)
# SECURITY CHECK 3: Domain must not be sold
if portfolio_domain.is_sold:
raise HTTPException(
status_code=400,
detail="Cannot activate Yield for a sold domain.",
)
# SECURITY CHECK 4: Tier gating + limits
sub_result = await db.execute(select(Subscription).where(Subscription.user_id == current_user.id))
subscription = sub_result.scalar_one_or_none()
tier = subscription.tier if subscription else SubscriptionTier.SCOUT
tier_value = tier.value if hasattr(tier, "value") else str(tier)
if tier_value == "scout":
raise HTTPException(
status_code=403,
detail="Yield is not available on Scout plan. Upgrade to Trader or Tycoon.",
)
max_yield_domains = 5 if tier_value == "trader" else 10_000_000
user_domain_count = (
await db.execute(
select(func.count(YieldDomain.id)).where(YieldDomain.user_id == current_user.id)
)
).scalar() or 0
if user_domain_count >= max_yield_domains:
raise HTTPException(
status_code=403,
detail=f"Yield domain limit reached for your plan ({max_yield_domains}).",
)
# Check if domain already exists in yield system
existing_result = await db.execute(
select(YieldDomain).where(YieldDomain.domain == domain)
)
existing = existing_result.scalar_one_or_none()
if existing:
if existing.user_id == current_user.id:
raise HTTPException(
@ -267,25 +394,31 @@ async def activate_domain_for_yield(
# Find best matching partner
if intent_result.suggested_partners:
partner = db.query(AffiliatePartner).filter(
partner_result = await db.execute(
select(AffiliatePartner).where(
AffiliatePartner.slug == intent_result.suggested_partners[0],
AffiliatePartner.is_active == True,
).first()
)
)
partner = partner_result.scalar_one_or_none()
if partner:
yield_domain.partner_id = partner.id
yield_domain.active_route = partner.slug
db.add(yield_domain)
db.commit()
db.refresh(yield_domain)
await db.commit()
await db.refresh(yield_domain)
# Create DNS instructions
yield_nameservers = settings.yield_nameserver_list
if not yield_nameservers:
raise HTTPException(status_code=500, detail="Yield nameservers are not configured on server.")
dns_instructions = DNSSetupInstructions(
domain=domain,
nameservers=YIELD_NAMESERVERS,
nameservers=yield_nameservers,
cname_host="@",
cname_target=YIELD_CNAME_TARGET,
verification_url=f"{settings.site_url}/api/v1/yield/verify/{yield_domain.id}",
cname_target=settings.yield_cname_target,
verification_url=f"{settings.site_url}/api/v1/yield/domains/{yield_domain.id}/verify",
)
return ActivateYieldResponse(
@ -316,73 +449,60 @@ async def activate_domain_for_yield(
@router.post("/domains/{domain_id}/verify", response_model=DNSVerificationResult)
async def verify_domain_dns(
domain_id: int,
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Verify DNS configuration for a yield domain.
"""
domain = db.query(YieldDomain).filter(
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
).first()
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
# Perform DNS check (simplified - in production use dnspython)
verified = False
actual_ns = []
error = None
try:
import dns.resolver
# Check nameservers
try:
answers = dns.resolver.resolve(domain.domain, 'NS')
actual_ns = [str(rr.target).rstrip('.') for rr in answers]
# Check if our nameservers are set
our_ns_set = set(ns.lower() for ns in YIELD_NAMESERVERS)
actual_ns_set = set(ns.lower() for ns in actual_ns)
if our_ns_set.issubset(actual_ns_set):
verified = True
except dns.resolver.NXDOMAIN:
error = "Domain does not exist"
except dns.resolver.NoAnswer:
# Try CNAME instead
try:
cname_answers = dns.resolver.resolve(domain.domain, 'CNAME')
for rr in cname_answers:
if str(rr.target).rstrip('.').lower() == YIELD_CNAME_TARGET.lower():
verified = True
break
except Exception:
error = "No NS or CNAME records found"
except Exception as e:
error = str(e)
except ImportError:
# dnspython not installed - simulate for development
verified = True # Auto-verify in dev
actual_ns = YIELD_NAMESERVERS
# Production-grade DNS check
check = verify_yield_dns(
domain=domain.domain,
expected_nameservers=settings.yield_nameserver_list,
cname_target=settings.yield_cname_target,
)
verified = check.verified
actual_ns = check.actual_ns
error = check.error
# Update domain status
if verified and not domain.dns_verified:
domain.dns_verified = True
domain.dns_verified_at = datetime.utcnow()
domain.connected_at = domain.dns_verified_at
domain.status = "active"
domain.activated_at = datetime.utcnow()
db.commit()
await track_event(
db,
event_name="yield_connected",
request=None,
user_id=current_user.id,
is_authenticated=True,
source="terminal",
domain=domain.domain,
yield_domain_id=domain.id,
metadata={"method": check.method, "cname_ok": check.cname_ok, "actual_ns": check.actual_ns},
)
await db.commit()
return DNSVerificationResult(
domain=domain.domain,
verified=verified,
expected_ns=YIELD_NAMESERVERS,
expected_ns=settings.yield_nameserver_list,
actual_ns=actual_ns,
cname_ok=verified and not actual_ns,
cname_ok=check.cname_ok if verified else False,
error=error,
checked_at=datetime.utcnow(),
)
@ -392,16 +512,19 @@ async def verify_domain_dns(
async def update_yield_domain(
domain_id: int,
update: YieldDomainUpdate,
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Update yield domain settings.
"""
domain = db.query(YieldDomain).filter(
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
).first()
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
@ -409,10 +532,13 @@ async def update_yield_domain(
# Apply updates
if update.active_route is not None:
# Validate partner exists
partner = db.query(AffiliatePartner).filter(
partner_result = await db.execute(
select(AffiliatePartner).where(
AffiliatePartner.slug == update.active_route,
AffiliatePartner.is_active == True,
).first()
)
)
partner = partner_result.scalar_one_or_none()
if not partner:
raise HTTPException(status_code=400, detail="Invalid partner route")
domain.active_route = update.active_route
@ -429,8 +555,8 @@ async def update_yield_domain(
domain.status = "active"
domain.paused_at = None
db.commit()
db.refresh(domain)
await db.commit()
await db.refresh(domain)
return _domain_to_response(domain)
@ -438,22 +564,25 @@ async def update_yield_domain(
@router.delete("/domains/{domain_id}")
async def delete_yield_domain(
domain_id: int,
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Remove a domain from yield program.
"""
domain = db.query(YieldDomain).filter(
result = await db.execute(
select(YieldDomain).where(
YieldDomain.id == domain_id,
YieldDomain.user_id == current_user.id,
).first()
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(status_code=404, detail="Yield domain not found")
db.delete(domain)
db.commit()
await db.delete(domain)
await db.commit()
return {"message": "Yield domain removed"}
@ -468,29 +597,53 @@ async def list_transactions(
status: Optional[str] = Query(None),
limit: int = Query(50, le=100),
offset: int = Query(0, ge=0),
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
List yield transactions for user's domains.
"""
# Get user's domain IDs
domain_ids = db.query(YieldDomain.id).filter(
YieldDomain.user_id == current_user.id
).subquery()
domain_ids_result = await db.execute(
select(YieldDomain.id).where(YieldDomain.user_id == current_user.id)
)
domain_ids = [row[0] for row in domain_ids_result.all()]
query = db.query(YieldTransaction).filter(
if not domain_ids:
return YieldTransactionListResponse(
transactions=[],
total=0,
total_gross=Decimal("0"),
total_net=Decimal("0"),
)
query = select(YieldTransaction).where(
YieldTransaction.yield_domain_id.in_(domain_ids)
)
if domain_id:
query = query.filter(YieldTransaction.yield_domain_id == domain_id)
query = query.where(YieldTransaction.yield_domain_id == domain_id)
if status:
query = query.filter(YieldTransaction.status == status)
query = query.where(YieldTransaction.status == status)
total = query.count()
transactions = query.order_by(YieldTransaction.created_at.desc()).offset(offset).limit(limit).all()
# Get count
count_query = select(func.count(YieldTransaction.id)).where(
YieldTransaction.yield_domain_id.in_(domain_ids)
)
if domain_id:
count_query = count_query.where(YieldTransaction.yield_domain_id == domain_id)
if status:
count_query = count_query.where(YieldTransaction.status == status)
count_result = await db.execute(count_query)
total = count_result.scalar() or 0
# Get transactions
result = await db.execute(
query.order_by(YieldTransaction.created_at.desc()).offset(offset).limit(limit)
)
transactions = list(result.scalars().all())
# Aggregates
total_gross = sum(tx.gross_amount for tx in transactions)
@ -513,19 +666,28 @@ async def list_payouts(
status: Optional[str] = Query(None),
limit: int = Query(20, le=50),
offset: int = Query(0, ge=0),
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
List user's yield payouts.
"""
query = db.query(YieldPayout).filter(YieldPayout.user_id == current_user.id)
query = select(YieldPayout).where(YieldPayout.user_id == current_user.id)
if status:
query = query.filter(YieldPayout.status == status)
query = query.where(YieldPayout.status == status)
total = query.count()
payouts = query.order_by(YieldPayout.created_at.desc()).offset(offset).limit(limit).all()
# Get count
count_result = await db.execute(
select(func.count(YieldPayout.id)).where(YieldPayout.user_id == current_user.id)
)
total = count_result.scalar() or 0
# Get payouts
result = await db.execute(
query.order_by(YieldPayout.created_at.desc()).offset(offset).limit(limit)
)
payouts = list(result.scalars().all())
# Aggregates
total_paid = sum(p.amount for p in payouts if p.status == "completed")
@ -546,14 +708,17 @@ async def list_payouts(
@router.get("/partners", response_model=list[AffiliatePartnerResponse])
async def list_partners(
category: Optional[str] = Query(None, description="Filter by intent category"),
db: Session = Depends(get_db),
db: AsyncSession = Depends(get_db),
):
"""
List available affiliate partners.
"""
query = db.query(AffiliatePartner).filter(AffiliatePartner.is_active == True)
partners = query.order_by(AffiliatePartner.priority.desc()).all()
result = await db.execute(
select(AffiliatePartner)
.where(AffiliatePartner.is_active == True)
.order_by(AffiliatePartner.priority.desc())
)
partners = list(result.scalars().all())
# Filter by category if specified
if category:
@ -590,6 +755,7 @@ def _domain_to_response(domain: YieldDomain) -> YieldDomainResponse:
partner_name=domain.partner.name if domain.partner else None,
dns_verified=domain.dns_verified,
dns_verified_at=domain.dns_verified_at,
connected_at=getattr(domain, "connected_at", None),
total_clicks=domain.total_clicks,
total_conversions=domain.total_conversions,
total_revenue=domain.total_revenue,
@ -605,6 +771,7 @@ def _tx_to_response(tx: YieldTransaction) -> YieldTransactionResponse:
id=tx.id,
event_type=tx.event_type,
partner_slug=tx.partner_slug,
click_id=getattr(tx, "click_id", None),
gross_amount=tx.gross_amount,
net_amount=tx.net_amount,
currency=tx.currency,
@ -632,6 +799,4 @@ def _payout_to_response(payout: YieldPayout) -> YieldPayoutResponse:
)
# Missing import
from sqlalchemy import Integer

View File

@ -0,0 +1,188 @@
"""
Admin endpoints for Yield payouts (ledger).
Premium constraints:
- No placeholder payouts
- No currency mixing
- Idempotent generation per (user, currency, period)
"""
from __future__ import annotations
from datetime import datetime
from decimal import Decimal
from fastapi import APIRouter, Depends, HTTPException, status
from pydantic import BaseModel, Field
from sqlalchemy import and_, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_current_user, get_db
from app.models.user import User
from app.models.yield_domain import YieldPayout, YieldTransaction
from app.services.telemetry import track_event
from app.services.yield_payouts import generate_payouts_for_period
router = APIRouter(prefix="/yield", tags=["yield-admin"])
class PayoutGenerateRequest(BaseModel):
period_start: datetime
period_end: datetime
class GeneratedPayout(BaseModel):
id: int
user_id: int
amount: Decimal
currency: str
period_start: datetime
period_end: datetime
transaction_count: int
status: str
created_at: datetime
class PayoutGenerateResponse(BaseModel):
created: list[GeneratedPayout]
skipped_existing: int = 0
class PayoutCompleteRequest(BaseModel):
payment_method: str | None = Field(default=None, max_length=50)
payment_reference: str | None = Field(default=None, max_length=200)
class PayoutCompleteResponse(BaseModel):
payout_id: int
transactions_marked_paid: int
completed_at: datetime
def _require_admin(current_user: User) -> None:
if not current_user.is_admin:
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Admin access required")
@router.post("/payouts/generate", response_model=PayoutGenerateResponse)
async def generate_payouts(
payload: PayoutGenerateRequest,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Create YieldPayout rows for confirmed, unpaid transactions in the period.
This does NOT mark payouts as completed. It only assigns transactions to a payout via payout_id.
Completion is a separate step once payment is executed.
"""
_require_admin(current_user)
if payload.period_end <= payload.period_start:
raise HTTPException(status_code=400, detail="period_end must be after period_start")
created_count, skipped_existing = await generate_payouts_for_period(
db,
period_start=payload.period_start,
period_end=payload.period_end,
)
payouts = (
await db.execute(
select(YieldPayout)
.where(
and_(
YieldPayout.period_start == payload.period_start,
YieldPayout.period_end == payload.period_end,
)
)
.order_by(YieldPayout.created_at.desc())
)
).scalars().all()
created = [
GeneratedPayout(
id=p.id,
user_id=p.user_id,
amount=p.amount,
currency=p.currency,
period_start=p.period_start,
period_end=p.period_end,
transaction_count=p.transaction_count,
status=p.status,
created_at=p.created_at,
)
for p in payouts
]
# created_count is still returned implicitly via list length; we keep it for logs later
_ = created_count
return PayoutGenerateResponse(created=created, skipped_existing=skipped_existing)
@router.post("/payouts/{payout_id}/complete", response_model=PayoutCompleteResponse)
async def complete_payout(
payout_id: int,
payload: PayoutCompleteRequest,
db: AsyncSession = Depends(get_db),
current_user: User = Depends(get_current_user),
):
"""
Mark a payout as completed and mark assigned transactions as paid.
"""
_require_admin(current_user)
payout = (
await db.execute(select(YieldPayout).where(YieldPayout.id == payout_id))
).scalar_one_or_none()
if not payout:
raise HTTPException(status_code=404, detail="Payout not found")
if payout.status == "completed":
raise HTTPException(status_code=400, detail="Payout already completed")
payout.status = "completed"
payout.completed_at = datetime.utcnow()
payout.payment_method = payload.payment_method
payout.payment_reference = payload.payment_reference
txs = (
await db.execute(
select(YieldTransaction).where(YieldTransaction.payout_id == payout.id)
)
).scalars().all()
marked = 0
for tx in txs:
if tx.status != "paid":
tx.status = "paid"
tx.paid_at = payout.completed_at
marked += 1
await track_event(
db,
event_name="payout_paid",
request=None,
user_id=payout.user_id,
is_authenticated=None,
source="admin",
domain=None,
yield_domain_id=None,
metadata={
"payout_id": payout.id,
"currency": payout.currency,
"amount": float(payout.amount),
"transaction_count": payout.transaction_count,
"payment_method": payout.payment_method,
},
)
await db.commit()
return PayoutCompleteResponse(
payout_id=payout.id,
transactions_marked_paid=marked,
completed_at=payout.completed_at,
)

View File

@ -0,0 +1,308 @@
"""
Yield Domain Routing API.
This handles incoming HTTP requests to yield domains:
1. Detect the domain from the Host header
2. Look up the yield configuration
3. Track the click
4. Redirect to the appropriate affiliate landing page
In production, this runs on a separate subdomain or IP (yield.pounce.io)
that yield domains CNAME to.
"""
import logging
from datetime import datetime, timedelta
from decimal import Decimal
from typing import Optional
from uuid import uuid4
from fastapi import APIRouter, Depends, HTTPException, Query, Request
from fastapi.responses import RedirectResponse
from sqlalchemy import and_, func, or_, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_db
from app.config import get_settings
from app.models.yield_domain import YieldDomain, YieldTransaction, AffiliatePartner
from app.services.intent_detector import detect_domain_intent
from app.services.telemetry import track_event
logger = logging.getLogger(__name__)
settings = get_settings()
router = APIRouter(prefix="/r", tags=["yield-routing"])
# Revenue split
USER_REVENUE_SHARE = Decimal("0.70")
def hash_ip(ip: str) -> str:
"""Hash IP for privacy-compliant storage."""
import hashlib
# Salt to prevent trivial rainbow table lookups.
return hashlib.sha256(f"{ip}|{settings.secret_key}".encode()).hexdigest()[:32]
def _get_client_ip(request: Request) -> Optional[str]:
# Prefer proxy headers when behind nginx
xff = request.headers.get("x-forwarded-for")
if xff:
# first IP in list
ip = xff.split(",")[0].strip()
if ip:
return ip
cf_ip = request.headers.get("cf-connecting-ip")
if cf_ip:
return cf_ip.strip()
return request.client.host if request.client else None
def _safe_tracking_url(template: str, *, click_id: str, domain: str, domain_id: int, partner: str) -> str:
try:
return template.format(
click_id=click_id,
domain=domain,
domain_id=domain_id,
partner=partner,
)
except KeyError as e:
raise HTTPException(
status_code=500,
detail=f"Partner tracking_url_template uses unsupported placeholder: {str(e)}",
)
def generate_tracking_url(
partner: AffiliatePartner,
yield_domain: YieldDomain,
click_id: str,
) -> str:
"""
Generate the tracking URL for a partner.
Most affiliate networks expect parameters like:
- clickid / subid: Our click tracking ID
- ref: Domain name or user reference
"""
if not partner.tracking_url_template:
raise HTTPException(
status_code=503,
detail=f"Partner routing not configured for {partner.slug}. Missing tracking_url_template.",
)
return _safe_tracking_url(
partner.tracking_url_template,
click_id=click_id,
domain=yield_domain.domain,
domain_id=yield_domain.id,
partner=partner.slug,
)
@router.get("/{domain}")
async def route_yield_domain(
domain: str,
request: Request,
db: AsyncSession = Depends(get_db),
direct: bool = Query(True, description="Direct redirect without landing page"),
):
"""
Route traffic for a yield domain.
This is the main entry point for yield domain traffic.
Query params:
- direct: If true, redirect immediately without landing page
"""
domain = domain.lower().strip()
# Find yield domain (must be connected + active)
yield_domain = (
await db.execute(
select(YieldDomain).where(
and_(
YieldDomain.domain == domain,
YieldDomain.status == "active",
YieldDomain.dns_verified == True,
or_(YieldDomain.connected_at.is_not(None), YieldDomain.dns_verified_at.is_not(None)),
)
)
)
).scalar_one_or_none()
if not yield_domain:
logger.warning(f"Route request for unknown/inactive/unconnected domain: {domain}")
raise HTTPException(status_code=404, detail="Domain not active for yield routing.")
# Resolve partner
partner: Optional[AffiliatePartner] = None
if yield_domain.partner_id:
partner = (
await db.execute(
select(AffiliatePartner).where(
and_(
AffiliatePartner.id == yield_domain.partner_id,
AffiliatePartner.is_active == True,
)
)
)
).scalar_one_or_none()
if not partner and yield_domain.detected_intent:
# Match full detected intent first (e.g. medical_dental)
partner = (
await db.execute(
select(AffiliatePartner)
.where(
and_(
AffiliatePartner.is_active == True,
AffiliatePartner.intent_categories.ilike(f"%{yield_domain.detected_intent}%"),
)
)
.order_by(AffiliatePartner.priority.desc())
)
).scalar_one_or_none()
if not partner:
raise HTTPException(status_code=503, detail="No active partner available for this domain intent.")
# Rate limit: max 120 clicks/10min per IP per domain
client_ip = _get_client_ip(request)
ip_hash = hash_ip(client_ip) if client_ip else None
if ip_hash:
cutoff = datetime.utcnow() - timedelta(minutes=10)
recent = (
await db.execute(
select(func.count(YieldTransaction.id)).where(
and_(
YieldTransaction.yield_domain_id == yield_domain.id,
YieldTransaction.event_type == "click",
YieldTransaction.ip_hash == ip_hash,
YieldTransaction.created_at >= cutoff,
)
)
)
).scalar() or 0
if recent >= 120:
raise HTTPException(status_code=429, detail="Too many requests. Please slow down.")
# Compute click economics (only CPC can be accounted immediately)
gross = Decimal("0")
net = Decimal("0")
currency = (partner.payout_currency or "CHF").upper()
if (partner.payout_type or "").lower() == "cpc":
gross = partner.payout_amount or Decimal("0")
net = (gross * USER_REVENUE_SHARE).quantize(Decimal("0.01"))
click_id = uuid4().hex
destination_url = generate_tracking_url(partner, yield_domain, click_id)
user_agent = request.headers.get("user-agent")
referrer = request.headers.get("referer")
geo_country = request.headers.get("cf-ipcountry") or request.headers.get("x-country")
geo_country = geo_country.strip().upper() if geo_country else None
transaction = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type="click",
partner_slug=partner.slug,
click_id=click_id,
destination_url=destination_url[:2000],
gross_amount=gross,
net_amount=net,
currency=currency,
referrer=referrer[:500] if referrer else None,
user_agent=user_agent[:500] if user_agent else None,
geo_country=geo_country[:2] if geo_country else None,
ip_hash=ip_hash,
status="confirmed",
confirmed_at=datetime.utcnow(),
)
db.add(transaction)
yield_domain.total_clicks += 1
yield_domain.last_click_at = datetime.utcnow()
if net > 0:
yield_domain.total_revenue += net
await track_event(
db,
event_name="yield_click",
request=request,
user_id=yield_domain.user_id,
is_authenticated=None,
source="routing",
domain=yield_domain.domain,
yield_domain_id=yield_domain.id,
click_id=click_id,
metadata={"partner": partner.slug, "currency": currency, "net_amount": float(net)},
)
await db.commit()
# Only direct redirect for MVP
return RedirectResponse(url=destination_url, status_code=302)
@router.get("/")
async def yield_routing_info():
"""Info endpoint for yield routing service."""
return {
"service": "Pounce Yield Routing",
"version": "2.0.0",
"docs": f"{settings.site_url}/docs#/yield-routing",
"status": "active",
}
# ============================================================================
# Host-based routing (for production deployment)
# ============================================================================
@router.api_route("/catch-all", methods=["GET", "HEAD"])
async def catch_all_route(
request: Request,
db: AsyncSession = Depends(get_db),
):
"""
Catch-all route for host-based routing.
In production, this endpoint handles requests where the Host header
is the yield domain itself (e.g., zahnarzt-zuerich.ch).
This requires:
1. Yield domains to CNAME to yield.pounce.io
2. Nginx/Caddy to route all hosts to this backend
3. This endpoint to parse the Host header
"""
host = request.headers.get("host", "").lower()
# Remove port if present
if ":" in host:
host = host.split(":")[0]
# Skip our own domains
our_domains = ["pounce.ch", "pounce.io", "localhost", "127.0.0.1"]
if any(host.endswith(d) for d in our_domains):
return {"status": "not a yield domain", "host": host}
# If host matches a connected yield domain, route it
_ = (
await db.execute(
select(YieldDomain.id).where(
and_(
YieldDomain.domain == host,
YieldDomain.status == "active",
YieldDomain.dns_verified == True,
or_(YieldDomain.connected_at.is_not(None), YieldDomain.dns_verified_at.is_not(None)),
)
)
)
).scalar_one_or_none()
if not _:
raise HTTPException(status_code=404, detail="Host not configured for yield routing.")
return RedirectResponse(url=f"/api/v1/r/{host}?direct=true", status_code=302)

View File

@ -0,0 +1,563 @@
"""
Webhook endpoints for Yield affiliate partner callbacks.
Partners call these endpoints to report:
- Clicks (redirect happened)
- Leads (form submitted, signup, etc.)
- Sales (purchase completed)
Each partner may have different authentication methods:
- HMAC signature verification
- API key in header
- IP whitelist
"""
import hashlib
import hmac
import json
import logging
from datetime import datetime
from decimal import Decimal
from typing import Optional
from fastapi import APIRouter, BackgroundTasks, Depends, Header, HTTPException, Request
from pydantic import BaseModel, Field
from sqlalchemy import and_, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.api.deps import get_db
from app.config import get_settings
from app.models.yield_domain import YieldDomain, YieldTransaction, AffiliatePartner
from app.services.telemetry import track_event
logger = logging.getLogger(__name__)
settings = get_settings()
router = APIRouter(prefix="/yield-webhooks", tags=["yield-webhooks"])
# Revenue split: User gets 70%, Pounce keeps 30%
USER_REVENUE_SHARE = Decimal("0.70")
# ============================================================================
# Schemas
# ============================================================================
class PartnerEvent(BaseModel):
"""Generic partner event payload."""
event_type: str = Field(..., description="click, lead, or sale")
domain: str = Field(..., description="The yield domain that generated this event")
transaction_id: Optional[str] = Field(None, description="Partner's transaction ID")
click_id: Optional[str] = Field(None, description="Pounce click_id for attribution (UUID hex)")
amount: Optional[float] = Field(None, description="Gross commission amount")
currency: Optional[str] = Field("CHF", description="Currency code")
# Optional attribution data
geo_country: Optional[str] = None
referrer: Optional[str] = None
user_agent: Optional[str] = None
# Optional metadata
metadata: Optional[dict] = None
class WebhookResponse(BaseModel):
"""Response for webhook calls."""
success: bool
transaction_id: Optional[int] = None
message: str
# ============================================================================
# Signature Verification Helpers
# ============================================================================
def verify_hmac_signature(
payload: bytes,
signature: str,
secret: str,
algorithm: str = "sha256"
) -> bool:
"""Verify HMAC signature for webhook payload."""
expected = hmac.new(
secret.encode(),
payload,
hashlib.sha256 if algorithm == "sha256" else hashlib.sha1
).hexdigest()
return hmac.compare_digest(signature, expected)
def hash_ip(ip: str) -> str:
"""Hash IP address for privacy-compliant storage."""
return hashlib.sha256(f"{ip}|{settings.secret_key}".encode()).hexdigest()[:32]
def _get_webhook_secret(partner_slug: str) -> Optional[str]:
"""
Webhook secrets are configured via environment:
- YIELD_WEBHOOK_SECRET (global default)
- YIELD_WEBHOOK_SECRET_<PARTNER_SLUG_UPPER> (partner-specific override)
"""
import os
specific = os.getenv(f"YIELD_WEBHOOK_SECRET_{partner_slug.upper()}")
if specific:
return specific
return os.getenv("YIELD_WEBHOOK_SECRET") or None
# ============================================================================
# Generic Webhook Endpoint
# ============================================================================
@router.post("/{partner_slug}", response_model=WebhookResponse)
async def receive_partner_webhook(
partner_slug: str,
event: PartnerEvent,
request: Request,
background_tasks: BackgroundTasks,
db: AsyncSession = Depends(get_db),
x_webhook_signature: Optional[str] = Header(None),
x_api_key: Optional[str] = Header(None),
):
"""
Receive webhook callback from affiliate partner.
Partners POST events here when clicks, leads, or sales occur.
"""
# 1. Find partner
partner = (
await db.execute(
select(AffiliatePartner).where(
and_(
AffiliatePartner.slug == partner_slug,
AffiliatePartner.is_active == True,
)
)
)
).scalar_one_or_none()
if not partner:
logger.warning(f"Webhook from unknown partner: {partner_slug}")
raise HTTPException(status_code=404, detail="Unknown partner")
# 2. Verify authentication (strict)
secret = _get_webhook_secret(partner_slug)
if not secret:
raise HTTPException(status_code=503, detail="Webhook secret not configured on server.")
if not x_webhook_signature:
raise HTTPException(status_code=401, detail="Missing webhook signature.")
raw = await request.body()
if not verify_hmac_signature(raw, x_webhook_signature, secret):
raise HTTPException(status_code=401, detail="Invalid webhook signature.")
# 3. Find yield domain (must be active)
yield_domain = (
await db.execute(
select(YieldDomain).where(
and_(
YieldDomain.domain == event.domain.lower(),
YieldDomain.status == "active",
)
)
)
).scalar_one_or_none()
if not yield_domain:
logger.warning(f"Webhook for unknown/inactive domain: {event.domain}")
raise HTTPException(status_code=404, detail="Domain not found or inactive")
# 4. Calculate amounts
gross_amount = Decimal(str(event.amount)) if event.amount else Decimal("0")
net_amount = gross_amount * USER_REVENUE_SHARE
# 5. Get client IP for hashing
client_ip = request.client.host if request.client else None
ip_hash = hash_ip(client_ip) if client_ip else None
# 6. Create transaction
transaction = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type=event.event_type,
partner_slug=partner_slug,
partner_transaction_id=event.transaction_id,
click_id=(event.click_id[:64] if event.click_id else None),
gross_amount=gross_amount,
net_amount=net_amount,
currency=event.currency or "CHF",
referrer=event.referrer,
user_agent=event.user_agent,
geo_country=event.geo_country,
ip_hash=ip_hash,
status="pending" if event.event_type in ["lead", "sale"] else "confirmed",
confirmed_at=datetime.utcnow() if event.event_type == "click" else None,
)
db.add(transaction)
# Optional: attribute to an existing click transaction (same yield_domain + click_id)
if event.click_id:
click_tx = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.yield_domain_id == yield_domain.id,
YieldTransaction.event_type == "click",
YieldTransaction.click_id == event.click_id[:64],
)
)
)
).scalar_one_or_none()
if not click_tx:
logger.warning(
f"Webhook received click_id but no matching click found: partner={partner_slug} "
f"domain={yield_domain.domain} click_id={event.click_id[:64]}"
)
# 7. Update domain aggregates
if event.event_type == "click":
yield_domain.total_clicks += 1
yield_domain.last_click_at = datetime.utcnow()
elif event.event_type in ["lead", "sale"]:
yield_domain.total_conversions += 1
yield_domain.last_conversion_at = datetime.utcnow()
# Add revenue when confirmed
if transaction.status == "confirmed":
yield_domain.total_revenue += net_amount
await track_event(
db,
event_name="yield_conversion",
request=request,
user_id=yield_domain.user_id,
is_authenticated=None,
source="webhook",
domain=yield_domain.domain,
yield_domain_id=yield_domain.id,
click_id=event.click_id,
metadata={
"partner": partner_slug,
"event_type": event.event_type,
"status": transaction.status,
"currency": transaction.currency,
"net_amount": float(net_amount),
"partner_transaction_id": event.transaction_id,
},
)
await db.commit()
await db.refresh(transaction)
logger.info(
f"Webhook processed: {partner_slug} -> {event.domain} "
f"({event.event_type}, gross={gross_amount}, net={net_amount})"
)
return WebhookResponse(
success=True,
transaction_id=transaction.id,
message=f"Event {event.event_type} recorded successfully"
)
# ============================================================================
# Awin-Specific Webhook
# ============================================================================
class AwinEvent(BaseModel):
"""Awin network postback format."""
clickRef: str # Our yield domain ID or domain name
transactionId: str
commission: float
commissionCurrency: str = "CHF"
status: str # "pending", "approved", "declined"
transactionType: str # "sale", "lead"
@router.post("/awin/postback", response_model=WebhookResponse)
async def receive_awin_postback(
event: AwinEvent,
request: Request,
db: AsyncSession = Depends(get_db),
x_awin_signature: Optional[str] = Header(None),
):
"""
Receive postback from Awin affiliate network.
Awin sends postbacks for tracked conversions.
"""
# Verify authentication (strict)
secret = _get_webhook_secret("awin")
if not secret:
raise HTTPException(status_code=503, detail="Webhook secret not configured on server.")
if not x_awin_signature:
raise HTTPException(status_code=401, detail="Missing webhook signature.")
raw = await request.body()
if not verify_hmac_signature(raw, x_awin_signature, secret):
raise HTTPException(status_code=401, detail="Invalid webhook signature.")
# Find domain by click reference
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.domain == event.clickRef.lower()))
).scalar_one_or_none()
if not yield_domain:
# Try to find by ID if clickRef is numeric
try:
domain_id = int(event.clickRef)
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.id == domain_id))
).scalar_one_or_none()
except ValueError:
pass
if not yield_domain:
logger.warning(f"Awin postback for unknown domain: {event.clickRef}")
raise HTTPException(status_code=404, detail="Domain not found")
# Calculate amounts
gross_amount = Decimal(str(event.commission))
net_amount = gross_amount * USER_REVENUE_SHARE
# Map Awin status to our status
status_map = {
"pending": "pending",
"approved": "confirmed",
"declined": "rejected",
}
status = status_map.get(event.status.lower(), "pending")
# Create or update transaction
existing_tx = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.partner_transaction_id == event.transactionId,
YieldTransaction.partner_slug.ilike("awin%"),
)
)
)
).scalar_one_or_none()
if existing_tx:
# Update existing transaction
existing_tx.status = status
if status == "confirmed":
existing_tx.confirmed_at = datetime.utcnow()
yield_domain.total_revenue += net_amount
transaction_id = existing_tx.id
else:
# Create new transaction
transaction = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type="lead" if event.transactionType.lower() == "lead" else "sale",
partner_slug=f"awin_{yield_domain.active_route or 'unknown'}",
partner_transaction_id=event.transactionId,
gross_amount=gross_amount,
net_amount=net_amount,
currency=event.commissionCurrency,
status=status,
confirmed_at=datetime.utcnow() if status == "confirmed" else None,
)
db.add(transaction)
# Update domain stats
yield_domain.total_conversions += 1
yield_domain.last_conversion_at = datetime.utcnow()
if status == "confirmed":
yield_domain.total_revenue += net_amount
await db.flush()
transaction_id = transaction.id
await db.commit()
logger.info(f"Awin postback processed: {event.transactionId} -> {status}")
return WebhookResponse(
success=True,
transaction_id=transaction_id,
message=f"Awin event processed ({status})"
)
# ============================================================================
# Transaction Confirmation Endpoint (Admin/Internal)
# ============================================================================
@router.post("/confirm/{transaction_id}", response_model=WebhookResponse)
async def confirm_transaction(
transaction_id: int,
db: AsyncSession = Depends(get_db),
x_internal_key: Optional[str] = Header(None),
):
"""
Manually confirm a pending transaction.
Internal endpoint for admin use or automated confirmation.
"""
internal_key = (settings.internal_api_key or "").strip()
if not internal_key:
raise HTTPException(status_code=503, detail="internal_api_key is not configured on server.")
if x_internal_key != internal_key:
raise HTTPException(status_code=401, detail="Unauthorized")
transaction = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.id == transaction_id,
YieldTransaction.status == "pending",
)
)
)
).scalar_one_or_none()
if not transaction:
raise HTTPException(status_code=404, detail="Transaction not found or not pending")
# Confirm transaction
transaction.status = "confirmed"
transaction.confirmed_at = datetime.utcnow()
# Update domain revenue
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.id == transaction.yield_domain_id))
).scalar_one_or_none()
if yield_domain:
yield_domain.total_revenue += transaction.net_amount
await db.commit()
logger.info(f"Transaction {transaction_id} confirmed manually")
return WebhookResponse(
success=True,
transaction_id=transaction_id,
message="Transaction confirmed"
)
# ============================================================================
# Batch Transaction Import (for reconciliation)
# ============================================================================
class BatchTransactionItem(BaseModel):
"""Single transaction in batch import."""
domain: str
event_type: str
partner_slug: str
transaction_id: str
click_id: Optional[str] = None
gross_amount: float
currency: str = "CHF"
status: str = "confirmed"
created_at: Optional[str] = None
class BatchImportRequest(BaseModel):
"""Batch transaction import request."""
transactions: list[BatchTransactionItem]
class BatchImportResponse(BaseModel):
"""Batch import response."""
success: bool
imported: int
skipped: int
errors: list[str]
@router.post("/batch-import", response_model=BatchImportResponse)
async def batch_import_transactions(
request_data: BatchImportRequest,
db: AsyncSession = Depends(get_db),
x_internal_key: Optional[str] = Header(None),
):
"""
Batch import transactions for reconciliation.
Internal endpoint for importing partner reports.
"""
internal_key = (settings.internal_api_key or "").strip()
if not internal_key:
raise HTTPException(status_code=503, detail="internal_api_key is not configured on server.")
if x_internal_key != internal_key:
raise HTTPException(status_code=401, detail="Unauthorized")
imported = 0
skipped = 0
errors = []
for item in request_data.transactions:
try:
# Find domain
yield_domain = (
await db.execute(select(YieldDomain).where(YieldDomain.domain == item.domain.lower()))
).scalar_one_or_none()
if not yield_domain:
errors.append(f"Domain not found: {item.domain}")
skipped += 1
continue
# Check for duplicate
existing = (
await db.execute(
select(YieldTransaction).where(
and_(
YieldTransaction.partner_transaction_id == item.transaction_id,
YieldTransaction.partner_slug == item.partner_slug,
)
)
)
).scalar_one_or_none()
if existing:
skipped += 1
continue
# Create transaction
gross = Decimal(str(item.gross_amount))
net = gross * USER_REVENUE_SHARE
tx = YieldTransaction(
yield_domain_id=yield_domain.id,
event_type=item.event_type,
partner_slug=item.partner_slug,
partner_transaction_id=item.transaction_id,
click_id=(item.click_id[:64] if item.click_id else None),
gross_amount=gross,
net_amount=net,
currency=item.currency,
status=item.status,
confirmed_at=datetime.utcnow() if item.status == "confirmed" else None,
)
db.add(tx)
# Update domain stats
if item.event_type == "click":
yield_domain.total_clicks += 1
else:
yield_domain.total_conversions += 1
if item.status == "confirmed":
yield_domain.total_revenue += net
imported += 1
except Exception as e:
errors.append(f"Error importing {item.domain}/{item.transaction_id}: {str(e)}")
skipped += 1
await db.commit()
return BatchImportResponse(
success=len(errors) == 0,
imported=imported,
skipped=skipped,
errors=errors[:10] # Limit error messages
)

View File

@ -17,6 +17,11 @@ class Settings(BaseSettings):
# App Settings
app_name: str = "DomainWatch"
debug: bool = True
site_url: str = "https://pounce.ch" # Base URL for links in emails/API responses
# Internal admin operations (server-to-server / cron)
# MUST be set in production; used for protected internal endpoints.
internal_api_key: str = ""
# Email Settings (optional)
smtp_host: str = ""
@ -42,10 +47,50 @@ class Settings(BaseSettings):
enable_metrics: bool = True
metrics_path: str = "/metrics"
enable_db_query_metrics: bool = False
enable_business_metrics: bool = True
business_metrics_days: int = 30
business_metrics_cache_seconds: int = 60
# Ops / Backups (4B)
enable_db_backups: bool = False
backup_dir: str = "backups"
backup_retention_days: int = 14
# Ops / Alerting (4B) - no Docker required
ops_alerts_enabled: bool = False
ops_alert_recipients: str = "" # comma-separated emails; if empty -> CONTACT_EMAIL env fallback
ops_alert_cooldown_minutes: int = 180
ops_alert_backup_stale_seconds: int = 93600 # ~26h
# Rate limiting storage (SlowAPI / limits). Use Redis in production.
rate_limit_storage_uri: str = "memory://"
# =================================
# Referral rewards / Anti-fraud (3C.2)
# =================================
referral_rewards_enabled: bool = True
referral_rewards_cooldown_days: int = 7
referral_rewards_ip_window_days: int = 30
referral_rewards_require_ip_hash: bool = True
# =================================
# Yield / Intent Routing
# =================================
# Comma-separated list of nameservers the user must delegate to for Yield.
# Example: "ns1.pounce.io,ns2.pounce.io"
yield_nameservers: str = "ns1.pounce.io,ns2.pounce.io"
# CNAME/ALIAS target for simpler DNS setup (provider-dependent).
# Example: "yield.pounce.io"
yield_cname_target: str = "yield.pounce.io"
@property
def yield_nameserver_list(self) -> list[str]:
return [
ns.strip().lower()
for ns in (self.yield_nameservers or "").split(",")
if ns.strip()
]
# Database pooling (PostgreSQL)
db_pool_size: int = 5
db_max_overflow: int = 10
@ -72,6 +117,13 @@ class Settings(BaseSettings):
moz_access_id: str = ""
moz_secret_key: str = ""
# ICANN CZDS (Centralized Zone Data Service)
# For downloading gTLD zone files (.com, .net, .org, etc.)
# Register at: https://czds.icann.org/
czds_username: str = ""
czds_password: str = ""
czds_data_dir: str = "/tmp/pounce_czds"
class Config:
env_file = ".env"
env_file_encoding = "utf-8"

View File

@ -120,12 +120,251 @@ async def apply_migrations(conn: AsyncConnection) -> None:
# 4) domain_listings pounce_score index (market sorting)
# ----------------------------------------------------
if await _table_exists(conn, "domain_listings"):
if not await _has_column(conn, "domain_listings", "sold_at"):
logger.info("DB migrations: adding column domain_listings.sold_at")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_at DATETIME"))
if not await _has_column(conn, "domain_listings", "sold_reason"):
logger.info("DB migrations: adding column domain_listings.sold_reason")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_reason VARCHAR(200)"))
if not await _has_column(conn, "domain_listings", "sold_price"):
logger.info("DB migrations: adding column domain_listings.sold_price")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_price FLOAT"))
if not await _has_column(conn, "domain_listings", "sold_currency"):
logger.info("DB migrations: adding column domain_listings.sold_currency")
await conn.execute(text("ALTER TABLE domain_listings ADD COLUMN sold_currency VARCHAR(3)"))
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_domain_listings_pounce_score "
"ON domain_listings(pounce_score)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_domain_listings_status "
"ON domain_listings(status)"
)
)
# ----------------------------------------------------
# 4b) listing_inquiries: deal workflow + audit trail
# ----------------------------------------------------
if await _table_exists(conn, "listing_inquiries"):
if not await _has_column(conn, "listing_inquiries", "buyer_user_id"):
logger.info("DB migrations: adding column listing_inquiries.buyer_user_id")
await conn.execute(text("ALTER TABLE listing_inquiries ADD COLUMN buyer_user_id INTEGER"))
if not await _has_column(conn, "listing_inquiries", "closed_at"):
logger.info("DB migrations: adding column listing_inquiries.closed_at")
await conn.execute(text("ALTER TABLE listing_inquiries ADD COLUMN closed_at DATETIME"))
if not await _has_column(conn, "listing_inquiries", "closed_reason"):
logger.info("DB migrations: adding column listing_inquiries.closed_reason")
await conn.execute(text("ALTER TABLE listing_inquiries ADD COLUMN closed_reason VARCHAR(200)"))
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiries_listing_created "
"ON listing_inquiries(listing_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiries_listing_status "
"ON listing_inquiries(listing_id, status)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiries_buyer_user "
"ON listing_inquiries(buyer_user_id)"
)
)
# The table itself is created by `Base.metadata.create_all()` on startup.
# Here we only add indexes (idempotent) for existing DBs.
if await _table_exists(conn, "listing_inquiry_events"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_events_inquiry_created "
"ON listing_inquiry_events(inquiry_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_events_listing_created "
"ON listing_inquiry_events(listing_id, created_at)"
)
)
if await _table_exists(conn, "listing_inquiry_messages"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_messages_inquiry_created "
"ON listing_inquiry_messages(inquiry_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_messages_listing_created "
"ON listing_inquiry_messages(listing_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_listing_inquiry_messages_sender_created "
"ON listing_inquiry_messages(sender_user_id, created_at)"
)
)
# ----------------------------------------------------
# 5) Yield tables indexes
# ----------------------------------------------------
if await _table_exists(conn, "yield_domains"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_domains_user_status "
"ON yield_domains(user_id, status)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_domains_domain "
"ON yield_domains(domain)"
)
)
if not await _has_column(conn, "yield_domains", "connected_at"):
logger.info("DB migrations: adding column yield_domains.connected_at")
await conn.execute(text("ALTER TABLE yield_domains ADD COLUMN connected_at DATETIME"))
if await _table_exists(conn, "yield_transactions"):
if not await _has_column(conn, "yield_transactions", "click_id"):
logger.info("DB migrations: adding column yield_transactions.click_id")
await conn.execute(text("ALTER TABLE yield_transactions ADD COLUMN click_id VARCHAR(64)"))
await conn.execute(text("CREATE INDEX IF NOT EXISTS ix_yield_transactions_click_id ON yield_transactions(click_id)"))
if not await _has_column(conn, "yield_transactions", "destination_url"):
logger.info("DB migrations: adding column yield_transactions.destination_url")
await conn.execute(text("ALTER TABLE yield_transactions ADD COLUMN destination_url TEXT"))
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_tx_domain_created "
"ON yield_transactions(yield_domain_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_tx_status_created "
"ON yield_transactions(status, created_at)"
)
)
if await _table_exists(conn, "yield_payouts"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_yield_payouts_user_status "
"ON yield_payouts(user_id, status)"
)
)
# ----------------------------------------------------
# 6) Referral rewards: subscriptions.referral_bonus_domains (3C.2)
# ----------------------------------------------------
if await _table_exists(conn, "subscriptions"):
if not await _has_column(conn, "subscriptions", "referral_bonus_domains"):
logger.info("DB migrations: adding column subscriptions.referral_bonus_domains")
await conn.execute(
text(
"ALTER TABLE subscriptions "
"ADD COLUMN referral_bonus_domains INTEGER NOT NULL DEFAULT 0"
)
)
# ----------------------------------------------------
# 6) Telemetry events indexes
# ----------------------------------------------------
if await _table_exists(conn, "telemetry_events"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_event_name_created "
"ON telemetry_events(event_name, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_user_created "
"ON telemetry_events(user_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_listing_created "
"ON telemetry_events(listing_id, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_telemetry_yield_created "
"ON telemetry_events(yield_domain_id, created_at)"
)
)
# ----------------------------------------------------
# 6b) Ops alert events (persisted cooldown + history)
# ----------------------------------------------------
# NOTE: Table is created by Base.metadata.create_all() for new installs.
# Here we ensure indexes exist for older DBs.
if await _table_exists(conn, "ops_alert_events"):
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_ops_alert_key_created "
"ON ops_alert_events(alert_key, created_at)"
)
)
await conn.execute(
text(
"CREATE INDEX IF NOT EXISTS ix_ops_alert_status_created "
"ON ops_alert_events(status, created_at)"
)
)
# ----------------------------------------------------
# 7) User referral tracking columns
# ----------------------------------------------------
if await _table_exists(conn, "users"):
if not await _has_column(conn, "users", "referred_by_user_id"):
logger.info("DB migrations: adding column users.referred_by_user_id")
await conn.execute(text("ALTER TABLE users ADD COLUMN referred_by_user_id INTEGER"))
if not await _has_column(conn, "users", "referred_by_domain"):
logger.info("DB migrations: adding column users.referred_by_domain")
await conn.execute(text("ALTER TABLE users ADD COLUMN referred_by_domain VARCHAR(255)"))
if not await _has_column(conn, "users", "referral_code"):
logger.info("DB migrations: adding column users.referral_code")
await conn.execute(text("ALTER TABLE users ADD COLUMN referral_code VARCHAR(100)"))
if not await _has_column(conn, "users", "invite_code"):
logger.info("DB migrations: adding column users.invite_code")
await conn.execute(text("ALTER TABLE users ADD COLUMN invite_code VARCHAR(32)"))
# Unique index for invite_code (SQLite + Postgres)
await conn.execute(text("CREATE UNIQUE INDEX IF NOT EXISTS ix_users_invite_code ON users(invite_code)"))
# ----------------------------------------------------
# 7) Portfolio DNS verification columns
# ----------------------------------------------------
if await _table_exists(conn, "portfolio_domains"):
if not await _has_column(conn, "portfolio_domains", "is_dns_verified"):
logger.info("DB migrations: adding column portfolio_domains.is_dns_verified")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN is_dns_verified BOOLEAN DEFAULT 0"))
if not await _has_column(conn, "portfolio_domains", "verification_status"):
logger.info("DB migrations: adding column portfolio_domains.verification_status")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verification_status VARCHAR(50) DEFAULT 'unverified'"))
if not await _has_column(conn, "portfolio_domains", "verification_code"):
logger.info("DB migrations: adding column portfolio_domains.verification_code")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verification_code VARCHAR(100)"))
if not await _has_column(conn, "portfolio_domains", "verification_started_at"):
logger.info("DB migrations: adding column portfolio_domains.verification_started_at")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verification_started_at DATETIME"))
if not await _has_column(conn, "portfolio_domains", "verified_at"):
logger.info("DB migrations: adding column portfolio_domains.verified_at")
await conn.execute(text("ALTER TABLE portfolio_domains ADD COLUMN verified_at DATETIME"))
logger.info("DB migrations: done")

View File

@ -49,8 +49,8 @@ async def lifespan(app: FastAPI):
# Start scheduler (optional - recommended: run in separate process/container)
if settings.enable_scheduler:
start_scheduler()
logger.info("Scheduler started")
start_scheduler()
logger.info("Scheduler started")
else:
logger.info("Scheduler disabled (ENABLE_SCHEDULER=false)")
@ -58,7 +58,7 @@ async def lifespan(app: FastAPI):
# Shutdown
if settings.enable_scheduler:
stop_scheduler()
stop_scheduler()
logger.info("Application shutdown complete")
@ -90,7 +90,7 @@ Login: POST /api/v1/auth/login
## Support
For API issues, contact support@pounce.ch
For API issues, contact hello@pounce.ch
""",
version="1.0.0",
lifespan=lifespan,
@ -168,6 +168,22 @@ async def health_check():
}
@app.get("/api/health")
async def health_check_api():
"""
Health check behind Nginx `/api` proxy.
Nginx routes `/api/*` to the backend, so `https://pounce.ch/api/health` must exist.
"""
return await health_check()
@app.get("/api/v1/health")
async def health_check_api_v1():
"""Health check behind `/api/v1` prefix (convenience)."""
return await health_check()
# Rate-limited endpoints - apply specific limits to sensitive routes
from fastapi import Depends

View File

@ -13,6 +13,10 @@ from app.models.listing import DomainListing, ListingInquiry, ListingView
from app.models.sniper_alert import SniperAlert, SniperAlertMatch
from app.models.seo_data import DomainSEOData
from app.models.yield_domain import YieldDomain, YieldTransaction, YieldPayout, AffiliatePartner
from app.models.telemetry import TelemetryEvent
from app.models.ops_alert import OpsAlertEvent
from app.models.domain_analysis_cache import DomainAnalysisCache
from app.models.zone_file import ZoneSnapshot, DroppedDomain
__all__ = [
"User",
@ -43,4 +47,12 @@ __all__ = [
"YieldTransaction",
"YieldPayout",
"AffiliatePartner",
# New: Telemetry (events)
"TelemetryEvent",
"OpsAlertEvent",
# New: Analyze cache
"DomainAnalysisCache",
# New: Zone file drops
"ZoneSnapshot",
"DroppedDomain",
]

View File

@ -2,7 +2,7 @@
from datetime import datetime
from enum import Enum
from sqlalchemy import String, Boolean, DateTime, ForeignKey, Text, Enum as SQLEnum
from sqlalchemy.orm import Mapped, mapped_column, relationship
from sqlalchemy.orm import Mapped, mapped_column, relationship, backref
from app.database import Base
@ -116,8 +116,11 @@ class DomainHealthCache(Base):
# Timestamp
checked_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
# Relationship
domain: Mapped["Domain"] = relationship("Domain", backref="health_cache")
# Relationship - cascade delete when domain is deleted
domain: Mapped["Domain"] = relationship(
"Domain",
backref=backref("health_cache", cascade="all, delete-orphan", uselist=False)
)
def __repr__(self) -> str:
return f"<DomainHealthCache {self.domain_id} status={self.status}>"

View File

@ -0,0 +1,25 @@
"""
Domain analysis cache (Phase 2 Diligence).
We store computed JSON to avoid repeated RDAP/DNS/HTTP checks on each click.
"""
from __future__ import annotations
from datetime import datetime
from sqlalchemy import DateTime, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column
from app.database import Base
class DomainAnalysisCache(Base):
__tablename__ = "domain_analysis_cache"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
domain: Mapped[str] = mapped_column(String(255), unique=True, index=True, nullable=False)
payload_json: Mapped[str] = mapped_column(Text, nullable=False)
computed_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
ttl_seconds: Mapped[int] = mapped_column(Integer, default=3600)

View File

@ -91,6 +91,10 @@ class DomainListing(Base):
# Status
status: Mapped[str] = mapped_column(String(30), default=ListingStatus.DRAFT.value, index=True)
sold_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
sold_reason: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
sold_price: Mapped[Optional[float]] = mapped_column(Float, nullable=True)
sold_currency: Mapped[Optional[str]] = mapped_column(String(3), nullable=True)
# Features
show_valuation: Mapped[bool] = mapped_column(Boolean, default=True)
@ -147,6 +151,7 @@ class ListingInquiry(Base):
id: Mapped[int] = mapped_column(primary_key=True, index=True)
listing_id: Mapped[int] = mapped_column(ForeignKey("domain_listings.id"), index=True, nullable=False)
buyer_user_id: Mapped[Optional[int]] = mapped_column(ForeignKey("users.id"), index=True, nullable=True)
# Inquirer info
name: Mapped[str] = mapped_column(String(100), nullable=False)
@ -159,7 +164,8 @@ class ListingInquiry(Base):
offer_amount: Mapped[Optional[float]] = mapped_column(Float, nullable=True)
# Status
status: Mapped[str] = mapped_column(String(20), default="new") # new, read, replied, spam
status: Mapped[str] = mapped_column(String(20), default="new") # new, read, replied, closed, spam
closed_reason: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
# Tracking
ip_address: Mapped[Optional[str]] = mapped_column(String(45), nullable=True)
@ -169,14 +175,72 @@ class ListingInquiry(Base):
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
read_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
replied_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
closed_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Relationships
listing: Mapped["DomainListing"] = relationship("DomainListing", back_populates="inquiries")
messages: Mapped[List["ListingInquiryMessage"]] = relationship(
"ListingInquiryMessage", back_populates="inquiry", cascade="all, delete-orphan"
)
events: Mapped[List["ListingInquiryEvent"]] = relationship(
"ListingInquiryEvent", back_populates="inquiry", cascade="all, delete-orphan"
)
def __repr__(self) -> str:
return f"<ListingInquiry from {self.email} for listing #{self.listing_id}>"
class ListingInquiryEvent(Base):
"""
Audit trail for inquiry status changes.
This is the minimal “deal system” log:
- who changed what status
- when it happened
- optional reason (close/spam)
"""
__tablename__ = "listing_inquiry_events"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
inquiry_id: Mapped[int] = mapped_column(ForeignKey("listing_inquiries.id"), index=True, nullable=False)
listing_id: Mapped[int] = mapped_column(ForeignKey("domain_listings.id"), index=True, nullable=False)
actor_user_id: Mapped[int] = mapped_column(ForeignKey("users.id"), index=True, nullable=False)
old_status: Mapped[Optional[str]] = mapped_column(String(20), nullable=True)
new_status: Mapped[str] = mapped_column(String(20), nullable=False)
reason: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
ip_address: Mapped[Optional[str]] = mapped_column(String(45), nullable=True)
user_agent: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
inquiry: Mapped["ListingInquiry"] = relationship("ListingInquiry", back_populates="events")
class ListingInquiryMessage(Base):
"""
Thread messages for listing inquiries (in-product negotiation).
- Buyer sends messages from their account
- Seller replies from Terminal
"""
__tablename__ = "listing_inquiry_messages"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
inquiry_id: Mapped[int] = mapped_column(ForeignKey("listing_inquiries.id"), index=True, nullable=False)
listing_id: Mapped[int] = mapped_column(ForeignKey("domain_listings.id"), index=True, nullable=False)
sender_user_id: Mapped[int] = mapped_column(ForeignKey("users.id"), index=True, nullable=False)
body: Mapped[str] = mapped_column(Text, nullable=False)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
inquiry: Mapped["ListingInquiry"] = relationship("ListingInquiry", back_populates="messages")
class ListingView(Base):
"""
Track listing page views for analytics.

View File

@ -0,0 +1,40 @@
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import DateTime, Index, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column
from app.database import Base
class OpsAlertEvent(Base):
"""
Persisted ops alert events.
Used for:
- cooldown across process restarts
- audit/history in admin UI
"""
__tablename__ = "ops_alert_events"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
alert_key: Mapped[str] = mapped_column(String(80), nullable=False, index=True)
severity: Mapped[str] = mapped_column(String(10), nullable=False, index=True) # "warn" | "page"
title: Mapped[str] = mapped_column(String(200), nullable=False)
detail: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
# "sent" | "skipped" | "error"
status: Mapped[str] = mapped_column(String(20), nullable=False, index=True)
recipients: Mapped[Optional[str]] = mapped_column(Text, nullable=True) # comma-separated
send_reason: Mapped[Optional[str]] = mapped_column(String(60), nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
__table_args__ = (
Index("ix_ops_alert_key_created", "alert_key", "created_at"),
Index("ix_ops_alert_status_created", "status", "created_at"),
)

View File

@ -45,6 +45,14 @@ class PortfolioDomain(Base):
# Status
status: Mapped[str] = mapped_column(String(50), default="active") # active, expired, sold, parked
# DNS Verification (required for Yield and For Sale)
# All fields nullable=True to avoid migration issues on existing databases
is_dns_verified: Mapped[Optional[bool]] = mapped_column(Boolean, default=False, nullable=True)
verification_status: Mapped[Optional[str]] = mapped_column(String(50), default="unverified", nullable=True)
verification_code: Mapped[Optional[str]] = mapped_column(String(100), nullable=True)
verification_started_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
verified_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# Notes
notes: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
tags: Mapped[Optional[str]] = mapped_column(String(500), nullable=True) # Comma-separated

View File

@ -123,6 +123,8 @@ class Subscription(Base):
# Limits (can be overridden)
max_domains: Mapped[int] = mapped_column(Integer, default=5)
# Referral reward bonus (3C.2): additive, computed deterministically from qualified referrals
referral_bonus_domains: Mapped[int] = mapped_column(Integer, default=0)
check_frequency: Mapped[str] = mapped_column(String(50), default="daily")
# Stripe integration
@ -167,7 +169,9 @@ class Subscription(Base):
@property
def domain_limit(self) -> int:
"""Get maximum allowed domains for this subscription."""
return self.max_domains or self.config["domain_limit"]
base = int(self.max_domains or self.config["domain_limit"] or 0)
bonus = int(self.referral_bonus_domains or 0)
return max(0, base + bonus)
@property
def portfolio_limit(self) -> int:

View File

@ -0,0 +1,56 @@
"""
Telemetry events (4A).
Store canonical product events for funnel KPIs:
- Deal funnel: listing_view → inquiry_created → message_sent → listing_marked_sold
- Yield funnel: yield_connected → yield_click → yield_conversion → payout_paid
"""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from sqlalchemy import Boolean, DateTime, ForeignKey, Index, Integer, String, Text
from sqlalchemy.orm import Mapped, mapped_column
from app.database import Base
class TelemetryEvent(Base):
__tablename__ = "telemetry_events"
id: Mapped[int] = mapped_column(primary_key=True, index=True)
# Who
user_id: Mapped[Optional[int]] = mapped_column(ForeignKey("users.id"), nullable=True, index=True)
# What
event_name: Mapped[str] = mapped_column(String(60), nullable=False, index=True)
# Entity links (optional)
listing_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
inquiry_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
yield_domain_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True, index=True)
click_id: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
domain: Mapped[Optional[str]] = mapped_column(String(255), nullable=True, index=True)
# Context
source: Mapped[Optional[str]] = mapped_column(String(30), nullable=True) # "public" | "terminal" | "webhook" | "scheduler" | "admin"
ip_hash: Mapped[Optional[str]] = mapped_column(String(64), nullable=True)
user_agent: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
referrer: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
metadata_json: Mapped[Optional[str]] = mapped_column(Text, nullable=True) # JSON string
# Flags
is_authenticated: Mapped[Optional[bool]] = mapped_column(Boolean, nullable=True)
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow, index=True)
__table_args__ = (
Index("ix_telemetry_event_name_created", "event_name", "created_at"),
Index("ix_telemetry_user_created", "user_id", "created_at"),
Index("ix_telemetry_listing_created", "listing_id", "created_at"),
Index("ix_telemetry_yield_created", "yield_domain_id", "created_at"),
)

View File

@ -1,7 +1,7 @@
"""User model."""
from datetime import datetime
from typing import Optional, List
from sqlalchemy import String, Boolean, DateTime
from sqlalchemy import String, Boolean, DateTime, Integer
from sqlalchemy.orm import Mapped, mapped_column, relationship
from app.database import Base
@ -40,6 +40,12 @@ class User(Base):
oauth_id: Mapped[Optional[str]] = mapped_column(String(255), nullable=True)
oauth_avatar: Mapped[Optional[str]] = mapped_column(String(500), nullable=True)
# Yield Referral Tracking (for viral growth)
referred_by_user_id: Mapped[Optional[int]] = mapped_column(Integer, nullable=True) # User who referred this user
referred_by_domain: Mapped[Optional[str]] = mapped_column(String(255), nullable=True) # Domain that referred
referral_code: Mapped[Optional[str]] = mapped_column(String(100), nullable=True) # Original referral code
invite_code: Mapped[Optional[str]] = mapped_column(String(32), nullable=True, unique=True, index=True) # user's own code
# Timestamps
created_at: Mapped[datetime] = mapped_column(DateTime, default=datetime.utcnow)
updated_at: Mapped[datetime] = mapped_column(

View File

@ -105,6 +105,8 @@ class YieldDomain(Base):
dns_verified: Mapped[bool] = mapped_column(Boolean, default=False)
dns_verified_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
# "Connect" timestamp for Yield (nameserver/CNAME verified)
connected_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
activated_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
paused_at: Mapped[Optional[datetime]] = mapped_column(DateTime, nullable=True)
@ -142,13 +144,6 @@ class YieldDomain(Base):
"""Check if domain is actively earning."""
return self.status == "active" and self.dns_verified
@property
def monthly_revenue(self) -> Decimal:
"""Estimate monthly revenue (placeholder - should compute from transactions)."""
# In production: calculate from last 30 days of transactions
return self.total_revenue
class YieldTransaction(Base):
"""
Revenue events from affiliate partners.
@ -170,6 +165,9 @@ class YieldTransaction(Base):
# Partner info
partner_slug: Mapped[str] = mapped_column(String(50), nullable=False)
partner_transaction_id: Mapped[Optional[str]] = mapped_column(String(200), nullable=True)
# Our click id for attribution across systems (UUID string)
click_id: Mapped[Optional[str]] = mapped_column(String(64), nullable=True, index=True)
destination_url: Mapped[Optional[str]] = mapped_column(Text, nullable=True)
# Amount
gross_amount: Mapped[Decimal] = mapped_column(Numeric(10, 2), default=0) # Full commission
@ -200,6 +198,7 @@ class YieldTransaction(Base):
__table_args__ = (
Index("ix_yield_tx_domain_created", "yield_domain_id", "created_at"),
Index("ix_yield_tx_status_created", "status", "created_at"),
Index("ix_yield_tx_click_id", "click_id"),
)
def __repr__(self) -> str:

View File

@ -0,0 +1,43 @@
"""
Zone File Models for .ch and .li domain drops
"""
from datetime import datetime
from sqlalchemy import Column, Integer, String, DateTime, Boolean, Index
from app.database import Base
class ZoneSnapshot(Base):
"""Stores metadata about zone file snapshots (not the full data)"""
__tablename__ = "zone_snapshots"
id = Column(Integer, primary_key=True)
tld = Column(String(10), nullable=False, index=True) # 'ch' or 'li'
snapshot_date = Column(DateTime, nullable=False, index=True)
domain_count = Column(Integer, nullable=False)
checksum = Column(String(64), nullable=False) # SHA256 of sorted domain list
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('ix_zone_snapshots_tld_date', 'tld', 'snapshot_date'),
)
class DroppedDomain(Base):
"""Stores domains that were dropped (found in previous snapshot but not current)"""
__tablename__ = "dropped_domains"
id = Column(Integer, primary_key=True)
domain = Column(String(255), nullable=False, index=True)
tld = Column(String(10), nullable=False, index=True)
dropped_date = Column(DateTime, nullable=False, index=True)
length = Column(Integer, nullable=False)
is_numeric = Column(Boolean, default=False)
has_hyphen = Column(Boolean, default=False)
created_at = Column(DateTime, default=datetime.utcnow)
__table_args__ = (
Index('ix_dropped_domains_tld_date', 'tld', 'dropped_date'),
Index('ix_dropped_domains_length', 'length'),
)

View File

@ -0,0 +1,304 @@
"""
Business KPIs exported as Prometheus metrics (4B Ops).
These KPIs are derived from real telemetry events in the database.
We cache computations to avoid putting load on the DB on every scrape.
"""
from __future__ import annotations
import json
from dataclasses import dataclass
from datetime import datetime, timedelta
from typing import Any, Optional
from sqlalchemy import and_, func, select
from app.config import get_settings
from app.database import AsyncSessionLocal
from app.models.telemetry import TelemetryEvent
settings = get_settings()
try:
from prometheus_client import Gauge
except Exception: # pragma: no cover
Gauge = None # type: ignore
@dataclass(frozen=True)
class TelemetryWindowKpis:
window_days: int
start: datetime
end: datetime
# Deal
listing_views: int
inquiries_created: int
seller_replied_inquiries: int
inquiry_reply_rate: float
listings_with_inquiries: int
listings_sold: int
inquiry_to_sold_listing_rate: float
# Yield
connected_domains: int
clicks: int
conversions: int
conversion_rate: float
payouts_paid: int
payouts_paid_amount_total: float
_cache_until_by_days: dict[int, datetime] = {}
_cache_value_by_days: dict[int, TelemetryWindowKpis] = {}
def _safe_json(metadata_json: Optional[str]) -> dict[str, Any]:
if not metadata_json:
return {}
try:
value = json.loads(metadata_json)
return value if isinstance(value, dict) else {}
except Exception:
return {}
async def _compute_window_kpis(days: int) -> TelemetryWindowKpis:
end = datetime.utcnow()
start = end - timedelta(days=days)
async with AsyncSessionLocal() as db:
# Fast path: grouped counts for pure counter events
count_events = [
"listing_view",
"inquiry_created",
"yield_connected",
"yield_click",
"yield_conversion",
"payout_paid",
]
grouped = (
await db.execute(
select(TelemetryEvent.event_name, func.count(TelemetryEvent.id))
.where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name.in_(count_events),
)
)
.group_by(TelemetryEvent.event_name)
)
).all()
counts = {name: int(cnt) for name, cnt in grouped}
listing_views = counts.get("listing_view", 0)
inquiries_created = counts.get("inquiry_created", 0)
connected_domains = counts.get("yield_connected", 0)
clicks = counts.get("yield_click", 0)
conversions = counts.get("yield_conversion", 0)
payouts_paid = counts.get("payout_paid", 0)
# Distinct listing counts (deal)
listings_with_inquiries = (
await db.execute(
select(func.count(func.distinct(TelemetryEvent.listing_id))).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "inquiry_created",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalar() or 0
listings_sold = (
await db.execute(
select(func.count(func.distinct(TelemetryEvent.listing_id))).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "listing_marked_sold",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalar() or 0
# For rates we need intersections/uniques; keep it exact via minimal event fetch
inquiry_listing_ids = (
await db.execute(
select(func.distinct(TelemetryEvent.listing_id)).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "inquiry_created",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalars().all()
sold_listing_ids = (
await db.execute(
select(func.distinct(TelemetryEvent.listing_id)).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "listing_marked_sold",
TelemetryEvent.listing_id.isnot(None),
)
)
)
).scalars().all()
inquiry_set = {int(x) for x in inquiry_listing_ids if x is not None}
sold_set = {int(x) for x in sold_listing_ids if x is not None}
sold_from_inquiry = inquiry_set.intersection(sold_set)
inquiry_to_sold_listing_rate = (len(sold_from_inquiry) / len(inquiry_set)) if inquiry_set else 0.0
# Seller reply rate: unique inquiries with at least one seller message
msg_rows = (
await db.execute(
select(TelemetryEvent.inquiry_id, TelemetryEvent.metadata_json).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "message_sent",
TelemetryEvent.inquiry_id.isnot(None),
)
)
)
).all()
seller_replied_inquiries_set: set[int] = set()
for inquiry_id, metadata_json in msg_rows:
if inquiry_id is None:
continue
meta = _safe_json(metadata_json)
if meta.get("role") == "seller":
seller_replied_inquiries_set.add(int(inquiry_id))
seller_replied_inquiries = len(seller_replied_inquiries_set)
inquiry_reply_rate = (seller_replied_inquiries / inquiries_created) if inquiries_created else 0.0
# Payout amounts (sum of metadata amounts)
payout_rows = (
await db.execute(
select(TelemetryEvent.metadata_json).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "payout_paid",
TelemetryEvent.metadata_json.isnot(None),
)
)
)
).scalars().all()
payouts_paid_amount_total = 0.0
for metadata_json in payout_rows:
meta = _safe_json(metadata_json)
amount = meta.get("amount")
if isinstance(amount, (int, float)):
payouts_paid_amount_total += float(amount)
conversion_rate = (conversions / clicks) if clicks else 0.0
return TelemetryWindowKpis(
window_days=days,
start=start,
end=end,
listing_views=int(listing_views),
inquiries_created=int(inquiries_created),
seller_replied_inquiries=int(seller_replied_inquiries),
inquiry_reply_rate=float(inquiry_reply_rate),
listings_with_inquiries=int(listings_with_inquiries),
listings_sold=int(listings_sold),
inquiry_to_sold_listing_rate=float(inquiry_to_sold_listing_rate),
connected_domains=int(connected_domains),
clicks=int(clicks),
conversions=int(conversions),
conversion_rate=float(conversion_rate),
payouts_paid=int(payouts_paid),
payouts_paid_amount_total=float(payouts_paid_amount_total),
)
async def get_cached_window_kpis(days: int) -> Optional[TelemetryWindowKpis]:
"""Return cached KPIs for a window (recompute if TTL expired)."""
if not settings.enable_business_metrics:
return None
now = datetime.utcnow()
until = _cache_until_by_days.get(days)
cached = _cache_value_by_days.get(days)
if until is not None and cached is not None and now < until:
return cached
value = await _compute_window_kpis(int(days))
ttl_seconds = max(5, int(settings.business_metrics_cache_seconds))
_cache_until_by_days[int(days)] = now + timedelta(seconds=ttl_seconds)
_cache_value_by_days[int(days)] = value
return value
# -----------------------------
# Prometheus Gauges
# -----------------------------
if Gauge is not None:
_g = {
"deal_listing_views": Gauge("pounce_deal_listing_views", "Deal: listing views in window", ["window_days"]),
"deal_inquiries_created": Gauge("pounce_deal_inquiries_created", "Deal: inquiries created in window", ["window_days"]),
"deal_seller_replied_inquiries": Gauge(
"pounce_deal_seller_replied_inquiries", "Deal: inquiries with seller reply in window", ["window_days"]
),
"deal_inquiry_reply_rate": Gauge("pounce_deal_inquiry_reply_rate", "Deal: inquiry reply rate in window", ["window_days"]),
"deal_listings_with_inquiries": Gauge(
"pounce_deal_listings_with_inquiries", "Deal: distinct listings with inquiries in window", ["window_days"]
),
"deal_listings_sold": Gauge("pounce_deal_listings_sold", "Deal: distinct listings marked sold in window", ["window_days"]),
"deal_inquiry_to_sold_listing_rate": Gauge(
"pounce_deal_inquiry_to_sold_listing_rate", "Deal: (listings with inquiry) -> sold rate in window", ["window_days"]
),
"yield_connected_domains": Gauge("pounce_yield_connected_domains", "Yield: connected domains in window", ["window_days"]),
"yield_clicks": Gauge("pounce_yield_clicks", "Yield: clicks in window", ["window_days"]),
"yield_conversions": Gauge("pounce_yield_conversions", "Yield: conversions in window", ["window_days"]),
"yield_conversion_rate": Gauge("pounce_yield_conversion_rate", "Yield: conversion rate in window", ["window_days"]),
"yield_payouts_paid": Gauge("pounce_yield_payouts_paid", "Yield: payouts paid in window", ["window_days"]),
"yield_payouts_paid_amount_total": Gauge(
"pounce_yield_payouts_paid_amount_total", "Yield: total amount paid out in window", ["window_days"]
),
}
else: # pragma: no cover
_g = {}
async def update_prometheus_business_metrics() -> None:
"""Compute KPIs and set Prometheus gauges (no-op when disabled)."""
if Gauge is None or not _g:
return
if not settings.enable_business_metrics:
return
windows = {1, int(settings.business_metrics_days)}
for days in sorted(windows):
kpis = await get_cached_window_kpis(days)
if kpis is None:
continue
w = str(int(kpis.window_days))
_g["deal_listing_views"].labels(window_days=w).set(kpis.listing_views)
_g["deal_inquiries_created"].labels(window_days=w).set(kpis.inquiries_created)
_g["deal_seller_replied_inquiries"].labels(window_days=w).set(kpis.seller_replied_inquiries)
_g["deal_inquiry_reply_rate"].labels(window_days=w).set(kpis.inquiry_reply_rate)
_g["deal_listings_with_inquiries"].labels(window_days=w).set(kpis.listings_with_inquiries)
_g["deal_listings_sold"].labels(window_days=w).set(kpis.listings_sold)
_g["deal_inquiry_to_sold_listing_rate"].labels(window_days=w).set(kpis.inquiry_to_sold_listing_rate)
_g["yield_connected_domains"].labels(window_days=w).set(kpis.connected_domains)
_g["yield_clicks"].labels(window_days=w).set(kpis.clicks)
_g["yield_conversions"].labels(window_days=w).set(kpis.conversions)
_g["yield_conversion_rate"].labels(window_days=w).set(kpis.conversion_rate)
_g["yield_payouts_paid"].labels(window_days=w).set(kpis.payouts_paid)
_g["yield_payouts_paid_amount_total"].labels(window_days=w).set(kpis.payouts_paid_amount_total)

View File

@ -72,6 +72,21 @@ def instrument_app(app: FastAPI, *, metrics_path: str = "/metrics", enable_db_me
@app.get(metrics_path, include_in_schema=False)
async def _metrics_endpoint():
# Optional: export business KPIs derived from telemetry (cached).
try:
from app.observability.business_metrics import update_prometheus_business_metrics
await update_prometheus_business_metrics()
except Exception:
# Never break metrics scrape due to KPI computation issues.
pass
# Optional: export ops metrics (e.g. backup age).
try:
from app.observability.ops_metrics import update_prometheus_ops_metrics
await update_prometheus_ops_metrics()
except Exception:
pass
return Response(generate_latest(), media_type=CONTENT_TYPE_LATEST)
if enable_db_metrics:

View File

@ -0,0 +1,65 @@
"""
Ops/health metrics exported as Prometheus metrics (4B Ops).
These are low-frequency filesystem-based metrics (safe on scrape).
"""
from __future__ import annotations
from datetime import datetime
from pathlib import Path
from app.config import get_settings
settings = get_settings()
try:
from prometheus_client import Gauge
except Exception: # pragma: no cover
Gauge = None # type: ignore
if Gauge is not None:
db_backups_enabled = Gauge("pounce_db_backups_enabled", "DB backups enabled (1/0)")
db_backup_latest_unixtime = Gauge("pounce_db_backup_latest_unixtime", "Unix time of latest backup file (0 if none)")
db_backup_latest_age_seconds = Gauge("pounce_db_backup_latest_age_seconds", "Age of latest backup file (seconds)")
else: # pragma: no cover
db_backups_enabled = None # type: ignore
db_backup_latest_unixtime = None # type: ignore
db_backup_latest_age_seconds = None # type: ignore
def _backup_root() -> Path:
root = Path(settings.backup_dir)
if not root.is_absolute():
root = (Path.cwd() / root).resolve()
return root
async def update_prometheus_ops_metrics() -> None:
if Gauge is None:
return
db_backups_enabled.set(1 if settings.enable_db_backups else 0)
root = _backup_root()
if not root.exists() or not root.is_dir():
db_backup_latest_unixtime.set(0)
db_backup_latest_age_seconds.set(0)
return
files = [p for p in root.glob("*") if p.is_file()]
if not files:
db_backup_latest_unixtime.set(0)
db_backup_latest_age_seconds.set(0)
return
latest = max(files, key=lambda p: p.stat().st_mtime)
mtime = float(latest.stat().st_mtime)
now = datetime.utcnow().timestamp()
age = max(0.0, now - mtime)
db_backup_latest_unixtime.set(mtime)
db_backup_latest_age_seconds.set(age)

542
backend/app/routes/portfolio.py Executable file
View File

@ -0,0 +1,542 @@
"""Portfolio API routes."""
from datetime import datetime
from typing import Optional, List
from fastapi import APIRouter, Depends, HTTPException, status, Query
from pydantic import BaseModel, Field
from sqlalchemy import select, func, and_
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import get_db
from app.routes.auth import get_current_user
from app.models.user import User
from app.models.portfolio import PortfolioDomain, DomainValuation
from app.services.valuation import valuation_service
router = APIRouter(prefix="/portfolio", tags=["portfolio"])
# ============== Schemas ==============
class PortfolioDomainCreate(BaseModel):
"""Schema for creating a portfolio domain."""
domain: str = Field(..., min_length=3, max_length=255)
purchase_date: Optional[datetime] = None
purchase_price: Optional[float] = Field(None, ge=0)
purchase_registrar: Optional[str] = None
registrar: Optional[str] = None
renewal_date: Optional[datetime] = None
renewal_cost: Optional[float] = Field(None, ge=0)
auto_renew: bool = True
notes: Optional[str] = None
tags: Optional[str] = None
class PortfolioDomainUpdate(BaseModel):
"""Schema for updating a portfolio domain."""
purchase_date: Optional[datetime] = None
purchase_price: Optional[float] = Field(None, ge=0)
purchase_registrar: Optional[str] = None
registrar: Optional[str] = None
renewal_date: Optional[datetime] = None
renewal_cost: Optional[float] = Field(None, ge=0)
auto_renew: Optional[bool] = None
status: Optional[str] = None
notes: Optional[str] = None
tags: Optional[str] = None
class PortfolioDomainSell(BaseModel):
"""Schema for marking a domain as sold."""
sale_date: datetime
sale_price: float = Field(..., ge=0)
class PortfolioDomainResponse(BaseModel):
"""Response schema for portfolio domain."""
id: int
domain: str
purchase_date: Optional[datetime]
purchase_price: Optional[float]
purchase_registrar: Optional[str]
registrar: Optional[str]
renewal_date: Optional[datetime]
renewal_cost: Optional[float]
auto_renew: bool
estimated_value: Optional[float]
value_updated_at: Optional[datetime]
is_sold: bool
sale_date: Optional[datetime]
sale_price: Optional[float]
status: str
notes: Optional[str]
tags: Optional[str]
roi: Optional[float]
created_at: datetime
updated_at: datetime
class Config:
from_attributes = True
class PortfolioSummary(BaseModel):
"""Summary of user's portfolio."""
total_domains: int
active_domains: int
sold_domains: int
total_invested: float
total_value: float
total_sold_value: float
unrealized_profit: float
realized_profit: float
overall_roi: float
class ValuationResponse(BaseModel):
"""Response schema for domain valuation."""
domain: str
estimated_value: float
currency: str
scores: dict
factors: dict
confidence: str
source: str
calculated_at: str
# ============== Portfolio Endpoints ==============
@router.get("", response_model=List[PortfolioDomainResponse])
async def get_portfolio(
status: Optional[str] = Query(None, description="Filter by status"),
sort_by: str = Query("created_at", description="Sort field"),
sort_order: str = Query("desc", description="Sort order (asc/desc)"),
limit: int = Query(100, le=500),
offset: int = Query(0, ge=0),
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get user's portfolio domains."""
query = select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
# Filter by status
if status:
query = query.where(PortfolioDomain.status == status)
# Sorting
sort_column = getattr(PortfolioDomain, sort_by, PortfolioDomain.created_at)
if sort_order == "asc":
query = query.order_by(sort_column.asc())
else:
query = query.order_by(sort_column.desc())
# Pagination
query = query.offset(offset).limit(limit)
result = await db.execute(query)
domains = result.scalars().all()
# Calculate ROI for each domain
responses = []
for d in domains:
response = PortfolioDomainResponse(
id=d.id,
domain=d.domain,
purchase_date=d.purchase_date,
purchase_price=d.purchase_price,
purchase_registrar=d.purchase_registrar,
registrar=d.registrar,
renewal_date=d.renewal_date,
renewal_cost=d.renewal_cost,
auto_renew=d.auto_renew,
estimated_value=d.estimated_value,
value_updated_at=d.value_updated_at,
is_sold=d.is_sold,
sale_date=d.sale_date,
sale_price=d.sale_price,
status=d.status,
notes=d.notes,
tags=d.tags,
roi=d.roi,
created_at=d.created_at,
updated_at=d.updated_at,
)
responses.append(response)
return responses
@router.get("/summary", response_model=PortfolioSummary)
async def get_portfolio_summary(
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get portfolio summary statistics."""
result = await db.execute(
select(PortfolioDomain).where(PortfolioDomain.user_id == current_user.id)
)
domains = result.scalars().all()
total_domains = len(domains)
active_domains = sum(1 for d in domains if d.status == "active" and not d.is_sold)
sold_domains = sum(1 for d in domains if d.is_sold)
total_invested = sum(d.purchase_price or 0 for d in domains)
total_value = sum(d.estimated_value or 0 for d in domains if not d.is_sold)
total_sold_value = sum(d.sale_price or 0 for d in domains if d.is_sold)
# Calculate active investment for ROI
active_investment = sum(d.purchase_price or 0 for d in domains if not d.is_sold)
sold_investment = sum(d.purchase_price or 0 for d in domains if d.is_sold)
unrealized_profit = total_value - active_investment
realized_profit = total_sold_value - sold_investment
overall_roi = 0.0
if total_invested > 0:
overall_roi = ((total_value + total_sold_value - total_invested) / total_invested) * 100
return PortfolioSummary(
total_domains=total_domains,
active_domains=active_domains,
sold_domains=sold_domains,
total_invested=round(total_invested, 2),
total_value=round(total_value, 2),
total_sold_value=round(total_sold_value, 2),
unrealized_profit=round(unrealized_profit, 2),
realized_profit=round(realized_profit, 2),
overall_roi=round(overall_roi, 2),
)
@router.post("", response_model=PortfolioDomainResponse, status_code=status.HTTP_201_CREATED)
async def add_portfolio_domain(
data: PortfolioDomainCreate,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Add a domain to portfolio."""
# Check if domain already exists in user's portfolio
existing = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.user_id == current_user.id,
PortfolioDomain.domain == data.domain.lower(),
)
)
)
if existing.scalar_one_or_none():
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="Domain already in portfolio",
)
# Get initial valuation
valuation = await valuation_service.estimate_value(data.domain, db, save_result=True)
estimated_value = valuation.get("estimated_value") if "error" not in valuation else None
# Create portfolio entry
domain = PortfolioDomain(
user_id=current_user.id,
domain=data.domain.lower(),
purchase_date=data.purchase_date,
purchase_price=data.purchase_price,
purchase_registrar=data.purchase_registrar,
registrar=data.registrar or data.purchase_registrar,
renewal_date=data.renewal_date,
renewal_cost=data.renewal_cost,
auto_renew=data.auto_renew,
estimated_value=estimated_value,
value_updated_at=datetime.utcnow() if estimated_value else None,
notes=data.notes,
tags=data.tags,
)
db.add(domain)
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.get("/{domain_id}", response_model=PortfolioDomainResponse)
async def get_portfolio_domain(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get a specific portfolio domain."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.put("/{domain_id}", response_model=PortfolioDomainResponse)
async def update_portfolio_domain(
domain_id: int,
data: PortfolioDomainUpdate,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Update a portfolio domain."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
# Update fields
update_data = data.model_dump(exclude_unset=True)
for field, value in update_data.items():
setattr(domain, field, value)
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.post("/{domain_id}/sell", response_model=PortfolioDomainResponse)
async def mark_domain_sold(
domain_id: int,
data: PortfolioDomainSell,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Mark a domain as sold."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
domain.is_sold = True
domain.sale_date = data.sale_date
domain.sale_price = data.sale_price
domain.status = "sold"
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
@router.delete("/{domain_id}", status_code=status.HTTP_204_NO_CONTENT)
async def delete_portfolio_domain(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Delete a domain from portfolio."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
await db.delete(domain)
await db.commit()
@router.post("/{domain_id}/refresh-value", response_model=PortfolioDomainResponse)
async def refresh_domain_value(
domain_id: int,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Refresh the estimated value of a portfolio domain."""
result = await db.execute(
select(PortfolioDomain).where(
and_(
PortfolioDomain.id == domain_id,
PortfolioDomain.user_id == current_user.id,
)
)
)
domain = result.scalar_one_or_none()
if not domain:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail="Domain not found in portfolio",
)
# Get new valuation
valuation = await valuation_service.estimate_value(domain.domain, db, save_result=True)
if "error" not in valuation:
domain.estimated_value = valuation["estimated_value"]
domain.value_updated_at = datetime.utcnow()
await db.commit()
await db.refresh(domain)
return PortfolioDomainResponse(
id=domain.id,
domain=domain.domain,
purchase_date=domain.purchase_date,
purchase_price=domain.purchase_price,
purchase_registrar=domain.purchase_registrar,
registrar=domain.registrar,
renewal_date=domain.renewal_date,
renewal_cost=domain.renewal_cost,
auto_renew=domain.auto_renew,
estimated_value=domain.estimated_value,
value_updated_at=domain.value_updated_at,
is_sold=domain.is_sold,
sale_date=domain.sale_date,
sale_price=domain.sale_price,
status=domain.status,
notes=domain.notes,
tags=domain.tags,
roi=domain.roi,
created_at=domain.created_at,
updated_at=domain.updated_at,
)
# ============== Valuation Endpoints ==============
@router.get("/valuation/{domain}", response_model=ValuationResponse)
async def get_domain_valuation(
domain: str,
current_user: User = Depends(get_current_user),
db: AsyncSession = Depends(get_db),
):
"""Get estimated value for any domain."""
valuation = await valuation_service.estimate_value(domain, db, save_result=True)
if "error" in valuation:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail=valuation["error"],
)
return ValuationResponse(**valuation)

View File

@ -16,6 +16,10 @@ from app.models.subscription import Subscription, SubscriptionTier, TIER_CONFIG
from app.services.domain_checker import domain_checker
from app.services.email_service import email_service
from app.services.price_tracker import price_tracker
from app.services.yield_payouts import generate_payouts_for_previous_month
from app.services.db_backup import create_backup
from app.services.ops_alerts import run_ops_alert_checks
from app.services.referral_rewards import apply_referral_rewards_all
if TYPE_CHECKING:
from app.models.sniper_alert import SniperAlert
@ -450,6 +454,53 @@ async def send_health_change_alerts(db, changes: list):
logger.error(f"Failed to send health alert: {e}")
async def prepare_monthly_yield_payouts():
"""
Prepare Yield payouts for previous month (admin automation).
Safety:
- Only runs when `internal_api_key` is configured.
- Idempotent: generation skips existing payouts for the same period.
"""
if not (settings.internal_api_key or "").strip():
return
try:
async with AsyncSessionLocal() as db:
await generate_payouts_for_previous_month(db)
except Exception as e:
logger.exception(f"Yield payout preparation failed: {e}")
async def run_db_backup():
"""Create a verified DB backup (4B Ops)."""
if not settings.enable_db_backups:
return
try:
# backup is filesystem / subprocess based; no DB session needed here
create_backup(verify=True)
except Exception as e:
logger.exception(f"DB backup failed: {e}")
async def run_ops_alerting():
"""Evaluate and (optionally) send ops alerts (4B)."""
try:
await run_ops_alert_checks()
except Exception as e:
logger.exception(f"Ops alerting failed: {e}")
async def run_referral_rewards():
"""Recompute and apply referral reward bonuses (3C.2)."""
try:
async with AsyncSessionLocal() as db:
res = await apply_referral_rewards_all(db)
await db.commit()
logger.info("Referral rewards applied: processed=%s updated=%s", res.get("processed"), res.get("updated"))
except Exception as e:
logger.exception(f"Referral rewards job failed: {e}")
def setup_scheduler():
"""Configure and start the scheduler."""
# Daily domain check for Scout users at configured hour
@ -505,6 +556,42 @@ def setup_scheduler():
name="Weekly Digest Email",
replace_existing=True,
)
# Yield payout preparation: run on 2nd day of month at 02:10 UTC
scheduler.add_job(
prepare_monthly_yield_payouts,
CronTrigger(day=2, hour=2, minute=10),
id="yield_payout_prepare",
name="Yield Payout Preparation (Monthly)",
replace_existing=True,
)
# DB backup: daily at 01:30 UTC
scheduler.add_job(
run_db_backup,
CronTrigger(hour=1, minute=30),
id="db_backup",
name="DB Backup (Daily)",
replace_existing=True,
)
# Ops alerting: hourly at :12 (staggered)
scheduler.add_job(
run_ops_alerting,
CronTrigger(minute=12),
id="ops_alerting",
name="Ops Alerting (Hourly)",
replace_existing=True,
)
# Referral rewards: daily at 00:22 UTC (staggered)
scheduler.add_job(
run_referral_rewards,
CronTrigger(hour=0, minute=22),
id="referral_rewards",
name="Referral Rewards (Daily)",
replace_existing=True,
)
# TLD price scrape 2x daily for better historical data
# Morning scrape at 03:00 UTC
@ -570,6 +657,33 @@ def setup_scheduler():
replace_existing=True,
)
# Zone file sync for .ch and .li domains (daily at 05:00 UTC)
scheduler.add_job(
sync_zone_files,
CronTrigger(hour=5, minute=0), # Daily at 05:00 UTC
id="zone_file_sync",
name="Zone File Sync - Switch.ch (daily)",
replace_existing=True,
)
# CZDS zone file sync for gTLDs (daily at 06:00 UTC, after Switch sync)
scheduler.add_job(
sync_czds_zones,
CronTrigger(hour=6, minute=0), # Daily at 06:00 UTC
id="czds_zone_sync",
name="Zone File Sync - ICANN CZDS (daily)",
replace_existing=True,
)
# Zone data cleanup (hourly - delete drops older than 48h)
scheduler.add_job(
cleanup_zone_data,
CronTrigger(minute=45), # Every hour at :45
id="zone_cleanup",
name="Zone Data Cleanup (hourly)",
replace_existing=True,
)
logger.info(
f"Scheduler configured:"
f"\n - Scout domain check at {settings.check_hour:02d}:{settings.check_minute:02d} (daily)"
@ -580,6 +694,7 @@ def setup_scheduler():
f"\n - Auction scrape every 2 hours at :30"
f"\n - Expired auction cleanup every 15 minutes"
f"\n - Sniper alert matching every 30 minutes"
f"\n - Zone file sync daily at 05:00 UTC"
)
@ -744,12 +859,86 @@ async def scrape_auctions():
logger.exception(f"Auction scrape failed: {e}")
async def cleanup_zone_data():
"""Clean up old zone file data to save storage."""
logger.info("Starting zone data cleanup...")
try:
from app.services.zone_file import cleanup_old_drops, cleanup_old_snapshots
async with AsyncSessionLocal() as db:
# Delete drops older than 48h
drops_deleted = await cleanup_old_drops(db, hours=48)
# Delete snapshots older than 7 days
snapshots_deleted = await cleanup_old_snapshots(db, keep_days=7)
logger.info(f"Zone cleanup: {drops_deleted} drops, {snapshots_deleted} snapshots deleted")
except Exception as e:
logger.exception(f"Zone data cleanup failed: {e}")
async def sync_zone_files():
"""Sync zone files from Switch.ch (.ch, .li) and ICANN CZDS (gTLDs)."""
logger.info("Starting zone file sync...")
try:
from app.services.zone_file import ZoneFileService
service = ZoneFileService()
async with AsyncSessionLocal() as db:
# Sync Switch.ch zones (.ch, .li)
for tld in ["ch", "li"]:
try:
result = await service.run_daily_sync(db, tld)
logger.info(f".{tld} zone sync: {len(result.get('dropped', []))} dropped, {result.get('new_count', 0)} new")
except Exception as e:
logger.error(f".{tld} zone sync failed: {e}")
logger.info("Switch.ch zone file sync completed")
except Exception as e:
logger.exception(f"Zone file sync failed: {e}")
async def sync_czds_zones():
"""Sync zone files from ICANN CZDS (gTLDs like .xyz, .org, .dev, .app)."""
logger.info("Starting CZDS zone file sync...")
try:
from app.services.czds_client import CZDSClient, APPROVED_TLDS
from app.config import get_settings
settings = get_settings()
# Skip if credentials not configured
if not settings.czds_username or not settings.czds_password:
logger.info("CZDS credentials not configured, skipping CZDS sync")
return
client = CZDSClient()
async with AsyncSessionLocal() as db:
results = await client.sync_all_zones(db, APPROVED_TLDS)
success_count = sum(1 for r in results if r["status"] == "success")
total_dropped = sum(r["dropped_count"] for r in results)
logger.info(f"CZDS sync complete: {success_count}/{len(APPROVED_TLDS)} zones, {total_dropped:,} dropped")
except Exception as e:
logger.exception(f"CZDS zone file sync failed: {e}")
async def match_sniper_alerts():
"""Match active sniper alerts against current auctions and notify users."""
"""Match active sniper alerts against auctions AND drops and notify users."""
from app.models.sniper_alert import SniperAlert, SniperAlertMatch
from app.models.auction import DomainAuction
from app.models.zone_file import DroppedDomain
logger.info("Matching sniper alerts against new auctions...")
logger.info("Matching sniper alerts against auctions and drops...")
try:
async with AsyncSessionLocal() as db:
@ -764,39 +953,65 @@ async def match_sniper_alerts():
return
# Get recent auctions (added in last 2 hours)
cutoff = datetime.utcnow() - timedelta(hours=2)
auction_cutoff = datetime.utcnow() - timedelta(hours=2)
auctions_result = await db.execute(
select(DomainAuction).where(
and_(
DomainAuction.is_active == True,
DomainAuction.scraped_at >= cutoff,
DomainAuction.scraped_at >= auction_cutoff,
)
)
)
auctions = auctions_result.scalars().all()
if not auctions:
logger.info("No recent auctions to match against")
return
# Get recent drops (last 24 hours)
drop_cutoff = datetime.utcnow() - timedelta(hours=24)
drops_result = await db.execute(
select(DroppedDomain).where(DroppedDomain.dropped_date >= drop_cutoff)
)
drops = drops_result.scalars().all()
logger.info(f"Checking {len(alerts)} alerts against {len(auctions)} auctions and {len(drops)} drops")
matches_created = 0
notifications_sent = 0
for alert in alerts:
matching_auctions = []
matching_items = []
# Match against auctions
for auction in auctions:
if _auction_matches_alert(auction, alert):
matching_auctions.append(auction)
matching_items.append({
'domain': auction.domain,
'source': 'auction',
'platform': auction.platform,
'price': auction.current_bid,
'end_time': auction.end_time,
'url': auction.auction_url,
})
if matching_auctions:
for auction in matching_auctions:
# Match against drops
for drop in drops:
if _drop_matches_alert(drop, alert):
full_domain = f"{drop.domain}.{drop.tld}"
matching_items.append({
'domain': full_domain,
'source': 'drop',
'platform': f'.{drop.tld} zone',
'price': 0,
'end_time': None,
'url': f"https://pounce.ch/terminal/hunt?tab=drops&search={drop.domain}",
})
if matching_items:
for item in matching_items:
# Check if this match already exists
existing = await db.execute(
select(SniperAlertMatch).where(
and_(
SniperAlertMatch.alert_id == alert.id,
SniperAlertMatch.domain == auction.domain,
SniperAlertMatch.domain == item['domain'],
)
)
)
@ -806,48 +1021,61 @@ async def match_sniper_alerts():
# Create new match
match = SniperAlertMatch(
alert_id=alert.id,
domain=auction.domain,
platform=auction.platform,
current_bid=auction.current_bid,
end_time=auction.end_time,
auction_url=auction.auction_url,
domain=item['domain'],
platform=item['platform'],
current_bid=item['price'],
end_time=item['end_time'] or datetime.utcnow(),
auction_url=item['url'],
matched_at=datetime.utcnow(),
)
db.add(match)
matches_created += 1
# Update alert stats
alert.matches_count = (alert.matches_count or 0) + 1
alert.last_matched_at = datetime.utcnow()
# Update alert last_triggered
alert.last_triggered = datetime.utcnow()
# Send notification if enabled
if alert.notify_email:
# Send notification if enabled (batch notification)
if alert.notify_email and matching_items:
try:
user_result = await db.execute(
select(User).where(User.id == alert.user_id)
)
user = user_result.scalar_one_or_none()
if user and email_service.is_enabled:
# Send email with matching domains
domains_list = ", ".join([a.domain for a in matching_auctions[:5]])
if user and email_service.is_configured():
auction_matches = [m for m in matching_items if m['source'] == 'auction']
drop_matches = [m for m in matching_items if m['source'] == 'drop']
# Build HTML content
html_parts = [f'<h2>Your Sniper Alert "{alert.name}" matched!</h2>']
if auction_matches:
html_parts.append(f'<h3>🎯 {len(auction_matches)} Auction Match{"es" if len(auction_matches) > 1 else ""}</h3><ul>')
for m in auction_matches[:10]:
html_parts.append(f'<li><strong>{m["domain"]}</strong> - ${m["price"]:.0f} on {m["platform"]}</li>')
html_parts.append('</ul>')
if drop_matches:
html_parts.append(f'<h3>🔥 {len(drop_matches)} Fresh Drop{"s" if len(drop_matches) > 1 else ""}</h3><ul>')
for m in drop_matches[:10]:
html_parts.append(f'<li><strong>{m["domain"]}</strong> - Just dropped!</li>')
html_parts.append('</ul>')
html_parts.append('<p><a href="https://pounce.ch/terminal/sniper">View all matches in Pounce</a></p>')
await email_service.send_email(
to_email=user.email,
subject=f"🎯 Sniper Alert: {len(matching_auctions)} matching domains found!",
html_content=f"""
<h2>Your Sniper Alert "{alert.name}" matched!</h2>
<p>We found {len(matching_auctions)} domains matching your criteria:</p>
<ul>
{"".join(f"<li><strong>{a.domain}</strong> - ${a.current_bid:.0f} on {a.platform}</li>" for a in matching_auctions[:10])}
</ul>
<p><a href="https://pounce.ch/command/alerts">View all matches in your Command Center</a></p>
"""
subject=f"🎯 Sniper Alert: {len(matching_items)} matching domains found!",
html_content=''.join(html_parts),
)
notifications_sent += 1
alert.notifications_sent = (alert.notifications_sent or 0) + 1
except Exception as e:
logger.error(f"Failed to send sniper alert notification: {e}")
await db.commit()
logger.info(f"Sniper alert matching complete: {matches_created} matches created, {notifications_sent} notifications sent")
logger.info(f"Sniper alert matching complete: {matches_created} matches, {notifications_sent} notifications")
except Exception as e:
logger.exception(f"Sniper alert matching failed: {e}")
@ -857,9 +1085,16 @@ def _auction_matches_alert(auction: "DomainAuction", alert: "SniperAlert") -> bo
"""Check if an auction matches the criteria of a sniper alert."""
domain_name = auction.domain.rsplit('.', 1)[0] if '.' in auction.domain else auction.domain
# Check keyword filter
if alert.keyword:
if alert.keyword.lower() not in domain_name.lower():
# Check keyword filter (must contain any of the keywords)
if alert.keywords:
required = [k.strip().lower() for k in alert.keywords.split(',')]
if not any(kw in domain_name.lower() for kw in required):
return False
# Check exclude keywords
if alert.exclude_keywords:
excluded = [k.strip().lower() for k in alert.exclude_keywords.split(',')]
if any(kw in domain_name.lower() for kw in excluded):
return False
# Check TLD filter
@ -868,6 +1103,12 @@ def _auction_matches_alert(auction: "DomainAuction", alert: "SniperAlert") -> bo
if auction.tld.lower() not in allowed_tlds:
return False
# Check platform filter
if alert.platforms:
allowed_platforms = [p.strip().lower() for p in alert.platforms.split(',')]
if auction.platform.lower() not in allowed_platforms:
return False
# Check length filters
if alert.min_length and len(domain_name) < alert.min_length:
return False
@ -880,17 +1121,68 @@ def _auction_matches_alert(auction: "DomainAuction", alert: "SniperAlert") -> bo
if alert.max_price and auction.current_bid > alert.max_price:
return False
# Check exclusion filters
if alert.exclude_numbers:
# Check bids filter (low competition)
if alert.max_bids and auction.num_bids and auction.num_bids > alert.max_bids:
return False
# Check no_numbers filter
if alert.no_numbers:
if any(c.isdigit() for c in domain_name):
return False
if alert.exclude_hyphens:
# Check no_hyphens filter
if alert.no_hyphens:
if '-' in domain_name:
return False
# Check exclude_chars
if alert.exclude_chars:
excluded = set(alert.exclude_chars.lower())
excluded = set(c.strip().lower() for c in alert.exclude_chars.split(','))
if any(c in excluded for c in domain_name.lower()):
return False
return True
def _drop_matches_alert(drop, alert: "SniperAlert") -> bool:
"""Check if a dropped domain matches the criteria of a sniper alert."""
domain_name = drop.domain # Already just the name without TLD
# Check keyword filter
if alert.keywords:
required = [k.strip().lower() for k in alert.keywords.split(',')]
if not any(kw in domain_name.lower() for kw in required):
return False
# Check exclude keywords
if alert.exclude_keywords:
excluded = [k.strip().lower() for k in alert.exclude_keywords.split(',')]
if any(kw in domain_name.lower() for kw in excluded):
return False
# Check TLD filter
if alert.tlds:
allowed_tlds = [t.strip().lower() for t in alert.tlds.split(',')]
if drop.tld.lower() not in allowed_tlds:
return False
# Check length filters
if alert.min_length and len(domain_name) < alert.min_length:
return False
if alert.max_length and len(domain_name) > alert.max_length:
return False
# Check no_numbers filter (use drop.is_numeric)
if alert.no_numbers and drop.is_numeric:
return False
# Check no_hyphens filter (use drop.has_hyphen)
if alert.no_hyphens and drop.has_hyphen:
return False
# Check exclude_chars
if alert.exclude_chars:
excluded = set(c.strip().lower() for c in alert.exclude_chars.split(','))
if any(c in excluded for c in domain_name.lower()):
return False

View File

@ -0,0 +1,35 @@
"""
Analyze schemas (Alpha Terminal - Phase 2 Diligence).
Open-data-first: we return null + reason when data isn't available.
"""
from __future__ import annotations
from datetime import datetime
from typing import Any, Optional
from pydantic import BaseModel, Field
class AnalyzeItem(BaseModel):
key: str
label: str
value: Optional[Any] = None
status: str = Field(default="info", description="pass|warn|fail|info|na")
source: str = Field(default="internal", description="internal|rdap|whois|dns|http|ssl|db|open_data")
details: dict[str, Any] = Field(default_factory=dict)
class AnalyzeSection(BaseModel):
key: str
title: str
items: list[AnalyzeItem] = Field(default_factory=list)
class AnalyzeResponse(BaseModel):
domain: str
computed_at: datetime
cached: bool = False
sections: list[AnalyzeSection]

View File

@ -10,6 +10,8 @@ class UserCreate(BaseModel):
email: EmailStr
password: str = Field(..., min_length=8, max_length=100)
name: Optional[str] = Field(None, max_length=100)
# Yield referral tracking
ref: Optional[str] = Field(None, max_length=100, description="Referral code from yield domain")
class UserLogin(BaseModel):
@ -49,3 +51,26 @@ class TokenData(BaseModel):
user_id: Optional[int] = None
email: Optional[str] = None
class ReferralStats(BaseModel):
"""Referral reward snapshot for the current user (3C.2)."""
window_days: int = 30
referred_users_total: int = 0
qualified_referrals_total: int = 0
referral_link_views_window: int = 0
bonus_domains: int = 0
next_reward_at: int = 0
badge: Optional[str] = None # "verified_referrer" | "elite_referrer"
cooldown_days: int = 7
disqualified_cooldown_total: int = 0
disqualified_missing_ip_total: int = 0
disqualified_shared_ip_total: int = 0
disqualified_duplicate_ip_total: int = 0
class ReferralLinkResponse(BaseModel):
invite_code: str
url: str
stats: ReferralStats

View File

@ -0,0 +1,51 @@
"""CFO (Management) schemas."""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class CfoMonthlyBucket(BaseModel):
month: str # YYYY-MM
total_cost_usd: float = 0.0
domains: int = 0
class CfoUpcomingCostRow(BaseModel):
domain_id: int
domain: str
renewal_date: Optional[datetime] = None
renewal_cost_usd: Optional[float] = None
cost_source: str = Field(default="unknown", description="portfolio|tld_prices|unknown")
is_sold: bool = False
class CfoKillListRow(BaseModel):
domain_id: int
domain: str
renewal_date: Optional[datetime] = None
renewal_cost_usd: Optional[float] = None
cost_source: str = "unknown"
auto_renew: bool = True
is_dns_verified: bool = False
yield_net_60d: float = 0.0
yield_clicks_60d: int = 0
reason: str
class CfoSummaryResponse(BaseModel):
computed_at: datetime
upcoming_30d_total_usd: float = 0.0
upcoming_30d_rows: list[CfoUpcomingCostRow] = []
monthly: list[CfoMonthlyBucket] = []
kill_list: list[CfoKillListRow] = []
class SetToDropResponse(BaseModel):
domain_id: int
auto_renew: bool
updated_at: datetime

View File

@ -0,0 +1,93 @@
"""HUNT (Discovery) schemas."""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class HuntSniperItem(BaseModel):
domain: str
platform: str
auction_url: str
current_bid: float
currency: str
end_time: datetime
age_years: Optional[int] = None
backlinks: Optional[int] = None
pounce_score: Optional[int] = None
class HuntSniperResponse(BaseModel):
items: list[HuntSniperItem]
total: int
filtered_out_missing_data: int = 0
last_updated: Optional[datetime] = None
class TrendItem(BaseModel):
title: str
approx_traffic: Optional[str] = None
published_at: Optional[datetime] = None
link: Optional[str] = None
class TrendsResponse(BaseModel):
geo: str = "US"
items: list[TrendItem]
fetched_at: datetime
class KeywordAvailabilityRequest(BaseModel):
keywords: list[str] = Field(min_length=1, max_length=25)
tlds: list[str] = Field(default_factory=lambda: ["com", "io", "ai", "net", "org"], max_length=20)
class KeywordAvailabilityRow(BaseModel):
keyword: str
domain: str
tld: str
is_available: Optional[bool] = None
status: str # available|taken|unknown
class KeywordAvailabilityResponse(BaseModel):
items: list[KeywordAvailabilityRow]
class TypoCheckRequest(BaseModel):
brand: str = Field(min_length=2, max_length=50)
tlds: list[str] = Field(default_factory=lambda: ["com"], max_length=10)
limit: int = Field(default=50, ge=1, le=200)
class TypoCandidate(BaseModel):
domain: str
is_available: Optional[bool] = None
status: str # available|taken|unknown
class TypoCheckResponse(BaseModel):
brand: str
items: list[TypoCandidate]
class BrandableRequest(BaseModel):
pattern: str = Field(description="cvcvc|cvccv|human", examples=["cvcvc"])
tlds: list[str] = Field(default_factory=lambda: ["com"], max_length=10)
limit: int = Field(default=30, ge=1, le=100)
max_checks: int = Field(default=400, ge=50, le=2000)
class BrandableCandidate(BaseModel):
domain: str
is_available: Optional[bool] = None
status: str # available|taken|unknown
class BrandableResponse(BaseModel):
pattern: str
items: list[BrandableCandidate]

View File

@ -0,0 +1,33 @@
"""
Referral schemas (3C.2).
"""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class ReferralKpiWindow(BaseModel):
days: int = Field(ge=1, le=365)
start: datetime
end: datetime
class ReferralReferrerRow(BaseModel):
user_id: int
email: str
invite_code: Optional[str] = None
created_at: datetime
referred_users_total: int = 0
referred_users_window: int = 0
referral_link_views_window: int = 0
class ReferralKpisResponse(BaseModel):
window: ReferralKpiWindow
totals: dict[str, int]
referrers: list[ReferralReferrerRow]

View File

@ -0,0 +1,47 @@
"""
Telemetry schemas (4A.2).
"""
from __future__ import annotations
from datetime import datetime
from typing import Optional
from pydantic import BaseModel, Field
class TelemetryKpiWindow(BaseModel):
days: int = Field(ge=1, le=365)
start: datetime
end: datetime
class DealFunnelKpis(BaseModel):
listing_views: int = 0
inquiries_created: int = 0
seller_replied_inquiries: int = 0
inquiry_reply_rate: float = 0.0
listings_with_inquiries: int = 0
listings_sold: int = 0
inquiry_to_sold_listing_rate: float = 0.0
median_reply_seconds: Optional[float] = None
median_time_to_sold_seconds: Optional[float] = None
class YieldFunnelKpis(BaseModel):
connected_domains: int = 0
clicks: int = 0
conversions: int = 0
conversion_rate: float = 0.0
payouts_paid: int = 0
payouts_paid_amount_total: float = 0.0
class TelemetryKpisResponse(BaseModel):
window: TelemetryKpiWindow
deal: DealFunnelKpis
yield_: YieldFunnelKpis = Field(alias="yield")

View File

@ -73,6 +73,7 @@ class YieldDomainResponse(BaseModel):
# DNS
dns_verified: bool = False
dns_verified_at: Optional[datetime] = None
connected_at: Optional[datetime] = None
# Stats
total_clicks: int = 0
@ -108,6 +109,7 @@ class YieldTransactionResponse(BaseModel):
id: int
event_type: str
partner_slug: str
click_id: Optional[str] = None
gross_amount: Decimal
net_amount: Decimal

View File

@ -0,0 +1,6 @@
"""Database seed scripts."""
from app.seeds.yield_partners import seed_partners
__all__ = ["seed_partners"]

View File

@ -0,0 +1,474 @@
"""
Seed data for Yield affiliate partners.
Run via: python -m app.seeds.yield_partners
Or: from app.seeds.yield_partners import seed_partners; await seed_partners(db)
"""
import asyncio
import logging
from decimal import Decimal
from typing import Any
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.database import AsyncSessionLocal
from app.models.yield_domain import AffiliatePartner
logger = logging.getLogger(__name__)
# Partner configurations grouped by category
PARTNER_SEED_DATA: list[dict[str, Any]] = [
# =========================================================================
# MEDICAL / HEALTH
# =========================================================================
{
"name": "Comparis Dental",
"slug": "comparis_dental",
"network": "direct",
"intent_categories": "medical_dental",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Dental treatment comparison platform. High conversion for Swiss dental searches.",
"priority": 100,
},
{
"name": "Swisssmile",
"slug": "swisssmile",
"network": "awin",
"intent_categories": "medical_dental",
"geo_countries": "CH,DE,AT",
"payout_type": "cpl",
"payout_amount": Decimal("30.00"),
"payout_currency": "CHF",
"description": "Premium dental clinics network.",
"priority": 90,
},
{
"name": "Comparis Health",
"slug": "comparis_health",
"network": "direct",
"intent_categories": "medical_general",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("20.00"),
"payout_currency": "CHF",
"description": "Health insurance comparison.",
"priority": 100,
},
{
"name": "Sanitas",
"slug": "sanitas",
"network": "awin",
"intent_categories": "medical_general",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("35.00"),
"payout_currency": "CHF",
"description": "Swiss health insurance provider.",
"priority": 80,
},
{
"name": "Swiss Esthetic",
"slug": "swissesthetic",
"network": "direct",
"intent_categories": "medical_beauty",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("40.00"),
"payout_currency": "CHF",
"description": "Aesthetic treatments and beauty clinics.",
"priority": 90,
},
# =========================================================================
# FINANCE / INSURANCE
# =========================================================================
{
"name": "Comparis Insurance",
"slug": "comparis_insurance",
"network": "direct",
"intent_categories": "finance_insurance",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("30.00"),
"payout_currency": "CHF",
"description": "All-in-one insurance comparison.",
"priority": 100,
},
{
"name": "Bonus.ch",
"slug": "bonus_ch",
"network": "awin",
"intent_categories": "finance_insurance",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Swiss insurance comparison portal.",
"priority": 80,
},
{
"name": "Comparis Hypo",
"slug": "comparis_hypo",
"network": "direct",
"intent_categories": "finance_mortgage",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("100.00"),
"payout_currency": "CHF",
"description": "Mortgage comparison - high value leads.",
"priority": 100,
},
{
"name": "MoneyPark",
"slug": "moneypark",
"network": "awin",
"intent_categories": "finance_mortgage",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("120.00"),
"payout_currency": "CHF",
"description": "Independent mortgage broker.",
"priority": 90,
},
{
"name": "Neon Bank",
"slug": "neon_bank",
"network": "partnerstack",
"intent_categories": "finance_banking",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("50.00"),
"payout_currency": "CHF",
"description": "Swiss mobile banking app.",
"priority": 80,
},
# =========================================================================
# LEGAL
# =========================================================================
{
"name": "Legal CH",
"slug": "legal_ch",
"network": "direct",
"intent_categories": "legal_general",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("50.00"),
"payout_currency": "CHF",
"description": "Lawyer matching service.",
"priority": 100,
},
{
"name": "Anwalt24",
"slug": "anwalt24",
"network": "awin",
"intent_categories": "legal_general",
"geo_countries": "DE,AT",
"payout_type": "cpl",
"payout_amount": Decimal("35.00"),
"payout_currency": "EUR",
"description": "German lawyer directory.",
"priority": 80,
},
# =========================================================================
# REAL ESTATE
# =========================================================================
{
"name": "Homegate",
"slug": "homegate",
"network": "awin",
"intent_categories": "realestate_buy,realestate_rent",
"geo_countries": "CH",
"payout_type": "cpc",
"payout_amount": Decimal("0.50"),
"payout_currency": "CHF",
"description": "Switzerland's #1 real estate platform.",
"priority": 100,
},
{
"name": "ImmoScout24",
"slug": "immoscout",
"network": "awin",
"intent_categories": "realestate_buy,realestate_rent",
"geo_countries": "CH,DE",
"payout_type": "cpc",
"payout_amount": Decimal("0.40"),
"payout_currency": "CHF",
"description": "Real estate marketplace.",
"priority": 90,
},
{
"name": "Comparis Immo",
"slug": "comparis_immo",
"network": "direct",
"intent_categories": "realestate_buy,realestate_rent",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("15.00"),
"payout_currency": "CHF",
"description": "Property valuation and search.",
"priority": 85,
},
# =========================================================================
# TRAVEL
# =========================================================================
{
"name": "Skyscanner",
"slug": "skyscanner",
"network": "awin",
"intent_categories": "travel_flights",
"geo_countries": "CH,DE,AT",
"payout_type": "cpc",
"payout_amount": Decimal("0.30"),
"payout_currency": "CHF",
"description": "Flight comparison engine.",
"priority": 90,
},
{
"name": "Booking.com",
"slug": "booking_com",
"network": "awin",
"intent_categories": "travel_hotels",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("4.00"), # 4% commission
"payout_currency": "CHF",
"description": "World's leading accommodation site.",
"priority": 100,
},
# =========================================================================
# AUTOMOTIVE
# =========================================================================
{
"name": "AutoScout24",
"slug": "autoscout",
"network": "awin",
"intent_categories": "auto_buy",
"geo_countries": "CH,DE",
"payout_type": "cpc",
"payout_amount": Decimal("0.60"),
"payout_currency": "CHF",
"description": "Auto marketplace.",
"priority": 100,
},
{
"name": "Comparis Auto",
"slug": "comparis_auto",
"network": "direct",
"intent_categories": "auto_buy,auto_service",
"geo_countries": "CH",
"payout_type": "cpl",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Car insurance & leasing comparison.",
"priority": 90,
},
# =========================================================================
# JOBS
# =========================================================================
{
"name": "Jobs.ch",
"slug": "jobs_ch",
"network": "awin",
"intent_categories": "jobs",
"geo_countries": "CH",
"payout_type": "cpc",
"payout_amount": Decimal("0.40"),
"payout_currency": "CHF",
"description": "Swiss job board.",
"priority": 100,
},
{
"name": "Indeed",
"slug": "indeed",
"network": "awin",
"intent_categories": "jobs",
"geo_countries": "CH,DE,AT",
"payout_type": "cpc",
"payout_amount": Decimal("0.25"),
"payout_currency": "CHF",
"description": "Global job search engine.",
"priority": 80,
},
# =========================================================================
# EDUCATION
# =========================================================================
{
"name": "Udemy",
"slug": "udemy",
"network": "awin",
"intent_categories": "education",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("10.00"), # Per sale
"payout_currency": "USD",
"description": "Online courses platform.",
"priority": 80,
},
# =========================================================================
# TECHNOLOGY / HOSTING
# =========================================================================
{
"name": "Hostpoint",
"slug": "hostpoint",
"network": "partnerstack",
"intent_categories": "tech_hosting",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("30.00"),
"payout_currency": "CHF",
"description": "Swiss web hosting leader.",
"priority": 100,
},
{
"name": "Infomaniak",
"slug": "infomaniak",
"network": "direct",
"intent_categories": "tech_hosting",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("25.00"),
"payout_currency": "CHF",
"description": "Eco-friendly Swiss hosting.",
"priority": 90,
},
# =========================================================================
# SHOPPING
# =========================================================================
{
"name": "Galaxus",
"slug": "galaxus",
"network": "awin",
"intent_categories": "shopping_general",
"geo_countries": "CH",
"payout_type": "cps",
"payout_amount": Decimal("2.00"), # 2% commission
"payout_currency": "CHF",
"description": "Switzerland's largest online shop.",
"priority": 100,
},
{
"name": "Zalando",
"slug": "zalando",
"network": "awin",
"intent_categories": "shopping_fashion",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("8.00"), # 8% commission
"payout_currency": "CHF",
"description": "Fashion & lifestyle.",
"priority": 100,
},
# =========================================================================
# FOOD / DELIVERY
# =========================================================================
{
"name": "Uber Eats",
"slug": "uber_eats",
"network": "awin",
"intent_categories": "food_restaurant,food_delivery",
"geo_countries": "CH,DE",
"payout_type": "cps",
"payout_amount": Decimal("5.00"),
"payout_currency": "CHF",
"description": "Food delivery service.",
"priority": 90,
},
# =========================================================================
# POUNCE SELF-PROMOTION (Viral Growth)
# =========================================================================
{
"name": "Pounce Promo",
"slug": "pounce_promo",
"network": "internal",
"intent_categories": "investment_domains,tech_dev,generic",
"geo_countries": "CH,DE,AT",
"payout_type": "cps",
"payout_amount": Decimal("0"), # 30% lifetime commission handled separately
"payout_currency": "CHF",
"description": "Pounce self-promotion. Domain owners earn 30% lifetime commission on referrals.",
"priority": 50, # Higher than generic but lower than high-value partners
},
# =========================================================================
# GENERIC FALLBACK
# =========================================================================
{
"name": "Generic Affiliate",
"slug": "generic_affiliate",
"network": "internal",
"intent_categories": "generic",
"geo_countries": "CH,DE,AT",
"payout_type": "cpc",
"payout_amount": Decimal("0.10"),
"payout_currency": "CHF",
"description": "Fallback for unclassified domains - shows Pounce marketplace.",
"priority": 1,
},
]
async def seed_partners(db: AsyncSession) -> int:
"""
Seed affiliate partners into database.
Idempotent: updates existing partners by slug, creates new ones.
Returns:
Number of partners created/updated.
"""
count = 0
for data in PARTNER_SEED_DATA:
slug = data["slug"]
# Check if partner exists
result = await db.execute(
select(AffiliatePartner).where(AffiliatePartner.slug == slug)
)
existing = result.scalar_one_or_none()
if existing:
# Update existing partner
for key, value in data.items():
setattr(existing, key, value)
logger.info(f"Updated partner: {slug}")
else:
# Create new partner
partner = AffiliatePartner(**data)
db.add(partner)
logger.info(f"Created partner: {slug}")
count += 1
await db.commit()
logger.info(f"Seeded {count} affiliate partners")
return count
async def main():
"""Run seed script standalone."""
logging.basicConfig(level=logging.INFO)
async with AsyncSessionLocal() as db:
count = await seed_partners(db)
print(f"✅ Seeded {count} affiliate partners")
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,2 @@
"""Analyze services package (Alpha Terminal)."""

View File

@ -0,0 +1,2 @@
"""Analyzer implementations."""

View File

@ -0,0 +1,102 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.domain_health import get_health_checker
class BasicRiskAnalyzer:
key = "basic_risk"
ttl_seconds = 60 * 10 # 10m (HTTP/SSL/DNS can change quickly)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
if ctx.fast:
return [
AnalyzerContribution(
quadrant="risk",
items=[
AnalyzeItem(
key="risk_skipped_fast_mode",
label="Risk Signals",
value=None,
status="na",
source="internal",
details={"reason": "Fast mode enabled: skip HTTP/SSL checks."},
)
],
)
]
health = ctx.health
if health is None:
health = await get_health_checker().check_domain(ctx.domain)
# health object has attributes; keep access defensive
score = int(getattr(health, "score", 0) or 0)
status = getattr(getattr(health, "status", None), "value", None) or str(getattr(health, "status", "unknown"))
signals = getattr(health, "signals", []) or []
http = getattr(health, "http", None)
ssl = getattr(health, "ssl", None)
dns = getattr(health, "dns", None)
http_status_code = getattr(http, "status_code", None) if http else None
http_reachable = bool(getattr(http, "is_reachable", False)) if http else False
http_parked = bool(getattr(http, "is_parked", False)) if http else False
redirect_url = getattr(http, "redirect_url", None) if http else None
parking_signals = getattr(http, "parking_signals", []) if http else []
http_error = getattr(http, "error", None) if http else None
ssl_has = bool(getattr(ssl, "has_ssl", False)) if ssl else False
ssl_valid = bool(getattr(ssl, "is_valid", False)) if ssl else False
ssl_days = getattr(ssl, "days_until_expiry", None) if ssl else None
ssl_issuer = getattr(ssl, "issuer", None) if ssl else None
ssl_error = getattr(ssl, "error", None) if ssl else None
dns_has_ns = bool(getattr(dns, "has_nameservers", False)) if dns else False
dns_has_a = bool(getattr(dns, "has_a_record", False)) if dns else False
dns_parking_ns = bool(getattr(dns, "is_parking_ns", False)) if dns else False
items = [
AnalyzeItem(
key="health_score",
label="Health Score",
value=score,
status="pass" if score >= 80 else "warn" if score >= 50 else "fail",
source="internal",
details={"status": status, "signals": signals},
),
AnalyzeItem(
key="dns_infra",
label="DNS Infra",
value={"has_ns": dns_has_ns, "has_a": dns_has_a},
status="pass" if (dns_has_ns and dns_has_a and not dns_parking_ns) else "warn",
source="dns",
details={"parking_ns": dns_parking_ns},
),
AnalyzeItem(
key="http",
label="HTTP",
value=http_status_code,
status="pass" if http_reachable and (http_status_code or 0) < 400 else "warn",
source="http",
details={
"reachable": http_reachable,
"is_parked": http_parked,
"redirect_url": redirect_url,
"parking_signals": parking_signals,
"error": http_error,
},
),
AnalyzeItem(
key="ssl",
label="SSL",
value=ssl_days if ssl_has else None,
status="pass" if ssl_has and ssl_valid else "warn",
source="ssl",
details={"has_certificate": ssl_has, "is_valid": ssl_valid, "issuer": ssl_issuer, "error": ssl_error},
),
]
return [AnalyzerContribution(quadrant="risk", items=items)]

View File

@ -0,0 +1,56 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
class DomainFactsAnalyzer:
key = "domain_facts"
ttl_seconds = 60 * 60 * 12 # 12h (RDAP/WHOIS changes slowly)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
check = ctx.check
items = [
AnalyzeItem(
key="availability",
label="Availability",
value="available" if check.is_available else "taken",
status="pass" if check.is_available else "warn",
source=str(check.check_method or "internal"),
details={"status": str(check.status.value)},
),
AnalyzeItem(
key="created_at",
label="Creation Date",
value=check.creation_date.isoformat() if check.creation_date else None,
status="pass" if check.creation_date else "na",
source=str(check.check_method or "internal"),
details={"reason": None if check.creation_date else "Not provided by RDAP/WHOIS for this TLD."},
),
AnalyzeItem(
key="expires_at",
label="Expiry Date",
value=check.expiration_date.isoformat() if check.expiration_date else None,
status="pass" if check.expiration_date else "na",
source=str(check.check_method or "internal"),
details={"reason": None if check.expiration_date else "Not provided by RDAP/WHOIS for this TLD."},
),
AnalyzeItem(
key="registrar",
label="Registrar",
value=check.registrar,
status="info" if check.registrar else "na",
source=str(check.check_method or "internal"),
details={},
),
AnalyzeItem(
key="nameservers",
label="Nameservers",
value=check.name_servers or [],
status="info" if check.name_servers else "na",
source="dns",
details={},
),
]
return [AnalyzerContribution(quadrant="authority", items=items)]

View File

@ -0,0 +1,30 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.analyze.radio_test import run_radio_test
class RadioTestAnalyzer:
key = "radio_test"
ttl_seconds = 60 * 60 * 24 * 7 # deterministic, effectively stable
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
radio = run_radio_test(ctx.domain)
item = AnalyzeItem(
key="radio_test",
label="Radio Test",
value=radio.status,
status=radio.status,
source="internal",
details={
"sld": radio.sld,
"syllables": radio.syllables,
"length": radio.length,
"has_hyphen": radio.has_hyphen,
"has_digits": radio.has_digits,
"rationale": radio.rationale,
},
)
return [AnalyzerContribution(quadrant="authority", items=[item])]

View File

@ -0,0 +1,23 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.analyze.tld_matrix import run_tld_matrix
class TldMatrixAnalyzer:
key = "tld_matrix"
ttl_seconds = 60 * 30 # 30m (availability can change)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
rows = await run_tld_matrix(ctx.domain)
item = AnalyzeItem(
key="tld_matrix",
label="TLD Matrix",
value=[row.__dict__ for row in rows],
status="info",
source="dns",
details={"tlds": [r.tld for r in rows]},
)
return [AnalyzerContribution(quadrant="market", items=[item])]

View File

@ -0,0 +1,73 @@
from __future__ import annotations
from app.schemas.analyze import AnalyzeItem
from app.services.analyze.base import AnalyzerContribution, AnalyzeContext
from app.services.analyze.renewal_cost import get_tld_price_snapshot
class TldPricingAnalyzer:
key = "tld_pricing"
ttl_seconds = 60 * 60 * 6 # 6h (DB updates periodically)
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]:
tld = ctx.domain.split(".")[-1].lower()
snap = await get_tld_price_snapshot(ctx.db, tld)
market_items = [
AnalyzeItem(
key="tld_cheapest_register_usd",
label="Cheapest registration (USD)",
value=snap.min_register_usd,
status="info" if snap.min_register_usd is not None else "na",
source="db",
details={
"registrar": snap.min_register_registrar,
"latest_recorded_at": snap.latest_recorded_at.isoformat() if snap.latest_recorded_at else None,
"reason": None if snap.min_register_usd is not None else "No TLD price data collected yet.",
},
),
AnalyzeItem(
key="tld_cheapest_renew_usd",
label="Cheapest renewal (USD)",
value=snap.min_renew_usd,
status="info" if snap.min_renew_usd is not None else "na",
source="db",
details={
"registrar": snap.min_renew_registrar,
"latest_recorded_at": snap.latest_recorded_at.isoformat() if snap.latest_recorded_at else None,
"reason": None if snap.min_renew_usd is not None else "No TLD price data collected yet.",
},
),
AnalyzeItem(
key="tld_cheapest_transfer_usd",
label="Cheapest transfer (USD)",
value=snap.min_transfer_usd,
status="info" if snap.min_transfer_usd is not None else "na",
source="db",
details={
"registrar": snap.min_transfer_registrar,
"latest_recorded_at": snap.latest_recorded_at.isoformat() if snap.latest_recorded_at else None,
"reason": None if snap.min_transfer_usd is not None else "No TLD price data collected yet.",
},
),
]
value_items = [
AnalyzeItem(
key="renewal_burn_usd_per_year",
label="Renewal burn (USD/year)",
value=snap.min_renew_usd,
status="info" if snap.min_renew_usd is not None else "na",
source="db",
details={
"assumption": "Cheapest renewal among tracked registrars (your DB).",
"registrar": snap.min_renew_registrar,
},
)
]
return [
AnalyzerContribution(quadrant="market", items=market_items),
AnalyzerContribution(quadrant="value", items=value_items),
]

View File

@ -0,0 +1,41 @@
"""
Analyzer plugin interface (Alpha Terminal - Diligence).
Each analyzer contributes items to one or more quadrants:
authority | market | risk | value
"""
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime
from typing import Protocol
from sqlalchemy.ext.asyncio import AsyncSession
from app.schemas.analyze import AnalyzeItem
from app.services.domain_checker import DomainCheckResult
@dataclass(frozen=True)
class AnalyzeContext:
db: AsyncSession
domain: str
computed_at: datetime
fast: bool
check: DomainCheckResult
health: object | None # DomainHealthReport or None (kept as object to avoid import cycles)
@dataclass(frozen=True)
class AnalyzerContribution:
quadrant: str # authority|market|risk|value
items: list[AnalyzeItem]
class Analyzer(Protocol):
key: str
ttl_seconds: int
async def analyze(self, ctx: AnalyzeContext) -> list[AnalyzerContribution]: ...

View File

@ -0,0 +1,91 @@
"""
Radio Test analyzer (open-data, deterministic).
No external API, no LLM. This is a heuristic that explains its decision.
"""
from __future__ import annotations
import re
from dataclasses import dataclass
VOWELS = set("aeiou")
def _count_syllables(word: str) -> int:
"""
Approximate syllable count for brandability heuristics.
Uses vowel-group counting with a few basic adjustments.
"""
w = re.sub(r"[^a-z]", "", word.lower())
if not w:
return 0
groups = 0
prev_vowel = False
for i, ch in enumerate(w):
is_vowel = ch in VOWELS or (ch == "y" and i > 0)
if is_vowel and not prev_vowel:
groups += 1
prev_vowel = is_vowel
# common silent-e adjustment
if w.endswith("e") and groups > 1:
groups -= 1
return max(1, groups)
@dataclass(frozen=True)
class RadioTestResult:
sld: str
syllables: int
length: int
has_hyphen: bool
has_digits: bool
status: str # pass|warn|fail
rationale: str
def run_radio_test(domain: str) -> RadioTestResult:
sld = (domain or "").split(".")[0].lower()
length = len(sld)
has_hyphen = "-" in sld
has_digits = bool(re.search(r"\d", sld))
syllables = _count_syllables(sld)
# Hard fails: ugly for radio + high typo risk
if length >= 20 or has_hyphen or (has_digits and length > 4):
return RadioTestResult(
sld=sld,
syllables=syllables,
length=length,
has_hyphen=has_hyphen,
has_digits=has_digits,
status="fail",
rationale="Hard to say/spell on radio (length/hyphen/digits).",
)
# Ideal: 23 syllables, 412 chars, no digits
if 2 <= syllables <= 3 and 4 <= length <= 12 and not has_digits:
return RadioTestResult(
sld=sld,
syllables=syllables,
length=length,
has_hyphen=has_hyphen,
has_digits=has_digits,
status="pass",
rationale="Short, pronounceable, low spelling friction.",
)
return RadioTestResult(
sld=sld,
syllables=syllables,
length=length,
has_hyphen=has_hyphen,
has_digits=has_digits,
status="warn",
rationale="Usable, but not ideal (syllables/length/digits).",
)

View File

@ -0,0 +1,21 @@
"""Analyzer registry (Alpha Terminal - Diligence)."""
from __future__ import annotations
from app.services.analyze.analyzers.domain_facts import DomainFactsAnalyzer
from app.services.analyze.analyzers.radio_test import RadioTestAnalyzer
from app.services.analyze.analyzers.tld_matrix import TldMatrixAnalyzer
from app.services.analyze.analyzers.tld_pricing import TldPricingAnalyzer
from app.services.analyze.analyzers.basic_risk import BasicRiskAnalyzer
def get_default_analyzers():
# Order matters (UX)
return [
DomainFactsAnalyzer(),
RadioTestAnalyzer(),
TldMatrixAnalyzer(),
TldPricingAnalyzer(),
BasicRiskAnalyzer(),
]

View File

@ -0,0 +1,93 @@
"""
TLD pricing snapshot (open-data via internal DB).
Uses our own collected TLD price history (no external API calls here).
"""
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime
from typing import Optional
from sqlalchemy import func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.tld_price import TLDPrice
@dataclass(frozen=True)
class TldPriceSnapshot:
tld: str
min_register_usd: Optional[float]
min_register_registrar: Optional[str]
min_renew_usd: Optional[float]
min_renew_registrar: Optional[str]
min_transfer_usd: Optional[float]
min_transfer_registrar: Optional[str]
latest_recorded_at: Optional[datetime]
async def get_tld_price_snapshot(db: AsyncSession, tld: str) -> TldPriceSnapshot:
tld = (tld or "").lower().lstrip(".")
# Latest record per registrar for this TLD, then take min renew.
subq = (
select(
TLDPrice.registrar,
func.max(TLDPrice.recorded_at).label("max_date"),
)
.where(TLDPrice.tld == tld)
.group_by(TLDPrice.registrar)
.subquery()
)
rows = (
await db.execute(
select(TLDPrice)
.join(
subq,
(TLDPrice.registrar == subq.c.registrar)
& (TLDPrice.recorded_at == subq.c.max_date),
)
.where(TLDPrice.tld == tld)
)
).scalars().all()
if not rows:
return TldPriceSnapshot(
tld=tld,
min_register_usd=None,
min_register_registrar=None,
min_renew_usd=None,
min_renew_registrar=None,
min_transfer_usd=None,
min_transfer_registrar=None,
latest_recorded_at=None,
)
def _reg_price(r) -> float:
return float(r.registration_price or 1e12)
def _renew_price(r) -> float:
return float(r.renewal_price or r.registration_price or 1e12)
def _transfer_price(r) -> float:
return float(r.transfer_price or r.registration_price or 1e12)
best_reg = min(rows, key=_reg_price)
best_renew = min(rows, key=_renew_price)
best_transfer = min(rows, key=_transfer_price)
latest = max((r.recorded_at for r in rows if r.recorded_at), default=None)
return TldPriceSnapshot(
tld=tld,
min_register_usd=float(best_reg.registration_price),
min_register_registrar=str(best_reg.registrar),
min_renew_usd=float(best_renew.renewal_price or best_renew.registration_price),
min_renew_registrar=str(best_renew.registrar),
min_transfer_usd=float(best_transfer.transfer_price or best_transfer.registration_price),
min_transfer_registrar=str(best_transfer.registrar),
latest_recorded_at=latest,
)

View File

@ -0,0 +1,128 @@
"""
Analyze service orchestrator (Alpha Terminal).
Implements the plan:
- Quadrants: authority | market | risk | value
- Analyzer registry (plugin-like)
- Open-data-first (null + reason)
"""
from __future__ import annotations
import json
from datetime import datetime, timedelta, timezone
from fastapi import HTTPException, status
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.domain_analysis_cache import DomainAnalysisCache
from app.schemas.analyze import AnalyzeResponse, AnalyzeSection
from app.services.analyze.base import AnalyzeContext
from app.services.analyze.registry import get_default_analyzers
from app.services.domain_checker import domain_checker
from app.services.domain_health import get_health_checker
DEFAULT_CACHE_TTL_SECONDS = 60 * 10 # conservative fallback (10m)
def _utcnow() -> datetime:
return datetime.now(timezone.utc)
def _is_cache_valid(row: DomainAnalysisCache) -> bool:
ttl = int(row.ttl_seconds or 0)
if ttl <= 0:
return False
computed = row.computed_at
if computed is None:
return False
if computed.tzinfo is None:
# stored as naive UTC typically
computed = computed.replace(tzinfo=timezone.utc)
return computed + timedelta(seconds=ttl) > _utcnow()
async def get_domain_analysis(
db: AsyncSession,
domain: str,
*,
fast: bool = False,
refresh: bool = False,
) -> AnalyzeResponse:
is_valid, error = domain_checker.validate_domain(domain)
if not is_valid:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=error)
norm = domain_checker._normalize_domain(domain) # internal normalize
# Cache lookup
if not refresh:
row = (
await db.execute(select(DomainAnalysisCache).where(DomainAnalysisCache.domain == norm))
).scalar_one_or_none()
if row and _is_cache_valid(row):
payload = json.loads(row.payload_json)
payload["cached"] = True
return AnalyzeResponse.model_validate(payload)
computed_at = _utcnow()
# Core domain facts via RDAP/DNS/WHOIS (shared input for analyzers)
check = await domain_checker.check_domain(norm, quick=False)
# Health is expensive; compute once only when needed
health = None
if not fast:
health = await get_health_checker().check_domain(norm)
ctx = AnalyzeContext(db=db, domain=norm, computed_at=computed_at, fast=fast, check=check, health=health)
analyzers = get_default_analyzers()
ttl = DEFAULT_CACHE_TTL_SECONDS
# Quadrants per plan (stable ordering)
quadrants: dict[str, AnalyzeSection] = {
"authority": AnalyzeSection(key="authority", title="Authority", items=[]),
"market": AnalyzeSection(key="market", title="Market", items=[]),
"risk": AnalyzeSection(key="risk", title="Risk", items=[]),
"value": AnalyzeSection(key="value", title="Value", items=[]),
}
for a in analyzers:
ttl = min(ttl, int(getattr(a, "ttl_seconds", DEFAULT_CACHE_TTL_SECONDS) or DEFAULT_CACHE_TTL_SECONDS))
contributions = await a.analyze(ctx)
for c in contributions:
if c.quadrant in quadrants:
quadrants[c.quadrant].items.extend(c.items)
resp = AnalyzeResponse(
domain=norm,
computed_at=computed_at,
cached=False,
sections=[quadrants["authority"], quadrants["market"], quadrants["risk"], quadrants["value"]],
)
# Upsert cache (best-effort)
payload = resp.model_dump(mode="json")
payload_json = json.dumps(payload, separators=(",", ":"), ensure_ascii=False)
existing = (
await db.execute(select(DomainAnalysisCache).where(DomainAnalysisCache.domain == norm))
).scalar_one_or_none()
if existing:
existing.payload_json = payload_json
existing.computed_at = computed_at.replace(tzinfo=None)
existing.ttl_seconds = int(ttl or DEFAULT_CACHE_TTL_SECONDS)
else:
db.add(
DomainAnalysisCache(
domain=norm,
payload_json=payload_json,
computed_at=computed_at.replace(tzinfo=None),
ttl_seconds=int(ttl or DEFAULT_CACHE_TTL_SECONDS),
)
)
return resp

View File

@ -0,0 +1,66 @@
"""
TLD Matrix analyzer (open-data).
We check availability for the same SLD across a small, curated TLD set.
This is intentionally small to keep the UX fast.
"""
from __future__ import annotations
import asyncio
from dataclasses import dataclass
from app.services.domain_checker import domain_checker
DEFAULT_TLDS = ["com", "net", "org", "io", "ai", "co", "ch"]
@dataclass(frozen=True)
class TldMatrixRow:
tld: str
domain: str
is_available: bool | None
status: str # available|taken|unknown
method: str
error: str | None = None
async def _check_one(domain: str) -> TldMatrixRow:
try:
res = await domain_checker.check_domain(domain, quick=True)
return TldMatrixRow(
tld=domain.split(".")[-1],
domain=domain,
is_available=bool(res.is_available),
status="available" if res.is_available else "taken",
method=str(res.check_method or "dns"),
error=res.error_message,
)
except Exception as e: # noqa: BLE001
return TldMatrixRow(
tld=domain.split(".")[-1],
domain=domain,
is_available=None,
status="unknown",
method="error",
error=str(e),
)
async def run_tld_matrix(domain: str, tlds: list[str] | None = None) -> list[TldMatrixRow]:
sld = (domain or "").split(".")[0].lower().strip()
tlds = [t.lower().lstrip(".") for t in (tlds or DEFAULT_TLDS)]
# Avoid repeated checks and the original TLD duplication
seen = set()
candidates: list[str] = []
for t in tlds:
d = f"{sld}.{t}"
if d not in seen:
candidates.append(d)
seen.add(d)
rows = await asyncio.gather(*[_check_one(d) for d in candidates])
return list(rows)

View File

@ -68,11 +68,11 @@ class AuctionScraperService:
"""
Orchestrates scraping across multiple sources and stores results in DB.
"""
def __init__(self):
self.http_client: Optional[httpx.AsyncClient] = None
self._last_request: Dict[str, datetime] = {}
async def _get_client(self) -> httpx.AsyncClient:
"""Get or create HTTP client with appropriate headers (and optional proxy)."""
if self.http_client is None or self.http_client.is_closed:
@ -92,7 +92,7 @@ class AuctionScraperService:
},
)
return self.http_client
async def _rate_limit(self, platform: str):
"""Enforce rate limiting per platform."""
min_interval = 60 / RATE_LIMITS.get(platform, 10)
@ -102,7 +102,7 @@ class AuctionScraperService:
if elapsed < min_interval:
await asyncio.sleep(min_interval - elapsed)
self._last_request[platform] = datetime.utcnow()
# ----------------------------
# Parsing & validation helpers
# ----------------------------
@ -305,7 +305,7 @@ class AuctionScraperService:
cleaned["currency"] = str(currency).strip().upper()
return cleaned
async def _store_auction(self, db: AsyncSession, auction_data: Dict[str, Any]) -> str:
"""Store or update an auction in the database. Returns 'new', 'updated' or 'skipped'."""
cleaned = self._sanitize_auction_payload(auction_data)
@ -325,7 +325,7 @@ class AuctionScraperService:
)
)
existing = existing.scalar_one_or_none()
if existing:
# Prevent "end_time drift" on sources that only provide rounded time-left.
# `end_time` must be monotonically decreasing (or stable) across scrapes.
@ -384,15 +384,15 @@ class AuctionScraperService:
) -> Dict[str, Any]:
"""Scrape ExpiredDomains provider-specific auction pages (real Price/Bids/Endtime)."""
result = {"found": 0, "new": 0, "updated": 0}
log = AuctionScrapeLog(platform=platform)
db.add(log)
await db.commit()
try:
await self._rate_limit("ExpiredDomains")
client = await self._get_client()
resp = await client.get(url, timeout=20.0)
if resp.status_code != 200:
raise Exception(f"HTTP {resp.status_code}")
@ -416,11 +416,11 @@ class AuctionScraperService:
cols = row.find_all("td")
if len(cols) < len(headers):
continue
domain = cols[header_index["Domain"]].get_text(" ", strip=True).lower()
if not domain or "." not in domain:
continue
tld = domain.rsplit(".", 1)[-1].lower()
parsed_price = self._parse_price_currency(cols[header_index["Price"]].get_text(" ", strip=True))
@ -429,7 +429,7 @@ class AuctionScraperService:
current_bid, currency = parsed_price
if current_bid <= 0:
continue
bids_raw = cols[header_index["Bids"]].get_text(" ", strip=True)
try:
num_bids = int(re.sub(r"[^0-9]", "", bids_raw) or "0")
@ -446,26 +446,26 @@ class AuctionScraperService:
href = domain_link.get("href") if domain_link else None
if href and href.startswith("/"):
href = f"https://www.expireddomains.net{href}"
auction_data = {
"domain": domain,
"tld": tld,
"platform": platform,
"platform_auction_id": None,
auction_data = {
"domain": domain,
"tld": tld,
"platform": platform,
"platform_auction_id": None,
"auction_url": href or build_affiliate_url(platform, domain),
"current_bid": current_bid,
"currency": currency,
"num_bids": num_bids,
"end_time": end_time,
"scrape_source": f"expireddomains:{url}",
}
status = await self._store_auction(db, auction_data)
}
status = await self._store_auction(db, auction_data)
if status == "skipped":
continue
result["found"] += 1
result[status] += 1
result["found"] += 1
result[status] += 1
await db.commit()
log.completed_at = datetime.utcnow()
@ -474,16 +474,16 @@ class AuctionScraperService:
log.auctions_new = result["new"]
log.auctions_updated = result["updated"]
await db.commit()
except Exception as e:
log.completed_at = datetime.utcnow()
log.status = "failed"
log.error_message = str(e)[:500]
await db.commit()
logger.error(f"ExpiredDomains({platform}) scrape failed: {e}")
return result
async def _scrape_expireddomains_godaddy(self, db: AsyncSession) -> Dict[str, Any]:
return await self._scrape_expireddomains_auction_page(
db=db,
@ -509,15 +509,15 @@ class AuctionScraperService:
"""Scrape Park.io public auctions page (includes price + close date)."""
platform = "Park.io"
result = {"found": 0, "new": 0, "updated": 0}
log = AuctionScrapeLog(platform=platform)
db.add(log)
await db.commit()
try:
await self._rate_limit(platform)
client = await self._get_client()
resp = await client.get("https://park.io/auctions", timeout=20.0)
if resp.status_code != 200:
raise Exception(f"HTTP {resp.status_code}")
@ -531,8 +531,8 @@ class AuctionScraperService:
for row in rows[:200]:
cols = row.find_all("td")
if len(cols) < 5:
continue
continue
domain = cols[1].get_text(" ", strip=True).lower()
if not domain or "." not in domain:
continue
@ -544,14 +544,14 @@ class AuctionScraperService:
continue
current_bid, currency = parsed_price
if current_bid <= 0:
continue
continue
bids_raw = cols[3].get_text(" ", strip=True)
try:
num_bids = int(re.sub(r"[^0-9]", "", bids_raw) or "0")
except Exception:
continue
continue
close_raw = cols[4].get_text(" ", strip=True)
try:
# Park.io displays a naive timestamp in their platform timezone.
@ -567,7 +567,7 @@ class AuctionScraperService:
href = link_el["href"] if link_el else None
if href and href.startswith("/"):
href = f"https://park.io{href}"
auction_data = {
"domain": domain,
"tld": tld,
@ -579,13 +579,13 @@ class AuctionScraperService:
"end_time": end_time,
"scrape_source": "park.io:auctions",
}
status = await self._store_auction(db, auction_data)
if status == "skipped":
continue
result["found"] += 1
result[status] += 1
await db.commit()
log.completed_at = datetime.utcnow()
@ -594,29 +594,29 @@ class AuctionScraperService:
log.auctions_new = result["new"]
log.auctions_updated = result["updated"]
await db.commit()
except Exception as e:
log.completed_at = datetime.utcnow()
log.status = "failed"
log.error_message = str(e)[:500]
await db.commit()
logger.error(f"Park.io scrape failed: {e}")
return result
async def _scrape_sav_public(self, db: AsyncSession) -> Dict[str, Any]:
"""Scrape Sav auctions from their HTML table endpoint."""
platform = "Sav"
result = {"found": 0, "new": 0, "updated": 0}
log = AuctionScrapeLog(platform=platform)
db.add(log)
await db.commit()
try:
await self._rate_limit(platform)
client = await self._get_client()
now = datetime.utcnow()
for page in range(0, 3):
resp = await client.post(
@ -636,7 +636,7 @@ class AuctionScraperService:
cells = row.find_all("td")
if len(cells) < 7:
continue
domain_link = cells[1].find("a")
domain = domain_link.get_text(" ", strip=True).lower() if domain_link else ""
if not domain or "." not in domain:
@ -655,8 +655,8 @@ class AuctionScraperService:
try:
num_bids = int(re.sub(r"[^0-9]", "", bids_raw) or "0")
except Exception:
continue
continue
time_left_raw = cells[6].get_text(" ", strip=True)
delta = self._parse_timeleft(time_left_raw)
if not delta:
@ -666,7 +666,7 @@ class AuctionScraperService:
href = domain_link.get("href") if domain_link else None
if href and href.startswith("/"):
href = f"https://www.sav.com{href}"
auction_data = {
"domain": domain,
"tld": tld,
@ -678,15 +678,15 @@ class AuctionScraperService:
"end_time": end_time,
"scrape_source": f"sav:load_domains_ajax:{page}",
}
status = await self._store_auction(db, auction_data)
if status == "skipped":
continue
result["found"] += 1
result[status] += 1
await asyncio.sleep(1)
await db.commit()
log.completed_at = datetime.utcnow()
@ -695,16 +695,16 @@ class AuctionScraperService:
log.auctions_new = result["new"]
log.auctions_updated = result["updated"]
await db.commit()
except Exception as e:
log.completed_at = datetime.utcnow()
log.status = "failed"
log.error_message = str(e)[:500]
await db.commit()
logger.error(f"Sav scrape failed: {e}")
return result
# ----------------------------
# Orchestration
# ----------------------------
@ -729,7 +729,7 @@ class AuctionScraperService:
for item in hidden_api_result.get("items", []):
action = await self._store_auction(db, item)
if action == "skipped":
continue
continue
platform = item.get("platform", "Unknown")
_touch_platform(platform)
results["platforms"][platform]["found"] += 1
@ -804,98 +804,98 @@ class AuctionScraperService:
results["errors"].append(f"Playwright: {error}")
except Exception as e:
results["errors"].append(f"Playwright: {str(e)}")
await db.commit()
await db.commit()
await self._cleanup_ended_auctions(db)
return results
# ----------------------------
# Tier 1 helpers (official APIs)
# ----------------------------
async def _fetch_dropcatch_api(self, db: AsyncSession) -> Dict[str, Any]:
platform = "DropCatch"
result = {"found": 0, "new": 0, "updated": 0, "source": "api"}
if not dropcatch_client.is_configured:
return result
log = AuctionScrapeLog(platform=platform)
db.add(log)
await db.commit()
try:
api_result = await dropcatch_client.search_auctions(page_size=100)
auctions = api_result.get("auctions") or api_result.get("items") or []
result["found"] = len(auctions)
for dc_auction in auctions:
auction_data = dropcatch_client.transform_to_pounce_format(dc_auction)
status = await self._store_auction(db, auction_data)
if status == "skipped":
continue
result[status] += 1
await db.commit()
log.status = "success"
log.auctions_found = result["found"]
log.auctions_new = result["new"]
log.auctions_updated = result["updated"]
log.completed_at = datetime.utcnow()
await db.commit()
except Exception as e:
log.status = "failed"
log.error_message = str(e)[:500]
log.completed_at = datetime.utcnow()
await db.commit()
return result
return result
async def _fetch_sedo_api(self, db: AsyncSession) -> Dict[str, Any]:
platform = "Sedo"
result = {"found": 0, "new": 0, "updated": 0, "source": "api"}
if not sedo_client.is_configured:
return result
log = AuctionScrapeLog(platform=platform)
db.add(log)
await db.commit()
try:
api_result = await sedo_client.search_auctions(page_size=100)
listings = api_result.get("domains") or api_result.get("items") or api_result.get("result") or []
if isinstance(listings, dict):
listings = list(listings.values()) if listings else []
result["found"] = len(listings)
for sedo_listing in listings:
auction_data = sedo_client.transform_to_pounce_format(sedo_listing)
status = await self._store_auction(db, auction_data)
if status == "skipped":
continue
result[status] += 1
await db.commit()
log.status = "success"
log.auctions_found = result["found"]
log.auctions_new = result["new"]
log.auctions_updated = result["updated"]
log.completed_at = datetime.utcnow()
await db.commit()
except Exception as e:
log.status = "failed"
log.error_message = str(e)[:500]
log.completed_at = datetime.utcnow()
await db.commit()
return result
# ----------------------------
# DB cleanup / queries
# ----------------------------
@ -903,7 +903,7 @@ class AuctionScraperService:
async def _cleanup_ended_auctions(self, db: AsyncSession):
"""Mark auctions that have ended as inactive and delete very old inactive auctions."""
now = datetime.utcnow()
from sqlalchemy import update
await db.execute(
@ -911,14 +911,14 @@ class AuctionScraperService:
.where(and_(DomainAuction.end_time < now, DomainAuction.is_active == True))
.values(is_active=False)
)
cutoff = now - timedelta(days=30)
await db.execute(
delete(DomainAuction).where(and_(DomainAuction.is_active == False, DomainAuction.end_time < cutoff))
)
await db.commit()
async def get_active_auctions(
self,
db: AsyncSession,
@ -934,7 +934,7 @@ class AuctionScraperService:
) -> List[DomainAuction]:
"""Get active auctions from database with filters."""
query = select(DomainAuction).where(DomainAuction.is_active == True)
if platform:
query = query.where(DomainAuction.platform == platform)
if tld:
@ -948,7 +948,7 @@ class AuctionScraperService:
if ending_within_hours:
cutoff = datetime.utcnow() + timedelta(hours=ending_within_hours)
query = query.where(DomainAuction.end_time <= cutoff)
if sort_by == "end_time":
query = query.order_by(DomainAuction.end_time.asc())
elif sort_by == "bid_asc":
@ -957,17 +957,17 @@ class AuctionScraperService:
query = query.order_by(DomainAuction.current_bid.desc())
elif sort_by == "bids":
query = query.order_by(DomainAuction.num_bids.desc())
result = await db.execute(query.offset(offset).limit(limit))
return list(result.scalars().all())
async def get_auction_count(self, db: AsyncSession) -> int:
"""Get total count of active auctions."""
from sqlalchemy import func
result = await db.execute(select(func.count(DomainAuction.id)).where(DomainAuction.is_active == True))
return result.scalar() or 0
async def close(self):
"""Close HTTP client."""
if self.http_client and not self.http_client.is_closed:

View File

@ -3,6 +3,7 @@ from datetime import datetime, timedelta
from typing import Optional
import bcrypt
import secrets
from jose import JWTError, jwt
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
@ -92,11 +93,21 @@ class AuthService:
name: Optional[str] = None
) -> User:
"""Create a new user with default subscription."""
async def _generate_unique_invite_code() -> str:
# 12 hex chars; easy to validate + share + embed in URLs.
for _ in range(12):
code = secrets.token_hex(6)
exists = await db.execute(select(User.id).where(User.invite_code == code))
if exists.scalar_one_or_none() is None:
return code
raise RuntimeError("Failed to generate unique invite code")
# Create user (normalize email to lowercase)
user = User(
email=email.lower().strip(),
hashed_password=AuthService.hash_password(password),
name=name,
invite_code=await _generate_unique_invite_code(),
)
db.add(user)
await db.flush()

View File

@ -0,0 +1,499 @@
"""
ICANN CZDS (Centralized Zone Data Service) Client
==================================================
Downloads zone files from ICANN CZDS, parses them, and detects dropped domains.
Authentication: OAuth2 with username/password
Zone Format: Standard DNS zone file format (.txt.gz)
Usage:
client = CZDSClient(username, password)
await client.sync_all_zones(db)
"""
import asyncio
import gzip
import hashlib
import logging
import os
import re
import shutil
from datetime import datetime, timedelta
from pathlib import Path
from typing import Optional
import httpx
from sqlalchemy import select, func, text
from sqlalchemy.ext.asyncio import AsyncSession
from app.config import get_settings
from app.models.zone_file import ZoneSnapshot, DroppedDomain
logger = logging.getLogger(__name__)
settings = get_settings()
# ============================================================================
# CONSTANTS
# ============================================================================
CZDS_AUTH_URL = "https://account-api.icann.org/api/authenticate"
CZDS_ZONES_URL = "https://czds-api.icann.org/czds/downloads/links"
CZDS_DOWNLOAD_BASE = "https://czds-download-api.icann.org"
# TLDs we have approved access to
APPROVED_TLDS = ["xyz", "org", "online", "info", "dev", "app"]
# Regex to extract domain names from zone file NS records
# Format: example.tld. IN NS ns1.example.com.
NS_RECORD_PATTERN = re.compile(r'^([a-z0-9][-a-z0-9]*)\.[a-z]+\.\s+\d*\s*IN\s+NS\s+', re.IGNORECASE)
# ============================================================================
# CZDS CLIENT
# ============================================================================
class CZDSClient:
"""Client for ICANN CZDS zone file downloads."""
def __init__(
self,
username: Optional[str] = None,
password: Optional[str] = None,
data_dir: Optional[Path] = None
):
self.username = username or os.getenv("CZDS_USERNAME") or settings.czds_username
self.password = password or os.getenv("CZDS_PASSWORD") or settings.czds_password
self.data_dir = data_dir or Path(os.getenv("CZDS_DATA_DIR", "/tmp/pounce_czds"))
self.data_dir.mkdir(parents=True, exist_ok=True)
self._token: Optional[str] = None
self._token_expires: Optional[datetime] = None
async def _authenticate(self) -> str:
"""Authenticate with ICANN and get access token."""
if self._token and self._token_expires and datetime.utcnow() < self._token_expires:
return self._token
if not self.username or not self.password:
raise ValueError("CZDS credentials not configured. Set CZDS_USERNAME and CZDS_PASSWORD.")
logger.info("Authenticating with ICANN CZDS...")
async with httpx.AsyncClient(timeout=30) as client:
response = await client.post(
CZDS_AUTH_URL,
json={"username": self.username, "password": self.password},
headers={"Content-Type": "application/json"}
)
if response.status_code != 200:
logger.error(f"CZDS authentication failed: {response.status_code} {response.text}")
raise RuntimeError(f"CZDS authentication failed: {response.status_code}")
data = response.json()
self._token = data.get("accessToken")
# Token expires in 24 hours, refresh after 23 hours
self._token_expires = datetime.utcnow() + timedelta(hours=23)
logger.info("CZDS authentication successful")
return self._token
async def get_available_zones(self) -> dict[str, str]:
"""
Get list of zone files available for download.
Returns dict mapping TLD to download URL.
"""
token = await self._authenticate()
async with httpx.AsyncClient(timeout=60) as client:
response = await client.get(
CZDS_ZONES_URL,
headers={"Authorization": f"Bearer {token}"}
)
if response.status_code != 200:
logger.error(f"Failed to get zone list: {response.status_code}")
return {}
# Response is a list of download URLs
urls = response.json()
# Extract TLDs and their URLs
zones = {}
for url in urls:
# URL format: https://czds-download-api.icann.org/czds/downloads/xyz.zone
match = re.search(r'/([a-z0-9-]+)\.zone$', url, re.IGNORECASE)
if match:
tld = match.group(1).lower()
zones[tld] = url
logger.info(f"Available zones: {list(zones.keys())}")
return zones
async def download_zone(self, tld: str, download_url: Optional[str] = None) -> Optional[Path]:
"""
Download a zone file for a specific TLD.
Args:
tld: The TLD to download
download_url: Optional explicit download URL (from get_available_zones)
"""
token = await self._authenticate()
# Use provided URL or construct one
if not download_url:
download_url = f"{CZDS_DOWNLOAD_BASE}/czds/downloads/{tld}.zone"
output_path = self.data_dir / f"{tld}.zone.txt.gz"
logger.info(f"Downloading zone file for .{tld} from {download_url}...")
async with httpx.AsyncClient(timeout=600, follow_redirects=True) as client:
try:
async with client.stream(
"GET",
download_url,
headers={"Authorization": f"Bearer {token}"}
) as response:
if response.status_code != 200:
logger.error(f"Failed to download .{tld}: {response.status_code}")
return None
# Stream to file
with open(output_path, "wb") as f:
async for chunk in response.aiter_bytes(chunk_size=1024 * 1024):
f.write(chunk)
file_size = output_path.stat().st_size / (1024 * 1024)
logger.info(f"Downloaded .{tld} zone file: {file_size:.1f} MB")
return output_path
except Exception as e:
logger.error(f"Error downloading .{tld}: {e}")
return None
def extract_zone_file(self, gz_path: Path) -> Path:
"""Extract gzipped zone file."""
output_path = gz_path.with_suffix('') # Remove .gz
logger.info(f"Extracting {gz_path.name}...")
with gzip.open(gz_path, 'rb') as f_in:
with open(output_path, 'wb') as f_out:
shutil.copyfileobj(f_in, f_out)
# Remove gz file to save space
gz_path.unlink()
return output_path
def parse_zone_file(self, zone_path: Path, tld: str) -> set[str]:
"""
Parse zone file and extract unique domain names.
Zone files contain various record types. We extract domains from:
- NS records (most reliable indicator of active domain)
- A/AAAA records
Returns set of domain names (without TLD suffix).
"""
logger.info(f"Parsing zone file for .{tld}...")
domains = set()
line_count = 0
with open(zone_path, 'r', encoding='utf-8', errors='ignore') as f:
for line in f:
line_count += 1
# Skip comments and empty lines
if line.startswith(';') or not line.strip():
continue
# Look for NS records which indicate delegated domains
# Format: example.tld. 86400 IN NS ns1.registrar.com.
parts = line.split()
if len(parts) >= 4:
# First column is the domain name
name = parts[0].rstrip('.')
# Must end with our TLD
if name.lower().endswith(f'.{tld}'):
# Extract just the domain name part
domain_name = name[:-(len(tld) + 1)]
# Skip the TLD itself and subdomains
if domain_name and '.' not in domain_name:
domains.add(domain_name.lower())
logger.info(f"Parsed .{tld}: {len(domains):,} unique domains from {line_count:,} lines")
return domains
def compute_checksum(self, domains: set[str]) -> str:
"""Compute SHA256 checksum of sorted domain list."""
sorted_domains = "\n".join(sorted(domains))
return hashlib.sha256(sorted_domains.encode()).hexdigest()
async def get_previous_domains(self, tld: str) -> Optional[set[str]]:
"""Load previous day's domain set from cache file."""
cache_file = self.data_dir / f"{tld}_domains.txt"
if cache_file.exists():
try:
content = cache_file.read_text()
return set(line.strip() for line in content.splitlines() if line.strip())
except Exception as e:
logger.warning(f"Failed to load cache for .{tld}: {e}")
return None
async def save_domains(self, tld: str, domains: set[str]):
"""Save current domains to cache file."""
cache_file = self.data_dir / f"{tld}_domains.txt"
cache_file.write_text("\n".join(sorted(domains)))
logger.info(f"Saved {len(domains):,} domains for .{tld}")
async def process_drops(
self,
db: AsyncSession,
tld: str,
previous: set[str],
current: set[str]
) -> list[dict]:
"""Find and store dropped domains."""
dropped = previous - current
if not dropped:
logger.info(f"No dropped domains found for .{tld}")
return []
logger.info(f"Found {len(dropped):,} dropped domains for .{tld}")
today = datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
# Batch insert for performance
dropped_records = []
batch_size = 1000
batch = []
for name in dropped:
record = DroppedDomain(
domain=f"{name}.{tld}",
tld=tld,
dropped_date=today,
length=len(name),
is_numeric=name.isdigit(),
has_hyphen='-' in name
)
batch.append(record)
dropped_records.append({
"domain": f"{name}.{tld}",
"length": len(name),
"is_numeric": name.isdigit(),
"has_hyphen": '-' in name
})
if len(batch) >= batch_size:
db.add_all(batch)
await db.flush()
batch = []
# Add remaining
if batch:
db.add_all(batch)
await db.commit()
return dropped_records
async def sync_zone(
self,
db: AsyncSession,
tld: str,
download_url: Optional[str] = None
) -> dict:
"""
Sync a single zone file:
1. Download zone file
2. Extract and parse
3. Compare with previous snapshot
4. Store dropped domains
5. Save new snapshot
Args:
db: Database session
tld: TLD to sync
download_url: Optional explicit download URL
"""
logger.info(f"Starting sync for .{tld}")
result = {
"tld": tld,
"status": "pending",
"current_count": 0,
"previous_count": 0,
"dropped_count": 0,
"new_count": 0,
"error": None
}
try:
# Download zone file
gz_path = await self.download_zone(tld, download_url)
if not gz_path:
result["status"] = "download_failed"
result["error"] = "Failed to download zone file"
return result
# Extract
zone_path = self.extract_zone_file(gz_path)
# Parse
current_domains = self.parse_zone_file(zone_path, tld)
result["current_count"] = len(current_domains)
# Clean up zone file (can be very large)
zone_path.unlink()
# Get previous snapshot
previous_domains = await self.get_previous_domains(tld)
if previous_domains:
result["previous_count"] = len(previous_domains)
# Find dropped domains
dropped = await self.process_drops(db, tld, previous_domains, current_domains)
result["dropped_count"] = len(dropped)
result["new_count"] = len(current_domains - previous_domains)
# Save current snapshot
await self.save_domains(tld, current_domains)
# Save snapshot metadata
checksum = self.compute_checksum(current_domains)
snapshot = ZoneSnapshot(
tld=tld,
snapshot_date=datetime.utcnow(),
domain_count=len(current_domains),
checksum=checksum
)
db.add(snapshot)
await db.commit()
result["status"] = "success"
logger.info(
f"Sync complete for .{tld}: "
f"{result['current_count']:,} domains, "
f"{result['dropped_count']:,} dropped, "
f"{result['new_count']:,} new"
)
except Exception as e:
logger.exception(f"Error syncing .{tld}: {e}")
result["status"] = "error"
result["error"] = str(e)
return result
async def sync_all_zones(
self,
db: AsyncSession,
tlds: Optional[list[str]] = None
) -> list[dict]:
"""
Sync all approved zone files.
Args:
db: Database session
tlds: Optional list of TLDs to sync. Defaults to APPROVED_TLDS.
Returns:
List of sync results for each TLD.
"""
target_tlds = tlds or APPROVED_TLDS
# Get available zones with their download URLs
available_zones = await self.get_available_zones()
logger.info(f"Starting CZDS sync for {len(target_tlds)} zones: {target_tlds}")
logger.info(f"Available zones: {list(available_zones.keys())}")
results = []
for tld in target_tlds:
# Get the actual download URL for this TLD
download_url = available_zones.get(tld)
if not download_url:
logger.warning(f"No download URL available for .{tld}")
results.append({
"tld": tld,
"status": "not_available",
"current_count": 0,
"previous_count": 0,
"dropped_count": 0,
"new_count": 0,
"error": f"No access to .{tld} zone"
})
continue
result = await self.sync_zone(db, tld, download_url)
results.append(result)
# Small delay between zones to be nice to ICANN servers
await asyncio.sleep(2)
# Summary
success_count = sum(1 for r in results if r["status"] == "success")
total_dropped = sum(r["dropped_count"] for r in results)
logger.info(
f"CZDS sync complete: "
f"{success_count}/{len(target_tlds)} zones successful, "
f"{total_dropped:,} total dropped domains"
)
return results
# ============================================================================
# STANDALONE SCRIPT
# ============================================================================
async def main():
"""Standalone script to run CZDS sync."""
import sys
from app.database import AsyncSessionLocal, init_db
# Initialize database
await init_db()
# Parse arguments
tlds = sys.argv[1:] if len(sys.argv) > 1 else APPROVED_TLDS
print(f"🌐 CZDS Zone File Sync")
print(f"📂 TLDs: {', '.join(tlds)}")
print("-" * 50)
client = CZDSClient()
async with AsyncSessionLocal() as db:
results = await client.sync_all_zones(db, tlds)
print("\n" + "=" * 50)
print("📊 RESULTS")
print("=" * 50)
for r in results:
status_icon = "" if r["status"] == "success" else ""
print(f"{status_icon} .{r['tld']}: {r['current_count']:,} domains, "
f"{r['dropped_count']:,} dropped, {r['new_count']:,} new")
if r["error"]:
print(f" ⚠️ Error: {r['error']}")
total_dropped = sum(r["dropped_count"] for r in results)
print(f"\n🎯 Total dropped domains: {total_dropped:,}")
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,201 @@
"""
DB backup utilities (4B Ops).
Supports:
- SQLite: file copy + integrity_check verification
- Postgres: pg_dump custom format + pg_restore --list verification
This is real ops code: it will fail loudly if the platform tooling isn't available.
"""
from __future__ import annotations
import os
import shutil
import subprocess
from dataclasses import dataclass
from datetime import datetime, timedelta
from pathlib import Path
from typing import Optional
from sqlalchemy.engine.url import make_url
from app.config import get_settings
settings = get_settings()
@dataclass(frozen=True)
class BackupResult:
path: str
size_bytes: int
created_at: str
verified: bool
verification_detail: Optional[str] = None
def _backup_root() -> Path:
root = Path(settings.backup_dir)
if not root.is_absolute():
# Keep backups next to backend working dir by default
root = (Path.cwd() / root).resolve()
root.mkdir(parents=True, exist_ok=True)
return root
def _timestamp() -> str:
return datetime.utcnow().strftime("%Y%m%dT%H%M%SZ")
def _cleanup_old_backups(root: Path, retention_days: int) -> int:
if retention_days <= 0:
return 0
cutoff = datetime.utcnow() - timedelta(days=retention_days)
removed = 0
for p in root.glob("*"):
if not p.is_file():
continue
try:
mtime = datetime.utcfromtimestamp(p.stat().st_mtime)
if mtime < cutoff:
p.unlink()
removed += 1
except Exception:
continue
return removed
def _sqlite_path_from_url(database_url: str) -> Path:
url = make_url(database_url)
db_path = url.database
if not db_path:
raise RuntimeError("SQLite database path missing in DATABASE_URL")
p = Path(db_path)
if not p.is_absolute():
p = (Path.cwd() / p).resolve()
return p
def _verify_sqlite(path: Path) -> tuple[bool, str]:
import sqlite3
conn = sqlite3.connect(str(path))
try:
row = conn.execute("PRAGMA integrity_check;").fetchone()
ok = bool(row and str(row[0]).lower() == "ok")
return ok, str(row[0]) if row else "no result"
finally:
conn.close()
def _pg_dump_backup(database_url: str, out_file: Path) -> None:
url = make_url(database_url)
if not url.database:
raise RuntimeError("Postgres database name missing in DATABASE_URL")
env = os.environ.copy()
if url.password:
env["PGPASSWORD"] = str(url.password)
cmd = [
"pg_dump",
"--format=custom",
"--no-owner",
"--no-privileges",
"--file",
str(out_file),
]
if url.host:
cmd += ["--host", str(url.host)]
if url.port:
cmd += ["--port", str(url.port)]
if url.username:
cmd += ["--username", str(url.username)]
cmd += [str(url.database)]
proc = subprocess.run(cmd, env=env, capture_output=True, text=True)
if proc.returncode != 0:
raise RuntimeError(f"pg_dump failed: {proc.stderr.strip() or proc.stdout.strip()}")
def _verify_pg_dump(out_file: Path) -> tuple[bool, str]:
# Basic size check
if out_file.stat().st_size < 1024:
return False, "backup file too small"
proc = subprocess.run(
["pg_restore", "--list", str(out_file)],
capture_output=True,
text=True,
)
if proc.returncode != 0:
return False, proc.stderr.strip() or proc.stdout.strip() or "pg_restore failed"
return True, "pg_restore --list OK"
def create_backup(*, verify: bool = True) -> BackupResult:
root = _backup_root()
_cleanup_old_backups(root, settings.backup_retention_days)
db_url = settings.database_url
driver = make_url(db_url).drivername
created_at = datetime.utcnow().isoformat() + "Z"
if driver.startswith("sqlite"):
src = _sqlite_path_from_url(db_url)
if not src.exists():
raise RuntimeError(f"SQLite DB file not found: {src}")
out = root / f"sqlite-backup-{_timestamp()}{src.suffix or '.db'}"
shutil.copy2(src, out)
ok = True
detail = None
if verify:
ok, detail = _verify_sqlite(out)
if not ok:
raise RuntimeError(f"SQLite backup verification failed: {detail}")
return BackupResult(
path=str(out),
size_bytes=out.stat().st_size,
created_at=created_at,
verified=ok,
verification_detail=detail,
)
if driver.startswith("postgresql"):
out = root / f"pg-backup-{_timestamp()}.dump"
_pg_dump_backup(db_url, out)
ok = True
detail = None
if verify:
ok, detail = _verify_pg_dump(out)
if not ok:
raise RuntimeError(f"Postgres backup verification failed: {detail}")
return BackupResult(
path=str(out),
size_bytes=out.stat().st_size,
created_at=created_at,
verified=ok,
verification_detail=detail,
)
raise RuntimeError(f"Unsupported database driver for backups: {driver}")
def list_backups(limit: int = 20) -> list[dict]:
root = _backup_root()
files = [p for p in root.glob("*") if p.is_file()]
files.sort(key=lambda p: p.stat().st_mtime, reverse=True)
out: list[dict] = []
for p in files[: max(1, limit)]:
st = p.stat()
out.append(
{
"name": p.name,
"path": str(p),
"size_bytes": st.st_size,
"modified_at": datetime.utcfromtimestamp(st.st_mtime).isoformat() + "Z",
}
)
return out

View File

@ -265,8 +265,8 @@ class DomainChecker:
return None
except Exception as e:
logger.warning(f"Custom RDAP error for {domain}: {e}")
return None
return None
async def _check_rdap(self, domain: str) -> Optional[DomainCheckResult]:
"""
Check domain using RDAP (Registration Data Access Protocol).
@ -460,11 +460,11 @@ class DomainChecker:
]
if any(phrase in error_str for phrase in not_found_phrases):
return DomainCheckResult(
domain=domain,
status=DomainStatus.AVAILABLE,
is_available=True,
check_method="whois",
)
domain=domain,
status=DomainStatus.AVAILABLE,
is_available=True,
check_method="whois",
)
# Otherwise it's a real error
return DomainCheckResult(
domain=domain,

View File

@ -16,6 +16,7 @@ import logging
import ssl
import socket
import re
import ipaddress
from datetime import datetime, timezone, timedelta
from dataclasses import dataclass, field
from typing import Optional, List, Dict, Any
@ -173,6 +174,45 @@ class DomainHealthChecker:
self._dns_resolver = dns.resolver.Resolver()
self._dns_resolver.timeout = 3
self._dns_resolver.lifetime = 5
def _is_public_ip(self, ip: str) -> bool:
try:
addr = ipaddress.ip_address(ip)
return bool(getattr(addr, "is_global", False))
except Exception:
return False
async def _ssrf_guard(self, domain: str) -> tuple[bool, str | None]:
"""
SSRF hardening for HTTP/SSL probes.
We block domains that resolve exclusively to non-public IPs.
"""
loop = asyncio.get_event_loop()
def _resolve_ips() -> list[str]:
ips: list[str] = []
try:
a = self._dns_resolver.resolve(domain, "A")
ips.extend([str(r.address) for r in a])
except Exception:
pass
try:
aaaa = self._dns_resolver.resolve(domain, "AAAA")
ips.extend([str(r.address) for r in aaaa])
except Exception:
pass
# de-dup
return list(dict.fromkeys([i.strip() for i in ips if i]))
ips = await loop.run_in_executor(None, _resolve_ips)
if not ips:
return True, None # nothing to block; will fail naturally if unreachable
if any(self._is_public_ip(ip) for ip in ips):
return True, None
return False, f"blocked_ssrf: resolved_non_public_ips={ips}"
async def check_domain(self, domain: str) -> DomainHealthReport:
"""
@ -299,10 +339,15 @@ class DomainHealthChecker:
- Parking/for-sale detection
"""
result = HTTPCheckResult()
allowed, reason = await self._ssrf_guard(domain)
if not allowed:
result.error = reason
return result
async with httpx.AsyncClient(
timeout=10.0,
follow_redirects=True,
follow_redirects=False,
headers={
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36"
}
@ -311,7 +356,24 @@ class DomainHealthChecker:
url = f"{scheme}://{domain}"
try:
start = asyncio.get_event_loop().time()
response = await client.get(url)
# Follow redirects manually with host/IP guard
current_url = url
for _ in range(0, 5):
response = await client.get(current_url)
if response.status_code in (301, 302, 303, 307, 308) and response.headers.get("location"):
next_url = str(httpx.URL(current_url).join(response.headers["location"]))
next_host = httpx.URL(next_url).host
if not next_host:
break
ok, why = await self._ssrf_guard(next_host)
if not ok:
result.error = why
return result
current_url = next_url
continue
break
end = asyncio.get_event_loop().time()
result.status_code = response.status_code
@ -320,7 +382,7 @@ class DomainHealthChecker:
result.response_time_ms = (end - start) * 1000
# Check for redirects
if response.history:
if str(response.url) != url:
result.redirect_url = str(response.url)
# Check for parking keywords in content
@ -355,6 +417,11 @@ class DomainHealthChecker:
2. On validation failure, extract cert info without validation
"""
result = SSLCheckResult()
allowed, reason = await self._ssrf_guard(domain)
if not allowed:
result.error = reason
return result
loop = asyncio.get_event_loop()

View File

@ -22,10 +22,12 @@ Environment Variables Required:
"""
import logging
import os
import uuid
from typing import Optional, List
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart
from datetime import datetime
from email.utils import formatdate
import aiosmtplib
from jinja2 import Template
@ -273,6 +275,11 @@ TEMPLATES = {
Visit pounce.ch
</a>
</div>
{% if unsubscribe_url %}
<p style="margin: 32px 0 0 0; font-size: 12px; color: #999999; line-height: 1.6;">
<a href="{{ unsubscribe_url }}" style="color: #666666; text-decoration: none;">Unsubscribe</a>
</p>
{% endif %}
""",
"listing_inquiry": """
@ -303,6 +310,26 @@ TEMPLATES = {
<p style="margin: 24px 0 0 0; font-size: 13px; color: #999999;">
<a href="https://pounce.ch/terminal/listing" style="color: #666666;">Manage your listings →</a>
</p>
""",
"listing_message": """
<h2 style="margin: 0 0 24px 0; font-size: 20px; font-weight: 600; color: #000000;">
New message on {{ domain }}
</h2>
<p style="margin: 0 0 16px 0; font-size: 15px; color: #333333; line-height: 1.6;">
From: <strong style="color:#000000;">{{ sender_name }}</strong>
</p>
<div style="margin: 16px 0; padding: 20px; background: #fafafa; border-radius: 6px; border-left: 3px solid #000000;">
<p style="margin: 0; font-size: 14px; color: #333333; line-height: 1.6; white-space: pre-wrap;">{{ message }}</p>
</div>
<div style="margin: 24px 0 0 0;">
<a href="{{ thread_url }}" style="display: inline-block; padding: 12px 32px; background: #000000; color: #ffffff; text-decoration: none; border-radius: 6px; font-size: 15px; font-weight: 500;">
Open thread
</a>
</div>
<p style="margin: 16px 0 0 0; font-size: 13px; color: #999999;">
Sent: {{ timestamp }}
</p>
""",
}
@ -341,6 +368,7 @@ class EmailService:
subject: str,
html_content: str,
text_content: Optional[str] = None,
headers: Optional[dict[str, str]] = None,
) -> bool:
"""
Send an email via SMTP.
@ -364,6 +392,15 @@ class EmailService:
msg["Subject"] = subject
msg["From"] = f"{SMTP_CONFIG['from_name']} <{SMTP_CONFIG['from_email']}>"
msg["To"] = to_email
msg["Date"] = formatdate(localtime=False)
msg["Message-ID"] = EmailService._make_message_id()
msg["Reply-To"] = SMTP_CONFIG["from_email"]
# Optional extra headers (deliverability + RFC 8058 List-Unsubscribe)
if headers:
for k, v in headers.items():
if v:
msg[k] = v
# Add text part (fallback)
if text_content:
@ -400,6 +437,16 @@ class EmailService:
except Exception as e:
logger.error(f"Failed to send email to {to_email}: {e}")
return False
@staticmethod
def _make_message_id() -> str:
"""
Generate a stable Message-ID with the sender domain.
Helps deliverability and threading in some clients.
"""
from_email = str(SMTP_CONFIG.get("from_email") or "hello@pounce.ch")
domain = from_email.split("@")[-1] if "@" in from_email else "pounce.ch"
return f"<{uuid.uuid4().hex}@{domain}>"
# ============== Domain Alerts ==============
@ -601,15 +648,22 @@ class EmailService:
@staticmethod
async def send_newsletter_welcome(
to_email: str,
unsubscribe_url: Optional[str] = None,
) -> bool:
"""Send newsletter subscription welcome email."""
html = EmailService._render_email("newsletter_welcome")
html = EmailService._render_email("newsletter_welcome", unsubscribe_url=unsubscribe_url)
extra_headers: dict[str, str] = {}
if unsubscribe_url:
extra_headers["List-Unsubscribe"] = f"<{unsubscribe_url}>"
extra_headers["List-Unsubscribe-Post"] = "List-Unsubscribe=One-Click"
return await EmailService.send_email(
to_email=to_email,
subject="You're on the list. Welcome to POUNCE.",
html_content=html,
text_content="Welcome to POUNCE Insights. Expect market moves, strategies, and feature drops. No spam.",
headers=extra_headers or None,
)
# ============== Listing Inquiries ==============
@ -646,6 +700,32 @@ class EmailService:
text_content=f"New inquiry from {name} ({email}) for {domain}. Message: {message}",
)
@staticmethod
async def send_listing_message(
to_email: str,
domain: str,
sender_name: str,
message: str,
thread_url: str,
) -> bool:
"""Send notification when a new in-product message is posted."""
html = EmailService._render_email(
"listing_message",
domain=domain,
sender_name=sender_name,
message=message,
thread_url=thread_url,
timestamp=datetime.utcnow().strftime("%Y-%m-%d %H:%M UTC"),
)
subject = f"New message on {domain}"
return await EmailService.send_email(
to_email=to_email,
subject=subject,
html_content=html,
text_content=f"New message on {domain} from {sender_name}: {message}",
)
# Global instance
email_service = EmailService()

View File

@ -26,7 +26,7 @@ logger = logging.getLogger(__name__)
def build_affiliate_url(platform: str, domain: str, original_url: Optional[str] = None) -> str:
"""
Build an affiliate URL for a given platform and domain.
If the affiliate program is not configured, returns the plain provider URL.
If `original_url` is provided, it is preferred (e.g. ExpiredDomains click-through links).
"""
@ -73,14 +73,14 @@ def build_affiliate_url(platform: str, domain: str, original_url: Optional[str]
class DynadotApiScraper:
"""
Scraper for Dynadot Marketplace using their hidden JSON API.
Endpoint:
- https://www.dynadot.com/dynadot-vue-api/dynadot-service/marketplace-api
"""
BASE_URL = "https://www.dynadot.com"
MARKETPLACE_API = "/dynadot-vue-api/dynadot-service/marketplace-api"
def _parse_end_time(self, item: Dict[str, Any]) -> Optional[datetime]:
# Dynadot often provides an epoch timestamp in ms
end_time_stamp = item.get("end_time_stamp")
@ -121,7 +121,7 @@ class DynadotApiScraper:
}
if keyword:
params["keyword"] = keyword
resp = await client.post(
f"{self.BASE_URL}{self.MARKETPLACE_API}",
params=params,
@ -131,13 +131,13 @@ class DynadotApiScraper:
"Referer": "https://www.dynadot.com/market",
},
)
if resp.status_code != 200:
return {"items": [], "total": 0, "error": f"HTTP {resp.status_code}: {resp.text[:200]}"}
data = resp.json()
listings = data.get("data", {}).get("records", []) or data.get("data", {}).get("list", [])
transformed: List[Dict[str, Any]] = []
for item in listings:
domain = item.get("domain") or item.get("name") or item.get("utf8_name") or ""
@ -170,21 +170,21 @@ class DynadotApiScraper:
transformed.append(
{
"domain": domain,
"domain": domain,
"tld": tld,
"platform": "Dynadot",
"platform": "Dynadot",
"current_bid": current_bid,
"currency": str(item.get("bid_price_currency") or "USD").upper(),
"num_bids": num_bids,
"end_time": end_time,
"auction_url": build_affiliate_url("Dynadot", domain),
"buy_now_price": float(item.get("accepted_bid_price")) if item.get("accepted_bid_price") else None,
"buy_now_price": float(item.get("accepted_bid_price")) if item.get("accepted_bid_price") else None,
"age_years": int(item.get("age", 0) or 0) or None,
"backlinks": int(item.get("links", 0) or 0) or None,
"scrape_source": "dynadot:hidden_api",
}
)
return {
"items": transformed,
"total": data.get("data", {}).get("total_count", len(transformed)),
@ -197,10 +197,10 @@ class DynadotApiScraper:
class HiddenApiScraperService:
"""Orchestrates enabled hidden API scrapers."""
def __init__(self):
self.dynadot = DynadotApiScraper()
async def scrape_all(self, limit_per_platform: int = 100) -> Dict[str, Any]:
results: Dict[str, Any] = {"total_found": 0, "platforms": {}, "errors": [], "items": []}
@ -212,12 +212,12 @@ class HiddenApiScraperService:
}
results["items"].extend(dynadot_data.get("items", []))
results["total_found"] += len(dynadot_data.get("items", []))
if dynadot_data.get("error"):
results["errors"].append(f"Dynadot: {dynadot_data['error']}")
except Exception as e:
results["errors"].append(f"Dynadot: {str(e)}")
return results

View File

@ -0,0 +1,2 @@
"""HUNT services package."""

View File

@ -0,0 +1,76 @@
"""
Brandable generator (no external APIs).
Generates pronounceable strings and checks availability via internal DomainChecker.
"""
from __future__ import annotations
import asyncio
import secrets
from dataclasses import dataclass
from app.services.domain_checker import domain_checker
VOWELS = "aeiou"
CONSONANTS = "bcdfghjklmnpqrstvwxz"
def _rand_choice(alphabet: str) -> str:
return alphabet[secrets.randbelow(len(alphabet))]
def generate_cvcvc() -> str:
return (
_rand_choice(CONSONANTS)
+ _rand_choice(VOWELS)
+ _rand_choice(CONSONANTS)
+ _rand_choice(VOWELS)
+ _rand_choice(CONSONANTS)
)
def generate_cvccv() -> str:
return (
_rand_choice(CONSONANTS)
+ _rand_choice(VOWELS)
+ _rand_choice(CONSONANTS)
+ _rand_choice(CONSONANTS)
+ _rand_choice(VOWELS)
)
HUMAN_SUFFIXES = ["ly", "ri", "ro", "na", "no", "mi", "li", "sa", "ta", "ya"]
def generate_human() -> str:
# two syllable-ish: CV + CV + suffix
base = _rand_choice(CONSONANTS) + _rand_choice(VOWELS) + _rand_choice(CONSONANTS) + _rand_choice(VOWELS)
return base + HUMAN_SUFFIXES[secrets.randbelow(len(HUMAN_SUFFIXES))]
@dataclass(frozen=True)
class AvailabilityResult:
domain: str
is_available: bool | None
status: str
async def check_domains(domains: list[str], *, concurrency: int = 25) -> list[AvailabilityResult]:
sem = asyncio.Semaphore(concurrency)
async def _one(d: str) -> AvailabilityResult:
async with sem:
try:
res = await domain_checker.check_domain(d, quick=True)
return AvailabilityResult(
domain=d,
is_available=bool(res.is_available),
status="available" if res.is_available else "taken",
)
except Exception:
return AvailabilityResult(domain=d, is_available=None, status="unknown")
return list(await asyncio.gather(*[_one(d) for d in domains]))

View File

@ -0,0 +1,60 @@
"""
Trend Surfer: fetch trending search queries via public RSS (no API key).
Note: This still performs an external HTTP request to Google Trends RSS.
It's not a paid API and uses public endpoints.
"""
from __future__ import annotations
from datetime import datetime, timezone
from typing import Optional
from xml.etree import ElementTree as ET
import httpx
async def fetch_google_trends_daily_rss(geo: str = "US", *, timeout_seconds: float = 10.0) -> list[dict]:
geo = (geo or "US").upper().strip()
# Google has changed/retired older RSS paths (e.g. /trends/trendingsearches/daily/rss).
# This endpoint is currently the stable public feed for daily search trends.
url = f"https://trends.google.com/trending/rss?geo={geo}"
headers = {
# Use a browser-like UA; Google returns 404 for some automated clients otherwise.
"User-Agent": "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0 Safari/537.36",
"Accept": "application/rss+xml, application/xml;q=0.9, text/xml;q=0.8, */*;q=0.5",
}
async with httpx.AsyncClient(timeout=timeout_seconds, follow_redirects=True, headers=headers) as client:
res = await client.get(url)
res.raise_for_status()
xml = res.text
root = ET.fromstring(xml)
items: list[dict] = []
for item in root.findall(".//item"):
title = item.findtext("title") or ""
link = item.findtext("link")
pub = item.findtext("pubDate")
approx = item.findtext("{*}approx_traffic") # namespaced
published_at: Optional[datetime] = None
if pub:
try:
# Example: "Sat, 14 Dec 2025 00:00:00 +0000"
published_at = datetime.strptime(pub, "%a, %d %b %Y %H:%M:%S %z").astimezone(timezone.utc)
except Exception:
published_at = None
if title.strip():
items.append(
{
"title": title.strip(),
"approx_traffic": approx.strip() if approx else None,
"published_at": published_at,
"link": link,
}
)
return items

View File

@ -0,0 +1,89 @@
"""
Typo generator for Trend Surfer / brand typos.
No external APIs.
"""
from __future__ import annotations
import string
KEYBOARD_NEIGHBORS = {
# simplified QWERTY adjacency (enough for useful typos)
"q": "wa",
"w": "qase",
"e": "wsdr",
"r": "edft",
"t": "rfgy",
"y": "tghu",
"u": "yhji",
"i": "ujko",
"o": "iklp",
"p": "ol",
"a": "qwsz",
"s": "awedxz",
"d": "serfcx",
"f": "drtgvc",
"g": "ftyhbv",
"h": "gyujbn",
"j": "huikmn",
"k": "jiolm",
"l": "kop",
"z": "asx",
"x": "zsdc",
"c": "xdfv",
"v": "cfgb",
"b": "vghn",
"n": "bhjm",
"m": "njk",
}
def _normalize_brand(brand: str) -> str:
b = (brand or "").lower().strip()
b = "".join(ch for ch in b if ch in string.ascii_lowercase)
return b
def generate_typos(brand: str, *, limit: int = 100) -> list[str]:
b = _normalize_brand(brand)
if len(b) < 2:
return []
candidates: list[str] = []
seen = set()
def _add(s: str):
if s and s not in seen and s != b:
seen.add(s)
candidates.append(s)
# 1) single deletion
for i in range(len(b)):
_add(b[:i] + b[i + 1 :])
if len(candidates) >= limit:
return candidates
# 2) single insertion (duplicate char)
for i in range(len(b)):
_add(b[:i] + b[i] + b[i:])
if len(candidates) >= limit:
return candidates
# 3) adjacent transposition
for i in range(len(b) - 1):
_add(b[:i] + b[i + 1] + b[i] + b[i + 2 :])
if len(candidates) >= limit:
return candidates
# 4) neighbor substitution
for i, ch in enumerate(b):
neigh = KEYBOARD_NEIGHBORS.get(ch, "")
for n in neigh:
_add(b[:i] + n + b[i + 1 :])
if len(candidates) >= limit:
return candidates
return candidates[:limit]

View File

@ -217,6 +217,33 @@ INTENT_PATTERNS = {
"partners": ["capterra", "g2"]
},
# Investment / Crypto / Finance Tech - HIGH POUNCE CONVERSION
"investment_domains": {
"keywords": [
"invest", "investment", "investor", "portfolio", "asset", "assets",
"trading", "trader", "crypto", "bitcoin", "blockchain", "nft",
"domain", "domains", "digital", "passive", "income", "yield",
"startup", "founder", "entrepreneur", "venture", "capital"
],
"patterns": [r"invest\w*", r"trad\w*", r"crypto\w*", r"domain\w*"],
"potential": "high",
"partners": ["pounce_promo"], # Pounce self-promotion
"pounce_affinity": True, # Flag for Pounce self-promotion
},
# Tech / Developer - GOOD POUNCE CONVERSION
"tech_dev": {
"keywords": [
"dev", "developer", "code", "coding", "tech", "technology",
"api", "sdk", "github", "git", "open-source", "opensource",
"web3", "defi", "fintech", "proptech", "saas"
],
"patterns": [r"dev\w*", r"tech\w*", r"web\d*"],
"potential": "medium",
"partners": ["pounce_promo"],
"pounce_affinity": True,
},
# Food / Restaurant
"food_restaurant": {
"keywords": [

View File

@ -0,0 +1,256 @@
"""
Ops alerting (4B) without external monitoring stack.
Runs in the scheduler process:
- checks backup freshness (if backups enabled)
- checks basic 24h business signals from telemetry (deal inquiries / yield clicks)
- sends an aggregated email alert with cooldown to avoid spam
"""
from __future__ import annotations
import logging
import os
from dataclasses import dataclass
from datetime import datetime, timedelta
from pathlib import Path
from sqlalchemy import and_, func, select
from app.config import get_settings
from app.database import AsyncSessionLocal
from app.models.ops_alert import OpsAlertEvent
from app.models.telemetry import TelemetryEvent
from app.services.email_service import CONTACT_EMAIL, email_service
logger = logging.getLogger(__name__)
settings = get_settings()
@dataclass(frozen=True)
class OpsFinding:
key: str
severity: str # "warn" | "page"
title: str
detail: str
def _parse_recipients(raw: str) -> list[str]:
emails = [e.strip() for e in (raw or "").split(",") if e.strip()]
if emails:
return emails
fallback = (CONTACT_EMAIL or os.getenv("CONTACT_EMAIL", "")).strip()
return [fallback] if fallback else []
def _backup_root() -> Path:
root = Path(settings.backup_dir)
if not root.is_absolute():
root = (Path.cwd() / root).resolve()
return root
def _latest_backup_age_seconds() -> float | None:
root = _backup_root()
if not root.exists() or not root.is_dir():
return None
files = [p for p in root.glob("*") if p.is_file()]
if not files:
return None
latest = max(files, key=lambda p: p.stat().st_mtime)
now = datetime.utcnow().timestamp()
return max(0.0, now - float(latest.stat().st_mtime))
async def evaluate_ops_findings() -> list[OpsFinding]:
findings: list[OpsFinding] = []
# Backup stale check
if settings.enable_db_backups:
age = _latest_backup_age_seconds()
if age is None:
findings.append(
OpsFinding(
key="backup_missing",
severity="page",
title="DB backups enabled but no backup file found",
detail=f"backup_dir={_backup_root()}",
)
)
elif age > float(settings.ops_alert_backup_stale_seconds):
findings.append(
OpsFinding(
key="backup_stale",
severity="page",
title="DB backup is stale",
detail=f"latest_backup_age_seconds={int(age)} threshold={int(settings.ops_alert_backup_stale_seconds)}",
)
)
# 24h telemetry signal checks (business sanity)
end = datetime.utcnow()
start = end - timedelta(days=1)
async with AsyncSessionLocal() as db:
inquiries_24h = (
await db.execute(
select(func.count(TelemetryEvent.id)).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "inquiry_created",
)
)
)
).scalar() or 0
yield_clicks_24h = (
await db.execute(
select(func.count(TelemetryEvent.id)).where(
and_(
TelemetryEvent.created_at >= start,
TelemetryEvent.created_at <= end,
TelemetryEvent.event_name == "yield_click",
)
)
)
).scalar() or 0
if int(inquiries_24h) == 0:
findings.append(
OpsFinding(
key="deal_inquiries_zero_24h",
severity="warn",
title="No inquiries created in last 24h",
detail="Deal funnel might be broken or traffic is zero.",
)
)
if int(yield_clicks_24h) == 0:
findings.append(
OpsFinding(
key="yield_clicks_zero_24h",
severity="warn",
title="No yield clicks in last 24h",
detail="Yield routing might be misconfigured or traffic is zero.",
)
)
return findings
async def _cooldown_ok(db, key: str) -> bool:
cooldown = max(5, int(settings.ops_alert_cooldown_minutes))
cutoff = datetime.utcnow() - timedelta(minutes=cooldown)
last_sent = (
await db.execute(
select(OpsAlertEvent.created_at)
.where(
OpsAlertEvent.alert_key == key,
OpsAlertEvent.status == "sent",
OpsAlertEvent.created_at >= cutoff,
)
.order_by(OpsAlertEvent.created_at.desc())
.limit(1)
)
).scalar_one_or_none()
return last_sent is None
async def send_ops_alerts(findings: list[OpsFinding]) -> dict:
recipients = _parse_recipients(settings.ops_alert_recipients)
if not recipients:
logger.warning("Ops alerts enabled but no recipients configured (OPS_ALERT_RECIPIENTS/CONTACT_EMAIL).")
return {"sent": 0, "skipped": len(findings), "reason": "no_recipients"}
if not email_service.is_configured():
return {"sent": 0, "skipped": len(findings), "reason": "smtp_not_configured"}
async with AsyncSessionLocal() as db:
actionable: list[OpsFinding] = []
skipped = 0
for f in findings:
if await _cooldown_ok(db, f.key):
actionable.append(f)
else:
skipped += 1
db.add(
OpsAlertEvent(
alert_key=f.key,
severity=f.severity,
title=f.title,
detail=f.detail,
status="skipped",
recipients=",".join(recipients) if recipients else None,
send_reason="cooldown",
)
)
if not actionable:
await db.commit()
return {"sent": 0, "skipped": len(findings), "reason": "cooldown"}
sev = "PAGE" if any(f.severity == "page" for f in actionable) else "WARN"
subject = f"[pounce][{sev}] Ops alerts ({len(actionable)})"
items_html = "".join(
f"""
<div style="padding: 12px 14px; background: #fafafa; border-radius: 8px; border-left: 3px solid {'#ef4444' if f.severity=='page' else '#f59e0b'}; margin: 10px 0;">
<div style="font-weight:600; color:#000;">{f.title}</div>
<div style="margin-top:6px; font-size:13px; color:#444; font-family: ui-monospace, SFMono-Regular, Menlo, Monaco, Consolas, 'Liberation Mono', 'Courier New', monospace;">
{f.key}: {f.detail}
</div>
</div>
""".strip()
for f in actionable
)
html = f"""
<h2 style="margin: 0 0 16px 0; font-size: 18px; font-weight: 700; color: #000000;">
Ops alerts
</h2>
<p style="margin: 0 0 16px 0; color:#333; line-height:1.6;">
Detected {len(actionable)} issue(s). (Cooldown: {int(settings.ops_alert_cooldown_minutes)} min)
</p>
{items_html}
<p style="margin: 18px 0 0 0; font-size: 12px; color:#777;">
Timestamp: {datetime.utcnow().isoformat()}Z
</p>
""".strip()
text = "\n".join([f"- [{f.severity.upper()}] {f.title} ({f.key}) :: {f.detail}" for f in actionable])
sent = 0
for to in recipients:
ok = await email_service.send_email(to_email=to, subject=subject, html_content=html, text_content=text)
sent += 1 if ok else 0
# Persist sent events for cooldown + history
async with AsyncSessionLocal() as db:
for f in actionable:
db.add(
OpsAlertEvent(
alert_key=f.key,
severity=f.severity,
title=f.title,
detail=f.detail,
status="sent" if sent else "error",
recipients=",".join(recipients) if recipients else None,
send_reason=None if sent else "send_failed",
)
)
await db.commit()
return {"sent": sent, "actionable": len(actionable), "recipients": recipients}
async def run_ops_alert_checks() -> dict:
"""
Entry point for scheduler/admin.
Returns findings + send status (if enabled).
"""
findings = await evaluate_ops_findings()
if not settings.ops_alerts_enabled:
return {"enabled": False, "findings": [f.__dict__ for f in findings]}
send_status = await send_ops_alerts(findings)
return {"enabled": True, "findings": [f.__dict__ for f in findings], "send": send_status}

View File

@ -0,0 +1,245 @@
"""
Referral rewards (3C.2).
Goals:
- Deterministic, abuse-resistant rewards
- No manual state tracking per referral; we compute from authoritative DB state
- Idempotent updates (can be run via scheduler and on-demand)
Current reward:
- For every N qualified referrals, grant +M bonus watchlist domain slots.
Qualified referral definition:
- referred user has `users.referred_by_user_id = referrer.id`
- referred user is_active AND is_verified
- referred user has an active subscription that is NOT Scout (Trader/Tycoon), and is currently active
"""
from __future__ import annotations
from dataclasses import dataclass
from datetime import datetime, timedelta
from typing import Optional
from sqlalchemy import and_, func, or_, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.config import get_settings
from app.models.subscription import Subscription, SubscriptionStatus, SubscriptionTier
from app.models.telemetry import TelemetryEvent
from app.models.user import User
QUALIFIED_REFERRAL_BATCH_SIZE = 3
BONUS_DOMAINS_PER_BATCH = 5
settings = get_settings()
def compute_bonus_domains(qualified_referrals: int) -> int:
if qualified_referrals <= 0:
return 0
batches = qualified_referrals // QUALIFIED_REFERRAL_BATCH_SIZE
return int(batches * BONUS_DOMAINS_PER_BATCH)
def compute_badge(qualified_referrals: int) -> Optional[str]:
if qualified_referrals >= 10:
return "elite_referrer"
if qualified_referrals >= 3:
return "verified_referrer"
return None
@dataclass(frozen=True)
class ReferralRewardSnapshot:
referrer_user_id: int
referred_users_total: int
qualified_referrals_total: int
cooldown_days: int
disqualified_cooldown_total: int
disqualified_missing_ip_total: int
disqualified_shared_ip_total: int
disqualified_duplicate_ip_total: int
bonus_domains: int
badge: Optional[str]
computed_at: datetime
async def get_referral_reward_snapshot(db: AsyncSession, referrer_user_id: int) -> ReferralRewardSnapshot:
# Total referred users (all-time)
referred_users_total = int(
(
await db.execute(
select(func.count(User.id)).where(User.referred_by_user_id == referrer_user_id)
)
).scalar()
or 0
)
now = datetime.utcnow()
cooldown_days = max(0, int(getattr(settings, "referral_rewards_cooldown_days", 7) or 0))
cooldown_cutoff = now - timedelta(days=cooldown_days) if cooldown_days else None
# Referrer IP hashes (window) for self-ref/shared-ip checks
ip_window_days = max(1, int(getattr(settings, "referral_rewards_ip_window_days", 30) or 30))
ip_window_start = now - timedelta(days=ip_window_days)
referrer_ip_rows = (
await db.execute(
select(TelemetryEvent.ip_hash)
.where(
and_(
TelemetryEvent.user_id == referrer_user_id,
TelemetryEvent.ip_hash.isnot(None),
TelemetryEvent.created_at >= ip_window_start,
TelemetryEvent.created_at <= now,
)
)
.distinct()
)
).all()
referrer_ip_hashes = {str(r[0]) for r in referrer_ip_rows if r and r[0]}
# Referred user's registration IP hash (from telemetry) as subquery
reg_ip_subq = (
select(
TelemetryEvent.user_id.label("user_id"),
func.max(TelemetryEvent.ip_hash).label("signup_ip_hash"),
)
.where(
and_(
TelemetryEvent.event_name == "user_registered",
TelemetryEvent.user_id.isnot(None),
)
)
.group_by(TelemetryEvent.user_id)
.subquery()
)
# Candidate referred users (paid + verified + active)
rows = (
await db.execute(
select(
User.id,
User.created_at,
Subscription.started_at,
reg_ip_subq.c.signup_ip_hash,
)
.select_from(User)
.join(Subscription, Subscription.user_id == User.id)
.outerjoin(reg_ip_subq, reg_ip_subq.c.user_id == User.id)
.where(
and_(
User.referred_by_user_id == referrer_user_id,
User.is_active == True,
User.is_verified == True,
Subscription.tier.in_([SubscriptionTier.TRADER, SubscriptionTier.TYCOON]),
Subscription.status.in_([SubscriptionStatus.ACTIVE, SubscriptionStatus.PAST_DUE]),
or_(Subscription.expires_at.is_(None), Subscription.expires_at >= now),
)
)
)
).all()
require_ip = bool(getattr(settings, "referral_rewards_require_ip_hash", True))
disqualified_cooldown_total = 0
disqualified_missing_ip_total = 0
disqualified_shared_ip_total = 0
disqualified_duplicate_ip_total = 0
qualified_ip_hashes: set[str] = set()
qualified_referrals_total = 0
for _user_id, user_created_at, sub_started_at, signup_ip_hash in rows:
# Cooldown: user account age AND subscription age must pass cooldown
if cooldown_cutoff is not None:
if (user_created_at and user_created_at > cooldown_cutoff) or (
sub_started_at and sub_started_at > cooldown_cutoff
):
disqualified_cooldown_total += 1
continue
ip_hash = str(signup_ip_hash) if signup_ip_hash else None
if require_ip and not ip_hash:
disqualified_missing_ip_total += 1
continue
if ip_hash and referrer_ip_hashes and ip_hash in referrer_ip_hashes:
disqualified_shared_ip_total += 1
continue
if ip_hash and ip_hash in qualified_ip_hashes:
disqualified_duplicate_ip_total += 1
continue
if ip_hash:
qualified_ip_hashes.add(ip_hash)
qualified_referrals_total += 1
bonus_domains = compute_bonus_domains(qualified_referrals_total)
badge = compute_badge(qualified_referrals_total)
return ReferralRewardSnapshot(
referrer_user_id=referrer_user_id,
referred_users_total=referred_users_total,
qualified_referrals_total=qualified_referrals_total,
cooldown_days=cooldown_days,
disqualified_cooldown_total=disqualified_cooldown_total,
disqualified_missing_ip_total=disqualified_missing_ip_total,
disqualified_shared_ip_total=disqualified_shared_ip_total,
disqualified_duplicate_ip_total=disqualified_duplicate_ip_total,
bonus_domains=bonus_domains,
badge=badge,
computed_at=datetime.utcnow(),
)
async def apply_referral_rewards_for_user(db: AsyncSession, referrer_user_id: int) -> ReferralRewardSnapshot:
"""
Apply rewards to the referrer's subscription row, based on current qualified referrals.
This is idempotent: it sets the bonus to the computed value.
"""
snapshot = await get_referral_reward_snapshot(db, referrer_user_id)
sub_res = await db.execute(select(Subscription).where(Subscription.user_id == referrer_user_id))
sub = sub_res.scalar_one_or_none()
if not sub:
# Create default subscription so bonus can be stored
sub = Subscription(user_id=referrer_user_id, tier=SubscriptionTier.SCOUT, max_domains=5)
db.add(sub)
await db.flush()
desired = int(snapshot.bonus_domains)
current = int(getattr(sub, "referral_bonus_domains", 0) or 0)
if current != desired:
sub.referral_bonus_domains = desired
await db.flush()
return snapshot
async def apply_referral_rewards_all(db: AsyncSession) -> dict[str, int]:
"""
Apply rewards for all users that have an invite_code.
"""
res = await db.execute(select(User.id).where(User.invite_code.isnot(None)))
user_ids = [int(r[0]) for r in res.all()]
updated = 0
processed = 0
for user_id in user_ids:
processed += 1
snap = await get_referral_reward_snapshot(db, user_id)
sub_res = await db.execute(select(Subscription).where(Subscription.user_id == user_id))
sub = sub_res.scalar_one_or_none()
if not sub:
sub = Subscription(user_id=user_id, tier=SubscriptionTier.SCOUT, max_domains=5)
db.add(sub)
await db.flush()
desired = int(snap.bonus_domains)
current = int(getattr(sub, "referral_bonus_domains", 0) or 0)
if current != desired:
sub.referral_bonus_domains = desired
updated += 1
return {"processed": processed, "updated": updated}

View File

@ -140,10 +140,41 @@ class SedoAPIClient:
"""Parse XML response from Sedo API."""
try:
root = ElementTree.fromstring(xml_text)
# Check for error response
if root.tag == "fault" or root.find(".//faultcode") is not None:
fault_code = root.findtext(".//faultcode") or root.findtext("faultcode")
fault_string = root.findtext(".//faultstring") or root.findtext("faultstring")
return {"error": True, "faultcode": fault_code, "faultstring": fault_string}
# Parse SEDOSEARCH response (domain listings)
if root.tag == "SEDOSEARCH":
items = []
for item in root.findall("item"):
domain_data = {}
for child in item:
# Get the text content, handle type attribute
value = child.text
type_attr = child.get("type", "")
# Convert types
if "double" in type_attr or "int" in type_attr:
try:
value = float(value) if value else 0
except:
pass
domain_data[child.tag] = value
items.append(domain_data)
return {"items": items, "count": len(items)}
# Generic XML to dict fallback
return self._xml_to_dict(root)
except Exception as e:
logger.warning(f"Failed to parse XML: {e}")
return {"raw": xml_text}
return {"raw": xml_text, "error": str(e)}
def _xml_to_dict(self, element) -> Dict[str, Any]:
"""Convert XML element to dictionary."""
@ -171,20 +202,18 @@ class SedoAPIClient:
"""
Search for domains listed on Sedo marketplace.
Returns domains for sale (not auctions).
Returns domains for sale (XML parsed to dict).
"""
params = {
"output_method": "json", # Request JSON response
}
params = {}
if keyword:
params["keyword"] = keyword
if tld:
params["tld"] = tld.lstrip(".")
if min_price is not None:
params["minprice"] = min_price
params["minprice"] = int(min_price)
if max_price is not None:
params["maxprice"] = max_price
params["maxprice"] = int(max_price)
if page:
params["page"] = page
if page_size:
@ -202,11 +231,11 @@ class SedoAPIClient:
) -> Dict[str, Any]:
"""
Search for active domain auctions on Sedo.
Note: Sedo API doesn't have a dedicated auction filter.
We filter by type='A' (auction) in post-processing.
"""
params = {
"output_method": "json",
"auction": "true", # Only auctions
}
params = {}
if keyword:
params["keyword"] = keyword
@ -217,7 +246,72 @@ class SedoAPIClient:
if page_size:
params["pagesize"] = min(page_size, 100)
return await self._request("DomainSearch", params)
result = await self._request("DomainSearch", params)
# Filter to only show auctions (type='A')
if "items" in result:
result["items"] = [
item for item in result["items"]
if item.get("type") == "A"
]
result["count"] = len(result["items"])
return result
async def get_listings_for_display(
self,
keyword: Optional[str] = None,
tld: Optional[str] = None,
page_size: int = 50,
) -> List[Dict[str, Any]]:
"""
Get Sedo listings formatted for display in Pounce.
Returns a list of domains with affiliate URLs.
"""
result = await self.search_domains(
keyword=keyword,
tld=tld,
page_size=page_size
)
if "error" in result or "items" not in result:
logger.warning(f"Sedo API error: {result}")
return []
listings = []
for item in result.get("items", []):
domain = item.get("domain", "")
if not domain:
continue
# Get price (Sedo returns 0 for "Make Offer")
price = item.get("price", 0)
if isinstance(price, str):
try:
price = float(price)
except:
price = 0
# Use the URL from Sedo (includes partner ID and tracking)
url = item.get("url", f"https://sedo.com/search/details/?domain={domain}&partnerid={self.partner_id}")
# Determine listing type
listing_type = item.get("type", "D") # D=Direct, A=Auction
is_auction = listing_type == "A"
listings.append({
"domain": domain,
"tld": domain.rsplit(".", 1)[1] if "." in domain else "",
"price": price,
"price_type": "bid" if is_auction else ("make_offer" if price == 0 else "fixed"),
"is_auction": is_auction,
"platform": "Sedo",
"url": url,
"rank": item.get("rank", 0),
})
return listings
async def get_domain_details(self, domain: str) -> Dict[str, Any]:
"""Get detailed information about a specific domain."""

View File

@ -5,8 +5,8 @@ Handles subscription payments for pounce.ch
Subscription Tiers:
- Scout (Free): $0/month
- Trader: €19/month (or ~$21)
- Tycoon: €49/month (or ~$54)
- Trader: $9/month
- Tycoon: $29/month
Environment Variables Required:
- STRIPE_SECRET_KEY: Stripe API secret key
@ -24,7 +24,7 @@ from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.user import User
from app.models.subscription import Subscription
from app.models.subscription import Subscription, SubscriptionTier, SubscriptionStatus
logger = logging.getLogger(__name__)
@ -216,7 +216,8 @@ class StripeService:
Handle Stripe webhook events.
Important events:
- checkout.session.completed: Payment successful
- checkout.session.completed: Payment successful (initial)
- invoice.payment_succeeded: Invoice paid (recurring & initial)
- customer.subscription.updated: Subscription changed
- customer.subscription.deleted: Subscription cancelled
- invoice.payment_failed: Payment failed
@ -231,41 +232,78 @@ class StripeService:
payload, sig_header, webhook_secret
)
except ValueError:
logger.error("❌ Invalid webhook payload")
raise ValueError("Invalid payload")
except stripe.error.SignatureVerificationError:
logger.error("❌ Invalid webhook signature")
raise ValueError("Invalid signature")
event_type = event["type"]
data = event["data"]["object"]
logger.info(f"Processing Stripe webhook: {event_type}")
logger.info(f"🔔 Processing Stripe webhook: {event_type}")
logger.info(f" Event ID: {event.get('id')}")
if event_type == "checkout.session.completed":
await StripeService._handle_checkout_complete(data, db)
elif event_type == "customer.subscription.updated":
await StripeService._handle_subscription_updated(data, db)
elif event_type == "customer.subscription.deleted":
await StripeService._handle_subscription_cancelled(data, db)
elif event_type == "invoice.payment_failed":
await StripeService._handle_payment_failed(data, db)
return {"status": "success", "event_type": event_type}
try:
if event_type == "checkout.session.completed":
await StripeService._handle_checkout_complete(data, db)
elif event_type == "invoice.payment_succeeded":
# This is the main event for successful payments!
await StripeService._handle_invoice_paid(data, db)
elif event_type == "customer.subscription.updated":
await StripeService._handle_subscription_updated(data, db)
elif event_type == "customer.subscription.deleted":
await StripeService._handle_subscription_cancelled(data, db)
elif event_type == "invoice.payment_failed":
await StripeService._handle_payment_failed(data, db)
else:
logger.info(f" Unhandled event type: {event_type} (acknowledged)")
return {"status": "success", "event_type": event_type}
except Exception as e:
logger.exception(f"❌ Error processing webhook {event_type}: {e}")
# Still return success to prevent Stripe from retrying
# The error is logged for investigation
return {"status": "error_logged", "event_type": event_type, "error": str(e)}
@staticmethod
async def _handle_checkout_complete(data: Dict, db: AsyncSession):
"""Handle successful checkout - activate subscription."""
"""
Handle successful checkout - activate subscription.
IMPORTANT: This must be idempotent! Stripe may send webhooks multiple times.
"""
session_id = data.get("id")
user_id = data.get("metadata", {}).get("user_id")
plan = data.get("metadata", {}).get("plan")
plan = data.get("metadata", {}).get("plan") # "trader" or "tycoon"
customer_id = data.get("customer")
subscription_id = data.get("subscription")
logger.info(f"🔔 Checkout complete webhook received:")
logger.info(f" Session: {session_id}")
logger.info(f" User ID: {user_id}")
logger.info(f" Plan: {plan}")
logger.info(f" Customer: {customer_id}")
logger.info(f" Subscription: {subscription_id}")
if not user_id or not plan:
logger.error("Missing user_id or plan in checkout metadata")
logger.error(f"Missing user_id or plan in checkout metadata: {data.get('metadata')}")
return
# Convert plan string to SubscriptionTier enum
tier_map = {
"trader": SubscriptionTier.TRADER,
"tycoon": SubscriptionTier.TYCOON,
"scout": SubscriptionTier.SCOUT,
}
tier_enum = tier_map.get(plan.lower(), SubscriptionTier.SCOUT)
# Get user
result = await db.execute(
select(User).where(User.id == int(user_id))
@ -273,9 +311,23 @@ class StripeService:
user = result.scalar_one_or_none()
if not user:
logger.error(f"User {user_id} not found for checkout")
logger.error(f"User {user_id} not found for checkout")
return
logger.info(f" User email: {user.email}")
# IDEMPOTENCY CHECK: Check if this subscription_id was already processed
if subscription_id:
existing_sub = await db.execute(
select(Subscription).where(
Subscription.stripe_subscription_id == subscription_id
)
)
existing = existing_sub.scalar_one_or_none()
if existing:
logger.info(f"⚠️ Subscription {subscription_id} already processed (idempotent)")
return
# Update user's Stripe customer ID
user.stripe_customer_id = customer_id
@ -285,29 +337,161 @@ class StripeService:
)
subscription = sub_result.scalar_one_or_none()
tier_info = TIER_FEATURES.get(plan, TIER_FEATURES["scout"])
tier_info = TIER_FEATURES.get(plan.lower(), TIER_FEATURES["scout"])
if subscription:
subscription.tier = plan
subscription.is_active = True
# Only upgrade if actually changing
old_tier = subscription.tier
subscription.tier = tier_enum
subscription.status = SubscriptionStatus.ACTIVE
subscription.stripe_subscription_id = subscription_id
subscription.max_domains = tier_info["max_domains"]
subscription.check_frequency = tier_info["check_frequency"]
subscription.updated_at = datetime.utcnow()
logger.info(f"✅ Updated subscription: {old_tier}{tier_enum}")
else:
subscription = Subscription(
user_id=user.id,
tier=plan,
is_active=True,
tier=tier_enum,
status=SubscriptionStatus.ACTIVE,
stripe_subscription_id=subscription_id,
max_domains=tier_info["max_domains"],
check_frequency=tier_info["check_frequency"],
)
db.add(subscription)
logger.info(f"✅ Created new subscription: {tier_enum}")
await db.commit()
logger.info(f"Activated {plan} subscription for user {user_id}")
try:
await db.commit()
logger.info(f"✅ Activated {plan} subscription for user {user_id} ({user.email})")
except Exception as e:
logger.exception(f"❌ Failed to commit subscription: {e}")
await db.rollback()
raise
@staticmethod
async def _handle_invoice_paid(data: Dict, db: AsyncSession):
"""
Handle successful invoice payment.
This is the MAIN event for activating subscriptions!
Called for both initial payments and recurring payments.
Invoice structure has metadata in:
- parent.subscription_details.metadata (for subscription invoices)
- lines.data[0].metadata (line item level)
"""
invoice_id = data.get("id")
customer_id = data.get("customer")
customer_email = data.get("customer_email")
billing_reason = data.get("billing_reason") # "subscription_create", "subscription_cycle", etc.
logger.info(f"🧾 Invoice paid webhook received:")
logger.info(f" Invoice: {invoice_id}")
logger.info(f" Customer: {customer_id}")
logger.info(f" Email: {customer_email}")
logger.info(f" Billing reason: {billing_reason}")
# Extract metadata from subscription details
parent = data.get("parent", {})
subscription_details = parent.get("subscription_details", {})
metadata = subscription_details.get("metadata", {})
subscription_id = subscription_details.get("subscription")
user_id = metadata.get("user_id")
plan = metadata.get("plan")
# Fallback: try to get from line items
if not user_id or not plan:
lines = data.get("lines", {}).get("data", [])
if lines:
line_metadata = lines[0].get("metadata", {})
user_id = user_id or line_metadata.get("user_id")
plan = plan or line_metadata.get("plan")
logger.info(f" User ID: {user_id}")
logger.info(f" Plan: {plan}")
logger.info(f" Subscription ID: {subscription_id}")
if not user_id or not plan:
logger.warning(f"⚠️ No user_id or plan in invoice metadata, skipping")
logger.warning(f" Full parent: {parent}")
return
# Convert plan string to SubscriptionTier enum
tier_map = {
"trader": SubscriptionTier.TRADER,
"tycoon": SubscriptionTier.TYCOON,
"scout": SubscriptionTier.SCOUT,
}
tier_enum = tier_map.get(plan.lower(), SubscriptionTier.SCOUT)
# Get user
result = await db.execute(
select(User).where(User.id == int(user_id))
)
user = result.scalar_one_or_none()
if not user:
logger.error(f"❌ User {user_id} not found for invoice")
return
logger.info(f" Found user: {user.email}")
# Update user's Stripe customer ID if not set
if not user.stripe_customer_id:
user.stripe_customer_id = customer_id
# IDEMPOTENCY CHECK: Check if this subscription_id was already processed with this tier
if subscription_id:
existing_sub = await db.execute(
select(Subscription).where(
Subscription.stripe_subscription_id == subscription_id,
Subscription.tier == tier_enum
)
)
existing = existing_sub.scalar_one_or_none()
if existing:
logger.info(f"⚠️ Subscription {subscription_id} already active as {tier_enum} (idempotent)")
return
# Create or update subscription
sub_result = await db.execute(
select(Subscription).where(Subscription.user_id == user.id)
)
subscription = sub_result.scalar_one_or_none()
tier_info = TIER_FEATURES.get(plan.lower(), TIER_FEATURES["scout"])
if subscription:
old_tier = subscription.tier
subscription.tier = tier_enum
subscription.status = SubscriptionStatus.ACTIVE
subscription.stripe_subscription_id = subscription_id
subscription.max_domains = tier_info["max_domains"]
subscription.check_frequency = tier_info["check_frequency"]
subscription.updated_at = datetime.utcnow()
logger.info(f"✅ Updated subscription: {old_tier}{tier_enum}")
else:
subscription = Subscription(
user_id=user.id,
tier=tier_enum,
status=SubscriptionStatus.ACTIVE,
stripe_subscription_id=subscription_id,
max_domains=tier_info["max_domains"],
check_frequency=tier_info["check_frequency"],
)
db.add(subscription)
logger.info(f"✅ Created new subscription: {tier_enum}")
try:
await db.commit()
logger.info(f"✅ Activated {plan} subscription for user {user_id} ({user.email}) via invoice")
except Exception as e:
logger.exception(f"❌ Failed to commit subscription: {e}")
await db.rollback()
raise
@staticmethod
async def _handle_subscription_updated(data: Dict, db: AsyncSession):
"""Handle subscription update (plan change, renewal, etc.)."""
@ -340,8 +524,8 @@ class StripeService:
subscription = result.scalar_one_or_none()
if subscription:
subscription.tier = "scout"
subscription.is_active = True # Scout is still active
subscription.tier = SubscriptionTier.SCOUT # Use enum
subscription.status = SubscriptionStatus.ACTIVE # Scout is still active
subscription.stripe_subscription_id = None
subscription.max_domains = TIER_FEATURES["scout"]["max_domains"]
subscription.check_frequency = TIER_FEATURES["scout"]["check_frequency"]

View File

@ -0,0 +1,79 @@
"""
Telemetry service (4A).
Single entry-point for writing canonical product events.
"""
from __future__ import annotations
import hashlib
import json
from typing import Any, Optional
from fastapi import Request
from sqlalchemy.ext.asyncio import AsyncSession
from app.config import get_settings
from app.models.telemetry import TelemetryEvent
settings = get_settings()
def _hash_ip(ip: str) -> str:
return hashlib.sha256(f"{ip}|{settings.secret_key}".encode()).hexdigest()[:32]
def _get_client_ip(request: Request) -> Optional[str]:
xff = request.headers.get("x-forwarded-for")
if xff:
ip = xff.split(",")[0].strip()
if ip:
return ip
cf_ip = request.headers.get("cf-connecting-ip")
if cf_ip:
return cf_ip.strip()
return request.client.host if request.client else None
async def track_event(
db: AsyncSession,
*,
event_name: str,
request: Optional[Request] = None,
user_id: Optional[int] = None,
is_authenticated: Optional[bool] = None,
source: Optional[str] = None,
domain: Optional[str] = None,
listing_id: Optional[int] = None,
inquiry_id: Optional[int] = None,
yield_domain_id: Optional[int] = None,
click_id: Optional[str] = None,
referrer: Optional[str] = None,
user_agent: Optional[str] = None,
metadata: Optional[dict[str, Any]] = None,
) -> None:
ip_hash = None
if request is not None:
ip = _get_client_ip(request)
ip_hash = _hash_ip(ip) if ip else None
user_agent = user_agent or request.headers.get("user-agent")
referrer = referrer or request.headers.get("referer")
row = TelemetryEvent(
user_id=user_id,
event_name=event_name,
listing_id=listing_id,
inquiry_id=inquiry_id,
yield_domain_id=yield_domain_id,
click_id=click_id[:64] if click_id else None,
domain=domain,
source=source,
ip_hash=ip_hash,
user_agent=user_agent[:500] if user_agent else None,
referrer=referrer[:500] if referrer else None,
metadata_json=json.dumps(metadata or {}, ensure_ascii=False) if metadata else None,
is_authenticated=is_authenticated,
)
db.add(row)

View File

@ -0,0 +1,169 @@
"""
Yield DNS verification helpers.
Production-grade DNS checks for the Yield Connect flow:
- Option A (recommended): Nameserver delegation to our nameservers
- Option B (simpler): CNAME/ALIAS to a shared target
"""
from __future__ import annotations
from dataclasses import dataclass
from typing import Optional
import dns.resolver
@dataclass(frozen=True)
class YieldDNSCheckResult:
verified: bool
method: Optional[str] # "nameserver" | "cname" | None
actual_ns: list[str]
cname_ok: bool
error: Optional[str]
def _resolver() -> dns.resolver.Resolver:
r = dns.resolver.Resolver()
r.timeout = 3
r.lifetime = 5
return r
def _normalize_host(host: str) -> str:
return host.rstrip(".").lower().strip()
def _resolve_ns(domain: str) -> list[str]:
r = _resolver()
answers = r.resolve(domain, "NS")
# NS answers are RRset with .target
return sorted({_normalize_host(str(rr.target)) for rr in answers})
def _resolve_cname(domain: str) -> list[str]:
r = _resolver()
answers = r.resolve(domain, "CNAME")
return sorted({_normalize_host(str(rr.target)) for rr in answers})
def _resolve_a(host: str) -> list[str]:
r = _resolver()
answers = r.resolve(host, "A")
return sorted({str(rr) for rr in answers})
def verify_yield_dns(domain: str, expected_nameservers: list[str], cname_target: str) -> YieldDNSCheckResult:
"""
Verify that a domain is connected for Yield.
We accept:
- Nameserver delegation (NS contains all expected nameservers), OR
- CNAME/ALIAS to `cname_target` (either CNAME matches, or A records match target A records)
"""
domain = _normalize_host(domain)
expected_ns = sorted({_normalize_host(ns) for ns in expected_nameservers if ns})
target = _normalize_host(cname_target)
if not domain:
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=[],
cname_ok=False,
error="Domain is empty",
)
if not expected_ns and not target:
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=[],
cname_ok=False,
error="Yield DNS is not configured on server",
)
# Option A: NS delegation
try:
actual_ns = _resolve_ns(domain)
if expected_ns and set(expected_ns).issubset(set(actual_ns)):
return YieldDNSCheckResult(
verified=True,
method="nameserver",
actual_ns=actual_ns,
cname_ok=False,
error=None,
)
except (dns.resolver.NXDOMAIN, dns.resolver.NoAnswer):
actual_ns = []
except Exception as e:
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=[],
cname_ok=False,
error=str(e),
)
# Option B: CNAME / ALIAS
if not target:
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=actual_ns,
cname_ok=False,
error="Yield CNAME target is not configured on server",
)
# 1) Direct CNAME check (works for subdomain CNAME setups)
try:
cnames = _resolve_cname(domain)
if any(c == target for c in cnames):
return YieldDNSCheckResult(
verified=True,
method="cname",
actual_ns=actual_ns,
cname_ok=True,
error=None,
)
except (dns.resolver.NXDOMAIN, dns.resolver.NoAnswer):
pass
except Exception as e:
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=actual_ns,
cname_ok=False,
error=str(e),
)
# 2) ALIAS/ANAME flattening: compare A records against target A records
try:
target_as = set(_resolve_a(target))
domain_as = set(_resolve_a(domain))
if target_as and domain_as and domain_as.issubset(target_as):
return YieldDNSCheckResult(
verified=True,
method="cname",
actual_ns=actual_ns,
cname_ok=True,
error=None,
)
except (dns.resolver.NXDOMAIN, dns.resolver.NoAnswer):
pass
except Exception as e:
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=actual_ns,
cname_ok=False,
error=str(e),
)
return YieldDNSCheckResult(
verified=False,
method=None,
actual_ns=actual_ns,
cname_ok=False,
error=None,
)

View File

@ -0,0 +1,132 @@
"""
Yield payout generation helpers (ledger).
Used by:
- Admin endpoints (manual ops)
- Scheduler (automatic monthly preparation)
"""
from __future__ import annotations
from datetime import datetime, timedelta
from decimal import Decimal
from sqlalchemy import and_, func, select
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.yield_domain import YieldDomain, YieldPayout, YieldTransaction
async def generate_payouts_for_period(
db: AsyncSession,
*,
period_start: datetime,
period_end: datetime,
) -> tuple[int, int]:
"""
Create payouts for confirmed, unpaid transactions and assign payout_id.
Returns: (created_count, skipped_existing_count)
"""
if period_end <= period_start:
raise ValueError("period_end must be after period_start")
aggregates = (
await db.execute(
select(
YieldDomain.user_id.label("user_id"),
YieldTransaction.currency.label("currency"),
func.count(YieldTransaction.id).label("tx_count"),
func.coalesce(func.sum(YieldTransaction.net_amount), 0).label("amount"),
)
.join(YieldDomain, YieldDomain.id == YieldTransaction.yield_domain_id)
.where(
and_(
YieldTransaction.status == "confirmed",
YieldTransaction.payout_id.is_(None),
YieldTransaction.created_at >= period_start,
YieldTransaction.created_at < period_end,
)
)
.group_by(YieldDomain.user_id, YieldTransaction.currency)
)
).all()
created = 0
skipped = 0
for row in aggregates:
user_id = int(row.user_id)
currency = (row.currency or "CHF").upper()
tx_count = int(row.tx_count or 0)
amount = Decimal(str(row.amount or 0))
if tx_count <= 0 or amount <= 0:
continue
existing = (
await db.execute(
select(YieldPayout).where(
and_(
YieldPayout.user_id == user_id,
YieldPayout.currency == currency,
YieldPayout.period_start == period_start,
YieldPayout.period_end == period_end,
)
)
)
).scalar_one_or_none()
if existing:
skipped += 1
continue
payout = YieldPayout(
user_id=user_id,
amount=amount,
currency=currency,
period_start=period_start,
period_end=period_end,
transaction_count=tx_count,
status="pending",
payment_method=None,
payment_reference=None,
)
db.add(payout)
await db.flush()
tx_ids = (
await db.execute(
select(YieldTransaction.id)
.join(YieldDomain, YieldDomain.id == YieldTransaction.yield_domain_id)
.where(
and_(
YieldDomain.user_id == user_id,
YieldTransaction.currency == currency,
YieldTransaction.status == "confirmed",
YieldTransaction.payout_id.is_(None),
YieldTransaction.created_at >= period_start,
YieldTransaction.created_at < period_end,
)
)
)
).scalars().all()
for tx_id in tx_ids:
tx = (
await db.execute(select(YieldTransaction).where(YieldTransaction.id == tx_id))
).scalar_one()
tx.payout_id = payout.id
created += 1
await db.commit()
return created, skipped
async def generate_payouts_for_previous_month(db: AsyncSession) -> tuple[int, int]:
now = datetime.utcnow()
month_start = now.replace(day=1, hour=0, minute=0, second=0, microsecond=0)
prev_month_end = month_start
prev_month_start = (month_start - timedelta(days=1)).replace(day=1)
return await generate_payouts_for_period(db, period_start=prev_month_start, period_end=prev_month_end)

View File

@ -0,0 +1,400 @@
"""
Zone File Service for .ch and .li domains
==========================================
Uses DNS AXFR zone transfer to fetch domain lists from Switch.ch
Compares daily snapshots to find freshly dropped domains.
Storage: We only store the diff (dropped/new domains) to minimize disk usage.
"""
import asyncio
import hashlib
import logging
from datetime import datetime, timedelta
from pathlib import Path
from typing import Optional
from sqlalchemy import select, func
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.zone_file import ZoneSnapshot, DroppedDomain
logger = logging.getLogger(__name__)
# ============================================================================
# TSIG KEYS (from Switch.ch documentation)
# ============================================================================
TSIG_KEYS = {
"ch": {
"name": "tsig-zonedata-ch-public-21-01",
"algorithm": "hmac-sha512",
"secret": "stZwEGApYumtXkh73qMLPqfbIDozWKZLkqRvcjKSpRnsor6A6MxixRL6C2HeSVBQNfMW4wer+qjS0ZSfiWiJ3Q=="
},
"li": {
"name": "tsig-zonedata-li-public-21-01",
"algorithm": "hmac-sha512",
"secret": "t8GgeCn+fhPaj+cRy1epox2Vj4hZ45ax6v3rQCkkfIQNg5fsxuU23QM5mzz+BxJ4kgF/jiQyBDBvL+XWPE6oCQ=="
}
}
ZONE_SERVER = "zonedata.switch.ch"
# ============================================================================
# ZONE FILE SERVICE
# ============================================================================
class ZoneFileService:
"""Service for fetching and analyzing zone files"""
def __init__(self, data_dir: Optional[Path] = None):
self.data_dir = data_dir or Path("/tmp/pounce_zones")
self.data_dir.mkdir(parents=True, exist_ok=True)
def _get_key_file_path(self, tld: str) -> Path:
"""Generate TSIG key file for dig command"""
key_path = self.data_dir / f"{tld}_zonedata.key"
key_info = TSIG_KEYS.get(tld)
if not key_info:
raise ValueError(f"Unknown TLD: {tld}")
# Write TSIG key file in BIND format
key_content = f"""key "{key_info['name']}" {{
algorithm {key_info['algorithm']};
secret "{key_info['secret']}";
}};
"""
key_path.write_text(key_content)
return key_path
async def fetch_zone_file(self, tld: str) -> set[str]:
"""
Fetch zone file via DNS AXFR transfer.
Returns set of domain names (without TLD suffix).
"""
if tld not in TSIG_KEYS:
raise ValueError(f"Unsupported TLD: {tld}. Only 'ch' and 'li' are supported.")
logger.info(f"Starting zone transfer for .{tld}")
key_file = self._get_key_file_path(tld)
# Build dig command
cmd = [
"dig",
"-k", str(key_file),
f"@{ZONE_SERVER}",
"+noall",
"+answer",
"+noidnout",
"+onesoa",
"AXFR",
f"{tld}."
]
try:
# Run dig command (this can take a while for large zones)
process = await asyncio.create_subprocess_exec(
*cmd,
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE
)
stdout, stderr = await asyncio.wait_for(
process.communicate(),
timeout=600 # 10 minutes timeout for large zones
)
if process.returncode != 0:
error_msg = stderr.decode() if stderr else "Unknown error"
logger.error(f"Zone transfer failed for .{tld}: {error_msg}")
raise RuntimeError(f"Zone transfer failed: {error_msg}")
# Parse output to extract domain names
domains = set()
for line in stdout.decode().splitlines():
parts = line.split()
if len(parts) >= 1:
domain = parts[0].rstrip('.')
# Only include actual domain names (not the TLD itself)
if domain and domain != tld and '.' in domain:
# Extract just the name part (before the TLD)
name = domain.rsplit('.', 1)[0]
if name:
domains.add(name.lower())
logger.info(f"Zone transfer complete for .{tld}: {len(domains)} domains")
return domains
except asyncio.TimeoutError:
logger.error(f"Zone transfer timed out for .{tld}")
raise RuntimeError("Zone transfer timed out")
except FileNotFoundError:
logger.error("dig command not found. Please install bind-utils or dnsutils.")
raise RuntimeError("dig command not available")
def compute_checksum(self, domains: set[str]) -> str:
"""Compute SHA256 checksum of sorted domain list"""
sorted_domains = "\n".join(sorted(domains))
return hashlib.sha256(sorted_domains.encode()).hexdigest()
async def get_previous_snapshot(self, db: AsyncSession, tld: str) -> Optional[set[str]]:
"""Load previous day's domain set from cache file"""
cache_file = self.data_dir / f"{tld}_domains.txt"
if cache_file.exists():
try:
content = cache_file.read_text()
return set(line.strip() for line in content.splitlines() if line.strip())
except Exception as e:
logger.warning(f"Failed to load cache for .{tld}: {e}")
return None
async def save_snapshot(self, db: AsyncSession, tld: str, domains: set[str]):
"""Save current snapshot to cache and database"""
# Save to cache file
cache_file = self.data_dir / f"{tld}_domains.txt"
cache_file.write_text("\n".join(sorted(domains)))
# Save metadata to database
checksum = self.compute_checksum(domains)
snapshot = ZoneSnapshot(
tld=tld,
snapshot_date=datetime.utcnow(),
domain_count=len(domains),
checksum=checksum
)
db.add(snapshot)
await db.commit()
logger.info(f"Saved snapshot for .{tld}: {len(domains)} domains")
async def process_drops(
self,
db: AsyncSession,
tld: str,
previous: set[str],
current: set[str]
) -> list[dict]:
"""Find and store dropped domains"""
dropped = previous - current
if not dropped:
logger.info(f"No dropped domains found for .{tld}")
return []
logger.info(f"Found {len(dropped)} dropped domains for .{tld}")
today = datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
# Store dropped domains
dropped_records = []
for name in dropped:
record = DroppedDomain(
domain=f"{name}.{tld}",
tld=tld,
dropped_date=today,
length=len(name),
is_numeric=name.isdigit(),
has_hyphen='-' in name
)
db.add(record)
dropped_records.append({
"domain": f"{name}.{tld}",
"length": len(name),
"is_numeric": name.isdigit(),
"has_hyphen": '-' in name
})
await db.commit()
return dropped_records
async def run_daily_sync(self, db: AsyncSession, tld: str) -> dict:
"""
Run daily zone file sync:
1. Fetch current zone file
2. Compare with previous snapshot
3. Store dropped domains
4. Save new snapshot
"""
logger.info(f"Starting daily sync for .{tld}")
# Get previous snapshot
previous = await self.get_previous_snapshot(db, tld)
# Fetch current zone
current = await self.fetch_zone_file(tld)
result = {
"tld": tld,
"current_count": len(current),
"previous_count": len(previous) if previous else 0,
"dropped": [],
"new_count": 0
}
if previous:
# Find dropped domains
result["dropped"] = await self.process_drops(db, tld, previous, current)
result["new_count"] = len(current - previous)
# Save current snapshot
await self.save_snapshot(db, tld, current)
logger.info(f"Daily sync complete for .{tld}: {len(result['dropped'])} dropped, {result['new_count']} new")
return result
# ============================================================================
# API FUNCTIONS
# ============================================================================
async def get_dropped_domains(
db: AsyncSession,
tld: Optional[str] = None,
hours: int = 24,
min_length: Optional[int] = None,
max_length: Optional[int] = None,
exclude_numeric: bool = False,
exclude_hyphen: bool = False,
keyword: Optional[str] = None,
limit: int = 100,
offset: int = 0
) -> dict:
"""
Get recently dropped domains with filters.
Only returns drops from last 24-48h (we don't store older data).
"""
cutoff = datetime.utcnow() - timedelta(hours=hours)
query = select(DroppedDomain).where(DroppedDomain.dropped_date >= cutoff)
count_query = select(func.count(DroppedDomain.id)).where(DroppedDomain.dropped_date >= cutoff)
if tld:
query = query.where(DroppedDomain.tld == tld)
count_query = count_query.where(DroppedDomain.tld == tld)
if min_length:
query = query.where(DroppedDomain.length >= min_length)
count_query = count_query.where(DroppedDomain.length >= min_length)
if max_length:
query = query.where(DroppedDomain.length <= max_length)
count_query = count_query.where(DroppedDomain.length <= max_length)
if exclude_numeric:
query = query.where(DroppedDomain.is_numeric == False)
count_query = count_query.where(DroppedDomain.is_numeric == False)
if exclude_hyphen:
query = query.where(DroppedDomain.has_hyphen == False)
count_query = count_query.where(DroppedDomain.has_hyphen == False)
if keyword:
query = query.where(DroppedDomain.domain.ilike(f"%{keyword}%"))
count_query = count_query.where(DroppedDomain.domain.ilike(f"%{keyword}%"))
# Get total count
total_result = await db.execute(count_query)
total = total_result.scalar() or 0
# Get items with pagination
query = query.order_by(DroppedDomain.dropped_date.desc(), DroppedDomain.length)
query = query.offset(offset).limit(limit)
result = await db.execute(query)
items = result.scalars().all()
return {
"total": total,
"items": [
{
"domain": item.domain,
"tld": item.tld,
"dropped_date": item.dropped_date.isoformat(),
"length": item.length,
"is_numeric": item.is_numeric,
"has_hyphen": item.has_hyphen
}
for item in items
]
}
async def get_zone_stats(db: AsyncSession) -> dict:
"""Get zone file statistics"""
# Get latest snapshots
ch_query = select(ZoneSnapshot).where(ZoneSnapshot.tld == "ch").order_by(ZoneSnapshot.snapshot_date.desc()).limit(1)
li_query = select(ZoneSnapshot).where(ZoneSnapshot.tld == "li").order_by(ZoneSnapshot.snapshot_date.desc()).limit(1)
ch_result = await db.execute(ch_query)
li_result = await db.execute(li_query)
ch_snapshot = ch_result.scalar_one_or_none()
li_snapshot = li_result.scalar_one_or_none()
# Count drops from last 24h only
cutoff_24h = datetime.utcnow() - timedelta(hours=24)
drops_query = select(func.count(DroppedDomain.id)).where(DroppedDomain.dropped_date >= cutoff_24h)
drops_result = await db.execute(drops_query)
daily_drops = drops_result.scalar() or 0
return {
"ch": {
"domain_count": ch_snapshot.domain_count if ch_snapshot else 0,
"last_sync": ch_snapshot.snapshot_date.isoformat() if ch_snapshot else None
},
"li": {
"domain_count": li_snapshot.domain_count if li_snapshot else 0,
"last_sync": li_snapshot.snapshot_date.isoformat() if li_snapshot else None
},
"daily_drops": daily_drops
}
async def cleanup_old_drops(db: AsyncSession, hours: int = 48) -> int:
"""
Delete dropped domains older than specified hours.
Default: Keep only last 48h for safety margin (24h display + 24h buffer).
Returns number of deleted records.
"""
from sqlalchemy import delete
cutoff = datetime.utcnow() - timedelta(hours=hours)
# Delete old drops
stmt = delete(DroppedDomain).where(DroppedDomain.dropped_date < cutoff)
result = await db.execute(stmt)
await db.commit()
deleted = result.rowcount
if deleted > 0:
logger.info(f"Cleaned up {deleted} old dropped domains (older than {hours}h)")
return deleted
async def cleanup_old_snapshots(db: AsyncSession, keep_days: int = 7) -> int:
"""
Delete zone snapshots older than specified days.
Keep at least 7 days of metadata for debugging.
Returns number of deleted records.
"""
from sqlalchemy import delete
cutoff = datetime.utcnow() - timedelta(days=keep_days)
stmt = delete(ZoneSnapshot).where(ZoneSnapshot.snapshot_date < cutoff)
result = await db.execute(stmt)
await db.commit()
deleted = result.rowcount
if deleted > 0:
logger.info(f"Cleaned up {deleted} old zone snapshots (older than {keep_days}d)")
return deleted

View File

@ -57,6 +57,13 @@ MOZ_SECRET_KEY=
# Sentry Error Tracking
SENTRY_DSN=
# ============== ZONE FILE SERVICES ==============
# ICANN CZDS (Centralized Zone Data Service)
# Register at: https://czds.icann.org/
CZDS_USERNAME=
CZDS_PASSWORD=
CZDS_DATA_DIR=/tmp/pounce_czds
# ============== PRODUCTION SETTINGS ==============
# Uncomment for production deployment:
# DATABASE_URL=postgresql+asyncpg://user:pass@localhost/pounce

View File

@ -54,3 +54,6 @@ redis>=5.0.0
# Production Database (optional)
# asyncpg>=0.30.0 # Already included above
# ICANN CZDS Zone File Access
pyCZDS>=1.7.0

View File

@ -0,0 +1,36 @@
"""Seed auction data for development."""
import asyncio
import sys
import os
# Add parent directory to path
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
from app.database import AsyncSessionLocal
from app.services.auction_scraper import auction_scraper
async def main():
"""Seed auction data."""
async with AsyncSessionLocal() as db:
print("Seeding sample auction data...")
result = await auction_scraper.seed_sample_auctions(db)
print(f"✓ Seeded {result['found']} auctions ({result['new']} new, {result['updated']} updated)")
# Also try to scrape real data
print("\nAttempting to scrape real auction data...")
try:
scrape_result = await auction_scraper.scrape_all_platforms(db)
print(f"✓ Scraped {scrape_result['total_found']} auctions from platforms:")
for platform, stats in scrape_result['platforms'].items():
print(f" - {platform}: {stats.get('found', 0)} found")
if scrape_result['errors']:
print(f" Errors: {scrape_result['errors']}")
except Exception as e:
print(f" Scraping failed (this is okay): {e}")
print("\n✓ Done!")
if __name__ == "__main__":
asyncio.run(main())

Some files were not shown because too many files have changed in this diff Show More