diff --git a/MARKET_CONCEPT.md b/MARKET_CONCEPT.md
index 2aa23db..01fd25e 100644
--- a/MARKET_CONCEPT.md
+++ b/MARKET_CONCEPT.md
@@ -1,447 +1,173 @@
-# ๐ฏ POUNCE MARKET โ Das Konzept fรผr die Unicorn-Journey
+# ๐ฏ POUNCE MARKET โ Das Herzstรผck der Plattform
+
+> **Letzte Aktualisierung:** 11. Dezember 2025
---
-# ๐ฆ TEIL 1: BESTANDSAUFNAHME โ Was haben wir?
+## ๐ Executive Summary
-## รbersicht: Code-Inventar
+Die **Market Page** ist das Herzstรผck von Pounce. Hier flieรen alle Datenquellen zusammen und werden dem User als **"Clean Feed"** prรคsentiert.
-### โ
BEHALTEN โ Funktioniert gut, Vision-konform
+### Vision (aus pounce_terminal.md)
+> *"Die Market Page zeigt alle Domains die entweder:*
+> 1. *Zu Verkauf stehen (Auktionen)*
+> 2. *Bald frei werden (Drops)*
+> 3. *รber Pounce direkt angeboten werden (Pounce Direct)"*
-| Komponente | Pfad | Status | Beschreibung |
-|------------|------|--------|--------------|
-| **Listings API** | `backend/app/api/listings.py` | โ
Vollstรคndig | Pounce Direct Marketplace mit DNS-Verifizierung |
-| **Listing Model** | `backend/app/models/listing.py` | โ
Vollstรคndig | DomainListing, ListingInquiry, ListingView |
-| **My Listings Page** | `frontend/src/app/terminal/listing/page.tsx` | โ
Vollstรคndig | Seller Dashboard mit Verification Wizard |
-| **Public Marketplace** | `frontend/src/app/buy/page.tsx` | โ
Vollstรคndig | รffentliche Browse-Seite fรผr Listings |
-| **Listing Detail** | `frontend/src/app/buy/[slug]/page.tsx` | โ
Vollstรคndig | รffentliche Landing Page pro Listing |
-| **Sniper Alerts API** | `backend/app/api/sniper_alerts.py` | โ
Vollstรคndig | Alert-Matching fรผr Auktionen |
-| **Sniper Alert Model** | `backend/app/models/sniper_alert.py` | โ
Vollstรคndig | SniperAlert, SniperAlertMatch |
-| **Scheduler** | `backend/app/scheduler.py` | โ
Vollstรคndig | APScheduler mit Scraping, Alerts, Checks |
-| **Valuation Service** | `backend/app/services/valuation.py` | โ
Vollstรคndig | Pounce Score Berechnung |
-| **TLD Prices API** | `backend/app/api/tld_prices.py` | โ
Vollstรคndig | Intel/Pricing Feature |
-| **TLD Scraper** | `backend/app/services/tld_scraper/` | โ
Funktioniert | Porkbun + Aggregator |
-| **Portfolio API** | `backend/app/api/portfolio.py` | โ
Vollstรคndig | Eigene Domains verwalten |
-| **Domain Health** | `backend/app/services/domain_health.py` | โ
Vollstรคndig | 4-Layer Monitoring |
-| **SEO Analyzer** | `backend/app/services/seo_analyzer.py` | โ
Vollstรคndig | Moz API Integration |
-| **Email Service** | `backend/app/services/email_service.py` | โ
Vollstรคndig | Notifications |
-| **Stripe Service** | `backend/app/services/stripe_service.py` | โ
Vollstรคndig | Subscriptions |
+### Aktueller Stand: Phase 1 โ Intelligence
----
-
-### โ ๏ธ รBERARBEITEN โ Funktioniert, aber Optimierung nรถtig
-
-| Komponente | Pfad | Problem | Lรถsung |
-|------------|------|---------|--------|
-| **Auction Scraper** | `backend/app/services/auction_scraper.py` | Scraping ist fragil, oft leer | API-First + Fallback-Logik |
-| **Auctions API** | `backend/app/api/auctions.py` | Keine Pounce Direct Integration | Unified Feed erstellen |
-| **Market Page** | `frontend/src/app/terminal/market/page.tsx` | Zeigt nur externe Auktionen | Pounce Direct integrieren |
-| **Pounce Score** | In `market/page.tsx` | Zu simpel (nur Length+TLD) | Erweitern um Markt-Signale |
-| **Public Auctions** | `frontend/src/app/auctions/page.tsx` | Kein Pounce Direct Highlight | Visuelle Hierarchie |
-
----
-
-### โ ENTFERNEN / KONSOLIDIEREN โ Redundant oder veraltet
-
-| Komponente | Pfad | Grund | Aktion |
-|------------|------|-------|--------|
-| **Leere Ordner** | `frontend/src/app/dashboard/` | Leer (Legacy von /command) | Lรถschen |
-| **Leere Ordner** | `frontend/src/app/portfolio/` | Leer (Legacy) | Lรถschen |
-| **Leere Ordner** | `frontend/src/app/settings/` | Leer (Legacy) | Lรถschen |
-| **Leere Ordner** | `frontend/src/app/watchlist/` | Leer (Legacy) | Lรถschen |
-| **Leere Ordner** | `frontend/src/app/careers/` | Kein Inhalt | Lรถschen oder TODO |
-| **Intelligence Redirect** | `frontend/src/app/intelligence/page.tsx` | Redirect zu /tld-pricing | Prรผfen ob noch nรถtig |
-| **Market Public** | `frontend/src/app/market/page.tsx` | Duplikat? Prรผfen | Ggf. konsolidieren mit /auctions |
-
----
-
-## Detaillierte Analyse pro Bereich
-
-### 1. BACKEND: API Routes (`backend/app/api/`)
-
-```
-backend/app/api/
-โโโ __init__.py โ
Router-Registration
-โโโ admin.py โ
Admin Panel APIs
-โโโ auctions.py โ ๏ธ รberarbeiten (Unified Feed)
-โโโ auth.py โ
Login/Register/JWT
-โโโ blog.py โ
Blog Feature
-โโโ check.py โ
Domain Availability Check
-โโโ contact.py โ
Kontaktformular
-โโโ deps.py โ
Dependencies
-โโโ domains.py โ
Watchlist
-โโโ listings.py โ
Pounce Direct Marketplace
-โโโ oauth.py โ
Google/GitHub OAuth
-โโโ portfolio.py โ
Portfolio Management
-โโโ price_alerts.py โ
TLD Price Alerts
-โโโ seo.py โ
SEO Juice (Tycoon)
-โโโ sniper_alerts.py โ
Auction Sniper Alerts
-โโโ subscription.py โ
Stripe Integration
-โโโ tld_prices.py โ
TLD Pricing Data
-โโโ webhooks.py โ
Stripe Webhooks
-```
-
-**Aktion:**
-- `auctions.py`: Unified Feed Endpoint hinzufรผgen der Pounce Direct + External kombiniert
-
----
-
-### 2. BACKEND: Services (`backend/app/services/`)
-
-```
-backend/app/services/
-โโโ auction_scraper.py โ ๏ธ Fallback-Logik verbessern
-โโโ auth.py โ
Behalten
-โโโ domain_checker.py โ
Behalten
-โโโ domain_health.py โ
Behalten
-โโโ email_service.py โ
Behalten
-โโโ price_tracker.py โ
Behalten
-โโโ seo_analyzer.py โ
Behalten
-โโโ stripe_service.py โ
Behalten
-โโโ valuation.py โ ๏ธ Pounce Score v2.0 integrieren
-โโโ tld_scraper/
- โโโ aggregator.py โ
Behalten
- โโโ base.py โ
Behalten
- โโโ porkbun.py โ
Behalten
- โโโ tld_list.py โ
Behalten
-```
-
-**Aktionen:**
-1. `auction_scraper.py`: Methode `scrape_with_fallback()` hinzufรผgen
-2. `valuation.py`: Pounce Score v2.0 mit Market Signals
-
----
-
-### 3. BACKEND: Models (`backend/app/models/`)
-
-```
-backend/app/models/
-โโโ admin_log.py โ
Behalten
-โโโ auction.py โ
DomainAuction, AuctionScrapeLog
-โโโ blog.py โ
Behalten
-โโโ domain.py โ
Domain, DomainCheck
-โโโ listing.py โ
DomainListing, ListingInquiry, ListingView
-โโโ newsletter.py โ
Behalten
-โโโ portfolio.py โ
PortfolioDomain
-โโโ price_alert.py โ
TLDPriceAlert
-โโโ seo_data.py โ
DomainSEOData
-โโโ sniper_alert.py โ
SniperAlert, SniperAlertMatch
-โโโ subscription.py โ
Subscription, tier config
-โโโ tld_price.py โ
TLDPrice, TLDInfo
-โโโ user.py โ
User
-```
-
-**Status:** Alle Models sind sauber und Vision-konform. Keine รnderungen nรถtig.
-
----
-
-### 4. FRONTEND: Terminal (Authenticated) (`frontend/src/app/terminal/`)
-
-```
-frontend/src/app/terminal/
-โโโ page.tsx โ
Redirect zu /radar
-โโโ radar/page.tsx โ
Dashboard
-โโโ market/page.tsx โ ๏ธ Pounce Direct integrieren!
-โโโ intel/page.tsx โ
TLD Overview
-โโโ intel/[tld]/page.tsx โ
TLD Detail
-โโโ watchlist/page.tsx โ
Domain Monitoring
-โโโ listing/page.tsx โ
My Listings (Seller Dashboard)
-โโโ settings/page.tsx โ
User Settings
-โโโ welcome/page.tsx โ
Onboarding
-```
-
-**Aktionen:**
-1. `market/page.tsx`: Pounce Direct Listings im Feed anzeigen
-2. `market/page.tsx`: Visuelle Hierarchie (๐ Pounce vs ๐ข External)
-
----
-
-### 5. FRONTEND: Public Pages (`frontend/src/app/`)
-
-```
-frontend/src/app/
-โโโ page.tsx โ
Landing Page
-โโโ auctions/page.tsx โ ๏ธ Pounce Direct hervorheben
-โโโ buy/page.tsx โ
Marketplace Browse
-โโโ buy/[slug]/page.tsx โ
Listing Detail
-โโโ tld-pricing/ โ
TLD Intel Public
-โโโ pricing/page.tsx โ
Subscription Tiers
-โโโ blog/ โ
Blog
-โโโ login/page.tsx โ
Auth
-โโโ register/page.tsx โ
Auth
-โโโ ... โ
Legal, Contact, etc.
-```
-
-**Aktionen:**
-1. `auctions/page.tsx`: "๐ Pounce Direct" Listings prominent anzeigen
-2. Konsolidieren: `/market/` mit `/auctions/` zusammenfรผhren?
-
----
-
-### 6. FRONTEND: API Client (`frontend/src/lib/api.ts`)
-
-**Status:** โ
Vollstรคndig
-
-Enthรคlt alle nรถtigen Methoden:
-- `getAuctions()` - Externe Auktionen
-- `getMarketplaceListings()` - TODO: Backend anbinden (aktuell leere Liste)
-
-**Aktion:**
-- `getMarketplaceListings()` โ Backend Endpoint `/listings` anbinden
-
----
-
-## Zusammenfassung: Cleanup-Liste
-
-### Sofort lรถschen (leere Ordner):
-```bash
-rm -rf frontend/src/app/dashboard/
-rm -rf frontend/src/app/portfolio/
-rm -rf frontend/src/app/settings/
-rm -rf frontend/src/app/watchlist/
-rm -rf frontend/src/app/careers/
-```
-
-### Konsolidieren:
-- `/market/page.tsx` und `/auctions/page.tsx` โ Eine Seite fรผr Public Market
-- `/intelligence/page.tsx` prรผfen ob Redirect noch nรถtig
-
-### Code-รnderungen:
-1. **Market Page (Terminal)**: Pounce Direct + External in einem Feed
-2. **Auctions Page (Public)**: Pounce Direct prominent
-3. **API Client**: `getMarketplaceListings()` Backend anbinden
-4. **Auctions API**: Unified Feed Endpoint
-5. **Pounce Score**: v2.0 mit Market Signals
-
----
-
-# ๐ TEIL 2: KONZEPT โ Wohin entwickeln wir?
-
-## Executive Summary
-
-Die aktuelle Market-Page funktioniert technisch, aber sie ist noch nicht "Unicorn-ready".
-Dieses Konzept transformiert sie von einem einfachen Auktions-Aggregator zur **zentralen Domain-Intelligence-Plattform**.
-
----
-
-## ๐ IST-Analyse: Aktuelle Implementation
-
-### Datenquellen (Backend)
```
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ CURRENT DATA FLOW โ
+โ POUNCE MARKET โ Aktueller Datenfluss โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
-โ ExpiredDomains.net โโโ โ
-โ โ โ
-โ GoDaddy RSS Feed โโโโโผโโโ Web Scraper โโโ PostgreSQL/SQLite โ
-โ โ (hourly) (domain_auctions) โ
-โ Sedo Public Search โโโค โ
-โ โ โ
-โ NameJet Public โโโโโโโค โ
-โ โ โ
-โ DropCatch Public โโโโโ โ
+โ DATENQUELLEN: โ
+โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
+โ โ
+โ ๐ฆ WEB SCRAPING (Hauptquelle) โ
+โ โโโ ExpiredDomains.net (325 Auktionen) โ
โ
+โ โโโ GoDaddy RSS Feed (10 Auktionen) โ
โ
+โ โโโ Sedo Public (7 Auktionen) โ
โ
+โ โโโ NameJet Public (6 Auktionen) โ
โ
+โ โโโ DropCatch Public (7 Auktionen) โ
โ
+โ โ
+โ ๐ OFFIZIELLE APIs (Konfiguriert) โ
+โ โโโ DropCatch Partner API โ ๏ธ (Nur eigene Aktivitรคten) โ
+โ โโโ Sedo Partner API โณ (Credentials fehlen) โ
+โ โ
+โ ๐ POUNCE DIRECT (User-Listings) โ
+โ โโโ DNS-verifizierte Verkaufsangebote โ (0 Listings) โ
+โ โ
+โ ๐ฎ ZONE FILES (Phase 3 โ Zukunft) โ
+โ โโโ Verisign .com/.net ๐ โ
+โ โโโ PIR .org ๐ โ
+โ โ
+โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
+โ TOTAL: 355 Domains im Feed | 0 Pounce Direct โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
```
-### Probleme mit dem aktuellen Setup
-
-| Problem | Impact | Severity |
-|---------|--------|----------|
-| **Web-Scraping ist fragil** | Seiten รคndern Layout โ Scraper bricht | ๐ด Hoch |
-| **Daten sind oft veraltet** | End-Zeiten stimmen nicht, Preise falsch | ๐ด Hoch |
-| **Kein "Pounce Direct" Content** | Alles nur externe Daten, kein USP | ๐ด Hoch |
-| **Rate-Limiting & Blocking** | Plattformen blockieren Scraper | ๐ก Mittel |
-| **Keine echte Echtzeit-Daten** | Stรผndliches Scraping ist zu langsam | ๐ก Mittel |
-| **Pounce Score ist simpel** | Nur Length + TLD, keine echten Signale | ๐ก Mittel |
-
---
-## ๐ SOLL-Konzept: Die Unicorn-Architektur
+## ๐ TEIL 1: Bestandsaufnahme โ Was haben wir?
-### Phase 1: Der "Clean Feed" (Jetzt โ 3 Monate)
+### A. Backend-Komponenten โ
-**Ziel:** Die beste Auktions-รbersicht mit echtem Mehrwert.
+| Komponente | Status | Beschreibung |
+|------------|--------|--------------|
+| **Unified Feed API** `/auctions/feed` | โ
Live | Kombiniert Pounce Direct + External |
+| **Pounce Score v2.0** | โ
Live | Length, TLD, Bids, Time Pressure |
+| **Vanity Filter** | โ
Live | Premium-Domains fรผr Public Users |
+| **Auction Scraper** | โ
Lรคuft | 5 Plattformen, Scheduler aktiv |
+| **Listings API** | โ
Fertig | DNS-Verifizierung, Inquiry-System |
+| **Sniper Alerts** | โ
Fertig | Keyword-Matching, Notifications |
-#### 1.1 Daten-Strategie: Hybrid-Ansatz
+### B. Frontend-Komponenten โ
+| Seite | Status | Beschreibung |
+|-------|--------|--------------|
+| `/terminal/market` | โ
Live | Vollstรคndiger Market Feed fรผr Auth Users |
+| `/auctions` | โ
Live | Public Market mit Vanity Filter |
+| `/buy` | โ
Live | Pounce Direct Marketplace Browse |
+| `/buy/[slug]` | โ
Live | Listing-Detailseite |
+| `/terminal/listing` | โ
Live | Seller Dashboard |
+
+### C. Datenquellen โ Realitรคtscheck
+
+#### Offizielle APIs โ Die Ernรผchterung
+
+**DropCatch API:**
```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ NEW DATA ARCHITECTURE โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ TIER 1: OFFIZIELLE APIs (zuverlรคssig, real-time) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โข GoDaddy Partner API (wenn Partner-Account vorhanden) โ
-โ โข Sedo Partner API (Affiliate-Programm) โ
-โ โข DropCatch Public API โ
-โ โ
-โ TIER 2: WEB SCRAPING (Backup, validiert) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โข ExpiredDomains.net (Deleted Domains) โ
-โ โข NameJet Public (mit Fallback-Logik) โ
-โ โ
-โ TIER 3: POUNCE EXCLUSIVE (unser USP!) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โข User-Listings ("Pounce Direct" / "For Sale") โ
-โ โข DNS-verifizierte Eigentรผmer โ
-โ โข Sofort-Kauf-Option โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+Status: โ
Authentifiziert
+Problem: Zeigt nur EIGENE Aktivitรคten (Bids, Backorders)
+ NICHT das รถffentliche Auktionsinventar
+Nutzen: User-Integration (verbinde dein DropCatch-Konto)
```
-#### 1.2 Der "Clean Feed" Algorithmus
+**Sedo API:**
+```
+Status: โณ Client bereit, Credentials fehlen
+Wo finden: Sedo.com โ Mein Sedo โ API-Zugang
+Benรถtigt: Partner ID + SignKey
+```
+
+#### Web Scraping โ Unsere Hauptquelle
```python
-# Spam-Filter v2.0 (Vanity Filter)
+# Aktuelle Scraper-Architektur
+TIER_1_APIS = [
+ ("DropCatch", _fetch_dropcatch_api), # Fรผr eigene Aktivitรคten
+ ("Sedo", _fetch_sedo_api), # Wenn konfiguriert
+]
+
+TIER_2_SCRAPING = [
+ ("ExpiredDomains", _scrape_expireddomains), # 325 Domains
+ ("GoDaddy", _scrape_godaddy_rss), # 10 Domains
+ ("Sedo", _scrape_sedo_public), # 7 Domains (Fallback)
+ ("NameJet", _scrape_namejet_public), # 6 Domains
+ ("DropCatch", _scrape_dropcatch_public), # 7 Domains (Fallback)
+]
+```
+
+---
+
+## ๐ฏ TEIL 2: Das Konzept โ Die 3 Sรคulen des Market
+
+### Sรคule 1: AUKTIONEN (Externe Plattformen)
+
+> *"Zeige alle relevanten Auktionen von GoDaddy, Sedo, NameJet, etc."*
+
+**Datenquellen:**
+- Web Scraping (primรคr)
+- Partner APIs (wenn verfรผgbar)
+
+**Filter-Strategie:**
+```python
+# Vanity Filter fรผr Public Users (aus pounce_features.md)
def is_premium_domain(domain: str) -> bool:
- name = domain.rsplit('.', 1)[0]
- tld = domain.rsplit('.', 1)[1]
+ name, tld = domain.rsplit('.', 1)
- # REGEL 1: Nur Premium-TLDs fรผr Public
- premium_tlds = ['com', 'io', 'ai', 'co', 'de', 'ch', 'net', 'org', 'app', 'dev']
- if tld not in premium_tlds:
+ # Premium TLDs only
+ if tld not in ['com', 'io', 'ai', 'co', 'ch', 'de', 'net', 'org', 'app', 'dev']:
return False
- # REGEL 2: Keine Spam-Muster
- if len(name) > 12: # Kurz = Premium
- return False
- if name.count('-') > 0: # Keine Bindestriche
- return False
- if sum(c.isdigit() for c in name) > 1: # Max 1 Zahl
- return False
- if any(word in name.lower() for word in ['xxx', 'casino', 'loan', 'cheap']):
- return False
-
- # REGEL 3: Konsonanten-Check (kein "xkqzfgh.com")
- consonants = 'bcdfghjklmnpqrstvwxyz'
- max_consonant_streak = max(len(list(g)) for k, g in groupby(name, key=lambda c: c.lower() in consonants) if k)
- if max_consonant_streak > 4:
- return False
+ # Keine Spam-Muster
+ if len(name) > 12: return False
+ if '-' in name: return False
+ if sum(c.isdigit() for c in name) > 1: return False
return True
```
-#### 1.3 Pounce Score 2.0
-
-Der aktuelle Score ist zu simpel. Hier ist die verbesserte Version:
-
-```python
-def calculate_pounce_score_v2(domain: str, auction_data: dict) -> int:
- score = 50 # Baseline
- name = domain.rsplit('.', 1)[0]
- tld = domain.rsplit('.', 1)[1]
-
- # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
- # A) INTRINSIC VALUE (Domain selbst)
- # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-
- # Lรคnge (kurz = wertvoll)
- length_scores = {1: 50, 2: 45, 3: 40, 4: 30, 5: 20, 6: 15, 7: 10}
- score += length_scores.get(len(name), max(0, 15 - len(name)))
-
- # TLD Premium
- tld_scores = {'com': 20, 'ai': 25, 'io': 18, 'co': 12, 'de': 10, 'ch': 10}
- score += tld_scores.get(tld, 0)
-
- # Dictionary Word Bonus
- common_words = ['tech', 'data', 'cloud', 'app', 'dev', 'net', 'hub', 'lab', 'pro']
- if name.lower() in common_words or any(word in name.lower() for word in common_words):
- score += 15
-
- # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
- # B) MARKET SIGNALS (Aktivitรคt)
- # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-
- # Bid Activity (mehr Bids = mehr Interesse)
- bids = auction_data.get('num_bids', 0)
- if bids >= 20: score += 15
- elif bids >= 10: score += 10
- elif bids >= 5: score += 5
-
- # Time Pressure (endet bald = Opportunity)
- hours_left = auction_data.get('hours_left', 999)
- if hours_left < 1: score += 10 # HOT!
- elif hours_left < 4: score += 5
-
- # Price-to-Value Ratio
- current_bid = auction_data.get('current_bid', 0)
- estimated_value = estimate_base_value(name, tld)
- if current_bid > 0 and estimated_value > current_bid * 1.5:
- score += 15 # Unterbewertet!
-
- # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
- # C) PENALTIES (Abzรผge)
- # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-
- if '-' in name: score -= 30
- if any(c.isdigit() for c in name) and len(name) > 3: score -= 20
- if len(name) > 15: score -= 25
-
- return max(0, min(100, score))
-```
+**UI-Darstellung:**
+| Domain | Source | Price | Status | Action |
+|--------|--------|-------|--------|--------|
+| **crypto-bank.io** | ๐ข GoDaddy | $2,500 | โฑ๏ธ 2h left | [Bid โ] |
+| **meta-shop.com** | ๐ข Sedo | $5,000 | ๐ค Offer | [View โ] |
---
-### Phase 2: Der "Pounce Direct" Marktplatz (3 โ 6 Monate)
+### Sรคule 2: POUNCE DIRECT (User-Listings)
-**Ziel:** Eigenes Inventar = Unique Content = USP
-
-#### 2.1 Das Killer-Feature: "Pounce Direct"
+> *"Das sind die Domains, die es NUR bei Pounce gibt. Unser USP."*
+**Das Konzept (aus pounce_terminal.md):**
```
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ POUNCE DIRECT INTEGRATION โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ MARKET FEED (Gemischt) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ ๐ POUNCE DIRECT โ โ
-โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ
-โ โ zurich-immo.ch $950 โก INSTANT [BUY] โ โ
-โ โ โ
Verified Owner โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ ๐ข EXTERNAL AUCTION โ โ
-โ โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ โ
-โ โ techflow.io $250 โฑ๏ธ 6h left [BID โ] โ โ
-โ โ via GoDaddy โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-```
-
-#### 2.2 Warum das genial ist
-
-| Vorteil | Erklรคrung |
-|---------|-----------|
-| **Unique Content** | Domains, die es NUR bei Pounce gibt |
-| **Hรถhere Conversion** | "Instant Buy" statt "Bid on external site" |
-| **Vendor Lock-in** | Verkรคufer listen bei uns (weil 0% Provision) |
-| **SEO Power** | Jede Listing = eigene Landing Page |
-| **Trust Signal** | DNS-Verifizierung = Qualitรคtsgarantie |
-
-#### 2.3 Der Flow fรผr Verkรคufer (aus `pounce_terminal.md`)
-
-```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ LISTING WIZARD โ
+โ POUNCE DIRECT โ Der Listing-Wizard โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
โ โ
โ STEP 1: DOMAIN EINGEBEN โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ [________________________] zurich-immo.ch โ
-โ Preis: [$950] โ Fixpreis โ Verhandlungsbasis โ
+โ Domain: [zurich-immo.ch___________] โ
+โ Preis: [$950_______] โ Fixpreis โ Verhandlungsbasis โ
โ โ
-โ STEP 2: DNS VERIFICATION โ
+โ STEP 2: DNS VERIFICATION (Trust-Check) โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ Fรผge diesen TXT-Record zu deiner Domain hinzu: โ
+โ Fรผge diesen TXT-Record bei deinem Registrar hinzu: โ
โ โ
-โ Name: _pounce-verify โ
+โ Name: _pounce-verify โ
โ Value: pounce-verify-8a3f7b9c2e1d โ
โ โ
โ [๐ VERIFY DNS] โ
@@ -449,967 +175,335 @@ def calculate_pounce_score_v2(domain: str, auction_data: dict) -> int:
โ STEP 3: LIVE! โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
โ โ
Domain verifiziert! โ
-โ Dein Listing ist jetzt im Market Feed sichtbar. โ
+โ Dein Listing erscheint jetzt im Market Feed. โ
+โ โ
+โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+```
+
+**Warum das genial ist:**
+
+| Vorteil | Erklรคrung |
+|---------|-----------|
+| **Unique Content** | Domains, die es NUR bei Pounce gibt |
+| **Hรถhere Conversion** | "Instant Buy" statt "Bid on external site" |
+| **Vendor Lock-in** | Verkรคufer listen bei uns (0% Provision!) |
+| **SEO Power** | Jede Listing = eigene Landing Page |
+| **Trust Signal** | DNS-Verifizierung = Qualitรคtsgarantie |
+
+**UI-Darstellung:**
+| Domain | Source | Price | Status | Action |
+|--------|--------|-------|--------|--------|
+| **zurich-immo.ch** | ๐ **Pounce** | **$950** | โก **Instant** | **[Buy Now]** |
+
+---
+
+### Sรคule 3: DROPS (Domains die bald frei werden)
+
+> *"Zeige Domains BEVOR sie in Auktionen landen."*
+
+**Phase 1 (Jetzt): Deleted Domains via Scraping**
+```
+ExpiredDomains.net โ Deleted Domains Liste โ Pounce Filter โ Feed
+```
+
+**Phase 3 (Zukunft): Zone File Analysis**
+```
+โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+โ ZONE FILE PIPELINE โ Die Unicorn-Strategie โ
+โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
+โ โ
+โ 1. DAILY DOWNLOAD (4:00 UTC) โ
+โ โโโ Zone Files von Verisign, PIR, etc. โ
+โ โ
+โ 2. DIFF ANALYSIS โ
+โ โโโ Was war gestern da, ist heute weg? โ
+โ โโโ Diese Domains DROPPEN in 1-5 Tagen! โ
+โ โ
+โ 3. POUNCE ALGORITHM โ
+โ โโโ Nur Premium-Domains durchlassen (Score > 70) โ
+โ โ
+โ 4. OUTPUT: "Drops Tomorrow" (Tycoon Exclusive) โ
+โ โโโ Domains BEVOR sie in Auktionen erscheinen โ
โ โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
```
---
-### Phase 3: Die Daten-Hoheit (6 โ 12 Monate) ๐
+## ๐ง TEIL 3: Technische Architektur
-**Ziel:** Unabhรคngigkeit von externen Quellen. **EIGENE DATEN = EIGENES MONOPOL.**
-
-> *"Pounce weiร Dinge, die GoDaddy dir verheimlicht."* โ pounce_strategy.md
-
-#### 3.1 Zone File Analysis โ Der Unicorn-Treiber
-
-**Was sind Zone Files?**
-Zone Files sind die "Master-Listen" aller registrierten Domains pro TLD. Sie werden tรคglich von den Registries (Verisign, PIR, etc.) aktualisiert.
-
-**Wer hat Zugang?**
-- Jeder kann sich bei ICANN-akkreditierten Registries bewerben
-- Verisign (.com/.net): https://www.verisign.com/en_US/channel-resources/domain-registry-products/zone-file/index.xhtml
-- PIR (.org): Zone File Access Program
-- Donuts (.xyz, .online, etc.): TLD Zone File Access
-
-**Kosten:** $0 - $10,000/Jahr je nach TLD und Nutzung
-
-```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ ZONE FILE PIPELINE โ Die Daten-Revolution โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ TIER 1: CRITICAL TLDs (Sofort beantragen) โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค โ
-โ โ Verisign โ .com, .net ~160M + 13M Domains โ โ
-โ โ PIR โ .org ~10M Domains โ โ
-โ โ Afilias โ .info ~4M Domains โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ โ
-โ โผ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ TIER 2: PREMIUM TLDs (Phase 2) โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค โ
-โ โ CentralNIC โ .io, .co Premium fรผr Startups โ โ
-โ โ Google โ .app, .dev Tech-Domains โ โ
-โ โ Donuts โ .xyz, .online Volumen โ โ
-โ โ SWITCH โ .ch Schweizer Markt โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ โ
-โ โผ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ POUNCE INTELLIGENCE ENGINE โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค โ
-โ โ โ โ
-โ โ 1. DAILY DOWNLOAD (4:00 UTC) โ โ
-โ โ โโโ ~500GB komprimierte Daten pro Tag โ โ
-โ โ โ โ
-โ โ 2. DIFF ANALYSIS โ โ
-โ โ โโโ Was ist NEU? Was ist WEG? โ โ
-โ โ โ โ
-โ โ 3. DROP PREDICTION โ โ
-โ โ โโโ Domains die aus Zone verschwinden = droppen โ โ
-โ โ โ โ
-โ โ 4. QUALITY SCORING (Pounce Algorithm) โ โ
-โ โ โโโ Nur Premium-Domains durchlassen โ โ
-โ โ โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ โ
-โ โผ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ OUTPUT: EXKLUSIVE INTELLIGENCE โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค โ
-โ โ โ โ
-โ โ ๐ฎ "Drops Tomorrow" โ Domains BEVOR sie in Auktionen โ โ
-โ โ ๐ "Trending Registrations" โ Was wird gerade gehypt โ โ
-โ โ โ ๏ธ "Expiring Premium" โ Hochwertige Domains am Ende โ โ
-โ โ ๐ "Pattern Detection" โ Welche Keywords explodieren โ โ
-โ โ โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-```
-
-#### 3.2 Der Pounce Algorithm โ "No-Bullshit" Filter
+### Der Unified Feed API Endpoint
```python
-# backend/app/services/zone_analyzer.py (NEU ZU BAUEN)
+# backend/app/api/auctions.py
-class ZoneFileAnalyzer:
- """
- Analysiert Zone Files und findet Premium-Opportunities.
-
- Input: Raw Zone File (Millionen von Domains)
- Output: Gefilterte Premium-Liste (Hunderte)
- """
-
- async def analyze_drops(self, yesterday: set, today: set) -> list:
- """
- Findet Domains die aus der Zone verschwunden sind.
- Diese Domains droppen in 1-5 Tagen (Redemption Period).
- """
- dropped = yesterday - today # Set-Differenz
-
- premium_drops = []
- for domain in dropped:
- score = self.calculate_pounce_score(domain)
-
- # Nur Premium durchlassen
- if score >= 70:
- premium_drops.append({
- "domain": domain,
- "score": score,
- "drop_date": self.estimate_drop_date(domain),
- "estimated_value": self.estimate_value(domain),
- })
-
- return sorted(premium_drops, key=lambda x: x['score'], reverse=True)
-
- def calculate_pounce_score(self, domain: str) -> int:
- """
- Der Pounce Algorithm โ Qualitรคtsfilter fรผr Domains.
-
- Faktoren:
- - Lรคnge (kurz = wertvoll)
- - TLD (com > io > xyz)
- - Keine Zahlen/Bindestriche
- - Dictionary Word Bonus
- - Historische Daten (wenn verfรผgbar)
- """
- name = domain.rsplit('.', 1)[0]
- tld = domain.rsplit('.', 1)[1]
- score = 50 # Baseline
-
- # Lรคngen-Score
- length_scores = {1: 50, 2: 45, 3: 40, 4: 30, 5: 20, 6: 15, 7: 10}
- score += length_scores.get(len(name), max(0, 15 - len(name)))
-
- # TLD Premium
- tld_scores = {'com': 20, 'ai': 25, 'io': 18, 'co': 12, 'ch': 15, 'de': 10}
- score += tld_scores.get(tld, 0)
-
- # Penalties
- if '-' in name: score -= 30
- if any(c.isdigit() for c in name): score -= 20
- if len(name) > 12: score -= 15
-
- # Dictionary Word Bonus
- if self.is_dictionary_word(name):
- score += 25
-
- return max(0, min(100, score))
-```
-
-#### 3.3 Der "Drops Tomorrow" Feed โ Tycoon Exclusive
-
-```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ ๐ฎ DROPS TOMORROW โ Tycoon Exclusive ($29/mo) โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ Diese Domains sind NICHT in Auktionen! โ
-โ Du kannst sie beim Registrar direkt registrieren. โ
-โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โ Domain TLD Score Est. Value Drops In โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ pixel.com .com 95 $50,000 23h 45m โ
-โ swift.io .io 88 $8,000 23h 12m โ
-โ quantum.ai .ai 92 $25,000 22h 58m โ
-โ nexus.dev .dev 84 $4,500 22h 30m โ
-โ fusion.co .co 81 $3,200 21h 15m โ
-โ โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โ ๐ก Pro Tip: Setze bei deinem Registrar einen Backorder โ
-โ fรผr diese Domains. Wer zuerst kommt... โ
-โ โ
-โ [๐ Alert fรผr "pixel.com" setzen] โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-```
-
-#### 3.4 Warum das ein MONOPOL schafft
-
-| Wettbewerber | Datenquelle | Problem |
-|--------------|-------------|---------|
-| **ExpiredDomains.net** | Zone Files | Zeigt ALLES (Spam-Hรถlle) |
-| **GoDaddy Auctions** | Eigene Daten | Nur GoDaddy-Domains |
-| **Sedo** | User-Listings | รberteuert, wenig Volumen |
-| **Pounce** | Zone Files + **Algorithmus** | **Premium-gefiltert, clean** |
-
-**Der Unterschied:**
-- ExpiredDomains zeigt dir 100.000 Domains am Tag. Davon sind 99.990 Mรผll.
-- Pounce zeigt dir 100 Premium-Domains. Alle sind es wert, angeschaut zu werden.
-
-**Das verkauft Abos:**
-> *"Ich zahle $29/Monat, weil Pounce mir 20 Stunden Recherche pro Woche spart."*
-
-#### 3.5 Technische Umsetzung โ Server-Anforderungen
-
-```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ ZONE FILE PROCESSING โ Infrastructure โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ SERVER REQUIREMENTS: โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โข Storage: 2TB SSD (Zone Files sind ~500GB/Tag komprimiert) โ
-โ โข RAM: 64GB+ (fรผr effizientes Set-Diffing) โ
-โ โข CPU: 16+ Cores (parallele Analyse) โ
-โ โข Kosten: ~$300-500/Monat (Hetzner/OVH Dedicated) โ
-โ โ
-โ PROCESSING PIPELINE: โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ 04:00 UTC โ Zone File Download (FTP/HTTPS) โ
-โ 04:30 UTC โ Decompression & Parsing โ
-โ 05:00 UTC โ Diff Analysis (gestern vs heute) โ
-โ 05:30 UTC โ Quality Scoring (Pounce Algorithm) โ
-โ 06:00 UTC โ Database Update (PostgreSQL) โ
-โ 06:15 UTC โ Alert Matching (Sniper Alerts) โ
-โ 06:30 UTC โ User Notifications (Email/SMS) โ
-โ โ
-โ STORAGE STRATEGY: โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โข Nur Premium-Domains speichern (Score > 50) โ
-โ โข 90 Tage History fรผr Trend-Analyse โ
-โ โข รltere Daten archivieren (S3 Glacier) โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-```
-
-#### 3.6 Phase 1 vs Phase 3 โ Was zuerst?
-
-| Phase | Datenquelle | Status |
-|-------|-------------|--------|
-| **Phase 1 (JETZT)** | Web Scraping + Pounce Direct | โ
Implementiert |
-| **Phase 3 (6-12 Mo)** | Zone Files | ๐ Geplant |
-
-**Warum warten?**
-1. Zone File Access braucht Vertrรคge mit Registries (1-3 Monate)
-2. Infrastruktur-Investition (~$500/Monat Server)
-3. Algorithmus muss getestet werden (False Positives vermeiden)
-
-**Was wir JETZT tun:**
-- Scraping + Pounce Direct perfektionieren
-- User-Basis aufbauen (die Zone Files spรคter monetarisiert)
-- Algorithmus entwickeln (funktioniert auch ohne Zone Files)
-
----
-
-## ๐ก Konkrete รnderungen fรผr die Market Page
-
-### Frontend-รnderungen
-
-#### 1. Visuelle Hierarchie verbessern
-
-```tsx
-// VORHER: Alle Items sehen gleich aus
-
- {items.map(item =>
)}
-
-
-// NACHHER: Pounce Direct hervorheben
-
- {/* Featured: Pounce Direct (wenn vorhanden) */}
- {pounceDirectItems.length > 0 && (
-
-
-
- Pounce Direct โ Verified Instant Buy
-
- {pounceDirectItems.map(item =>
)}
-
- )}
-
- {/* Standard: External Auctions */}
-
- {externalItems.map(item =>
)}
-
-
-```
-
-#### 2. Filter-Presets fรผr User-Journeys
-
-```tsx
-// Quick-Filter Buttons basierend auf User-Intent
-const FILTER_PRESETS = {
- 'ending-soon': {
- label: 'โฑ๏ธ Ending Soon',
- filter: { hours_left: { max: 4 } },
- sort: 'time_asc'
- },
- 'bargains': {
- label: '๐ฐ Under $100',
- filter: { price: { max: 100 }, score: { min: 60 } },
- sort: 'score_desc'
- },
- 'premium': {
- label: '๐ Premium Only',
- filter: { score: { min: 80 }, tld: ['com', 'io', 'ai'] },
- sort: 'price_desc'
- },
- 'pounce-only': {
- label: '๐ Pounce Direct',
- filter: { source: 'pounce' },
- sort: 'created_desc'
- }
-}
-```
-
-#### 3. "Opportunity Score" statt nur "Pounce Score"
-
-```tsx
-// Zeige WARUM ein Domain interessant ist
-function OpportunityIndicators({ item }) {
- const indicators = []
-
- if (item.hoursLeft < 2) indicators.push({ icon: '๐ฅ', label: 'Ending soon' })
- if (item.numBids < 3) indicators.push({ icon: '๐', label: 'Low competition' })
- if (item.valueRatio > 2) indicators.push({ icon: '๐', label: 'Undervalued' })
- if (item.isPounce) indicators.push({ icon: 'โก', label: 'Instant buy' })
-
- return (
-
- {indicators.map(ind => (
-
- {ind.icon}
-
- ))}
-
- )
-}
-```
-
-### Backend-รnderungen
-
-#### 1. Unified Feed API
-
-```python
-# NEUER ENDPOINT: /api/v1/market/feed
@router.get("/feed")
async def get_market_feed(
- # Filter
- source: Optional[str] = Query(None, enum=['all', 'pounce', 'external']),
- score_min: int = Query(0, ge=0, le=100),
- price_max: Optional[float] = None,
- tld: Optional[List[str]] = Query(None),
- ending_within: Optional[int] = Query(None, description="Hours"),
-
- # Sort
- sort_by: str = Query('score', enum=['score', 'price', 'time', 'bids']),
-
- # Pagination
- limit: int = Query(30, le=100),
+ source: str = Query("all", enum=["all", "pounce", "external"]),
+ keyword: Optional[str] = None,
+ tld: Optional[str] = None,
+ min_price: Optional[float] = None,
+ max_price: Optional[float] = None,
+ min_score: int = Query(0, ge=0, le=100),
+ ending_within: Optional[int] = None, # Stunden
+ verified_only: bool = False,
+ sort_by: str = Query("score", enum=["score", "price_asc", "price_desc", "time", "newest"]),
+ limit: int = Query(50, le=200),
offset: int = Query(0),
-
- # Auth
current_user: Optional[User] = Depends(get_current_user_optional),
):
"""
- Unified market feed combining:
- - Pounce Direct listings (user-listed domains)
- - External auctions (scraped from platforms)
+ ๐ UNIFIED MARKET FEED โ Das Herz von Pounce
- For non-authenticated users:
- - Apply vanity filter (premium domains only)
- - Blur "Deal Score" (tease upgrade)
+ Kombiniert:
+ - ๐ Pounce Direct: DNS-verifizierte User-Listings (Instant Buy)
+ - ๐ข External Auctions: Scraped von GoDaddy, Sedo, etc.
+ - ๐ฎ Drops: Domains die bald frei werden (Phase 3)
+
+ Fรผr nicht-authentifizierte User:
+ - Vanity Filter aktiv (nur Premium-Domains)
+ - Pounce Score sichtbar, aber limited Details
+
+ Fรผr authentifizierte User (Trader/Tycoon):
+ - Vollzugriff auf alle Domains
+ - Advanced Filtering
+ - Valuation Data
"""
-
- items = []
-
- # 1. Get Pounce Direct listings
- pounce_listings = await get_published_listings(db)
- for listing in pounce_listings:
- items.append({
- 'type': 'pounce_direct',
- 'domain': listing.domain,
- 'price': listing.asking_price,
- 'source': 'Pounce',
- 'status': 'instant',
- 'verified': listing.verification_status == 'verified',
- 'url': f'/buy/{listing.slug}', # Internal!
- })
-
- # 2. Get external auctions
- auctions = await get_active_auctions(db)
- for auction in auctions:
- # Apply vanity filter for non-auth users
- if not current_user and not is_premium_domain(auction.domain):
- continue
-
- items.append({
- 'type': 'auction',
- 'domain': auction.domain,
- 'price': auction.current_bid,
- 'source': auction.platform,
- 'status': 'auction',
- 'time_left': format_time_remaining(auction.end_time),
- 'url': auction.affiliate_url, # External
- })
-
- # 3. Calculate scores
- for item in items:
- item['pounce_score'] = calculate_pounce_score_v2(
- item['domain'],
- item
- )
-
- # 4. Sort and paginate
- items = sorted(items, key=lambda x: x['pounce_score'], reverse=True)
-
- return {
- 'items': items[offset:offset+limit],
- 'total': len(items),
- 'filters_applied': {...},
- }
```
-#### 2. Scraper Verbesserungen
+### Pounce Score v2.0
```python
-class AuctionScraperService:
- """
- IMPROVED: Resilient scraping with fallbacks
+def calculate_pounce_score_v2(domain: str, auction_data: dict) -> int:
"""
+ Der Pounce Score โ Qualitรคts- und Opportunity-Bewertung
- async def scrape_with_fallback(self, platform: str, db: AsyncSession):
- """Try multiple methods to get data"""
-
- methods = [
- (f'_scrape_{platform.lower()}_api', 'API'), # Best: Official API
- (f'_scrape_{platform.lower()}_rss', 'RSS'), # Good: RSS Feed
- (f'_scrape_{platform.lower()}_html', 'HTML'), # Fallback: HTML Scrape
- ]
-
- for method_name, method_type in methods:
- method = getattr(self, method_name, None)
- if not method:
- continue
-
- try:
- result = await method(db)
- if result['found'] > 0:
- logger.info(f"{platform}: Got {result['found']} via {method_type}")
- return result
- except Exception as e:
- logger.warning(f"{platform} {method_type} failed: {e}")
- continue
-
- # All methods failed
- logger.error(f"{platform}: All scrape methods failed")
- return {'found': 0, 'new': 0, 'updated': 0, 'error': 'All methods failed'}
+ A) INTRINSIC VALUE (Domain selbst)
+ - Lรคnge (kurz = wertvoll)
+ - TLD Premium (com > io > xyz)
+ - Dictionary Word Bonus
+
+ B) MARKET SIGNALS (Aktivitรคt)
+ - Bid Activity (mehr Bids = mehr Interesse)
+ - Time Pressure (endet bald = Opportunity)
+ - Price-to-Value Ratio (unterbewertet = ๐ฅ)
+
+ C) PENALTIES
+ - Bindestriche (-30)
+ - Zahlen wenn >3 Zeichen (-20)
+ - Zu lang >15 Zeichen (-25)
+ """
+ score = 50 # Baseline
+ name = domain.rsplit('.', 1)[0]
+ tld = domain.rsplit('.', 1)[1]
+
+ # Lรคnge
+ if len(name) <= 3: score += 30
+ elif len(name) == 4: score += 25
+ elif len(name) == 5: score += 20
+ elif len(name) <= 7: score += 10
+
+ # TLD
+ tld_scores = {'com': 20, 'ai': 25, 'io': 18, 'co': 12, 'ch': 15}
+ score += tld_scores.get(tld, 0)
+
+ # Market Signals
+ bids = auction_data.get('num_bids', 0)
+ if bids >= 20: score += 15
+ elif bids >= 10: score += 10
+ elif bids >= 5: score += 5
+
+ # Penalties
+ if '-' in name: score -= 30
+ if any(c.isdigit() for c in name) and len(name) > 3: score -= 20
+
+ return max(0, min(100, score))
```
---
-## ๐ Metriken fรผr den Erfolg
+## ๐ TEIL 4: Roadmap
-### KPIs fรผr Phase 1
+### โ
ERLEDIGT (Stand: 11. Dezember 2025)
-| Metrik | Ziel (3 Monate) | Messung |
-|--------|-----------------|---------|
-| **Daily Active Users (DAU)** | 500 | PostHog |
-| **Conversion Rate (Free โ Trader)** | 5% | Stripe |
-| **Domains in Feed** | 1000+ | DB Query |
-| **Avg. Session Duration** | > 3 min | PostHog |
-| **Scrape Success Rate** | > 95% | Logs |
+- [x] Unified Feed API `/auctions/feed`
+- [x] Pounce Score v2.0 mit Market Signals
+- [x] Vanity Filter fรผr Public Users
+- [x] Pounce Direct Listing-System (DNS-Verifizierung)
+- [x] Sniper Alerts mit Keyword-Matching
+- [x] Web Scraping fรผr 5 Plattformen
+- [x] DropCatch API Client (fรผr User-Integration)
+- [x] Sedo API Client (bereit fรผr Credentials)
-### KPIs fรผr Phase 2
+### ๐ฏ NรCHSTE SCHRITTE (Diese Woche)
-| Metrik | Ziel (6 Monate) | Messung |
-|--------|-----------------|---------|
-| **Pounce Direct Listings** | 100+ | DB Query |
-| **First Sale via Pounce** | โ
| Manual |
-| **GMV (Gross Merchandise Value)** | $50,000 | Tracked |
-| **Repeat Sellers** | 20% | DB Query |
+1. **Sedo API Credentials eingeben**
+ - Sedo.com โ Mein Sedo โ API-Zugang
+ - Partner ID + SignKey in `.env`
+
+2. **Erste Pounce Direct Listings erstellen**
+ - Test-Domains zum Verifizieren des Flows
+ - Zeigt "Unique Content" im Feed
+
+3. **Scraper-Stabilitรคt verbessern**
+ - Fallback-Logik testen
+ - Error-Handling optimieren
+
+### ๐ฎ PHASE 3 (6-12 Monate)
+
+1. **Zone File Access beantragen**
+ - Verisign (.com/.net)
+ - PIR (.org)
+ - Kosten: $0-$10,000/Jahr
+
+2. **"Drops Tomorrow" Feature**
+ - Zone File Diff-Analyse
+ - Tycoon Exclusive ($29/mo)
+
+3. **Pounce Instant Exchange**
+ - Integrierter Escrow-Service
+ - 5% Gebรผhr (statt 15-20% bei Konkurrenz)
---
-## ๐ ๏ธ Technische Schulden abbauen
+## ๐จ TEIL 5: UI/UX Design
-### Prioritรคt 1: Scraper Stabilitรคt
+### Die Master-Tabelle (aus pounce_terminal.md)
-```python
-# Problem: Scraper bricht bei HTML-รnderungen
+| Spalte | Inhalt | Visualisierung |
+|--------|--------|----------------|
+| **Domain** | Name der Domain | Fettgedruckt. Bei "Pounce Direct" โ ๐ Icon |
+| **Pounce Score** | Qualitรคts-Algorithmus | 0-100 (Grรผn > 80, Gelb 50-80, Rot < 50) |
+| **Price / Bid** | Preis oder aktuelles Gebot | `$500` oder `$50 (Bid)` |
+| **Status / Time** | Countdown oder Verfรผgbarkeit | โฑ๏ธ `4h left` oder โก `Instant` |
+| **Source** | Herkunft | ๐ข GoDaddy, ๐ Pounce |
+| **Action** | Der Button | `[Bid โ]` oder `[Buy Now]` |
-# Lรถsung: Defensive Parsing mit Fallbacks
-def parse_domain_from_row(row) -> Optional[str]:
- """Try multiple selectors to find domain"""
- selectors = [
- 'a.domain-name',
- 'td.domain a',
- 'span[data-domain]',
- 'a[href*="domain"]',
- ]
-
- for selector in selectors:
- elem = row.select_one(selector)
- if elem:
- text = elem.get_text(strip=True)
- if '.' in text and len(text) < 100:
- return text.lower()
-
- return None
+### Filter Bar
+
+```
+[Toggle] Hide Spam (Standard: AN)
+[Toggle] Pounce Direct Only
+[Dropdown] TLD: .com, .ai, .io, .ch
+[Dropdown] Price: < $100, < $1k, High Roller
+[Dropdown] Ending: 1h, 4h, 24h, 7d
```
-### Prioritรคt 2: Caching Layer
+### Visuelle Hierarchie
-```python
-# Problem: Jeder Request macht DB-Abfragen
+```tsx
+// Pounce Direct Items werden prominent angezeigt
+{pounceDirectItems.length > 0 && (
+
+
+
+ Pounce Exclusive โ Verified Instant Buy
+
+ {pounceDirectItems.map(item => )}
+
+)}
-# Lรถsung: Redis Cache fรผr Feed-Daten
-from redis import asyncio as aioredis
-
-async def get_market_feed_cached(filters: dict) -> list:
- cache_key = f"market:feed:{hash(str(filters))}"
-
- # Try cache first
- cached = await redis.get(cache_key)
- if cached:
- return json.loads(cached)
-
- # Generate fresh data
- data = await generate_market_feed(filters)
-
- # Cache for 5 minutes
- await redis.setex(cache_key, 300, json.dumps(data))
-
- return data
-```
-
-### Prioritรคt 3: Rate Limiting pro User
-
-```python
-# Problem: Power User kรถnnten API รผberlasten
-
-# Lรถsung: Tiered Rate Limits
-RATE_LIMITS = {
- 'scout': '50/hour',
- 'trader': '200/hour',
- 'tycoon': '1000/hour',
-}
+// External Auctions darunter
+
+ Active Auctions
+ {externalItems.map(item => )}
+
```
---
-## ๐ฏ Nรคchste Schritte
+## ๐ฐ TEIL 6: Monetarisierung
-### โ
ERLEDIGT (11. Dezember 2025)
-- [x] Pounce Score v2.0 implementieren โ `_calculate_pounce_score_v2()` in `auctions.py`
-- [x] Unified `/auctions/feed` API deployen โ Live und funktional
-- [x] Pounce Direct Listings im Feed integrieren โ Kombiniert mit externen Auktionen
-- [x] "๐ Pounce Direct" Badge und Highlighting โ Visuelle Hierarchie implementiert
-- [x] Filter-Presets im Frontend โ "Pounce Only", "Verified", Preis-Filter
-- [x] Zone File Access Anleitung โ `ZONE_FILE_ACCESS.md` erstellt
+### Tier-basierte Features (aus pounce_pricing.md)
-### Nรคchste Woche
-- [ ] Erste Pounce Direct Listings erstellen (Testdaten)
-- [ ] Scraper-Fallbacks implementieren
-- [ ] Verisign Zone File Access beantragen
+| Feature | Scout ($0) | Trader ($9) | Tycoon ($29) |
+|---------|------------|-------------|--------------|
+| **Market Feed** | ๐ช๏ธ Raw (Vanity Filter) | โจ Curated (Clean) | โจ Curated + Priority |
+| **Alert Speed** | ๐ข Daily | ๐ Hourly | โก Real-Time (10m) |
+| **Watchlist** | 5 Domains | 50 Domains | 500 Domains |
+| **Sell Domains** | โ | โ
5 Listings | โ
50 Listings + Featured |
+| **Pounce Score** | โ Locked | โ
Basic | โ
+ SEO Data |
+| **Drops Tomorrow** | โ | โ | โ
Exclusive |
-### Nรคchster Monat
-- [ ] Opportunity Indicators im UI
-- [ ] Redis Caching Layer
-- [ ] PIR (.org) Zone File Access
+### Die "Conversion-Falle" (aus pounce_features.md)
+
+Wenn ein nicht-eingeloggter User auf "Buy Now" bei einem Pounce Direct Listing klickt:
+
+```
+โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+โ ๐ Secure Transaction โ
+โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
+โ โ
+โ Du bist dabei, ein verifiziertes Direct-Listing anzusehen. โ
+โ โ
+โ Um den Verkรคufer zu kontaktieren und Kรคuferschutz zu โ
+โ genieรen, logge dich bitte ein. โ
+โ โ
+โ [Login] [Create Free Scout Account] โ
+โ โ
+โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+```
+
+---
+
+## ๐ TEIL 7: Kritische Erkenntnisse
+
+### API-Realitรคt vs. Erwartung
+
+| API | Erwartung | Realitรคt |
+|-----|-----------|----------|
+| **DropCatch** | Alle รถffentlichen Auktionen | โ Nur eigene Bids/Backorders |
+| **Sedo** | TBD | โณ Credentials fehlen noch |
+
+**Konsequenz:**
+- Web Scraping bleibt unsere **Hauptquelle** fรผr รถffentliche Daten
+- APIs sind nรผtzlich fรผr **User-Integration** (verbinde dein DropCatch-Konto)
+- **Zone Files** sind der langfristige Weg zur Datenhoheit
+
+### Der echte USP: Pounce Direct
+
+> *"Domains die es NUR bei Pounce gibt."*
+
+Das ist der Schlรผssel. Nicht die Aggregation (das kann jeder), sondern der **Unique Content** durch User-Listings.
+
+**Prioritรคt:** Erste Pounce Direct Listings aktivieren!
+
+---
+
+## ๐ Checkliste fรผr den Launch
+
+### Backend
+- [x] Unified Feed API
+- [x] Pounce Score v2.0
+- [x] Vanity Filter
+- [x] Scraper aktiv
+- [ ] Sedo API Credentials eingeben
+- [ ] Scheduler-Intervall optimieren
+
+### Frontend
+- [x] Terminal Market Page
+- [x] Public Auctions Page
+- [x] Pounce Direct Highlighting
+- [x] Filter (Source, TLD, Price)
+- [ ] "Hot Right Now" Section
+- [ ] Better Empty States
+
+### Content
+- [ ] Erste 5 Test-Listings erstellen
+- [ ] DNS-Verifizierung testen
+- [ ] Listing-to-Feed Flow validieren
---
## ๐ Fazit
-Die Market Page ist das Herzstรผck von Pounce. Mit diesen รnderungen wird sie:
+Die Market Page ist **funktional**, aber der wahre USP (Pounce Direct) ist noch nicht aktiviert.
-1. **Zuverlรคssiger** (Scraper-Fallbacks, Caching)
-2. **Wertvoller** (Pounce Direct = Unique Content)
-3. **Stickier** (bessere UX, personalisierte Filter)
-4. **Skalierbarer** (Unicorn-ready Architektur)
-
-Der Weg zum Unicorn fรผhrt รผber **Datenhoheit** und **einzigartigen Content**.
-Pounce Direct ist der erste Schritt.
-
----
-
-# ๐ง TEIL 3: AKTIONSPLAN โ Was tun wir konkret?
-
-## Phase A: Cleanup (Heute)
-
-### 1. Leere Ordner lรถschen
-
-```bash
-# Diese Ordner sind leer und Legacy vom alten /command Routing
-rm -rf frontend/src/app/dashboard/
-rm -rf frontend/src/app/portfolio/
-rm -rf frontend/src/app/settings/
-rm -rf frontend/src/app/watchlist/
-rm -rf frontend/src/app/careers/
-```
-
-### 2. Redundante Seiten prรผfen
-
-| Seite | Entscheidung |
-|-------|--------------|
-| `/market/page.tsx` | โ Entfernen โ Redirect zu `/auctions` |
-| `/intelligence/page.tsx` | โ ๏ธ Prรผfen โ Redirect zu `/tld-pricing` |
-
----
-
-## Phase B: Pounce Direct Integration (Diese Woche)
-
-### 1. Backend: Unified Market Feed API
-
-**Datei:** `backend/app/api/auctions.py`
-
-Neuer Endpoint hinzufรผgen:
-
-```python
-@router.get("/feed")
-async def get_unified_market_feed(
- source: str = Query("all", enum=["all", "pounce", "external"]),
- # ... Filter
-):
- """
- Unified feed combining:
- - Pounce Direct (user listings)
- - External auctions (scraped)
- """
- items = []
-
- # 1. Pounce Direct Listings
- if source in ["all", "pounce"]:
- listings = await db.execute(
- select(DomainListing)
- .where(DomainListing.status == "active")
- )
- for listing in listings.scalars():
- items.append({
- "type": "pounce_direct",
- "domain": listing.domain,
- "price": listing.asking_price,
- "source": "Pounce",
- "status": "instant",
- "verified": listing.is_verified,
- "url": f"/buy/{listing.slug}",
- })
-
- # 2. External Auctions
- if source in ["all", "external"]:
- auctions = await db.execute(
- select(DomainAuction)
- .where(DomainAuction.is_active == True)
- )
- for auction in auctions.scalars():
- items.append({
- "type": "auction",
- "domain": auction.domain,
- "price": auction.current_bid,
- "source": auction.platform,
- "status": "auction",
- "time_left": _format_time_remaining(auction.end_time),
- "url": auction.affiliate_url,
- })
-
- return {"items": items, "total": len(items)}
-```
-
-### 2. Frontend: API Client erweitern
-
-**Datei:** `frontend/src/lib/api.ts`
-
-```typescript
-async getMarketFeed(
- source: 'all' | 'pounce' | 'external' = 'all',
- filters?: {
- keyword?: string
- tld?: string
- minPrice?: number
- maxPrice?: number
- }
-) {
- const params = new URLSearchParams({ source })
- if (filters?.keyword) params.append('keyword', filters.keyword)
- if (filters?.tld) params.append('tld', filters.tld)
- if (filters?.minPrice) params.append('min_price', filters.minPrice.toString())
- if (filters?.maxPrice) params.append('max_price', filters.maxPrice.toString())
-
- return this.request<{
- items: MarketItem[]
- total: number
- }>(`/auctions/feed?${params.toString()}`)
-}
-```
-
-### 3. Frontend: Market Page updaten
-
-**Datei:** `frontend/src/app/terminal/market/page.tsx`
-
-รnderungen:
-1. `api.getMarketFeed()` statt `api.getAuctions()` aufrufen
-2. Pounce Direct Items visuell hervorheben
-3. "Pounce Exclusive" Filter aktivieren
-
----
-
-## Phase C: Public Page Alignment (Nรคchste Woche)
-
-### 1. `/auctions/page.tsx` โ Pounce Direct hervorheben
-
-```tsx
-// Gruppiere Items
-const pounceItems = items.filter(i => i.type === 'pounce_direct')
-const externalItems = items.filter(i => i.type === 'auction')
-
-return (
- <>
- {/* Featured: Pounce Direct */}
- {pounceItems.length > 0 && (
-
-
-
- Pounce Exclusive โ Verified Instant Buy
-
-
- {pounceItems.map(item => (
-
- ))}
-
-
- )}
-
- {/* Standard: External */}
-
- >
-)
-```
-
-### 2. Konsolidierung
-
-| Aktion | Details |
-|--------|---------|
-| `/market/page.tsx` entfernen | Redirect zu `/auctions` |
-| `/auctions/page.tsx` umbenennen | โ "Market" in Navigation |
-
----
-
-## Phase D: Score & Scraper Verbesserungen (Woche 2-3)
-
-### 1. Pounce Score v2.0
-
-**Datei:** `backend/app/services/valuation.py`
-
-Erweitern um:
-- Bid Activity Score
-- Time Pressure Score
-- Value Ratio Score
-- Platform Trust Score
-
-### 2. Scraper Fallbacks
-
-**Datei:** `backend/app/services/auction_scraper.py`
-
-```python
-async def scrape_with_fallback(self, platform: str, db: AsyncSession):
- methods = [
- (f'_scrape_{platform.lower()}_api', 'API'),
- (f'_scrape_{platform.lower()}_rss', 'RSS'),
- (f'_scrape_{platform.lower()}_html', 'HTML'),
- ]
-
- for method_name, method_type in methods:
- method = getattr(self, method_name, None)
- if not method:
- continue
-
- try:
- result = await method(db)
- if result['found'] > 0:
- return result
- except Exception as e:
- logger.warning(f"{platform} {method_type} failed: {e}")
-
- return {'found': 0, 'error': 'All methods failed'}
-```
-
----
-
-## Checkliste fรผr den Clean Start
-
-### Backend:
-- [ ] Unified Feed Endpoint `/auctions/feed` erstellen
-- [ ] Pounce Score v2.0 in `valuation.py` integrieren
-- [ ] Scraper Fallback-Logik hinzufรผgen
-
-### Frontend:
-- [ ] Leere Ordner lรถschen
-- [ ] `api.getMarketFeed()` implementieren
-- [ ] Market Page: Pounce Direct Integration
-- [ ] Auctions Page: Visuelle Hierarchie
-- [ ] `/market/page.tsx` zu Redirect machen
-
-### Testing:
-- [ ] Listing erstellen โ Erscheint im Market Feed?
-- [ ] DNS Verification โ Funktioniert?
-- [ ] External Auctions โ Werden geladen?
-- [ ] Filter "Pounce Only" โ Zeigt nur Listings?
-
----
-
-## Visualisierung: Datenfluss
-
-```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ MARKET FEED โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โ
-โ โ LISTINGS โ โ AUCTIONS โ โ SCHEDULER โ โ
-โ โ (Pounce) โ โ (External) โ โ (Scrape) โ โ
-โ โโโโโโโโฌโโโโโโโ โโโโโโโโฌโโโโโโโ โโโโโโโโฌโโโโโโโ โ
-โ โ โ โ โ
-โ โ โ โ โ
-โ โโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโ โ
-โ โ โ
-โ โผ โ
-โ โโโโโโโโโโโโโโโโโโโโ โ
-โ โ /auctions/feed โ โ
-โ โ (Unified API) โ โ
-โ โโโโโโโโโโฌโโโโโโโโโโ โ
-โ โ โ
-โ โโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโ โ
-โ โผ โผ โผ โ
-โ โโโโโโโโโโโโโโ โโโโโโโโโโโโโโ โโโโโโโโโโโโโโ โ
-โ โ TERMINAL โ โ PUBLIC โ โ ADMIN โ โ
-โ โ /market โ โ /auctions โ โ /admin โ โ
-โ โโโโโโโโโโโโโโ โโโโโโโโโโโโโโ โโโโโโโโโโโโโโ โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-```
-
----
-
----
-
-# ๐ TEIL 4: ROADMAP ZUM UNICORN
-
-## Die 4 Phasen (aus pounce_strategy.md)
-
-```
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-โ POUNCE UNICORN ROADMAP โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ PHASE 1: INTELLIGENCE (0-18 Monate) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ Ziel: 10.000 User, $1M ARR, Datenhoheit โ
-โ โ
-โ โ
Pounce Terminal (Dashboard) โ
-โ โ
TLD Pricing (Market Barometer) โ
-โ โ
Auction Aggregator (Scraping) โ
-โ โ
Watchlist/Monitoring โ
-โ โณ Pounce Direct (Marketplace) โ
-โ ๐ Zone File Analyse โ
-โ โ
-โ Status: WIR SIND HIER โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ PHASE 2: LIQUIDITรT (18-36 Monate) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ Ziel: Den Transaktionsfluss รผbernehmen, $10M ARR โ
-โ โ
-โ ๐ฎ Pounce Instant Exchange (Escrow integriert) โ
-โ ๐ฎ "Buy Now" Buttons im Dashboard โ
-โ ๐ฎ 5% Transaktionsgebรผhr (statt 15-20% bei Konkurrenz) โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ PHASE 3: FINANZIALISIERUNG (3-5 Jahre) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ Ziel: Domains als Asset-Klasse, $50-100M ARR โ
-โ โ
-โ ๐ฎ Fractional Ownership (Anteile an Premium-Domains) โ
-โ ๐ฎ Domain-Backed Lending (Kredit gegen Domain) โ
-โ ๐ฎ โ Wir werden ein FINTECH โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโค
-โ โ
-โ PHASE 4: IMPERIUM (5+ Jahre) โ
-โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ โ
-โ Ziel: $1 Mrd. Bewertung, "Too big to fail" โ
-โ โ
-โ ๐ฎ Pounce Enterprise Sentinel (B2B Brand Protection) โ
-โ ๐ฎ Fortune 500 Kunden (Apple, Tesla, etc.) โ
-โ ๐ฎ KI-gestรผtzte Phishing-Takedowns โ
-โ โ
-โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
-```
-
-## Was WIR JETZT tun (Phase 1 perfektionieren)
-
-### Prioritรคt 1: Pounce Direct perfektionieren
-- [x] Listing-System gebaut
-- [x] DNS-Verifizierung funktioniert
-- [ ] **Im Market Feed anzeigen** โ NรCHSTER SCHRITT
-- [ ] Visuelle Hierarchie (๐ Pounce vs ๐ข External)
-
-### Prioritรคt 2: Datenqualitรคt verbessern
-- [x] Scraping lรคuft
-- [ ] Fallback-Logik fรผr Scraper
-- [ ] Pounce Score v2.0
-
-### Prioritรคt 3: Zone Files vorbereiten
-- [ ] Verisign Zone File Access beantragen
-- [ ] Algorithmus entwickeln (kann lokal getestet werden)
-- [ ] Server-Infrastruktur planen
-
----
-
-## Zusammenfassung: Der Weg zum Unicorn
-
-```
- HEUTE 6 MONATE 18+ MONATE
- โ โ โ
- โผ โผ โผ
- โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ
- โ SCRAPING โ โโโ โ ZONE FILES โ โโโ โ FINTECH โ
- โ + POUNCE โ โ ANALYSIS โ โ BรRSE โ
- โ DIRECT โ โ โ โ โ
- โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ โโโโโโโโโโโโโโโ
- โ โ โ
- โ โ โ
- "Content Filler" "Daten-Monopol" "Asset-Klasse"
- Seite wirkt lebendig Exklusive Intel Domains = Aktien
-```
-
----
-
-## ๐ Das Mantra
-
-> **"Don't guess. Know."**
->
-> Phase 1: Intelligence
->
-> **"Don't just buy. Invest."**
->
-> Phase 3: Asset Class
-
-Der Weg zum Unicorn fรผhrt รผber **Datenhoheit** und **einzigartigen Content**.
-
-1. **Heute:** Pounce Direct (User-Listings) = Unique Content
-2. **Morgen:** Zone Files = Exklusive Intelligence
-3. **รbermorgen:** Fintech = Milliarden-Bewertung
-
----
-
-**Bereit zum Start?** ๐
-
-Sag mir, womit ich beginnen soll:
-1. **Cleanup** โ Leere Ordner lรถschen
-2. **Backend** โ Unified Feed API erstellen
-3. **Frontend** โ Market Page mit Pounce Direct
+**Die Reihenfolge:**
+1. โ
Aggregation funktioniert (Scraping)
+2. โณ Pounce Direct aktivieren (User-Listings)
+3. ๐ฎ Zone Files fรผr Datenhoheit (Phase 3)
+> *"Der Weg zum Unicorn fรผhrt nicht รผber besseres Scraping, sondern รผber einzigartigen Content."*
+>
+> โ pounce_strategy.md
diff --git a/backend/app/api/admin.py b/backend/app/api/admin.py
index 1cb1c93..9f408d4 100644
--- a/backend/app/api/admin.py
+++ b/backend/app/api/admin.py
@@ -981,3 +981,126 @@ async def get_activity_log(
],
"total": total,
}
+
+
+# ============== API Connection Tests ==============
+
+@router.get("/test-apis")
+async def test_external_apis(
+ admin: User = Depends(require_admin),
+):
+ """
+ Test connections to all external APIs.
+
+ Returns status of:
+ - DropCatch API
+ - Sedo API
+ - Moz API (if configured)
+ """
+ from app.services.dropcatch_api import dropcatch_client
+ from app.services.sedo_api import sedo_client
+
+ results = {
+ "tested_at": datetime.utcnow().isoformat(),
+ "apis": {}
+ }
+
+ # Test DropCatch API
+ try:
+ dropcatch_result = await dropcatch_client.test_connection()
+ results["apis"]["dropcatch"] = dropcatch_result
+ except Exception as e:
+ results["apis"]["dropcatch"] = {
+ "success": False,
+ "error": str(e),
+ "configured": dropcatch_client.is_configured
+ }
+
+ # Test Sedo API
+ try:
+ sedo_result = await sedo_client.test_connection()
+ results["apis"]["sedo"] = sedo_result
+ except Exception as e:
+ results["apis"]["sedo"] = {
+ "success": False,
+ "error": str(e),
+ "configured": sedo_client.is_configured
+ }
+
+ # Summary
+ results["summary"] = {
+ "total": len(results["apis"]),
+ "configured": sum(1 for api in results["apis"].values() if api.get("configured")),
+ "connected": sum(1 for api in results["apis"].values() if api.get("success")),
+ }
+
+ return results
+
+
+@router.post("/trigger-scrape")
+async def trigger_auction_scrape(
+ background_tasks: BackgroundTasks,
+ db: Database,
+ admin: User = Depends(require_admin),
+):
+ """
+ Manually trigger auction scraping from all sources.
+
+ This will:
+ 1. Try Tier 1 APIs (DropCatch, Sedo) first
+ 2. Fall back to web scraping for others
+ """
+ from app.services.auction_scraper import AuctionScraperService
+
+ scraper = AuctionScraperService()
+
+ # Run scraping in background
+ async def run_scrape():
+ async with db.begin():
+ return await scraper.scrape_all_platforms(db)
+
+ background_tasks.add_task(run_scrape)
+
+ return {
+ "message": "Auction scraping started in background",
+ "note": "Check /admin/scrape-status for results"
+ }
+
+
+@router.get("/scrape-status")
+async def get_scrape_status(
+ db: Database,
+ admin: User = Depends(require_admin),
+ limit: int = 10,
+):
+ """Get recent scrape logs."""
+ from app.models.auction import AuctionScrapeLog
+
+ query = (
+ select(AuctionScrapeLog)
+ .order_by(desc(AuctionScrapeLog.started_at))
+ .limit(limit)
+ )
+
+ try:
+ result = await db.execute(query)
+ logs = result.scalars().all()
+ except Exception:
+ return {"logs": [], "error": "Table not found"}
+
+ return {
+ "logs": [
+ {
+ "id": log.id,
+ "platform": log.platform,
+ "status": log.status,
+ "auctions_found": log.auctions_found,
+ "auctions_new": log.auctions_new,
+ "auctions_updated": log.auctions_updated,
+ "error_message": log.error_message,
+ "started_at": log.started_at.isoformat() if log.started_at else None,
+ "completed_at": log.completed_at.isoformat() if log.completed_at else None,
+ }
+ for log in logs
+ ]
+ }
diff --git a/backend/app/config.py b/backend/app/config.py
index df49aad..89b67d2 100644
--- a/backend/app/config.py
+++ b/backend/app/config.py
@@ -33,6 +33,27 @@ class Settings(BaseSettings):
check_minute: int = 0
scheduler_check_interval_hours: int = 24
+ # =================================
+ # External API Credentials
+ # =================================
+
+ # DropCatch API (Official Partner API)
+ # Docs: https://www.dropcatch.com/hiw/dropcatch-api
+ dropcatch_client_id: str = ""
+ dropcatch_client_secret: str = ""
+ dropcatch_api_base: str = "https://api.dropcatch.com"
+
+ # Sedo API (Partner API - XML-RPC)
+ # Docs: https://api.sedo.com/apidocs/v1/
+ # Find your credentials: Sedo.com โ Mein Sedo โ API-Zugang
+ sedo_partner_id: str = ""
+ sedo_sign_key: str = ""
+ sedo_api_base: str = "https://api.sedo.com/api/v1/"
+
+ # Moz API (SEO Data)
+ moz_access_id: str = ""
+ moz_secret_key: str = ""
+
class Config:
env_file = ".env"
env_file_encoding = "utf-8"
diff --git a/backend/app/services/auction_scraper.py b/backend/app/services/auction_scraper.py
index bbaffcb..8a8b6bc 100644
--- a/backend/app/services/auction_scraper.py
+++ b/backend/app/services/auction_scraper.py
@@ -1,15 +1,18 @@
"""
Domain Auction Scraper Service
-Scrapes real auction data from various platforms WITHOUT using their APIs.
-Uses web scraping to get publicly available auction information.
+Data Acquisition Strategy (from MARKET_CONCEPT.md):
-Supported Platforms:
+TIER 1: OFFICIAL APIs (Most Reliable)
+- DropCatch API (Official Partner) โ WE HAVE THIS!
+
+TIER 2: WEB SCRAPING (Fallback)
- ExpiredDomains.net (aggregator for deleted domains)
- GoDaddy Auctions (public listings via RSS/public pages)
- Sedo (public marketplace)
- NameJet (public auctions)
-- DropCatch (public auctions)
+
+The scraper tries Tier 1 first, then falls back to Tier 2 if needed.
IMPORTANT:
- Respects robots.txt
@@ -31,6 +34,8 @@ from sqlalchemy import select, and_, delete
from sqlalchemy.ext.asyncio import AsyncSession
from app.models.auction import DomainAuction, AuctionScrapeLog
+from app.services.dropcatch_api import dropcatch_client
+from app.services.sedo_api import sedo_client
logger = logging.getLogger(__name__)
@@ -102,15 +107,41 @@ class AuctionScraperService:
"errors": [],
}
- # Scrape each platform
+ # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ # TIER 1: Official APIs (Best data quality)
+ # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ tier1_apis = [
+ ("DropCatch", self._fetch_dropcatch_api), # We have API access!
+ ("Sedo", self._fetch_sedo_api), # We have API access!
+ ]
+
+ for platform_name, api_func in tier1_apis:
+ try:
+ api_result = await api_func(db)
+ if api_result.get("found", 0) > 0:
+ results["platforms"][platform_name] = api_result
+ results["total_found"] += api_result.get("found", 0)
+ results["total_new"] += api_result.get("new", 0)
+ results["total_updated"] += api_result.get("updated", 0)
+ logger.info(f"โ
{platform_name} API: {api_result['found']} auctions")
+ except Exception as e:
+ logger.warning(f"โ ๏ธ {platform_name} API failed, will try scraping: {e}")
+
+ # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
+ # TIER 2: Web Scraping (Fallback for platforms without API access)
+ # โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
scrapers = [
("ExpiredDomains", self._scrape_expireddomains),
("GoDaddy", self._scrape_godaddy_public),
- ("Sedo", self._scrape_sedo_public),
("NameJet", self._scrape_namejet_public),
- ("DropCatch", self._scrape_dropcatch_public),
]
+ # Add fallbacks only if APIs failed
+ if "DropCatch" not in results["platforms"]:
+ scrapers.append(("DropCatch", self._scrape_dropcatch_public))
+ if "Sedo" not in results["platforms"]:
+ scrapers.append(("Sedo", self._scrape_sedo_public))
+
for platform_name, scraper_func in scrapers:
try:
platform_result = await scraper_func(db)
@@ -561,13 +592,206 @@ class AuctionScraperService:
return result
- async def _scrape_dropcatch_public(self, db: AsyncSession) -> Dict[str, Any]:
+ async def _fetch_dropcatch_api(self, db: AsyncSession) -> Dict[str, Any]:
"""
- Scrape DropCatch public auction listings.
- DropCatch shows pending delete auctions publicly.
+ ๐ TIER 1: Fetch DropCatch auctions via OFFICIAL API
+
+ This is our preferred method - faster, more reliable, more data.
+ Uses the official DropCatch Partner API.
"""
platform = "DropCatch"
- result = {"found": 0, "new": 0, "updated": 0}
+ result = {"found": 0, "new": 0, "updated": 0, "source": "api"}
+
+ if not dropcatch_client.is_configured:
+ logger.info("DropCatch API not configured, skipping")
+ return result
+
+ log = AuctionScrapeLog(platform=platform)
+ db.add(log)
+ await db.commit()
+
+ try:
+ # Fetch auctions from official API
+ api_result = await dropcatch_client.search_auctions(page_size=100)
+
+ auctions = api_result.get("auctions") or api_result.get("items") or []
+ result["found"] = len(auctions)
+
+ for dc_auction in auctions:
+ try:
+ # Transform to our format
+ auction_data = dropcatch_client.transform_to_pounce_format(dc_auction)
+
+ if not auction_data["domain"]:
+ continue
+
+ # Check if exists
+ existing = await db.execute(
+ select(DomainAuction).where(
+ and_(
+ DomainAuction.domain == auction_data["domain"],
+ DomainAuction.platform == platform
+ )
+ )
+ )
+ existing_auction = existing.scalar_one_or_none()
+
+ if existing_auction:
+ # Update existing
+ existing_auction.current_bid = auction_data["current_bid"]
+ existing_auction.num_bids = auction_data["num_bids"]
+ existing_auction.end_time = auction_data["end_time"]
+ existing_auction.is_active = True
+ existing_auction.updated_at = datetime.utcnow()
+ result["updated"] += 1
+ else:
+ # Create new
+ new_auction = DomainAuction(
+ domain=auction_data["domain"],
+ tld=auction_data["tld"],
+ platform=platform,
+ current_bid=auction_data["current_bid"],
+ currency=auction_data["currency"],
+ num_bids=auction_data["num_bids"],
+ end_time=auction_data["end_time"],
+ auction_url=auction_data["auction_url"],
+ age_years=auction_data.get("age_years"),
+ buy_now_price=auction_data.get("buy_now_price"),
+ reserve_met=auction_data.get("reserve_met"),
+ traffic=auction_data.get("traffic"),
+ is_active=True,
+ )
+ db.add(new_auction)
+ result["new"] += 1
+
+ except Exception as e:
+ logger.warning(f"Error processing DropCatch auction: {e}")
+ continue
+
+ await db.commit()
+
+ log.status = "success"
+ log.auctions_found = result["found"]
+ log.auctions_new = result["new"]
+ log.auctions_updated = result["updated"]
+ log.completed_at = datetime.utcnow()
+ await db.commit()
+
+ logger.info(f"DropCatch API: Found {result['found']}, New {result['new']}, Updated {result['updated']}")
+ return result
+
+ except Exception as e:
+ logger.error(f"DropCatch API error: {e}")
+ log.status = "failed"
+ log.error_message = str(e)[:500]
+ log.completed_at = datetime.utcnow()
+ await db.commit()
+ return result
+
+ async def _fetch_sedo_api(self, db: AsyncSession) -> Dict[str, Any]:
+ """
+ ๐ TIER 1: Fetch Sedo auctions via OFFICIAL API
+
+ This is our preferred method for Sedo data.
+ Uses the official Sedo Partner API.
+ """
+ platform = "Sedo"
+ result = {"found": 0, "new": 0, "updated": 0, "source": "api"}
+
+ if not sedo_client.is_configured:
+ logger.info("Sedo API not configured, skipping")
+ return result
+
+ log = AuctionScrapeLog(platform=platform)
+ db.add(log)
+ await db.commit()
+
+ try:
+ # Fetch auctions from official API
+ api_result = await sedo_client.search_auctions(page_size=100)
+
+ # Sedo response structure may vary
+ listings = api_result.get("domains") or api_result.get("items") or api_result.get("result") or []
+ if isinstance(listings, dict):
+ listings = list(listings.values()) if listings else []
+
+ result["found"] = len(listings)
+
+ for sedo_listing in listings:
+ try:
+ # Transform to our format
+ auction_data = sedo_client.transform_to_pounce_format(sedo_listing)
+
+ if not auction_data["domain"]:
+ continue
+
+ # Check if exists
+ existing = await db.execute(
+ select(DomainAuction).where(
+ and_(
+ DomainAuction.domain == auction_data["domain"],
+ DomainAuction.platform == platform
+ )
+ )
+ )
+ existing_auction = existing.scalar_one_or_none()
+
+ if existing_auction:
+ # Update existing
+ existing_auction.current_bid = auction_data["current_bid"]
+ existing_auction.num_bids = auction_data["num_bids"]
+ existing_auction.end_time = auction_data["end_time"]
+ existing_auction.is_active = True
+ existing_auction.updated_at = datetime.utcnow()
+ result["updated"] += 1
+ else:
+ # Create new
+ new_auction = DomainAuction(
+ domain=auction_data["domain"],
+ tld=auction_data["tld"],
+ platform=platform,
+ current_bid=auction_data["current_bid"],
+ currency=auction_data["currency"],
+ num_bids=auction_data["num_bids"],
+ end_time=auction_data["end_time"],
+ auction_url=auction_data["auction_url"],
+ buy_now_price=auction_data.get("buy_now_price"),
+ is_active=True,
+ )
+ db.add(new_auction)
+ result["new"] += 1
+
+ except Exception as e:
+ logger.warning(f"Error processing Sedo listing: {e}")
+ continue
+
+ await db.commit()
+
+ log.status = "success"
+ log.auctions_found = result["found"]
+ log.auctions_new = result["new"]
+ log.auctions_updated = result["updated"]
+ log.completed_at = datetime.utcnow()
+ await db.commit()
+
+ logger.info(f"Sedo API: Found {result['found']}, New {result['new']}, Updated {result['updated']}")
+ return result
+
+ except Exception as e:
+ logger.error(f"Sedo API error: {e}")
+ log.status = "failed"
+ log.error_message = str(e)[:500]
+ log.completed_at = datetime.utcnow()
+ await db.commit()
+ return result
+
+ async def _scrape_dropcatch_public(self, db: AsyncSession) -> Dict[str, Any]:
+ """
+ ๐ฆ TIER 2 FALLBACK: Scrape DropCatch public auction listings.
+ Only used if the API is not configured or fails.
+ """
+ platform = "DropCatch"
+ result = {"found": 0, "new": 0, "updated": 0, "source": "scrape"}
log = AuctionScrapeLog(platform=platform)
db.add(log)
diff --git a/backend/app/services/dropcatch_api.py b/backend/app/services/dropcatch_api.py
new file mode 100644
index 0000000..85125e2
--- /dev/null
+++ b/backend/app/services/dropcatch_api.py
@@ -0,0 +1,334 @@
+"""
+DropCatch Official API Client
+
+This service provides access to DropCatch's official API for:
+- Searching domain auctions
+- Getting auction details
+- Backorder management
+
+API Documentation: https://www.dropcatch.com/hiw/dropcatch-api
+Interactive Docs: https://api.dropcatch.com/swagger
+
+SECURITY:
+- Credentials are loaded from environment variables
+- NEVER hardcode credentials in this file
+
+Usage:
+ from app.services.dropcatch_api import dropcatch_client
+
+ # Get active auctions
+ auctions = await dropcatch_client.search_auctions(keyword="tech")
+"""
+import logging
+from datetime import datetime, timedelta
+from typing import Optional, List, Dict, Any
+import httpx
+from functools import lru_cache
+
+from app.config import get_settings
+
+logger = logging.getLogger(__name__)
+
+
+class DropCatchAPIClient:
+ """
+ Official DropCatch API Client.
+
+ This uses the V2 API endpoints (V1 is deprecated).
+ Authentication is via OAuth2 client credentials.
+ """
+
+ def __init__(self):
+ self.settings = get_settings()
+ self.base_url = self.settings.dropcatch_api_base or "https://api.dropcatch.com"
+ self.client_id = self.settings.dropcatch_client_id
+ self.client_secret = self.settings.dropcatch_client_secret
+
+ # Token cache
+ self._access_token: Optional[str] = None
+ self._token_expires_at: Optional[datetime] = None
+
+ # HTTP client
+ self._client: Optional[httpx.AsyncClient] = None
+
+ @property
+ def is_configured(self) -> bool:
+ """Check if API credentials are configured."""
+ return bool(self.client_id and self.client_secret)
+
+ async def _get_client(self) -> httpx.AsyncClient:
+ """Get or create HTTP client."""
+ if self._client is None or self._client.is_closed:
+ self._client = httpx.AsyncClient(
+ timeout=30.0,
+ headers={
+ "Content-Type": "application/json",
+ "User-Agent": "Pounce/1.0 (Domain Intelligence Platform)"
+ }
+ )
+ return self._client
+
+ async def close(self):
+ """Close the HTTP client."""
+ if self._client and not self._client.is_closed:
+ await self._client.aclose()
+ self._client = None
+
+ async def _authenticate(self) -> str:
+ """
+ Authenticate with DropCatch API and get access token.
+
+ POST https://api.dropcatch.com/authorize
+ Body: { "clientId": "...", "clientSecret": "..." }
+
+ Returns: Access token string
+ """
+ if not self.is_configured:
+ raise ValueError("DropCatch API credentials not configured")
+
+ # Check if we have a valid cached token
+ if self._access_token and self._token_expires_at:
+ if datetime.utcnow() < self._token_expires_at - timedelta(minutes=5):
+ return self._access_token
+
+ client = await self._get_client()
+
+ try:
+ response = await client.post(
+ f"{self.base_url}/authorize",
+ json={
+ "clientId": self.client_id,
+ "clientSecret": self.client_secret
+ }
+ )
+
+ if response.status_code != 200:
+ logger.error(f"DropCatch auth failed: {response.status_code} - {response.text}")
+ raise Exception(f"Authentication failed: {response.status_code}")
+
+ data = response.json()
+
+ # Extract token - the response format may vary
+ # Common formats: { "token": "...", "expiresIn": 3600 }
+ # or: { "accessToken": "...", "expiresIn": 3600 }
+ self._access_token = data.get("token") or data.get("accessToken") or data.get("access_token")
+
+ # Calculate expiry (default 1 hour if not specified)
+ expires_in = data.get("expiresIn") or data.get("expires_in") or 3600
+ self._token_expires_at = datetime.utcnow() + timedelta(seconds=expires_in)
+
+ logger.info("DropCatch API: Successfully authenticated")
+ return self._access_token
+
+ except httpx.HTTPError as e:
+ logger.error(f"DropCatch auth HTTP error: {e}")
+ raise
+
+ async def _request(
+ self,
+ method: str,
+ endpoint: str,
+ params: Optional[Dict] = None,
+ json_data: Optional[Dict] = None
+ ) -> Dict[str, Any]:
+ """Make an authenticated API request."""
+ token = await self._authenticate()
+ client = await self._get_client()
+
+ headers = {
+ "Authorization": f"Bearer {token}"
+ }
+
+ url = f"{self.base_url}{endpoint}"
+
+ try:
+ response = await client.request(
+ method=method,
+ url=url,
+ params=params,
+ json=json_data,
+ headers=headers
+ )
+
+ if response.status_code == 401:
+ # Token expired, re-authenticate
+ self._access_token = None
+ token = await self._authenticate()
+ headers["Authorization"] = f"Bearer {token}"
+ response = await client.request(
+ method=method,
+ url=url,
+ params=params,
+ json=json_data,
+ headers=headers
+ )
+
+ response.raise_for_status()
+ return response.json()
+
+ except httpx.HTTPError as e:
+ logger.error(f"DropCatch API request failed: {e}")
+ raise
+
+ # =========================================================================
+ # AUCTION ENDPOINTS (V2)
+ # =========================================================================
+
+ async def search_auctions(
+ self,
+ keyword: Optional[str] = None,
+ tld: Optional[str] = None,
+ min_price: Optional[float] = None,
+ max_price: Optional[float] = None,
+ ending_within_hours: Optional[int] = None,
+ page_size: int = 100,
+ page_token: Optional[str] = None,
+ ) -> Dict[str, Any]:
+ """
+ Search for domain auctions.
+
+ Endpoint: GET /v2/auctions (or similar - check interactive docs)
+
+ Returns:
+ {
+ "auctions": [...],
+ "cursor": {
+ "next": "...",
+ "previous": "..."
+ }
+ }
+ """
+ params = {
+ "pageSize": page_size,
+ }
+
+ if keyword:
+ params["searchTerm"] = keyword
+ if tld:
+ params["tld"] = tld.lstrip(".")
+ if min_price is not None:
+ params["minPrice"] = min_price
+ if max_price is not None:
+ params["maxPrice"] = max_price
+ if ending_within_hours:
+ params["endingWithinHours"] = ending_within_hours
+ if page_token:
+ params["pageToken"] = page_token
+
+ return await self._request("GET", "/v2/auctions", params=params)
+
+ async def get_auction(self, auction_id: int) -> Dict[str, Any]:
+ """Get details for a specific auction."""
+ return await self._request("GET", f"/v2/auctions/{auction_id}")
+
+ async def get_ending_soon(
+ self,
+ hours: int = 24,
+ page_size: int = 50
+ ) -> Dict[str, Any]:
+ """Get auctions ending soon."""
+ return await self.search_auctions(
+ ending_within_hours=hours,
+ page_size=page_size
+ )
+
+ async def get_hot_auctions(self, page_size: int = 50) -> Dict[str, Any]:
+ """
+ Get hot/popular auctions (high bid activity).
+ Note: The actual endpoint may vary - check interactive docs.
+ """
+ # This might be a different endpoint or sort parameter
+ params = {
+ "pageSize": page_size,
+ "sortBy": "bidCount", # or "popularity" - check docs
+ "sortOrder": "desc"
+ }
+ return await self._request("GET", "/v2/auctions", params=params)
+
+ # =========================================================================
+ # BACKORDER ENDPOINTS (V2)
+ # =========================================================================
+
+ async def search_backorders(
+ self,
+ keyword: Optional[str] = None,
+ page_size: int = 100,
+ page_token: Optional[str] = None,
+ ) -> Dict[str, Any]:
+ """Search for available backorders (domains dropping soon)."""
+ params = {"pageSize": page_size}
+
+ if keyword:
+ params["searchTerm"] = keyword
+ if page_token:
+ params["pageToken"] = page_token
+
+ return await self._request("GET", "/v2/backorders", params=params)
+
+ # =========================================================================
+ # UTILITY METHODS
+ # =========================================================================
+
+ async def test_connection(self) -> Dict[str, Any]:
+ """Test the API connection and credentials."""
+ if not self.is_configured:
+ return {
+ "success": False,
+ "error": "API credentials not configured",
+ "configured": False
+ }
+
+ try:
+ await self._authenticate()
+ return {
+ "success": True,
+ "configured": True,
+ "client_id": self.client_id.split(":")[0] if ":" in self.client_id else self.client_id,
+ "authenticated_at": datetime.utcnow().isoformat()
+ }
+ except Exception as e:
+ return {
+ "success": False,
+ "error": str(e),
+ "configured": True
+ }
+
+ def transform_to_pounce_format(self, dc_auction: Dict) -> Dict[str, Any]:
+ """
+ Transform DropCatch auction to Pounce internal format.
+
+ Maps DropCatch fields to our DomainAuction model.
+ """
+ domain = dc_auction.get("domainName") or dc_auction.get("domain", "")
+ tld = domain.rsplit(".", 1)[1] if "." in domain else ""
+
+ # Parse end time (format may vary)
+ end_time_str = dc_auction.get("auctionEndTime") or dc_auction.get("endTime")
+ if end_time_str:
+ try:
+ end_time = datetime.fromisoformat(end_time_str.replace("Z", "+00:00"))
+ except:
+ end_time = datetime.utcnow() + timedelta(days=1)
+ else:
+ end_time = datetime.utcnow() + timedelta(days=1)
+
+ return {
+ "domain": domain,
+ "tld": tld,
+ "platform": "DropCatch",
+ "current_bid": dc_auction.get("currentBid") or dc_auction.get("price", 0),
+ "currency": "USD",
+ "num_bids": dc_auction.get("bidCount") or dc_auction.get("numberOfBids", 0),
+ "end_time": end_time,
+ "auction_url": f"https://www.dropcatch.com/domain/{domain}",
+ "age_years": dc_auction.get("yearsOld") or dc_auction.get("age"),
+ "buy_now_price": dc_auction.get("buyNowPrice"),
+ "reserve_met": dc_auction.get("reserveMet"),
+ "traffic": dc_auction.get("traffic"),
+ "external_id": str(dc_auction.get("auctionId") or dc_auction.get("id", "")),
+ }
+
+
+# Singleton instance
+dropcatch_client = DropCatchAPIClient()
+
diff --git a/backend/app/services/sedo_api.py b/backend/app/services/sedo_api.py
new file mode 100644
index 0000000..cf58570
--- /dev/null
+++ b/backend/app/services/sedo_api.py
@@ -0,0 +1,314 @@
+"""
+Sedo Official API Client
+
+This service provides access to Sedo's official API for:
+- Domain search and auctions
+- Marketplace listings
+- Domain pricing
+
+API Documentation: https://api.sedo.com/apidocs/v1/
+Type: XML-RPC based API
+
+SECURITY:
+- Credentials are loaded from environment variables
+- NEVER hardcode credentials in this file
+
+WHERE TO FIND YOUR CREDENTIALS:
+1. Login to https://sedo.com
+2. Go to "Mein Sedo" / "My Sedo"
+3. Navigate to "API-Zugang" / "API Access"
+4. You'll find:
+ - Partner ID (your user ID)
+ - SignKey (signature key for authentication)
+
+Usage:
+ from app.services.sedo_api import sedo_client
+
+ # Search domains for sale
+ listings = await sedo_client.search_domains(keyword="tech")
+"""
+import logging
+import hashlib
+import time
+from datetime import datetime, timedelta
+from typing import Optional, List, Dict, Any
+import httpx
+from xml.etree import ElementTree
+
+from app.config import get_settings
+
+logger = logging.getLogger(__name__)
+
+
+class SedoAPIClient:
+ """
+ Official Sedo API Client.
+
+ Sedo uses an XML-RPC style API with signature-based authentication.
+ Each request must include:
+ - partnerid: Your partner ID
+ - signkey: Your signature key (or hashed signature)
+ """
+
+ def __init__(self):
+ self.settings = get_settings()
+ self.base_url = self.settings.sedo_api_base or "https://api.sedo.com/api/v1/"
+ self.partner_id = self.settings.sedo_partner_id
+ self.sign_key = self.settings.sedo_sign_key
+
+ # HTTP client
+ self._client: Optional[httpx.AsyncClient] = None
+
+ @property
+ def is_configured(self) -> bool:
+ """Check if API credentials are configured."""
+ return bool(self.partner_id and self.sign_key)
+
+ async def _get_client(self) -> httpx.AsyncClient:
+ """Get or create HTTP client."""
+ if self._client is None or self._client.is_closed:
+ self._client = httpx.AsyncClient(
+ timeout=30.0,
+ headers={
+ "Content-Type": "application/x-www-form-urlencoded",
+ "User-Agent": "Pounce/1.0 (Domain Intelligence Platform)"
+ }
+ )
+ return self._client
+
+ async def close(self):
+ """Close the HTTP client."""
+ if self._client and not self._client.is_closed:
+ await self._client.aclose()
+ self._client = None
+
+ def _generate_signature(self, params: Dict[str, Any]) -> str:
+ """
+ Generate request signature for Sedo API.
+
+ The signature is typically: MD5(signkey + sorted_params)
+ Check Sedo docs for exact implementation.
+ """
+ # Simple implementation - may need adjustment based on actual Sedo requirements
+ sorted_params = "&".join(f"{k}={v}" for k, v in sorted(params.items()))
+ signature_base = f"{self.sign_key}{sorted_params}"
+ return hashlib.md5(signature_base.encode()).hexdigest()
+
+ async def _request(
+ self,
+ endpoint: str,
+ params: Optional[Dict] = None
+ ) -> Dict[str, Any]:
+ """Make an authenticated API request."""
+ if not self.is_configured:
+ raise ValueError("Sedo API credentials not configured")
+
+ client = await self._get_client()
+
+ # Base params for all requests
+ request_params = {
+ "partnerid": self.partner_id,
+ "signkey": self.sign_key,
+ **(params or {})
+ }
+
+ url = f"{self.base_url.rstrip('/')}/{endpoint.lstrip('/')}"
+
+ try:
+ response = await client.get(url, params=request_params)
+ response.raise_for_status()
+
+ # Sedo API can return XML or JSON depending on endpoint
+ content_type = response.headers.get("content-type", "")
+
+ if "xml" in content_type:
+ return self._parse_xml_response(response.text)
+ elif "json" in content_type:
+ return response.json()
+ else:
+ # Try JSON first, fallback to XML
+ try:
+ return response.json()
+ except:
+ return self._parse_xml_response(response.text)
+
+ except httpx.HTTPError as e:
+ logger.error(f"Sedo API request failed: {e}")
+ raise
+
+ def _parse_xml_response(self, xml_text: str) -> Dict[str, Any]:
+ """Parse XML response from Sedo API."""
+ try:
+ root = ElementTree.fromstring(xml_text)
+ return self._xml_to_dict(root)
+ except Exception as e:
+ logger.warning(f"Failed to parse XML: {e}")
+ return {"raw": xml_text}
+
+ def _xml_to_dict(self, element) -> Dict[str, Any]:
+ """Convert XML element to dictionary."""
+ result = {}
+ for child in element:
+ if len(child) > 0:
+ result[child.tag] = self._xml_to_dict(child)
+ else:
+ result[child.tag] = child.text
+ return result
+
+ # =========================================================================
+ # DOMAIN SEARCH ENDPOINTS
+ # =========================================================================
+
+ async def search_domains(
+ self,
+ keyword: Optional[str] = None,
+ tld: Optional[str] = None,
+ min_price: Optional[float] = None,
+ max_price: Optional[float] = None,
+ page: int = 1,
+ page_size: int = 100,
+ ) -> Dict[str, Any]:
+ """
+ Search for domains listed on Sedo marketplace.
+
+ Returns domains for sale (not auctions).
+ """
+ params = {
+ "output_method": "json", # Request JSON response
+ }
+
+ if keyword:
+ params["keyword"] = keyword
+ if tld:
+ params["tld"] = tld.lstrip(".")
+ if min_price is not None:
+ params["minprice"] = min_price
+ if max_price is not None:
+ params["maxprice"] = max_price
+ if page:
+ params["page"] = page
+ if page_size:
+ params["pagesize"] = min(page_size, 100)
+
+ return await self._request("DomainSearch", params)
+
+ async def search_auctions(
+ self,
+ keyword: Optional[str] = None,
+ tld: Optional[str] = None,
+ ending_within_hours: Optional[int] = None,
+ page: int = 1,
+ page_size: int = 100,
+ ) -> Dict[str, Any]:
+ """
+ Search for active domain auctions on Sedo.
+ """
+ params = {
+ "output_method": "json",
+ "auction": "true", # Only auctions
+ }
+
+ if keyword:
+ params["keyword"] = keyword
+ if tld:
+ params["tld"] = tld.lstrip(".")
+ if page:
+ params["page"] = page
+ if page_size:
+ params["pagesize"] = min(page_size, 100)
+
+ return await self._request("DomainSearch", params)
+
+ async def get_domain_details(self, domain: str) -> Dict[str, Any]:
+ """Get detailed information about a specific domain."""
+ params = {
+ "domain": domain,
+ "output_method": "json",
+ }
+ return await self._request("DomainDetails", params)
+
+ async def get_ending_soon_auctions(
+ self,
+ hours: int = 24,
+ page_size: int = 50
+ ) -> Dict[str, Any]:
+ """Get auctions ending soon."""
+ return await self.search_auctions(
+ ending_within_hours=hours,
+ page_size=page_size
+ )
+
+ # =========================================================================
+ # UTILITY METHODS
+ # =========================================================================
+
+ async def test_connection(self) -> Dict[str, Any]:
+ """Test the API connection and credentials."""
+ if not self.is_configured:
+ return {
+ "success": False,
+ "error": "API credentials not configured",
+ "configured": False,
+ "hint": "Find your credentials at: Sedo.com โ Mein Sedo โ API-Zugang"
+ }
+
+ try:
+ # Try a simple search to test connection
+ result = await self.search_domains(keyword="test", page_size=1)
+ return {
+ "success": True,
+ "configured": True,
+ "partner_id": self.partner_id,
+ "authenticated_at": datetime.utcnow().isoformat()
+ }
+ except Exception as e:
+ return {
+ "success": False,
+ "error": str(e),
+ "configured": True
+ }
+
+ def transform_to_pounce_format(self, sedo_listing: Dict) -> Dict[str, Any]:
+ """
+ Transform Sedo listing to Pounce internal format.
+
+ Maps Sedo fields to our DomainAuction model.
+ """
+ domain = sedo_listing.get("domain") or sedo_listing.get("domainname", "")
+ tld = domain.rsplit(".", 1)[1] if "." in domain else ""
+
+ # Parse end time if auction
+ end_time_str = sedo_listing.get("auctionend") or sedo_listing.get("enddate")
+ if end_time_str:
+ try:
+ end_time = datetime.fromisoformat(end_time_str.replace("Z", "+00:00"))
+ except:
+ end_time = datetime.utcnow() + timedelta(days=7)
+ else:
+ end_time = datetime.utcnow() + timedelta(days=7)
+
+ # Price handling
+ price = sedo_listing.get("price") or sedo_listing.get("currentbid") or 0
+ if isinstance(price, str):
+ price = float(price.replace(",", "").replace("$", "").replace("โฌ", ""))
+
+ return {
+ "domain": domain,
+ "tld": tld,
+ "platform": "Sedo",
+ "current_bid": price,
+ "buy_now_price": sedo_listing.get("buynow") or sedo_listing.get("bin"),
+ "currency": sedo_listing.get("currency", "EUR"),
+ "num_bids": sedo_listing.get("numbids") or sedo_listing.get("bidcount", 0),
+ "end_time": end_time,
+ "auction_url": f"https://sedo.com/search/details/?domain={domain}",
+ "age_years": None,
+ "reserve_met": sedo_listing.get("reservemet"),
+ "traffic": sedo_listing.get("traffic"),
+ "is_auction": sedo_listing.get("isaution") == "1" or sedo_listing.get("auction") == True,
+ }
+
+
+# Singleton instance
+sedo_client = SedoAPIClient()
+