commit 5d57b1f3498393a89eaff2725ec691e5fa92ade6 Author: CCS Admin Date: Tue Feb 24 07:24:00 2026 +0000 feat: project scaffolding with FastAPI, config, database connection - Initialize project structure with backend/app/ package layout - Add FastAPI app with CORS middleware and health check endpoint - Add Pydantic Settings config with DB, JWT, SMTP, and app settings - Add SQLAlchemy database engine and session management - Add requirements.txt with all dependencies (FastAPI, SQLAlchemy, Alembic, etc.) - Add .env.example template and .gitignore - Add empty frontend/ and backend test scaffolding - Include project specification and design/implementation plans Co-Authored-By: Claude Opus 4.6 diff --git a/.gitignore b/.gitignore new file mode 100644 index 0000000..d45ed33 --- /dev/null +++ b/.gitignore @@ -0,0 +1,47 @@ +# Python +__pycache__/ +*.py[cod] +*$py.class +*.so +*.egg-info/ +*.egg +dist/ +build/ +*.whl + +# Virtual environments +venv/ +.venv/ +env/ + +# Environment files +.env + +# IDE +.vscode/ +.idea/ +*.swp +*.swo +*~ + +# Testing +.pytest_cache/ +.coverage +htmlcov/ +backend/alembic/versions/*.pyc + +# Node / Frontend +node_modules/ +dist/ +.next/ +.nuxt/ + +# Data files (reference Excel/CSV) +data/ + +# OS +.DS_Store +Thumbs.db + +# Logs +*.log diff --git a/Dak_projekt_spezifikation_final.md b/Dak_projekt_spezifikation_final.md new file mode 100644 index 0000000..da1a9f9 --- /dev/null +++ b/Dak_projekt_spezifikation_final.md @@ -0,0 +1,899 @@ +# DAK Zweitmeinungs-Portal — Implementierungsspezifikation + +## FINAL v4.0 — Alle Entscheidungen getroffen +**Datum:** 24.02.2026 +**Status:** IMPLEMENTIERUNGSBEREIT + +--- + +## Entscheidungsprotokoll (komplett) + +| # | Thema | Entscheidung | +|---|-------|-------------| +| 1 | Historischer Import | Alle Excel-Sheets (2023-2026) → DB | +| 2 | Therapieänderungen | Eigenes Sheet UND integriert in Gutachten-Sheet | +| 3 | Abrechnung_DAK.xlsx | Bleibt parallel; DB = Master für Berichtswesen | +| 4 | DAK-Registrierung | Domain-Whitelist (@dak.de) + Einladungslinks | +| 5 | Benachrichtigungen | Bidirektional: Admin↔DAK per E-Mail + In-App | +| 6 | Masse-Coding | Ja, Queue-Interface für historische Nachbearbeitung | +| 7 | Datenbank | MariaDB 10.11.14 (dak_c2s auf localhost) | +| 8 | Deployment | Nativ, systemd-Service, Plesk-Nginx | +| 9 | Nginx | Über Plesk "Additional nginx directives" | +| 10 | Dev-DB | MariaDB direkt auf sv-frontend | +| 11 | Process Manager | Manueller systemd-Service | +| 12 | Max Upload | 20 MB | +| 13 | Docker | Nein — native Deployment | + +--- + +## 1. Infrastruktur + +### Produktion (Hetzner 1) +``` +Server: Hetzner 1 (Plesk-managed) +Domain: dak.complexcaresolutions.de (SSL vorhanden) +Datenbank: MariaDB 10.11.14 → DB: dak_c2s / User: dak_c2s_admin +Python: 3.11.2 +Node.js: vorhanden (Plesk-Modul) +Nginx: Plesk-verwaltet → "Additional nginx directives" +Process Mgr: systemd (manuell) +Git: Plesk-Modul +``` + +### Entwicklung (sv-frontend) +``` +Server: Debian-basiert, vollständiger Dev-Server +Python: installiert +Node.js: installiert +Claude Code: läuft +MariaDB: lokal für Entwicklung +Git: → GitHub (complexcaresolutions/dak.c2s) +``` + +### E-Mail +``` +SMTP Host: smtp.complexcaresolutions.de +Port: 465 (SSL) +Benutzer: noreply@complexcaresolutions.de +Passwort: 9Vw0!3y6o +Absender: noreply@complexcaresolutions.de +``` + +### Git +``` +Organisation: complexcaresolutions (GitHub Business) +Repository: dak.c2s (noch anzulegen) +Workflow: sv-frontend → git push → GitHub → Hetzner 1 git pull +``` + +--- + +## 2. Architektur + +``` +┌─────────────────────────────────────────────────┐ +│ Hetzner 1 (Plesk) │ +│ │ +│ Nginx (Plesk + Additional directives) │ +│ ┌────────────────────────────────────────────┐ │ +│ │ dak.complexcaresolutions.de:443 (SSL) │ │ +│ │ │ │ +│ │ /api/* → proxy_pass 127.0.0.1:8000 │ │ +│ │ /docs → proxy_pass 127.0.0.1:8000 │ │ +│ │ /* → /frontend/dist/ (React SPA) │ │ +│ └──────────────┬────────────────┬────────────┘ │ +│ │ │ │ +│ ┌──────────────▼─────┐ ┌──────▼────────────┐ │ +│ │ FastAPI Backend │ │ React Frontend │ │ +│ │ Gunicorn+Uvicorn │ │ Static Build │ │ +│ │ systemd managed │ │ /frontend/dist │ │ +│ │ Port 8000 │ │ │ │ +│ └──────────┬─────────┘ └───────────────────┘ │ +│ │ │ +│ ┌──────────▼─────────┐ │ +│ │ MariaDB 10.11.14 │ │ +│ │ DB: dak_c2s │ │ +│ │ localhost:3306 │ │ +│ └────────────────────┘ │ +└─────────────────────────────────────────────────┘ +``` + +### Tech-Stack + +| Layer | Technologie | Details | +|-------|------------|---------| +| Frontend | React 18 + TypeScript | Vite, Tailwind CSS, shadcn/ui, Recharts | +| Backend | Python 3.11 + FastAPI | SQLAlchemy 2.0, Pandas, openpyxl | +| DB-Treiber | PyMySQL + cryptography | mysql+pymysql Dialekt | +| Auth | python-jose + passlib + pyotp | JWT, bcrypt, TOTP | +| Migrations | Alembic | MariaDB-kompatibel | +| Process | Gunicorn + Uvicorn Workers | systemd-Service | +| E-Mail | smtplib | SSL, Port 465 | + +--- + +## 3. Quelldaten-Analyse + +### 3.1 CRM-CSV (wöchentlicher Import) +``` +Dateiname-Pattern: YYYY-MM-DD-HHMM.csv +Encoding: UTF-8-BOM +Delimiter: Komma +Spalten: Hauptkontakt, Name, Thema, Erstellungsdatum, Modul + +Hauptkontakt (Pipe-delimited): + "Nachname | Vorname | Geburtsdatum | KVNR" + Beispiel: "Tonn | Regina | 28.04.1960 | D410126355" + + Edge Cases: + - KVNR kann fehlen: "Daum | Luana | 05.02.2016 |" + - Geburtsdatum fehlerhaft: "29.08.0196" + - Leerzeichen um Pipes + +Erstellungsdatum: + Format: "DD.MM.YY, HH:MM" + Beispiel: "02.02.26, 08:50" + +Modul → Fallgruppe Mapping: + "Zweitmeinung Onkologie" → "onko" + "Zweitmeinung Kardiologie" → "kardio" + "Zweitmeinung Intensiv" → "intensiv" + "Zweitmeinung Gallenblase" → "galle" + "Zweitmeinung Schilddrüse" → "sd" + "Begutachtung *" → aus Kontext ableiten +``` + +### 3.2 Abrechnung_DAK.xlsx (Master-Datei) +``` +Sheets: 2026, 2025, 2024, 2023, 2020-2022, Gutachten, Übersicht, ... +Spalten pro Jahres-Sheet (39): + ID, KW, Datum, Anrede, Vorname, Nachname, Geburtsdatum, KVNR, + Versicherung, ICD, Fallgruppe, Straße, PLZ, Ort, E-Mail, + Ansprechpartner, Telefonnummer, Mobiltelefon, Unterlagen, + Unterlagen verschickt, Erhalten, Unterlagen erhalten, + Unterlagen an Gutachter, Gutachten, Gutachter, Gutachten erstellt, + Gutachten versendet, Schweigepflicht, Ablehnung, Abbruch, + Abbruch_Datum, Kurzbeschreibung, Fragestellung, Kommentar, + E-Mail2, Telefon2, Sonstiges, Abgerechnet, Abrechnung_Datum + +Sheet "2020-2022" hat zusätzliche Spalte "Jahr" (Position 2) + +Datenvolumen: + 2026: 68 Zeilen + 2025: 631 Zeilen + 2024: 769 Zeilen + 2023: 635 Zeilen + 2020-2022: 1182 Zeilen + GESAMT: ~3.285 Fälle + +Hinweis: "Tabelle1" ignorieren (temporäre Arbeitstabelle) +``` + +### 3.3 Berichtswesen (Referenzformat) +``` +4 Sheets (5 ab jetzt): + +Sheet 1 "Auswertung KW gesamt": + Zeile 1: "Gesamtübersicht" + Zeile 2: [leer] [leer] [AKTUELLES_JAHR] [leer] [VORJAHR] [leer] + Zeile 3: "Gesamtzahl an Erstberatungen" [leer] [Wert] [leer] [VJ-Wert] [leer] + Zeile 4: "Anzahl Ablehnungen" [leer] [Wert] [Prozent] [VJ-Wert] [VJ-Prozent] + Zeile 5: "Anzahl versendeter Unterlagen" [leer] [Wert] [Prozent] [VJ-Wert] [VJ-Prozent] + Zeile 6: "Anzahl keine Rückmeldung" [leer] [Wert] [Prozent] [VJ-Wert] [VJ-Prozent] + Zeile 7: "Anzahl erstellter Gutachten" [leer] [Wert] [Prozent] [VJ-Wert] [VJ-Prozent] + Zeile 8-9: leer + Zeile 10: KW | Anzahl Erstberatungen | Unterlagen | Ablehnung | Keine Rückmeldung | Gutachten + Zeile 11+: Daten pro KW (1-52) + +Sheet 2 "Auswertung nach Fachgebieten": + Zeile 1: "Übersicht nach Fallgruppen" + Zeile 3: Gruppenköpfe: onko | kardio | intensiv | galle | sd + Zeile 4: KW | Anzahl | Gutachten | Keine RM/Ablehnung (×5 Fallgruppen) + Zeile 5+: Daten pro KW + + ACHTUNG: 2023 hatte nur 3 Fallgruppen (onko, kardio, intensiv) + Ab 2024 kamen galle + sd dazu + +Sheet 3 "Auswertung Gutachten": + Zeile 3: Gruppenköpfe: Gesamt | onko | kardio | intensiv | galle | sd + Zeile 4: KW | Gutachten | Alternative | Bestätigung (×6) + Zeile 5+: Daten pro KW + +Sheet 4 "Auswertung Therapieänderungen" (NEU): + → Gleiche Struktur wie Sheet 3, aber mit: + KW | Gutachten | TA Ja | TA Nein | Diagnosekorrektur | Unterversorgung | Übertherapie + +Sheet 5 "Auswertung ICD onko": + Zeile 1: ICD | Anzahl von ICD + Zeile 2+: Daten (ICD normalisiert, uppercase, sortiert) +``` + +--- + +## 4. MariaDB Schema + +```sql +-- ============================================ +-- Vorab: Charset sicherstellen +-- ============================================ +ALTER DATABASE dak_c2s CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci; + +-- ============================================ +-- BENUTZER & AUTH +-- ============================================ + +CREATE TABLE IF NOT EXISTS users ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + username VARCHAR(100) NOT NULL, + email VARCHAR(255) NOT NULL, + password_hash VARCHAR(255) NOT NULL, + role VARCHAR(20) NOT NULL DEFAULT 'dak_mitarbeiter', + mfa_secret VARCHAR(255) DEFAULT NULL, + mfa_enabled TINYINT(1) NOT NULL DEFAULT 0, + is_active TINYINT(1) NOT NULL DEFAULT 1, + must_change_password TINYINT(1) NOT NULL DEFAULT 0, + last_login DATETIME DEFAULT NULL, + failed_login_attempts INT UNSIGNED NOT NULL DEFAULT 0, + locked_until DATETIME DEFAULT NULL, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, + UNIQUE KEY uk_username (username), + UNIQUE KEY uk_email (email), + CONSTRAINT chk_role CHECK (role IN ('admin', 'dak_mitarbeiter')) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS refresh_tokens ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + user_id INT UNSIGNED NOT NULL, + token_hash VARCHAR(255) NOT NULL, + expires_at DATETIME NOT NULL, + revoked TINYINT(1) NOT NULL DEFAULT 0, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + INDEX idx_user (user_id), + INDEX idx_token (token_hash), + CONSTRAINT fk_rt_user FOREIGN KEY (user_id) REFERENCES users(id) ON DELETE CASCADE +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS invitation_links ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + token VARCHAR(255) NOT NULL, + email VARCHAR(255) DEFAULT NULL, + role VARCHAR(20) NOT NULL DEFAULT 'dak_mitarbeiter', + created_by INT UNSIGNED DEFAULT NULL, + expires_at DATETIME NOT NULL, + used_at DATETIME DEFAULT NULL, + used_by INT UNSIGNED DEFAULT NULL, + is_active TINYINT(1) NOT NULL DEFAULT 1, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + UNIQUE KEY uk_token (token), + CONSTRAINT fk_inv_created FOREIGN KEY (created_by) REFERENCES users(id), + CONSTRAINT fk_inv_used FOREIGN KEY (used_by) REFERENCES users(id) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS allowed_domains ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + domain VARCHAR(255) NOT NULL, + role VARCHAR(20) NOT NULL DEFAULT 'dak_mitarbeiter', + is_active TINYINT(1) NOT NULL DEFAULT 1, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + UNIQUE KEY uk_domain (domain) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +INSERT IGNORE INTO allowed_domains (domain, role) VALUES ('dak.de', 'dak_mitarbeiter'); + +-- ============================================ +-- KERN: FÄLLE +-- ============================================ + +CREATE TABLE IF NOT EXISTS cases ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + fall_id VARCHAR(100) DEFAULT NULL, + crm_ticket_id VARCHAR(20) DEFAULT NULL, + + jahr SMALLINT UNSIGNED NOT NULL, + kw TINYINT UNSIGNED NOT NULL, + datum DATE NOT NULL, + + anrede VARCHAR(20) DEFAULT NULL, + vorname VARCHAR(100) DEFAULT NULL, + nachname VARCHAR(100) NOT NULL, + geburtsdatum DATE DEFAULT NULL, + kvnr VARCHAR(20) DEFAULT NULL, + versicherung VARCHAR(50) NOT NULL DEFAULT 'DAK', + + icd TEXT DEFAULT NULL, + fallgruppe VARCHAR(20) NOT NULL, + + strasse VARCHAR(255) DEFAULT NULL, + plz VARCHAR(10) DEFAULT NULL, + ort VARCHAR(100) DEFAULT NULL, + email VARCHAR(255) DEFAULT NULL, + ansprechpartner VARCHAR(200) DEFAULT NULL, + telefonnummer VARCHAR(50) DEFAULT NULL, + mobiltelefon VARCHAR(50) DEFAULT NULL, + email2 VARCHAR(255) DEFAULT NULL, + telefon2 VARCHAR(50) DEFAULT NULL, + + unterlagen TINYINT(1) NOT NULL DEFAULT 0, + unterlagen_verschickt DATE DEFAULT NULL, + erhalten TINYINT(1) DEFAULT NULL, + unterlagen_erhalten DATE DEFAULT NULL, + unterlagen_an_gutachter DATE DEFAULT NULL, + gutachten TINYINT(1) NOT NULL DEFAULT 0, + gutachter VARCHAR(100) DEFAULT NULL, + gutachten_erstellt DATE DEFAULT NULL, + gutachten_versendet DATE DEFAULT NULL, + schweigepflicht TINYINT(1) NOT NULL DEFAULT 0, + ablehnung TINYINT(1) NOT NULL DEFAULT 0, + abbruch TINYINT(1) NOT NULL DEFAULT 0, + abbruch_datum DATE DEFAULT NULL, + + gutachten_typ VARCHAR(20) DEFAULT NULL, + therapieaenderung VARCHAR(5) DEFAULT NULL, + ta_diagnosekorrektur TINYINT(1) NOT NULL DEFAULT 0, + ta_unterversorgung TINYINT(1) NOT NULL DEFAULT 0, + ta_uebertherapie TINYINT(1) NOT NULL DEFAULT 0, + + kurzbeschreibung TEXT DEFAULT NULL, + fragestellung TEXT DEFAULT NULL, + kommentar TEXT DEFAULT NULL, + sonstiges TEXT DEFAULT NULL, + + abgerechnet TINYINT(1) NOT NULL DEFAULT 0, + abrechnung_datum DATE DEFAULT NULL, + + import_source VARCHAR(255) DEFAULT NULL, + imported_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + updated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, + updated_by INT UNSIGNED DEFAULT NULL, + icd_entered_by INT UNSIGNED DEFAULT NULL, + icd_entered_at DATETIME DEFAULT NULL, + coding_completed_by INT UNSIGNED DEFAULT NULL, + coding_completed_at DATETIME DEFAULT NULL, + + UNIQUE KEY uk_fall_id (fall_id), + INDEX idx_jahr_kw (jahr, kw), + INDEX idx_kvnr (kvnr), + INDEX idx_fallgruppe (fallgruppe), + INDEX idx_datum (datum), + INDEX idx_nachname_vorname (nachname, vorname), + INDEX idx_pending_icd (jahr, kw, fallgruppe, icd(20)), + INDEX idx_pending_coding (gutachten, gutachten_typ), + + CONSTRAINT fk_c_updated FOREIGN KEY (updated_by) REFERENCES users(id), + CONSTRAINT fk_c_icd_by FOREIGN KEY (icd_entered_by) REFERENCES users(id), + CONSTRAINT fk_c_coding_by FOREIGN KEY (coding_completed_by) REFERENCES users(id), + CONSTRAINT chk_fallgruppe CHECK (fallgruppe IN ('onko', 'kardio', 'intensiv', 'galle', 'sd')), + CONSTRAINT chk_gutachten_typ CHECK (gutachten_typ IS NULL OR gutachten_typ IN ('Bestätigung', 'Alternative')), + CONSTRAINT chk_ta CHECK (therapieaenderung IS NULL OR therapieaenderung IN ('Ja', 'Nein')) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS case_icd_codes ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + case_id INT UNSIGNED NOT NULL, + icd_code VARCHAR(20) NOT NULL, + icd_hauptgruppe VARCHAR(10) DEFAULT NULL, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + INDEX idx_case (case_id), + INDEX idx_code (icd_code), + INDEX idx_haupt (icd_hauptgruppe), + CONSTRAINT fk_icd_case FOREIGN KEY (case_id) REFERENCES cases(id) ON DELETE CASCADE +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +-- ============================================ +-- BERICHTSWESEN +-- ============================================ + +CREATE TABLE IF NOT EXISTS weekly_reports ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + jahr SMALLINT UNSIGNED NOT NULL, + kw TINYINT UNSIGNED NOT NULL, + report_date DATE NOT NULL, + report_file_path VARCHAR(500) DEFAULT NULL, + report_data JSON DEFAULT NULL, + generated_by INT UNSIGNED DEFAULT NULL, + generated_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + UNIQUE KEY uk_jahr_kw (jahr, kw), + CONSTRAINT fk_rpt_by FOREIGN KEY (generated_by) REFERENCES users(id) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS yearly_summary ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + jahr SMALLINT UNSIGNED NOT NULL, + kw TINYINT UNSIGNED NOT NULL, + erstberatungen INT UNSIGNED DEFAULT 0, + ablehnungen INT UNSIGNED DEFAULT 0, + unterlagen INT UNSIGNED DEFAULT 0, + keine_rueckmeldung INT UNSIGNED DEFAULT 0, + gutachten_gesamt INT UNSIGNED DEFAULT 0, + gutachten_alternative INT UNSIGNED DEFAULT 0, + gutachten_bestaetigung INT UNSIGNED DEFAULT 0, + onko_anzahl INT UNSIGNED DEFAULT 0, onko_gutachten INT UNSIGNED DEFAULT 0, onko_keine_rm INT UNSIGNED DEFAULT 0, + kardio_anzahl INT UNSIGNED DEFAULT 0, kardio_gutachten INT UNSIGNED DEFAULT 0, kardio_keine_rm INT UNSIGNED DEFAULT 0, + intensiv_anzahl INT UNSIGNED DEFAULT 0, intensiv_gutachten INT UNSIGNED DEFAULT 0, intensiv_keine_rm INT UNSIGNED DEFAULT 0, + galle_anzahl INT UNSIGNED DEFAULT 0, galle_gutachten INT UNSIGNED DEFAULT 0, galle_keine_rm INT UNSIGNED DEFAULT 0, + sd_anzahl INT UNSIGNED DEFAULT 0, sd_gutachten INT UNSIGNED DEFAULT 0, sd_keine_rm INT UNSIGNED DEFAULT 0, + onko_alternative INT UNSIGNED DEFAULT 0, onko_bestaetigung INT UNSIGNED DEFAULT 0, + kardio_alternative INT UNSIGNED DEFAULT 0, kardio_bestaetigung INT UNSIGNED DEFAULT 0, + intensiv_alternative INT UNSIGNED DEFAULT 0, intensiv_bestaetigung INT UNSIGNED DEFAULT 0, + galle_alternative INT UNSIGNED DEFAULT 0, galle_bestaetigung INT UNSIGNED DEFAULT 0, + sd_alternative INT UNSIGNED DEFAULT 0, sd_bestaetigung INT UNSIGNED DEFAULT 0, + ta_ja INT UNSIGNED DEFAULT 0, + ta_nein INT UNSIGNED DEFAULT 0, + ta_diagnosekorrektur INT UNSIGNED DEFAULT 0, + ta_unterversorgung INT UNSIGNED DEFAULT 0, + ta_uebertherapie INT UNSIGNED DEFAULT 0, + UNIQUE KEY uk_jahr_kw (jahr, kw) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +-- ============================================ +-- LOGGING & BENACHRICHTIGUNGEN +-- ============================================ + +CREATE TABLE IF NOT EXISTS import_log ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + filename VARCHAR(255) NOT NULL, + import_type VARCHAR(50) NOT NULL, + cases_imported INT UNSIGNED NOT NULL DEFAULT 0, + cases_skipped INT UNSIGNED NOT NULL DEFAULT 0, + cases_updated INT UNSIGNED NOT NULL DEFAULT 0, + errors TEXT DEFAULT NULL, + details JSON DEFAULT NULL, + imported_by INT UNSIGNED DEFAULT NULL, + imported_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + CONSTRAINT fk_imp_by FOREIGN KEY (imported_by) REFERENCES users(id), + CONSTRAINT chk_imp_type CHECK (import_type IN ('csv_crm', 'icd_xlsx', 'historical_excel', 'excel_sync')) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS audit_log ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + user_id INT UNSIGNED DEFAULT NULL, + action VARCHAR(100) NOT NULL, + entity_type VARCHAR(50) DEFAULT NULL, + entity_id INT UNSIGNED DEFAULT NULL, + old_values JSON DEFAULT NULL, + new_values JSON DEFAULT NULL, + ip_address VARCHAR(45) DEFAULT NULL, + user_agent TEXT DEFAULT NULL, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + INDEX idx_user (user_id), + INDEX idx_entity (entity_type, entity_id), + INDEX idx_created (created_at), + CONSTRAINT fk_audit_user FOREIGN KEY (user_id) REFERENCES users(id) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; + +CREATE TABLE IF NOT EXISTS notifications ( + id INT UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, + recipient_id INT UNSIGNED NOT NULL, + notification_type VARCHAR(50) NOT NULL, + title VARCHAR(255) NOT NULL, + message TEXT DEFAULT NULL, + related_entity_type VARCHAR(50) DEFAULT NULL, + related_entity_id INT UNSIGNED DEFAULT NULL, + is_read TINYINT(1) NOT NULL DEFAULT 0, + email_sent TINYINT(1) NOT NULL DEFAULT 0, + email_sent_at DATETIME DEFAULT NULL, + created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP, + INDEX idx_recipient (recipient_id, is_read), + CONSTRAINT fk_notif_user FOREIGN KEY (recipient_id) REFERENCES users(id), + CONSTRAINT chk_notif CHECK (notification_type IN ( + 'new_cases_uploaded', 'icd_entered', 'icd_uploaded', + 'report_ready', 'coding_completed')) +) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_unicode_ci; +``` + +--- + +## 5. Backend-Projektstruktur + +``` +dak.c2s/ +├── backend/ +│ ├── app/ +│ │ ├── __init__.py +│ │ ├── main.py # FastAPI App, Middleware, CORS, Lifespan +│ │ ├── config.py # Pydantic Settings (.env) +│ │ ├── database.py # SQLAlchemy engine (mysql+pymysql) +│ │ │ +│ │ ├── models/ +│ │ │ ├── __init__.py # Alle Models exportieren +│ │ │ ├── user.py # User, RefreshToken, InvitationLink, AllowedDomain +│ │ │ ├── case.py # Case, CaseICDCode +│ │ │ ├── report.py # WeeklyReport, YearlySummary +│ │ │ └── audit.py # AuditLog, ImportLog, Notification +│ │ │ +│ │ ├── schemas/ +│ │ │ ├── __init__.py +│ │ │ ├── auth.py # LoginRequest, TokenResponse, RegisterRequest, MFASetup +│ │ │ ├── user.py # UserResponse, UserCreate, UserUpdate +│ │ │ ├── case.py # CaseResponse, CaseList, CaseUpdate, ICDUpdate, CodingUpdate +│ │ │ ├── import_schemas.py # ImportPreview, ImportResult, ImportRow +│ │ │ ├── report.py # DashboardKPIs, WeeklyData, ReportMeta +│ │ │ └── notification.py # NotificationResponse, NotificationList +│ │ │ +│ │ ├── api/ +│ │ │ ├── __init__.py +│ │ │ ├── auth.py # POST login, register, refresh, mfa/setup, mfa/verify +│ │ │ ├── cases.py # GET/PUT cases, pending-icd, pending-coding +│ │ │ ├── import_router.py # POST csv, icd-xlsx, historical +│ │ │ ├── coding.py # GET queue, PUT coding/{id} +│ │ │ ├── reports.py # GET dashboard, weekly, download; POST generate +│ │ │ ├── notifications.py # GET list, PUT mark-read +│ │ │ └── admin.py # Users CRUD, invitations, audit-log +│ │ │ +│ │ ├── services/ +│ │ │ ├── __init__.py +│ │ │ ├── auth_service.py # JWT create/verify, password hash/verify, MFA +│ │ │ ├── csv_parser.py # CRM CSV parsing (pipe-delimited contacts) +│ │ │ ├── import_service.py # Import orchestration + duplicate detection +│ │ │ ├── icd_service.py # ICD normalize, split, validate, hauptgruppe +│ │ │ ├── coding_service.py # Coding queue, batch update +│ │ │ ├── report_service.py # All 5 sheet calculations (pandas) +│ │ │ ├── excel_export.py # openpyxl → Berichtswesen format +│ │ │ ├── excel_import.py # Abrechnung_DAK.xlsx → DB (historical) +│ │ │ ├── vorjahr_service.py # Year-over-year comparison +│ │ │ ├── notification_service.py # SMTP + in-app notifications +│ │ │ ├── audit_service.py # Audit log helper +│ │ │ └── excel_sync.py # Bidirectional Excel ↔ DB sync +│ │ │ +│ │ ├── core/ +│ │ │ ├── __init__.py +│ │ │ ├── security.py # JWT encode/decode, password helpers +│ │ │ ├── dependencies.py # get_current_user, require_admin, get_db +│ │ │ └── exceptions.py # Custom HTTP exceptions +│ │ │ +│ │ └── utils/ +│ │ ├── __init__.py +│ │ ├── fallgruppe_map.py # Modul string → fallgruppe code +│ │ ├── kw_utils.py # ISO KW calculation, date helpers +│ │ └── validators.py # ICD regex, KVNR format, date validation +│ │ +│ ├── alembic/ +│ │ ├── alembic.ini +│ │ ├── env.py +│ │ └── versions/ +│ │ +│ ├── scripts/ +│ │ ├── init_db.py # Create schema + seed data +│ │ ├── import_historical.py # One-time: Abrechnung → DB +│ │ ├── import_berichtswesen.py # One-time: Vorjahres-Berichte → yearly_summary +│ │ └── create_admin.py # Create initial admin user +│ │ +│ ├── tests/ +│ │ ├── conftest.py +│ │ ├── test_csv_parser.py +│ │ ├── test_import.py +│ │ ├── test_icd_service.py +│ │ ├── test_report_service.py +│ │ └── test_auth.py +│ │ +│ ├── requirements.txt +│ ├── .env.example +│ └── .env +│ +├── frontend/ +│ ├── src/ +│ │ ├── components/ +│ │ │ ├── ui/ # shadcn/ui +│ │ │ ├── layout/ # AppLayout, Sidebar, Header, ProtectedRoute +│ │ │ ├── dashboard/ # KPICards, WeeklyChart, FallgruppenDonut, etc. +│ │ │ ├── cases/ # CaseTable, CaseDetail, ICDInlineEdit +│ │ │ ├── import/ # CSVUpload, ImportPreview, ICDUpload +│ │ │ ├── coding/ # CodingQueue, CodingCard, CodingProgress +│ │ │ ├── reports/ # ReportList, ReportDownload +│ │ │ ├── notifications/ # NotificationBell, NotificationDropdown +│ │ │ └── admin/ # UserManagement, InvitationLinks, AuditLog +│ │ ├── pages/ # LoginPage, DashboardPage, CasesPage, etc. +│ │ ├── services/ # api.ts, authService.ts, etc. +│ │ ├── hooks/ # useAuth, useCases, useNotifications +│ │ ├── types/ # TypeScript interfaces +│ │ ├── context/ # AuthContext +│ │ ├── App.tsx +│ │ └── main.tsx +│ ├── package.json +│ ├── vite.config.ts +│ ├── tailwind.config.js +│ └── tsconfig.json +│ +├── data/ # Referenzdaten (nicht im Git) +│ ├── Abrechnung_DAK.xlsx +│ ├── Berichtswesen_2024_29122024.xlsx +│ ├── Berichtswesen_2023_31122023.xlsx +│ └── sample_csv/ +│ └── 2026-02-06-0406.csv +│ +├── .gitignore +├── README.md +└── SPEC.md # Diese Datei +``` + +--- + +## 6. API-Endpunkte + +### Auth +``` +POST /api/auth/login → {access_token, refresh_token, user} +POST /api/auth/register → {user} (Domain-Check oder Invitation-Token) +POST /api/auth/refresh → {access_token} +POST /api/auth/mfa/setup → {secret, qr_uri} (Admin) +POST /api/auth/mfa/verify → {verified: true} +POST /api/auth/logout → Revoke refresh token +``` + +### Cases +``` +GET /api/cases → Paginated list (filter: jahr, kw, fallgruppe, has_icd, has_coding) +GET /api/cases/{id} → Single case detail +PUT /api/cases/{id} → Update case fields +GET /api/cases/pending-icd → Cases without ICD (for DAK) +PUT /api/cases/{id}/icd → Set ICD (DAK or Admin) +GET /api/cases/pending-coding → Cases with gutachten but no typ (for Admin) +PUT /api/cases/{id}/coding → Set gutachten_typ + therapieaenderung (Admin) +GET /api/cases/coding-template → Download .xlsx template for ICD entry +``` + +### Import +``` +POST /api/import/csv → Upload CRM CSV → preview +POST /api/import/csv/confirm → Confirm import from preview +POST /api/import/icd-xlsx → Upload ICD-coded Excel (DAK) +POST /api/import/historical → Import from Abrechnung_DAK.xlsx (Admin, one-time) +GET /api/import/log → Import history +``` + +### Reports +``` +GET /api/reports/dashboard → Live KPIs + chart data +GET /api/reports/weekly/{jahr}/{kw} → Specific week data +POST /api/reports/generate → Generate report .xlsx (Admin) +GET /api/reports/download/{id} → Download generated .xlsx +GET /api/reports/list → All generated reports +``` + +### Notifications +``` +GET /api/notifications → Unread + recent +PUT /api/notifications/{id}/read → Mark as read +PUT /api/notifications/read-all → Mark all as read +``` + +### Admin +``` +GET /api/admin/users → List users +POST /api/admin/users → Create user +PUT /api/admin/users/{id} → Update user (role, active) +POST /api/admin/invitations → Create invitation link +GET /api/admin/invitations → List invitation links +GET /api/admin/audit-log → Paginated audit log +POST /api/admin/excel-sync → Trigger Excel ↔ DB sync +``` + +--- + +## 7. Implementierungsphasen + +### Phase 1: Setup + DB + Auth (Woche 1) +``` +Ziel: Lauffähiges Backend mit Auth auf sv-frontend + +□ GitHub Repo complexcaresolutions/dak.c2s anlegen +□ Projekt klonen auf sv-frontend +□ MariaDB auf sv-frontend einrichten (dev) +□ Backend-Projektstruktur erstellen +□ Python venv + requirements.txt +□ .env + config.py (Pydantic Settings) +□ database.py (SQLAlchemy mysql+pymysql) +□ Alle SQLAlchemy Models +□ Alembic init + erste Migration +□ Schema deployen (dev) +□ Auth: JWT + bcrypt + MFA +□ API: auth (login, register, refresh, mfa) +□ API: admin (users CRUD) +□ Domain-Whitelist + Einladungslinks +□ Audit-Log Middleware +□ CORS + Error Handling +□ create_admin.py Script +□ Erster Commit + Push +``` + +### Phase 2: Import + ICD (Woche 2) +``` +Ziel: CSV-Import und ICD-Workflow funktionsfähig + +□ csv_parser.py (Pipe-Format, Modul-Mapping) +□ import_service.py (Duplikatprüfung, Fall_ID) +□ icd_service.py (Normalisierung, Validierung) +□ validators.py (ICD-Regex, KVNR, Datum) +□ API: import/csv (Upload + Preview + Confirm) +□ API: import/icd-xlsx (DAK-Rückläufer) +□ API: cases/pending-icd +□ API: cases/{id}/icd +□ Coding-Template Generierung (.xlsx) +□ excel_import.py (Abrechnung → DB) +□ import_historical.py Script +□ Historischen Import durchführen (2023-2026) +□ Import-Logging +□ notification_service.py (SMTP) +□ Tests: csv_parser, import, icd +``` + +### Phase 3: Berichtswesen + Coding (Woche 3) +``` +Ziel: Berichte generierbar, Coding-Queue nutzbar + +□ report_service.py (alle 5 Sheet-Berechnungen) +□ vorjahr_service.py +□ excel_export.py (exaktes Berichtswesen-Format) +□ import_berichtswesen.py (Vorjahres-Cache) +□ API: reports/dashboard +□ API: reports/generate + download +□ coding_service.py (Queue-Logik) +□ API: coding (queue, update) +□ API: cases/pending-coding +□ Benachrichtigungen (alle Trigger) +□ excel_sync.py (DB ↔ Abrechnung) +□ Tests: reports, coding +``` + +### Phase 4: Frontend Core (Woche 4-5) +``` +Ziel: Funktionsfähiges Web-Interface + +Woche 4: +□ React + Vite + TS + Tailwind + shadcn/ui Setup +□ API-Client (axios/fetch + JWT interceptor) +□ AuthContext + ProtectedRoute +□ Login + Register Seiten +□ AppLayout (Sidebar + Header) +□ Dashboard: KPI-Cards +□ Dashboard: WeeklyTrendChart (Recharts) +□ Dashboard: FallgruppenDonut +□ Dashboard: VorjahresComparison + +Woche 5: +□ CaseTable + Filter + Pagination +□ ICD-Inline-Edit (DAK) +□ CSVUpload + ImportPreview +□ ICDUpload (Excel) +□ CodingQueue + CodingCard +□ ReportList + Download +□ NotificationBell + Dropdown +□ Admin: UserManagement +□ Admin: InvitationLinks +□ Admin: AuditLogViewer +``` + +### Phase 5: Deploy + Go-Live (Woche 6) +``` +Ziel: Live auf dak.complexcaresolutions.de + +□ Frontend Production Build +□ Git push → GitHub +□ Hetzner 1: git pull +□ Python venv auf Server +□ MariaDB Schema auf Produktion deployen +□ .env auf Server konfigurieren +□ systemd-Service einrichten + starten +□ Plesk: Additional nginx directives +□ SMTP testen (Produktion) +□ Admin-Account anlegen +□ Historische Daten importieren (Produktion) +□ DAK-Mitarbeiter einladen +□ Smoke Tests +□ Go-Live ✓ +``` + +--- + +## 8. Sicherheit + +### Authentifizierung +- bcrypt (12 Rounds) für Passwort-Hashing +- JWT Access Token: 15 Min Laufzeit +- JWT Refresh Token: 7 Tage, in DB gespeichert, revokierbar +- MFA: TOTP (pyotp), optional aktivierbar +- Account-Sperre nach 5 Fehlversuchen (30 Min) +- Domain-Whitelist: @dak.de für Selbstregistrierung +- Einladungslinks: Token + Ablaufdatum + optional E-Mail-Bindung + +### Autorisierung (RBAC) +- admin: Voller Zugriff (Import, Coding, Reports, Users) +- dak_mitarbeiter: Dashboard, ICD-Eingabe, Report-Download + +### Audit +- Jede Datenänderung wird geloggt (entity, old/new values, user, IP) +- Unveränderlich (INSERT only) +- Einsehbar über Admin-UI + +### Transport +- TLS/SSL (Plesk-verwaltet) +- CORS: nur dak.complexcaresolutions.de + +### Daten +- MariaDB auf localhost (kein externer Zugang) +- Verschlüsselung at rest: MariaDB encryption + Dateisystem +- Uploads in geschütztem Verzeichnis (nicht öffentlich) +- Sensible Daten (.env, Passwörter) nicht im Git + +--- + +## 9. Nginx Konfiguration (Plesk Additional Directives) + +```nginx +# In Plesk → dak.complexcaresolutions.de → Apache & nginx Settings +# → Additional nginx directives + +# API Reverse Proxy +location /api/ { + proxy_pass http://127.0.0.1:8000; + proxy_http_version 1.1; + proxy_set_header Host $host; + proxy_set_header X-Real-IP $remote_addr; + proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; + proxy_set_header X-Forwarded-Proto $scheme; + proxy_read_timeout 120s; + client_max_body_size 20M; +} + +# FastAPI Docs +location /docs { + proxy_pass http://127.0.0.1:8000; + proxy_set_header Host $host; +} + +location /openapi.json { + proxy_pass http://127.0.0.1:8000; + proxy_set_header Host $host; +} + +# React SPA fallback +location / { + root /var/www/vhosts/dak.complexcaresolutions.de/frontend/dist; + index index.html; + try_files $uri $uri/ /index.html; +} +``` + +--- + +## 10. systemd Service + +```ini +# /etc/systemd/system/dak-backend.service + +[Unit] +Description=DAK Portal Backend (FastAPI/Gunicorn) +After=network.target mariadb.service + +[Service] +Type=simple +User=www-data +Group=www-data +WorkingDirectory=/var/www/vhosts/dak.complexcaresolutions.de/backend +Environment="PATH=/var/www/vhosts/dak.complexcaresolutions.de/backend/venv/bin" +EnvironmentFile=/var/www/vhosts/dak.complexcaresolutions.de/backend/.env +ExecStart=/var/www/vhosts/dak.complexcaresolutions.de/backend/venv/bin/gunicorn \ + app.main:app \ + --workers 2 \ + --worker-class uvicorn.workers.UvicornWorker \ + --bind 127.0.0.1:8000 \ + --access-logfile /var/www/vhosts/dak.complexcaresolutions.de/logs/access.log \ + --error-logfile /var/www/vhosts/dak.complexcaresolutions.de/logs/error.log +Restart=always +RestartSec=5 +StandardOutput=journal +StandardError=journal + +[Install] +WantedBy=multi-user.target +``` + +Management: +```bash +sudo systemctl daemon-reload +sudo systemctl enable dak-backend +sudo systemctl start dak-backend +sudo systemctl status dak-backend +sudo journalctl -u dak-backend -f # Live-Logs +``` \ No newline at end of file diff --git a/backend/.env.example b/backend/.env.example new file mode 100644 index 0000000..1b1d1b4 --- /dev/null +++ b/backend/.env.example @@ -0,0 +1,24 @@ +# Database +DB_HOST= +DB_PORT=3306 +DB_NAME=dak_c2s +DB_USER=dak_c2s_admin +DB_PASSWORD= + +# JWT +JWT_SECRET_KEY= +JWT_ALGORITHM=HS256 +ACCESS_TOKEN_EXPIRE_MINUTES=15 +REFRESH_TOKEN_EXPIRE_DAYS=7 + +# SMTP +SMTP_HOST=smtp.complexcaresolutions.de +SMTP_PORT=465 +SMTP_USER=noreply@complexcaresolutions.de +SMTP_PASSWORD= +SMTP_FROM=noreply@complexcaresolutions.de + +# App +APP_NAME=DAK Zweitmeinungs-Portal +CORS_ORIGINS=http://localhost:5173,https://dak.complexcaresolutions.de +MAX_UPLOAD_SIZE=20971520 diff --git a/backend/alembic/.gitkeep b/backend/alembic/.gitkeep new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/__init__.py b/backend/app/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/api/__init__.py b/backend/app/api/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/config.py b/backend/app/config.py new file mode 100644 index 0000000..46c080b --- /dev/null +++ b/backend/app/config.py @@ -0,0 +1,46 @@ +# backend/app/config.py +from pydantic_settings import BaseSettings +from functools import lru_cache + + +class Settings(BaseSettings): + # Database + DB_HOST: str = "localhost" + DB_PORT: int = 3306 + DB_NAME: str = "dak_c2s" + DB_USER: str = "dak_c2s_admin" + DB_PASSWORD: str = "" + + # JWT + JWT_SECRET_KEY: str = "change-me-in-production" + JWT_ALGORITHM: str = "HS256" + ACCESS_TOKEN_EXPIRE_MINUTES: int = 15 + REFRESH_TOKEN_EXPIRE_DAYS: int = 7 + + # SMTP + SMTP_HOST: str = "smtp.complexcaresolutions.de" + SMTP_PORT: int = 465 + SMTP_USER: str = "noreply@complexcaresolutions.de" + SMTP_PASSWORD: str = "" + SMTP_FROM: str = "noreply@complexcaresolutions.de" + + # App + APP_NAME: str = "DAK Zweitmeinungs-Portal" + CORS_ORIGINS: str = "http://localhost:5173,https://dak.complexcaresolutions.de" + MAX_UPLOAD_SIZE: int = 20971520 # 20MB + + @property + def database_url(self) -> str: + return ( + f"mysql+pymysql://{self.DB_USER}:{self.DB_PASSWORD}" + f"@{self.DB_HOST}:{self.DB_PORT}/{self.DB_NAME}?charset=utf8mb4" + ) + + class Config: + env_file = ".env" + env_file_encoding = "utf-8" + + +@lru_cache +def get_settings() -> Settings: + return Settings() diff --git a/backend/app/core/__init__.py b/backend/app/core/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/database.py b/backend/app/database.py new file mode 100644 index 0000000..2311a18 --- /dev/null +++ b/backend/app/database.py @@ -0,0 +1,24 @@ +# backend/app/database.py +from sqlalchemy import create_engine +from sqlalchemy.orm import sessionmaker, DeclarativeBase +from app.config import get_settings + +settings = get_settings() +engine = create_engine( + settings.database_url, + pool_pre_ping=True, + pool_recycle=3600, +) +SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine) + + +class Base(DeclarativeBase): + pass + + +def get_db(): + db = SessionLocal() + try: + yield db + finally: + db.close() diff --git a/backend/app/main.py b/backend/app/main.py new file mode 100644 index 0000000..ea87736 --- /dev/null +++ b/backend/app/main.py @@ -0,0 +1,21 @@ +# backend/app/main.py +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from app.config import get_settings + +settings = get_settings() + +app = FastAPI(title=settings.APP_NAME, docs_url="/docs") + +app.add_middleware( + CORSMiddleware, + allow_origins=settings.CORS_ORIGINS.split(","), + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], +) + + +@app.get("/api/health") +def health_check(): + return {"status": "ok"} diff --git a/backend/app/models/__init__.py b/backend/app/models/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/schemas/__init__.py b/backend/app/schemas/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/services/__init__.py b/backend/app/services/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/app/utils/__init__.py b/backend/app/utils/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/requirements.txt b/backend/requirements.txt new file mode 100644 index 0000000..cd57d2e --- /dev/null +++ b/backend/requirements.txt @@ -0,0 +1,24 @@ +fastapi==0.115.6 +uvicorn[standard]==0.34.0 +gunicorn==23.0.0 +sqlalchemy==2.0.36 +alembic==1.14.1 +pymysql==1.1.1 +cryptography==44.0.0 +python-jose[cryptography]==3.3.0 +passlib[bcrypt]==1.7.4 +pyotp==2.9.0 +qrcode==8.0 +python-multipart==0.0.20 +pandas==2.2.3 +openpyxl==3.1.5 +pydantic==2.10.4 +pydantic-settings==2.7.1 +python-dotenv==1.0.1 +email-validator==2.2.0 +httpx==0.28.1 + +# Dev/Test +pytest==8.3.4 +pytest-asyncio==0.25.0 +pytest-cov==6.0.0 diff --git a/backend/scripts/__init__.py b/backend/scripts/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/tests/__init__.py b/backend/tests/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/backend/tests/conftest.py b/backend/tests/conftest.py new file mode 100644 index 0000000..28465a6 --- /dev/null +++ b/backend/tests/conftest.py @@ -0,0 +1,16 @@ +# backend/tests/conftest.py +import pytest + + +@pytest.fixture +def app_settings(): + """Provide test settings.""" + from app.config import Settings + return Settings( + DB_HOST="localhost", + DB_PORT=3306, + DB_NAME="dak_c2s_test", + DB_USER="test", + DB_PASSWORD="test", + JWT_SECRET_KEY="test-secret-key", + ) diff --git a/docs/plans/2026-02-23-dak-portal-design.md b/docs/plans/2026-02-23-dak-portal-design.md new file mode 100644 index 0000000..e9e7025 --- /dev/null +++ b/docs/plans/2026-02-23-dak-portal-design.md @@ -0,0 +1,61 @@ +# DAK Zweitmeinungs-Portal — Design-Dokument + +**Datum:** 2026-02-23 +**Status:** Genehmigt +**Basis:** Dak_projekt_spezifikation_final.md (v4.0) + +--- + +## Architektur + +FastAPI-Backend (Python) + React-SPA-Frontend (Vite/TypeScript) + MariaDB auf Hetzner 1. + +``` +Nginx (Plesk) → /api/* → FastAPI (Gunicorn+Uvicorn, Port 8000) + → /* → React SPA (Static Build) + → MariaDB dak_c2s (localhost:3306 in Prod) +``` + +## Entscheidungen (abweichend von Spec) + +| Thema | Spec | Anpassung | Begründung | +|-------|------|-----------|------------| +| Dev-DB | Lokale MariaDB auf sv-frontend | Remote → Hetzner 1 | DB existiert bereits, kein Doppel-Setup | +| Python Dev | 3.11.2 | 3.13.5 (sv-frontend) | Kompatibel, kein Pinning auf 3.11-Features | +| gh CLI | Nicht erwähnt | Installiert auf sv-frontend | Für Repo-Erstellung und PR-Workflow | + +Alle anderen Entscheidungen aus der Spec bleiben unverändert (13 Punkte). + +## Tech-Stack + +- **Backend:** Python 3.13 (Dev) / 3.11 (Prod), FastAPI, SQLAlchemy 2.0, Alembic, Pandas, openpyxl +- **Frontend:** React 18, Vite, TypeScript, Tailwind CSS, shadcn/ui, Recharts +- **DB:** MariaDB 10.11.14 (dak_c2s auf Hetzner 1) +- **Auth:** JWT (python-jose) + bcrypt (passlib) + TOTP (pyotp) +- **E-Mail:** SMTP smtp.complexcaresolutions.de:465 (SSL) +- **Deploy:** systemd + Plesk-Nginx, GitHub-Webhook + +## Rollen + +- `admin` — Voller Zugriff (Import, Coding, Reports, Users) +- `dak_mitarbeiter` — Dashboard, ICD-Eingabe, Report-Download + +## Dev-Workflow + +- Entwicklung auf sv-frontend `/home/frontend/dak_c2s/` +- DB-Verbindung: Remote zu Hetzner 1 MariaDB +- Git: `develop` → `main`, GitHub `complexcaresolutions/dak.c2s` +- Backend-Dev: `uvicorn app.main:app --reload` +- Frontend-Dev: `pnpm dev` mit API-Proxy auf Backend + +## Referenzdaten + +Bereitgestellt durch User in `data/` (gitignored): +- Abrechnung_DAK.xlsx +- Berichtswesen_2024_29122024.xlsx +- Berichtswesen_2023_31122023.xlsx +- Sample-CSVs + +## Implementierungsphasen + +5 Phasen, 6 Wochen — Details im Implementierungsplan. diff --git a/docs/plans/2026-02-23-dak-portal-implementation.md b/docs/plans/2026-02-23-dak-portal-implementation.md new file mode 100644 index 0000000..52dc952 --- /dev/null +++ b/docs/plans/2026-02-23-dak-portal-implementation.md @@ -0,0 +1,1738 @@ +# DAK Zweitmeinungs-Portal — Implementation Plan + +> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task. + +**Goal:** Build a full-stack portal for managing DAK second-opinion medical cases — from CSV import through ICD coding to weekly Excel reports. + +**Architecture:** FastAPI backend (Python, SQLAlchemy 2.0, MariaDB) serving a React SPA (Vite, TypeScript, shadcn/ui). JWT auth with RBAC (admin / dak_mitarbeiter). Deployed on Hetzner 1 via systemd + Plesk-Nginx. + +**Tech Stack:** Python 3.13/FastAPI/SQLAlchemy/Alembic/Pandas/openpyxl (Backend), React 18/Vite/TypeScript/Tailwind CSS/shadcn-ui/Recharts (Frontend), MariaDB 10.11.14 + +**Spec Reference:** `/home/frontend/dak_c2s/Dak_projekt_spezifikation_final.md` + +**DB Connection (Dev):** Remote to Hetzner 1 MariaDB `dak_c2s` / `dak_c2s_admin` + +--- + +## Phase 1: Project Setup + Database + Auth + +### Task 1: Project Scaffolding + +**Files:** +- Create: `backend/app/__init__.py` +- Create: `backend/app/config.py` +- Create: `backend/app/database.py` +- Create: `backend/app/main.py` +- Create: `backend/requirements.txt` +- Create: `backend/.env.example` +- Create: `backend/.env` +- Create: `.gitignore` + +**Step 1: Initialize git repo and create directory structure** + +```bash +cd /home/frontend/dak_c2s +git init +git checkout -b develop +``` + +Create full directory tree: +``` +backend/ + app/ + __init__.py + config.py + database.py + main.py + models/__init__.py + schemas/__init__.py + api/__init__.py + services/__init__.py + core/__init__.py + utils/__init__.py + alembic/ + scripts/ + tests/ + __init__.py + conftest.py +frontend/ +data/ +docs/ +``` + +**Step 2: Write requirements.txt** + +``` +# Backend dependencies +fastapi==0.115.6 +uvicorn[standard]==0.34.0 +gunicorn==23.0.0 +sqlalchemy==2.0.36 +alembic==1.14.1 +pymysql==1.1.1 +cryptography==44.0.0 +python-jose[cryptography]==3.3.0 +passlib[bcrypt]==1.7.4 +pyotp==2.9.0 +qrcode==8.0 +python-multipart==0.0.20 +pandas==2.2.3 +openpyxl==3.1.5 +pydantic==2.10.4 +pydantic-settings==2.7.1 +python-dotenv==1.0.1 +email-validator==2.2.0 +httpx==0.28.1 + +# Dev/Test +pytest==8.3.4 +pytest-asyncio==0.25.0 +pytest-cov==6.0.0 +``` + +**Step 3: Write config.py (Pydantic Settings)** + +```python +# backend/app/config.py +from pydantic_settings import BaseSettings +from functools import lru_cache + +class Settings(BaseSettings): + # Database + DB_HOST: str = "localhost" + DB_PORT: int = 3306 + DB_NAME: str = "dak_c2s" + DB_USER: str = "dak_c2s_admin" + DB_PASSWORD: str = "" + + # JWT + JWT_SECRET_KEY: str = "change-me-in-production" + JWT_ALGORITHM: str = "HS256" + ACCESS_TOKEN_EXPIRE_MINUTES: int = 15 + REFRESH_TOKEN_EXPIRE_DAYS: int = 7 + + # SMTP + SMTP_HOST: str = "smtp.complexcaresolutions.de" + SMTP_PORT: int = 465 + SMTP_USER: str = "noreply@complexcaresolutions.de" + SMTP_PASSWORD: str = "" + SMTP_FROM: str = "noreply@complexcaresolutions.de" + + # App + APP_NAME: str = "DAK Zweitmeinungs-Portal" + CORS_ORIGINS: str = "http://localhost:5173,https://dak.complexcaresolutions.de" + MAX_UPLOAD_SIZE: int = 20 * 1024 * 1024 # 20MB + + @property + def database_url(self) -> str: + return f"mysql+pymysql://{self.DB_USER}:{self.DB_PASSWORD}@{self.DB_HOST}:{self.DB_PORT}/{self.DB_NAME}?charset=utf8mb4" + + class Config: + env_file = ".env" + env_file_encoding = "utf-8" + +@lru_cache +def get_settings() -> Settings: + return Settings() +``` + +**Step 4: Write database.py** + +```python +# backend/app/database.py +from sqlalchemy import create_engine +from sqlalchemy.orm import sessionmaker, DeclarativeBase +from app.config import get_settings + +settings = get_settings() +engine = create_engine(settings.database_url, pool_pre_ping=True, pool_recycle=3600) +SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine) + +class Base(DeclarativeBase): + pass + +def get_db(): + db = SessionLocal() + try: + yield db + finally: + db.close() +``` + +**Step 5: Write main.py (minimal FastAPI app)** + +```python +# backend/app/main.py +from fastapi import FastAPI +from fastapi.middleware.cors import CORSMiddleware +from app.config import get_settings + +settings = get_settings() + +app = FastAPI(title=settings.APP_NAME, docs_url="/docs") + +app.add_middleware( + CORSMiddleware, + allow_origins=settings.CORS_ORIGINS.split(","), + allow_credentials=True, + allow_methods=["*"], + allow_headers=["*"], +) + +@app.get("/api/health") +def health_check(): + return {"status": "ok"} +``` + +**Step 6: Write .env.example and .env, create .gitignore** + +`.env.example` — all keys without values. +`.env` — actual values with Hetzner 1 DB credentials. +`.gitignore` — Python, Node, `.env`, `data/`, `__pycache__/`, `venv/`, `node_modules/`, `dist/`. + +**Step 7: Create Python venv, install deps, test server starts** + +```bash +cd /home/frontend/dak_c2s/backend +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt +uvicorn app.main:app --reload --port 8000 +# Verify: GET http://localhost:8000/api/health → {"status": "ok"} +``` + +**Step 8: Commit** + +```bash +git add -A +git commit -m "feat: project scaffolding with FastAPI, config, database connection" +``` + +--- + +### Task 2: SQLAlchemy Models + +**Files:** +- Create: `backend/app/models/user.py` +- Create: `backend/app/models/case.py` +- Create: `backend/app/models/report.py` +- Create: `backend/app/models/audit.py` +- Modify: `backend/app/models/__init__.py` + +**Step 1: Write User models** + +`backend/app/models/user.py` — 4 models matching the SQL schema: +- `User` (id, username, email, password_hash, role, mfa_secret, mfa_enabled, is_active, must_change_password, last_login, failed_login_attempts, locked_until, created_at, updated_at) +- `RefreshToken` (id, user_id → FK users, token_hash, expires_at, revoked, created_at) +- `InvitationLink` (id, token, email, role, created_by → FK users, expires_at, used_at, used_by → FK users, is_active, created_at) +- `AllowedDomain` (id, domain, role, is_active, created_at) + +Use `mapped_column()` syntax (SQLAlchemy 2.0 declarative). All column types, defaults, indexes, and constraints must match the SQL schema in Spec Section 4 exactly. + +**Step 2: Write Case models** + +`backend/app/models/case.py` — 2 models: +- `Case` — 45+ columns matching spec exactly. Include all CHECK constraints as Python-level validation (SQLAlchemy `CheckConstraint`). Indexes: `idx_jahr_kw`, `idx_kvnr`, `idx_fallgruppe`, `idx_datum`, `idx_nachname_vorname`, `idx_pending_icd`, `idx_pending_coding`. +- `CaseICDCode` (id, case_id → FK cases ON DELETE CASCADE, icd_code, icd_hauptgruppe, created_at) + +**Step 3: Write Report models** + +`backend/app/models/report.py` — 2 models: +- `WeeklyReport` (id, jahr, kw, report_date, report_file_path, report_data [JSON], generated_by, generated_at) +- `YearlySummary` — all 40+ aggregation columns matching spec exactly (erstberatungen through ta_uebertherapie, per-fallgruppe breakdown) + +**Step 4: Write Audit models** + +`backend/app/models/audit.py` — 3 models: +- `ImportLog` (id, filename, import_type, cases_imported/skipped/updated, errors, details [JSON], imported_by, imported_at) — CHECK constraint on import_type +- `AuditLog` (id, user_id, action, entity_type, entity_id, old_values [JSON], new_values [JSON], ip_address, user_agent, created_at) +- `Notification` (id, recipient_id, notification_type, title, message, related_entity_type/id, is_read, email_sent, email_sent_at, created_at) — CHECK constraint on notification_type + +**Step 5: Wire up models/__init__.py** + +```python +# backend/app/models/__init__.py +from app.models.user import User, RefreshToken, InvitationLink, AllowedDomain +from app.models.case import Case, CaseICDCode +from app.models.report import WeeklyReport, YearlySummary +from app.models.audit import AuditLog, ImportLog, Notification + +__all__ = [ + "User", "RefreshToken", "InvitationLink", "AllowedDomain", + "Case", "CaseICDCode", + "WeeklyReport", "YearlySummary", + "AuditLog", "ImportLog", "Notification", +] +``` + +**Step 6: Verify models compile** + +```bash +cd /home/frontend/dak_c2s/backend +source venv/bin/activate +python -c "from app.models import *; print('All models loaded OK')" +``` + +**Step 7: Commit** + +```bash +git add backend/app/models/ +git commit -m "feat: SQLAlchemy models for users, cases, reports, audit" +``` + +--- + +### Task 3: Alembic Migrations + +**Files:** +- Create: `backend/alembic.ini` +- Create: `backend/alembic/env.py` +- Create: `backend/alembic/versions/` (auto-generated) + +**Step 1: Initialize Alembic** + +```bash +cd /home/frontend/dak_c2s/backend +source venv/bin/activate +alembic init alembic +``` + +**Step 2: Configure alembic/env.py** + +- Import `Base` from `app.database` and all models from `app.models` +- Set `target_metadata = Base.metadata` +- Read `sqlalchemy.url` from `app.config.get_settings().database_url` + +**Step 3: Generate initial migration** + +```bash +alembic revision --autogenerate -m "initial schema" +``` + +**Step 4: Review generated migration, then apply** + +```bash +alembic upgrade head +``` + +**Step 5: Seed allowed_domains** + +Write `backend/scripts/init_db.py`: +- Insert `AllowedDomain(domain='dak.de', role='dak_mitarbeiter')` if not exists +- Run: `python -m scripts.init_db` + +**Step 6: Verify tables exist on Hetzner DB** + +```bash +python -c " +from app.database import engine +from sqlalchemy import inspect +insp = inspect(engine) +print(insp.get_table_names()) +" +# Expected: ['users', 'refresh_tokens', 'invitation_links', 'allowed_domains', 'cases', 'case_icd_codes', 'weekly_reports', 'yearly_summary', 'import_log', 'audit_log', 'notifications'] +``` + +**Step 7: Commit** + +```bash +git add alembic.ini backend/alembic/ backend/scripts/ +git commit -m "feat: Alembic migrations, initial schema deployed" +``` + +--- + +### Task 4: Core Security & Dependencies + +**Files:** +- Create: `backend/app/core/security.py` +- Create: `backend/app/core/dependencies.py` +- Create: `backend/app/core/exceptions.py` + +**Step 1: Write security.py** + +```python +# backend/app/core/security.py +from datetime import datetime, timedelta, timezone +from jose import jwt, JWTError +from passlib.context import CryptContext +import pyotp +import secrets +from app.config import get_settings + +settings = get_settings() +pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto") + +def hash_password(password: str) -> str: + return pwd_context.hash(password) + +def verify_password(plain: str, hashed: str) -> bool: + return pwd_context.verify(plain, hashed) + +def create_access_token(user_id: int, role: str) -> str: + expire = datetime.now(timezone.utc) + timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES) + return jwt.encode({"sub": str(user_id), "role": role, "exp": expire}, settings.JWT_SECRET_KEY, algorithm=settings.JWT_ALGORITHM) + +def create_refresh_token() -> str: + return secrets.token_urlsafe(64) + +def decode_access_token(token: str) -> dict: + return jwt.decode(token, settings.JWT_SECRET_KEY, algorithms=[settings.JWT_ALGORITHM]) + +def generate_mfa_secret() -> str: + return pyotp.random_base32() + +def verify_mfa_code(secret: str, code: str) -> bool: + totp = pyotp.TOTP(secret) + return totp.verify(code) + +def get_mfa_uri(secret: str, email: str) -> str: + totp = pyotp.TOTP(secret) + return totp.provisioning_uri(name=email, issuer_name=settings.APP_NAME) +``` + +**Step 2: Write dependencies.py** + +```python +# backend/app/core/dependencies.py +from fastapi import Depends, HTTPException, status +from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials +from sqlalchemy.orm import Session +from app.database import get_db +from app.core.security import decode_access_token +from app.models.user import User +from jose import JWTError + +security = HTTPBearer() + +def get_current_user( + credentials: HTTPAuthorizationCredentials = Depends(security), + db: Session = Depends(get_db), +) -> User: + try: + payload = decode_access_token(credentials.credentials) + user_id = int(payload["sub"]) + except (JWTError, KeyError, ValueError): + raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token") + user = db.query(User).filter(User.id == user_id, User.is_active == True).first() + if not user: + raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="User not found") + return user + +def require_admin(user: User = Depends(get_current_user)) -> User: + if user.role != "admin": + raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Admin access required") + return user +``` + +**Step 3: Write exceptions.py** + +Custom exceptions: `CaseNotFound`, `DuplicateCase`, `InvalidImportFile`, `ICDValidationError`. Each maps to appropriate HTTP status codes. + +**Step 4: Write tests for security functions** + +`backend/tests/test_security.py`: +- `test_hash_and_verify_password` +- `test_create_and_decode_access_token` +- `test_invalid_token_raises` +- `test_mfa_secret_and_verify` + +```bash +cd /home/frontend/dak_c2s/backend +pytest tests/test_security.py -v +``` + +**Step 5: Commit** + +```bash +git add backend/app/core/ backend/tests/test_security.py +git commit -m "feat: JWT auth, bcrypt, MFA, dependency injection" +``` + +--- + +### Task 5: Auth Schemas & API + +**Files:** +- Create: `backend/app/schemas/auth.py` +- Create: `backend/app/schemas/user.py` +- Create: `backend/app/api/auth.py` +- Create: `backend/app/services/auth_service.py` +- Modify: `backend/app/main.py` (add router) + +**Step 1: Write auth schemas** + +`backend/app/schemas/auth.py`: +- `LoginRequest` (email, password, mfa_code: Optional) +- `TokenResponse` (access_token, refresh_token, token_type, user: UserResponse) +- `RegisterRequest` (username, email, password, invitation_token: Optional) +- `RefreshRequest` (refresh_token) +- `MFASetupResponse` (secret, qr_uri) +- `MFAVerifyRequest` (code) + +`backend/app/schemas/user.py`: +- `UserResponse` (id, username, email, role, mfa_enabled, is_active, last_login, created_at) +- `UserCreate` (username, email, password, role) +- `UserUpdate` (username, email, role, is_active — all Optional) + +**Step 2: Write auth_service.py** + +Business logic: +- `authenticate_user(db, email, password, mfa_code)` — check credentials, account lock (5 attempts, 30 min), MFA verification, update last_login +- `register_user(db, data)` — domain whitelist check OR invitation token validation +- `create_tokens(db, user)` — generate access + refresh, store refresh hash in DB +- `refresh_access_token(db, refresh_token)` — validate, return new access token +- `revoke_refresh_token(db, refresh_token)` — mark as revoked + +**Step 3: Write auth API router** + +`backend/app/api/auth.py`: +- `POST /api/auth/login` → authenticate, return tokens +- `POST /api/auth/register` → domain check or invitation, create user +- `POST /api/auth/refresh` → new access token +- `POST /api/auth/logout` → revoke refresh token +- `POST /api/auth/mfa/setup` → generate secret + QR URI (admin only) +- `POST /api/auth/mfa/verify` → verify TOTP code, enable MFA + +**Step 4: Register router in main.py** + +```python +from app.api.auth import router as auth_router +app.include_router(auth_router, prefix="/api/auth", tags=["auth"]) +``` + +**Step 5: Write auth tests** + +`backend/tests/test_auth.py`: +- `test_register_with_valid_domain` +- `test_register_with_invalid_domain_rejected` +- `test_register_with_invitation_token` +- `test_login_success` +- `test_login_wrong_password` +- `test_account_lockout_after_5_failures` +- `test_refresh_token_flow` +- `test_logout_revokes_token` + +Use `httpx.AsyncClient` with `app` for integration tests. Use a test fixture for DB session. + +```bash +pytest tests/test_auth.py -v +``` + +**Step 6: Commit** + +```bash +git add backend/app/schemas/ backend/app/api/auth.py backend/app/services/auth_service.py backend/tests/test_auth.py +git commit -m "feat: auth system — login, register, refresh, MFA, domain whitelist" +``` + +--- + +### Task 6: Admin API + Audit Middleware + +**Files:** +- Create: `backend/app/api/admin.py` +- Create: `backend/app/services/audit_service.py` +- Create: `backend/app/api/notifications.py` +- Modify: `backend/app/main.py` (add routers + audit middleware) + +**Step 1: Write audit_service.py** + +```python +# Helper to log actions +def log_action(db, user_id, action, entity_type, entity_id, old_values, new_values, ip, user_agent): + entry = AuditLog(user_id=user_id, action=action, entity_type=entity_type, + entity_id=entity_id, old_values=old_values, new_values=new_values, + ip_address=ip, user_agent=user_agent) + db.add(entry) + db.commit() +``` + +**Step 2: Write admin API** + +`backend/app/api/admin.py`: +- `GET /api/admin/users` → list users (admin only) +- `POST /api/admin/users` → create user +- `PUT /api/admin/users/{id}` → update role, active status +- `POST /api/admin/invitations` → create invitation link (token + expiry + optional email) +- `GET /api/admin/invitations` → list invitations +- `GET /api/admin/audit-log` → paginated audit log + +**Step 3: Write notifications API** + +`backend/app/api/notifications.py`: +- `GET /api/notifications` → unread + recent for current user +- `PUT /api/notifications/{id}/read` → mark single as read +- `PUT /api/notifications/read-all` → mark all as read + +**Step 4: Write create_admin.py script** + +`backend/scripts/create_admin.py` — interactive script to create the first admin user (prompts for username, email, password). + +**Step 5: Register routers in main.py, test endpoints** + +```bash +uvicorn app.main:app --reload --port 8000 +# Test: POST /api/auth/register, POST /api/auth/login +# Test: GET /api/admin/users (with admin token) +``` + +**Step 6: Commit** + +```bash +git add backend/app/api/admin.py backend/app/api/notifications.py backend/app/services/audit_service.py backend/scripts/create_admin.py +git commit -m "feat: admin API, audit logging, notifications, create_admin script" +``` + +--- + +## Phase 2: Import & ICD Workflow + +### Task 7: Utility Functions + +**Files:** +- Create: `backend/app/utils/fallgruppe_map.py` +- Create: `backend/app/utils/kw_utils.py` +- Create: `backend/app/utils/validators.py` + +**Step 1: Write fallgruppe_map.py** + +```python +MODUL_TO_FALLGRUPPE = { + "Zweitmeinung Onkologie": "onko", + "Zweitmeinung Kardiologie": "kardio", + "Zweitmeinung Intensiv": "intensiv", + "Zweitmeinung Gallenblase": "galle", + "Zweitmeinung Schilddrüse": "sd", +} + +def map_modul_to_fallgruppe(modul: str) -> str: + modul = modul.strip() + if modul in MODUL_TO_FALLGRUPPE: + return MODUL_TO_FALLGRUPPE[modul] + modul_lower = modul.lower() + if "begutachtung" in modul_lower: + # Derive from context — check for keywords + for key, val in [("onko", "onko"), ("kardio", "kardio"), ("intensiv", "intensiv"), + ("galle", "galle"), ("schilddrüse", "sd")]: + if key in modul_lower: + return val + raise ValueError(f"Cannot map module: {modul}") +``` + +**Step 2: Write kw_utils.py** + +- `date_to_kw(d: date) -> int` — ISO calendar week +- `date_to_jahr(d: date) -> int` — ISO calendar year (can differ from d.year in week 1/53) +- `parse_german_date(s: str) -> date` — handles "DD.MM.YY" and "DD.MM.YYYY", edge cases like "29.08.0196" + +**Step 3: Write validators.py** + +- `validate_icd(code: str) -> str` — regex `^[A-Z]\d{2}(\.\d{1,2})?$`, normalize uppercase, strip +- `validate_kvnr(kvnr: str) -> str` — format check (letter + 9 digits) +- `normalize_icd_hauptgruppe(code: str) -> str` — extract letter + first 2 digits (e.g., "C50.1" → "C50") + +**Step 4: Write tests** + +`backend/tests/test_utils.py`: +- Test all mappings including "Begutachtung" edge cases +- Test KW calculation across year boundaries +- Test German date parsing with edge cases ("29.08.0196") +- Test ICD validation (valid, invalid, normalization) +- Test KVNR validation + +```bash +pytest tests/test_utils.py -v +``` + +**Step 5: Commit** + +```bash +git add backend/app/utils/ backend/tests/test_utils.py +git commit -m "feat: utility functions — fallgruppe mapping, KW calc, ICD/KVNR validation" +``` + +--- + +### Task 8: CRM CSV Parser + +**Files:** +- Create: `backend/app/services/csv_parser.py` +- Create: `backend/tests/test_csv_parser.py` + +**Step 1: Write failing tests for CSV parser** + +Test cases from spec: +- Normal row: `"Tonn | Regina | 28.04.1960 | D410126355"` → nachname=Tonn, vorname=Regina, geburtsdatum=1960-04-28, kvnr=D410126355 +- Missing KVNR: `"Daum | Luana | 05.02.2016 |"` → kvnr=None +- Bad date: `"Name | Vorname | 29.08.0196 | X123"` → geburtsdatum=None (log warning) +- Date format: `"02.02.26, 08:50"` → 2026-02-02 +- Modul mapping: `"Zweitmeinung Onkologie"` → `"onko"` + +```bash +pytest tests/test_csv_parser.py -v +# Expected: FAIL (csv_parser.py doesn't exist yet) +``` + +**Step 2: Implement csv_parser.py** + +```python +# backend/app/services/csv_parser.py +import csv +import io +from dataclasses import dataclass +from datetime import date +from typing import Optional +from app.utils.fallgruppe_map import map_modul_to_fallgruppe +from app.utils.kw_utils import parse_german_date, date_to_kw, date_to_jahr + +@dataclass +class ParsedCase: + nachname: str + vorname: Optional[str] + geburtsdatum: Optional[date] + kvnr: Optional[str] + thema: str + fallgruppe: str + datum: date + jahr: int + kw: int + crm_ticket_id: Optional[str] + +def parse_hauptkontakt(raw: str) -> dict: + """Parse pipe-delimited contact: 'Nachname | Vorname | Geburtsdatum | KVNR'""" + parts = [p.strip() for p in raw.split("|")] + result = {"nachname": parts[0] if len(parts) > 0 else ""} + result["vorname"] = parts[1] if len(parts) > 1 and parts[1] else None + result["geburtsdatum"] = None + if len(parts) > 2 and parts[2]: + try: + result["geburtsdatum"] = parse_german_date(parts[2]) + except ValueError: + pass # Log warning, continue + result["kvnr"] = parts[3] if len(parts) > 3 and parts[3] else None + return result + +def parse_csv(content: bytes, filename: str) -> list[ParsedCase]: + """Parse CRM CSV file, return list of ParsedCase.""" + text = content.decode("utf-8-sig") # Handle BOM + reader = csv.DictReader(io.StringIO(text)) + cases = [] + for row in reader: + kontakt = parse_hauptkontakt(row.get("Hauptkontakt", "")) + datum_str = row.get("Erstellungsdatum", "") + datum = parse_german_date(datum_str.split(",")[0].strip()) if datum_str else date.today() + modul = row.get("Modul", "") + fallgruppe = map_modul_to_fallgruppe(modul) + cases.append(ParsedCase( + nachname=kontakt["nachname"], + vorname=kontakt["vorname"], + geburtsdatum=kontakt["geburtsdatum"], + kvnr=kontakt["kvnr"], + thema=row.get("Thema", ""), + fallgruppe=fallgruppe, + datum=datum, + jahr=date_to_jahr(datum), + kw=date_to_kw(datum), + crm_ticket_id=row.get("Name", None), + )) + return cases +``` + +**Step 3: Run tests, verify pass** + +```bash +pytest tests/test_csv_parser.py -v +``` + +**Step 4: Commit** + +```bash +git add backend/app/services/csv_parser.py backend/tests/test_csv_parser.py +git commit -m "feat: CRM CSV parser with pipe-delimited contact parsing" +``` + +--- + +### Task 9: Import Service + Duplicate Detection + +**Files:** +- Create: `backend/app/services/import_service.py` +- Create: `backend/app/schemas/import_schemas.py` +- Create: `backend/tests/test_import.py` + +**Step 1: Write import schemas** + +`backend/app/schemas/import_schemas.py`: +- `ImportRow` — parsed case data for preview +- `ImportPreview` (total, new_cases, duplicates, rows: list[ImportRow]) +- `ImportResult` (imported, skipped, updated, errors: list[str]) + +**Step 2: Write import_service.py** + +Key logic: +- `generate_fall_id(case)` — format: `{YYYY}-{KW:02d}-{fallgruppe}-{nachname}` (must be unique) +- `check_duplicate(db, parsed_case)` — match on (nachname, vorname, geburtsdatum, fallgruppe, datum) or fall_id +- `preview_import(db, parsed_cases)` → `ImportPreview` +- `confirm_import(db, parsed_cases, user_id)` → `ImportResult` — insert new cases, skip duplicates, log to import_log +- `import_icd_xlsx(db, file, user_id)` — parse Excel with ICD column, match cases, update icd field + +**Step 3: Write tests** + +`backend/tests/test_import.py`: +- `test_generate_fall_id_format` +- `test_duplicate_detection_exact_match` +- `test_duplicate_detection_no_match` +- `test_import_creates_cases_in_db` +- `test_import_skips_duplicates` +- `test_import_logs_created` + +**Step 4: Run tests** + +```bash +pytest tests/test_import.py -v +``` + +**Step 5: Commit** + +```bash +git add backend/app/services/import_service.py backend/app/schemas/import_schemas.py backend/tests/test_import.py +git commit -m "feat: import service with duplicate detection and fall_id generation" +``` + +--- + +### Task 10: ICD Service + +**Files:** +- Create: `backend/app/services/icd_service.py` +- Create: `backend/tests/test_icd_service.py` + +**Step 1: Write failing tests** + +Test ICD normalization: +- `"c50.1"` → `"C50.1"` (uppercase) +- `"C50.1, C79.5"` → `["C50.1", "C79.5"]` (split multi-ICD) +- `"C50.1;C79.5"` → `["C50.1", "C79.5"]` (semicolon separator) +- `"XYZ"` → validation error +- Hauptgruppe: `"C50.1"` → `"C50"` + +**Step 2: Implement icd_service.py** + +- `normalize_icd(raw: str) -> list[str]` — split by comma/semicolon, strip, uppercase, validate each +- `save_icd_for_case(db, case_id, icd_raw, user_id)` — update `cases.icd`, create `CaseICDCode` entries, set `icd_entered_by/at` +- `get_pending_icd_cases(db, filters)` — cases where `icd IS NULL` +- `generate_coding_template(db, filters) -> bytes` — openpyxl workbook with case ID, name, fallgruppe, empty ICD column + +**Step 3: Run tests** + +```bash +pytest tests/test_icd_service.py -v +``` + +**Step 4: Commit** + +```bash +git add backend/app/services/icd_service.py backend/tests/test_icd_service.py +git commit -m "feat: ICD service — normalize, split, validate, coding template" +``` + +--- + +### Task 11: Import & Cases API Routes + +**Files:** +- Create: `backend/app/api/import_router.py` +- Create: `backend/app/api/cases.py` +- Create: `backend/app/schemas/case.py` +- Modify: `backend/app/main.py` (add routers) + +**Step 1: Write case schemas** + +`backend/app/schemas/case.py`: +- `CaseResponse` — full case representation +- `CaseListResponse` (items: list[CaseResponse], total, page, per_page) +- `CaseUpdate` — partial update fields +- `ICDUpdate` (icd: str) +- `CodingUpdate` (gutachten_typ: str, therapieaenderung: str, ta_diagnosekorrektur, ta_unterversorgung, ta_uebertherapie) + +**Step 2: Write import_router.py** + +- `POST /api/import/csv` → accept file upload, parse, return ImportPreview +- `POST /api/import/csv/confirm` → confirm import from preview session +- `POST /api/import/icd-xlsx` → upload ICD-coded Excel (DAK role) +- `POST /api/import/historical` → one-time import from Abrechnung_DAK.xlsx (admin only) +- `GET /api/import/log` → import history + +**Step 3: Write cases.py** + +- `GET /api/cases` → paginated list with filters (jahr, kw, fallgruppe, has_icd, has_coding) +- `GET /api/cases/{id}` → single case +- `PUT /api/cases/{id}` → update case fields (admin) +- `GET /api/cases/pending-icd` → cases without ICD +- `PUT /api/cases/{id}/icd` → set ICD (DAK or admin) +- `GET /api/cases/pending-coding` → gutachten without typ +- `PUT /api/cases/{id}/coding` → set gutachten_typ + therapieaenderung (admin) +- `GET /api/cases/coding-template` → download .xlsx template + +**Step 4: Register routers in main.py** + +**Step 5: Test endpoints manually** + +```bash +uvicorn app.main:app --reload --port 8000 +# Upload a CSV, check preview, confirm import +``` + +**Step 6: Commit** + +```bash +git add backend/app/api/import_router.py backend/app/api/cases.py backend/app/schemas/case.py +git commit -m "feat: import and cases API endpoints" +``` + +--- + +### Task 12: Historical Excel Import + +**Files:** +- Create: `backend/app/services/excel_import.py` +- Create: `backend/scripts/import_historical.py` + +**Step 1: Write excel_import.py** + +Parse `Abrechnung_DAK.xlsx`: +- Read sheets: "2026", "2025", "2024", "2023", "2020-2022" (skip "Tabelle1") +- Map 39 columns to Case model fields (column positions from spec) +- Handle "2020-2022" sheet which has extra "Jahr" column at position 2 +- Convert German date formats, boolean fields ("Ja"/"Nein"/1/0/empty) +- Handle empty rows, merged cells +- Generate fall_id for each imported case +- Deduplicate against existing DB records + +**Step 2: Write import_historical.py script** + +```python +# backend/scripts/import_historical.py +"""One-time script: Import all cases from Abrechnung_DAK.xlsx into DB.""" +# Usage: python -m scripts.import_historical /path/to/Abrechnung_DAK.xlsx +``` + +- Accept file path as argument +- Print progress per sheet +- Print summary (imported, skipped, errors) +- Log to import_log table + +**Step 3: Commit** (actual import runs when data files are provided) + +```bash +git add backend/app/services/excel_import.py backend/scripts/import_historical.py +git commit -m "feat: historical Excel import (Abrechnung_DAK.xlsx → DB)" +``` + +--- + +### Task 13: Notification Service + +**Files:** +- Create: `backend/app/services/notification_service.py` + +**Step 1: Implement notification_service.py** + +- `send_notification(db, recipient_id, type, title, message, entity_type, entity_id)` — create in-app notification + send email +- `send_email(to, subject, body)` — SMTP via `smtplib.SMTP_SSL` on port 465 +- Trigger points: + - `new_cases_uploaded` → notify all DAK users when admin uploads CSV + - `icd_entered` → notify admin when DAK user enters ICD + - `icd_uploaded` → notify admin when DAK user uploads ICD Excel + - `report_ready` → notify all users when report generated + - `coding_completed` → notify DAK users when coding done + +**Step 2: Commit** + +```bash +git add backend/app/services/notification_service.py +git commit -m "feat: notification service — in-app + SMTP email" +``` + +--- + +## Phase 3: Reports & Coding + +### Task 14: Report Service + +**Files:** +- Create: `backend/app/services/report_service.py` +- Create: `backend/app/services/vorjahr_service.py` +- Create: `backend/tests/test_report_service.py` + +**Step 1: Write report_service.py** + +Implement all 5 sheet calculations (using pandas queries against DB): + +**Sheet 1 "Auswertung KW gesamt":** +- Per KW: count Erstberatungen, Unterlagen (unterlagen=1), Ablehnungen (ablehnung=1), Keine Rückmeldung (NOT unterlagen AND NOT ablehnung AND NOT abbruch), Gutachten (gutachten=1) +- Totals row with percentages + +**Sheet 2 "Auswertung nach Fachgebieten":** +- Per KW, per fallgruppe (onko, kardio, intensiv, galle, sd): Anzahl, Gutachten, Keine RM/Ablehnung + +**Sheet 3 "Auswertung Gutachten":** +- Per KW, per fallgruppe + gesamt: Gutachten count, Alternative (gutachten_typ='Alternative'), Bestätigung (gutachten_typ='Bestätigung') + +**Sheet 4 "Auswertung Therapieänderungen":** +- Per KW: Gutachten, TA Ja, TA Nein, Diagnosekorrektur, Unterversorgung, Übertherapie + +**Sheet 5 "Auswertung ICD onko":** +- ICD codes from onko cases, normalized uppercase, sorted, with count + +**Step 2: Write vorjahr_service.py** + +- `get_vorjahr_data(db, jahr)` → aggregated data from previous year for comparison +- Reads from `yearly_summary` table (cached) or calculates live + +**Step 3: Write tests** + +`backend/tests/test_report_service.py`: +- Insert known test data, verify each sheet calculation returns correct values +- Test year-over-year comparison +- Test edge cases (empty weeks, KW 53) + +```bash +pytest tests/test_report_service.py -v +``` + +**Step 4: Commit** + +```bash +git add backend/app/services/report_service.py backend/app/services/vorjahr_service.py backend/tests/test_report_service.py +git commit -m "feat: report service — all 5 sheet calculations + year-over-year" +``` + +--- + +### Task 15: Excel Export (Berichtswesen Format) + +**Files:** +- Create: `backend/app/services/excel_export.py` +- Create: `backend/scripts/import_berichtswesen.py` + +**Step 1: Write excel_export.py** + +Using openpyxl, generate `.xlsx` matching the exact Berichtswesen format from spec section 3.3: + +- Sheet 1 layout: Row 1 "Gesamtübersicht", Row 2 year headers, Rows 3-7 summary with percentages, Row 10 column headers, Row 11+ data per KW +- Sheet 2 layout: Fallgruppen columns (5 groups × 3 columns) +- Sheet 3 layout: Gutachten breakdown (6 groups × 3 columns) +- Sheet 4 layout: Therapieänderungen (Gutachten, TA Ja/Nein, Diagnosekorrektur, Unterversorgung, Übertherapie) +- Sheet 5 layout: ICD onko (ICD | Anzahl) +- Apply formatting: headers bold, percentage columns formatted, column widths + +**Step 2: Write import_berichtswesen.py** + +One-time script to import previous years' Berichtswesen data into `yearly_summary` table for year-over-year comparisons. + +**Step 3: Commit** + +```bash +git add backend/app/services/excel_export.py backend/scripts/import_berichtswesen.py +git commit -m "feat: Excel export in exact Berichtswesen format + historical import" +``` + +--- + +### Task 16: Coding Service & Reports API + +**Files:** +- Create: `backend/app/services/coding_service.py` +- Create: `backend/app/api/coding.py` +- Create: `backend/app/api/reports.py` +- Create: `backend/app/schemas/report.py` +- Create: `backend/app/services/excel_sync.py` + +**Step 1: Write coding_service.py** + +- `get_coding_queue(db, filters)` — cases where `gutachten=1 AND gutachten_typ IS NULL` +- `update_coding(db, case_id, gutachten_typ, therapieaenderung, ta_*, user_id)` — set coding fields, log audit +- `batch_update_coding(db, updates: list, user_id)` — mass coding for historical data + +**Step 2: Write report schemas** + +`backend/app/schemas/report.py`: +- `DashboardKPIs` (total_cases, pending_icd, pending_coding, reports_generated, current_year_stats) +- `WeeklyData` (kw, erstberatungen, unterlagen, ablehnungen, keine_rm, gutachten) +- `ReportMeta` (id, jahr, kw, generated_at, download_url) + +**Step 3: Write coding API** + +`backend/app/api/coding.py`: +- `GET /api/coding/queue` → paginated coding queue (admin) +- `PUT /api/coding/{id}` → update single case coding (admin) + +**Step 4: Write reports API** + +`backend/app/api/reports.py`: +- `GET /api/reports/dashboard` → live KPIs + chart data +- `GET /api/reports/weekly/{jahr}/{kw}` → specific week data +- `POST /api/reports/generate` → generate .xlsx, save to disk + DB, return metadata +- `GET /api/reports/download/{id}` → serve generated .xlsx file +- `GET /api/reports/list` → all generated reports + +**Step 5: Write excel_sync.py** + +- `sync_db_to_excel(db)` → export current DB state to Abrechnung_DAK.xlsx format +- `sync_excel_to_db(db, file)` → import changes from edited Excel back to DB +- Triggered via `POST /api/admin/excel-sync` + +**Step 6: Register all new routers in main.py** + +**Step 7: Commit** + +```bash +git add backend/app/services/coding_service.py backend/app/services/excel_sync.py backend/app/api/coding.py backend/app/api/reports.py backend/app/schemas/report.py +git commit -m "feat: coding queue, reports API, Excel sync" +``` + +--- + +### Task 17: Push Backend to GitHub + +**Step 1: Create GitHub repo** + +```bash +gh repo create complexcaresolutions/dak.c2s --private --source=/home/frontend/dak_c2s --push +``` + +**Step 2: Push develop branch** + +```bash +cd /home/frontend/dak_c2s +git push -u origin develop +``` + +--- + +## Phase 4: Frontend + +### Task 18: Frontend Setup + +**Files:** +- Create: `frontend/` (Vite scaffold) +- Create: `frontend/vite.config.ts` +- Create: `frontend/tailwind.config.js` +- Create: `frontend/src/main.tsx` +- Create: `frontend/src/App.tsx` + +**Step 1: Scaffold React + Vite + TypeScript** + +```bash +cd /home/frontend/dak_c2s +pnpm create vite frontend --template react-ts +cd frontend +pnpm install +``` + +**Step 2: Install dependencies** + +```bash +pnpm add axios react-router-dom recharts +pnpm add -D tailwindcss @tailwindcss/vite +``` + +**Step 3: Configure Tailwind** + +Add Tailwind via `@tailwindcss/vite` plugin in `vite.config.ts`. Add `@import "tailwindcss"` in CSS. + +**Step 4: Configure Vite API proxy** + +```typescript +// frontend/vite.config.ts +export default defineConfig({ + plugins: [react(), tailwindcss()], + server: { + proxy: { + '/api': 'http://localhost:8000', + }, + }, +}); +``` + +**Step 5: Initialize shadcn/ui** + +```bash +pnpm dlx shadcn@latest init +# Select: TypeScript, default style, CSS variables +``` + +**Step 6: Verify dev server starts** + +```bash +pnpm dev +# Visit http://localhost:5173 +``` + +**Step 7: Commit** + +```bash +git add frontend/ +git commit -m "feat: frontend setup — React, Vite, TypeScript, Tailwind, shadcn/ui" +``` + +--- + +### Task 19: Auth Context & API Client + +**Files:** +- Create: `frontend/src/services/api.ts` +- Create: `frontend/src/services/authService.ts` +- Create: `frontend/src/context/AuthContext.tsx` +- Create: `frontend/src/hooks/useAuth.ts` +- Create: `frontend/src/types/index.ts` +- Create: `frontend/src/components/layout/ProtectedRoute.tsx` + +**Step 1: Write TypeScript types** + +`frontend/src/types/index.ts`: +- `User` (id, username, email, role, mfa_enabled, is_active, last_login) +- `LoginRequest`, `RegisterRequest`, `TokenResponse` +- `Case`, `CaseListResponse`, `ImportPreview`, `ImportResult` +- `DashboardKPIs`, `WeeklyData`, `ReportMeta` +- `Notification` + +**Step 2: Write API client with JWT interceptor** + +`frontend/src/services/api.ts`: +- Axios instance with base URL `/api` +- Request interceptor: attach access token from localStorage +- Response interceptor: on 401, try refresh token, retry original request +- If refresh fails, redirect to login + +**Step 3: Write authService.ts** + +- `login(email, password, mfaCode?)` → store tokens, return user +- `register(data)` → create account +- `logout()` → call API, clear tokens +- `refreshToken()` → get new access token + +**Step 4: Write AuthContext + useAuth hook** + +- `AuthProvider` wraps app, stores user + loading state +- `useAuth()` → `{ user, login, logout, register, isAdmin, isLoading }` +- On mount: check stored token, try refresh + +**Step 5: Write ProtectedRoute** + +- If not authenticated → redirect to `/login` +- If `requireAdmin` and user is not admin → show 403 +- Otherwise render children + +**Step 6: Commit** + +```bash +git add frontend/src/ +git commit -m "feat: auth context, API client with JWT refresh, protected routes" +``` + +--- + +### Task 20: Login & Register Pages + +**Files:** +- Create: `frontend/src/pages/LoginPage.tsx` +- Create: `frontend/src/pages/RegisterPage.tsx` +- Modify: `frontend/src/App.tsx` (add routes) + +**Step 1: Install shadcn components** + +```bash +pnpm dlx shadcn@latest add button input label card form +``` + +**Step 2: Write LoginPage** + +- Email + password form +- Optional MFA code field (shown after first login attempt if MFA enabled) +- Error display for invalid credentials / account locked +- Link to register + +**Step 3: Write RegisterPage** + +- Username, email, password fields +- Optional invitation token field (from URL param) +- Domain validation feedback (@dak.de) + +**Step 4: Wire up React Router in App.tsx** + +```tsx + + } /> + } /> + } /> + +``` + +**Step 5: Commit** + +```bash +git add frontend/src/pages/ frontend/src/App.tsx +git commit -m "feat: login and register pages with MFA support" +``` + +--- + +### Task 21: App Layout + +**Files:** +- Create: `frontend/src/components/layout/AppLayout.tsx` +- Create: `frontend/src/components/layout/Sidebar.tsx` +- Create: `frontend/src/components/layout/Header.tsx` + +**Step 1: Install shadcn components** + +```bash +pnpm dlx shadcn@latest add avatar dropdown-menu separator sheet badge +``` + +**Step 2: Write Sidebar** + +Navigation items (role-aware): +- Dashboard (all) +- Fälle (all) +- Import (admin) +- ICD-Eingabe (dak_mitarbeiter) +- Coding (admin) +- Berichte (all) +- Admin (admin only): Users, Einladungen, Audit-Log + +**Step 3: Write Header** + +- App title +- NotificationBell (placeholder for now) +- User dropdown (profile, logout) + +**Step 4: Write AppLayout** + +- Sidebar + Header + `` for page content +- Responsive: collapsible sidebar on mobile + +**Step 5: Commit** + +```bash +git add frontend/src/components/layout/ +git commit -m "feat: app layout with role-aware sidebar and header" +``` + +--- + +### Task 22: Dashboard Page + +**Files:** +- Create: `frontend/src/pages/DashboardPage.tsx` +- Create: `frontend/src/components/dashboard/KPICards.tsx` +- Create: `frontend/src/components/dashboard/WeeklyChart.tsx` +- Create: `frontend/src/components/dashboard/FallgruppenDonut.tsx` +- Create: `frontend/src/components/dashboard/VorjahresComparison.tsx` +- Create: `frontend/src/hooks/useDashboard.ts` + +**Step 1: Install shadcn components** + +```bash +pnpm dlx shadcn@latest add card tabs +``` + +**Step 2: Write useDashboard hook** + +- Fetch `GET /api/reports/dashboard` +- Return: kpis, weeklyData, loading, error + +**Step 3: Write KPICards** + +4 cards: Erstberatungen gesamt, Pending ICD, Pending Coding, Gutachten gesamt. Color-coded. + +**Step 4: Write WeeklyChart** + +Recharts BarChart showing weekly Erstberatungen + Gutachten trend. + +**Step 5: Write FallgruppenDonut** + +Recharts PieChart showing distribution across 5 Fallgruppen. + +**Step 6: Write VorjahresComparison** + +Table comparing current year vs previous year key metrics. + +**Step 7: Assemble DashboardPage** + +Layout: KPICards (top), WeeklyChart (left) + FallgruppenDonut (right), VorjahresComparison (bottom). + +**Step 8: Commit** + +```bash +git add frontend/src/pages/DashboardPage.tsx frontend/src/components/dashboard/ frontend/src/hooks/useDashboard.ts +git commit -m "feat: dashboard with KPI cards, weekly chart, fallgruppen donut, year comparison" +``` + +--- + +### Task 23: Cases Page + +**Files:** +- Create: `frontend/src/pages/CasesPage.tsx` +- Create: `frontend/src/components/cases/CaseTable.tsx` +- Create: `frontend/src/components/cases/CaseDetail.tsx` +- Create: `frontend/src/components/cases/ICDInlineEdit.tsx` +- Create: `frontend/src/hooks/useCases.ts` + +**Step 1: Install shadcn components** + +```bash +pnpm dlx shadcn@latest add table dialog select pagination +``` + +**Step 2: Write useCases hook** + +- Fetch `GET /api/cases` with pagination + filters +- CRUD operations for individual cases + +**Step 3: Write CaseTable** + +- Columns: ID, KW, Nachname, Vorname, Fallgruppe, ICD, Gutachten, Status +- Filters: Jahr, KW, Fallgruppe, has_icd, has_coding +- Pagination +- Click row → CaseDetail dialog + +**Step 4: Write CaseDetail** + +- Full case view in dialog/sheet +- Editable fields (admin): all case fields +- Read-only for dak_mitarbeiter (except ICD) + +**Step 5: Write ICDInlineEdit** + +- Inline ICD editing in case table or detail view +- Validation feedback (regex check) +- For dak_mitarbeiter: only ICD field editable + +**Step 6: Commit** + +```bash +git add frontend/src/pages/CasesPage.tsx frontend/src/components/cases/ frontend/src/hooks/useCases.ts +git commit -m "feat: cases page with filterable table, detail view, inline ICD edit" +``` + +--- + +### Task 24: Import Pages + +**Files:** +- Create: `frontend/src/pages/ImportPage.tsx` +- Create: `frontend/src/components/import/CSVUpload.tsx` +- Create: `frontend/src/components/import/ImportPreview.tsx` +- Create: `frontend/src/components/import/ICDUpload.tsx` + +**Step 1: Write CSVUpload** + +- File dropzone (accept .csv) +- Upload → POST /api/import/csv → show ImportPreview + +**Step 2: Write ImportPreview** + +- Table showing parsed rows (new vs duplicate) +- Confirm/Cancel buttons +- On confirm → POST /api/import/csv/confirm → show ImportResult + +**Step 3: Write ICDUpload** + +- File dropzone (accept .xlsx) +- Upload → POST /api/import/icd-xlsx +- Show result (cases updated, errors) +- Template download link (GET /api/cases/coding-template) + +**Step 4: Assemble ImportPage** + +Tabs: "CSV Import" | "ICD Upload" | "Import-Log" + +**Step 5: Commit** + +```bash +git add frontend/src/pages/ImportPage.tsx frontend/src/components/import/ +git commit -m "feat: import pages — CSV upload with preview, ICD Excel upload" +``` + +--- + +### Task 25: Coding Queue Page + +**Files:** +- Create: `frontend/src/pages/CodingPage.tsx` +- Create: `frontend/src/components/coding/CodingQueue.tsx` +- Create: `frontend/src/components/coding/CodingCard.tsx` +- Create: `frontend/src/components/coding/CodingProgress.tsx` + +**Step 1: Write CodingQueue** + +- List of cases with gutachten=1 but no gutachten_typ +- Each shows: Name, Fallgruppe, Kurzbeschreibung, Fragestellung + +**Step 2: Write CodingCard** + +- Individual case coding form +- Fields: gutachten_typ (Bestätigung/Alternative), therapieaenderung (Ja/Nein), checkboxes for ta_diagnosekorrektur, ta_unterversorgung, ta_uebertherapie +- Save → PUT /api/coding/{id} + +**Step 3: Write CodingProgress** + +- Progress bar: X of Y cases coded +- Stats: Bestätigung vs Alternative ratio + +**Step 4: Commit** + +```bash +git add frontend/src/pages/CodingPage.tsx frontend/src/components/coding/ +git commit -m "feat: coding queue with progress tracking" +``` + +--- + +### Task 26: Reports Page + +**Files:** +- Create: `frontend/src/pages/ReportsPage.tsx` +- Create: `frontend/src/components/reports/ReportList.tsx` +- Create: `frontend/src/components/reports/ReportDownload.tsx` + +**Step 1: Write ReportList** + +- Table of generated reports (date, KW, generated_by) +- Download button per report → GET /api/reports/download/{id} +- Generate button (admin) → POST /api/reports/generate + +**Step 2: Commit** + +```bash +git add frontend/src/pages/ReportsPage.tsx frontend/src/components/reports/ +git commit -m "feat: reports page with list and download" +``` + +--- + +### Task 27: Notifications + +**Files:** +- Create: `frontend/src/components/notifications/NotificationBell.tsx` +- Create: `frontend/src/components/notifications/NotificationDropdown.tsx` +- Create: `frontend/src/hooks/useNotifications.ts` + +**Step 1: Write useNotifications hook** + +- Poll `GET /api/notifications` every 30 seconds +- Return: notifications, unreadCount, markAsRead, markAllAsRead + +**Step 2: Write NotificationBell + Dropdown** + +- Bell icon in header with unread badge count +- Dropdown: list of recent notifications with timestamps +- Click → mark as read + navigate to related entity + +**Step 3: Commit** + +```bash +git add frontend/src/components/notifications/ frontend/src/hooks/useNotifications.ts +git commit -m "feat: notification bell with dropdown and polling" +``` + +--- + +### Task 28: Admin Pages + +**Files:** +- Create: `frontend/src/pages/AdminUsersPage.tsx` +- Create: `frontend/src/pages/AdminInvitationsPage.tsx` +- Create: `frontend/src/pages/AdminAuditPage.tsx` +- Create: `frontend/src/components/admin/UserManagement.tsx` +- Create: `frontend/src/components/admin/InvitationLinks.tsx` +- Create: `frontend/src/components/admin/AuditLog.tsx` + +**Step 1: Write UserManagement** + +- Table: username, email, role, active, last_login +- Edit button → dialog with role/active toggle +- Create user button + +**Step 2: Write InvitationLinks** + +- Create invitation form (email optional, expiry date) +- List existing invitations with status (active/used/expired) +- Copy link button + +**Step 3: Write AuditLog** + +- Paginated table: timestamp, user, action, entity, old/new values (JSON expandable) +- Filter by user, action, date range + +**Step 4: Commit** + +```bash +git add frontend/src/pages/Admin* frontend/src/components/admin/ +git commit -m "feat: admin pages — user management, invitations, audit log" +``` + +--- + +## Phase 5: Integration & Deploy + +### Task 29: Frontend Production Build & Integration Test + +**Step 1: Build frontend** + +```bash +cd /home/frontend/dak_c2s/frontend +pnpm build +# Output: frontend/dist/ +``` + +**Step 2: Test full stack locally** + +- Backend: `uvicorn app.main:app --port 8000` +- Serve frontend build from dist/ +- Test all flows: login → dashboard → import CSV → view cases → enter ICD → coding → generate report → download + +**Step 3: Commit** + +```bash +git add -A +git commit -m "feat: frontend production build, integration tested" +``` + +--- + +### Task 30: GitHub & Deploy to Hetzner 1 + +**Step 1: Push to GitHub** + +```bash +cd /home/frontend/dak_c2s +git push origin develop +git checkout main +git merge develop +git push origin main +git checkout develop +``` + +**Step 2: Clone on Hetzner 1** + +```bash +ssh hetzner1 +# As appropriate user: +cd /var/www/vhosts/dak.complexcaresolutions.de/ +git clone git@github.com:complexcaresolutions/dak.c2s.git . +# Or: git init + git remote add origin + git pull +``` + +**Step 3: Setup Python venv on Hetzner** + +```bash +cd /var/www/vhosts/dak.complexcaresolutions.de/backend +python3 -m venv venv +source venv/bin/activate +pip install -r requirements.txt +``` + +**Step 4: Configure .env on Hetzner** + +- DB_HOST=localhost (MariaDB is local on Hetzner 1) +- All other production values + +**Step 5: Run Alembic migrations on production DB** + +```bash +cd /var/www/vhosts/dak.complexcaresolutions.de/backend +source venv/bin/activate +alembic upgrade head +``` + +**Step 6: Create systemd service** + +Copy service file from spec section 10 to `/etc/systemd/system/dak-backend.service`. + +```bash +sudo systemctl daemon-reload +sudo systemctl enable dak-backend +sudo systemctl start dak-backend +sudo systemctl status dak-backend +``` + +**Step 7: Configure Plesk Nginx** + +Add directives from spec section 9 in Plesk → dak.complexcaresolutions.de → Additional nginx directives. + +**Step 8: Build frontend on Hetzner** + +```bash +cd /var/www/vhosts/dak.complexcaresolutions.de/frontend +npm install +npm run build +``` + +**Step 9: Create admin account** + +```bash +cd /var/www/vhosts/dak.complexcaresolutions.de/backend +source venv/bin/activate +python -m scripts.create_admin +``` + +**Step 10: Test SMTP** + +```bash +python -c " +from app.services.notification_service import send_email +send_email('test@example.com', 'Test', 'Portal SMTP works') +" +``` + +**Step 11: Smoke test production** + +- Visit https://dak.complexcaresolutions.de +- Login with admin +- Check dashboard loads +- Upload test CSV +- Verify notifications + +**Step 12: Invite DAK users** + +- Create invitation links via admin panel +- Send to DAK contacts + +--- + +## Dependency Graph + +``` +Task 1 (Scaffolding) → Task 2 (Models) → Task 3 (Alembic) + → Task 4 (Security) → Task 5 (Auth API) → Task 6 (Admin API) + → Task 7 (Utils) → Task 8 (CSV Parser) → Task 9 (Import Service) + → Task 10 (ICD Service) + → Task 11 (Import/Cases API) ← Task 9, 10 + → Task 12 (Historical Import) + → Task 13 (Notifications) + → Task 14 (Reports) → Task 15 (Excel Export) → Task 16 (Coding + Reports API) + → Task 17 (Push to GitHub) + → Task 18 (Frontend Setup) → Task 19 (Auth Context) → Task 20 (Login) + → Task 21 (Layout) + → Task 22 (Dashboard) + → Task 23 (Cases) + → Task 24 (Import Pages) + → Task 25 (Coding) + → Task 26 (Reports) + → Task 27 (Notifications) + → Task 28 (Admin Pages) + → Task 29 (Integration) → Task 30 (Deploy) +``` + +## Notes + +- **Referenzdaten:** User provides data files to `data/` — historical import (Task 12) runs when available +- **Python compatibility:** Code must work on both 3.13 (dev) and 3.11 (prod) — avoid 3.12+ syntax like `type` statements +- **Testing strategy:** Unit tests for services/utils (pytest), manual integration tests for API endpoints, visual testing for frontend +- **DB:** Single MariaDB instance on Hetzner 1 — dev connects remotely, prod connects locally diff --git a/frontend/.gitkeep b/frontend/.gitkeep new file mode 100644 index 0000000..e69de29