- Initialize project structure with backend/app/ package layout - Add FastAPI app with CORS middleware and health check endpoint - Add Pydantic Settings config with DB, JWT, SMTP, and app settings - Add SQLAlchemy database engine and session management - Add requirements.txt with all dependencies (FastAPI, SQLAlchemy, Alembic, etc.) - Add .env.example template and .gitignore - Add empty frontend/ and backend test scaffolding - Include project specification and design/implementation plans Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
51 KiB
DAK Zweitmeinungs-Portal — Implementation Plan
For Claude: REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
Goal: Build a full-stack portal for managing DAK second-opinion medical cases — from CSV import through ICD coding to weekly Excel reports.
Architecture: FastAPI backend (Python, SQLAlchemy 2.0, MariaDB) serving a React SPA (Vite, TypeScript, shadcn/ui). JWT auth with RBAC (admin / dak_mitarbeiter). Deployed on Hetzner 1 via systemd + Plesk-Nginx.
Tech Stack: Python 3.13/FastAPI/SQLAlchemy/Alembic/Pandas/openpyxl (Backend), React 18/Vite/TypeScript/Tailwind CSS/shadcn-ui/Recharts (Frontend), MariaDB 10.11.14
Spec Reference: /home/frontend/dak_c2s/Dak_projekt_spezifikation_final.md
DB Connection (Dev): Remote to Hetzner 1 MariaDB dak_c2s / dak_c2s_admin
Phase 1: Project Setup + Database + Auth
Task 1: Project Scaffolding
Files:
- Create:
backend/app/__init__.py - Create:
backend/app/config.py - Create:
backend/app/database.py - Create:
backend/app/main.py - Create:
backend/requirements.txt - Create:
backend/.env.example - Create:
backend/.env - Create:
.gitignore
Step 1: Initialize git repo and create directory structure
cd /home/frontend/dak_c2s
git init
git checkout -b develop
Create full directory tree:
backend/
app/
__init__.py
config.py
database.py
main.py
models/__init__.py
schemas/__init__.py
api/__init__.py
services/__init__.py
core/__init__.py
utils/__init__.py
alembic/
scripts/
tests/
__init__.py
conftest.py
frontend/
data/
docs/
Step 2: Write requirements.txt
# Backend dependencies
fastapi==0.115.6
uvicorn[standard]==0.34.0
gunicorn==23.0.0
sqlalchemy==2.0.36
alembic==1.14.1
pymysql==1.1.1
cryptography==44.0.0
python-jose[cryptography]==3.3.0
passlib[bcrypt]==1.7.4
pyotp==2.9.0
qrcode==8.0
python-multipart==0.0.20
pandas==2.2.3
openpyxl==3.1.5
pydantic==2.10.4
pydantic-settings==2.7.1
python-dotenv==1.0.1
email-validator==2.2.0
httpx==0.28.1
# Dev/Test
pytest==8.3.4
pytest-asyncio==0.25.0
pytest-cov==6.0.0
Step 3: Write config.py (Pydantic Settings)
# backend/app/config.py
from pydantic_settings import BaseSettings
from functools import lru_cache
class Settings(BaseSettings):
# Database
DB_HOST: str = "localhost"
DB_PORT: int = 3306
DB_NAME: str = "dak_c2s"
DB_USER: str = "dak_c2s_admin"
DB_PASSWORD: str = ""
# JWT
JWT_SECRET_KEY: str = "change-me-in-production"
JWT_ALGORITHM: str = "HS256"
ACCESS_TOKEN_EXPIRE_MINUTES: int = 15
REFRESH_TOKEN_EXPIRE_DAYS: int = 7
# SMTP
SMTP_HOST: str = "smtp.complexcaresolutions.de"
SMTP_PORT: int = 465
SMTP_USER: str = "noreply@complexcaresolutions.de"
SMTP_PASSWORD: str = ""
SMTP_FROM: str = "noreply@complexcaresolutions.de"
# App
APP_NAME: str = "DAK Zweitmeinungs-Portal"
CORS_ORIGINS: str = "http://localhost:5173,https://dak.complexcaresolutions.de"
MAX_UPLOAD_SIZE: int = 20 * 1024 * 1024 # 20MB
@property
def database_url(self) -> str:
return f"mysql+pymysql://{self.DB_USER}:{self.DB_PASSWORD}@{self.DB_HOST}:{self.DB_PORT}/{self.DB_NAME}?charset=utf8mb4"
class Config:
env_file = ".env"
env_file_encoding = "utf-8"
@lru_cache
def get_settings() -> Settings:
return Settings()
Step 4: Write database.py
# backend/app/database.py
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker, DeclarativeBase
from app.config import get_settings
settings = get_settings()
engine = create_engine(settings.database_url, pool_pre_ping=True, pool_recycle=3600)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
class Base(DeclarativeBase):
pass
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
Step 5: Write main.py (minimal FastAPI app)
# backend/app/main.py
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
from app.config import get_settings
settings = get_settings()
app = FastAPI(title=settings.APP_NAME, docs_url="/docs")
app.add_middleware(
CORSMiddleware,
allow_origins=settings.CORS_ORIGINS.split(","),
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
@app.get("/api/health")
def health_check():
return {"status": "ok"}
Step 6: Write .env.example and .env, create .gitignore
.env.example — all keys without values.
.env — actual values with Hetzner 1 DB credentials.
.gitignore — Python, Node, .env, data/, __pycache__/, venv/, node_modules/, dist/.
Step 7: Create Python venv, install deps, test server starts
cd /home/frontend/dak_c2s/backend
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
uvicorn app.main:app --reload --port 8000
# Verify: GET http://localhost:8000/api/health → {"status": "ok"}
Step 8: Commit
git add -A
git commit -m "feat: project scaffolding with FastAPI, config, database connection"
Task 2: SQLAlchemy Models
Files:
- Create:
backend/app/models/user.py - Create:
backend/app/models/case.py - Create:
backend/app/models/report.py - Create:
backend/app/models/audit.py - Modify:
backend/app/models/__init__.py
Step 1: Write User models
backend/app/models/user.py — 4 models matching the SQL schema:
User(id, username, email, password_hash, role, mfa_secret, mfa_enabled, is_active, must_change_password, last_login, failed_login_attempts, locked_until, created_at, updated_at)RefreshToken(id, user_id → FK users, token_hash, expires_at, revoked, created_at)InvitationLink(id, token, email, role, created_by → FK users, expires_at, used_at, used_by → FK users, is_active, created_at)AllowedDomain(id, domain, role, is_active, created_at)
Use mapped_column() syntax (SQLAlchemy 2.0 declarative). All column types, defaults, indexes, and constraints must match the SQL schema in Spec Section 4 exactly.
Step 2: Write Case models
backend/app/models/case.py — 2 models:
Case— 45+ columns matching spec exactly. Include all CHECK constraints as Python-level validation (SQLAlchemyCheckConstraint). Indexes:idx_jahr_kw,idx_kvnr,idx_fallgruppe,idx_datum,idx_nachname_vorname,idx_pending_icd,idx_pending_coding.CaseICDCode(id, case_id → FK cases ON DELETE CASCADE, icd_code, icd_hauptgruppe, created_at)
Step 3: Write Report models
backend/app/models/report.py — 2 models:
WeeklyReport(id, jahr, kw, report_date, report_file_path, report_data [JSON], generated_by, generated_at)YearlySummary— all 40+ aggregation columns matching spec exactly (erstberatungen through ta_uebertherapie, per-fallgruppe breakdown)
Step 4: Write Audit models
backend/app/models/audit.py — 3 models:
ImportLog(id, filename, import_type, cases_imported/skipped/updated, errors, details [JSON], imported_by, imported_at) — CHECK constraint on import_typeAuditLog(id, user_id, action, entity_type, entity_id, old_values [JSON], new_values [JSON], ip_address, user_agent, created_at)Notification(id, recipient_id, notification_type, title, message, related_entity_type/id, is_read, email_sent, email_sent_at, created_at) — CHECK constraint on notification_type
Step 5: Wire up models/init.py
# backend/app/models/__init__.py
from app.models.user import User, RefreshToken, InvitationLink, AllowedDomain
from app.models.case import Case, CaseICDCode
from app.models.report import WeeklyReport, YearlySummary
from app.models.audit import AuditLog, ImportLog, Notification
__all__ = [
"User", "RefreshToken", "InvitationLink", "AllowedDomain",
"Case", "CaseICDCode",
"WeeklyReport", "YearlySummary",
"AuditLog", "ImportLog", "Notification",
]
Step 6: Verify models compile
cd /home/frontend/dak_c2s/backend
source venv/bin/activate
python -c "from app.models import *; print('All models loaded OK')"
Step 7: Commit
git add backend/app/models/
git commit -m "feat: SQLAlchemy models for users, cases, reports, audit"
Task 3: Alembic Migrations
Files:
- Create:
backend/alembic.ini - Create:
backend/alembic/env.py - Create:
backend/alembic/versions/(auto-generated)
Step 1: Initialize Alembic
cd /home/frontend/dak_c2s/backend
source venv/bin/activate
alembic init alembic
Step 2: Configure alembic/env.py
- Import
Basefromapp.databaseand all models fromapp.models - Set
target_metadata = Base.metadata - Read
sqlalchemy.urlfromapp.config.get_settings().database_url
Step 3: Generate initial migration
alembic revision --autogenerate -m "initial schema"
Step 4: Review generated migration, then apply
alembic upgrade head
Step 5: Seed allowed_domains
Write backend/scripts/init_db.py:
- Insert
AllowedDomain(domain='dak.de', role='dak_mitarbeiter')if not exists - Run:
python -m scripts.init_db
Step 6: Verify tables exist on Hetzner DB
python -c "
from app.database import engine
from sqlalchemy import inspect
insp = inspect(engine)
print(insp.get_table_names())
"
# Expected: ['users', 'refresh_tokens', 'invitation_links', 'allowed_domains', 'cases', 'case_icd_codes', 'weekly_reports', 'yearly_summary', 'import_log', 'audit_log', 'notifications']
Step 7: Commit
git add alembic.ini backend/alembic/ backend/scripts/
git commit -m "feat: Alembic migrations, initial schema deployed"
Task 4: Core Security & Dependencies
Files:
- Create:
backend/app/core/security.py - Create:
backend/app/core/dependencies.py - Create:
backend/app/core/exceptions.py
Step 1: Write security.py
# backend/app/core/security.py
from datetime import datetime, timedelta, timezone
from jose import jwt, JWTError
from passlib.context import CryptContext
import pyotp
import secrets
from app.config import get_settings
settings = get_settings()
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def hash_password(password: str) -> str:
return pwd_context.hash(password)
def verify_password(plain: str, hashed: str) -> bool:
return pwd_context.verify(plain, hashed)
def create_access_token(user_id: int, role: str) -> str:
expire = datetime.now(timezone.utc) + timedelta(minutes=settings.ACCESS_TOKEN_EXPIRE_MINUTES)
return jwt.encode({"sub": str(user_id), "role": role, "exp": expire}, settings.JWT_SECRET_KEY, algorithm=settings.JWT_ALGORITHM)
def create_refresh_token() -> str:
return secrets.token_urlsafe(64)
def decode_access_token(token: str) -> dict:
return jwt.decode(token, settings.JWT_SECRET_KEY, algorithms=[settings.JWT_ALGORITHM])
def generate_mfa_secret() -> str:
return pyotp.random_base32()
def verify_mfa_code(secret: str, code: str) -> bool:
totp = pyotp.TOTP(secret)
return totp.verify(code)
def get_mfa_uri(secret: str, email: str) -> str:
totp = pyotp.TOTP(secret)
return totp.provisioning_uri(name=email, issuer_name=settings.APP_NAME)
Step 2: Write dependencies.py
# backend/app/core/dependencies.py
from fastapi import Depends, HTTPException, status
from fastapi.security import HTTPBearer, HTTPAuthorizationCredentials
from sqlalchemy.orm import Session
from app.database import get_db
from app.core.security import decode_access_token
from app.models.user import User
from jose import JWTError
security = HTTPBearer()
def get_current_user(
credentials: HTTPAuthorizationCredentials = Depends(security),
db: Session = Depends(get_db),
) -> User:
try:
payload = decode_access_token(credentials.credentials)
user_id = int(payload["sub"])
except (JWTError, KeyError, ValueError):
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="Invalid token")
user = db.query(User).filter(User.id == user_id, User.is_active == True).first()
if not user:
raise HTTPException(status_code=status.HTTP_401_UNAUTHORIZED, detail="User not found")
return user
def require_admin(user: User = Depends(get_current_user)) -> User:
if user.role != "admin":
raise HTTPException(status_code=status.HTTP_403_FORBIDDEN, detail="Admin access required")
return user
Step 3: Write exceptions.py
Custom exceptions: CaseNotFound, DuplicateCase, InvalidImportFile, ICDValidationError. Each maps to appropriate HTTP status codes.
Step 4: Write tests for security functions
backend/tests/test_security.py:
test_hash_and_verify_passwordtest_create_and_decode_access_tokentest_invalid_token_raisestest_mfa_secret_and_verify
cd /home/frontend/dak_c2s/backend
pytest tests/test_security.py -v
Step 5: Commit
git add backend/app/core/ backend/tests/test_security.py
git commit -m "feat: JWT auth, bcrypt, MFA, dependency injection"
Task 5: Auth Schemas & API
Files:
- Create:
backend/app/schemas/auth.py - Create:
backend/app/schemas/user.py - Create:
backend/app/api/auth.py - Create:
backend/app/services/auth_service.py - Modify:
backend/app/main.py(add router)
Step 1: Write auth schemas
backend/app/schemas/auth.py:
LoginRequest(email, password, mfa_code: Optional)TokenResponse(access_token, refresh_token, token_type, user: UserResponse)RegisterRequest(username, email, password, invitation_token: Optional)RefreshRequest(refresh_token)MFASetupResponse(secret, qr_uri)MFAVerifyRequest(code)
backend/app/schemas/user.py:
UserResponse(id, username, email, role, mfa_enabled, is_active, last_login, created_at)UserCreate(username, email, password, role)UserUpdate(username, email, role, is_active — all Optional)
Step 2: Write auth_service.py
Business logic:
authenticate_user(db, email, password, mfa_code)— check credentials, account lock (5 attempts, 30 min), MFA verification, update last_loginregister_user(db, data)— domain whitelist check OR invitation token validationcreate_tokens(db, user)— generate access + refresh, store refresh hash in DBrefresh_access_token(db, refresh_token)— validate, return new access tokenrevoke_refresh_token(db, refresh_token)— mark as revoked
Step 3: Write auth API router
backend/app/api/auth.py:
POST /api/auth/login→ authenticate, return tokensPOST /api/auth/register→ domain check or invitation, create userPOST /api/auth/refresh→ new access tokenPOST /api/auth/logout→ revoke refresh tokenPOST /api/auth/mfa/setup→ generate secret + QR URI (admin only)POST /api/auth/mfa/verify→ verify TOTP code, enable MFA
Step 4: Register router in main.py
from app.api.auth import router as auth_router
app.include_router(auth_router, prefix="/api/auth", tags=["auth"])
Step 5: Write auth tests
backend/tests/test_auth.py:
test_register_with_valid_domaintest_register_with_invalid_domain_rejectedtest_register_with_invitation_tokentest_login_successtest_login_wrong_passwordtest_account_lockout_after_5_failurestest_refresh_token_flowtest_logout_revokes_token
Use httpx.AsyncClient with app for integration tests. Use a test fixture for DB session.
pytest tests/test_auth.py -v
Step 6: Commit
git add backend/app/schemas/ backend/app/api/auth.py backend/app/services/auth_service.py backend/tests/test_auth.py
git commit -m "feat: auth system — login, register, refresh, MFA, domain whitelist"
Task 6: Admin API + Audit Middleware
Files:
- Create:
backend/app/api/admin.py - Create:
backend/app/services/audit_service.py - Create:
backend/app/api/notifications.py - Modify:
backend/app/main.py(add routers + audit middleware)
Step 1: Write audit_service.py
# Helper to log actions
def log_action(db, user_id, action, entity_type, entity_id, old_values, new_values, ip, user_agent):
entry = AuditLog(user_id=user_id, action=action, entity_type=entity_type,
entity_id=entity_id, old_values=old_values, new_values=new_values,
ip_address=ip, user_agent=user_agent)
db.add(entry)
db.commit()
Step 2: Write admin API
backend/app/api/admin.py:
GET /api/admin/users→ list users (admin only)POST /api/admin/users→ create userPUT /api/admin/users/{id}→ update role, active statusPOST /api/admin/invitations→ create invitation link (token + expiry + optional email)GET /api/admin/invitations→ list invitationsGET /api/admin/audit-log→ paginated audit log
Step 3: Write notifications API
backend/app/api/notifications.py:
GET /api/notifications→ unread + recent for current userPUT /api/notifications/{id}/read→ mark single as readPUT /api/notifications/read-all→ mark all as read
Step 4: Write create_admin.py script
backend/scripts/create_admin.py — interactive script to create the first admin user (prompts for username, email, password).
Step 5: Register routers in main.py, test endpoints
uvicorn app.main:app --reload --port 8000
# Test: POST /api/auth/register, POST /api/auth/login
# Test: GET /api/admin/users (with admin token)
Step 6: Commit
git add backend/app/api/admin.py backend/app/api/notifications.py backend/app/services/audit_service.py backend/scripts/create_admin.py
git commit -m "feat: admin API, audit logging, notifications, create_admin script"
Phase 2: Import & ICD Workflow
Task 7: Utility Functions
Files:
- Create:
backend/app/utils/fallgruppe_map.py - Create:
backend/app/utils/kw_utils.py - Create:
backend/app/utils/validators.py
Step 1: Write fallgruppe_map.py
MODUL_TO_FALLGRUPPE = {
"Zweitmeinung Onkologie": "onko",
"Zweitmeinung Kardiologie": "kardio",
"Zweitmeinung Intensiv": "intensiv",
"Zweitmeinung Gallenblase": "galle",
"Zweitmeinung Schilddrüse": "sd",
}
def map_modul_to_fallgruppe(modul: str) -> str:
modul = modul.strip()
if modul in MODUL_TO_FALLGRUPPE:
return MODUL_TO_FALLGRUPPE[modul]
modul_lower = modul.lower()
if "begutachtung" in modul_lower:
# Derive from context — check for keywords
for key, val in [("onko", "onko"), ("kardio", "kardio"), ("intensiv", "intensiv"),
("galle", "galle"), ("schilddrüse", "sd")]:
if key in modul_lower:
return val
raise ValueError(f"Cannot map module: {modul}")
Step 2: Write kw_utils.py
date_to_kw(d: date) -> int— ISO calendar weekdate_to_jahr(d: date) -> int— ISO calendar year (can differ from d.year in week 1/53)parse_german_date(s: str) -> date— handles "DD.MM.YY" and "DD.MM.YYYY", edge cases like "29.08.0196"
Step 3: Write validators.py
validate_icd(code: str) -> str— regex^[A-Z]\d{2}(\.\d{1,2})?$, normalize uppercase, stripvalidate_kvnr(kvnr: str) -> str— format check (letter + 9 digits)normalize_icd_hauptgruppe(code: str) -> str— extract letter + first 2 digits (e.g., "C50.1" → "C50")
Step 4: Write tests
backend/tests/test_utils.py:
- Test all mappings including "Begutachtung" edge cases
- Test KW calculation across year boundaries
- Test German date parsing with edge cases ("29.08.0196")
- Test ICD validation (valid, invalid, normalization)
- Test KVNR validation
pytest tests/test_utils.py -v
Step 5: Commit
git add backend/app/utils/ backend/tests/test_utils.py
git commit -m "feat: utility functions — fallgruppe mapping, KW calc, ICD/KVNR validation"
Task 8: CRM CSV Parser
Files:
- Create:
backend/app/services/csv_parser.py - Create:
backend/tests/test_csv_parser.py
Step 1: Write failing tests for CSV parser
Test cases from spec:
- Normal row:
"Tonn | Regina | 28.04.1960 | D410126355"→ nachname=Tonn, vorname=Regina, geburtsdatum=1960-04-28, kvnr=D410126355 - Missing KVNR:
"Daum | Luana | 05.02.2016 |"→ kvnr=None - Bad date:
"Name | Vorname | 29.08.0196 | X123"→ geburtsdatum=None (log warning) - Date format:
"02.02.26, 08:50"→ 2026-02-02 - Modul mapping:
"Zweitmeinung Onkologie"→"onko"
pytest tests/test_csv_parser.py -v
# Expected: FAIL (csv_parser.py doesn't exist yet)
Step 2: Implement csv_parser.py
# backend/app/services/csv_parser.py
import csv
import io
from dataclasses import dataclass
from datetime import date
from typing import Optional
from app.utils.fallgruppe_map import map_modul_to_fallgruppe
from app.utils.kw_utils import parse_german_date, date_to_kw, date_to_jahr
@dataclass
class ParsedCase:
nachname: str
vorname: Optional[str]
geburtsdatum: Optional[date]
kvnr: Optional[str]
thema: str
fallgruppe: str
datum: date
jahr: int
kw: int
crm_ticket_id: Optional[str]
def parse_hauptkontakt(raw: str) -> dict:
"""Parse pipe-delimited contact: 'Nachname | Vorname | Geburtsdatum | KVNR'"""
parts = [p.strip() for p in raw.split("|")]
result = {"nachname": parts[0] if len(parts) > 0 else ""}
result["vorname"] = parts[1] if len(parts) > 1 and parts[1] else None
result["geburtsdatum"] = None
if len(parts) > 2 and parts[2]:
try:
result["geburtsdatum"] = parse_german_date(parts[2])
except ValueError:
pass # Log warning, continue
result["kvnr"] = parts[3] if len(parts) > 3 and parts[3] else None
return result
def parse_csv(content: bytes, filename: str) -> list[ParsedCase]:
"""Parse CRM CSV file, return list of ParsedCase."""
text = content.decode("utf-8-sig") # Handle BOM
reader = csv.DictReader(io.StringIO(text))
cases = []
for row in reader:
kontakt = parse_hauptkontakt(row.get("Hauptkontakt", ""))
datum_str = row.get("Erstellungsdatum", "")
datum = parse_german_date(datum_str.split(",")[0].strip()) if datum_str else date.today()
modul = row.get("Modul", "")
fallgruppe = map_modul_to_fallgruppe(modul)
cases.append(ParsedCase(
nachname=kontakt["nachname"],
vorname=kontakt["vorname"],
geburtsdatum=kontakt["geburtsdatum"],
kvnr=kontakt["kvnr"],
thema=row.get("Thema", ""),
fallgruppe=fallgruppe,
datum=datum,
jahr=date_to_jahr(datum),
kw=date_to_kw(datum),
crm_ticket_id=row.get("Name", None),
))
return cases
Step 3: Run tests, verify pass
pytest tests/test_csv_parser.py -v
Step 4: Commit
git add backend/app/services/csv_parser.py backend/tests/test_csv_parser.py
git commit -m "feat: CRM CSV parser with pipe-delimited contact parsing"
Task 9: Import Service + Duplicate Detection
Files:
- Create:
backend/app/services/import_service.py - Create:
backend/app/schemas/import_schemas.py - Create:
backend/tests/test_import.py
Step 1: Write import schemas
backend/app/schemas/import_schemas.py:
ImportRow— parsed case data for previewImportPreview(total, new_cases, duplicates, rows: list[ImportRow])ImportResult(imported, skipped, updated, errors: list[str])
Step 2: Write import_service.py
Key logic:
generate_fall_id(case)— format:{YYYY}-{KW:02d}-{fallgruppe}-{nachname}(must be unique)check_duplicate(db, parsed_case)— match on (nachname, vorname, geburtsdatum, fallgruppe, datum) or fall_idpreview_import(db, parsed_cases)→ImportPreviewconfirm_import(db, parsed_cases, user_id)→ImportResult— insert new cases, skip duplicates, log to import_logimport_icd_xlsx(db, file, user_id)— parse Excel with ICD column, match cases, update icd field
Step 3: Write tests
backend/tests/test_import.py:
test_generate_fall_id_formattest_duplicate_detection_exact_matchtest_duplicate_detection_no_matchtest_import_creates_cases_in_dbtest_import_skips_duplicatestest_import_logs_created
Step 4: Run tests
pytest tests/test_import.py -v
Step 5: Commit
git add backend/app/services/import_service.py backend/app/schemas/import_schemas.py backend/tests/test_import.py
git commit -m "feat: import service with duplicate detection and fall_id generation"
Task 10: ICD Service
Files:
- Create:
backend/app/services/icd_service.py - Create:
backend/tests/test_icd_service.py
Step 1: Write failing tests
Test ICD normalization:
"c50.1"→"C50.1"(uppercase)"C50.1, C79.5"→["C50.1", "C79.5"](split multi-ICD)"C50.1;C79.5"→["C50.1", "C79.5"](semicolon separator)"XYZ"→ validation error- Hauptgruppe:
"C50.1"→"C50"
Step 2: Implement icd_service.py
normalize_icd(raw: str) -> list[str]— split by comma/semicolon, strip, uppercase, validate eachsave_icd_for_case(db, case_id, icd_raw, user_id)— updatecases.icd, createCaseICDCodeentries, seticd_entered_by/atget_pending_icd_cases(db, filters)— cases whereicd IS NULLgenerate_coding_template(db, filters) -> bytes— openpyxl workbook with case ID, name, fallgruppe, empty ICD column
Step 3: Run tests
pytest tests/test_icd_service.py -v
Step 4: Commit
git add backend/app/services/icd_service.py backend/tests/test_icd_service.py
git commit -m "feat: ICD service — normalize, split, validate, coding template"
Task 11: Import & Cases API Routes
Files:
- Create:
backend/app/api/import_router.py - Create:
backend/app/api/cases.py - Create:
backend/app/schemas/case.py - Modify:
backend/app/main.py(add routers)
Step 1: Write case schemas
backend/app/schemas/case.py:
CaseResponse— full case representationCaseListResponse(items: list[CaseResponse], total, page, per_page)CaseUpdate— partial update fieldsICDUpdate(icd: str)CodingUpdate(gutachten_typ: str, therapieaenderung: str, ta_diagnosekorrektur, ta_unterversorgung, ta_uebertherapie)
Step 2: Write import_router.py
POST /api/import/csv→ accept file upload, parse, return ImportPreviewPOST /api/import/csv/confirm→ confirm import from preview sessionPOST /api/import/icd-xlsx→ upload ICD-coded Excel (DAK role)POST /api/import/historical→ one-time import from Abrechnung_DAK.xlsx (admin only)GET /api/import/log→ import history
Step 3: Write cases.py
GET /api/cases→ paginated list with filters (jahr, kw, fallgruppe, has_icd, has_coding)GET /api/cases/{id}→ single casePUT /api/cases/{id}→ update case fields (admin)GET /api/cases/pending-icd→ cases without ICDPUT /api/cases/{id}/icd→ set ICD (DAK or admin)GET /api/cases/pending-coding→ gutachten without typPUT /api/cases/{id}/coding→ set gutachten_typ + therapieaenderung (admin)GET /api/cases/coding-template→ download .xlsx template
Step 4: Register routers in main.py
Step 5: Test endpoints manually
uvicorn app.main:app --reload --port 8000
# Upload a CSV, check preview, confirm import
Step 6: Commit
git add backend/app/api/import_router.py backend/app/api/cases.py backend/app/schemas/case.py
git commit -m "feat: import and cases API endpoints"
Task 12: Historical Excel Import
Files:
- Create:
backend/app/services/excel_import.py - Create:
backend/scripts/import_historical.py
Step 1: Write excel_import.py
Parse Abrechnung_DAK.xlsx:
- Read sheets: "2026", "2025", "2024", "2023", "2020-2022" (skip "Tabelle1")
- Map 39 columns to Case model fields (column positions from spec)
- Handle "2020-2022" sheet which has extra "Jahr" column at position 2
- Convert German date formats, boolean fields ("Ja"/"Nein"/1/0/empty)
- Handle empty rows, merged cells
- Generate fall_id for each imported case
- Deduplicate against existing DB records
Step 2: Write import_historical.py script
# backend/scripts/import_historical.py
"""One-time script: Import all cases from Abrechnung_DAK.xlsx into DB."""
# Usage: python -m scripts.import_historical /path/to/Abrechnung_DAK.xlsx
- Accept file path as argument
- Print progress per sheet
- Print summary (imported, skipped, errors)
- Log to import_log table
Step 3: Commit (actual import runs when data files are provided)
git add backend/app/services/excel_import.py backend/scripts/import_historical.py
git commit -m "feat: historical Excel import (Abrechnung_DAK.xlsx → DB)"
Task 13: Notification Service
Files:
- Create:
backend/app/services/notification_service.py
Step 1: Implement notification_service.py
send_notification(db, recipient_id, type, title, message, entity_type, entity_id)— create in-app notification + send emailsend_email(to, subject, body)— SMTP viasmtplib.SMTP_SSLon port 465- Trigger points:
new_cases_uploaded→ notify all DAK users when admin uploads CSVicd_entered→ notify admin when DAK user enters ICDicd_uploaded→ notify admin when DAK user uploads ICD Excelreport_ready→ notify all users when report generatedcoding_completed→ notify DAK users when coding done
Step 2: Commit
git add backend/app/services/notification_service.py
git commit -m "feat: notification service — in-app + SMTP email"
Phase 3: Reports & Coding
Task 14: Report Service
Files:
- Create:
backend/app/services/report_service.py - Create:
backend/app/services/vorjahr_service.py - Create:
backend/tests/test_report_service.py
Step 1: Write report_service.py
Implement all 5 sheet calculations (using pandas queries against DB):
Sheet 1 "Auswertung KW gesamt":
- Per KW: count Erstberatungen, Unterlagen (unterlagen=1), Ablehnungen (ablehnung=1), Keine Rückmeldung (NOT unterlagen AND NOT ablehnung AND NOT abbruch), Gutachten (gutachten=1)
- Totals row with percentages
Sheet 2 "Auswertung nach Fachgebieten":
- Per KW, per fallgruppe (onko, kardio, intensiv, galle, sd): Anzahl, Gutachten, Keine RM/Ablehnung
Sheet 3 "Auswertung Gutachten":
- Per KW, per fallgruppe + gesamt: Gutachten count, Alternative (gutachten_typ='Alternative'), Bestätigung (gutachten_typ='Bestätigung')
Sheet 4 "Auswertung Therapieänderungen":
- Per KW: Gutachten, TA Ja, TA Nein, Diagnosekorrektur, Unterversorgung, Übertherapie
Sheet 5 "Auswertung ICD onko":
- ICD codes from onko cases, normalized uppercase, sorted, with count
Step 2: Write vorjahr_service.py
get_vorjahr_data(db, jahr)→ aggregated data from previous year for comparison- Reads from
yearly_summarytable (cached) or calculates live
Step 3: Write tests
backend/tests/test_report_service.py:
- Insert known test data, verify each sheet calculation returns correct values
- Test year-over-year comparison
- Test edge cases (empty weeks, KW 53)
pytest tests/test_report_service.py -v
Step 4: Commit
git add backend/app/services/report_service.py backend/app/services/vorjahr_service.py backend/tests/test_report_service.py
git commit -m "feat: report service — all 5 sheet calculations + year-over-year"
Task 15: Excel Export (Berichtswesen Format)
Files:
- Create:
backend/app/services/excel_export.py - Create:
backend/scripts/import_berichtswesen.py
Step 1: Write excel_export.py
Using openpyxl, generate .xlsx matching the exact Berichtswesen format from spec section 3.3:
- Sheet 1 layout: Row 1 "Gesamtübersicht", Row 2 year headers, Rows 3-7 summary with percentages, Row 10 column headers, Row 11+ data per KW
- Sheet 2 layout: Fallgruppen columns (5 groups × 3 columns)
- Sheet 3 layout: Gutachten breakdown (6 groups × 3 columns)
- Sheet 4 layout: Therapieänderungen (Gutachten, TA Ja/Nein, Diagnosekorrektur, Unterversorgung, Übertherapie)
- Sheet 5 layout: ICD onko (ICD | Anzahl)
- Apply formatting: headers bold, percentage columns formatted, column widths
Step 2: Write import_berichtswesen.py
One-time script to import previous years' Berichtswesen data into yearly_summary table for year-over-year comparisons.
Step 3: Commit
git add backend/app/services/excel_export.py backend/scripts/import_berichtswesen.py
git commit -m "feat: Excel export in exact Berichtswesen format + historical import"
Task 16: Coding Service & Reports API
Files:
- Create:
backend/app/services/coding_service.py - Create:
backend/app/api/coding.py - Create:
backend/app/api/reports.py - Create:
backend/app/schemas/report.py - Create:
backend/app/services/excel_sync.py
Step 1: Write coding_service.py
get_coding_queue(db, filters)— cases wheregutachten=1 AND gutachten_typ IS NULLupdate_coding(db, case_id, gutachten_typ, therapieaenderung, ta_*, user_id)— set coding fields, log auditbatch_update_coding(db, updates: list, user_id)— mass coding for historical data
Step 2: Write report schemas
backend/app/schemas/report.py:
DashboardKPIs(total_cases, pending_icd, pending_coding, reports_generated, current_year_stats)WeeklyData(kw, erstberatungen, unterlagen, ablehnungen, keine_rm, gutachten)ReportMeta(id, jahr, kw, generated_at, download_url)
Step 3: Write coding API
backend/app/api/coding.py:
GET /api/coding/queue→ paginated coding queue (admin)PUT /api/coding/{id}→ update single case coding (admin)
Step 4: Write reports API
backend/app/api/reports.py:
GET /api/reports/dashboard→ live KPIs + chart dataGET /api/reports/weekly/{jahr}/{kw}→ specific week dataPOST /api/reports/generate→ generate .xlsx, save to disk + DB, return metadataGET /api/reports/download/{id}→ serve generated .xlsx fileGET /api/reports/list→ all generated reports
Step 5: Write excel_sync.py
sync_db_to_excel(db)→ export current DB state to Abrechnung_DAK.xlsx formatsync_excel_to_db(db, file)→ import changes from edited Excel back to DB- Triggered via
POST /api/admin/excel-sync
Step 6: Register all new routers in main.py
Step 7: Commit
git add backend/app/services/coding_service.py backend/app/services/excel_sync.py backend/app/api/coding.py backend/app/api/reports.py backend/app/schemas/report.py
git commit -m "feat: coding queue, reports API, Excel sync"
Task 17: Push Backend to GitHub
Step 1: Create GitHub repo
gh repo create complexcaresolutions/dak.c2s --private --source=/home/frontend/dak_c2s --push
Step 2: Push develop branch
cd /home/frontend/dak_c2s
git push -u origin develop
Phase 4: Frontend
Task 18: Frontend Setup
Files:
- Create:
frontend/(Vite scaffold) - Create:
frontend/vite.config.ts - Create:
frontend/tailwind.config.js - Create:
frontend/src/main.tsx - Create:
frontend/src/App.tsx
Step 1: Scaffold React + Vite + TypeScript
cd /home/frontend/dak_c2s
pnpm create vite frontend --template react-ts
cd frontend
pnpm install
Step 2: Install dependencies
pnpm add axios react-router-dom recharts
pnpm add -D tailwindcss @tailwindcss/vite
Step 3: Configure Tailwind
Add Tailwind via @tailwindcss/vite plugin in vite.config.ts. Add @import "tailwindcss" in CSS.
Step 4: Configure Vite API proxy
// frontend/vite.config.ts
export default defineConfig({
plugins: [react(), tailwindcss()],
server: {
proxy: {
'/api': 'http://localhost:8000',
},
},
});
Step 5: Initialize shadcn/ui
pnpm dlx shadcn@latest init
# Select: TypeScript, default style, CSS variables
Step 6: Verify dev server starts
pnpm dev
# Visit http://localhost:5173
Step 7: Commit
git add frontend/
git commit -m "feat: frontend setup — React, Vite, TypeScript, Tailwind, shadcn/ui"
Task 19: Auth Context & API Client
Files:
- Create:
frontend/src/services/api.ts - Create:
frontend/src/services/authService.ts - Create:
frontend/src/context/AuthContext.tsx - Create:
frontend/src/hooks/useAuth.ts - Create:
frontend/src/types/index.ts - Create:
frontend/src/components/layout/ProtectedRoute.tsx
Step 1: Write TypeScript types
frontend/src/types/index.ts:
User(id, username, email, role, mfa_enabled, is_active, last_login)LoginRequest,RegisterRequest,TokenResponseCase,CaseListResponse,ImportPreview,ImportResultDashboardKPIs,WeeklyData,ReportMetaNotification
Step 2: Write API client with JWT interceptor
frontend/src/services/api.ts:
- Axios instance with base URL
/api - Request interceptor: attach access token from localStorage
- Response interceptor: on 401, try refresh token, retry original request
- If refresh fails, redirect to login
Step 3: Write authService.ts
login(email, password, mfaCode?)→ store tokens, return userregister(data)→ create accountlogout()→ call API, clear tokensrefreshToken()→ get new access token
Step 4: Write AuthContext + useAuth hook
AuthProviderwraps app, stores user + loading stateuseAuth()→{ user, login, logout, register, isAdmin, isLoading }- On mount: check stored token, try refresh
Step 5: Write ProtectedRoute
- If not authenticated → redirect to
/login - If
requireAdminand user is not admin → show 403 - Otherwise render children
Step 6: Commit
git add frontend/src/
git commit -m "feat: auth context, API client with JWT refresh, protected routes"
Task 20: Login & Register Pages
Files:
- Create:
frontend/src/pages/LoginPage.tsx - Create:
frontend/src/pages/RegisterPage.tsx - Modify:
frontend/src/App.tsx(add routes)
Step 1: Install shadcn components
pnpm dlx shadcn@latest add button input label card form
Step 2: Write LoginPage
- Email + password form
- Optional MFA code field (shown after first login attempt if MFA enabled)
- Error display for invalid credentials / account locked
- Link to register
Step 3: Write RegisterPage
- Username, email, password fields
- Optional invitation token field (from URL param)
- Domain validation feedback (@dak.de)
Step 4: Wire up React Router in App.tsx
<Routes>
<Route path="/login" element={<LoginPage />} />
<Route path="/register" element={<RegisterPage />} />
<Route path="/*" element={<ProtectedRoute><AppLayout /></ProtectedRoute>} />
</Routes>
Step 5: Commit
git add frontend/src/pages/ frontend/src/App.tsx
git commit -m "feat: login and register pages with MFA support"
Task 21: App Layout
Files:
- Create:
frontend/src/components/layout/AppLayout.tsx - Create:
frontend/src/components/layout/Sidebar.tsx - Create:
frontend/src/components/layout/Header.tsx
Step 1: Install shadcn components
pnpm dlx shadcn@latest add avatar dropdown-menu separator sheet badge
Step 2: Write Sidebar
Navigation items (role-aware):
- Dashboard (all)
- Fälle (all)
- Import (admin)
- ICD-Eingabe (dak_mitarbeiter)
- Coding (admin)
- Berichte (all)
- Admin (admin only): Users, Einladungen, Audit-Log
Step 3: Write Header
- App title
- NotificationBell (placeholder for now)
- User dropdown (profile, logout)
Step 4: Write AppLayout
- Sidebar + Header +
<Outlet />for page content - Responsive: collapsible sidebar on mobile
Step 5: Commit
git add frontend/src/components/layout/
git commit -m "feat: app layout with role-aware sidebar and header"
Task 22: Dashboard Page
Files:
- Create:
frontend/src/pages/DashboardPage.tsx - Create:
frontend/src/components/dashboard/KPICards.tsx - Create:
frontend/src/components/dashboard/WeeklyChart.tsx - Create:
frontend/src/components/dashboard/FallgruppenDonut.tsx - Create:
frontend/src/components/dashboard/VorjahresComparison.tsx - Create:
frontend/src/hooks/useDashboard.ts
Step 1: Install shadcn components
pnpm dlx shadcn@latest add card tabs
Step 2: Write useDashboard hook
- Fetch
GET /api/reports/dashboard - Return: kpis, weeklyData, loading, error
Step 3: Write KPICards
4 cards: Erstberatungen gesamt, Pending ICD, Pending Coding, Gutachten gesamt. Color-coded.
Step 4: Write WeeklyChart
Recharts BarChart showing weekly Erstberatungen + Gutachten trend.
Step 5: Write FallgruppenDonut
Recharts PieChart showing distribution across 5 Fallgruppen.
Step 6: Write VorjahresComparison
Table comparing current year vs previous year key metrics.
Step 7: Assemble DashboardPage
Layout: KPICards (top), WeeklyChart (left) + FallgruppenDonut (right), VorjahresComparison (bottom).
Step 8: Commit
git add frontend/src/pages/DashboardPage.tsx frontend/src/components/dashboard/ frontend/src/hooks/useDashboard.ts
git commit -m "feat: dashboard with KPI cards, weekly chart, fallgruppen donut, year comparison"
Task 23: Cases Page
Files:
- Create:
frontend/src/pages/CasesPage.tsx - Create:
frontend/src/components/cases/CaseTable.tsx - Create:
frontend/src/components/cases/CaseDetail.tsx - Create:
frontend/src/components/cases/ICDInlineEdit.tsx - Create:
frontend/src/hooks/useCases.ts
Step 1: Install shadcn components
pnpm dlx shadcn@latest add table dialog select pagination
Step 2: Write useCases hook
- Fetch
GET /api/caseswith pagination + filters - CRUD operations for individual cases
Step 3: Write CaseTable
- Columns: ID, KW, Nachname, Vorname, Fallgruppe, ICD, Gutachten, Status
- Filters: Jahr, KW, Fallgruppe, has_icd, has_coding
- Pagination
- Click row → CaseDetail dialog
Step 4: Write CaseDetail
- Full case view in dialog/sheet
- Editable fields (admin): all case fields
- Read-only for dak_mitarbeiter (except ICD)
Step 5: Write ICDInlineEdit
- Inline ICD editing in case table or detail view
- Validation feedback (regex check)
- For dak_mitarbeiter: only ICD field editable
Step 6: Commit
git add frontend/src/pages/CasesPage.tsx frontend/src/components/cases/ frontend/src/hooks/useCases.ts
git commit -m "feat: cases page with filterable table, detail view, inline ICD edit"
Task 24: Import Pages
Files:
- Create:
frontend/src/pages/ImportPage.tsx - Create:
frontend/src/components/import/CSVUpload.tsx - Create:
frontend/src/components/import/ImportPreview.tsx - Create:
frontend/src/components/import/ICDUpload.tsx
Step 1: Write CSVUpload
- File dropzone (accept .csv)
- Upload → POST /api/import/csv → show ImportPreview
Step 2: Write ImportPreview
- Table showing parsed rows (new vs duplicate)
- Confirm/Cancel buttons
- On confirm → POST /api/import/csv/confirm → show ImportResult
Step 3: Write ICDUpload
- File dropzone (accept .xlsx)
- Upload → POST /api/import/icd-xlsx
- Show result (cases updated, errors)
- Template download link (GET /api/cases/coding-template)
Step 4: Assemble ImportPage
Tabs: "CSV Import" | "ICD Upload" | "Import-Log"
Step 5: Commit
git add frontend/src/pages/ImportPage.tsx frontend/src/components/import/
git commit -m "feat: import pages — CSV upload with preview, ICD Excel upload"
Task 25: Coding Queue Page
Files:
- Create:
frontend/src/pages/CodingPage.tsx - Create:
frontend/src/components/coding/CodingQueue.tsx - Create:
frontend/src/components/coding/CodingCard.tsx - Create:
frontend/src/components/coding/CodingProgress.tsx
Step 1: Write CodingQueue
- List of cases with gutachten=1 but no gutachten_typ
- Each shows: Name, Fallgruppe, Kurzbeschreibung, Fragestellung
Step 2: Write CodingCard
- Individual case coding form
- Fields: gutachten_typ (Bestätigung/Alternative), therapieaenderung (Ja/Nein), checkboxes for ta_diagnosekorrektur, ta_unterversorgung, ta_uebertherapie
- Save → PUT /api/coding/{id}
Step 3: Write CodingProgress
- Progress bar: X of Y cases coded
- Stats: Bestätigung vs Alternative ratio
Step 4: Commit
git add frontend/src/pages/CodingPage.tsx frontend/src/components/coding/
git commit -m "feat: coding queue with progress tracking"
Task 26: Reports Page
Files:
- Create:
frontend/src/pages/ReportsPage.tsx - Create:
frontend/src/components/reports/ReportList.tsx - Create:
frontend/src/components/reports/ReportDownload.tsx
Step 1: Write ReportList
- Table of generated reports (date, KW, generated_by)
- Download button per report → GET /api/reports/download/{id}
- Generate button (admin) → POST /api/reports/generate
Step 2: Commit
git add frontend/src/pages/ReportsPage.tsx frontend/src/components/reports/
git commit -m "feat: reports page with list and download"
Task 27: Notifications
Files:
- Create:
frontend/src/components/notifications/NotificationBell.tsx - Create:
frontend/src/components/notifications/NotificationDropdown.tsx - Create:
frontend/src/hooks/useNotifications.ts
Step 1: Write useNotifications hook
- Poll
GET /api/notificationsevery 30 seconds - Return: notifications, unreadCount, markAsRead, markAllAsRead
Step 2: Write NotificationBell + Dropdown
- Bell icon in header with unread badge count
- Dropdown: list of recent notifications with timestamps
- Click → mark as read + navigate to related entity
Step 3: Commit
git add frontend/src/components/notifications/ frontend/src/hooks/useNotifications.ts
git commit -m "feat: notification bell with dropdown and polling"
Task 28: Admin Pages
Files:
- Create:
frontend/src/pages/AdminUsersPage.tsx - Create:
frontend/src/pages/AdminInvitationsPage.tsx - Create:
frontend/src/pages/AdminAuditPage.tsx - Create:
frontend/src/components/admin/UserManagement.tsx - Create:
frontend/src/components/admin/InvitationLinks.tsx - Create:
frontend/src/components/admin/AuditLog.tsx
Step 1: Write UserManagement
- Table: username, email, role, active, last_login
- Edit button → dialog with role/active toggle
- Create user button
Step 2: Write InvitationLinks
- Create invitation form (email optional, expiry date)
- List existing invitations with status (active/used/expired)
- Copy link button
Step 3: Write AuditLog
- Paginated table: timestamp, user, action, entity, old/new values (JSON expandable)
- Filter by user, action, date range
Step 4: Commit
git add frontend/src/pages/Admin* frontend/src/components/admin/
git commit -m "feat: admin pages — user management, invitations, audit log"
Phase 5: Integration & Deploy
Task 29: Frontend Production Build & Integration Test
Step 1: Build frontend
cd /home/frontend/dak_c2s/frontend
pnpm build
# Output: frontend/dist/
Step 2: Test full stack locally
- Backend:
uvicorn app.main:app --port 8000 - Serve frontend build from dist/
- Test all flows: login → dashboard → import CSV → view cases → enter ICD → coding → generate report → download
Step 3: Commit
git add -A
git commit -m "feat: frontend production build, integration tested"
Task 30: GitHub & Deploy to Hetzner 1
Step 1: Push to GitHub
cd /home/frontend/dak_c2s
git push origin develop
git checkout main
git merge develop
git push origin main
git checkout develop
Step 2: Clone on Hetzner 1
ssh hetzner1
# As appropriate user:
cd /var/www/vhosts/dak.complexcaresolutions.de/
git clone git@github.com:complexcaresolutions/dak.c2s.git .
# Or: git init + git remote add origin + git pull
Step 3: Setup Python venv on Hetzner
cd /var/www/vhosts/dak.complexcaresolutions.de/backend
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txt
Step 4: Configure .env on Hetzner
- DB_HOST=localhost (MariaDB is local on Hetzner 1)
- All other production values
Step 5: Run Alembic migrations on production DB
cd /var/www/vhosts/dak.complexcaresolutions.de/backend
source venv/bin/activate
alembic upgrade head
Step 6: Create systemd service
Copy service file from spec section 10 to /etc/systemd/system/dak-backend.service.
sudo systemctl daemon-reload
sudo systemctl enable dak-backend
sudo systemctl start dak-backend
sudo systemctl status dak-backend
Step 7: Configure Plesk Nginx
Add directives from spec section 9 in Plesk → dak.complexcaresolutions.de → Additional nginx directives.
Step 8: Build frontend on Hetzner
cd /var/www/vhosts/dak.complexcaresolutions.de/frontend
npm install
npm run build
Step 9: Create admin account
cd /var/www/vhosts/dak.complexcaresolutions.de/backend
source venv/bin/activate
python -m scripts.create_admin
Step 10: Test SMTP
python -c "
from app.services.notification_service import send_email
send_email('test@example.com', 'Test', 'Portal SMTP works')
"
Step 11: Smoke test production
- Visit https://dak.complexcaresolutions.de
- Login with admin
- Check dashboard loads
- Upload test CSV
- Verify notifications
Step 12: Invite DAK users
- Create invitation links via admin panel
- Send to DAK contacts
Dependency Graph
Task 1 (Scaffolding) → Task 2 (Models) → Task 3 (Alembic)
→ Task 4 (Security) → Task 5 (Auth API) → Task 6 (Admin API)
→ Task 7 (Utils) → Task 8 (CSV Parser) → Task 9 (Import Service)
→ Task 10 (ICD Service)
→ Task 11 (Import/Cases API) ← Task 9, 10
→ Task 12 (Historical Import)
→ Task 13 (Notifications)
→ Task 14 (Reports) → Task 15 (Excel Export) → Task 16 (Coding + Reports API)
→ Task 17 (Push to GitHub)
→ Task 18 (Frontend Setup) → Task 19 (Auth Context) → Task 20 (Login)
→ Task 21 (Layout)
→ Task 22 (Dashboard)
→ Task 23 (Cases)
→ Task 24 (Import Pages)
→ Task 25 (Coding)
→ Task 26 (Reports)
→ Task 27 (Notifications)
→ Task 28 (Admin Pages)
→ Task 29 (Integration) → Task 30 (Deploy)
Notes
- Referenzdaten: User provides data files to
data/— historical import (Task 12) runs when available - Python compatibility: Code must work on both 3.13 (dev) and 3.11 (prod) — avoid 3.12+ syntax like
typestatements - Testing strategy: Unit tests for services/utils (pytest), manual integration tests for API endpoints, visual testing for frontend
- DB: Single MariaDB instance on Hetzner 1 — dev connects remotely, prod connects locally