# Supabase Edge Functions Security Audit

You are a Senior DevOps Security Auditor. Your task is to verify whether the
Supabase Edge Functions have been correctly implemented according to the runbook (Article 3).

Execute the following checks in order. Use the available tools.
For each check: Report PASS, WARNING, or CRITICAL with a brief justification.
At the end, produce a summary with recommended actions.

---

## A1: Edge Functions Used Only for Integrations

Verify that Edge Functions serve as integration points, not as a second backend.

```bash
# Alle Edge Functions auflisten
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

echo "=== Functions Verzeichnis: $FUNCTIONS_DIR ==="
ls -d ${FUNCTIONS_DIR}/*/ 2>/dev/null | grep -v "main\|_shared"

# Für jede Function prüfen: Webhook/Integration oder Business-Logik?
echo "=== Webhook/Integration Patterns ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  echo "--- $name ---"

  # Webhook-Patterns vorhanden?
  grep -l "signature\|webhook\|stripe\|github\|trigger\|event" "$dir"*.ts 2>/dev/null || true

  # Business-Logik Patterns (Warnung)?
  if grep -qE "getUser\|session\|createServerClient\|\.from\(.*\)\.select\(.*\)\.eq\(" "$dir/index.ts" 2>/dev/null; then
    echo "WARNUNG: Enthält möglicherweise Business-Logik (User-Kontext, komplexe Queries)"
  fi

  # CRUD-Patterns (Warnung)?
  if grep -qE "\.insert\(|\.update\(|\.delete\(" "$dir/index.ts" 2>/dev/null; then
    COUNT=$(grep -cE "\.insert\(|\.update\(|\.delete\(" "$dir/index.ts" 2>/dev/null || echo 0)
    if [ "$COUNT" -gt 2 ]; then
      echo "WARNUNG: $COUNT DB-Mutationen. Möglicherweise Business-Logik statt Integration."
    fi
  fi
done
```

Expectation: Functions are webhooks, event handlers, or API integrations. Complex business logic (CRUD, user sessions, ownership) belongs in Next.js.
Risk: Duplicated logic in Edge Functions and Next.js leads to inconsistent validation and hard-to-debug errors.

---

## A2: Webhook Endpoints Isolated

Verify that each webhook provider has its own dedicated function.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

echo "=== Webhook Functions ==="
ls -d ${FUNCTIONS_DIR}/*webhook*/ 2>/dev/null || echo "Keine Webhook-Functions gefunden"

# Prüfe ob eine Function mehrere Provider bedient
for dir in ${FUNCTIONS_DIR}/*-webhook/; do
  [ -d "$dir" ] || continue
  name=$(basename "$dir")
  providers=$(grep -ciE "stripe|github|trigger|slack|sendgrid|twilio" "$dir/index.ts" 2>/dev/null || echo 0)
  if [ "$providers" -gt 1 ]; then
    echo "WARNUNG: $name referenziert mehrere Provider ($providers)"
  else
    echo "OK: $name"
  fi
done
```

Expectation: One webhook provider per function (stripe-webhook, github-webhook, etc.).
Risk: Shared error handlers. A malformed Stripe payload blocks the GitHub webhook.

---

## A3: CORS Correctly Configured

Verify that CORS is properly set on all functions.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

# Shared CORS Config vorhanden?
echo "=== Shared CORS Config ==="
ls ${FUNCTIONS_DIR}/_shared/cors.ts 2>/dev/null || echo "WARNUNG: Keine shared cors.ts"

if [ -f "${FUNCTIONS_DIR}/_shared/cors.ts" ]; then
  cat "${FUNCTIONS_DIR}/_shared/cors.ts"
fi

# Wildcard CORS in Produktion?
echo "=== Wildcard CORS ==="
grep -rn "'\\*'" ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null | grep -i "origin" || \
  echo "Kein Wildcard-Origin gefunden (gut)"

# Alle Functions haben OPTIONS Handler?
echo "=== OPTIONS Handler ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  if ! grep -q "OPTIONS" "$dir/index.ts" 2>/dev/null; then
    echo "WARNUNG: $name hat keinen OPTIONS Handler"
  else
    echo "OK: $name"
  fi
done

# CORS Headers in ALLEN Responses (auch Error)?
echo "=== CORS in Error Responses ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  ERROR_RESPONSES=$(grep -c "new Response" "$dir/index.ts" 2>/dev/null || echo 0)
  CORS_RESPONSES=$(grep -c "corsHeaders\|getCorsHeaders" "$dir/index.ts" 2>/dev/null || echo 0)
  if [ "$ERROR_RESPONSES" -gt "$CORS_RESPONSES" ] && [ "$ERROR_RESPONSES" -gt 0 ]; then
    echo "WARNUNG: $name hat $ERROR_RESPONSES Responses aber nur $CORS_RESPONSES mit CORS Headers"
  fi
done
```

Expectation:
- Shared cors.ts present in _shared/
- No wildcard origin (`'*'`) in production
- All functions have an OPTIONS preflight handler
- CORS headers in all responses (including error responses)

Risk: Without CORS, browser requests fail. With a wildcard, any website can send requests.

---

## B1: Webhook Signatures Verified

Verify that all webhook functions validate provider signatures.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

echo "=== Signaturprüfung ==="
for dir in ${FUNCTIONS_DIR}/*-webhook/ ${FUNCTIONS_DIR}/*webhook*/; do
  [ -d "$dir" ] || continue
  name=$(basename "$dir")

  if grep -qE "signature|verify|hmac|crypto\.subtle" "$dir/index.ts" 2>/dev/null; then
    echo "OK: $name prüft Signaturen"

    # Timing-Check vorhanden (Replay-Schutz)?
    if grep -qE "timestamp|Date\.now\|time" "$dir/index.ts" 2>/dev/null; then
      echo "  + Timing/Replay-Schutz vorhanden"
    else
      echo "  - WARNUNG: Kein Timing-Check für Replay-Schutz"
    fi
  else
    echo "KRITISCH: $name hat KEINE Signaturprüfung"
  fi
done

# Auch Non-Webhook Functions die POST akzeptieren
echo "=== Non-Webhook Functions mit POST ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" || "$name" == *"webhook"* ]] && continue
  if grep -q "POST" "$dir/index.ts" 2>/dev/null; then
    echo "INFO: $name akzeptiert POST. Auth-Mechanismus prüfen:"
    grep -n "auth\|token\|Bearer\|jwt\|verify" "$dir/index.ts" 2>/dev/null | head -3 || \
      echo "  WARNUNG: Kein Auth-Mechanismus erkennbar"
  fi
done
```

Expectation: All webhook functions verify provider signatures (Stripe: stripe-signature + HMAC, GitHub: X-Hub-Signature-256).
Risk: Without signature verification, anyone can send forged events (e.g., fake payment confirmations).

---

## B2: Supabase Client Correct (anon vs. service_role)

Verify that the Supabase client is correctly initialized in Edge Functions.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

# Shared Client vorhanden?
echo "=== Shared Supabase Client ==="
ls ${FUNCTIONS_DIR}/_shared/supabase*.ts 2>/dev/null || echo "Keine shared Client-Datei"

# Wo wird service_role verwendet?
echo "=== service_role Nutzung ==="
grep -rn "SERVICE_ROLE\|service_role" ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null | \
  grep -v "_shared/"

# In welchen Functions wird der Admin/Service-Role Client erstellt?
echo "=== createClient mit service_role ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  if grep -qE "SERVICE_ROLE|createAdminClient|service_role" "$dir/index.ts" 2>/dev/null; then
    echo "$name: nutzt service_role"
    # Wird User-Input direkt in Queries verwendet?
    if grep -qE "payload\.\|body\.\|data\." "$dir/index.ts" 2>/dev/null; then
      if ! grep -qE "safeParse\|z\.\|validate\|schema" "$dir/index.ts" 2>/dev/null; then
        echo "  WARNUNG: Nutzt service_role UND verarbeitet Payload ohne erkennbare Validation"
      fi
    fi
  fi
done
```

Expectation:
- Shared client factory in _shared/ (anon vs. admin separated)
- service_role only in webhook/integration functions (no user context)
- If service_role + user input: input validation must be present

Risk: service_role + unvalidated input = arbitrary DB operations without RLS.

---

## B3: Input Validation

Verify that all Edge Functions validate incoming data.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

echo "=== Input Validation ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue

  if grep -qE "zod|safeParse|z\.object|z\.string|validate|schema" "$dir/index.ts" 2>/dev/null; then
    echo "OK: $name hat Schema Validation"
  elif grep -qE "JSON\.parse|req\.json\(\)" "$dir/index.ts" 2>/dev/null; then
    echo "WARNUNG: $name parsed JSON/Body ohne Schema Validation"
  else
    echo "INFO: $name verarbeitet möglicherweise keinen Body"
  fi
done

# Wird zod über npm: importiert (Deno-kompatibel)?
echo "=== Zod Import ==="
grep -rn "from.*zod\|import.*zod" ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null | head -5
```

Expectation: All functions that process payloads have schema validation (e.g., zod).
Risk: Without validation, unexpected data structures can lead to undefined DB operations.

---

## B4: Secrets Not in Code

Verify that secrets are correctly loaded via environment variables.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

# Hardcoded Secrets
echo "=== Hardcoded Secrets ==="
grep -rn "sk_live\|sk_test\|whsec_\|ghsec_\|Bearer ey\|SG\.\|sk-proj\|sk-ant" \
  ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null || echo "Keine hardcoded Secrets gefunden (gut)"

# Secrets korrekt über Deno.env geladen?
echo "=== Deno.env.get Nutzung ==="
grep -rn "Deno.env.get" ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null | head -10

# .env.functions Datei vorhanden und sicher?
echo "=== .env.functions ==="
ENV_FUNC=$(find /opt -name ".env.functions" 2>/dev/null | head -1)
if [ -n "$ENV_FUNC" ]; then
  stat -c "%a %U" "$ENV_FUNC"
  # Nicht im Git?
  cd $(dirname "$ENV_FUNC") && git ls-files .env.functions 2>/dev/null
else
  echo "Keine .env.functions gefunden (Secrets möglicherweise über docker-compose)"
fi
```

Expectation:
- No hardcoded API keys, webhook secrets, or tokens in code
- Secrets loaded via Deno.env.get()
- .env.functions with permissions 600, not tracked in Git

Risk: Hardcoded secrets in the Git repo are exposed with every clone.

---

## B5: Timeouts and Long-Running Operations

Verify that Edge Functions do not contain long-running operations.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

# Langläufer-Patterns
echo "=== Langläufer-Patterns ==="
grep -rn "openai\|anthropic\|sharp\|ffmpeg\|puppeteer\|playwright\|pdf\|video\|transcode" \
  ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null || echo "Keine Langläufer-Patterns gefunden (gut)"

# Sleep/Timeout Patterns
echo "=== Sleep/Delay Patterns ==="
grep -rn "sleep\|setTimeout\|delay\|await new Promise" \
  ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null | head -5

# Externe fetch-Calls ohne Timeout?
echo "=== Fetch ohne AbortSignal ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  FETCHES=$(grep -c "fetch(" "$dir/index.ts" 2>/dev/null || echo 0)
  TIMEOUTS=$(grep -c "AbortSignal\|signal\|timeout" "$dir/index.ts" 2>/dev/null || echo 0)
  if [ "$FETCHES" -gt 0 ] && [ "$TIMEOUTS" -eq 0 ]; then
    echo "WARNUNG: $name hat $FETCHES fetch-Calls ohne Timeout/AbortSignal"
  fi
done
```

Expectation:
- No AI inference, PDF generation, or video processing in Edge Functions
- External fetch calls have timeouts (AbortSignal)
- Long-running tasks are delegated to Trigger.dev

Risk: Long-running operations block worker slots. Subsequent webhook calls fail with timeouts.

---

## B6: Error Handling

Verify that Edge Functions handle errors securely.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

# Try/Catch vorhanden?
echo "=== Try/Catch ==="
for dir in ${FUNCTIONS_DIR}/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  if ! grep -q "try" "$dir/index.ts" 2>/dev/null; then
    echo "WARNUNG: $name hat kein try/catch"
  else
    echo "OK: $name"
  fi
done

# Error Details in Responses?
echo "=== Error Details an Client geleakt? ==="
grep -rn "error\.stack\|error\.message" ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null | \
  grep "Response" || echo "Keine Error-Detail-Leaks gefunden (gut)"

# Secrets in console.log?
echo "=== Secrets in Logs ==="
grep -rn "console\.log.*KEY\|console\.log.*SECRET\|console\.log.*token\|console\.log.*password" \
  ${FUNCTIONS_DIR}/ --include="*.ts" 2>/dev/null || echo "Keine Secret-Logs gefunden (gut)"
```

Expectation:
- All functions have try/catch wrapping the main logic
- No error.stack or error.message in responses sent to the client
- No secrets in console.log

Risk: Stack traces in responses reveal internal paths and DB details. This facilitates targeted attacks.

---

## C1: Deployment Workflow

Verify that Edge Functions are correctly deployed.

```bash
FUNCTIONS_DIR=$(find /opt -path "*/volumes/functions" -type d 2>/dev/null | head -1)
FUNCTIONS_DIR=${FUNCTIONS_DIR:-"./supabase/functions"}

# Edge Runtime Container läuft?
echo "=== Edge Runtime Container ==="
docker ps --format '{{.Names}}\t{{.Image}}\t{{.Status}}' 2>/dev/null | grep -i "function\|edge"

# Functions im Volume vorhanden?
echo "=== Functions im Volume ==="
ls -la ${FUNCTIONS_DIR}/ 2>/dev/null | head -20

# VERIFY_JWT Einstellung
echo "=== VERIFY_JWT ==="
grep "VERIFY_JWT" /opt/supabase/.env /opt/supabase/docker-compose.yml 2>/dev/null
```

Expectation:
- Edge Runtime container running
- Functions present as files in the volume
- VERIFY_JWT consciously configured (false for webhooks that do not send JWTs)

---

## Summary

Now produce a summary in the following format:

```
# Edge Functions Security Audit - [DATE]

## Result

PASS:     X of Y checks
WARNING:  X checks
CRITICAL: X checks

## Critical Findings (act immediately)
- ...

## Warnings (resolve this week)
- ...

## Passed
- ...

## Recommended Next Steps
1. ...
2. ...
3. ...
```

Prioritize strictly: Critical findings first, then warnings.
For each finding: What is the problem, why is it a risk, what is the fix.
