Skip to content
Infrastructure & Technology

Deploying Supabase Edge Functions Securely

DevOps runbook for Supabase Edge Functions: webhooks, signatures, CORS, Deno runtime, input validation, and Claude Code integration.

Mansoor Ahmed
Mansoor Ahmed
Head of Engineering 16 min read

Supabase Edge Functions are server-side TypeScript functions that run on the Deno runtime and are accessible through the Kong API Gateway.

They are particularly well suited for webhooks, integrations with external APIs, signed event endpoints, and lightweight server-side logic.

The most common mistake is using Edge Functions as a second backend system. This leads to duplicated business logic, unclear security boundaries, and architectures that are difficult to debug.

This runbook describes how to deploy Edge Functions securely in a self-hosted setup. Every step contains a concrete implementation with real Deno code, a verifiable condition, and a failure scenario.

At a Glance - Part 3 of 6 in the DevOps Runbook Series

  • Edge Functions serve as integration points only (webhooks, events), not as a second backend
  • Every webhook endpoint must verify the sender's signature
  • CORS explicitly configured (no wildcard in production)
  • Input validation with Zod on every incoming payload
  • Long-running tasks delegated to Trigger.dev instead of blocking Edge Functions

Note on the runtime: Supabase Edge Functions are based on Deno, not Node.js. This affects import syntax, the module system, and available APIs. All code examples in this article are Deno-compatible. Self-hosted Edge Functions run in the supabase/edge-runtime container and are currently still marked as beta.

Series Table of Contents

This guide is part of our DevOps runbook series for self-hosted app stacks.

  1. Supabase Self-Hosting Runbook
  2. Running Next.js on Supabase Securely
  3. Deploying Supabase Edge Functions Securely - this article
  4. Running Trigger.dev Background Jobs Securely
  5. Claude Code as Security Control in DevOps Workflows
  6. Security Baseline for the Entire Stack

Article 1 covers the platform. Article 2 covers the Next.js app layer. This article covers integrations and webhooks.

Architecture Overview

Browser
   |
Next.js (App Layer)
   |
   +-- Supabase Client (anon key)     -> for user requests
   |
Kong API Gateway
   |
   +-- PostgREST                       -> REST API with RLS
   +-- GoTrue                          -> Auth
   +-- Edge Runtime                    -> Edge Functions
   |     |
   |     +-- stripe-webhook
   |     +-- github-webhook
   |     +-- trigger-webhook
   |
PostgreSQL + RLS Policies

External Services
   |
   +-- Stripe / GitHub / Trigger.dev   -> call webhooks

Ground rule:

Edge Functions are integration points for external events.
Business logic belongs in Next.js (Server Actions / Route Handlers).
Long-running tasks belong in Trigger.dev.

Decision Criteria

If a function reacts to an external event (Stripe payment, GitHub push, Trigger.dev callback), it is an Edge Function. If it reacts to user input and transforms data, it belongs in Next.js. If it runs longer than 30 seconds, it belongs in Trigger.dev.

More than 40% of all webhook security incidents result from missing signature verification (GitGuardian State of Secrets Sprawl 2024).

Decision Table: Where Does Logic Belong?

CriterionEdge FunctionNext.js Server ActionTrigger.dev Task
TriggerExternal event (webhook)User input (browser)Programmatic / cron
RuntimeUnder 30 secondsUnder 10 secondsUp to 5+ minutes
User contextNo (no JWT)Yes (session)No (service_role)
RLS activeNo (service_role)Yes (anon key)No (service_role)
Retry logicNone (provider retries)NoneBuilt-in (configurable)
ExampleStripe webhookSave form dataPDF generation

How Edge Functions Run in a Self-Hosted Setup

In a self-hosted Docker Compose stack, the Edge Functions configuration looks like this:

# docker-compose.yml (excerpt)
functions:
  container_name: supabase-edge-functions
  image: supabase/edge-runtime:v1.66.4    # Version pinnen
  restart: unless-stopped
  depends_on:
    - analytics
  environment:
    JWT_SECRET: ${JWT_SECRET}
    SUPABASE_URL: http://kong:8000                    # interner Docker-Hostname
    SUPABASE_ANON_KEY: ${ANON_KEY}
    SUPABASE_SERVICE_ROLE_KEY: ${SERVICE_ROLE_KEY}
    SUPABASE_DB_URL: postgresql://postgres:${POSTGRES_PASSWORD}@${POSTGRES_HOST}:${POSTGRES_PORT}/${POSTGRES_DB}
    VERIFY_JWT: "${FUNCTIONS_VERIFY_JWT}"
  volumes:
    - ./volumes/functions:/home/deno/functions:Z
  command:
    - start
    - --main-service
    - /home/deno/functions/main

Edge Functions are stored as TypeScript files in the volume:

volumes/functions/
  main/
    index.ts              <- Router / Main Service
  _shared/
    cors.ts               <- CORS Headers (shared)
    supabase-client.ts    <- Supabase Client Factory (shared)
  stripe-webhook/
    index.ts
  github-webhook/
    index.ts

Deployment in a self-hosted setup: Functions are placed as files into the volume, and the container is restarted:

# Copy functions into the volume
cp -r supabase/functions/* /opt/supabase/volumes/functions/

# Restart the container
docker compose restart functions --no-deps

Part A - Architecture Decisions

These decisions are made once and rarely changed.

A1 - Edge Functions for Integrations Only, Not as a Second Backend

Implementation

Clear assignment of which logic belongs where:

Edge Functions (Deno):
  Webhook reception (Stripe, GitHub, external APIs)
  Event-based sync (DB webhook -> external service)
  Signature verification of incoming events
  Lightweight transformations (< 30 seconds)

Next.js (Server Actions / Route Handlers):
  User-facing mutations (CRUD)
  Session-based operations
  Business logic
  Input validation with user context

Trigger.dev (Background Jobs):
  PDF generation
  AI tasks
  Bulk email delivery
  Anything running longer than 30 seconds

Verifiable Condition

# How many Edge Functions exist?
ls -d volumes/functions/*/ | grep -v "main\|_shared" | wc -l

# For each function, check: is it an integration?
for dir in volumes/functions/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  echo "--- $name ---"
  # Does it contain webhook/integration patterns?
  grep -l "signature\|webhook\|stripe\|github\|trigger" "$dir"*.ts 2>/dev/null || \
    echo "WARNING: no webhook/integration patterns found"
done

Failure Scenario

If business logic is implemented in Edge Functions, a shadow architecture emerges: the same validation exists in Next.js Server Actions AND in Edge Functions, with subtle differences. Bugs become hard to reproduce because it is unclear which code path was active. Auth logic must be maintained in two places.

A2 - Isolate Webhook Endpoints: One Function per Webhook

Implementation

Each webhook provider gets its own function in its own directory:

volumes/functions/
  stripe-webhook/
    index.ts
  github-webhook/
    index.ts
  trigger-webhook/
    index.ts

Not like this:

volumes/functions/
  webhooks/
    index.ts     <- handles Stripe, GitHub, and Trigger in a single file

Verifiable Condition

# Each webhook function should serve exactly one provider
for dir in volumes/functions/*-webhook/; do
  name=$(basename "$dir")
  providers=$(grep -ciE "stripe|github|trigger|slack|sendgrid" "$dir/index.ts")
  if [ "$providers" -gt 1 ]; then
    echo "WARNING: $name serves multiple providers ($providers)"
  fi
done

Failure Scenario

When multiple webhooks are handled in a single function, they share the same error handler. A malformed Stripe payload can block the GitHub webhook. Retry strategies differ per provider (Stripe retries with exponential backoff, GitHub only 3 times), and this is nearly impossible to handle cleanly in a shared function.

A3 - Configure CORS Correctly

Implementation

Edge Functions that are called from the browser (even indirectly) need CORS headers. Supabase does not provide automatic CORS configuration for Edge Functions.

Shared CORS config:

// volumes/functions/_shared/cors.ts

const allowedOrigins = [
  'https://app.example.com',
  ...(Deno.env.get('ENVIRONMENT') === 'development'
    ? ['http://localhost:3000']
    : [])
]

export function getCorsHeaders(req: Request) {
  const origin = req.headers.get('origin') ?? ''
  const corsOrigin = allowedOrigins.includes(origin) ? origin : ''

  return {
    'Access-Control-Allow-Origin': corsOrigin,
    'Access-Control-Allow-Headers':
      'authorization, x-client-info, apikey, content-type',
    'Access-Control-Allow-Methods': 'POST, GET, OPTIONS',
  }
}

Every Edge Function must handle CORS at the beginning:

import { getCorsHeaders } from '../_shared/cors.ts'

Deno.serve(async (req) => {
  const corsHeaders = getCorsHeaders(req)

  // CORS preflight must be the FIRST check
  if (req.method === 'OPTIONS') {
    return new Response('ok', { headers: corsHeaders })
  }

  // ... actual logic

  return new Response(JSON.stringify(data), {
    headers: { ...corsHeaders, 'Content-Type': 'application/json' },
  })
})

Verifiable Condition

# All functions must handle CORS
for dir in volumes/functions/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  if ! grep -q "OPTIONS" "$dir/index.ts" 2>/dev/null; then
    echo "WARNING: $name has no OPTIONS handler"
  fi
done

# No wildcard CORS in production
grep -r "'\\*'" volumes/functions/ --include="*.ts" | grep -i "allow-origin"
# Expected: no matches (except in development branches)

Failure Scenario

Without CORS headers, all browser requests to Edge Functions fail. The browser blocks the response even if the function responds correctly. This manifests as a cryptic CORS error in the console. With wildcard CORS ('*') in production, any website can send requests to your Edge Functions and read the responses.

Part B - Implementation Checks

These checks apply to every Edge Function and must be verified with each deployment.

B1 - Verify Webhook Signatures

Implementation

Every webhook endpoint must verify the signature of the sending service. Without signature verification, anyone can send arbitrary payloads to the function.

Complete Stripe webhook example:

// volumes/functions/stripe-webhook/index.ts

import { getCorsHeaders } from '../_shared/cors.ts'

const STRIPE_WEBHOOK_SECRET = Deno.env.get('STRIPE_WEBHOOK_SECRET')

Deno.serve(async (req) => {
  const corsHeaders = getCorsHeaders(req)

  if (req.method === 'OPTIONS') {
    return new Response('ok', { headers: corsHeaders })
  }

  // 1. Only allow POST
  if (req.method !== 'POST') {
    return new Response(
      JSON.stringify({ error: 'Method not allowed' }),
      { status: 405, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
    )
  }

  // 2. Verify signature
  const signature = req.headers.get('stripe-signature')
  if (!signature) {
    return new Response(
      JSON.stringify({ error: 'Missing signature' }),
      { status: 401, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
    )
  }

  const body = await req.text()

  const isValid = await verifyStripeSignature(body, signature, STRIPE_WEBHOOK_SECRET!)
  if (!isValid) {
    return new Response(
      JSON.stringify({ error: 'Invalid signature' }),
      { status: 401, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
    )
  }

  // 3. Process the event
  const event = JSON.parse(body)

  switch (event.type) {
    case 'checkout.session.completed':
      await handleCheckoutCompleted(event.data.object)
      break
    case 'invoice.payment_failed':
      await handlePaymentFailed(event.data.object)
      break
    default:
      // Ignore unknown events, do not fail
      console.log(`Unhandled event type: ${event.type}`)
  }

  // 4. Always return 200 (Stripe retries otherwise)
  return new Response(
    JSON.stringify({ received: true }),
    { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
  )
})

// Stripe signature verification using the Deno Crypto API
async function verifyStripeSignature(
  payload: string,
  header: string,
  secret: string
): Promise<boolean> {
  const parts = header.split(',')
  const timestamp = parts.find(p => p.startsWith('t='))?.split('=')[1]
  const signature = parts.find(p => p.startsWith('v1='))?.split('=')[1]

  if (!timestamp || !signature) return false

  // Timing check: reject events older than 5 minutes
  const now = Math.floor(Date.now() / 1000)
  if (now - parseInt(timestamp) > 300) return false

  const signedPayload = `${timestamp}.${payload}`
  const key = await crypto.subtle.importKey(
    'raw',
    new TextEncoder().encode(secret),
    { name: 'HMAC', hash: 'SHA-256' },
    false,
    ['sign']
  )

  const sig = await crypto.subtle.sign(
    'HMAC',
    key,
    new TextEncoder().encode(signedPayload)
  )

  const expectedSig = Array.from(new Uint8Array(sig))
    .map(b => b.toString(16).padStart(2, '0'))
    .join('')

  return expectedSig === signature
}

async function handleCheckoutCompleted(session: Record<string, unknown>) {
  console.log(`Checkout completed: ${session.id}`)
}

async function handlePaymentFailed(invoice: Record<string, unknown>) {
  console.log(`Payment failed: ${invoice.id}`)
}

GitHub webhook signature verification (different method):

// volumes/functions/github-webhook/index.ts

const GITHUB_WEBHOOK_SECRET = Deno.env.get('GITHUB_WEBHOOK_SECRET')

async function verifyGitHubSignature(
  payload: string,
  signatureHeader: string,
  secret: string
): Promise<boolean> {
  // GitHub uses sha256=<hex>
  const expected = signatureHeader.replace('sha256=', '')

  const key = await crypto.subtle.importKey(
    'raw',
    new TextEncoder().encode(secret),
    { name: 'HMAC', hash: 'SHA-256' },
    false,
    ['sign']
  )

  const sig = await crypto.subtle.sign(
    'HMAC',
    key,
    new TextEncoder().encode(payload)
  )

  const computed = Array.from(new Uint8Array(sig))
    .map(b => b.toString(16).padStart(2, '0'))
    .join('')

  return computed === expected
}

Verifiable Condition

# Every webhook function must have signature verification
for dir in volumes/functions/*-webhook/; do
  name=$(basename "$dir")
  if ! grep -qE "signature|verify|hmac" "$dir/index.ts" 2>/dev/null; then
    echo "CRITICAL: $name has no signature verification"
  else
    echo "OK: $name verifies signatures"
  fi
done

# Check the VERIFY_JWT setting
# Webhooks need VERIFY_JWT=false because external services do not send a Supabase JWT
grep "VERIFY_JWT" .env

Failure Scenario

Without signature verification, an attacker can send arbitrary events to the webhook:

curl -X POST https://app.example.com/functions/v1/stripe-webhook \
  -H "Content-Type: application/json" \
  -d '{"type":"checkout.session.completed","data":{"object":{"customer":"cus_fake"}}}'

This request would trigger a forged checkout confirmation. The function would update the user status even though no payment was ever made.

B2 - Initialize the Supabase Client Correctly (anon vs. service_role)

Implementation

Edge Functions have access to all Supabase environment variables. Choosing between the anon key and the service_role key is the most important security decision per function.

Shared client factory:

// volumes/functions/_shared/supabase-client.ts

import { createClient, SupabaseClient } from 'npm:@supabase/supabase-js@2'

// Client WITH RLS (for user-context operations)
export function createAnonClient(authHeader?: string): SupabaseClient {
  const client = createClient(
    Deno.env.get('SUPABASE_URL')!,
    Deno.env.get('SUPABASE_ANON_KEY')!,
    {
      global: {
        headers: authHeader ? { Authorization: authHeader } : {},
      },
    }
  )
  return client
}

// Client WITHOUT RLS (for webhook processing, admin tasks)
// WARNING: Bypasses all Row Level Security policies
export function createAdminClient(): SupabaseClient {
  return createClient(
    Deno.env.get('SUPABASE_URL')!,
    Deno.env.get('SUPABASE_SERVICE_ROLE_KEY')!
  )
}

When to use which client:

createAnonClient(authHeader):
  User-facing functions (rare, usually belong in Next.js)
  When RLS should control access

createAdminClient():
  Webhook processing (Stripe, GitHub)
  Internal sync tasks
  When the function has NO user context

Rule of thumb: Webhooks have no user context,
so they need service_role.
But: Execute only the minimal required operations.

Verifiable Condition

# Where is service_role / admin client used?
grep -rn "SERVICE_ROLE\|createAdminClient\|service_role" \
  volumes/functions/ --include="*.ts" | grep -v "_shared/"

# Expected: only in webhook functions, not in user-facing functions

# Is the admin client fed with unchecked user input?
# Manual review: in every function that uses createAdminClient,
# check whether input is validated before the DB operation

Failure Scenario

If an Edge Function runs with service_role and passes user input directly into queries, it bypasses RLS entirely. A manipulated webhook payload could then read or write arbitrary data in any table, because the service_role key has no restrictions.

B3 - Input Validation

Implementation

Every Edge Function must validate incoming data before processing it. In Deno, Zod works via npm import:

// volumes/functions/stripe-webhook/index.ts (excerpt)

import { z } from 'npm:zod@3'

// Schema for the expected Stripe event
const stripeEventSchema = z.object({
  id: z.string().startsWith('evt_'),
  type: z.string(),
  data: z.object({
    object: z.record(z.unknown()),
  }),
})

// In the main function after signature verification:
const parsed = stripeEventSchema.safeParse(JSON.parse(body))

if (!parsed.success) {
  return new Response(
    JSON.stringify({ error: 'Invalid payload structure' }),
    { status: 400, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
  )
}

// From here, work with parsed.data (type-safe)
const event = parsed.data

Verifiable Condition

# Every function should have input validation
for dir in volumes/functions/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  if grep -qE "zod|safeParse|z\.object|z\.string" "$dir/index.ts" 2>/dev/null; then
    echo "OK: $name has schema validation"
  elif grep -qE "JSON\.parse" "$dir/index.ts" 2>/dev/null; then
    echo "WARNING: $name parses JSON without schema validation"
  fi
done

Failure Scenario

Without input validation, the function accepts any payload that has a valid signature. A compromised API key at the webhook provider could then send unexpected data structures that lead to undefined database operations - for example, undefined as a user ID in an insert.

B4 - Secrets in Environment Variables, Never in Code

Implementation

In a self-hosted setup, secrets are passed to Edge Functions through the Docker Compose configuration:

# docker-compose.yml (excerpt)
functions:
  environment:
    JWT_SECRET: ${JWT_SECRET}
    SUPABASE_URL: http://kong:8000
    SUPABASE_ANON_KEY: ${ANON_KEY}
    SUPABASE_SERVICE_ROLE_KEY: ${SERVICE_ROLE_KEY}
  env_file:
    - .env.functions    # additional secrets for functions
# .env.functions (only on the server, not in Git)
STRIPE_WEBHOOK_SECRET=whsec_...
STRIPE_SECRET_KEY=sk_live_...
GITHUB_WEBHOOK_SECRET=ghsec_...
TRIGGER_DEV_API_KEY=tr_...

Accessing secrets in code:

// Correct: environment variable
const secret = Deno.env.get('STRIPE_WEBHOOK_SECRET')

// NEVER do this: hardcoded
const secret = 'whsec_abc123...'

For the general fundamentals of secrets management, see Data Security in Enterprise AI Infrastructure.

Verifiable Condition

# Hardcoded secrets in the code?
grep -rn "sk_live\|sk_test\|whsec_\|ghsec_\|Bearer ey" \
  volumes/functions/ --include="*.ts"
# Expected: no matches

# .env.functions not in Git?
git ls-files .env.functions
# Expected: empty

# File permissions correct?
stat -c "%a" .env.functions
# Expected: 600

Failure Scenario

Hardcoded secrets in source code end up in the Git repository. Even if the repo is private, every developer with repo access also has access to production secrets. In the event of an accidental public push, the secrets are immediately compromised.

B5 - Avoid Timeouts and Long-Running Tasks

Implementation

Edge Functions are designed for short, idempotent operations. The self-hosted Edge Runtime has a default timeout (configurable, typically 60 seconds for self-hosted). Long-running tasks block worker slots.

Suitable (< 30 seconds):
  Receive webhook and update DB
  Verify signature and forward event
  Short API calls to external services

NOT suitable (use Trigger.dev):
  PDF generation
  AI inference (LLM calls)
  Video/image processing
  Bulk database operations
  Email delivery to many recipients

Pattern for long-running tasks: Edge Function as trigger, job in Trigger.dev:

// volumes/functions/process-document/index.ts

// CORRECT: Edge Function only triggers the job
Deno.serve(async (req) => {
  // ... verify signature, validate input ...

  // Delegate long-running job to Trigger.dev
  const triggerResponse = await fetch(
    `${Deno.env.get('TRIGGER_DEV_URL')}/api/v1/tasks/process-document/trigger`,
    {
      method: 'POST',
      headers: {
        'Authorization': `Bearer ${Deno.env.get('TRIGGER_DEV_API_KEY')}`,
        'Content-Type': 'application/json',
      },
      body: JSON.stringify({
        payload: { documentId: parsed.data.documentId },
      }),
    }
  )

  // Respond immediately, the job runs in the background
  return new Response(
    JSON.stringify({ queued: true }),
    { status: 202, headers: { 'Content-Type': 'application/json' } }
  )
})

Verifiable Condition

# Search for patterns indicating long-running tasks
grep -rn "await.*fetch.*openai\|pdf\|sharp\|ffmpeg\|sleep\|setTimeout" \
  volumes/functions/ --include="*.ts" | grep -v "trigger"
# Expected: no matches (except short API calls)

# Are there timeouts in the code?
grep -rn "AbortSignal.timeout\|setTimeout" \
  volumes/functions/ --include="*.ts"
# Every external fetch should have a timeout

Failure Scenario

An Edge Function that waits 2 minutes for an AI API call blocks a worker slot in the Edge Runtime container. With multiple concurrent requests, the workers fill up and subsequent webhook calls (e.g., from Stripe) fail with timeouts. Stripe interprets this as an error and retries, which makes the situation worse. The correct pattern is to delegate these jobs to Trigger.dev with proper retry and concurrency configuration.

B6 - Error Handling and Safe Responses

Implementation

Edge Functions must not leak internal details to the caller on errors.

// Pattern for safe error handling

Deno.serve(async (req) => {
  const corsHeaders = getCorsHeaders(req)

  if (req.method === 'OPTIONS') {
    return new Response('ok', { headers: corsHeaders })
  }

  try {
    // ... actual logic ...

    return new Response(
      JSON.stringify({ success: true }),
      { status: 200, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
    )

  } catch (error) {
    // Log internally (details)
    console.error(`Function error: ${error.message}`, {
      stack: error.stack,
      // DO NOT log secrets or user data
    })

    // Respond externally (generic)
    return new Response(
      JSON.stringify({ error: 'Internal server error' }),
      { status: 500, headers: { ...corsHeaders, 'Content-Type': 'application/json' } }
    )
  }
})

What does NOT belong in responses or logs:

// WRONG: sending stack trace to the client
return new Response(JSON.stringify({ error: error.stack }), { status: 500 })

// WRONG: logging secrets
console.log(`Connecting with key: ${Deno.env.get('SERVICE_ROLE_KEY')}`)

// WRONG: logging complete user data
console.log(`Processing user: ${JSON.stringify(user)}`)

Verifiable Condition

# Are error details sent to the client?
grep -rn "error\.stack\|error\.message" volumes/functions/ --include="*.ts" | \
  grep "Response"
# Expected: no matches (stack/message only in console.error)

# Are secrets logged?
grep -rn "console\.log.*KEY\|console\.log.*SECRET\|console\.log.*token" \
  volumes/functions/ --include="*.ts"
# Expected: no matches

Failure Scenario

If an error stack trace is sent to the client, an attacker sees internal paths, module names, and database connection details. This significantly simplifies targeted attacks. If secrets end up in logs, they are visible to anyone with monitoring access.

Part C - Operations and Monitoring

C1 - Deployment Workflow

Implementation

In a self-hosted setup, there is no supabase functions deploy command. The workflow is file-based:

#!/bin/bash
# scripts/deploy-functions.sh

set -euo pipefail

FUNCTIONS_DIR="/opt/supabase/volumes/functions"
SOURCE_DIR="./supabase/functions"

echo "Deploying Edge Functions..."

# 1. Syntax check (Deno)
for file in $(find "$SOURCE_DIR" -name "*.ts" -not -path "*/_shared/*"); do
  deno check "$file" 2>/dev/null || {
    echo "ERROR: Syntax error in $file"
    exit 1
  }
done

# 2. Copy functions
rsync -av --delete \
  --exclude='*.test.ts' \
  "$SOURCE_DIR/" "$FUNCTIONS_DIR/"

# 3. Restart the container
docker compose restart functions --no-deps

# 4. Health check
sleep 5
HEALTH=$(curl -s -o /dev/null -w "%{http_code}" \
  http://localhost:8000/functions/v1/hello 2>/dev/null || echo "000")

if [ "$HEALTH" = "200" ] || [ "$HEALTH" = "401" ]; then
  echo "Edge Functions deployment successful"
else
  echo "WARNING: Health check status $HEALTH"
fi

Verifiable Condition

# Is the Edge Runtime container running?
docker compose ps functions --format '{{.State}}'
# Expected: running

# Are functions reachable?
curl -s -o /dev/null -w "%{http_code}" \
  -H "Authorization: Bearer ${ANON_KEY}" \
  http://localhost:8000/functions/v1/stripe-webhook
# Expected: 200 or 405 (Method Not Allowed, because GET instead of POST)

C2 - Claude Code Integration

Claude Code checks Edge Functions contextually as a supplement to deterministic checks.

Architecture

Git Push / PR
   |
   +-- Deterministic Checks (CI/CD)
   |   +-- grep for hardcoded secrets
   |   +-- grep for missing signature verification
   |   +-- grep for missing input validation
   |   +-- grep for error details in responses
   |   +-- Deno type check
   |
   +-- Claude Code Analysis (weekly or on PR)
       +-- New functions without signature verification?
       +-- service_role usage appropriate?
       +-- Long-running patterns detected?
       +-- Architecture drift (business logic in functions)?
       +-- CORS correct for new functions?

Concrete CI Script

#!/bin/bash
# scripts/check-edge-functions.sh

REPORT=""
FUNCTIONS_DIR="volumes/functions"

# 1. Hardcoded secrets
SECRETS=$(grep -rn "sk_live\|sk_test\|whsec_\|Bearer ey" \
  "$FUNCTIONS_DIR" --include="*.ts" 2>/dev/null)
if [ -n "$SECRETS" ]; then
  REPORT+="CRITICAL: Hardcoded secrets found:\n$SECRETS\n\n"
fi

# 2. Webhook functions without signature verification
for dir in "$FUNCTIONS_DIR"/*-webhook/; do
  [ -d "$dir" ] || continue
  name=$(basename "$dir")
  if ! grep -qE "signature|verify|hmac|crypto" "$dir/index.ts" 2>/dev/null; then
    REPORT+="CRITICAL: $name has no signature verification\n"
  fi
done

# 3. Functions without input validation
for dir in "$FUNCTIONS_DIR"/*/; do
  name=$(basename "$dir")
  [[ "$name" == "main" || "$name" == "_shared" ]] && continue
  if grep -q "JSON.parse" "$dir/index.ts" 2>/dev/null && \
     ! grep -qE "zod|safeParse|z\." "$dir/index.ts" 2>/dev/null; then
    REPORT+="WARNING: $name parses JSON without schema validation\n"
  fi
done

# 4. Error details in responses
LEAKS=$(grep -rn "error\.stack\|error\.message" "$FUNCTIONS_DIR" --include="*.ts" | \
  grep "Response" 2>/dev/null)
if [ -n "$LEAKS" ]; then
  REPORT+="WARNING: Error details in responses:\n$LEAKS\n\n"
fi

# 5. CORS wildcards
WILDCARDS=$(grep -rn "'\\*'" "$FUNCTIONS_DIR" --include="*.ts" | grep -i "origin" 2>/dev/null)
if [ -n "$WILDCARDS" ]; then
  REPORT+="WARNING: CORS wildcard found:\n$WILDCARDS\n\n"
fi

# 6. Long-running patterns
LONG=$(grep -rn "openai\|sharp\|ffmpeg\|puppeteer" "$FUNCTIONS_DIR" --include="*.ts" 2>/dev/null)
if [ -n "$LONG" ]; then
  REPORT+="WARNING: Possible long-running patterns:\n$LONG\n\n"
fi

# Output
if [ -n "$REPORT" ]; then
  echo -e "=== Edge Function Security Check ===\n"
  echo -e "$REPORT"
else
  echo "All Edge Function checks passed."
fi

Claude does not execute automatic changes on production. The complete audit workflow with custom commands and headless mode is described in Claude Code as a Security Control in DevOps Workflows.

Deployment Checklist

Before every Edge Functions deployment, verify:

Architecture
  [ ] Function is an integration (webhook/event), not business logic
  [ ] One webhook provider per function
  [ ] Long-running tasks delegated to Trigger.dev

Security
  [ ] Webhook signature is verified
  [ ] Input is validated with a schema (e.g., Zod)
  [ ] Correct client choice (anon vs. service_role)
  [ ] service_role only when no user context is available

Secrets
  [ ] No hardcoded secrets in the code
  [ ] Secrets in .env.functions (not in Git)
  [ ] .env.functions with permissions 600

CORS
  [ ] OPTIONS handler present
  [ ] No wildcard origin in production
  [ ] CORS headers in ALL responses (including errors)

Error Handling
  [ ] Try/catch around the entire logic
  [ ] Generic error messages to the client
  [ ] Details only in console.error (without secrets)

Deployment
  [ ] Deno type check passed
  [ ] Container restarted after deployment
  [ ] Health check successful after deployment

Conclusion

Supabase Edge Functions are a powerful integration point when used as such: receiving webhooks, verifying signatures, processing events, and forwarding them.

The critical point is clear boundaries. Edge Functions are not a second backend alongside Next.js and not a job runner alongside Trigger.dev. Drawing this line clearly and implementing the security basics (signatures, input validation, correct client, CORS) allows you to operate Edge Functions securely in a self-hosted setup.

The combination of deterministic checks in CI and contextual Claude Code analysis covers both known patterns and new, unexpected risks. Those who adopt Cert-Ready by Design from the start save themselves retroactive audit rounds.

Audit Checklist Download

Prepared prompt for Claude Code. Upload the file to your server and start Claude Code in your Edge Functions project directory. Claude Code will automatically check all security points from this runbook and report PASS, WARNING, or CRITICAL.

claude -p "$(cat claude-check-artikel-3-edge-functions-en.md)" --allowedTools Read,Grep,Glob,Bash

Download checklist

Series Table of Contents

This article is part of our DevOps series for self-hosted app stacks.

  1. Supabase Self-Hosting Runbook
  2. Running Next.js on Supabase Securely
  3. Deploying Supabase Edge Functions Securely - this article
  4. Running Trigger.dev Background Jobs Securely
  5. Claude Code as Security Control in DevOps Workflows
  6. Security Baseline for the Entire Stack

In the next article, we show how to run Trigger.dev background jobs securely without introducing new security risks into the stack.

Bert Gogolin

Bert Gogolin

CEO & Founder, Gosign

AI Governance Briefing

Enterprise AI, regulation, and infrastructure - once a month, directly from me.

No spam. Unsubscribe anytime. Privacy policy

Supabase Edge Functions Deno Webhooks Security
Share this article

Frequently Asked Questions

What are Supabase Edge Functions best suited for?

Edge Functions are best suited for webhooks, integrations with external APIs, and signed event endpoints. They are not an alternative to Next.js Server Actions for business logic and not a replacement for Trigger.dev for long-running tasks.

Why must every webhook endpoint verify the request signature?

Without signature verification, anyone can send arbitrary payloads to the webhook. An attacker could trigger forged events - for example, a fake Stripe checkout confirmation - without any actual payment having been made.

Why do webhooks not need a JWT but still require signature verification?

Webhooks are called by external services like Stripe or GitHub that do not have a Supabase JWT. Instead, they sign the payload with a shared secret (HMAC). The Edge Function must verify this signature because otherwise anyone can send arbitrary payloads.