2026-03-28 Samuel Olubukun 5 min read

Anatomy of a Production-Grade SAAS Boilerplate

Every integration, every pattern, every decision — a deep dive into how this Next.js + Supabase + Creem starter ships a billing-aware SAAS from day one.

Most boilerplates give you auth and a blank dashboard. You still spend two weeks wiring up payments, webhooks, email, rate limiting, and background jobs before you write a single line of product logic.

We wanted to eliminate that entire phase. Clone the repo, set your env vars, and your first product feature starts at line one, not line two thousand.

This is how we built a starter that ships a billing-aware SAAS on the first git clone.

Foundation: Next.js 16 + Supabase

The stack starts with Next.js 16 (App Router) and Supabase for auth and Postgres. Supabase's @supabase/ssr package gives us cookie-based auth that works in Server Components, Route Handlers, and Client Components without token juggling.

Our Supabase integration splits into three clients:

  • Browser client (src/lib/supabase/client.ts) — client-side realtime subscriptions
  • Server client (src/lib/supabase/server.ts) — Server Components and Actions, with safe cookie read/write
  • Admin client (src/lib/supabase/admin.ts) — service-role key, bypasses RLS for server-side mutations

A thin proxy (src/proxy.ts) handles route protection. Unauthenticated users hitting /dashboard/* get bounced to /login. Authenticated users hitting /login get bounced to /dashboard.

We never trust getSession() on the server. Every protected route validates the JWT via getUser(). Cookie spoofing doesn't survive that check.

Database Schema

The SQL schema (supabase/db_schema.sql) bootstraps ten tables with Row-Level Security enforced on every one. Client reads are scoped to auth.uid(). All writes go through the admin client.

TablePurpose
profilesUser display name, email, LTV tracking
subscriptionsCreem subscription state, period dates, seats
creditsWallet balance (integer or unlimited sentinel)
credit_transactionsAppend-only ledger for every grant and spend
licensesProduct license keys, active / deactivated status
billing_eventsNotification feed for checkout, renewal, refund
purchasesOne-time purchase records
chats / chat_messagesAI assistant conversation persistence
webhook_eventsIdempotency tracker for Creem ingestion
filesS3-backed file metadata

There's also a Drizzle schema mirror (src/db/schema.ts) for teams that prefer migration workflows.

Payments Engine: Creem

What Is Creem?

Creem is a merchant-of-record payment platform. It handles tax calculation, collection, and remittance across jurisdictions — so you never touch a tax engine or worry about EU VAT registration. For indie hackers and small teams, that alone saves months.

Beyond tax, Creem covers:

  • Checkout sessions — hosted payment pages with pre-filled customer info, discount codes, seat-based billing, and custom fields
  • Subscription lifecycle — activation, trialing, pausing, scheduled cancellation, immediate cancellation, upgrades with proration
  • Webhooks — signed, retried event delivery for every billing event (4 retries with progressive backoff: 30s, 1m, 5m, 1h)
  • Customer portal — self-service billing management links
  • License keys — generate, activate, validate, and deactivate software licenses

Our Integration

We use two Creem packages:

  • creem (v1.3.6+) — the core SDK, Speakeasy-generated, type-safe. Handles checkout creation, subscription mutations, license management, and product lookups.
  • @creem_io/nextjs (v0.6.0+) — the official Next.js adapter. We use it for the Webhook() handler that verifies signatures and dispatches typed lifecycle callbacks.

Checkout Flow

// 1. Client component — drop in a <CreemCheckout> button
<CreemCheckout productId="prod_starter" referenceId={user.id}>
  <Button>Upgrade to Starter</Button>
</CreemCheckout>
 
// 2. The component hits our checkout route
const session = await creem.checkouts.create({
  productId,
  successUrl: `${APP_URL}/dashboard?welcome=true`,
  metadata: { user_id: user.id },
});
 
// 3. User pays on Creem's hosted page
// 4. Creem fires a webhook → /webhooks/creem

Webhook Handler

// src/app/webhooks/creem/route.ts
export const POST = Webhook({
  webhookSecret: process.env.CREEM_WEBHOOK_SECRET!,
  onCheckoutCompleted: async ({ customer, product, metadata, subscription }) => {
    // upsert subscription record
    // grant credits
    // increment LTV
    // send payment confirmation email
  },
  onGrantAccess: async ({ customer, metadata }) => {
    // activate user entitlements
  },
  onRevokeAccess: async ({ customer, metadata }) => {
    // pause / cancel access
  },
});

Every webhook is wrapped in idempotency tracking. The webhook_events table records processed event IDs. The subscription upsert happens before the idempotency insert, so even a mid-flight crash stays safe on the next retry.

Creem's official docs recommend using subscription.paid (not subscription.active) to activate user access. We handle both, plus subscription.past_due, subscription.expired, subscription.paused, subscription.canceled, and subscription.update for full lifecycle coverage.

Credit System

We layered a credits wallet on top of Creem's billing. Each product maps to a credit amount in src/lib/credits-config.ts.

PlanCreditsPrice
Starter100 / month$10 / month
Creator500 / month$29 / month
Professional2,000 / month$79 / month
Nova Pro MaxUnlimited (sentinel)$3,000 one-time

Grant logic:

  • Grants are atomic: insert into credit_transactions, then upsert credits via a Postgres RPC
  • Spending is atomic too: spend_credits() does a row-lock, validates balance, decrements, and logs — all in one transaction
  • The unlimited sentinel (-1) bypasses deduction entirely

Job Pipeline: BullMQ + Redis

Real work shouldn't block the request cycle. We use BullMQ with ioredis for async job processing.

QueueConcurrencyPurpose
email5Welcome emails, payment confirmations
webhook-processing3Heavy webhook business logic
audit10Append-only audit log writes

Workers run as a standalone process (npm run workers). Jobs retry 3 times with exponential backoff. Graceful shutdown on SIGINT/SIGTERM drains in-flight jobs before exiting.

BullMQ uses a standard Redis TCP connection — not Upstash's HTTP Redis. If REDIS_URL is not set, the producer returns false and callers fall back to synchronous execution. Zero breaking changes.

Rate Limiting: Upstash Redis

Sensitive API routes are guarded with a sliding window algorithm:

const rateLimiter = new Ratelimit({
  redis: getRedisClient(),
  limiter: Ratelimit.slidingWindow(20, "60 s"),
});
RouteLimitKey
/api/chat20 req / minUser ID
/api/checkout10 req / 5 minUser ID
/api/subscriptions/*10 req / 5 minUser ID
/api/auth/welcome3 req / hourClient IP

The same Upstash Redis instance also powers cache-aside patterns for expensive admin dashboard aggregations and blog fetches.

Email: Resend + React Email

Transactional emails use Resend for delivery and React Email for templating:

export function WelcomeEmail({ firstName }: { firstName?: string }) {
  return (
    <Html>
      <Body>
        <Heading>Welcome{firstName ? `, ${firstName}` : ""}!</Heading>
        <Text>Your account is ready. Let's build something.</Text>
      </Body>
    </Html>
  );
}

Each email renders to both HTML and plain text. In development (no RESEND_API_KEY), emails log to console. When BullMQ is active, emails are enqueued and processed by the worker.

Storage: S3-Compatible Presigned Uploads

Works with AWS S3, Cloudflare R2, or MinIO. The endpoint and forcePathStyle env vars make it provider-agnostic.

  1. Client requests a presigned URL from GET /api/storage/presign
  2. Server validates file type and size, signs a PUT URL (5-minute expiry)
  3. Browser uploads directly to S3
  4. Client calls POST /api/storage/complete to register metadata
  5. Downloads served through GET /api/storage/download with auth checks

AI Assistant: Multi-Provider Chat

POST /api/chat supports three LLM providers. Each response costs 1 credit (unlimited-tier users bypass deduction). Conversations persist across sessions.

ProviderEndpointDefault Model
OpenAI/v1/chat/completionsgpt-4o-mini
Anthropic/v1/messagesclaude-sonnet-4-20250514
Google GeminiGenerative AI APIgemini-1.5-flash

Observability

Structured logging uses Pino with a custom Better Stack transport. Every API route emits JSON logs with contextual metadata. Sensitive fields (auth headers, API keys, emails) are automatically redacted.

PostHog integration is optional. Client SDK uses person_profiles: "identified_only" with localStorage persistence. Server SDK flushes immediately — no batching, ideal for serverless.

Testing and CI

LayerToolScope
Unit / ComponentVitest + Testing LibraryValidators, helpers, React components
E2ECypressFull user flows against local dev server
VisualStorybookComponent documentation and visual regression

The GitHub Actions CI pipeline runs lint, tests with coverage, build, Cypress E2E, and Storybook build on every PR. Vercel handles preview deploys.

Why This Exists

The hardest part of shipping a SAAS isn't the product feature. It's the infrastructure around it. Payments, auth, webhooks, email, rate limiting, background jobs, storage, logging — none of these are your product, but all of them are required before your product works.

We wanted to eliminate that entire phase.

Fork it. Ship with it. Tell us what you build.

Live demo: saasxcreem.vercel.app — use code CREEMSAAS2026 for 30% off.