Changelog
See what we've been working on. New features and improvements every month.
better-auth 1.4.17 → 1.5.5
Auth framework major upgrade with breaking changes: renamed email confirmation callback, updated error code types, fixed session/user type inference for plugin-added fields (activeOrganizationId, twoFactorEnabled)
Worker Zod 3 → 4
Upgraded enrichment worker's Zod dependency from v3.23 to v4.3.6, aligning with root app. Import path changed to `zod/v4` for explicit version pinning.
53 dependency upgrades
Next.js 16.1.6→16.2.1, tRPC 11.9→11.14, AI SDK 6.0.57→6.0.134, Sentry 10.37→10.45, motion 12.29→12.38, Tailwind CSS 4.1→4.2, Vitest 4.0→4.1, Biome 2.3→2.4, fumadocs 16.4→16.7, shikijs/rehype 3.21→4.0, react-dropzone 14→15, AWS SDK 3.975→3.1014, plus ~30 patch bumps
Worker AI SDK sync
Enrichment worker @ai-sdk/openai and ai packages synced to match root versions
Vercel Analytics/Speed Insights v2
Upgraded from v1 to v2 (React API unchanged)
Dropzone rejection handling TODO
Tracked react-dropzone v15 follow-up for upload error feedback in TODOS.md
Upload dropzone rejection handling
Both avatar and logo upload flows now validate file type (images only) and size (5MB max), show toast errors for rejected files, and guard against empty drops opening the crop modal
Worker CI uses bun
`deploy-enrichment-worker.yml` now uses `oven-sh/setup-bun` and `bun install` instead of npm, matching the project's package manager
Removed stale `package-lock.json`
Deleted npm lockfile (58 mismatches with package.json) and added `packageManager: bun@1.3.11` to package.json for explicit bun selection
better-auth 1.4→1.5
**shipped in v1.5.4**
recharts 2→3 (complete rewrite, waiting for shadcn wrapper)
cropperjs 1→2 (react-cropper incompatible)
react-resizable-panels 3→4 (unused component)
Apify Console link
External link button on each Apify source row opens the actor's Apify Console page directly. Desktop icon button + mobile labeled button. Only visible for sources with an actor ID
Run Now button
Trigger Apify actor runs directly from the admin Sources page. Rocket icon button for Apify-type sources in both desktop table and mobile card layouts. Backend `triggerApifyRun` function with full error handling
6 new triggerApifyRun tests
Full coverage: not found, non-Apify rejection, missing token, success with runId, API error response, network failure
Bulletproof Apify webhook handler
Refactored to respond 200 immediately and process via Next.js `after()`, eliminating Apify's 30-second timeout risk. Includes paginated dataset fetch (1000 items/page), timeout tracking, and auto-retry once on failure
Webhook idempotency
New `webhook_log` table with unique `apifyRunId` index prevents duplicate processing. Race conditions handled via insert constraint catch
Pipeline failure notifications
Dual-channel alerts (Resend email + Slack webhook) fire on FAILED/ABORTED/TIMED_OUT events and all-error ingests. Fire-and-forget with graceful degradation
Admin Sources page
Dashboard at `/dashboard/admin/sources` with source CRUD (create, edit, archive, toggle active/paused), 4 stats cards, webhook log table with status/source filters, retry button for failed runs. Includes toast notifications, AlertDialog confirmation for destructive actions, inline form validation, accessible icon buttons, and mobile card layout
DB-driven source resolution
Webhook handler resolves Apify actor IDs to source names from `job_source` table (with 60s in-memory TTL cache) instead of hardcoded map. Sources managed via admin UI
Apify run traceability
`apifyRunId` column on `jobTable` links every ingested job back to its exact Apify actor run
26 new pipeline tests
15 webhook handler tests (auth, validation, idempotency, event routing, source mapping) + 11 notification tests (channel selection, failure resilience, content formatting)
DESIGN.md
Formalized design system: Industrial/Utilitarian aesthetic, Geist Sans typography, sage accent, compact admin density, motion and component conventions
**Webhook handler** now handles ACTOR.RUN.FAILED, ACTOR.RUN.ABORTED, and ACTOR.RUN.TIMED_OUT events (previously ignored all non-success events)
**ingestJobBatch** accepts optional IngestOptions with apifyRunId passthrough for traceability
Enrichment pipeline now writes clean descriptions
AI extraction already produced a clean `description` field but the UPDATE query never wrote it back to the DB. Raw RSS/HTML from ingest stayed in the `description` column forever. Fixed in both CF Worker and lib pipeline
`stripHtml` hardened for double-encoded HTML
RSS feeds often contain `<strong>` (HTML entities inside HTML). Now runs two-pass strip: decode entities, then strip revealed tags. Also strips RSS metadata preambles (Title/Category/Posted headers)
Enrichment prompt updated
AI instructed to return description as clean plain text, preserving original wording verbatim — no summarizing or rephrasing, only stripping markup and encoding artifacts
Job detail views
Side panel sheet on row click in jobs browser + full-page detail view at `/dashboard/organization/jobs/[id]` with 100% API field parity (salary, location, career, skills, languages, description, dates & source)
Shared `DetailRow` component
Reusable label-value display component at `components/ui/custom/detail-row.tsx`, used across sheet and full page views
`stripHtml` utility
Converts HTML job descriptions to clean text with newline preservation, entity decoding, and whitespace normalization
`formatSalaryFull` utility
Formats salary ranges with currency/unit defaults (USD, /year)
18 new unit tests
Full coverage for `stripHtml` (11 tests) and `formatSalaryFull` (7 tests)
Feed cards
Added `Clock` icon next to delivery timestamps, skeleton loading states
Feed detail
Stat card icons (Send, Clock, AlertTriangle), skeleton loading, copy-to-clipboard button for filter JSON
API keys
Click-to-copy key prefix, standardized empty state icon size (h-12 w-12)
Jobs router
`get` procedure now filters by active status, consistent with `browse`
Cloudflare Workers enrichment pipeline
Replaces Vercel cron-based enrichment with a dedicated CF Worker at `workers/enrichment/`. Processes 50 jobs every 5 minutes (vs 10 jobs/15 min on Vercel). No 60s timeout limit
Atomic job claiming
`FOR UPDATE SKIP LOCKED` prevents race conditions between concurrent worker runs. `processingRunId` guard prevents stale overwrites
Configurable concurrency
Worker enriches jobs in parallel chunks (default 5 concurrent). Batch size and concurrency configurable via env vars
Stale lock auto-cleanup
Crashed workers' locks are automatically released after a configurable timeout (default 10 min)
GitHub Action auto-deploy
Worker auto-deploys on push to `main` when `workers/enrichment/**` changes via `cloudflare/wrangler-action@v3`
AI SDK v6 token cleanup
Removed `as any` casts from token usage. Uses `inputTokens`/`outputTokens` directly. Captures new v6 fields: `reasoningTokens`, `cachedInputTokens`
Enrichment throughput
50 jobs/5 min (600/hour) vs previous 40 jobs/hour on Vercel cron. 15x improvement
Neon HTTP driver
Worker uses `@neondatabase/serverless` HTTP driver (not WebSocket) for Cloudflare Workers compatibility
Root tsconfig exclusion
`workers/` excluded from root TypeScript compilation so Vercel builds don't fail on worker-only dependencies
Live changelog
Homepage hero pill and `/changelog` page now show real release history from `CHANGELOG.md` instead of boilerplate placeholder content
Full date display
Changelog page now shows "March 19, 2026" instead of abbreviated "March 2026"
Changelog parser
Build-time parser at `lib/changelog.ts` handles 5 markdown bullet formats with `server-only` guard
Total tests
149 passing (13 suites), including 17 new changelog parser tests
CSV formula injection (CWE-1236)
CSV export now prefixes cells starting with `=`, `+`, `@`, `-` with a tab character to prevent Excel/Sheets formula injection
URL scheme validation
HTML and Markdown formatters now block `javascript:` and `data:` URL schemes in job apply_url fields, preventing XSS in exports and emails
Webhook secret leak
tRPC feed endpoints no longer return the HMAC webhook secret to the frontend on any operation (list, get, create, update, email schedule)
Export credit timing
Export API now checks credit balance before fetching data, preventing charges on failed queries
Dashboard export cursor
Dashboard export no longer incorrectly filters by email delivery cursor, which caused it to return incomplete results after cron runs
Type-safe status enum
Replaced `"active" as any` casts with `JobStatus.active` enum constant across 3 files
Ingest dedup performance
Secondary title+company dedup now uses batched OR queries (chunks of 50) instead of N+1 per-job queries
Total tests
132 passing (12 suites)
Dashboard export + scheduled email delivery
Feed detail page with Export tab. Download matching jobs as HTML, Markdown, CSV, or JSON. Configure daily/weekly email delivery via Resend — Bordfeed emails formatted newsletter content to your inbox
Feed detail page
Click any feed card to see overview stats, filters, and the new Export tab. Breadcrumb navigation back to feeds list
Export API endpoint
`GET /v1/feeds/:id/export?format=json|csv|html|markdown` returns up to 1000 jobs in a single response without pagination. Supports `since=last_poll` for incremental exports
`since=last_poll` auto-cursor
Server-side cursor tracks your last poll/export. `GET /v1/feeds/:id/jobs?since=last_poll` returns only jobs added since your last API call, eliminating client-side state tracking
`include=html_snippet`
`GET /v1/feeds/:id/jobs?include=html_snippet` adds a pre-formatted HTML block per job for newsletter use, with XSS sanitization
Job formatter utility
`lib/feeds/job-formatter.ts` with HTML, Markdown, CSV formatters. All user-sourced fields sanitized to prevent XSS
Secondary dedup
Ingest pipeline now catches semantic duplicates via case-insensitive title + company match, in addition to existing content hash dedup
Email export cron
`POST /api/v1/cron/email-export` runs daily at 08:00 UTC via Vercel Cron. Sends formatted jobs to configured recipients
17 new tests
Job formatter test suite covering HTML, Markdown, CSV output, XSS sanitization, empty states, and edge cases
ROADMAP.md
Rewritten to reflect 4-phase plan: Data Flow (done) → Dogfood+DX (current) → Monetization → Scale. SDK removed from roadmap per CEO plan decision
OpenAPI spec
Updated to v1.3.0 with export endpoint, `since` and `include` parameters documented
Feed cards
Now clickable, navigate to feed detail page
Total tests
125 passing (12 suites)
OpenAPI 3.1 spec + Scalar interactive docs
Full API specification at `public/api/openapi.json` covering all 11 public endpoints. Interactive docs at `/docs/api` via `@scalar/api-reference-react` with kepler theme. API route serves spec with CORS headers
Dashboard metrics
Replaced achromatic boilerplate email stats with real Bordfeed metrics: Active Feeds, Credits Remaining, API Keys, Webhook Deliveries. New `organization.stats.dashboard` tRPC procedure with org-scoped parallel queries
OpenAPI route
`GET /api/openapi.json` serves the spec with caching and CORS for Scalar client-side rendering
Enrichment cron frequency
Now runs every 15 minutes instead of every 2 hours. Clears the enrichment backlog of 1,253 jobs in ~31 hours instead of ~10 days — jobs reach your feed much faster
Org selection cards
Cards now have a visible shadow and a cleaner hover effect so it's obvious which org you're selecting
Boilerplate dashboard charts
Removed `dashboard-demo-charts.tsx` (640 lines of fake email metrics data) replaced by real org-scoped stats
Avatar and org logo upload failing
S3 signed upload URL was generated with `ContentType: "image/jpeg"` hardcoded, but the image cropper outputs PNG. S3 rejected the PUT due to content type mismatch. Made `contentType` a required parameter throughout the upload chain (schema → tRPC → S3) so the signed URL always matches the actual upload
Dashboard empty — tRPC 503 errors
`getBaseUrl()` used stale per-deployment Vercel URL for tRPC batch requests. Added `NEXT_PUBLIC_VERCEL_PROJECT_PRODUCTION_URL` as stable fallback before the ephemeral deployment URL. Also set `NEXT_PUBLIC_SITE_URL` env var in Vercel production (#60)
Enrichment schema — OpenAI 100% failure
`status: z.string().default("active")` produced a JSON schema without `status` in the `required` array, causing OpenAI structured output to reject every request. Changed to `.nullable()` with default handled in `process-jobs.ts` (#56)
Feed templates returning 0 results
AI enrichment produces free-form categories ("Product Design", "Full-Stack Programming") but filter builder used exact `inArray` match against slug-style template values ("design", "full-stack"). Changed to case-insensitive `ilike` partial matching. Same fix for `workplace_type` (#57)
Homepage hero copy
replaced achromatic boilerplate with Bordfeed messaging: "The API for job data. Fresh, structured, delivered." CTAs: "Start Building" / "View Pricing". Pill links to `/changelog` (#58)
Features section copy
"Everything your feed needs" with 20+ Filter Fields and Webhook Delivery cards (#59)
Stats section copy
real product metrics: 11 endpoints, 20+ filters, 6 templates, 50+ fields (#59)
FAQ section copy
5 Bordfeed-specific questions replacing generic SaaS boilerplate (#59)
Apify webhook adapter
`POST /api/internal/apify-webhook` receives Apify run metadata, fetches dataset items from Apify API, passes raw data to shared ingest logic. Actor→source mapping for Workable, WWR, JobDataAPI
Shared ingest module
extracted `lib/pipeline/ingest-jobs.ts` used by both `/api/internal/ingest` and `/api/internal/apify-webhook`
Pipeline tests
60 new tests across 4 suites (ingest, process, monitor, push). Total: 108 tests / 11 suites
SSRF protection in monitor
blocks private/internal IPs before fetching source URLs
AI job classification in monitor
wires up `classifyJobStatus()` to detect filled/closed jobs returning HTTP 200
Batch ingest
single SELECT + INSERT replaces N+1 per-job queries
Max batch size guard
rejects payloads >200 jobs with 413
Pipeline reliability TODOs
concurrent push protection (P2), batch push at scale (P3)
`skills` column
migrated from `text` to `text[]` array with `arrayOverlaps` filter (migration: `20260319082018_damp_vulcan.sql`)
Content hash
JSON keys sorted before hashing for deterministic dedup across actor runs
Processing lock
now filters by specific job IDs (`inArray`) instead of locking all pending jobs
Race condition in job processing lock
concurrent cron runs could process same jobs twice
isPrivateHost() exported from webhook-validation for reuse in monitor SSRF checks
Pre-push git hook (tsc --noEmit)
failed on generated content-collections types; Vercel build already gates type errors
Feed-Centric API
complete rewrite on achromatic SaaS boilerplate
Feeds API (public REST)
POST/GET/PATCH/DELETE `/v1/feeds`, GET `/v1/feeds/:id/jobs` (poll with cursor pagination), GET `/v1/health`
Feed filter builder
20+ strict filter fields (category, workplace_type, country_code, career_level, salary range, languages, skills, industry, visa_sponsorship, and more)
Feed templates
6 pre-built templates (Remote Writing, AI/ML Engineering, Growth Marketing EU, DevRel, Remote Design, Full-Stack Remote US)
Webhook delivery
HMAC-SHA256 signing, SSRF protection, retry with backoff
Feed tRPC router
dashboard management (list, create, update, delete, templates)
Job admin tRPC router
pipeline management (list, stats, status updates, bulk actions)
Dashboard pages
feed list with cards, admin jobs table with stats
Database schema
job table (50+ fields), feed table, webhook delivery table, feed template table
Zod schemas
strict validation with unknown field rejection
Unit tests
48 tests across 7 suites (content hash, salary, pagination, webhook validation, webhook signing, Zod schemas, utils)
Credit-based billing
feed poll and webhook delivery consume credits (Starter $9.99/1K, Growth $49/10K, Scale $299/100K)
Design doc
`docs/designs/feed-centric-api.md`
Foundation
replaced v0.5.4 codebase with achromatic SaaS boilerplate (Better Auth + orgs + Stripe + Resend + Sentry + marketing pages)
Product model
enrichment is now internal-only, feeds are THE product
API architecture
hybrid — tRPC for dashboard, REST for public `/v1/*` API
Billing
API keys and feeds scoped to organizations (multi-tenant)