A personal portfolio site for Ethan Hathaway showcasing things I make — woodworking, baking, cooking, and other crafts.
bun install
# Local dev without platform bindings
bun run dev:vite
# Local dev with Cloudflare bindings (D1, assets, etc.)
bun run dev
# Remote dev with Cloudflare bindings (writes to remote D1/R2 preview bucket)
bun run dev:remote
.env (local dev): DATABASE_URL, PUBLIC_CLERK_PUBLISHABLE_KEY, CLERK_SECRET_KEY.env.local (wrangler dev): local bindings; leave PUBLIC_R2_BASE_URL empty to use proxy URLs.env.remote (wrangler dev –remote): set PUBLIC_R2_BASE_URL to the preview bucket public URL.env.test (tests): same Clerk dev keys and DATABASE_URL as .env.env.template: copy for new environments; includes E2E_BASE_URL and CLERK_TEST_TOKEN for integration testsExample .env.local:
DATABASE_URL=file:local.db
PUBLIC_CLERK_PUBLISHABLE_KEY=pk_test_your_key
CLERK_SECRET_KEY=sk_test_your_key
PUBLIC_R2_BASE_URL=
Example .env.remote:
DATABASE_URL=file:local.db
PUBLIC_CLERK_PUBLISHABLE_KEY=pk_test_your_key
CLERK_SECRET_KEY=sk_test_your_key
PUBLIC_R2_BASE_URL=https://<account-id>.r2.cloudflarestorage.com/portfolio-artifacts-22-dev
bun run build
bun run preview
bun run deploy
Run a local worker, then execute the API integration test. This requires a real Clerk session token.
bunx wrangler dev --port 8787
E2E_BASE_URL=http://localhost:8787 CLERK_TEST_TOKEN=... bunx playwright test e2e/categories.api.spec.ts
Secrets must be set via wrangler secret put — never in wrangler.toml.
bunx wrangler secret put CLERK_SECRET_KEY
bunx wrangler secret put GOOGLE_CLIENT_SECRET
bunx wrangler secret put GOOGLE_TOKEN_ENCRYPTION_KEY
For local dev, add secrets to both .env and .dev.vars.
Example .dev.vars:
CLERK_SECRET_KEY=sk_test_...
GOOGLE_CLIENT_ID=...
GOOGLE_CLIENT_SECRET=...
GOOGLE_TOKEN_ENCRYPTION_KEY=...
GOOGLE_TOKEN_ENCRYPTION_KEY is a self-generated key for encrypting Google OAuth tokens in D1. Generate one with:
openssl rand -base64 32
Imports photos and videos from Google Photos into projects via the Picker API.
http://localhost:8788/api/integrations/google-photos/callbackhttps://erhathaway.com/api/integrations/google-photos/callbackGOOGLE_CLIENT_ID in wrangler.toml [vars]GOOGLE_CLIENT_SECRET and GOOGLE_TOKEN_ENCRYPTION_KEY as secrets (see above)Images are optimized with a two-layer strategy:
On upload, images are converted to AVIF and WebP via Cloudflare Image Resizing and stored in R2. Artifacts track available formats in imageFormats (e.g. ['avif', 'webp']), rendered as <picture> elements with a <source> per format.
When PUBLIC_CF_IMAGE_RESIZING=true is set (production only), images are served at 400w, 800w, and 1200w via Cloudflare Image Resizing (/cdn-cgi/image/ URLs) in srcset attributes. The browser picks the best size based on sizes.
Important: AVIF cannot be used as an Image Resizing source (HTTP 415, error 9520). All resizing requests use the WebP variant as input, with format=auto for output so Cloudflare auto-negotiates AVIF or WebP based on the browser’s Accept header.
In local dev, Image Resizing is disabled and images fall back to plain src URLs with no srcset.
All responsive image logic lives in src/lib/utils/image-formats.ts.
/api endpoints (no cross-origin)./api/projects endpoints are available for CRUD; reads are public for published items./api/uploads/artifacts handles image uploads to R2 and returns a URL for artifacts.DB in wrangler.tomlsrc/lib/server/db/index.ts (getDb)event.locals.db is set in src/hooks.server.tssrc/lib/server/db/schema.tsdocs/api-spec.mdARTIFACTS in wrangler.tomlpreview_bucket_name for remote dev (wrangler dev --remote)PUBLIC_R2_BASE_URL (leave empty in .env.local to use proxy URLs)# Generate SQL from the schema
bun run db:generate
# Apply a generated migration to D1
bunx wrangler d1 execute portfolio-db --file=./drizzle/<migration>.sql