feat: script to notify users to migrate to v2#3924
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
1 Skipped Deployment
|
|
|
Caution Review failedThe pull request is closed. 📝 WalkthroughWalkthroughAdds a new React Email template for API v1 migration, a Resend client method to render/send that email, a migration script that queries ClickHouse and MySQL and notifies workspace members via WorkOS/Resend, and updates the migrate tool dependencies. Changes
Sequence Diagram(s)sequenceDiagram
autonumber
actor Operator
participant Script as v1_deprecation.ts
participant CH as ClickHouse
participant DB as MySQL/Drizzle
participant WorkOS as WorkOS API
participant Resend as Resend Client
Operator->>Script: run main()
Script->>CH: query rows WHERE path LIKE "/v1/%"
CH-->>Script: rows (workspace_id, path)
Script->>Script: aggregate paths per workspace
loop per workspace
Script->>DB: find workspace by id
alt found
Script->>WorkOS: list org memberships
WorkOS-->>Script: memberships
loop per member
Script->>WorkOS: getUser(member.userId)
WorkOS-->>Script: user details
Script->>Script: wait 500ms
Script->>Resend: sendApiV1MigrationEmail({email,name,workspace,deprecatedEndpoints})
Resend-->>Script: success / error
end
else not found
Script->>Script: log missing workspace
end
end
Script->>Script: log Emails sent: N
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Suggested reviewers
Tip 👮 Agentic pre-merge checks are now available in preview!Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.
Example: reviews:
pre_merge_checks:
custom_checks:
- name: "Undocumented Breaking Changes"
mode: "warning"
instructions: |
Flag potential breaking changes that are not documented:
1. Identify changes to public APIs/exports, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints (including removed/renamed items and changes to types, required params, return values, defaults, or behavior).
2. Ignore purely internal/private changes (e.g., code not exported from package entry points or marked internal).
3. Verify documentation exists: a "Breaking Change" section in the PR description and updates to CHANGELOG.md.Please share your feedback with us on this Discord post. 📜 Recent review detailsConfiguration used: CodeRabbit UI Review profile: ASSERTIVE Plan: Pro ⛔ Files ignored due to path filters (1)
📒 Files selected for processing (1)
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
|
Thank you for following the naming conventions for pull request titles! 🙏 |
There was a problem hiding this comment.
Actionable comments posted: 16
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
tools/migrate/package.json (1)
4-6: Expose a dedicated script entry to run this migration.You introduced
v1_deprecation.ts, but the package only exposes./main.ts. Add a script for the new entrypoint.Apply:
"scripts": { - "migrate": "dotenv -e .env npx tsx ./main.ts" + "migrate": "dotenv -e .env npx tsx ./main.ts", + "migrate:v1-deprecation": "dotenv -e .env npx tsx ./v1_deprecation.ts" },
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (4)
internal/resend/emails/api_v1_migration.tsx(1 hunks)internal/resend/src/client.tsx(2 hunks)tools/migrate/package.json(1 hunks)tools/migrate/v1_deprecation.ts(1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Test API / API Test Local
- GitHub Check: Test Go API Local / Test
- GitHub Check: Build / Build
🔇 Additional comments (1)
internal/resend/src/client.tsx (1)
129-131: Verify sender domain is authorized for Resend.New sender
andreas@updates.unkey.commust be verified in Resend to avoid bounces/DMARC fails.Please confirm in Resend dashboard that
updates.unkey.comis verified and includes DKIM/SPF.
…ttps://github.com/unkeyed/unkey into 09-07-feat_script_to_notify_users_to_migrate_to_v2
chronark
left a comment
There was a problem hiding this comment.
- excluded special customers
- slowed everything down to respect workos and resend limits
check it out @perkinsjr
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (7)
tools/migrate/v1_deprecation.ts (7)
7-24: Add upfront environment validation to fail fast with clear messages.The script requires multiple environment variables but doesn't validate them upfront. Missing credentials will cause runtime failures deep in the execution flow.
Add environment validation at the start of the main function:
+import { z } from "zod"; + +const envSchema = z.object({ + CLICKHOUSE_URL: z.string().url(), + DATABASE_USERNAME: z.string().min(1), + DATABASE_PASSWORD: z.string().min(1), + DATABASE_HOST: z.string().min(1), + WORKOS_API_KEY: z.string().min(1), + RESEND_API_KEY: z.string().min(1), + DATABASE_SSL: z.string().optional(), + DRY_RUN: z.enum(["true", "false", "1", "0"]).optional(), + MIGRATION_EMAIL_ALLOWLIST: z.string().optional(), +}); + async function main() { + const env = envSchema.parse(process.env); + const ch = new ClickHouse({ - url: process.env.CLICKHOUSE_URL, + url: env.CLICKHOUSE_URL, });
12-14: MySQL connection string is vulnerable to injection and misconfigures TLS.The current approach has security and reliability issues:
- Interpolating credentials directly into the DSN can break with special characters
ssl={}is non-standard and may silently fall back to plaintext connectionsUse the options object pattern with proper TLS configuration:
- const conn = await mysql.createConnection( - `mysql://${process.env.DATABASE_USERNAME}:${process.env.DATABASE_PASSWORD}@${process.env.DATABASE_HOST}:3306/unkey?ssl={}`, - ); + const conn = await mysql.createConnection({ + host: process.env.DATABASE_HOST!, + port: 3306, + user: process.env.DATABASE_USERNAME!, + password: process.env.DATABASE_PASSWORD!, + database: "unkey", + // Default to TLS unless explicitly disabled + ssl: process.env.DATABASE_SSL?.toLowerCase() === "false" ? undefined : {}, + });
48-53: Add path deduplication to prevent duplicate entries.While the GROUP BY should prevent duplicates, it's safer to deduplicate paths when building the map to handle any future query changes.
const workspaceToPaths = new Map<string, string[]>(); for (const row of rows.val) { - const paths = workspaceToPaths.get(row.workspace_id) || []; - paths.push(row.path); - workspaceToPaths.set(row.workspace_id, paths); + const current = workspaceToPaths.get(row.workspace_id) ?? []; + if (!current.includes(row.path)) { + current.push(row.path); + } + workspaceToPaths.set(row.workspace_id, current); }
55-66: Avoid logging PII in plaintext.The script logs workspace IDs, paths, and workspace names which could contain sensitive information. These should be gated behind a debug flag.
for (const [workspaceId, paths] of workspaceToPaths.entries()) { - console.log(workspaceId, paths); + if (process.env.DEBUG) { + console.log(workspaceId, paths); + } const workspace = await db.query.workspaces.findFirst({ where: (table, { eq }) => eq(table.id, workspaceId), }); if (!workspace) { console.error(`Workspace ${workspaceId} not found`); continue; } - console.log(workspace.name); + if (process.env.DEBUG) { + console.log(workspace.name); + }
68-74: Handle WorkOS pagination and implement rate limiting.The current implementation has two critical issues:
- Only fetches the first 10 organization members (default limit)
- No rate limiting when fetching user details in parallel, which could overwhelm the WorkOS API
- const members = await workos.userManagement.listOrganizationMemberships({ - organizationId: workspace.orgId, - }); - - const users = await Promise.all( - members.data.map(async (member) => workos.userManagement.getUser(member.userId)), - ); + // Fetch all members with pagination + const allMembers = []; + let cursor = undefined; + do { + const members = await workos.userManagement.listOrganizationMemberships({ + organizationId: workspace.orgId, + limit: 100, + after: cursor, + }); + allMembers.push(...members.data); + cursor = members.listMetadata?.after; + } while (cursor); + + // Rate-limited batch processing (5 concurrent requests) + const users = []; + const BATCH_SIZE = 5; + for (let i = 0; i < allMembers.length; i += BATCH_SIZE) { + const batch = allMembers.slice(i, i + BATCH_SIZE); + const batchUsers = await Promise.all( + batch.map((member) => workos.userManagement.getUser(member.userId)) + ); + users.push(...batchUsers); + // Add delay between batches to respect rate limits + if (i + BATCH_SIZE < allMembers.length) { + await new Promise(resolve => setTimeout(resolve, 100)); + } + }
76-88: Remove hard-coded email filter and implement proper allowlist/dry-run mode.The current implementation:
- Hard-codes a specific email address for testing
- Throws an error after sending to that email, which aborts the entire migration
- Doesn't provide a way to test without actually sending emails
- Logs user emails (PII) unconditionally
+ const allowlist = (process.env.MIGRATION_EMAIL_ALLOWLIST ?? "") + .split(",") + .map((s) => s.trim().toLowerCase()) + .filter(Boolean); + + const isDryRun = ["true", "1"].includes(process.env.DRY_RUN ?? ""); + for (const user of users) { - console.log(user.email); + const email = (user.email ?? "").toLowerCase(); + if (!email) continue; + + // Skip if allowlist is defined and email is not in it + if (allowlist.length > 0 && !allowlist.includes(email)) { + if (process.env.DEBUG) { + console.log(`Skipping ${email} (not in allowlist)`); + } + continue; + } - if (user.email === "andreas@unkey.com") { + if (isDryRun) { + console.log(`[DRY_RUN] Would send migration email to ${email} for workspace ${workspace.name} with ${paths.length} deprecated endpoints`); + } else { await resend.sendApiV1MigrationEmail({ - email: user.email, - name: user.firstName, + email, + name: user.firstName ?? email.split("@")[0], workspace: workspace.name, deprecatedEndpoints: paths, }); - throw new Error(`User ${user.email} is from unkey.com`); + console.log(`Sent migration email to ${email}`); } }
79-85: Implement rate limiting for Resend API calls.Sending emails in a tight loop without rate limiting could trigger Resend's rate limits and cause failures.
Add a delay between email sends to respect rate limits:
} else { await resend.sendApiV1MigrationEmail({ email, name: user.firstName ?? email.split("@")[0], workspace: workspace.name, deprecatedEndpoints: paths, }); console.log(`Sent migration email to ${email}`); + // Add delay to respect Resend rate limits (adjust as needed) + await new Promise(resolve => setTimeout(resolve, 200)); }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (1)
tools/migrate/v1_deprecation.ts(1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-01-31T13:50:45.004Z
Learnt from: chronark
PR: unkeyed/unkey#2825
File: apps/dashboard/app/(app)/logs/components/table/logs-table.tsx:100-118
Timestamp: 2025-01-31T13:50:45.004Z
Learning: Sensitive data in logs is masked at the API layer (apps/api/src/pkg/middleware/metrics.ts) before being written to Clickhouse. This includes redacting API keys, plaintext values, and authorization headers using regex replacements.
Applied to files:
tools/migrate/v1_deprecation.ts
🧬 Code graph analysis (1)
tools/migrate/v1_deprecation.ts (1)
internal/resend/src/client.tsx (1)
Resend(11-143)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
- GitHub Check: Build / Build
- GitHub Check: Test API / API Test Local
- GitHub Check: Test Go API Local / Test
There was a problem hiding this comment.
Actionable comments posted: 5
♻️ Duplicate comments (6)
tools/migrate/v1_deprecation.ts (6)
12-14: Use MySQL options object and enable TLS by default; DSN is brittle.Interpolated DSNs break with special chars and
ssl={}is non-standard.Apply:
- const conn = await mysql.createConnection( - `mysql://${process.env.DATABASE_USERNAME}:${process.env.DATABASE_PASSWORD}@${process.env.DATABASE_HOST}:3306/unkey?ssl={}`, - ); + const conn = await mysql.createConnection({ + host: ENV.DATABASE_HOST, + port: 3306, + user: ENV.DATABASE_USERNAME, + password: ENV.DATABASE_PASSWORD, + database: "unkey", + // Default to TLS unless explicitly disabled + ssl: (process.env.DATABASE_SSL ?? "").toLowerCase() === "false" ? undefined : {}, + });
7-7: Validate required environment at startup; fail fast.Avoid partial runs and undefined behavior when creds are missing.
Apply above this line:
+const ENV = z + .object({ + CLICKHOUSE_URL: z.string().min(1), + DATABASE_HOST: z.string().min(1), + DATABASE_USERNAME: z.string().min(1), + DATABASE_PASSWORD: z.string().min(1), + WORKOS_API_KEY: z.string().min(1), + RESEND_API_KEY: z.string().min(1), + MIGRATION_EMAIL_ALLOWLIST: z.string().optional(), + DRY_RUN: z.enum(["1", "0", "true", "false"]).optional(), + MIGRATION_CONCURRENCY: z.string().regex(/^\d+$/).optional(), + }) + .parse(process.env);And use it:
- const ch = new ClickHouse({ - url: process.env.CLICKHOUSE_URL, - }); + const ch = new ClickHouse({ url: ENV.CLICKHOUSE_URL }); ... - const workos = new WorkOS(process.env.WORKOS_API_KEY); + const workos = new WorkOS(ENV.WORKOS_API_KEY); ... - const resend = new Resend({ - apiKey: process.env.RESEND_API_KEY, - }); + const resend = new Resend({ apiKey: ENV.RESEND_API_KEY });
55-60: Deduplicate endpoint paths per workspace.Guard against dupes and keep email payloads lean.
Apply:
- const workspaceToPaths = new Map<string, string[]>(); + const workspaceToPaths = new Map<string, Set<string>>(); for (const row of rows.val) { - const paths = workspaceToPaths.get(row.workspace_id) || []; - paths.push(row.path); - workspaceToPaths.set(row.workspace_id, paths); + const paths = workspaceToPaths.get(row.workspace_id) ?? new Set<string>(); + paths.add(row.path); + workspaceToPaths.set(row.workspace_id, paths); } - for (const [workspaceId, paths] of workspaceToPaths.entries()) { + for (const [workspaceId, pathSet] of workspaceToPaths.entries()) {And when sending:
- deprecatedEndpoints: paths, + deprecatedEndpoints: Array.from(pathSet),Also applies to: 62-62, 88-89
62-64: Avoid logging PII; gate verbose logs behind DEBUG.Workspace IDs, names, and endpoint paths can be sensitive.
Apply:
- console.log(workspaceId, paths); + if (process.env.DEBUG) console.log(workspaceId, Array.from(pathSet)); ... - console.log(workspace.name); + if (process.env.DEBUG) console.log(`workspace: ${workspace.id}`);Also applies to: 73-73
97-97: Catch top-level errors and exit non-zero.Prevents silent failures in CI/cron.
Apply:
-main(); +main().catch((error) => { + console.error("Migration script failed:", error); + process.exit(1); +});
61-61: Log accurate counts after building the map.Report unique workspaces and total paths.
Apply:
+ console.log(`Found ${workspaceToPaths.size} workspaces across ${rows.val.length} paths`);
📜 Review details
Configuration used: CodeRabbit UI
Review profile: ASSERTIVE
Plan: Pro
📒 Files selected for processing (1)
tools/migrate/v1_deprecation.ts(1 hunks)
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2025-01-31T13:50:45.004Z
Learnt from: chronark
PR: unkeyed/unkey#2825
File: apps/dashboard/app/(app)/logs/components/table/logs-table.tsx:100-118
Timestamp: 2025-01-31T13:50:45.004Z
Learning: Sensitive data in logs is masked at the API layer (apps/api/src/pkg/middleware/metrics.ts) before being written to Clickhouse. This includes redacting API keys, plaintext values, and authorization headers using regex replacements.
Applied to files:
tools/migrate/v1_deprecation.ts
🧬 Code graph analysis (1)
tools/migrate/v1_deprecation.ts (1)
internal/resend/src/client.tsx (1)
Resend(11-143)
🪛 GitHub Actions: autofix.ci
tools/migrate/v1_deprecation.ts
[error] 46-98: Parse errors due to unresolved merge conflict markers (<<<<<<< HEAD, =======, >>>>>>> 938c58...) in ./tools/migrate/v1_deprecation.ts during the 'pnpm biome format . --write && pnpm biome check . --write' step; formatting aborted due to parsing errors.
🔇 Additional comments (1)
tools/migrate/v1_deprecation.ts (1)
65-71: Guard against missing workspace.orgId before invoking WorkOS API
In tools/migrate/v1_deprecation.ts before the call tolistOrganizationMemberships(on line 76), add a null‐check forworkspace.orgIdand skip workspaces without it to prevent runtime errors.
| workspace_id, | ||
| splitByChar('?', path, 1)[1] as path | ||
| FROM metrics.api_requests_per_day_v1 | ||
| WHERE startsWith(path, '/v1/') |
There was a problem hiding this comment.
can we exclude people who used the analytics endpoint? since thats not yet available in v2?
perkinsjr
left a comment
There was a problem hiding this comment.
Just remove v1 analytics and the rest looks good
|
ah lol I had that in mind and then forgot, I agree |
…at_script_to_notify_users_to_migrate_to_v2
* feat: local first ratelimits * wip * feat: sync drizzle to go/pkg/db/schema.sql (#3920) * feat: sync drizzle to go/pkg/db/schema.sql * fix: upgrade drizzle-kit to use required flag * fix: autogeneration notice --------- Co-authored-by: Flo <53355483+Flo4604@users.noreply.github.com> * feat: script to notify users to migrate to v2 (#3924) * feat: script to notify users to migrate to v2 * Apply suggestions from code review * [autofix.ci] apply automated fixes * fix: address james' feedback * merge * fix: exclude analytics --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com> * fix: prevent duplicates * fix: make reset async * fix: 404 * fix: rename variable for clarity * fix: reset in parallel * chore: revert zod version * fix: I don't even know, this restacking is fucking with my breain * chore: match zod schema --------- Co-authored-by: Flo <53355483+Flo4604@users.noreply.github.com> Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
* feat: local first ratelimits * wip * feat: sync drizzle to go/pkg/db/schema.sql (#3920) * feat: sync drizzle to go/pkg/db/schema.sql * fix: upgrade drizzle-kit to use required flag * fix: autogeneration notice --------- Co-authored-by: Flo <53355483+Flo4604@users.noreply.github.com> * feat: script to notify users to migrate to v2 (#3924) * feat: script to notify users to migrate to v2 * Apply suggestions from code review * [autofix.ci] apply automated fixes * fix: address james' feedback * merge * fix: exclude analytics --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com> * fix: prevent duplicates * fix: make reset async * fix: 404 * fix: rename variable for clarity * fix: reset in parallel * chore: revert zod version * fix: I don't even know, this restacking is fucking with my breain * chore: match zod schema * refactor: migrate projects to TanStack React DB * wip * revert: uncommit files * wip: it builds again, I gotta go * fix: deployment filters * fix: make it build * Apply suggestions from code review --------- Co-authored-by: Flo <53355483+Flo4604@users.noreply.github.com> Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>

What does this PR do?
Add API v1 migration email notification system to inform users about the upcoming deprecation of v1 endpoints. This includes:
Type of change
How should this be tested?
Checklist
Required
pnpm buildpnpm fmtconsole.logsgit pull origin mainAppreciated
Summary by CodeRabbit
New Features
Chores