-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Add upstream to my fork #684
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -12,6 +12,7 @@ import { env } from "@/env"; | |
| import { GlobalProviders } from "@/providers/GlobalProviders"; | ||
| import { UTM } from "@/app/utm"; | ||
| import { startupImage } from "@/app/startup-image"; | ||
| import { StartupMigrations } from "@/components/StartupMigrations"; | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 💡 Verification agent 🧩 Analysis chainAvoid running startup migrations in the layout render path. Running migrations during RootLayout render can add latency per request, trigger multiple concurrent runs under load, and entangle UI with ops logic. Prefer Next.js instrumentation.ts to run once per server boot. Apply this change:
-import { StartupMigrations } from "@/components/StartupMigrations";
@@
- <StartupMigrations />
// apps/web/instrumentation.ts
import { runStartupMigrations } from "@/utils/startup";
export async function register() {
await runStartupMigrations();
}Ensure runStartupMigrations is idempotent and protected by a distributed lock (e.g., Postgres advisory lock or Redis lock) to prevent concurrent execution across replicas. Also applies to: 88-88 🏁 Script executed: #!/bin/bash
set -euo pipefail
echo "Searching for StartupMigrations usages..."
rg -n --hidden --no-ignore --smart-case "StartupMigrations" -S -C3 || true
echo
echo "Searching for runStartupMigrations usages/definitions..."
rg -n --hidden --no-ignore --smart-case "runStartupMigrations" -S -C3 || true
echo
echo "Looking for apps/web/app/layout.tsx file..."
rg -n --hidden --no-ignore --smart-case --glob "apps/web/**" "layout.tsx" -S -C3 || true
# If apps/web/app/layout.tsx exists, print it
if [ -f "apps/web/app/layout.tsx" ]; then
echo
echo "----- apps/web/app/layout.tsx -----"
sed -n '1,240p' apps/web/app/layout.tsx || true
else
echo
echo "apps/web/app/layout.tsx not found at that exact path. Searching for any layout.tsx..."
rg -n --hidden --no-ignore --smart-case "layout.tsx" -S -C3 || true
fi
echo
echo "Searching for instrumentation.ts files..."
rg -n --hidden --no-ignore --smart-case "instrumentation.ts" -S -C3 || true
echo
echo "Searching for utils/startup or similar startup utilities..."
rg -n --hidden --no-ignore --smart-case "startup" -S -C3 || trueLength of output: 20456 Move startup migrations out of RootLayout — run once on server boot via instrumentation.register Verified: apps/web/app/layout.tsx imports and renders StartupMigrations (calls runStartupMigrations in apps/web/utils/startup.ts). No apps/web/instrumentation.ts exists. Running migrations in RootLayout causes per-request latency and concurrent-run risk. Files to change:
Apply these edits: Remove from apps/web/app/layout.tsx: -import { StartupMigrations } from "@/components/StartupMigrations";
@@
- <StartupMigrations />Add apps/web/instrumentation.ts: // apps/web/instrumentation.ts
import { runStartupMigrations } from "@/utils/startup";
export async function register() {
await runStartupMigrations();
}Ensure runStartupMigrations is idempotent and guarded by a distributed lock (Postgres advisory lock or Redis lock) to prevent concurrent execution across replicas. 🤖 Prompt for AI Agents |
||
|
|
||
| const inter = Inter({ | ||
| subsets: ["latin"], | ||
|
|
@@ -84,6 +85,7 @@ export default async function RootLayout({ | |
| <body | ||
| className={`h-full ${inter.variable} ${calFont.variable} font-sans antialiased`} | ||
| > | ||
| <StartupMigrations /> | ||
| <PostHogProvider> | ||
| <Suspense> | ||
| <PostHogPageview /> | ||
|
|
||
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,13 @@ | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| import { runStartupMigrations } from "@/utils/startup"; | ||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||
| /** | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| * Server component that runs startup migrations | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| * This component will run on the server side when the app loads | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| */ | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| export async function StartupMigrations() { | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| // Run migrations on server side during page load | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| await runStartupMigrations(); | ||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||
| // Return nothing - this is just for side effects | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| return null; | ||||||||||||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Comment on lines
+7
to
+13
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🛠️ Refactor suggestion Running migrations on every layout render will add latency and DB load Because this server component is rendered with the root layout, it’ll execute on most requests. Prefer a once-per-process trigger (or Next.js instrumentation.ts) to avoid per-request overhead. Apply this diff to execute once per server instance: import { runStartupMigrations } from "@/utils/startup";
/**
* Server component that runs startup migrations
* This component will run on the server side when the app loads
*/
export async function StartupMigrations() {
- // Run migrations on server side during page load
- await runStartupMigrations();
+ // Ensure migrations run only once per server instance
+ // Module scope guarantees persistence across requests in the same process
+ let startupMigrationsPromise:
+ | Promise<void>
+ | null
+ | undefined = (globalThis as unknown as { __IZ_STARTUP_MIGRATIONS_PROMISE?: Promise<void> })
+ .__IZ_STARTUP_MIGRATIONS_PROMISE;
+ if (!startupMigrationsPromise) {
+ startupMigrationsPromise = runStartupMigrations();
+ (globalThis as unknown as { __IZ_STARTUP_MIGRATIONS_PROMISE?: Promise<void> }).__IZ_STARTUP_MIGRATIONS_PROMISE =
+ startupMigrationsPromise;
+ }
+ await startupMigrationsPromise;
// Return nothing - this is just for side effects
return null;
}Alternatively, move this into apps/web/instrumentation.ts to run once at server start. I can provide the instrumentation.ts variant if you prefer. 📝 Committable suggestion
Suggested change
|
||||||||||||||||||||||||||||||||||||||||||||||||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -18,7 +18,8 @@ export const env = createEnv({ | |
| NODE_ENV: z.enum(["development", "production", "test"]), | ||
| DATABASE_URL: z.string().url(), | ||
|
|
||
| NEXTAUTH_SECRET: z.string().min(1), | ||
| NEXTAUTH_SECRET: z.string().min(1).optional(), // Legacy support for migration | ||
| BETTER_AUTH_SECRET: z.string().min(1).optional(), | ||
|
Comment on lines
+21
to
+22
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 💡 Verification agent 🧩 Analysis chainScan for accidental hardcoded provider tokens Given the sensitive nature of auth and queue/redis providers, ensure no Upstash/QStash secrets are hardcoded in source files. Use this secret-scan to flag suspicious assignments (excludes .env files): 🏁 Script executed: #!/bin/bash
# Grep for potentially hardcoded UPSTASH/QSTASH tokens in code (not env files)
rg -n -C1 -P '(UPSTASH|QSTASH)[A-Z0-9_]*\s*[:=]\s*["\'][^"\']{8,}["\']' -g '!**/.env*' -g '!**/node_modules/**'Length of output: 116 🏁 Script executed: #!/bin/bash
set -euo pipefail
echo "Searching for provider names (upstash/qstash)..."
rg -n -C1 -P "(?i)\b(upstash|qstash)\b" -g '!**/.env*' -g '!**/node_modules/**' || echo "No matches for provider names."
echo "Searching for secret/token/key assignments (>=8 chars)..."
rg -n -C1 -P "(?i)\b(secret|token|key)\b\s*[:=]\s*['\"][A-Za-z0-9_\-+/=]{8,}['\"]" -g '!**/.env*' -g '!**/node_modules/**' || echo "No matches for secret assignments."
echo "Searching for long string literals (>=20 chars)..."
rg -n -C1 -P "['\"][A-Za-z0-9_\-+/=]{20,}['\"]" -g '!**/.env*' -g '!**/node_modules/**' || echo "No long string literals found."Length of output: 240467 Hardcoded Upstash/QStash tokens found — please remove & rotate immediately Scan results show committed token literals in the repo. Actionable locations:
Recommended fixes (concise):
🤖 Prompt for AI Agents |
||
| NEXTAUTH_URL: z.string().optional(), | ||
| AUTH_TRUST_HOST: z.coerce.boolean().optional(), | ||
|
|
||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -15,6 +15,9 @@ const withMDX = nextMdx(); | |
|
|
||
| const nextConfig: NextConfig = { | ||
| reactStrictMode: true, | ||
| typescript: { | ||
| ignoreBuildErrors: true, | ||
| }, | ||
|
Comment on lines
+18
to
+20
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Don’t ship with TypeScript errors suppressed in production
Apply this diff to only ignore in non-production: typescript: {
- ignoreBuildErrors: true,
+ ignoreBuildErrors: env.NODE_ENV !== "production",
},
🤖 Prompt for AI Agents |
||
| serverExternalPackages: ["@sentry/nextjs", "@sentry/node"], | ||
| turbopack: { | ||
| rules: { | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,14 @@ | ||
| /* | ||
| Warnings: | ||
|
|
||
| - You are about to drop the column `digestScheduleId` on the `EmailAccount` table. All the data in the column will be lost. | ||
|
|
||
| */ | ||
| -- DropForeignKey | ||
| ALTER TABLE "EmailAccount" DROP CONSTRAINT "EmailAccount_digestScheduleId_fkey"; | ||
|
|
||
| -- DropIndex | ||
| DROP INDEX "EmailAccount_digestScheduleId_key"; | ||
|
|
||
| -- AlterTable | ||
| ALTER TABLE "EmailAccount" DROP COLUMN "digestScheduleId"; |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -19,7 +19,9 @@ model Account { | |
| providerAccountId String | ||
| refresh_token String? @db.Text | ||
| access_token String? @db.Text | ||
| expires_at DateTime? @default(now()) | ||
| expires_at DateTime? | ||
| accessTokenExpiresAt DateTime? | ||
| refreshTokenExpiresAt DateTime? | ||
|
Comment on lines
+22
to
+24
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 💡 Verification agent 🧩 Analysis chainVerify data migration and usage consistency for new token expiry fields. You added accessTokenExpiresAt and refreshTokenExpiresAt and left expires_at as optional. Ensure:
Run this script to check for remaining code references and migration presence: If gaps are found, I can generate a safe, idempotent Prisma migration to backfill accessTokenExpiresAt from expires_at. 🏁 Script executed: #!/bin/bash
set -euo pipefail
echo "Searching for legacy 'expires_at' usage in TS/TSX..."
rg -nP --type=ts --type=tsx -C2 '\bexpires_at\b'
echo
echo "Searching for new fields usage..."
rg -nP --type=ts --type=tsx -C2 '\b(accessTokenExpiresAt|refreshTokenExpiresAt)\b'
echo
echo "Checking Prisma migrations mentioning the new fields..."
fd -H migrations | xargs -I{} rg -nP -C2 '(accessTokenExpiresAt|refreshTokenExpiresAt|expires_at)' {}Length of output: 219 🏁 Script executed: #!/bin/bash
set -euo pipefail
echo "1) Showing apps/web/prisma/schema.prisma (if present)..."
if [ -f "apps/web/prisma/schema.prisma" ]; then
sed -n '1,240p' apps/web/prisma/schema.prisma || true
else
echo "apps/web/prisma/schema.prisma not found"
fi
echo
echo "2) Searching for 'expires_at' (snake_case) across repository..."
rg -n -C2 '\bexpires_at\b' --hidden --glob '!**/node_modules/**' || true
echo
echo "3) Searching for 'expiresAt' (camelCase) across repository..."
rg -n -C2 '\bexpiresAt\b' --hidden --glob '!**/node_modules/**' || true
echo
echo "4) Searching for new fields 'accessTokenExpiresAt' or 'refreshTokenExpiresAt'..."
rg -n -C2 '\b(accessTokenExpiresAt|refreshTokenExpiresAt)\b' --hidden --glob '!**/node_modules/**' || true
echo
echo "5) Searching .prisma files and migrations for any mentions..."
rg -n -C2 '(accessTokenExpiresAt|refreshTokenExpiresAt|expires_at)' --hidden --glob '**/*.prisma' --glob '!**/node_modules/**' || true
rg -n -C2 '(accessTokenExpiresAt|refreshTokenExpiresAt|expires_at)' --hidden --glob '**/migrations/**' --glob '!**/node_modules/**' || trueLength of output: 45170 🏁 Script executed: #!/bin/bash
set -euo pipefail
files=(
"apps/web/utils/migration/fix-encrypted-tokens.ts"
"apps/web/utils/auth.ts"
"apps/web/utils/api-auth.ts"
"apps/web/prisma/migrations/20250804163003_better_auth/migration.sql"
"apps/web/prisma/migrations/20230730073019_init/migration.sql"
)
for f in "${files[@]}"; do
echo "---- $f ----"
if [ -f "$f" ]; then
sed -n '1,240p' "$f" || true
else
echo "MISSING: $f"
fi
echo
doneLength of output: 21914 Action required: migrate expires_at -> accessTokenExpiresAt/refreshTokenExpiresAt and update codepaths Short summary: I searched the repo. accessTokenExpiresAt/refreshTokenExpiresAt were added to schema.prisma, but many code paths still read/write the old expires_at column and the migrations do not add/backfill the new columns. You need a migration + code updates. Files/locations that still reference expires_at (representative):
Places already using the new fields:
Required fixes (high level):
If you want, I can generate the idempotent Prisma migration SQL (ADD COLUMN + UPDATE backfill) and a small codemod to replace common selects/usages or a checklist of exact files/lines to change. 🤖 Prompt for AI Agents |
||
| token_type String? | ||
| scope String? | ||
| id_token String? @db.Text | ||
|
|
||
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| @@ -1,7 +1,6 @@ | ||||||||||||||||||||||||||
| "use client"; | ||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| import { useChat as useAiChat } from "@ai-sdk/react"; | ||||||||||||||||||||||||||
| import { DefaultChatTransport } from "ai"; | ||||||||||||||||||||||||||
| import { parseAsString, useQueryState } from "nuqs"; | ||||||||||||||||||||||||||
| import { | ||||||||||||||||||||||||||
| createContext, | ||||||||||||||||||||||||||
|
|
@@ -47,20 +46,14 @@ export function ChatProvider({ children }: { children: React.ReactNode }) { | |||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| const chat = useAiChat<ChatMessage>({ | ||||||||||||||||||||||||||
| id: chatId ?? undefined, | ||||||||||||||||||||||||||
| transport: new DefaultChatTransport({ | ||||||||||||||||||||||||||
| api: "/api/chat", | ||||||||||||||||||||||||||
| headers: { | ||||||||||||||||||||||||||
| [EMAIL_ACCOUNT_HEADER]: emailAccountId, | ||||||||||||||||||||||||||
| }, | ||||||||||||||||||||||||||
| prepareSendMessagesRequest({ messages, id, body }) { | ||||||||||||||||||||||||||
| return { | ||||||||||||||||||||||||||
| body: { | ||||||||||||||||||||||||||
| id, | ||||||||||||||||||||||||||
| message: messages.at(-1), | ||||||||||||||||||||||||||
| ...body, | ||||||||||||||||||||||||||
| }, | ||||||||||||||||||||||||||
| }; | ||||||||||||||||||||||||||
| }, | ||||||||||||||||||||||||||
| api: "/api/chat", | ||||||||||||||||||||||||||
| headers: { | ||||||||||||||||||||||||||
| [EMAIL_ACCOUNT_HEADER]: emailAccountId, | ||||||||||||||||||||||||||
| }, | ||||||||||||||||||||||||||
| experimental_prepareRequestBody: ({ id, messages, requestBody }) => ({ | ||||||||||||||||||||||||||
|
Comment on lines
+49
to
+53
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🛠️ Refactor suggestion Don’t send undefined header values If emailAccountId is null/undefined, you’ll set an invalid header. Omit the header when not available. Apply this diff: - api: "/api/chat",
- headers: {
- [EMAIL_ACCOUNT_HEADER]: emailAccountId,
- },
+ api: "/api/chat",
+ headers: emailAccountId
+ ? {
+ [EMAIL_ACCOUNT_HEADER]: emailAccountId,
+ }
+ : undefined,📝 Committable suggestion
Suggested change
🤖 Prompt for AI Agents |
||||||||||||||||||||||||||
| id, | ||||||||||||||||||||||||||
| message: messages.at(-1), | ||||||||||||||||||||||||||
| ...requestBody, | ||||||||||||||||||||||||||
| }), | ||||||||||||||||||||||||||
| // TODO: couldn't get this to work | ||||||||||||||||||||||||||
| // messages: initialMessages, | ||||||||||||||||||||||||||
|
|
@@ -83,7 +76,7 @@ export function ChatProvider({ children }: { children: React.ReactNode }) { | |||||||||||||||||||||||||
| }, [chat.setMessages, data]); | ||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| const handleSubmit = useCallback(() => { | ||||||||||||||||||||||||||
| chat.sendMessage({ | ||||||||||||||||||||||||||
| chat.append({ | ||||||||||||||||||||||||||
| role: "user", | ||||||||||||||||||||||||||
| parts: [ | ||||||||||||||||||||||||||
| { | ||||||||||||||||||||||||||
|
|
@@ -94,7 +87,7 @@ export function ChatProvider({ children }: { children: React.ReactNode }) { | |||||||||||||||||||||||||
| }); | ||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| setInput(""); | ||||||||||||||||||||||||||
| }, [chat.sendMessage, input]); | ||||||||||||||||||||||||||
| }, [chat.append, input]); | ||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| return ( | ||||||||||||||||||||||||||
| <ChatContext.Provider | ||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,39 @@ | ||
| #!/usr/bin/env tsx | ||
|
|
||
| /** | ||
| * Standalone script to fix corrupted tokens from NextAuth to Better-Auth migration | ||
| * Usage: pnpm exec tsx scripts/fix-corrupted-tokens.ts | ||
| */ | ||
|
|
||
| import { fixCorruptedTokens } from "@/utils/migration/fix-encrypted-tokens"; | ||
|
|
||
| async function main() { | ||
| console.log("🔧 Starting corrupted token cleanup..."); | ||
|
|
||
| try { | ||
| const result = await fixCorruptedTokens(); | ||
|
|
||
| console.log("✅ Token cleanup completed successfully!"); | ||
| console.log("📊 Results:"); | ||
| console.log(` - Total accounts checked: ${result.total}`); | ||
| console.log(` - Corrupted tokens fixed: ${result.fixed}`); | ||
|
|
||
| if (result.fixed > 0) { | ||
| console.log( | ||
| "\n⚠️ Users with affected accounts will need to re-authenticate on their next login.", | ||
| ); | ||
| console.log( | ||
| " This will happen automatically - no action required from users.", | ||
| ); | ||
| } else { | ||
| console.log("🎉 No corrupted tokens found - all accounts are healthy!"); | ||
| } | ||
|
|
||
| process.exit(0); | ||
| } catch (error) { | ||
| console.error("❌ Error during token cleanup:", error); | ||
| process.exit(1); | ||
| } | ||
| } | ||
|
|
||
| main(); |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -43,7 +43,11 @@ export const betterAuthConfig = betterAuth({ | |
| }, | ||
| baseURL: env.NEXT_PUBLIC_BASE_URL, | ||
| trustedOrigins: [env.NEXT_PUBLIC_BASE_URL], | ||
| secret: process.env.NEXTAUTH_SECRET, | ||
| secret: | ||
| env.BETTER_AUTH_SECRET || | ||
| env.NEXTAUTH_SECRET || | ||
| process.env.BETTER_AUTH_SECRET || | ||
| process.env.NEXTAUTH_SECRET, | ||
|
Comment on lines
+46
to
+50
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🛠️ Refactor suggestion Fail fast if auth secret is missing to avoid runtime misconfiguration. The OR chain can evaluate to undefined when both secrets are absent. better-auth expects a secret; proceed with an explicit check and clear error. Apply this change near the top of the file (before betterAuthConfig): const AUTH_SECRET =
env.BETTER_AUTH_SECRET ??
env.NEXTAUTH_SECRET ??
process.env.BETTER_AUTH_SECRET ??
process.env.NEXTAUTH_SECRET;
if (!AUTH_SECRET) {
throw new Error(
"Auth secret missing. Set BETTER_AUTH_SECRET (preferred) or NEXTAUTH_SECRET."
);
}Then update the config: - secret:
- env.BETTER_AUTH_SECRET ||
- env.NEXTAUTH_SECRET ||
- process.env.BETTER_AUTH_SECRET ||
- process.env.NEXTAUTH_SECRET,
+ secret: AUTH_SECRET,I can send a full diff if you prefer it inline. 🤖 Prompt for AI Agents |
||
| emailAndPassword: { | ||
| enabled: false, | ||
| }, | ||
|
|
@@ -71,7 +75,8 @@ export const betterAuthConfig = betterAuth({ | |
| providerId: "provider", | ||
| refreshToken: "refresh_token", | ||
| accessToken: "access_token", | ||
| accessTokenExpiresAt: "expires_at", | ||
| accessTokenExpiresAt: "accessTokenExpiresAt", | ||
| refreshTokenExpiresAt: "refreshTokenExpiresAt", | ||
| idToken: "id_token", | ||
| }, | ||
| }, | ||
|
|
@@ -430,7 +435,9 @@ export async function saveTokens({ | |
|
|
||
| const data = { | ||
| access_token: tokens.access_token, | ||
| expires_at: tokens.expires_at ? new Date(tokens.expires_at * 1000) : null, | ||
| accessTokenExpiresAt: tokens.expires_at | ||
| ? new Date(tokens.expires_at * 1000) | ||
| : null, | ||
| refresh_token: refreshToken, | ||
| }; | ||
|
Comment on lines
436
to
442
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Avoid clobbering refresh_token on encryption failure; skip the field update instead. Currently, if encryptToken fails, refresh_token is set to an empty string, which can irreversibly break the account. Treat encryption failures as non-updates for that field, mirroring access_token handling. Apply this diff: - const data = {
- access_token: tokens.access_token,
- accessTokenExpiresAt: tokens.expires_at
- ? new Date(tokens.expires_at * 1000)
- : null,
- refresh_token: refreshToken,
- };
+ const data: {
+ access_token?: string;
+ accessTokenExpiresAt: Date | null;
+ refresh_token?: string;
+ } = {
+ accessTokenExpiresAt: tokens.expires_at
+ ? new Date(tokens.expires_at * 1000)
+ : null,
+ };
+ if (tokens.access_token) data.access_token = tokens.access_token;
+ if (refreshToken) data.refresh_token = refreshToken;
@@
- if (data.access_token)
- data.access_token = encryptToken(data.access_token) || undefined;
- if (data.refresh_token)
- data.refresh_token = encryptToken(data.refresh_token) || "";
+ if (data.access_token) {
+ const enc = encryptToken(data.access_token);
+ data.access_token = enc ?? undefined; // skip update on failure
+ }
+ if (data.refresh_token) {
+ const enc = encryptToken(data.refresh_token);
+ if (enc) {
+ data.refresh_token = enc;
+ } else {
+ // Skip refresh token update to avoid data loss
+ delete data.refresh_token;
+ logger.error("Failed to encrypt refresh token; skipping update", {
+ emailAccountId,
+ providerAccountId,
+ provider,
+ });
+ }
+ }This keeps existing DB values intact when encryption fails, preventing accidental logout/breakage. Also applies to: 448-452 🤖 Prompt for AI Agents |
||
|
|
||
|
|
||
| Original file line number | Diff line number | Diff line change | ||||||||||||||||||||||||||||||||||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -60,6 +60,12 @@ export function decryptToken(encryptedText: string | null): string | null { | |||||||||||||||||||||||||||||||||||||||||||||
| try { | ||||||||||||||||||||||||||||||||||||||||||||||
| const buffer = Buffer.from(encryptedText, "hex"); | ||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||
| // Validate buffer length - must contain at least IV + Auth Tag | ||||||||||||||||||||||||||||||||||||||||||||||
| if (buffer.length < IV_LENGTH + AUTH_TAG_LENGTH) { | ||||||||||||||||||||||||||||||||||||||||||||||
| logger.warn("Encrypted token too short, likely corrupted"); | ||||||||||||||||||||||||||||||||||||||||||||||
| return null; | ||||||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||||||
|
Comment on lines
+63
to
+67
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. 🛠️ Refactor suggestion Add hex-format validation before/around decryption to prevent truncated/invalid buffers. Buffer.from with "hex" will silently drop an odd trailing nibble or non-hex chars, which can pass the length check and cause misleading errors. Validate the input is even-length and hex-only before proceeding. Apply this diff to harden the validator: - // Validate buffer length - must contain at least IV + Auth Tag
- if (buffer.length < IV_LENGTH + AUTH_TAG_LENGTH) {
- logger.warn("Encrypted token too short, likely corrupted");
- return null;
- }
+ // Validate hex format (avoid truncated bytes) and minimum length
+ if (
+ typeof encryptedText !== "string" ||
+ encryptedText.length % 2 !== 0 ||
+ !/^[0-9a-fA-F]+$/.test(encryptedText)
+ ) {
+ logger.warn("Encrypted token has invalid hex format");
+ return null;
+ }
+
+ // Validate buffer length - must contain at least IV + Auth Tag
+ if (buffer.length < IV_LENGTH + AUTH_TAG_LENGTH) {
+ logger.warn("Encrypted token too short, likely corrupted", {
+ byteLength: buffer.length,
+ });
+ return null;
+ }📝 Committable suggestion
Suggested change
🤖 Prompt for AI Agents |
||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||
| // Extract IV (first 16 bytes) | ||||||||||||||||||||||||||||||||||||||||||||||
| const iv = buffer.subarray(0, IV_LENGTH); | ||||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||
|
|
@@ -79,7 +85,15 @@ export function decryptToken(encryptedText: string | null): string | null { | |||||||||||||||||||||||||||||||||||||||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||
| return decrypted.toString("utf8"); | ||||||||||||||||||||||||||||||||||||||||||||||
| } catch (error) { | ||||||||||||||||||||||||||||||||||||||||||||||
| logger.error("Decryption failed", { error }); | ||||||||||||||||||||||||||||||||||||||||||||||
| // Reduce log noise - only log detailed errors in development | ||||||||||||||||||||||||||||||||||||||||||||||
| if (process.env.NODE_ENV === "development") { | ||||||||||||||||||||||||||||||||||||||||||||||
| logger.error("Decryption failed", { | ||||||||||||||||||||||||||||||||||||||||||||||
| error, | ||||||||||||||||||||||||||||||||||||||||||||||
| encryptedLength: encryptedText?.length, | ||||||||||||||||||||||||||||||||||||||||||||||
| }); | ||||||||||||||||||||||||||||||||||||||||||||||
| } else { | ||||||||||||||||||||||||||||||||||||||||||||||
| logger.warn("Token decryption failed - token may need refresh"); | ||||||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||||||
| return null; | ||||||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||||||
| } | ||||||||||||||||||||||||||||||||||||||||||||||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ensure encryption envs match env.ts expectations
env.ts requires
EMAIL_ENCRYPT_SECRETandEMAIL_ENCRYPT_SALT, but this example file listsGOOGLE_ENCRYPT_*andMICROSOFT_ENCRYPT_*. Add the requiredEMAIL_ENCRYPT_*envs here to avoid runtime validation failures.Proposed insertion near the “Gmail” or general auth section:
Follow-up: If the system actually expects provider-specific secrets (GOOGLE_/MICROSOFT_), either keep both sets or align env.ts to match the provider-specific keys. I can generate a follow-up PR once you confirm the intended keys.
📝 Committable suggestion
🧰 Tools
🪛 dotenv-linter (3.3.0)
[warning] 4-4: [SpaceCharacter] The line has spaces around equal sign
(SpaceCharacter)
[warning] 4-4: [ValueWithoutQuotes] This value needs to be surrounded in quotes
(ValueWithoutQuotes)
[warning] 5-5: [SpaceCharacter] The line has spaces around equal sign
(SpaceCharacter)
[warning] 5-5: [ValueWithoutQuotes] This value needs to be surrounded in quotes
(ValueWithoutQuotes)
🤖 Prompt for AI Agents