fix(integrations): handle Linear OAuth refresh token rotation#4002
Conversation
📝 WalkthroughWalkthroughImplements Linear OAuth refresh: store refresh tokens and expiry, add disconnected state on connections, provide single-flight refresh with DB advisory locks, proactively refresh in client acquisition, wrap Linear calls to retry on auth errors, add a Qstash-triggered refresh job, and surface reconnect UI in the dashboard. ChangesLinear OAuth Token Refresh & Connection State Management
Sequence DiagramssequenceDiagram
participant Q as Qstash/Upstash
participant API as Refresh-Job API
participant DB as Database
participant Linear as Linear OAuth
Q->>API: POST (signed payload)
API->>DB: SELECT stale integration_connections
DB-->>API: [ids]
par per-connection
API->>DB: BEGIN transaction + advisory lock
DB-->>API: lock acquired
API->>DB: SELECT tokens for id
DB-->>API: accessToken, refreshToken, tokenExpiresAt
API->>Linear: POST /oauth/token (refresh_token)
Linear-->>API: token response (access, refresh, expires_in) / error
API->>DB: UPDATE tokens, set token_expires_at, clear disconnected fields
DB-->>API: COMMIT + release lock
end
API-->>Q: { candidates: N, succeeded: M }
sequenceDiagram
participant Client
participant tRPC
participant Utils as getLinearClient
participant Refresh as refreshLinearToken
participant DB
participant LinearAPI
Client->>tRPC: callLinear(organizationId, fn)
tRPC->>Utils: getLinearClient(organizationId)
Utils->>DB: SELECT connection, tokenExpiresAt, disconnectedAt
DB-->>Utils: connection row
alt expires soon
Utils->>Refresh: refreshLinearToken(connectionId)
Refresh->>DB: withConnectionLock -> SELECT refreshToken
DB-->>Refresh: refreshToken
Refresh->>LinearAPI: POST /oauth/token
LinearAPI-->>Refresh: new tokens
Refresh->>DB: UPDATE tokens, clear disconnected fields
DB-->>Refresh: updated
Refresh-->>Utils: accessToken
end
Utils-->>tRPC: LinearClient(accessToken)
tRPC->>LinearAPI: fn(client)
alt auth error (401)
LinearAPI-->>tRPC: auth error
tRPC->>Refresh: refreshLinearToken(connectionId)
Refresh->>LinearAPI: POST /oauth/token
LinearAPI-->>Refresh: new tokens
Refresh-->>tRPC: accessToken
tRPC->>LinearAPI: retry fn(newClient)
LinearAPI-->>tRPC: result
end
tRPC-->>Client: result / null
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Poem
🚥 Pre-merge checks | ✅ 4 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (4 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Review rate limit: 7/8 reviews remaining, refill in 7 minutes and 30 seconds.Comment |
Greptile SummaryThis PR fixes Linear's OAuth token lifecycle for the short-lived (24h) / rotating-refresh-token model that went live on 2026-04-01. It adds schema columns for disconnect state, a Confidence Score: 4/5Safe to merge — all findings are P2 (style/defensive hardening); core refresh, locking, and disconnect logic is correct. No P0/P1 issues found. The lock-based single-flight refresh, re-check inside the transaction, atomic token writes, and connection disconnect state are all implemented correctly. Three P2 items: an unguarded second fn() call in callLinear that could surface auth errors to callers instead of returning null, a duplicated REFRESH_BUFFER_MS constant, and an unsafe type-cast in the one-time migration script. packages/trpc/src/router/integration/linear/refresh.ts (unguarded retry) and packages/db/src/scripts/migrate-linear-tokens.ts (unsafe response cast).
|
| Filename | Overview |
|---|---|
| packages/trpc/src/router/integration/linear/refresh.ts | New module with refreshLinearToken (under advisory lock with re-check), callLinear (401-retry wrapper), and isLinearAuthError. Core refresh logic is sound; the second retry call is unguarded and can propagate auth errors to callers. |
| packages/trpc/src/router/integration/linear/utils.ts | Adds getLinearClient (proactive refresh within 5-min buffer) and markConnectionDisconnected. REFRESH_BUFFER_MS is duplicated from refresh.ts; otherwise logic is correct. |
| packages/db/src/utils/sql.ts | Adds withConnectionLock using pg_advisory_xact_lock(hashtextextended(...)) inside a DB transaction for single-flight token refresh serialization. Correct use of transaction-scoped advisory lock. |
| apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts | New hourly cron endpoint refreshing tokens expiring within 90 min. QStash signature verification matches existing pattern. Promise.allSettled is misleading since inner try/catch prevents any rejection, but the counting logic is still functionally correct. |
| apps/api/src/app/api/integrations/linear/callback/route.ts | Now parses OAuth response with linearTokenResponseSchema, stores refresh_token, and resets disconnectedAt/disconnectReason on conflict (re-auth heals a broken connection). Clean improvement. |
| packages/db/src/scripts/migrate-linear-tokens.ts | One-time migration script to upgrade legacy long-lived tokens via Linear's /oauth/migrate_old_token. Uses an unsafe as MigrationResponse cast — malformed success responses could write NaN expiry or undefined token values to the DB. |
| packages/db/drizzle/0042_linear_disconnect_state.sql | Additive migration adding nullable disconnected_at (timestamp) and disconnect_reason (text) to integration_connections. No backfill, no breaking changes. |
| packages/trpc/src/router/integration/linear/linear.ts | Routes getTeams and disconnect now use callLinear; getConnection exposes needsReconnect and disconnectReason. Correct — disconnect wraps logout in try/catch so DB cleanup always runs. |
Sequence Diagram
sequenceDiagram
participant UI as Web UI
participant TRPC as tRPC (callLinear)
participant GLC as getLinearClient
participant RT as refreshLinearToken
participant DB as Postgres (withConnectionLock)
participant LIN as Linear API
UI->>TRPC: getTeams(orgId)
TRPC->>GLC: getLinearClient(orgId)
GLC->>DB: query connection
DB-->>GLC: connection (expiresAt, refreshToken)
alt token expires within 5 min
GLC->>RT: refreshLinearToken(id)
RT->>DB: BEGIN + pg_advisory_xact_lock
DB-->>RT: re-read (double-check)
RT->>LIN: POST /oauth/token (refresh_token)
LIN-->>RT: new access_token + refresh_token
RT->>DB: UPDATE tokens, COMMIT
RT-->>GLC: { disconnected:false, accessToken }
end
GLC-->>TRPC: LinearClient
TRPC->>LIN: client.teams()
alt 401 received
TRPC->>RT: refreshLinearToken(id)
RT->>DB: BEGIN + lock + re-read + refresh
RT-->>TRPC: { disconnected:false, accessToken }
TRPC->>LIN: retry client.teams()
end
LIN-->>TRPC: teams
TRPC-->>UI: teams[]
Note over DB,LIN: Hourly cron also calls refreshLinearToken for connections expiring within 90 min
Comments Outside Diff (1)
-
apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts, line 151-168 (link)Promise.allSettledis misleading here — promises never rejectThe mapper always resolves (the
try/catchreturns{ ok: false }on error, never throws).Promise.allSettledtherefore never observes a rejected promise;result.statusis always"fulfilled", so the&& result.value.okis the only relevant predicate. Either remove the try/catch and let real errors produce rejected promises (soallSettledbehaves as intended), or replaceallSettledwithPromise.alland keep the try/catch — the current mix is semantically confusing.Prompt To Fix With AI
This is a comment left during a code review. Path: apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts Line: 151-168 Comment: **`Promise.allSettled` is misleading here — promises never reject** The mapper always resolves (the `try/catch` returns `{ ok: false }` on error, never throws). `Promise.allSettled` therefore never observes a rejected promise; `result.status` is always `"fulfilled"`, so the `&& result.value.ok` is the only relevant predicate. Either remove the try/catch and let real errors produce rejected promises (so `allSettled` behaves as intended), or replace `allSettled` with `Promise.all` and keep the try/catch — the current mix is semantically confusing. How can I resolve this? If you propose a fix, please make it concise.
Prompt To Fix All With AI
Fix the following 4 code review issues. Work through them one at a time, proposing concise fixes.
---
### Issue 1 of 4
packages/trpc/src/router/integration/linear/utils.ts:9
**`REFRESH_BUFFER_MS` defined in two places**
`REFRESH_BUFFER_MS = 5 * 60 * 1000` is declared identically in both `utils.ts` (line 9) and `refresh.ts` (line 20). If one is updated independently they'll silently diverge — `getLinearClient`'s proactive-refresh window would differ from `refreshLinearToken`'s re-check guard. Extract it to a shared constant (e.g., in a `constants.ts` within this directory) and import it in both files.
### Issue 2 of 4
packages/db/src/scripts/migrate-linear-tokens.ts:54-55
**Unsafe cast on migration API response**
`(await response.json()) as MigrationResponse` skips runtime validation. If Linear's migration endpoint returns a shape that differs from the interface (e.g., a missing `refresh_token` or `expires_in`), `tokenExpiresAt` becomes `new Date(NaN)` and `refreshToken` gets stored as `undefined` — leaving the row in a silently corrupt state rather than triggering the `disconnectedAt` failure path. Consider reusing `linearTokenResponseSchema.parse()` (already available in `@superset/trpc/integrations/linear`) or defining a minimal Zod schema for this response.
### Issue 3 of 4
apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts:151-168
**`Promise.allSettled` is misleading here — promises never reject**
The mapper always resolves (the `try/catch` returns `{ ok: false }` on error, never throws). `Promise.allSettled` therefore never observes a rejected promise; `result.status` is always `"fulfilled"`, so the `&& result.value.ok` is the only relevant predicate. Either remove the try/catch and let real errors produce rejected promises (so `allSettled` behaves as intended), or replace `allSettled` with `Promise.all` and keep the try/catch — the current mix is semantically confusing.
### Issue 4 of 4
packages/trpc/src/router/integration/linear/refresh.ts:123-127
**Unguarded retry can surface auth errors to callers**
The second `fn(...)` call — executed after a successful token refresh — is not wrapped in a try/catch. If the freshly-rotated token still produces a 401 (network glitch, clock skew, immediate revocation), the auth error propagates as an unhandled exception instead of returning `null`. In the `getTeams` tRPC query, for example, this becomes a 500 to the client rather than a graceful empty list.
Wrapping the retry in a try/catch that checks `isLinearAuthError` and returns `null` (re-throwing non-auth errors) would keep the return contract consistent with the happy-path.
Reviews (1): Last reviewed commit: "fix(integrations): handle Linear OAuth r..." | Re-trigger Greptile
🧹 Preview Cleanup CompleteThe following preview resources have been cleaned up:
Thank you for your contribution! 🎉 |
|
|
||
| type Priority = "urgent" | "high" | "medium" | "low" | "none"; | ||
|
|
||
| const REFRESH_BUFFER_MS = 5 * 60 * 1000; |
There was a problem hiding this comment.
REFRESH_BUFFER_MS defined in two places
REFRESH_BUFFER_MS = 5 * 60 * 1000 is declared identically in both utils.ts (line 9) and refresh.ts (line 20). If one is updated independently they'll silently diverge — getLinearClient's proactive-refresh window would differ from refreshLinearToken's re-check guard. Extract it to a shared constant (e.g., in a constants.ts within this directory) and import it in both files.
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/trpc/src/router/integration/linear/utils.ts
Line: 9
Comment:
**`REFRESH_BUFFER_MS` defined in two places**
`REFRESH_BUFFER_MS = 5 * 60 * 1000` is declared identically in both `utils.ts` (line 9) and `refresh.ts` (line 20). If one is updated independently they'll silently diverge — `getLinearClient`'s proactive-refresh window would differ from `refreshLinearToken`'s re-check guard. Extract it to a shared constant (e.g., in a `constants.ts` within this directory) and import it in both files.
How can I resolve this? If you propose a fix, please make it concise.| const data = (await response.json()) as MigrationResponse; | ||
| const tokenExpiresAt = new Date(Date.now() + data.expires_in * 1000); |
There was a problem hiding this comment.
Unsafe cast on migration API response
(await response.json()) as MigrationResponse skips runtime validation. If Linear's migration endpoint returns a shape that differs from the interface (e.g., a missing refresh_token or expires_in), tokenExpiresAt becomes new Date(NaN) and refreshToken gets stored as undefined — leaving the row in a silently corrupt state rather than triggering the disconnectedAt failure path. Consider reusing linearTokenResponseSchema.parse() (already available in @superset/trpc/integrations/linear) or defining a minimal Zod schema for this response.
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/db/src/scripts/migrate-linear-tokens.ts
Line: 54-55
Comment:
**Unsafe cast on migration API response**
`(await response.json()) as MigrationResponse` skips runtime validation. If Linear's migration endpoint returns a shape that differs from the interface (e.g., a missing `refresh_token` or `expires_in`), `tokenExpiresAt` becomes `new Date(NaN)` and `refreshToken` gets stored as `undefined` — leaving the row in a silently corrupt state rather than triggering the `disconnectedAt` failure path. Consider reusing `linearTokenResponseSchema.parse()` (already available in `@superset/trpc/integrations/linear`) or defining a minimal Zod schema for this response.
How can I resolve this? If you propose a fix, please make it concise.| const result = await refreshLinearToken(connection.id); | ||
| if (result.disconnected) return null; | ||
|
|
||
| return await fn(new LinearClient({ accessToken: result.accessToken })); | ||
| } |
There was a problem hiding this comment.
Unguarded retry can surface auth errors to callers
The second fn(...) call — executed after a successful token refresh — is not wrapped in a try/catch. If the freshly-rotated token still produces a 401 (network glitch, clock skew, immediate revocation), the auth error propagates as an unhandled exception instead of returning null. In the getTeams tRPC query, for example, this becomes a 500 to the client rather than a graceful empty list.
Wrapping the retry in a try/catch that checks isLinearAuthError and returns null (re-throwing non-auth errors) would keep the return contract consistent with the happy-path.
Prompt To Fix With AI
This is a comment left during a code review.
Path: packages/trpc/src/router/integration/linear/refresh.ts
Line: 123-127
Comment:
**Unguarded retry can surface auth errors to callers**
The second `fn(...)` call — executed after a successful token refresh — is not wrapped in a try/catch. If the freshly-rotated token still produces a 401 (network glitch, clock skew, immediate revocation), the auth error propagates as an unhandled exception instead of returning `null`. In the `getTeams` tRPC query, for example, this becomes a 500 to the client rather than a graceful empty list.
Wrapping the retry in a try/catch that checks `isLinearAuthError` and returns `null` (re-throwing non-auth errors) would keep the return contract consistent with the happy-path.
How can I resolve this? If you propose a fix, please make it concise.There was a problem hiding this comment.
3 issues found across 17 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="packages/db/src/scripts/migrate-linear-tokens.ts">
<violation number="1" location="packages/db/src/scripts/migrate-linear-tokens.ts:47">
P1: Guard migration writes with the same legacy-state predicates used for candidate selection; updating by `id` alone allows stale writes that can incorrectly disconnect or overwrite concurrently-updated connections.</violation>
</file>
<file name="packages/trpc/src/router/integration/linear/linear.ts">
<violation number="1" location="packages/trpc/src/router/integration/linear/linear.ts:48">
P2: Do not silently swallow logout failures in `disconnect`; handle or log the error so failures are observable.
(Based on your team's feedback about handling async errors and avoiding silent catch blocks.) [FEEDBACK_USED]</violation>
</file>
<file name="apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts">
<violation number="1" location="apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts:24">
P2: Wrap `receiver.verify` in `try/catch`; invalid signatures currently throw and can return 500 instead of 401.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| disconnectedAt: new Date(), | ||
| disconnectReason: reason, | ||
| }) | ||
| .where(eq(integrationConnections.id, connectionId)); |
There was a problem hiding this comment.
P1: Guard migration writes with the same legacy-state predicates used for candidate selection; updating by id alone allows stale writes that can incorrectly disconnect or overwrite concurrently-updated connections.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/db/src/scripts/migrate-linear-tokens.ts, line 47:
<comment>Guard migration writes with the same legacy-state predicates used for candidate selection; updating by `id` alone allows stale writes that can incorrectly disconnect or overwrite concurrently-updated connections.</comment>
<file context>
@@ -0,0 +1,118 @@
+ disconnectedAt: new Date(),
+ disconnectReason: reason,
+ })
+ .where(eq(integrationConnections.id, connectionId));
+ console.log(
+ `[migrate-linear] ${connectionId} ${organizationName ?? "?"} -> disconnected:${reason}`,
</file context>
| } | ||
| try { | ||
| await callLinear(input.organizationId, (client) => client.logout()); | ||
| } catch {} |
There was a problem hiding this comment.
P2: Do not silently swallow logout failures in disconnect; handle or log the error so failures are observable.
(Based on your team's feedback about handling async errors and avoiding silent catch blocks.)
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At packages/trpc/src/router/integration/linear/linear.ts, line 48:
<comment>Do not silently swallow logout failures in `disconnect`; handle or log the error so failures are observable.
(Based on your team's feedback about handling async errors and avoiding silent catch blocks.) </comment>
<file context>
@@ -23,23 +23,29 @@ export const linearRouter = {
- }
+ try {
+ await callLinear(input.organizationId, (client) => client.logout());
+ } catch {}
const result = await dbWs.transaction(async (tx) => {
</file context>
| } catch {} | |
| } catch (error) { | |
| console.warn("Failed to logout from Linear during disconnect", error); | |
| } |
There was a problem hiding this comment.
Actionable comments posted: 6
🧹 Nitpick comments (1)
packages/db/src/utils/sql.ts (1)
36-39: ExtractDbWsTransactiontype and replace rawany-typed transaction signatures.Both
getCurrentTxidandwithConnectionLockshould use a typed transaction alias instead ofPgTransaction<any, any, any>. The codebase already establishes this pattern elsewhere (inpackages/trpc/src/router/task/task.tsandpackages/db/src/seed-default-statuses.ts):type DbWsTransaction = Parameters<Parameters<typeof dbWs.transaction>[0]>[0];Define this in
sql.tsand replace both occurrences, allowing you to remove thebiome-ignorecomments and gain full type safety for transaction callbacks.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/db/src/utils/sql.ts` around lines 36 - 39, Define a typed transaction alias inside sql.ts: add type DbWsTransaction = Parameters<Parameters<typeof dbWs.transaction>[0]>[0]; then replace the raw PgTransaction<any, any, any> signatures in getCurrentTxid and withConnectionLock with DbWsTransaction and remove the biome-ignore comments; ensure both functions import/see dbWs so the new type resolves correctly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts`:
- Around line 48-61: The current Promise.allSettled over stale.map causes an
all-at-once fan-out when calling refreshLinearToken for each connection; change
it to limit concurrency (e.g., process in batches or use a concurrency limiter)
so the cron does not stampede Linear or DB locks. Modify the logic around
Promise.allSettled / stale.map and results to iterate over stale in controlled
chunks or use a concurrency utility to call refreshLinearToken(connection.id)
with a max concurrent workers value (e.g., 5-10), ensuring the same
per-connection try/catch and the returned shape { id, ok } is preserved so
downstream code using results remains unchanged.
In `@packages/db/drizzle/0042_linear_disconnect_state.sql`:
- Around line 1-2: This migration was hand-edited; instead update the schema for
the "integration_connections" table (add the disconnected_at timestamp and
disconnect_reason text columns) under packages/db/src/schema/ and then
regenerate the migration rather than editing
packages/db/drizzle/0042_linear_disconnect_state.sql directly by running bunx
drizzle-kit generate --name="<descriptive_snake_case_name>"; commit the
generated migration so Drizzle's journal/snapshot remain in sync and remove the
manual SQL edit.
In `@packages/db/src/scripts/migrate-linear-tokens.ts`:
- Around line 100-106: The catch after the fetch/response parsing only
increments failed and logs error but leaves the Connection row with refreshToken
= NULL and disconnectedAt = NULL; update the catch block for the try around
fetch()/response parsing to mark the connection as disconnected: set
connection.disconnectedAt to the current time (and ensure refreshToken remains
NULL), persist that update to the DB (same codepath used elsewhere for marking
disconnected), and then increment failed and log the error (referencing the
failed variable, connection.id, refreshToken, and disconnectedAt to locate the
correct variables/fields).
In `@packages/trpc/src/router/integration/linear/refresh.ts`:
- Around line 51-60: The fetch to "https://api.linear.app/oauth/token" inside
refreshLinearToken (which runs under withConnectionLock) must be bounded to
avoid holding the advisory lock indefinitely; wrap the request with an
AbortController, pass its signal into fetch, start a timer (e.g. 3–10s or
configurable via env) to call controller.abort() on timeout, clear the timer
after fetch completes, and handle the abort/timeout error path so the lock can
be released and the caller gets a sensible error; reference refreshLinearToken,
withConnectionLock, connection.refreshToken and the fetch call in your changes.
In `@packages/trpc/src/router/integration/linear/utils.ts`:
- Around line 59-66: The proactive refresh branch currently calls
refreshLinearToken() and fails hard on transient errors; wrap the refresh call
in a try/catch inside the expiresSoon branch (where you check
connection.refreshToken) and only treat auth-type failures as disconnection
(calling markConnectionDisconnected(connection.id, "no_refresh_token") or
similar) — for non-auth/transient errors, log the error and return new
LinearClient({ accessToken: connection.accessToken }) so the still-valid token
is used until connection.tokenExpiresAt actually passes; keep the existing
behavior if refreshLinearToken() indicates a disconnected result or if the error
clearly indicates invalid/expired refresh credentials.
In `@plans/20260501-linear-team-entity.md`:
- Around line 209-224: The fenced code block containing the task lookup
pseudocode (starting with the line "input = \"SUPER-103\" or UUID" and the
following numbered steps and SQL snippets) is missing a language tag and should
be marked to satisfy MD040; update that triple-backtick fence to include a
language (e.g., text) so it becomes ```text, leaving the block contents
unchanged; ensure you modify the opening fence only and not the inner pseudocode
or backticks at the end.
---
Nitpick comments:
In `@packages/db/src/utils/sql.ts`:
- Around line 36-39: Define a typed transaction alias inside sql.ts: add type
DbWsTransaction = Parameters<Parameters<typeof dbWs.transaction>[0]>[0]; then
replace the raw PgTransaction<any, any, any> signatures in getCurrentTxid and
withConnectionLock with DbWsTransaction and remove the biome-ignore comments;
ensure both functions import/see dbWs so the new type resolves correctly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 3fcb5805-f15b-4137-9c4c-28aef8928400
📒 Files selected for processing (17)
apps/api/src/app/api/integrations/linear/callback/route.tsapps/api/src/app/api/integrations/linear/jobs/initial-sync/route.tsapps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.tsapps/web/src/app/(dashboard-legacy)/integrations/linear/components/ConnectionControls/ConnectionControls.tsxapps/web/src/app/(dashboard-legacy)/integrations/linear/page.tsxpackages/db/drizzle/0042_linear_disconnect_state.sqlpackages/db/drizzle/meta/0042_snapshot.jsonpackages/db/drizzle/meta/_journal.jsonpackages/db/src/schema/schema.tspackages/db/src/scripts/migrate-linear-tokens.tspackages/db/src/utils/sql.tspackages/trpc/src/env.tspackages/trpc/src/lib/integrations/linear/index.tspackages/trpc/src/router/integration/linear/linear.tspackages/trpc/src/router/integration/linear/refresh.tspackages/trpc/src/router/integration/linear/utils.tsplans/20260501-linear-team-entity.md
| const results = await Promise.allSettled( | ||
| stale.map(async (connection) => { | ||
| try { | ||
| await refreshLinearToken(connection.id); | ||
| return { id: connection.id, ok: true }; | ||
| } catch (error) { | ||
| console.error( | ||
| `[linear-refresh-cron] failed for ${connection.id}:`, | ||
| error, | ||
| ); | ||
| return { id: connection.id, ok: false }; | ||
| } | ||
| }), | ||
| ); |
There was a problem hiding this comment.
Cap the cron's refresh fan-out.
Promise.allSettled(stale.map(...)) refreshes every candidate at once. Since this route is the primary refresh path, a larger tenant set will stampede Linear's token endpoint and your advisory-lock/DB path on the same tick. Batch or limit concurrency so the hourly run stays predictable.
Suggested change
+const REFRESH_CONCURRENCY = 20;
+
- const results = await Promise.allSettled(
- stale.map(async (connection) => {
- try {
- await refreshLinearToken(connection.id);
- return { id: connection.id, ok: true };
- } catch (error) {
- console.error(
- `[linear-refresh-cron] failed for ${connection.id}:`,
- error,
- );
- return { id: connection.id, ok: false };
- }
- }),
- );
+ const results = [];
+ for (let i = 0; i < stale.length; i += REFRESH_CONCURRENCY) {
+ const batch = stale.slice(i, i + REFRESH_CONCURRENCY);
+ results.push(
+ ...(await Promise.allSettled(
+ batch.map(async (connection) => {
+ try {
+ await refreshLinearToken(connection.id);
+ return { id: connection.id, ok: true };
+ } catch (error) {
+ console.error(
+ `[linear-refresh-cron] failed for ${connection.id}:`,
+ error,
+ );
+ return { id: connection.id, ok: false };
+ }
+ }),
+ )),
+ );
+ }📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const results = await Promise.allSettled( | |
| stale.map(async (connection) => { | |
| try { | |
| await refreshLinearToken(connection.id); | |
| return { id: connection.id, ok: true }; | |
| } catch (error) { | |
| console.error( | |
| `[linear-refresh-cron] failed for ${connection.id}:`, | |
| error, | |
| ); | |
| return { id: connection.id, ok: false }; | |
| } | |
| }), | |
| ); | |
| const REFRESH_CONCURRENCY = 20; | |
| const results = []; | |
| for (let i = 0; i < stale.length; i += REFRESH_CONCURRENCY) { | |
| const batch = stale.slice(i, i + REFRESH_CONCURRENCY); | |
| results.push( | |
| ...(await Promise.allSettled( | |
| batch.map(async (connection) => { | |
| try { | |
| await refreshLinearToken(connection.id); | |
| return { id: connection.id, ok: true }; | |
| } catch (error) { | |
| console.error( | |
| `[linear-refresh-cron] failed for ${connection.id}:`, | |
| error, | |
| ); | |
| return { id: connection.id, ok: false }; | |
| } | |
| }), | |
| )), | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts` around
lines 48 - 61, The current Promise.allSettled over stale.map causes an
all-at-once fan-out when calling refreshLinearToken for each connection; change
it to limit concurrency (e.g., process in batches or use a concurrency limiter)
so the cron does not stampede Linear or DB locks. Modify the logic around
Promise.allSettled / stale.map and results to iterate over stale in controlled
chunks or use a concurrency utility to call refreshLinearToken(connection.id)
with a max concurrent workers value (e.g., 5-10), ensuring the same
per-connection try/catch and the returned shape { id, ok } is preserved so
downstream code using results remains unchanged.
| ALTER TABLE "integration_connections" ADD COLUMN "disconnected_at" timestamp;--> statement-breakpoint | ||
| ALTER TABLE "integration_connections" ADD COLUMN "disconnect_reason" text; No newline at end of file |
There was a problem hiding this comment.
Regenerate this migration instead of hand-editing it.
packages/db/drizzle/* is supposed to be generated from the schema files. Checking in manual SQL here can drift from Drizzle’s journal/snapshot state and makes the next generate unreliable.
As per coding guidelines, "Create database migrations by modifying schema files in packages/db/src/schema/ and running bunx drizzle-kit generate --name=\"<sample_name_snake_case>\". Never manually edit migration files in packages/db/drizzle/, including .sql files, meta/_journal.json, and snapshot files."
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/db/drizzle/0042_linear_disconnect_state.sql` around lines 1 - 2,
This migration was hand-edited; instead update the schema for the
"integration_connections" table (add the disconnected_at timestamp and
disconnect_reason text columns) under packages/db/src/schema/ and then
regenerate the migration rather than editing
packages/db/drizzle/0042_linear_disconnect_state.sql directly by running bunx
drizzle-kit generate --name="<descriptive_snake_case_name>"; commit the
generated migration so Drizzle's journal/snapshot remain in sync and remove the
manual SQL edit.
| } catch (error) { | ||
| failed += 1; | ||
| console.error( | ||
| `[migrate-linear] ${connection.id} unexpected error:`, | ||
| error, | ||
| ); | ||
| } |
There was a problem hiding this comment.
Unexpected migration failures still leave the connection marked connected.
This catch only logs and increments failed. If fetch() throws or the response body can't be parsed, the row keeps refreshToken = NULL and disconnectedAt = NULL, so the reconnect UI never activates even though this one-shot migration failed. Mark the connection disconnected in this path too.
🛠️ Proposed fix
} catch (error) {
failed += 1;
+ await db
+ .update(integrationConnections)
+ .set({
+ disconnectedAt: new Date(),
+ disconnectReason: "migration_failed:unexpected_error",
+ })
+ .where(eq(integrationConnections.id, connection.id));
console.error(
`[migrate-linear] ${connection.id} unexpected error:`,
error,
);
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| } catch (error) { | |
| failed += 1; | |
| console.error( | |
| `[migrate-linear] ${connection.id} unexpected error:`, | |
| error, | |
| ); | |
| } | |
| } catch (error) { | |
| failed += 1; | |
| await db | |
| .update(integrationConnections) | |
| .set({ | |
| disconnectedAt: new Date(), | |
| disconnectReason: "migration_failed:unexpected_error", | |
| }) | |
| .where(eq(integrationConnections.id, connection.id)); | |
| console.error( | |
| `[migrate-linear] ${connection.id} unexpected error:`, | |
| error, | |
| ); | |
| } |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@packages/db/src/scripts/migrate-linear-tokens.ts` around lines 100 - 106, The
catch after the fetch/response parsing only increments failed and logs error but
leaves the Connection row with refreshToken = NULL and disconnectedAt = NULL;
update the catch block for the try around fetch()/response parsing to mark the
connection as disconnected: set connection.disconnectedAt to the current time
(and ensure refreshToken remains NULL), persist that update to the DB (same
codepath used elsewhere for marking disconnected), and then increment failed and
log the error (referencing the failed variable, connection.id, refreshToken, and
disconnectedAt to locate the correct variables/fields).
| ``` | ||
| input = "SUPER-103" or UUID | ||
|
|
||
| 1. UUID? → tasks.id lookup. | ||
| 2. Match /^([A-Za-z][A-Za-z0-9]*)-(\d+)$/i: | ||
| a. SELECT t.* FROM tasks t | ||
| JOIN team_keys tk ON tk.team_id = t.team_id | ||
| WHERE tk.organization_id = $org AND tk.key = $prefix AND t.number = $number; | ||
| → if hit and tk.retired_at IS NULL, return. | ||
| → if hit and tk.retired_at IS NOT NULL, return with redirected: true plus | ||
| the canonical identifier. | ||
| b. If no match, fallback: SELECT * FROM tasks | ||
| WHERE organization_id = $org AND external_key = $input; | ||
| → handles old `@task:ENG-42` mentions where ENG-42 is Linear's identifier. | ||
| 3. Else: not found. | ||
| ``` |
There was a problem hiding this comment.
Add a language to this fenced block.
This is the block markdownlint is flagging with MD040. Mark it as text (or another appropriate language) so the doc stays lint-clean.
📝 Proposed fix
-```
+```text
input = "SUPER-103" or UUID
@@
-```
+```📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| ``` | |
| input = "SUPER-103" or UUID | |
| 1. UUID? → tasks.id lookup. | |
| 2. Match /^([A-Za-z][A-Za-z0-9]*)-(\d+)$/i: | |
| a. SELECT t.* FROM tasks t | |
| JOIN team_keys tk ON tk.team_id = t.team_id | |
| WHERE tk.organization_id = $org AND tk.key = $prefix AND t.number = $number; | |
| → if hit and tk.retired_at IS NULL, return. | |
| → if hit and tk.retired_at IS NOT NULL, return with redirected: true plus | |
| the canonical identifier. | |
| b. If no match, fallback: SELECT * FROM tasks | |
| WHERE organization_id = $org AND external_key = $input; | |
| → handles old `@task:ENG-42` mentions where ENG-42 is Linear's identifier. | |
| 3. Else: not found. | |
| ``` |
🧰 Tools
🪛 markdownlint-cli2 (0.22.1)
[warning] 209-209: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@plans/20260501-linear-team-entity.md` around lines 209 - 224, The fenced code
block containing the task lookup pseudocode (starting with the line "input =
\"SUPER-103\" or UUID" and the following numbered steps and SQL snippets) is
missing a language tag and should be marked to satisfy MD040; update that
triple-backtick fence to include a language (e.g., text) so it becomes ```text,
leaving the block contents unchanged; ensure you modify the opening fence only
and not the inner pseudocode or backticks at the end.
Linear migrated to short-lived (24h) access tokens with rotating refresh tokens on 2026-04-01. Our integration was written for the old long-lived model — refresh_token was never read from the OAuth callback, tokenExpiresAt was never checked, and no refresh logic existed. Result: any connection re-authed since 2026-04-01 silently breaks within 24h. This change adds: - disconnectedAt/disconnectReason columns on integrationConnections - withConnectionLock helper using pg_advisory_xact_lock for single-flight refresh (Linear rotates refresh tokens; parallel refreshes both fail with invalid_grant) - refreshLinearToken with under-lock re-check + invalid_grant handling - callLinear retry wrapper for 401-then-refresh-then-retry - getLinearClient refreshes proactively (within 5 min of expiry) - Hourly QStash-Schedule-triggered background refresh as the primary refresh path (refreshes anything within 90 min of expiry) - OAuth callback now stores refresh_token and clears disconnect state on re-auth - One-time migration script for legacy long-lived tokens via Linear's /oauth/migrate_old_token endpoint - Three-state UI badge + "Reconnect Linear" CTA for the integrations page when disconnectedAt is set Also includes the broader teams-entity plan (separate workstream).
75fb66e to
361f525
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (1)
apps/api/src/app/api/integrations/linear/jobs/initial-sync/route.ts (1)
55-61: 💤 Low valueConsider making the HTTP 200 explicit on the skip path.
Response.json({...})with no status defaults to 200, which is the correct behavior here — QStash treats 2xx as success so it won't retry permanently-disconnected connections. All null-returning paths ingetLinearClientare permanent states; transient errors throw and bubble up as 500 (triggering retries). The implicit 200 is functionally correct, but an explicit{ status: 200 }would make the intent clearer alongside theerrorfield.💡 Optional clarification
- return Response.json({ - error: "No Linear connection or connection disconnected", - skipped: true, - }); + return Response.json( + { + error: "No Linear connection or connection disconnected", + skipped: true, + }, + { status: 200 }, + );🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/app/api/integrations/linear/jobs/initial-sync/route.ts` around lines 55 - 61, The skip path currently returns Response.json({...}) implicitly with status 200; make the success intent explicit by adding an explicit status: 200 to that Response.json call (the block after awaiting getLinearClient(organizationId) where client is checked) so the handler clearly signals a non-retryable skip for permanently-disconnected connections.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@apps/api/src/app/api/integrations/linear/jobs/initial-sync/route.ts`:
- Around line 55-61: The skip path currently returns Response.json({...})
implicitly with status 200; make the success intent explicit by adding an
explicit status: 200 to that Response.json call (the block after awaiting
getLinearClient(organizationId) where client is checked) so the handler clearly
signals a non-retryable skip for permanently-disconnected connections.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: d0378f34-f00e-4428-90c8-a2ffd3c7ca92
📒 Files selected for processing (16)
apps/api/src/app/api/integrations/linear/callback/route.tsapps/api/src/app/api/integrations/linear/jobs/initial-sync/route.tsapps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.tsapps/web/src/app/(dashboard-legacy)/integrations/linear/components/ConnectionControls/ConnectionControls.tsxapps/web/src/app/(dashboard-legacy)/integrations/linear/page.tsxpackages/db/drizzle/0042_linear_disconnect_state.sqlpackages/db/drizzle/meta/0042_snapshot.jsonpackages/db/drizzle/meta/_journal.jsonpackages/db/src/schema/schema.tspackages/db/src/utils/sql.tspackages/trpc/src/env.tspackages/trpc/src/lib/integrations/linear/index.tspackages/trpc/src/router/integration/linear/linear.tspackages/trpc/src/router/integration/linear/refresh.tspackages/trpc/src/router/integration/linear/utils.tsplans/20260501-linear-team-entity.md
✅ Files skipped from review due to trivial changes (8)
- packages/trpc/src/env.ts
- packages/db/drizzle/0042_linear_disconnect_state.sql
- packages/trpc/src/lib/integrations/linear/index.ts
- packages/db/src/utils/sql.ts
- apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts
- packages/db/drizzle/meta/_journal.json
- packages/trpc/src/router/integration/linear/utils.ts
- packages/trpc/src/router/integration/linear/refresh.ts
🚧 Files skipped from review as they are similar to previous changes (5)
- apps/web/src/app/(dashboard-legacy)/integrations/linear/components/ConnectionControls/ConnectionControls.tsx
- apps/api/src/app/api/integrations/linear/callback/route.ts
- apps/web/src/app/(dashboard-legacy)/integrations/linear/page.tsx
- packages/db/src/schema/schema.ts
- packages/trpc/src/router/integration/linear/linear.ts
- Extract shared REFRESH_BUFFER_MS + REFRESH_TOKEN_TIMEOUT_MS into constants.ts so utils.ts and refresh.ts can't drift - AbortController + 10s timeout on Linear /oauth/token POST so a hung upstream call doesn't wedge the per-connection advisory lock - Try/catch around receiver.verify in the cron route — invalid signatures now return 401 instead of bubbling up as 500 (matches the existing sync-task pattern) - Wrap callLinear's post-refresh retry in try/catch so a 401 on the freshly-rotated token returns null instead of throwing — keeps the return contract consistent with the happy path - Proactive refresh in getLinearClient now falls back to the still- valid access token when refresh hits a transient (non-auth) failure during the 5-minute pre-expiry window
There was a problem hiding this comment.
🧹 Nitpick comments (1)
packages/trpc/src/router/integration/linear/utils.ts (1)
81-84: ⚡ Quick winConsider narrowing
reasonto a union type for compile-time safety.The known disconnect reasons in the codebase are
"no_refresh_token"and"invalid_grant". Typing the parameter as a string literal union prevents arbitrary strings from being written to thedisconnect_reasoncolumn.♻️ Proposed refactor
In
constants.ts, export the union:+export type DisconnectReason = "no_refresh_token" | "invalid_grant";Then tighten the function signature:
+import type { DisconnectReason } from "./constants"; export async function markConnectionDisconnected( connectionId: string, - reason: string, + reason: DisconnectReason, ): Promise<void> {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/trpc/src/router/integration/linear/utils.ts` around lines 81 - 84, Create a string-literal union for the known disconnect reasons (e.g., export type DisconnectReason = "no_refresh_token" | "invalid_grant" in constants.ts) and export it; then change the markConnectionDisconnected(connectionId: string, reason: string) signature to use that union (reason: DisconnectReason) so only the allowed values can be passed. Update any callers that pass dynamic strings to use one of the union literals (or map/validate before calling) and ensure any tests/types importing the constant type use the new DisconnectReason type for consistency.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@packages/trpc/src/router/integration/linear/utils.ts`:
- Around line 81-84: Create a string-literal union for the known disconnect
reasons (e.g., export type DisconnectReason = "no_refresh_token" |
"invalid_grant" in constants.ts) and export it; then change the
markConnectionDisconnected(connectionId: string, reason: string) signature to
use that union (reason: DisconnectReason) so only the allowed values can be
passed. Update any callers that pass dynamic strings to use one of the union
literals (or map/validate before calling) and ensure any tests/types importing
the constant type use the new DisconnectReason type for consistency.
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 954dc52d-8e89-4050-97f5-add674cee86f
📒 Files selected for processing (4)
apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.tspackages/trpc/src/router/integration/linear/constants.tspackages/trpc/src/router/integration/linear/refresh.tspackages/trpc/src/router/integration/linear/utils.ts
✅ Files skipped from review due to trivial changes (3)
- packages/trpc/src/router/integration/linear/constants.ts
- apps/api/src/app/api/integrations/linear/jobs/refresh-tokens/route.ts
- packages/trpc/src/router/integration/linear/refresh.ts
Brings back plans/20260501-linear-team-entity.md (deleted incidentally in #3893). Updated to drop the OAuth refresh workstream (already shipped via #4002), collapse the three-table teams design into a single teams entity with the counter on the row, and switch disconnect/reconnect to soft-delete so identifiers survive Linear disconnect/reconnect cycles.
…4386) * fix(integrations): prevent duplicate workspace linkage across orgs Two different Superset orgs could OAuth to the same Linear/Slack workspace, leaving the webhook handler to pick a winner non- deterministically via `findFirst` — so each webhook landed in a random org. Affected 153 orgs and 167 connection rows in prod (worst case: 13 active orgs claiming one Linear workspace). - Schema: partial unique `(provider, external_org_id) WHERE disconnected_at IS NULL` on `integration_connections`. - Migration: backfills duplicate groups by keeping the most recently updated active row per `(provider, external_org_id)` and soft-disconnecting the rest with reason `duplicate_resolved`, then adds the constraint in the same file. - OAuth callbacks (Linear, Slack): pre-check for active linkage in another org and short-circuit with `?error=workspace_already_linked &owner=<connector email>`; also catch `23505` for the race window. - Webhook + Slack event handlers: filter `isNull(disconnectedAt)` and `orderBy desc(updatedAt)` so even if a stray duplicate slips in, routing is deterministic. - Error UI: surfaces the connector's email so the user knows who to ping to disconnect; toast call deferred via `setTimeout(0)` so it fires after Sonner's `<Toaster />` subscribes on hydration. * docs(plans): restore linear teams + per-team numbering plan Brings back plans/20260501-linear-team-entity.md (deleted incidentally in #3893). Updated to drop the OAuth refresh workstream (already shipped via #4002), collapse the three-table teams design into a single teams entity with the counter on the row, and switch disconnect/reconnect to soft-delete so identifiers survive Linear disconnect/reconnect cycles. * chore(email): one-shot notifier for dup-linkage cleanup Adds the email template and a runnable script for notifying the 92 users whose Linear/Slack connections were soft-disconnected by the 0048 migration. Bullets out each affected (org, provider, workspace, new-owner-email) tuple so the recipient can see exactly what changed and reach out to the current owner if they want to take over. Script supports --dry-run, --test[=email], and --send. Requires NEXT_PUBLIC_MARKETING_URL=https://superset.sh on the command line so the logo/social icons in the email body resolve from prod rather than the localhost URL baked into root .env. * fix(integrations): address pr review nits - Slack callback UPSERT now clears disconnectedAt + disconnectReason on reconnect, matching the Linear callback. Without this, a Slack reconnect after a soft-disconnect would leave the row hidden from the new isNull(disconnectedAt) filter in event handlers and webhooks would silently no-op. - Webhook + 6 Slack lookup sites now order by (updatedAt desc, id desc) so findFirst is deterministic on equal timestamps. Belt-and-suspenders; the partial unique constraint makes ties impossible in practice. - Tag the resolver pseudo-code fence in the slug plan as `text` for markdownlint.
Summary
Linear migrated all OAuth apps to short-lived (24h) access tokens with rotating refresh tokens on 2026-04-01. Our integration was written for the previous long-lived-token model and was never updated:
refresh_tokenis never read from the OAuth callback (the field was even typed off the response inapps/api/.../linear/callback/route.ts:76)tokenExpiresAtis stored but never checked before using the access tokentasks.syncError, the connection appears "Connected" indefinitelyResult: every Linear connection re-authed since 2026-04-01 silently breaks within 24 hours. Confirmed in dev DB — 519 legacy connections, all with
refresh_token = NULLanddisconnected_at = NULL.What this changes
Schema (
0042_linear_disconnect_state.sql— additive, no backfill needed):integrationConnections.disconnectedAt(timestamp, nullable)integrationConnections.disconnectReason(text, nullable)Refresh plumbing in
packages/trpc/src/router/integration/linear/:refresh.ts— new module exportingrefreshLinearToken,callLinear,linearTokenResponseSchema,isLinearAuthErrorwithConnectionLockhelper inpackages/db/src/utils/sql.tsusingpg_advisory_xact_lock(hashtextextended(uuid))for single-flight refresh — Linear rotates refresh tokens, so parallel refreshes both fail withinvalid_grantwithout serializationrefreshLinearTokendoes an under-lock re-check (so a second concurrent caller observes the just-rotated token and returns immediately without a wasted Linear call), atomic(access_token, refresh_token, expires_at)write, marksdisconnectedAtoninvalid_grantgetLinearClientrefreshes proactively (≤5 min from expiry); if expired and no refresh token exists (legacy connections), marks disconnected and returns nullcallLinearwraps API calls — on 401, refreshes once and retries; if no refresh token exists, marks disconnectedHourly scheduled refresh at
apps/api/.../linear/jobs/refresh-tokens/route.ts:Promise.allSettledso one connection's failure doesn't block otherssync-task/initial-syncpattern (dev bypass too)Callback (
apps/api/.../linear/callback/route.ts):linearTokenResponseSchemarefresh_tokendisconnectedAt/disconnectReasonon conflict — re-auth heals a previously-broken connectionOne-time legacy-token migration (
packages/db/src/scripts/migrate-linear-tokens.ts):provider='linear' AND refreshToken IS NULL AND disconnectedAt IS NULL/oauth/migrate_old_tokenendpoint to upgrade eachUI (
apps/web/.../integrations/linear/):getConnectionreturnsneedsReconnectanddisconnectReasonConnectionControlsadds a destructive callout + "Reconnect Linear" button whenneedsReconnect=trueOther call-site updates:
apps/api/.../linear/jobs/initial-sync/route.tswas bypassinggetLinearClientand constructingnew LinearClientdirectly — now usesgetLinearClientso it benefits from proactive refreshgetTeamsand thedisconnectflow'sclient.logout()now usecallLinearfor 401 retrysync-taskleft as-is (usesgetLinearClientso it gets proactive refresh, but doesn't go throughcallLinearbecausecreateIssueisn't idempotent and the QStash retry is the existing safety net)Also includes the broader teams-entity plan in
plans/20260501-linear-team-entity.md(separate, larger workstream — referenced for context).Decisions worth flagging
disconnectflow, where the user explicitly chooses.actor=appswitch and integrations UI revamp are deferred to the broader teams-entity PR.Deploy sequence
bun run packages/db/src/scripts/migrate-linear-tokens.tsagainst prod once (requiresLINEAR_CLIENT_ID+LINEAR_CLIENT_SECRETin env)/api/integrations/linear/jobs/refresh-tokens, cron0 * * * *Test plan
bun run typecheckclean across 29 packagesbun run lint:fixcleanrefresh_tokenis populated inintegration_connectionsUPDATE integration_connections SET token_expires_at = now() - interval '1 hour' WHERE provider='linear') → triggergetTeams(visit integrations page) → confirm token rotates andtokenExpiresAtjumps forwardrefresh_token=NULL, expired token → triggergetTeams→ confirmdisconnectedAtpopulated, badge flips to "Reconnect required"Promise.all([refresh, refresh, refresh])against same connection → confirm exactly one POST to/oauth/tokenUPDATE … SET refresh_token = 'garbage') → trigger refresh → confirmdisconnectReason = 'invalid_grant'disconnectedAt/disconnectReason)/api/integrations/linear/jobs/refresh-tokenswith dev bypass → confirm only stale connections (within 90min of expiry) get refreshedSummary by cubic
Fixes Linear OAuth breakage by adding rotating refresh-token support with proactive refresh, 401 retry, and a clear “Reconnect required” UI. Keeps tokens fresh via an hourly job, safely migrates legacy tokens, and hardens refresh with timeouts and safer retries.
Bug Fixes
refresh_tokenin the OAuth callback and settokenExpiresAtusing a validated response schema.disconnectedAt/disconnectReason; tRPC returnsneedsReconnect; web shows a “Reconnect required” badge, callout, and CTA.refreshLinearTokenwith single-flight locking and atomic writes; proactively refresh when ≤5 min to expiry; add a 10s timeout viaAbortController; extractREFRESH_BUFFER_MS/REFRESH_TOKEN_TIMEOUT_MSconstants.callLinearto retry once on 401; if the post-refresh call still 401s, return null; use refresh-aware clients in initial sync,getTeams, anddisconnect.Migration
packages/db/src/scripts/migrate-linear-tokens.tsonce to upgrade legacy tokens via Linear’s/oauth/migrate_old_token./api/integrations/linear/jobs/refresh-tokenswith cron0 * * * *.LINEAR_CLIENT_IDandLINEAR_CLIENT_SECRETare set in environment.Written for commit 71fc03f. Summary will update on new commits.
Summary by CodeRabbit
New Features
Improvements