Skip to content

feat(desktop): redesign environment variables table#1420

Merged
saddlepaddle merged 7 commits into
mainfrom
sandbox-integration
Feb 13, 2026
Merged

feat(desktop): redesign environment variables table#1420
saddlepaddle merged 7 commits into
mainfrom
sandbox-integration

Conversation

@saddlepaddle
Copy link
Copy Markdown
Collaborator

@saddlepaddle saddlepaddle commented Feb 12, 2026

Summary

  • Redesign the secrets/environment variables table UI to match the reference design
  • Add createdByUserId column to secrets schema with user relation, return creator info (name, avatar) in getDecrypted query
  • Add sort dropdown (Last Updated, Name, Type) and click-to-copy with hover tooltip on revealed values
  • Show "Empty" for blank values, neutral badge colors, circled row icons, date + avatar, always-visible tridot menu
  • Add toast notifications on edit success/failure
  • Fix migration ordering (DROP CONSTRAINT before DROP TABLE CASCADE)

Test plan

  • bun run lint:fix — no issues
  • bun run typecheck --filter=@superset/desktop --filter=@superset/trpc — passes
  • bun test — 1227 pass, 0 fail
  • Manual: verify secret rows render correctly with new layout
  • Manual: verify sort dropdown works (Last Updated, Name, Type)
  • Manual: verify click-to-copy on revealed values shows tooltip
  • Manual: verify toast on add/edit secret
  • Run drizzle-kit generate if schema changes need fresh migration

Summary by CodeRabbit

Release Notes

  • New Features

    • Added cloud environment variables management for projects with encryption support.
    • Introduced cloud settings section for managing project secrets.
    • Added ability to link projects to external cloud infrastructure.
  • Refactor

    • Reorganized project settings from workspace-centric to project-centric structure.
    • Streamlined settings navigation with collapsible project sections.

@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Feb 12, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

This PR introduces a comprehensive refactor from a repository-centric model to a projects-based architecture. It adds cloud workspace support, environment secrets management with encryption, restructures desktop settings navigation, and propagates encryption keys through deployment workflows. Database schema migration includes new projects, workspaces, sandbox images, and secrets tables while removing the repositories table.

Changes

Cohort / File(s) Summary
Deployment Workflows
.github/workflows/deploy-preview.yml, .github/workflows/deploy-production.yml
Propagate SECRETS_ENCRYPTION_KEY environment variable across all deploy steps and Vercel deployment invocations for API, Web, Marketing, Admin, Docs, Streams services.
Database Schema & Migrations
packages/db/drizzle/0018_sandbox_start.sql, packages/db/drizzle/meta/..., packages/local-db/drizzle/0025_...
Add projects, workspaces, sandbox_images, secrets tables; drop repositories table; establish foreign keys and indexes; add workspace_type enum; add neon_project_id column to local projects.
Database Schema Definitions
packages/db/src/schema/schema.ts, packages/db/src/schema/relations.ts, packages/db/src/schema/enums.ts, packages/db/src/schema/zod.ts, packages/local-db/src/schema/schema.ts
Define projects, secrets, sandbox_images, workspaces entities with Insert/Select types; create workspace and sandbox status enums; establish all table relations; add configuration schemas for local/cloud workspaces.
TRPC Routers & Routing
packages/trpc/src/root.ts, packages/trpc/src/router/project/*, packages/trpc/src/router/workspace/*, packages/trpc/src/router/repository/*, packages/trpc/src/router/organization/...
Replace repositoryRouter with projectRouter; add projectRouter create/update/delete mutations with nested secretsRouter; add workspaceRouter with create/delete mutations; remove repositoryRouter entirely.
Secrets Management
packages/trpc/src/router/project/secrets/secrets.ts, packages/trpc/src/router/project/secrets/utils/crypto.ts, packages/trpc/src/router/project/secrets/utils/secrets-validation.ts
Implement secretsRouter with upsert/delete/getDecrypted operations; add AES-256-GCM encryption/decryption utilities; add secret key/value validation with size and count limits.
API Electric Utils & Env
apps/api/src/app/api/electric/[...path]/utils.ts, apps/api/src/env.ts, packages/trpc/src/env.ts
Replace repositories with projects in AllowedTable type; add SECRETS_ENCRYPTION_KEY and STREAMS_URL environment validations.
Desktop Collections Migration
apps/desktop/src/renderer/routes/_authenticated/providers/CollectionsProvider/collections.ts, apps/mobile/lib/collections/collections.ts, apps/desktop/src/lib/trpc/routers/projects/projects.ts
Migrate OrgCollections from SelectRepository to SelectProject; add linkToNeon procedure for Neon project linking; remove repository CRUD handlers.
Desktop Settings Navigation Restructure
apps/desktop/src/renderer/routes/_authenticated/settings/layout.tsx, apps/desktop/src/renderer/routes/_authenticated/settings/utils/settings-search/settings-search.ts, apps/desktop/src/renderer/stores/settings-state.ts
Remove workspace section from settings; replace workspace settings items with PROJECT_ENV_VARS; remove workspace state management (activeWorkspaceId, useActiveWorkspaceId).
Desktop Settings Routes
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/page.tsx, apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/general/page.tsx, apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/page.tsx, apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/page.tsx
Replace project settings loader with redirect; add general settings page with ProjectSettings loader; add cloud settings redirect page; add cloud secrets page with project data loader.
Desktop Project Settings Components
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/components/ProjectSettings/*, apps/desktop/src/renderer/routes/_authenticated/settings/components/SettingsSidebar/ProjectsSettings.tsx
Export SettingsSection from ProjectSettings; restructure ProjectsSettings sidebar with Collapsible UI; add nested Cloud section; hide projects during search if no matches.
Desktop Secrets UI Components
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/components/SecretsSettings/...
Implement SecretsSettings component with cloud project creation/linking flow; add AddSecretSheet for bulk secret import/editing; add EditSecretDialog for inline editing; implement EnvironmentVariablesList with search/sort; add SecretRow with reveal/copy/delete actions.
Secrets Utilities
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/components/SecretsSettings/utils/env-file/env-file.ts
Provide environment file parsing and validation utilities supporting .env format with multiline quoted values and export prefixes.
Workspace Settings Removal
apps/desktop/src/renderer/routes/_authenticated/settings/workspace/$workspaceId/page.tsx
Remove entire workspace settings page route and WorkspaceSettingsPage component.

Sequence Diagram(s)

sequenceDiagram
    actor User
    participant Desktop as Desktop App
    participant ElectronTRPC as Electron TRPC
    participant API as API Server
    participant DB as Database
    participant Crypto as Encryption Service

    User->>Desktop: Click "Connect Cloud Project"
    Desktop->>API: Create cloud project (repoOwner, repoName, slug, repoUrl)
    API->>DB: Insert projects record
    DB-->>API: Return project record
    API-->>Desktop: Return project details
    Desktop->>API: Link to Neon (projectId, neonProjectId)
    API->>DB: Update projects.neonProjectId
    DB-->>API: Confirm update
    API-->>Desktop: Success
    Desktop->>Desktop: Load EnvironmentVariablesList for cloud project
Loading
sequenceDiagram
    actor User
    participant Desktop as Desktop App
    participant Sheet as AddSecretSheet
    participant API as API Server
    participant Crypto as Encryption Service
    participant DB as Database

    User->>Sheet: Open Add Secret Sheet
    User->>Sheet: Enter key & value pairs
    User->>Sheet: Click Save
    Sheet->>API: Upsert secret (projectId, key, value, sensitive)
    API->>Crypto: Encrypt value with SECRETS_ENCRYPTION_KEY
    Crypto-->>API: Return encrypted_value
    API->>DB: Upsert secrets record (encrypted_value)
    DB-->>API: Confirm insert/update
    API-->>Sheet: Success response
    Sheet->>Sheet: Show success toast
    Sheet->>Desktop: Trigger refresh
    Desktop->>API: Fetch decrypted secrets
    API->>DB: Query secrets for project
    DB-->>API: Return encrypted secrets
    API->>Crypto: Decrypt values
    Crypto-->>API: Return plaintext values
    API-->>Desktop: Return decrypted secrets
    Desktop->>Desktop: Update EnvironmentVariablesList UI
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~75 minutes

Possibly related PRs

Poem

🐰 Hoppy refactoring brings projects to light,
Secrets encrypted, kept safe and tight,
Workspaces spring forth, where code can run free,
From repositories old to clouds that we see! 🌥️✨

🚥 Pre-merge checks | ✅ 3 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 2.13% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title 'feat(desktop): redesign environment variables table' clearly and accurately describes the primary change—a UI redesign of the environment variables/secrets table in the desktop application.
Description check ✅ Passed The PR description is largely complete, providing a summary of key changes, a test plan with verification items, and migration notes. It covers the primary objectives: UI redesign, schema updates (createdByUserId), and new features (sort, copy-to-clipboard, notifications).
Merge Conflict Detection ✅ Passed ✅ No merge conflicts detected when merging into main

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch sandbox-integration

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 15

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/desktop/src/renderer/stores/settings-state.ts (1)

4-7: ⚠️ Potential issue | 🟡 Minor

Stale comment: "workspace" is no longer a dynamic section.

The comment on line 6 still references "project/workspace are dynamic," but "workspace" has been removed from the SettingsSection type. Update the comment to reflect only "project".

📝 Proposed fix
 /**
  * Settings sections available in the settings view.
- * General sections are the main categories, project/workspace are dynamic.
+ * General sections are the main categories, project is dynamic.
  */
🤖 Fix all issues with AI agents
In @.github/workflows/deploy-preview.yml:
- Around line 678-687: Remove SECRETS_ENCRYPTION_KEY from the vercel deploy
invocations that build static Docs/Marketing sites: locate the run block that
sets VERCEL_URL via the vercel deploy --prebuilt command (the step that
constructs VERCEL_URL) and delete the --env
SECRETS_ENCRYPTION_KEY=$SECRETS_ENCRYPTION_KEY argument there (and the
equivalent argument in the Marketing deploy block). Leave SECRETS_ENCRYPTION_KEY
only on deployments that actually require it (do not remove the top-level
secrets definition), ensuring vercel deploy --prebuilt and VERCEL_URL remain
unchanged otherwise.
- Around line 253-254: The workflow is missing a value for STREAMS_URL (used
later alongside SECRETS_ENCRYPTION_KEY) so add/emit it before the API
build/deploy: construct the URL from the Fly app naming convention (e.g.
https://superset-stream-pr-${{ github.event.pull_request.number }}.fly.dev) and
set STREAMS_URL in the top-level env: block or export it to GITHUB_ENV after the
Fly deploy step that creates the app; ensure the generated value is available to
the job that runs apps/api (apps/api/src/env.ts expects STREAMS_URL as a
required z.string().url()).

In @.github/workflows/deploy-production.yml:
- Around line 504-513: The SECRETS_ENCRYPTION_KEY is being passed into the
production vercel deploy step where Docs/Marketing don't need it; remove the
--env SECRETS_ENCRYPTION_KEY=$SECRETS_ENCRYPTION_KEY entry from the vercel
deploy invocation (the run block containing vercel pull/build/deploy) so the
deploy command no longer injects SECRETS_ENCRYPTION_KEY into the Docs/Marketing
production deployment.

In `@apps/desktop/src/lib/trpc/routers/projects/projects.ts`:
- Around line 1004-1013: The linkToNeon mutation currently updates localDb
without verifying the project exists and allows empty neonProjectId; change the
input schema to neonProjectId: z.string().min(1) and, following the pattern used
by update/close/refreshDefaultBranch, first fetch the project by id (using the
same project lookup logic) and if not found throw a TRPCError with code
'NOT_FOUND'; only then perform the localDb.update(projects).set(...).where(...)
and await the .run() result before returning { success: true } so the procedure
behaves consistently with other id-based mutations.

In
`@apps/desktop/src/renderer/routes/_authenticated/hooks/useNeonProjectLink/useNeonProjectLink.ts`:
- Around line 46-53: The onError handler for linkMutation.mutate is swallowing
errors — modify the onError callback in useNeonProjectLink so it accepts the
error parameter (e.g., (error) => { ... }), keeps the existing cleanup calling
linkingRef.current.delete(local.id), and also surface the failure by logging the
error (console.error or app logger) and notifying the user (use the app's
toast/notification utility or set an error state) so failures aren't silent;
keep the cleanup logic intact and include the error details in the
log/notification for debugging.

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/EditSecretDialog/EditSecretDialog.tsx:
- Around line 40-45: The useEffect in EditSecretDialog currently depends on the
whole secret object which can cause in-progress edits to be reset when parent
passes a new secret reference; change the dependency to a stable identifier
(e.g., secret.id or secret.key) so the effect only runs when the actual secret
changes, and keep the existing logic that sets value via
setValue(secret.sensitive ? "" : secret.value) inside the effect that now lists
[open, secret.id] (or [open, secret.key]) instead of [open, secret].

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/EnvironmentVariablesList.tsx:
- Around line 72-84: The fetchSecrets callback currently logs errors to console
and only clears loading in finally, so failures are invisible to users and
subsequent refreshes don't show loading; update fetchSecrets to
setIsLoading(true) at the start of the function, catch the error and set a
visible error state (e.g., setFetchError or reuse an existing error state) when
apiTrpcClient.project.secrets.getDecrypted.query fails, and ensure you
clear/setSecrets only on success; display that error state in the
EnvironmentVariablesList UI so users see the failure and use setIsLoading(false)
in finally as you already do.

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/SecretsSettings.tsx:
- Around line 185-206: The AddSecretSheet and EditSecretDialog are being passed
an empty string for projectId via project.neonProjectId ?? "" which allows these
components to render (and potentially call APIs) when no cloud project is
linked; update the render guards so both AddSecretSheet and EditSecretDialog
only render when project.neonProjectId is truthy (e.g., replace the current
organizationId checks with organizationId && project.neonProjectId or add an
explicit check) so that projectId is never an empty string, ensuring the
components receive a valid project.neonProjectId before invoking APIs.
- Around line 74-98: The call to linkToNeon.mutate inside
handleCreateCloudProject is fire-and-forget and any error from that mutation is
swallowed; update the code to surface failures by either adding an onError
handler to the linkToNeon mutation definition (attach logging/user-facing error
state in its onError) or await the mutation and wrap it in its own try/catch so
you can log the error and set appropriate UI state (e.g., call
setIsCreatingCloud(false) if needed and show an error toast). Reference
linkToNeon.mutate and handleCreateCloudProject (and keep
apiTrpcClient.project.create.mutate behavior unchanged) when making the change.

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/utils/env-file/env-file.ts:
- Around line 27-32: The current loop that skips to the closing quote using
variables quote, afterEq, lines and index i can run off the end and treat an
unclosed multiline quote as valid; update the logic in the validate function
(and mirror in parseEnvContent) so after the inner while that advances i you
check if i >= lines.length (meaning no closing quote was found) and return a
validation error (or otherwise signal failure) instead of incrementing past the
end and continuing; ensure parseEnvContent uses the same detection to either
throw or return an error for unclosed quotes so values are not silently
truncated.

In `@packages/trpc/src/router/project/project.ts`:
- Around line 64-75: The mutation currently calls .set(data) even when data is
empty and returns updated without checking if a row was found; before calling
dbWs.update(projects).set(data) ensure there is at least one updatable field
(e.g., if Object.keys(data).length === 0) and throw a BAD_REQUEST TRPCError like
"No fields to update", and after performing the update check if the returned
updated is undefined and if so throw a NOT_FOUND TRPCError (use the same id and
organizationId context); reference verifyOrgMembership, the local variables
id/organizationId/data, the dbWs.update(projects).set(data)...returning() call,
and the updated result when adding these guards.
- Around line 27-53: The project insert and sandboxImages insert must be atomic
to avoid orphaned projects; wrap both operations in a single transaction using
the DB transaction API (e.g., dbWs.transaction) so both inserts run on the same
transaction context: perform the insert into projects (the existing
dbWs.insert(projects).values(...).returning()) inside the transaction, then
insert into sandboxImages using the transaction handle (use
tx.insert(sandboxImages).values(...)), check the returned project and throw the
same TRPCError if missing, and return the project from the transaction; this
ensures either both rows are committed or neither are.

In `@packages/trpc/src/router/project/secrets.ts`:
- Around line 97-112: The delete protectedProcedure currently always returns {
success: true } even when no row is deleted; modify the mutation to capture the
result of dbWs.delete(secrets).where(...).returning({id: secrets.id}) (or check
the affected row count from the executed query) and if the returned array is
empty throw a TRPCError with code 'NOT_FOUND' (after verifyOrgMembership).
Ensure you only return { success: true } when a row was actually deleted and add
an import for TRPCError if not present.
- Around line 28-29: Add an explicit check that the provided projectId belongs
to the provided organizationId before performing the mutation or getDecrypted
read: after verifyOrgMembership(ctx.session.user.id, input.organizationId) (in
the secrets create mutation) and at the start of the getDecrypted procedure,
query the projects table (or add a helper like
verifyProjectOrganization(projectId, organizationId)) to confirm that
input.projectId is associated with input.organizationId; if not, throw an
authorization error and halt; this ensures the insert/read cannot be made for a
project in a different org even when the user is a member of the provided
organizationId.
- Around line 47-52: The count check uses db.query.secrets.findMany (assigning
to existing) while the insert uses dbWs, creating a TOCTOU risk; change the read
that enforces MAX_SECRETS_PER_PROJECT to use the same client
(dbWs.query.secrets.findMany or equivalent) so the check and subsequent write
use the same transport, referencing the existing variable name existing, the
constant MAX_SECRETS_PER_PROJECT, and the write code that uses dbWs to keep the
operation atomic from the client-transport perspective (or explicitly document
this as an intentional soft limit if you choose not to change it).
🧹 Nitpick comments (18)
apps/desktop/src/renderer/routes/_authenticated/hooks/useNeonProjectLink/useNeonProjectLink.ts (2)

29-56: linkMutation in useCallback deps causes unnecessary effect re-fires.

useMutation returns a new object reference whenever mutation state changes (e.g., isPending flips). Including linkMutation in the dependency array means linkProjects is recreated on every state transition, re-triggering the useEffect. The linkingRef guard prevents duplicate mutations, but the churn is wasteful and makes the data flow harder to reason about.

A cleaner approach: store the mutate function in a ref, or extract only linkMutation.mutate (which has a stable identity in recent TanStack Query versions) as the dependency.

Suggested fix
+	const mutateRef = useRef(linkMutation.mutate);
+	mutateRef.current = linkMutation.mutate;
+
 	const linkProjects = useCallback(() => {
 		if (!localProjects || !cloudProjects) return;
 
 		for (const local of localProjects) {
 			if (local.neonProjectId || !local.githubOwner) continue;
 			if (linkingRef.current.has(local.id)) continue;
 
 			const repoName = local.mainRepoPath.split("/").pop();
 			if (!repoName) continue;
 
 			const match = cloudProjects.find(
 				(cloud) =>
 					cloud.repoOwner === local.githubOwner && cloud.repoName === repoName,
 			);
 
 			if (match) {
 				linkingRef.current.add(local.id);
-				linkMutation.mutate(
+				mutateRef.current(
 					{ id: local.id, neonProjectId: match.id },
 					{
 						onError: () => {
 							linkingRef.current.delete(local.id);
 						},
 					},
 				);
 			}
 		}
-	}, [localProjects, cloudProjects, linkMutation]);
+	}, [localProjects, cloudProjects]);

36-36: Fragile repo name extraction from path.

local.mainRepoPath.split("/").pop() works for simple paths but would return an empty string for a trailing-slash path (e.g., "owner/repo/"). The guard on Line 37 catches the empty-string case, so this isn't a bug, but a filter(Boolean) or explicit trim could make intent clearer.

apps/desktop/src/renderer/routes/_authenticated/settings/components/SettingsSidebar/ProjectsSettings.tsx (1)

95-126: Consider: nested Collapsible inside CollapsibleContent may feel heavy for a single child.

Currently the "Cloud" section nests a Collapsible with only one sub-item ("Environment Variables"). This adds visual and interaction overhead for a single link. If more cloud sub-items are planned, this is fine as-is. Otherwise, a flat link would simplify the sidebar.

This is a design observation — no code change needed unless the team prefers simplification.

packages/trpc/src/lib/crypto.ts (2)

7-14: Use the validated env object instead of raw process.env.

getKey() reads process.env.SECRETS_ENCRYPTION_KEY directly, bypassing the Zod validation already defined in ../env.ts. This means the runtime check here (lines 9, 11-12) duplicates what the env schema should guarantee, and a typo or drift between the two won't be caught.

Suggested fix
+import { env } from "../env";
+
 function getKey(): Buffer {
-	const raw = process.env.SECRETS_ENCRYPTION_KEY;
-	if (!raw) throw new Error("SECRETS_ENCRYPTION_KEY not set");
+	const raw = env.SECRETS_ENCRYPTION_KEY;
 	const key = Buffer.from(raw, "base64");
 	if (key.length !== 32)
 		throw new Error("SECRETS_ENCRYPTION_KEY must be 32 bytes");
 	return key;
 }

30-41: Mixed encoding in decryptSecret — specify "utf8" consistently.

On Line 40, decipher.update(ciphertext) returns a Buffer (no output encoding specified), which is then concatenated with decipher.final("utf8") (a string). This relies on implicit Buffer.toString() coercion during + concatenation. While it works because the default encoding is UTF-8, it's fragile and inconsistent with the explicit "utf8" on final().

Suggested fix
-	return decipher.update(ciphertext) + decipher.final("utf8");
+	return Buffer.concat([decipher.update(ciphertext), decipher.final()]).toString("utf8");
packages/trpc/src/router/project/secrets.ts (1)

132-140: No error handling if decryptSecret throws.

If the encryption key rotates or data is corrupted, decryptSecret will throw and the entire getDecrypted query fails for all secrets. Consider wrapping the decryption in a try/catch per row, returning an error indicator for the affected secret while still serving the rest.

Proposed approach
 return rows.map((row) => ({
   id: row.id,
   key: row.key,
-  value: row.sensitive ? "" : decryptSecret(row.encryptedValue),
+  value: row.sensitive
+    ? ""
+    : (() => {
+        try {
+          return decryptSecret(row.encryptedValue);
+        } catch (e) {
+          console.error(`[secrets/getDecrypted] Failed to decrypt secret ${row.id}`, e);
+          return "";
+        }
+      })(),
   sensitive: row.sensitive,
   createdAt: row.createdAt,
   updatedAt: row.updatedAt,
   createdBy: row.createdBy ?? null,
 }));
packages/db/drizzle/0018_sandbox_start.sql (1)

57-58: Destructive migration: repositories table is dropped with data loss.

DROP TABLE "repositories" CASCADE permanently removes this table and all its data. Ensure a data migration or backfill plan exists if there's any production data in repositories that needs to be preserved or migrated to projects.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/page.tsx (1)

23-26: String-based error detection is brittle.

error.message.includes("not found") can match unrelated errors. If the server uses TRPCError with code: "NOT_FOUND", prefer checking the structured error code instead.

Suggested improvement
-		} catch (error) {
-			if (error instanceof Error && error.message.includes("not found")) {
-				throw notFound();
-			}
-			throw error;
+		} catch (error) {
+			if (
+				error instanceof Error &&
+				"data" in error &&
+				(error as any).data?.code === "NOT_FOUND"
+			) {
+				throw notFound();
+			}
+			throw error;

Alternatively, check if a TRPCClientError type is available for a type-safe check.

packages/trpc/src/router/workspace/workspace.ts (2)

26-37: workspace could be undefined if .returning() yields no rows.

The destructured [workspace] is returned directly without a guard. While unlikely for an INSERT ... RETURNING, if it ever happens the caller receives undefined. Other routers in this codebase (e.g., organization create) have the same pattern, so this may be accepted — but a defensive check would be more robust.

Optional guard
 			const [workspace] = await dbWs
 				.insert(workspaces)
 				.values({ ... })
 				.returning();
+			if (!workspace) {
+				throw new TRPCError({
+					code: "INTERNAL_SERVER_ERROR",
+					message: "Failed to create workspace",
+				});
+			}
 			return workspace;

44-54: Delete succeeds silently when workspace doesn't exist.

The mutation returns { success: true } regardless of whether any row was actually deleted. If the caller expects idempotent deletes this is fine; otherwise consider checking the result count or throwing NOT_FOUND.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/components/EditSecretDialog/EditSecretDialog.tsx (1)

47-68: Consider using tRPC React hooks instead of the raw client for consistency.

handleSave uses apiTrpcClient directly with manual isSaving state. Using useMutation from the tRPC React hooks would provide automatic loading/error state and integrate with React Query's cache invalidation, which is the pattern used elsewhere (e.g., SecretsSettings.tsx uses hook-based queries).

This is a minor consistency suggestion — the current approach works fine.

packages/trpc/src/router/project/project.ts (1)

77-92: Delete returns success even if no row was deleted.

If the id + organizationId combination doesn't match any row, the mutation still returns { success: true }. Consider checking the affected row count or returning the deleted record.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/components/SecretRow/SecretRow.tsx (2)

48-62: Delete error is logged but not surfaced to the user.

Other mutation flows (add/edit) display toast.error(...) on failure, but delete silently catches the error at line 58. Also, confirm() is used here while alert.destructive() is used in AddSecretSheet for similar destructive confirmations — consider aligning the patterns.

🛡️ Proposed fix
+import { toast } from "@superset/ui/sonner";
+import { alert } from "@superset/ui/atoms/Alert";
 // ...

 const handleDelete = useCallback(async () => {
-	if (!confirm(`Delete environment variable "${secret.key}"?`)) return;
-	setIsDeleting(true);
-	try {
-		await apiTrpcClient.project.secrets.delete.mutate({
-			id: secret.id,
-			organizationId,
-		});
-		onDeleted();
-	} catch (err) {
-		console.error("[secrets/delete] Failed to delete:", err);
-	} finally {
-		setIsDeleting(false);
-	}
+	alert.destructive({
+		title: `Delete "${secret.key}"?`,
+		description: "This environment variable will be permanently removed.",
+		confirmText: "Delete",
+		onConfirm: async () => {
+			setIsDeleting(true);
+			try {
+				await apiTrpcClient.project.secrets.delete.mutate({
+					id: secret.id,
+					organizationId,
+				});
+				onDeleted();
+			} catch (err) {
+				console.error("[secrets/delete] Failed to delete:", err);
+				toast.error("Failed to delete environment variable");
+			} finally {
+				setIsDeleting(false);
+			}
+		},
+	});
 }, [secret.id, secret.key, organizationId, onDeleted]);

64-68: navigator.clipboard.writeText can reject — add error handling.

While clipboard failures are rare in Electron, wrapping in try/catch avoids an unhandled promise rejection.

🛡️ Proposed fix
 const handleCopy = useCallback(async () => {
-	await navigator.clipboard.writeText(secret.value);
-	setCopied(true);
-	setTimeout(() => setCopied(false), 1500);
+	try {
+		await navigator.clipboard.writeText(secret.value);
+		setCopied(true);
+		setTimeout(() => setCopied(false), 1500);
+	} catch {
+		console.error("[secrets/copy] Failed to copy to clipboard");
+	}
 }, [secret.value]);
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/components/AddSecretSheet/AddSecretSheet.tsx (2)

148-148: Move MAX_FILE_SIZE to module-level.

As per coding guidelines, "Extract hardcoded magic numbers, strings, and enums to named constants at module top instead of leaving them inline in logic."

+const MAX_FILE_SIZE = 256 * 1024; // 256 KB
+
 export function AddSecretSheet({

198-226: Sequential upserts risk partial saves on failure.

If an error occurs on the 3rd of 5 entries, the first two are already persisted but the user sees "Failed to save environment variables" with no indication of which succeeded. This could leave the user confused about the actual state.

Consider either:

  1. Using a batch/bulk endpoint if available.
  2. Collecting results and reporting partial success/failure:
 try {
-	for (const entry of validEntries) {
-		await apiTrpcClient.project.secrets.upsert.mutate({
+	await Promise.all(
+		validEntries.map((entry) =>
+			apiTrpcClient.project.secrets.upsert.mutate({
 			projectId,
 			organizationId,
 			key: entry.key.trim(),
 			value: entry.value.trim(),
 			sensitive,
-		});
-	}
+			}),
+		),
+	);

Promise.all would at least make the operation faster. The partial failure risk remains but is less likely with concurrent execution. A true fix would need a server-side batch endpoint.

packages/db/src/schema/zod.ts (1)

23-27: workspaceConfigSchema union lacks a discriminator — consider adding a type field.

The union of localWorkspaceConfigSchema and cloudWorkspaceConfigSchema works because the shapes are distinct, but Zod must try parsing each variant sequentially. If a type: "local" | "cloud" discriminator field were added, you could use z.discriminatedUnion("type", [...]) for more efficient parsing and clearer error messages. This aligns with the workspaceType enum already defined in the DB schema.

packages/db/src/schema/schema.ts (1)

398-425: Workspaces table stores config as unvalidated jsonb.

The config column uses $type<WorkspaceConfig>() which only provides compile-time type hints. Consider adding a note or documenting that application-layer validation (via workspaceConfigSchema) should be applied when reading this data, especially since the config shape may evolve over time and existing rows may not match newer schemas.

Comment thread .github/workflows/deploy-preview.yml Outdated
Comment on lines +253 to +254
STREAMS_URL: ${{ env.STREAMS_URL }}
SECRETS_ENCRYPTION_KEY: ${{ secrets.SECRETS_ENCRYPTION_KEY }}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

STREAMS_URL will be empty — it's never set in the preview workflow.

STREAMS_URL is referenced as ${{ env.STREAMS_URL }} but no step in this workflow defines or exports it to $GITHUB_ENV. The streams preview deploys to Fly.io (Line 145) but its URL is never captured. Since apps/api/src/env.ts validates STREAMS_URL as z.string().url() (required), the API preview deployment will fail env validation at build or runtime.

You likely need to construct the URL from the Fly app name (e.g., https://superset-stream-pr-${PR_NUMBER}.fly.dev) and either add it to the top-level env: block or load it from the streams status artifact.

Suggested fix: add STREAMS_URL to the top-level env block
 env:
   PR_NUMBER: ${{ github.event.pull_request.number }}
   API_ALIAS: api-pr-${{ github.event.pull_request.number }}-superset.vercel.app
   WEB_ALIAS: web-pr-${{ github.event.pull_request.number }}-superset.vercel.app
   MARKETING_ALIAS: marketing-pr-${{ github.event.pull_request.number }}-superset.vercel.app
   ADMIN_ALIAS: admin-pr-${{ github.event.pull_request.number }}-superset.vercel.app
   DOCS_ALIAS: docs-pr-${{ github.event.pull_request.number }}-superset.vercel.app
   ELECTRIC_URL: https://superset-electric-pr-${{ github.event.pull_request.number }}.fly.dev/v1/shape
+  STREAMS_URL: https://superset-stream-pr-${{ github.event.pull_request.number }}.fly.dev
🤖 Prompt for AI Agents
In @.github/workflows/deploy-preview.yml around lines 253 - 254, The workflow is
missing a value for STREAMS_URL (used later alongside SECRETS_ENCRYPTION_KEY) so
add/emit it before the API build/deploy: construct the URL from the Fly app
naming convention (e.g. https://superset-stream-pr-${{
github.event.pull_request.number }}.fly.dev) and set STREAMS_URL in the
top-level env: block or export it to GITHUB_ENV after the Fly deploy step that
creates the app; ensure the generated value is available to the job that runs
apps/api (apps/api/src/env.ts expects STREAMS_URL as a required
z.string().url()).

Comment on lines +678 to +687
SECRETS_ENCRYPTION_KEY: ${{ secrets.SECRETS_ENCRYPTION_KEY }}
run: |
vercel pull --yes --environment=preview --token=$VERCEL_TOKEN
vercel build --token=$VERCEL_TOKEN
VERCEL_URL=$(vercel deploy --prebuilt --token=$VERCEL_TOKEN \
--env NEXT_PUBLIC_POSTHOG_KEY=$NEXT_PUBLIC_POSTHOG_KEY \
--env NEXT_PUBLIC_POSTHOG_HOST=$NEXT_PUBLIC_POSTHOG_HOST \
--env NEXT_PUBLIC_SENTRY_DSN_DOCS=$NEXT_PUBLIC_SENTRY_DSN_DOCS \
--env NEXT_PUBLIC_SENTRY_ENVIRONMENT=$NEXT_PUBLIC_SENTRY_ENVIRONMENT)
--env NEXT_PUBLIC_SENTRY_ENVIRONMENT=$NEXT_PUBLIC_SENTRY_ENVIRONMENT \
--env SECRETS_ENCRYPTION_KEY=$SECRETS_ENCRYPTION_KEY)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

SECRETS_ENCRYPTION_KEY is unnecessarily propagated to Docs (and Marketing).

The Docs site is a static documentation deployment — it has no server-side secrets management logic. Passing SECRETS_ENCRYPTION_KEY here violates the principle of least privilege. The same applies to the Marketing deployment (Line 483/507).

Consider removing SECRETS_ENCRYPTION_KEY from deployments that don't use it (Docs, Marketing) to reduce the secret's exposure surface.

🤖 Prompt for AI Agents
In @.github/workflows/deploy-preview.yml around lines 678 - 687, Remove
SECRETS_ENCRYPTION_KEY from the vercel deploy invocations that build static
Docs/Marketing sites: locate the run block that sets VERCEL_URL via the vercel
deploy --prebuilt command (the step that constructs VERCEL_URL) and delete the
--env SECRETS_ENCRYPTION_KEY=$SECRETS_ENCRYPTION_KEY argument there (and the
equivalent argument in the Marketing deploy block). Leave SECRETS_ENCRYPTION_KEY
only on deployments that actually require it (do not remove the top-level
secrets definition), ensuring vercel deploy --prebuilt and VERCEL_URL remain
unchanged otherwise.

Comment on lines +504 to +513
SECRETS_ENCRYPTION_KEY: ${{ secrets.SECRETS_ENCRYPTION_KEY }}
run: |
vercel pull --yes --environment=production --token=$VERCEL_TOKEN
vercel build --prod --token=$VERCEL_TOKEN
vercel deploy --prod --prebuilt --token=$VERCEL_TOKEN \
--env NEXT_PUBLIC_POSTHOG_KEY=$NEXT_PUBLIC_POSTHOG_KEY \
--env NEXT_PUBLIC_POSTHOG_HOST=$NEXT_PUBLIC_POSTHOG_HOST \
--env NEXT_PUBLIC_SENTRY_DSN_DOCS=$NEXT_PUBLIC_SENTRY_DSN_DOCS \
--env NEXT_PUBLIC_SENTRY_ENVIRONMENT=$NEXT_PUBLIC_SENTRY_ENVIRONMENT
--env NEXT_PUBLIC_SENTRY_ENVIRONMENT=$NEXT_PUBLIC_SENTRY_ENVIRONMENT \
--env SECRETS_ENCRYPTION_KEY=$SECRETS_ENCRYPTION_KEY
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Same least-privilege concern: SECRETS_ENCRYPTION_KEY on Docs and Marketing in production.

Same issue as flagged in the preview workflow — Docs and Marketing don't need SECRETS_ENCRYPTION_KEY. Remove it from these deployments to minimize secret exposure.

🤖 Prompt for AI Agents
In @.github/workflows/deploy-production.yml around lines 504 - 513, The
SECRETS_ENCRYPTION_KEY is being passed into the production vercel deploy step
where Docs/Marketing don't need it; remove the --env
SECRETS_ENCRYPTION_KEY=$SECRETS_ENCRYPTION_KEY entry from the vercel deploy
invocation (the run block containing vercel pull/build/deploy) so the deploy
command no longer injects SECRETS_ENCRYPTION_KEY into the Docs/Marketing
production deployment.

Comment on lines +1004 to +1013
linkToNeon: publicProcedure
.input(z.object({ id: z.string(), neonProjectId: z.string() }))
.mutation(({ input }) => {
localDb
.update(projects)
.set({ neonProjectId: input.neonProjectId })
.where(eq(projects.id, input.id))
.run();
return { success: true };
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Missing project existence check — silently succeeds for invalid IDs.

Every other mutation in this router that updates by id (e.g., update, close, refreshDefaultBranch) fetches the project first and throws NOT_FOUND / an error when it doesn't exist. This procedure skips that, so calling linkToNeon with a bogus id returns { success: true } without actually linking anything.

Also, z.string() on neonProjectId permits empty strings — consider adding .min(1).

Proposed fix
 		linkToNeon: publicProcedure
-			.input(z.object({ id: z.string(), neonProjectId: z.string() }))
+			.input(z.object({ id: z.string(), neonProjectId: z.string().min(1) }))
 			.mutation(({ input }) => {
+				const project = localDb
+					.select()
+					.from(projects)
+					.where(eq(projects.id, input.id))
+					.get();
+
+				if (!project) {
+					throw new TRPCError({
+						code: "NOT_FOUND",
+						message: `Project ${input.id} not found`,
+					});
+				}
+
 				localDb
 					.update(projects)
 					.set({ neonProjectId: input.neonProjectId })
 					.where(eq(projects.id, input.id))
 					.run();
 				return { success: true };
 			}),

As per coding guidelines, "Use appropriate TRPCError codes consistently: NOT_FOUND (resource doesn't exist)" and "Follow existing patterns in the codebase and match the codebase style rather than introducing novel patterns".

🤖 Prompt for AI Agents
In `@apps/desktop/src/lib/trpc/routers/projects/projects.ts` around lines 1004 -
1013, The linkToNeon mutation currently updates localDb without verifying the
project exists and allows empty neonProjectId; change the input schema to
neonProjectId: z.string().min(1) and, following the pattern used by
update/close/refreshDefaultBranch, first fetch the project by id (using the same
project lookup logic) and if not found throw a TRPCError with code 'NOT_FOUND';
only then perform the localDb.update(projects).set(...).where(...) and await the
.run() result before returning { success: true } so the procedure behaves
consistently with other id-based mutations.

Comment on lines +46 to +53
linkMutation.mutate(
{ id: local.id, neonProjectId: match.id },
{
onError: () => {
linkingRef.current.delete(local.id);
},
},
);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Silent error swallowing in onError.

The onError callback removes the project from the linking set but discards the error entirely — no logging, no user feedback. Per coding guidelines, errors should never be swallowed silently.

Suggested fix
 				onError: () => {
+					console.error(
+						`[neon-project-link] Failed to link project ${local.id} to neon project ${match.id}`,
+					);
 					linkingRef.current.delete(local.id);
 				},
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
linkMutation.mutate(
{ id: local.id, neonProjectId: match.id },
{
onError: () => {
linkingRef.current.delete(local.id);
},
},
);
linkMutation.mutate(
{ id: local.id, neonProjectId: match.id },
{
onError: () => {
console.error(
`[neon-project-link] Failed to link project ${local.id} to neon project ${match.id}`,
);
linkingRef.current.delete(local.id);
},
},
);
🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/hooks/useNeonProjectLink/useNeonProjectLink.ts`
around lines 46 - 53, The onError handler for linkMutation.mutate is swallowing
errors — modify the onError callback in useNeonProjectLink so it accepts the
error parameter (e.g., (error) => { ... }), keeps the existing cleanup calling
linkingRef.current.delete(local.id), and also surface the failure by logging the
error (console.error or app logger) and notifying the user (use the app's
toast/notification utility or set an error state) so failures aren't silent;
keep the cleanup logic intact and include the error details in the
log/notification for debugging.

Comment on lines +27 to +53
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const [project] = await dbWs
.insert(projects)
.values({
organizationId: input.organizationId,
name: input.name,
slug: input.slug,
repoOwner: input.repoOwner,
repoName: input.repoName,
repoUrl: input.repoUrl,
defaultBranch: input.defaultBranch ?? "main",
githubRepositoryId: input.githubRepositoryId,
})
.returning();
if (!project) {
throw new TRPCError({
code: "INTERNAL_SERVER_ERROR",
message: "Failed to create project",
});
}
await dbWs.insert(sandboxImages).values({
organizationId: input.organizationId,
projectId: project.id,
});
return project;
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

Project creation and sandbox image insert are not atomic.

If dbWs.insert(sandboxImages) (line 48) fails after the project is successfully created (line 29-41), the database will contain an orphaned project with no sandbox image entry. Consider wrapping both inserts in a transaction:

🛡️ Proposed fix
 .mutation(async ({ ctx, input }) => {
 	await verifyOrgMembership(ctx.session.user.id, input.organizationId);
-	const [project] = await dbWs
-		.insert(projects)
-		.values({
-			organizationId: input.organizationId,
-			name: input.name,
-			slug: input.slug,
-			repoOwner: input.repoOwner,
-			repoName: input.repoName,
-			repoUrl: input.repoUrl,
-			defaultBranch: input.defaultBranch ?? "main",
-			githubRepositoryId: input.githubRepositoryId,
-		})
-		.returning();
-	if (!project) {
-		throw new TRPCError({
-			code: "INTERNAL_SERVER_ERROR",
-			message: "Failed to create project",
-		});
-	}
-	await dbWs.insert(sandboxImages).values({
-		organizationId: input.organizationId,
-		projectId: project.id,
+	const project = await dbWs.transaction(async (tx) => {
+		const [created] = await tx
+			.insert(projects)
+			.values({
+				organizationId: input.organizationId,
+				name: input.name,
+				slug: input.slug,
+				repoOwner: input.repoOwner,
+				repoName: input.repoName,
+				repoUrl: input.repoUrl,
+				defaultBranch: input.defaultBranch ?? "main",
+				githubRepositoryId: input.githubRepositoryId,
+			})
+			.returning();
+		if (!created) {
+			throw new TRPCError({
+				code: "INTERNAL_SERVER_ERROR",
+				message: "Failed to create project",
+			});
+		}
+		await tx.insert(sandboxImages).values({
+			organizationId: input.organizationId,
+			projectId: created.id,
+		});
+		return created;
 	});
 	return project;
 }),
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/project.ts` around lines 27 - 53, The
project insert and sandboxImages insert must be atomic to avoid orphaned
projects; wrap both operations in a single transaction using the DB transaction
API (e.g., dbWs.transaction) so both inserts run on the same transaction
context: perform the insert into projects (the existing
dbWs.insert(projects).values(...).returning()) inside the transaction, then
insert into sandboxImages using the transaction handle (use
tx.insert(sandboxImages).values(...)), check the returned project and throw the
same TRPCError if missing, and return the project from the transaction; this
ensures either both rows are committed or neither are.

Comment on lines +64 to +75
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const { id, organizationId, ...data } = input;
const [updated] = await dbWs
.update(projects)
.set(data)
.where(
and(eq(projects.id, id), eq(projects.organizationId, organizationId)),
)
.returning();
return updated;
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Update with no fields performs an empty SET and doesn't check for NOT_FOUND.

If the caller sends only id and organizationId (both optional fields omitted), data is {} and .set({}) may produce invalid SQL or a no-op. Additionally, if no row matches the WHERE clause, updated is undefined but is returned without error.

🛡️ Proposed fix
 .mutation(async ({ ctx, input }) => {
 	await verifyOrgMembership(ctx.session.user.id, input.organizationId);
 	const { id, organizationId, ...data } = input;
+	if (Object.keys(data).length === 0) {
+		throw new TRPCError({ code: "BAD_REQUEST", message: "No fields to update" });
+	}
 	const [updated] = await dbWs
 		.update(projects)
 		.set(data)
 		.where(
 			and(eq(projects.id, id), eq(projects.organizationId, organizationId)),
 		)
 		.returning();
+	if (!updated) {
+		throw new TRPCError({ code: "NOT_FOUND", message: "Project not found" });
+	}
 	return updated;
 }),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const { id, organizationId, ...data } = input;
const [updated] = await dbWs
.update(projects)
.set(data)
.where(
and(eq(projects.id, id), eq(projects.organizationId, organizationId)),
)
.returning();
return updated;
}),
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const { id, organizationId, ...data } = input;
if (Object.keys(data).length === 0) {
throw new TRPCError({ code: "BAD_REQUEST", message: "No fields to update" });
}
const [updated] = await dbWs
.update(projects)
.set(data)
.where(
and(eq(projects.id, id), eq(projects.organizationId, organizationId)),
)
.returning();
if (!updated) {
throw new TRPCError({ code: "NOT_FOUND", message: "Project not found" });
}
return updated;
}),
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/project.ts` around lines 64 - 75, The
mutation currently calls .set(data) even when data is empty and returns updated
without checking if a row was found; before calling
dbWs.update(projects).set(data) ensure there is at least one updatable field
(e.g., if Object.keys(data).length === 0) and throw a BAD_REQUEST TRPCError like
"No fields to update", and after performing the update check if the returned
updated is undefined and if so throw a NOT_FOUND TRPCError (use the same id and
organizationId context); reference verifyOrgMembership, the local variables
id/organizationId/data, the dbWs.update(projects).set(data)...returning() call,
and the updated result when adding these guards.

Comment on lines +28 to +29
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Missing project-organization ownership validation — potential authorization bypass.

verifyOrgMembership confirms the user belongs to the organization, but there's no check that input.projectId actually belongs to input.organizationId. The secrets table has separate FKs for organization_id and project_id without a composite constraint tying them together. A user who is a member of Org B could pass a projectId belonging to Org A with organizationId of Org B, and the insert would succeed — writing secrets scoped to a project they don't own.

The same gap exists in getDecrypted (Line 121). The delete procedure is safe because it filters on secrets.organizationId directly.

Add a check that the project belongs to the given organization before proceeding:

Proposed fix
 .mutation(async ({ ctx, input }) => {
   await verifyOrgMembership(ctx.session.user.id, input.organizationId);
+
+  const project = await db.query.projects.findFirst({
+    where: and(
+      eq(projects.id, input.projectId),
+      eq(projects.organizationId, input.organizationId),
+    ),
+  });
+  if (!project) {
+    throw new TRPCError({
+      code: "NOT_FOUND",
+      message: "Project not found in organization",
+    });
+  }

Apply the same guard to getDecrypted.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const project = await db.query.projects.findFirst({
where: and(
eq(projects.id, input.projectId),
eq(projects.organizationId, input.organizationId),
),
});
if (!project) {
throw new TRPCError({
code: "NOT_FOUND",
message: "Project not found in organization",
});
}
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/secrets.ts` around lines 28 - 29, Add an
explicit check that the provided projectId belongs to the provided
organizationId before performing the mutation or getDecrypted read: after
verifyOrgMembership(ctx.session.user.id, input.organizationId) (in the secrets
create mutation) and at the start of the getDecrypted procedure, query the
projects table (or add a helper like verifyProjectOrganization(projectId,
organizationId)) to confirm that input.projectId is associated with
input.organizationId; if not, throw an authorization error and halt; this
ensures the insert/read cannot be made for a project in a different org even
when the user is a member of the provided organizationId.

Comment on lines +47 to +52
const existing = await db.query.secrets.findMany({
where: and(
eq(secrets.projectId, input.projectId),
eq(secrets.organizationId, input.organizationId),
),
});
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

# First, let's read the secrets.ts file to see the actual code at lines 47-52 and around line 76
head -n 80 packages/trpc/src/router/project/secrets.ts | tail -n +40

Repository: superset-sh/superset

Length of output: 1169


🏁 Script executed:

# Search for db and dbWs definitions/exports
fd --type f --extension ts --extension tsx | xargs rg "export const db(Ws)?" -A 2 | head -n 50

Repository: superset-sh/superset

Length of output: 373


🏁 Script executed:

# Look at the entire secrets.ts file structure (check file size first)
wc -l packages/trpc/src/router/project/secrets.ts

Repository: superset-sh/superset

Length of output: 111


🏁 Script executed:

# Look at the client.ts file to understand what sql is
cat -n packages/db/src/client.ts | head -n 30

Repository: superset-sh/superset

Length of output: 779


🏁 Script executed:

# Search for imports and context
rg "drizzle|drizzleWs" packages/db/src/client.ts -B 5 -A 5

Repository: superset-sh/superset

Length of output: 618


🏁 Script executed:

# Check how MAX_SECRETS_PER_PROJECT is used across the file
rg "MAX_SECRETS_PER_PROJECT" packages/trpc/src/router/project/secrets.ts -B 2 -A 2

Repository: superset-sh/superset

Length of output: 473


🏁 Script executed:

# Search for comments or configuration indicating read-replica setup
rg "read.replica|replica|read-only" packages/db/src/ -i

Repository: superset-sh/superset

Length of output: 46


🏁 Script executed:

# Check how db vs dbWs are used throughout the codebase to understand the pattern
rg "await db\." packages/trpc/src/ | head -20

Repository: superset-sh/superset

Length of output: 2195


🏁 Script executed:

# Check how dbWs is used
rg "await dbWs\." packages/trpc/src/ | head -20

Repository: superset-sh/superset

Length of output: 437


🏁 Script executed:

# Look for any documentation or comments about Neon setup
find . -name "*.md" -o -name "*.txt" | xargs rg "neon|DATABASE_URL" | head -20

Repository: superset-sh/superset

Length of output: 1412


Use consistent database client for the count check and write operation.

Line 47 checks existing.length >= MAX_SECRETS_PER_PROJECT using db, while line 76 writes via dbWs. Although both connect to the same database URL, using different database clients for a check-then-write sequence creates a TOCTOU window where another process could insert secrets between the check and the insert, potentially bypassing the limit. Use dbWs for the count check as well, or document if this is intentionally treated as a soft limit. (Note: both db and dbWs can perform reads and writes; they differ in transport mechanism—HTTP vs WebSocket—rather than representing a read-replica/primary architecture.)

🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/secrets.ts` around lines 47 - 52, The count
check uses db.query.secrets.findMany (assigning to existing) while the insert
uses dbWs, creating a TOCTOU risk; change the read that enforces
MAX_SECRETS_PER_PROJECT to use the same client (dbWs.query.secrets.findMany or
equivalent) so the check and subsequent write use the same transport,
referencing the existing variable name existing, the constant
MAX_SECRETS_PER_PROJECT, and the write code that uses dbWs to keep the operation
atomic from the client-transport perspective (or explicitly document this as an
intentional soft limit if you choose not to change it).

Comment on lines +97 to +112
delete: protectedProcedure
.input(
z.object({ id: z.string().uuid(), organizationId: z.string().uuid() }),
)
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
await dbWs
.delete(secrets)
.where(
and(
eq(secrets.id, input.id),
eq(secrets.organizationId, input.organizationId),
),
);
return { success: true };
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

delete silently succeeds when the target secret doesn't exist.

If the id is invalid or already deleted, zero rows are affected but the procedure returns { success: true }. Consider checking the affected row count or using .returning() to confirm deletion, then throwing NOT_FOUND when nothing was deleted.

🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/secrets.ts` around lines 97 - 112, The
delete protectedProcedure currently always returns { success: true } even when
no row is deleted; modify the mutation to capture the result of
dbWs.delete(secrets).where(...).returning({id: secrets.id}) (or check the
affected row count from the executed query) and if the returned array is empty
throw a TRPCError with code 'NOT_FOUND' (after verifyOrgMembership). Ensure you
only return { success: true } when a row was actually deleted and add an import
for TRPCError if not present.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Feb 12, 2026

🧹 Preview Cleanup Complete

The following preview resources have been cleaned up:

  • ✅ Neon database branch
  • ✅ Electric Fly.io app
  • ✅ Streams Fly.io app

Thank you for your contribution! 🎉

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 7

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/desktop/src/renderer/stores/settings-state.ts (1)

4-7: ⚠️ Potential issue | 🟡 Minor

Stale comment: still references "workspace".

The doc comment says "project/workspace are dynamic" but workspace has been removed from the union type.

Suggested fix
 /**
  * Settings sections available in the settings view.
- * General sections are the main categories, project/workspace are dynamic.
+ * General sections are the main categories, project is dynamic.
  */
🤖 Fix all issues with AI agents
In `@apps/api/src/env.ts`:
- Line 44: The SECRETS_ENCRYPTION_KEY zod validation is too lax
(z.string().min(1)) even though the key is base64 and must decode to exactly 32
bytes; update the validation for SECRETS_ENCRYPTION_KEY to require the correct
base64 length by using z.string().min(44) (and consider adding a max(44) or a
custom refinement if you want to enforce exact length), and make the identical
change to the other env module that also defines SECRETS_ENCRYPTION_KEY so both
places validate the base64 32-byte key consistently.

In
`@apps/desktop/src/renderer/routes/_authenticated/hooks/useNeonProjectLink/useNeonProjectLink.ts`:
- Around line 36-37: The path split logic in useNeonProjectLink (where you
compute repoName via local.mainRepoPath.split("/").pop()) fails on Windows
because it only splits on forward slashes; change the extraction to handle both
separators by splitting on both "/" and "\" (or use a cross-platform approach
like path.basename) when deriving repoName from local.mainRepoPath, update the
code that checks repoName accordingly, and ensure any places referencing
repoName still behave the same after this normalization.
- Around line 29-60: The effect re-runs unnecessarily because linkMutation (from
useMutation) is not stable; change linkProjects to depend on the stable mutate
function instead of the whole linkMutation object: extract the mutate function
(e.g., const { mutate: linkMutate } = linkMutation) and use linkMutate inside
linkProjects (and in its mutate call), then update useCallback and useEffect
dependency arrays to include linkMutate (and localProjects, cloudProjects)
rather than linkMutation; keep linkingRef as-is to avoid duplicate requests.

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/AddSecretSheet/AddSecretSheet.tsx:
- Around line 198-226: handleSave currently upserts each validEntries item
sequentially so a mid-loop failure leaves earlier secrets saved and later ones
not; change handleSave to run all upserts in parallel with Promise.allSettled
against apiTrpcClient.project.secrets.upsert.mutate for each entry (using
entry.key.trim()/entry.value.trim() and sensitive/projectId/organizationId),
then inspect results to separate successes from failures, show a toast that
summarizes partial success (e.g., N added, M failed) and only call onSaved() and
onOpenChange(false) if all upserts succeeded; if some failed, keep the sheet
open and optionally remove the successfully saved entries from
entries/validEntries so retry only attempts failures.

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/components/SecretRow/SecretRow.tsx:
- Around line 64-68: The handleCopy callback currently calls
navigator.clipboard.writeText(secret.value) without error handling; wrap the
async clipboard call in a try/catch inside handleCopy (the function defined in
SecretRow.tsx) to catch rejections and avoid unhandled promise rejections, and
on error optionally log or silently handle it (e.g.,
processLogger/console.error) while still setting the copied state only on
success; ensure the dependency on secret.value remains unchanged.
- Around line 48-62: The catch block in handleDelete currently only
console.errors the failure from apiTrpcClient.project.secrets.delete.mutate and
leaves the user without feedback; add a user-visible error by calling
toast.error(...) inside the catch (including context like the secret.key and the
caught error message), and import toast from "@superset/ui/sonner" at the top of
the file; keep the existing setIsDeleting(false) in the finally block and do not
change onDeleted behavior.

In `@packages/trpc/src/router/project/project.ts`:
- Around line 77-92: The delete mutation (the protectedProcedure block that
calls verifyOrgAdmin and dbWs.delete(projects).where(...)) currently always
returns { success: true } even if no row was deleted; update it to check the
deletion result (e.g., use dbWs.delete(projects).where(...).returning() or
inspect the result count) and only return success when a row was actually
removed—otherwise return a failure (or throw a 404 / not-found error) so callers
can distinguish “no-op” from a real deletion.
🧹 Nitpick comments (8)
apps/desktop/src/renderer/routes/_authenticated/settings/utils/settings-search/settings-search.ts (1)

649-656: Consider adding "project" to the keywords for consistency.

Every other item in the "project" section includes "project" as a keyword (e.g., lines 584, 591, 608, 629). Without it, a user searching "project" won't see the Environment Variables entry in results.

Suggested fix
 		keywords: [
+			"project",
 			"environment",
 			"variables",
 			"secrets",
 			"env",
 			"cloud",
 			"sandbox",
 		],
packages/trpc/src/lib/secrets-validation.ts (1)

21-35: Consider checking length before running the regex.

Currently the regex runs on the full input before the length guard. Swapping the order avoids running a regex match against an arbitrarily long string and gives users the more actionable "too long" error first.

♻️ Suggested reorder
 export function validateSecretKey(
 	key: string,
 ): { valid: true } | { valid: false; error: string } {
 	const normalized = key.toUpperCase();
+	if (key.length > MAX_KEY_LENGTH)
+		return {
+			valid: false,
+			error: `Key must be <= ${MAX_KEY_LENGTH} characters`,
+		};
 	if (!KEY_PATTERN.test(key))
 		return { valid: false, error: "Key must match [A-Za-z_][A-Za-z0-9_]*" };
-	if (key.length > MAX_KEY_LENGTH)
-		return {
-			valid: false,
-			error: `Key must be <= ${MAX_KEY_LENGTH} characters`,
-		};
 	if (RESERVED_KEYS.has(normalized))
 		return { valid: false, error: `${normalized} is a reserved key` };
 	return { valid: true };
 }
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/general/page.tsx (1)

12-20: Manually-constructed query keys may drift from tRPC-generated keys.

The query keys on lines 12-20 are hand-crafted to match what @trpc/react-query generates internally. If tRPC changes its key format across versions, these will silently stop matching and the loader won't populate the cache for the component's useQuery hooks.

Consider adding a comment noting this coupling, or extracting the key construction into a shared helper.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/SecretsSettings.tsx (1)

100-102: refreshKey remount causes full loading flash on every save.

Incrementing refreshKey forces a complete unmount/remount of EnvironmentVariablesList, which resets its internal state (including isLoading = true). This means the user sees the "Loading..." placeholder every time they add or edit a secret.

Consider passing a refetch callback or lifting the fetch logic up so the list can refresh in-place without a full remount.

Also applies to: 72-72, 117-124

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/components/SecretRow/SecretRow.tsx (1)

48-49: confirm() is a native browser dialog — consider using the custom alert pattern.

The rest of the UI uses alert.destructive(...) from @superset/ui/atoms/Alert (as seen in AddSecretSheet.tsx line 76). Using window.confirm() here is inconsistent and blocks the main thread.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/components/AddSecretSheet/AddSecretSheet.tsx (2)

95-105: updateEntry value parameter typed string | boolean but SecretEntry fields are all string.

The boolean in the union is unnecessary and could mask type errors at call sites. Narrow to string.

♻️ Proposed fix
 const updateEntry = (
 	index: number,
 	field: keyof SecretEntry,
-	value: string | boolean,
+	value: string,
 ) => {

32-35: Module-level mutable counter for entry IDs.

entryIdCounter is module-level mutable state that never resets and is shared across all AddSecretSheet instances. This works because IDs only need uniqueness for React keys, but it's unconventional. If you ever need deterministic IDs (e.g., for testing), consider crypto.randomUUID() or useId() instead.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/EnvironmentVariablesList.tsx (1)

61-163: Data fetching is manual — consider using tRPC's useQuery or React Query directly.

EnvironmentVariablesList manages its own secrets state, isLoading, and manual fetchSecrets with useEffect. This reimplements what React Query (via the tRPC wrapper) provides out of the box — automatic loading/error states, refetching, cache invalidation, and stale-while-revalidate.

Using electronTrpc.project.secrets.getDecrypted.useQuery(...) (or the API tRPC equivalent if this must go through the cloud API) would eliminate the manual state management and the multiple issues identified around loading/error handling.

Comment thread apps/api/src/env.ts
STRIPE_PRO_YEARLY_PRICE_ID: z.string(),
SLACK_BILLING_WEBHOOK_URL: z.string().url(),
STREAMS_URL: z.string().url(),
SECRETS_ENCRYPTION_KEY: z.string().min(1),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

find . -name "crypto.ts" -o -name "crypto.js" | head -20

Repository: superset-sh/superset

Length of output: 97


🏁 Script executed:

rg "SECRETS_ENCRYPTION_KEY" -A 3 -B 3

Repository: superset-sh/superset

Length of output: 1498


🏁 Script executed:

rg "AES.*256.*GCM|createCipheriv|createDecipheriv" -A 2 -B 2 --type ts

Repository: superset-sh/superset

Length of output: 2732


🏁 Script executed:

python3 <<'EOF'
import base64

# 32 bytes as base64
test_bytes = b'x' * 32
encoded = base64.b64encode(test_bytes).decode('ascii')
print(f"32 bytes in base64: {len(encoded)} characters")
print(f"Example: {encoded}")
EOF

Repository: superset-sh/superset

Length of output: 151


SECRETS_ENCRYPTION_KEY validation is too permissive and suggested fix has incorrect length.

Since the key is base64-encoded and must decode to exactly 32 bytes, the minimum length should be 44 characters (32 bytes base64-encoded), not 32. A string like "abc" would currently pass min(1) validation but fail at runtime.

Additionally, the same issue exists in packages/trpc/src/env.ts.

Suggested fix
-		SECRETS_ENCRYPTION_KEY: z.string().min(1),
+		SECRETS_ENCRYPTION_KEY: z.string().min(44),

Apply the same fix to both apps/api/src/env.ts and packages/trpc/src/env.ts.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
SECRETS_ENCRYPTION_KEY: z.string().min(1),
SECRETS_ENCRYPTION_KEY: z.string().min(44),
🤖 Prompt for AI Agents
In `@apps/api/src/env.ts` at line 44, The SECRETS_ENCRYPTION_KEY zod validation is
too lax (z.string().min(1)) even though the key is base64 and must decode to
exactly 32 bytes; update the validation for SECRETS_ENCRYPTION_KEY to require
the correct base64 length by using z.string().min(44) (and consider adding a
max(44) or a custom refinement if you want to enforce exact length), and make
the identical change to the other env module that also defines
SECRETS_ENCRYPTION_KEY so both places validate the base64 32-byte key
consistently.

Comment on lines +29 to +60
const linkProjects = useCallback(() => {
if (!localProjects || !cloudProjects) return;

for (const local of localProjects) {
if (local.neonProjectId || !local.githubOwner) continue;
if (linkingRef.current.has(local.id)) continue;

const repoName = local.mainRepoPath.split("/").pop();
if (!repoName) continue;

const match = cloudProjects.find(
(cloud) =>
cloud.repoOwner === local.githubOwner && cloud.repoName === repoName,
);

if (match) {
linkingRef.current.add(local.id);
linkMutation.mutate(
{ id: local.id, neonProjectId: match.id },
{
onError: () => {
linkingRef.current.delete(local.id);
},
},
);
}
}
}, [localProjects, cloudProjects, linkMutation]);

useEffect(() => {
linkProjects();
}, [linkProjects]);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

linkMutation in dependency array causes unnecessary re-execution of the effect.

The linkMutation object returned by useMutation is not referentially stable — it gets a new identity whenever mutation state changes (e.g., isLoading, isSuccess). Including it in the useCallback dependency array means linkProjects is recreated on every state change, which re-triggers the useEffect. While linkingRef prevents duplicate mutations for the same project, this still causes redundant iteration over all projects on every mutation state transition.

Use only the stable mutate function reference instead:

Proposed fix
-	const linkProjects = useCallback(() => {
+	const mutate = linkMutation.mutate;
+
+	const linkProjects = useCallback(() => {
 		if (!localProjects || !cloudProjects) return;
 
 		for (const local of localProjects) {
 			if (local.neonProjectId || !local.githubOwner) continue;
 			if (linkingRef.current.has(local.id)) continue;
 
 			const repoName = local.mainRepoPath.split("/").pop();
 			if (!repoName) continue;
 
 			const match = cloudProjects.find(
 				(cloud) =>
 					cloud.repoOwner === local.githubOwner && cloud.repoName === repoName,
 			);
 
 			if (match) {
 				linkingRef.current.add(local.id);
-				linkMutation.mutate(
+				mutate(
 					{ id: local.id, neonProjectId: match.id },
 					{
 						onError: () => {
 							linkingRef.current.delete(local.id);
 						},
 					},
 				);
 			}
 		}
-	}, [localProjects, cloudProjects, linkMutation]);
+	}, [localProjects, cloudProjects, mutate]);
🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/hooks/useNeonProjectLink/useNeonProjectLink.ts`
around lines 29 - 60, The effect re-runs unnecessarily because linkMutation
(from useMutation) is not stable; change linkProjects to depend on the stable
mutate function instead of the whole linkMutation object: extract the mutate
function (e.g., const { mutate: linkMutate } = linkMutation) and use linkMutate
inside linkProjects (and in its mutate call), then update useCallback and
useEffect dependency arrays to include linkMutate (and localProjects,
cloudProjects) rather than linkMutation; keep linkingRef as-is to avoid
duplicate requests.

Comment on lines +36 to +37
const repoName = local.mainRepoPath.split("/").pop();
if (!repoName) continue;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Path splitting breaks on Windows — backslash separators are not handled.

mainRepoPath on Windows will use backslashes (e.g., C:\Users\dev\my-repo). Splitting on "/" alone will return the entire path as a single element, so repoName will be the full path, never matching a cloud project.

Proposed fix
-			const repoName = local.mainRepoPath.split("/").pop();
+			const repoName = local.mainRepoPath.split(/[\\/]/).pop();
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const repoName = local.mainRepoPath.split("/").pop();
if (!repoName) continue;
const repoName = local.mainRepoPath.split(/[\\/]/).pop();
if (!repoName) continue;
🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/hooks/useNeonProjectLink/useNeonProjectLink.ts`
around lines 36 - 37, The path split logic in useNeonProjectLink (where you
compute repoName via local.mainRepoPath.split("/").pop()) fails on Windows
because it only splits on forward slashes; change the extraction to handle both
separators by splitting on both "/" and "\" (or use a cross-platform approach
like path.basename) when deriving repoName from local.mainRepoPath, update the
code that checks repoName accordingly, and ensure any places referencing
repoName still behave the same after this normalization.

Comment on lines +198 to +226
const handleSave = async () => {
const validEntries = entries.filter((e) => e.key.trim() && e.value.trim());
if (validEntries.length === 0) return;

setIsSaving(true);
try {
for (const entry of validEntries) {
await apiTrpcClient.project.secrets.upsert.mutate({
projectId,
organizationId,
key: entry.key.trim(),
value: entry.value.trim(),
sensitive,
});
}
toast.success(
validEntries.length === 1
? `Added ${validEntries[0].key.trim()}`
: `Added ${validEntries.length} environment variables`,
);
onSaved();
onOpenChange(false);
} catch (err) {
console.error("[secrets/upsert] Failed to save:", err);
toast.error("Failed to save environment variables");
} finally {
setIsSaving(false);
}
};
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Partial save on mid-sequence failure — some secrets saved, others lost.

The sequential for loop (lines 204-212) calls upsert one at a time. If the 3rd of 5 entries fails, entries 1-2 are saved, 3-5 are not, and the sheet shows an error toast. However, the sheet closes only on full success (line 219), so the user can retry — and since it's an upsert, re-saving already-saved entries is idempotent. Still, the UX could be confusing: the user retries with all 5 entries visible, but 2 were already committed.

Consider filtering out already-saved entries on retry, or using Promise.allSettled to attempt all and report partial results.

🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/AddSecretSheet/AddSecretSheet.tsx
around lines 198 - 226, handleSave currently upserts each validEntries item
sequentially so a mid-loop failure leaves earlier secrets saved and later ones
not; change handleSave to run all upserts in parallel with Promise.allSettled
against apiTrpcClient.project.secrets.upsert.mutate for each entry (using
entry.key.trim()/entry.value.trim() and sensitive/projectId/organizationId),
then inspect results to separate successes from failures, show a toast that
summarizes partial success (e.g., N added, M failed) and only call onSaved() and
onOpenChange(false) if all upserts succeeded; if some failed, keep the sheet
open and optionally remove the successfully saved entries from
entries/validEntries so retry only attempts failures.

Comment on lines +48 to +62
const handleDelete = useCallback(async () => {
if (!confirm(`Delete environment variable "${secret.key}"?`)) return;
setIsDeleting(true);
try {
await apiTrpcClient.project.secrets.delete.mutate({
id: secret.id,
organizationId,
});
onDeleted();
} catch (err) {
console.error("[secrets/delete] Failed to delete:", err);
} finally {
setIsDeleting(false);
}
}, [secret.id, secret.key, organizationId, onDeleted]);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Delete failure is logged but not surfaced to the user.

When apiTrpcClient.project.secrets.delete.mutate fails, the error is logged to the console (line 58) but the user sees no feedback — the row just stops showing the deleting state. Add a toast.error(...) to inform the user.

🛡️ Proposed fix
 	} catch (err) {
 		console.error("[secrets/delete] Failed to delete:", err);
+		toast.error(`Failed to delete "${secret.key}"`);
 	} finally {

(Add import { toast } from "@superset/ui/sonner"; at the top.)

As per coding guidelines, "Never swallow errors silently; at minimum log errors with context before rethrowing or handling them explicitly."

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
const handleDelete = useCallback(async () => {
if (!confirm(`Delete environment variable "${secret.key}"?`)) return;
setIsDeleting(true);
try {
await apiTrpcClient.project.secrets.delete.mutate({
id: secret.id,
organizationId,
});
onDeleted();
} catch (err) {
console.error("[secrets/delete] Failed to delete:", err);
} finally {
setIsDeleting(false);
}
}, [secret.id, secret.key, organizationId, onDeleted]);
const handleDelete = useCallback(async () => {
if (!confirm(`Delete environment variable "${secret.key}"?`)) return;
setIsDeleting(true);
try {
await apiTrpcClient.project.secrets.delete.mutate({
id: secret.id,
organizationId,
});
onDeleted();
} catch (err) {
console.error("[secrets/delete] Failed to delete:", err);
toast.error(`Failed to delete "${secret.key}"`);
} finally {
setIsDeleting(false);
}
}, [secret.id, secret.key, organizationId, onDeleted]);
🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/components/SecretRow/SecretRow.tsx
around lines 48 - 62, The catch block in handleDelete currently only
console.errors the failure from apiTrpcClient.project.secrets.delete.mutate and
leaves the user without feedback; add a user-visible error by calling
toast.error(...) inside the catch (including context like the secret.key and the
caught error message), and import toast from "@superset/ui/sonner" at the top of
the file; keep the existing setIsDeleting(false) in the finally block and do not
change onDeleted behavior.

Comment on lines +64 to +68
const handleCopy = useCallback(async () => {
await navigator.clipboard.writeText(secret.value);
setCopied(true);
setTimeout(() => setCopied(false), 1500);
}, [secret.value]);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

navigator.clipboard.writeText can reject — no error handling.

The clipboard API can fail (e.g., in certain Electron contexts or when the document isn't focused). Wrap in try/catch to avoid an unhandled promise rejection.

🛡️ Proposed fix
 const handleCopy = useCallback(async () => {
-	await navigator.clipboard.writeText(secret.value);
-	setCopied(true);
-	setTimeout(() => setCopied(false), 1500);
+	try {
+		await navigator.clipboard.writeText(secret.value);
+		setCopied(true);
+		setTimeout(() => setCopied(false), 1500);
+	} catch (err) {
+		console.error("[secrets/copy] Clipboard write failed:", err);
+	}
 }, [secret.value]);
🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/components/SecretsSettings/components/EnvironmentVariablesList/components/SecretRow/SecretRow.tsx
around lines 64 - 68, The handleCopy callback currently calls
navigator.clipboard.writeText(secret.value) without error handling; wrap the
async clipboard call in a try/catch inside handleCopy (the function defined in
SecretRow.tsx) to catch rejections and avoid unhandled promise rejections, and
on error optionally log or silently handle it (e.g.,
processLogger/console.error) while still setting the copied state only on
success; ensure the dependency on secret.value remains unchanged.

Comment on lines +77 to +92
delete: protectedProcedure
.input(
z.object({ id: z.string().uuid(), organizationId: z.string().uuid() }),
)
.mutation(async ({ ctx, input }) => {
await verifyOrgAdmin(ctx.session.user.id, input.organizationId);
await dbWs
.delete(projects)
.where(
and(
eq(projects.id, input.id),
eq(projects.organizationId, input.organizationId),
),
);
return { success: true };
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Delete silently succeeds when the project doesn't exist.

If no row matches the WHERE clause, the mutation still returns { success: true }, which misleads the caller. Consider checking the deletion result or using .returning() to confirm a row was actually removed.

🛡️ Proposed fix
-		await dbWs
+		const [deleted] = await dbWs
 			.delete(projects)
 			.where(
 				and(
 					eq(projects.id, input.id),
 					eq(projects.organizationId, input.organizationId),
 				),
-			);
+			)
+			.returning();
+		if (!deleted) {
+			throw new TRPCError({
+				code: "NOT_FOUND",
+				message: "Project not found",
+			});
+		}
 		return { success: true };
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/project.ts` around lines 77 - 92, The delete
mutation (the protectedProcedure block that calls verifyOrgAdmin and
dbWs.delete(projects).where(...)) currently always returns { success: true }
even if no row was deleted; update it to check the deletion result (e.g., use
dbWs.delete(projects).where(...).returning() or inspect the result count) and
only return success when a row was actually removed—otherwise return a failure
(or throw a 404 / not-found error) so callers can distinguish “no-op” from a
real deletion.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🤖 Fix all issues with AI agents
In @.github/workflows/deploy-preview.yml:
- Line 590: Remove the unused SECRETS_ENCRYPTION_KEY environment variable from
the Admin app deployment job in the workflow: delete the line setting
SECRETS_ENCRYPTION_KEY: ${{ secrets.SECRETS_ENCRYPTION_KEY }} from the Admin
app's env block (the deploy-preview/admin job), ensuring no code in apps/admin
(e.g., apps/admin/src/env.ts) references it; verify by searching for
SECRETS_ENCRYPTION_KEY and confirm only the API app's secrets router uses it.

In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/secrets/components/SecretsSettings/components/AddSecretSheet/AddSecretSheet.tsx:
- Line 148: MAX_FILE_SIZE is declared inside the AddSecretSheet component
causing it to be recreated on every render; move the declaration to module scope
(top of the file) with the other utility functions/constants so it becomes a
single module-level constant (e.g., declare const MAX_FILE_SIZE = 256 * 1024;
alongside existing helpers) and remove the in-component definition so
AddSecretSheet uses the module-level MAX_FILE_SIZE.

In `@packages/trpc/src/router/project/secrets/secrets.ts`:
- Around line 132-140: The per-row mapping in the getDecrypted query uses
decryptSecret(row.encryptedValue) directly so a thrown error on one secret will
abort the whole query; update the rows.map callback (where decryptSecret is
called) to wrap decryption in a try-catch for each row, returning a safe
fallback (e.g., empty string or null) for value and optionally flagging the row
(e.g., add a decryptionError boolean) while still returning the other fields
(id, key, sensitive, createdAt, updatedAt, createdBy) so a single corrupted
secret does not break the entire response.
🧹 Nitpick comments (5)
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/components/SecretsSettings/components/EnvironmentVariablesList/components/SecretRow/SecretRow.tsx (1)

64-68: setTimeout not cleared on unmount — potential state update on unmounted component.

If the component unmounts within the 1500 ms window, setCopied(false) fires on a stale component. Consider using a ref-based cleanup or clearing the timeout on unmount.

Proposed fix
+import { useCallback, useRef, useState, useEffect } from "react";
 ...
 export function SecretRow(...) {
+	const copyTimeoutRef = useRef<ReturnType<typeof setTimeout>>();
+
+	useEffect(() => {
+		return () => {
+			if (copyTimeoutRef.current) clearTimeout(copyTimeoutRef.current);
+		};
+	}, []);
+
 	const handleCopy = useCallback(async () => {
 		await navigator.clipboard.writeText(secret.value);
 		setCopied(true);
-		setTimeout(() => setCopied(false), 1500);
+		copyTimeoutRef.current = setTimeout(() => setCopied(false), 1500);
 	}, [secret.value]);
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/components/SecretsSettings/components/EnvironmentVariablesList/EnvironmentVariablesList.tsx (1)

19-27: The Secret interface is duplicated across components.

This same shape is defined here and implicitly in SecretRow. Consider extracting a shared type (e.g., in a types.ts co-located with the secrets feature) to keep them in sync.

packages/trpc/src/router/project/secrets/utils/crypto.ts (2)

39-40: Inconsistent encoding specification between update and final.

decipher.update(ciphertext) returns a Buffer (no output encoding specified), while decipher.final("utf8") returns a string. The + operator coerces the Buffer via its default .toString() (which is UTF-8), so it works — but it's clearer to be explicit on both calls.

Suggested improvement
-	return decipher.update(ciphertext) + decipher.final("utf8");
+	return Buffer.concat([decipher.update(ciphertext), decipher.final()]).toString("utf8");

7-14: getKey() re-reads and re-allocates the key buffer on every call.

For a hot path (e.g., decrypting many secrets in getDecrypted), consider caching the derived key lazily. Not a correctness issue at current scale, but worth noting.

Optional caching pattern
+let _cachedKey: Buffer | null = null;
+
 function getKey(): Buffer {
+	if (_cachedKey) return _cachedKey;
 	const raw = process.env.SECRETS_ENCRYPTION_KEY;
 	if (!raw) throw new Error("SECRETS_ENCRYPTION_KEY not set");
 	const key = Buffer.from(raw, "base64");
 	if (key.length !== 32)
 		throw new Error("SECRETS_ENCRYPTION_KEY must be 32 bytes");
+	_cachedKey = key;
 	return key;
 }
apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/cloud/secrets/components/SecretsSettings/SecretsSettings.tsx (1)

50-58: suggestedMatch triggers recomputation on the entire project object.

The dependency array uses [project, cloudProjects]. Since project is a query result object that may be a new reference on every render, this memo may recompute more often than necessary. Consider narrowing to the specific fields used:

Suggested improvement
 }, [
-	project,
-	cloudProjects,
+	project?.neonProjectId,
+	project?.mainRepoPath,
+	project?.githubOwner,
+	cloudProjects,
 ]);

STRIPE_PRO_MONTHLY_PRICE_ID: ${{ secrets.STRIPE_PRO_MONTHLY_PRICE_ID }}
STRIPE_PRO_YEARLY_PRICE_ID: ${{ secrets.STRIPE_PRO_YEARLY_PRICE_ID }}
SLACK_BILLING_WEBHOOK_URL: ${{ secrets.SLACK_BILLING_WEBHOOK_URL }}
SECRETS_ENCRYPTION_KEY: ${{ secrets.SECRETS_ENCRYPTION_KEY }}
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Check if the admin app references SECRETS_ENCRYPTION_KEY in its source
echo "=== Searching in admin app ==="
rg 'SECRETS_ENCRYPTION_KEY' --type=ts --type=js -g 'apps/admin/**' -l

echo ""
echo "=== Searching in packages ==="
rg 'SECRETS_ENCRYPTION_KEY' --type=ts --type=js -g 'packages/**' -l

echo ""
echo "=== Full search across codebase ==="
rg 'SECRETS_ENCRYPTION_KEY' --type=ts --type=js -l

Repository: superset-sh/superset

Length of output: 346


🏁 Script executed:

#!/bin/bash
# Check Admin app's dependencies and imports
echo "=== Admin app package.json ==="
cat apps/admin/package.json | grep -A 20 -B 5 '"dependencies"'

echo ""
echo "=== Admin app env.ts (if exists) ==="
if [ -f apps/admin/src/env.ts ]; then
  cat apps/admin/src/env.ts | head -50
else
  echo "No env.ts found in admin app"
fi

echo ""
echo "=== Admin app imports of `@superset-sh/trpc` ==="
rg '@superset-sh/trpc|packages/trpc' apps/admin/ --type=ts --type=js -l

echo ""
echo "=== Check deploy-preview.yml around line 590 ==="
sed -n '580,610p' .github/workflows/deploy-preview.yml

Repository: superset-sh/superset

Length of output: 4734


Remove SECRETS_ENCRYPTION_KEY from Admin app deployment environment.

The Admin app's env schema (apps/admin/src/env.ts) does not include SECRETS_ENCRYPTION_KEY, and no code in the Admin app references it. The key is only used by the API app's secrets management router. Passing unused secrets violates the least-privilege principle.

🤖 Prompt for AI Agents
In @.github/workflows/deploy-preview.yml at line 590, Remove the unused
SECRETS_ENCRYPTION_KEY environment variable from the Admin app deployment job in
the workflow: delete the line setting SECRETS_ENCRYPTION_KEY: ${{
secrets.SECRETS_ENCRYPTION_KEY }} from the Admin app's env block (the
deploy-preview/admin job), ensuring no code in apps/admin (e.g.,
apps/admin/src/env.ts) references it; verify by searching for
SECRETS_ENCRYPTION_KEY and confirm only the API app's secrets router uses it.

});
}, []);

const MAX_FILE_SIZE = 256 * 1024; // 256 KB
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion | 🟠 Major

MAX_FILE_SIZE should be a module-level constant.

This constant is defined inside the component body, causing it to be re-created on every render. Move it to module scope alongside the other utility functions.

Proposed fix
+const MAX_FILE_SIZE = 256 * 1024; // 256 KB
+
 export function AddSecretSheet({
-	const MAX_FILE_SIZE = 256 * 1024; // 256 KB
-
 	const validateAndReadFile = useCallback(

As per coding guidelines, "Extract hardcoded magic numbers, strings, and enums to named constants at module top instead of leaving them inline in logic."

🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/secrets/components/SecretsSettings/components/AddSecretSheet/AddSecretSheet.tsx
at line 148, MAX_FILE_SIZE is declared inside the AddSecretSheet component
causing it to be recreated on every render; move the declaration to module scope
(top of the file) with the other utility functions/constants so it becomes a
single module-level constant (e.g., declare const MAX_FILE_SIZE = 256 * 1024;
alongside existing helpers) and remove the in-component definition so
AddSecretSheet uses the module-level MAX_FILE_SIZE.

Comment on lines +132 to +140
return rows.map((row) => ({
id: row.id,
key: row.key,
value: row.sensitive ? "" : decryptSecret(row.encryptedValue),
sensitive: row.sensitive,
createdAt: row.createdAt,
updatedAt: row.updatedAt,
createdBy: row.createdBy ?? null,
}));
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

A single corrupted secret breaks the entire getDecrypted query.

If decryptSecret throws on any row (e.g., corrupted ciphertext, key rotation), the unhandled error propagates and the user sees zero secrets. Consider wrapping decryption in a try-catch per row so the remaining secrets are still returned.

Proposed fix
 return rows.map((row) => ({
 	id: row.id,
 	key: row.key,
-	value: row.sensitive ? "" : decryptSecret(row.encryptedValue),
+	value: row.sensitive
+		? ""
+		: (() => {
+				try {
+					return decryptSecret(row.encryptedValue);
+				} catch (err) {
+					console.error(`[secrets/decrypt] Failed to decrypt key="${row.key}":`, err);
+					return "";
+				}
+			})(),
 	sensitive: row.sensitive,
 	createdAt: row.createdAt,
 	updatedAt: row.updatedAt,
 	createdBy: row.createdBy ?? null,
 }));
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/project/secrets/secrets.ts` around lines 132 - 140,
The per-row mapping in the getDecrypted query uses
decryptSecret(row.encryptedValue) directly so a thrown error on one secret will
abort the whole query; update the rows.map callback (where decryptSecret is
called) to wrap decryption in a try-catch for each row, returning a safe
fallback (e.g., empty string or null) for value and optionally flagging the row
(e.g., add a decryptionError boolean) while still returning the other fields
(id, key, sensitive, createdAt, updatedAt, createdBy) so a single corrupted
secret does not break the entire response.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/secrets/components/SecretsSettings/SecretsSettings.tsx:
- Around line 59-66: The useEffect closes over linkToNeon but lists
linkToNeon.mutate as a dependency and lacks an in-flight guard; change the
dependency to linkToNeon (not linkToNeon.mutate) in the useEffect dependency
array and add a guard so the effect only calls linkToNeon.mutate when
suggestedMatch is truthy and the mutation is not already in-flight (e.g., check
linkToNeon.isLoading / linkToNeon.isMutating or equivalent) while still passing
id: projectId and neonProjectId: suggestedMatch.id to linkToNeon.mutate; keep
references to suggestedMatch and projectId in the dependency array.

Comment on lines +59 to +66
useEffect(() => {
if (suggestedMatch) {
linkToNeon.mutate({
id: projectId,
neonProjectId: suggestedMatch.id,
});
}
}, [suggestedMatch, linkToNeon.mutate, projectId]);
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Incorrect useEffect dependency: linkToNeon.mutate instead of linkToNeon.

Biome correctly flags this: the effect closes over linkToNeon but the dependency list specifies the more-specific linkToNeon.mutate. This can cause stale closure bugs if the mutation object reference updates but .mutate doesn't (or vice-versa). Additionally, there's no guard against re-firing while a mutation is already in-flight.

🔧 Proposed fix
 useEffect(() => {
-	if (suggestedMatch) {
+	if (suggestedMatch && !linkToNeon.isPending) {
 		linkToNeon.mutate({
 			id: projectId,
 			neonProjectId: suggestedMatch.id,
 		});
 	}
-}, [suggestedMatch, linkToNeon.mutate, projectId]);
+}, [suggestedMatch, linkToNeon, projectId]);
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
useEffect(() => {
if (suggestedMatch) {
linkToNeon.mutate({
id: projectId,
neonProjectId: suggestedMatch.id,
});
}
}, [suggestedMatch, linkToNeon.mutate, projectId]);
useEffect(() => {
if (suggestedMatch && !linkToNeon.isPending) {
linkToNeon.mutate({
id: projectId,
neonProjectId: suggestedMatch.id,
});
}
}, [suggestedMatch, linkToNeon, projectId]);
🧰 Tools
🪛 Biome (2.3.14)

[error] 59-59: This hook specifies a dependency more specific than its captures: linkToNeon.mutate

This capture is more generic than...

...this dependency.

(lint/correctness/useExhaustiveDependencies)


[error] 59-59: This hook does not specify its dependency on linkToNeon.

This dependency is being used here, but is not specified in the hook dependency list.

React relies on hook dependencies to determine when to re-compute Effects.
Failing to specify dependencies can result in Effects not updating correctly when state changes.
These "stale closures" are a common source of surprising bugs.
Unsafe fix: Add the missing dependency to the list.

(lint/correctness/useExhaustiveDependencies)

🤖 Prompt for AI Agents
In
`@apps/desktop/src/renderer/routes/_authenticated/settings/project/`$projectId/cloud/secrets/components/SecretsSettings/SecretsSettings.tsx
around lines 59 - 66, The useEffect closes over linkToNeon but lists
linkToNeon.mutate as a dependency and lacks an in-flight guard; change the
dependency to linkToNeon (not linkToNeon.mutate) in the useEffect dependency
array and add a guard so the effect only calls linkToNeon.mutate when
suggestedMatch is truthy and the mutation is not already in-flight (e.g., check
linkToNeon.isLoading / linkToNeon.isMutating or equivalent) while still passing
id: projectId and neonProjectId: suggestedMatch.id to linkToNeon.mutate; keep
references to suggestedMatch and projectId in the dependency array.

Move environment variables route from secrets/ to cloud/secrets/,
add Cloud collapsible section in sidebar, and add env vars to
settings search.
…and creator info

Overhaul the secrets/environment variables UI to match the reference design:
- Add createdByUserId to secrets schema with user relation
- Return creator info (name, avatar) from getDecrypted query
- Redesign SecretRow with circled icons, centered value reveal, date + avatar
- Add sort dropdown (Last Updated, Name, Type) to EnvironmentVariablesList
- Click-to-copy revealed values with hover tooltip
- Show "Empty" for blank values, neutral badge colors for sensitive
- Add toast notifications on edit success/failure
- Fix migration ordering for DROP CONSTRAINT before DROP TABLE CASCADE
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
apps/desktop/src/renderer/stores/settings-state.ts (1)

4-7: ⚠️ Potential issue | 🟡 Minor

Stale JSDoc: still references "workspace".

The comment on line 6 mentions "project/workspace are dynamic" but workspace has been removed from SettingsSection.

Proposed fix
 /**
  * Settings sections available in the settings view.
- * General sections are the main categories, project/workspace are dynamic.
+ * General sections are the main categories, project is dynamic.
  */
🤖 Fix all issues with AI agents
In `@packages/trpc/src/router/workspace/workspace.ts`:
- Around line 24-38: The mutation currently calls verifyOrgMembership(...) but
does not ensure input.projectId belongs to input.organizationId, allowing
cross-org project references; before inserting via
dbWs.insert(workspaces).values(...).returning(), query the projects table (e.g.,
select from projects where id = input.projectId) and assert its organizationId
matches input.organizationId (throw an unauthorized/error if not), then proceed
with the insert; update the mutation to perform this ownership check using the
same DB client used for workspaces so the check is atomic and uses symbols
verifyOrgMembership, dbWs, workspaces and the projects table/selector.
- Around line 36-37: The variable workspace returned after calling .returning()
can be undefined if the returned array is empty; update the code that uses
workspace (the variable named workspace returned from the query that calls
.returning()) to guard against undefined—e.g., check if workspace is falsy and
either throw a descriptive error or return a safe default value before returning
workspace so callers don't receive undefined; ensure the check is colocated with
the .returning() result handling and use the same variable name workspace in the
guard.
🧹 Nitpick comments (3)
apps/desktop/src/renderer/routes/_authenticated/settings/components/SettingsSidebar/ProjectsSettings.tsx (1)

83-127: Consider extracting the repeated nav-link class logic.

The active/inactive className pattern appears three times (Lines 86–91, 97–101, 113–117) with only minor variations. A small helper would reduce duplication and make future style tweaks easier:

const navLinkClass = (active: boolean) =>
  cn(
    "flex items-center gap-2 px-2 py-1 text-sm w-full text-left rounded-md transition-colors",
    active
      ? "bg-accent text-accent-foreground"
      : "text-muted-foreground hover:bg-accent/50 hover:text-accent-foreground",
  );

The Cloud trigger variant drops the bg-accent when active — that one difference can be handled with an optional parameter or kept inline.

apps/desktop/src/renderer/routes/_authenticated/settings/project/$projectId/components/ProjectSettings/ProjectSettings.tsx (1)

27-27: Rename the SettingsSection component to avoid confusion with the SettingsSection type in settings-state.ts.

The component SettingsSection in ProjectSettings and its barrel export share a name with the type SettingsSection (a string union of app settings sections like "account", "organization", etc.). While TypeScript's separate type/value namespaces prevent compile errors, this naming overlap can cause developer confusion and complicate auto-import suggestions. The type is used throughout settings modules for state management, while the component is a UI panel wrapper. Renaming the component to something like SettingsSectionPanel or SettingsPanelCard would clarify their distinct purposes.

packages/trpc/src/router/workspace/workspace.ts (1)

40-55: delete silently succeeds for non-existent workspaces and also lacks project-org authorization.

Similarly to create, the delete mutation should verify the workspace exists under the correct organization. Currently it returns { success: true } even if no rows were deleted. Consider checking the delete result or using .returning() to confirm a row was actually removed.

Comment on lines +24 to +38
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const [workspace] = await dbWs
.insert(workspaces)
.values({
projectId: input.projectId,
organizationId: input.organizationId,
name: input.name,
type: input.type,
config: input.config,
createdByUserId: ctx.session.user.id,
})
.returning();
return workspace;
}),
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Missing authorization: projectId is not verified to belong to the given organizationId.

A member of organization A could pass a projectId belonging to organization B. The DB foreign keys validate each column independently (projects.id exists, organizations.id exists) but don't enforce that the project belongs to the specified organization.

Add a check that the project belongs to the organization before inserting, e.g.:

Proposed fix
 .mutation(async ({ ctx, input }) => {
   await verifyOrgMembership(ctx.session.user.id, input.organizationId);
+
+  const project = await dbWs.query.projects.findFirst({
+    where: and(
+      eq(projects.id, input.projectId),
+      eq(projects.organizationId, input.organizationId),
+    ),
+  });
+  if (!project) {
+    throw new TRPCError({
+      code: "NOT_FOUND",
+      message: "Project not found in this organization",
+    });
+  }
+
   const [workspace] = await dbWs
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const [workspace] = await dbWs
.insert(workspaces)
.values({
projectId: input.projectId,
organizationId: input.organizationId,
name: input.name,
type: input.type,
config: input.config,
createdByUserId: ctx.session.user.id,
})
.returning();
return workspace;
}),
.mutation(async ({ ctx, input }) => {
await verifyOrgMembership(ctx.session.user.id, input.organizationId);
const project = await dbWs.query.projects.findFirst({
where: and(
eq(projects.id, input.projectId),
eq(projects.organizationId, input.organizationId),
),
});
if (!project) {
throw new TRPCError({
code: "NOT_FOUND",
message: "Project not found in this organization",
});
}
const [workspace] = await dbWs
.insert(workspaces)
.values({
projectId: input.projectId,
organizationId: input.organizationId,
name: input.name,
type: input.type,
config: input.config,
createdByUserId: ctx.session.user.id,
})
.returning();
return workspace;
}),
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/workspace/workspace.ts` around lines 24 - 38, The
mutation currently calls verifyOrgMembership(...) but does not ensure
input.projectId belongs to input.organizationId, allowing cross-org project
references; before inserting via
dbWs.insert(workspaces).values(...).returning(), query the projects table (e.g.,
select from projects where id = input.projectId) and assert its organizationId
matches input.organizationId (throw an unauthorized/error if not), then proceed
with the insert; update the mutation to perform this ownership check using the
same DB client used for workspaces so the check is atomic and uses symbols
verifyOrgMembership, dbWs, workspaces and the projects table/selector.

Comment on lines +36 to +37
.returning();
return workspace;
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

workspace may be undefined if .returning() yields an empty array.

While unlikely, destructuring from an empty array silently returns undefined to the caller. Consider adding a guard.

Proposed fix
   const [workspace] = await dbWs
     .insert(workspaces)
     .values({ ... })
     .returning();
+  if (!workspace) {
+    throw new TRPCError({
+      code: "INTERNAL_SERVER_ERROR",
+      message: "Failed to create workspace",
+    });
+  }
   return workspace;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
.returning();
return workspace;
const [workspace] = await dbWs
.insert(workspaces)
.values({ ... })
.returning();
if (!workspace) {
throw new TRPCError({
code: "INTERNAL_SERVER_ERROR",
message: "Failed to create workspace",
});
}
return workspace;
🤖 Prompt for AI Agents
In `@packages/trpc/src/router/workspace/workspace.ts` around lines 36 - 37, The
variable workspace returned after calling .returning() can be undefined if the
returned array is empty; update the code that uses workspace (the variable named
workspace returned from the query that calls .returning()) to guard against
undefined—e.g., check if workspace is falsy and either throw a descriptive error
or return a safe default value before returning workspace so callers don't
receive undefined; ensure the check is colocated with the .returning() result
handling and use the same variable name workspace in the guard.

@saddlepaddle saddlepaddle merged commit 2909c4a into main Feb 13, 2026
14 of 15 checks passed
@Kitenite Kitenite deleted the sandbox-integration branch February 16, 2026 00:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant