Conversation
…up isn't blocked on polar set up
…ries - Add autofill button (definition, root, morphology) on both add and edit pages - Add separate "Generate example" button in the examples section - Use gpt-4o via Cloudflare AI Gateway with caching for autofill, no-cache for examples - Add layered rate limiting (5/min, 50/hr, 150/day) with array support in middleware - Gate AI features behind Pro plan with tooltips for non-Pro users - Add error toasts for rate limits and failures - Fix inflection/gender Select components to use controlled `value` prop - Improve AI prompt: Arabic-only definitions, attested plurals only, omit gender for masadir - Remove unused AI adapter packages (anthropic, gemini, openai) - Add shimmer loading animation - Translate all new strings to Arabic Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Add revlog recording for flashcard reviews and backlog clearing via new stats API routes. Fix Eden Treaty input types resolving as `any` by installing @types/bun in the web app (elysiajs/eden#189) and pinning elysia to 1.4.27 to deduplicate the package. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Replace raw SQL queries and prepared statements with Drizzle's query builder for the clearBacklog generator. This gives us proper type inference on query results (e.g. direction is now correctly typed without manual casting). Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The fuzzy tolerance was calculated from the raw input length, but Orama applies it against stemmed tokens in the index. Arabic words that stem shorter (e.g. مهرجان→مهرج) got tolerance=2 on a 4-char stem, matching nearly everything. Also replaced exact:true with tolerance:0 since exact:true bypasses stemming on the query and never matched stemmed index tokens. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
sync-wasm doesn't support ON DELETE CASCADE, so flashcard rows were orphaned when a dictionary entry was deleted. This caused empty flashcards to appear in the review queue until the next remote sync. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds Polar payments (checkout/portal/webhooks) and subscription fields/migrations; introduces AI autocomplete/examples endpoints with per-user rate limits; records revlogs via new stats endpoints and client calls; invalidates sessions in Redis on subscription changes; extends UI for billing/pricing and AI features; updates i18n and dependencies. Changes
Sequence Diagram(s)sequenceDiagram
participant User as User
participant Web as Web Client
participant Auth as Auth Client
participant API as API Server
participant DB as Database
participant Redis as Redis
participant Polar as Polar Service
User->>Web: Click "Upgrade to Pro" / open checkout
Web->>Auth: authClient.checkout("pro")
Auth->>Polar: Create checkout session
Polar->>User: Checkout flow (redirect)
User->>Web: Return after checkout
Polar->>API: POST /webhooks/polar (subscription.updated)
API->>API: webhook handler parses payload, maps product→plan/status
API->>DB: UPDATE users SET plan, subscriptionStatus WHERE id=...
API->>Redis: Delete user's session tokens
API->>Web: (client refetch) session invalidated -> refreshed
Web->>Web: Navigate to /checkout-success and show confirmation
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 8
🧹 Nitpick comments (9)
.gitignore (1)
82-87: Consider removing developer-specific entries from the project gitignore.These additions (
.claude/skills/socratic-tutor/,.memelord/,.mcp.json) appear unrelated to the Polar payments feature and seem to be personal development tooling.Personal or local development files are typically better managed in:
.git/info/exclude(local to your repository clone)- Your global gitignore (
~/.gitignore_global)This keeps the project's
.gitignorefocused on files that all contributors need to ignore.Additionally,
.memelord/uses informal naming that may not be appropriate for a production codebase.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.gitignore around lines 82 - 87, The .gitignore contains developer-specific entries (.claude/skills/socratic-tutor/, .memelord/, .mcp.json) that are not part of the Polar payments feature; remove these personal/local entries from .gitignore and instead add them to the developer's local excludes (.git/info/exclude) or global gitignore (~/.gitignore_global), leaving only project-wide ignores (e.g., .astro/) in the repository .gitignore to keep it focused and professional..zed/settings.json (1)
1-22: Configuration unrelated to PR objectives.This Zed editor configuration is not directly related to the Polar payments integration described in the PR objectives. While the configuration itself is valid and uses Biome appropriately, editor-specific settings like these are typically either:
- Excluded from version control (via
.gitignore) as they represent personal developer preferences, or- Documented in a team decision if standardizing tooling across all contributors
Consider whether this file should be part of this PR or moved to a separate housekeeping/tooling PR.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.zed/settings.json around lines 1 - 22, The .zed/settings.json editor config is unrelated to the Polar payments work; remove it from this PR by reverting the addition of ".zed/settings.json" (or move the changes into a separate housekeeping/tooling PR), and if you want to keep it for the repo-wide standard, add the file via a dedicated tooling PR with team agreement and/or add an entry to .gitignore to keep personal Zed settings out of VCS; locate the file by its root filename and the keys like "languages" and "code_actions_on_format" in the diff to make the change.apps/api/src/middleware/http-logger.ts (1)
52-63: Consider refactoring nested ternaries into a mapping object for clarity.The deeply nested ternary is difficult to read and maintain. A lookup object would be clearer and easier to extend:
♻️ Suggested refactor
- // Map Elysia error codes to HTTP status codes - const statusCode = - code === "NOT_FOUND" - ? 404 - : code === "VALIDATION" - ? 400 - : code === "PARSE" - ? 400 - : code === "INVALID_COOKIE_SIGNATURE" - ? 400 - : typeof set.status === "number" - ? set.status - : 500; + // Map Elysia error codes to HTTP status codes + const errorCodeToStatus: Record<string, number> = { + NOT_FOUND: 404, + VALIDATION: 400, + PARSE: 400, + INVALID_COOKIE_SIGNATURE: 400, + }; + + const statusCode = + errorCodeToStatus[code] ?? + (typeof set.status === "number" ? set.status : 500);🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/middleware/http-logger.ts` around lines 52 - 63, The nested ternary computing statusCode using variables code and set.status is hard to read; replace it with a clear mapping lookup (e.g., a constant object mapping error codes like "NOT_FOUND","VALIDATION","PARSE","INVALID_COOKIE_SIGNATURE" to their numeric HTTP statuses) and then compute statusCode by checking that map first and falling back to typeof set.status === "number" ? set.status : 500; update the code site that references statusCode accordingly; ensure you keep the same precedence (explicit code mappings override set.status only when present).apps/web/src/routes/_unauthorized-layout/login/route.lazy.tsx (1)
70-83: Consider edge case when redirect starts with#.If
redirectis somehow just a hash fragment (e.g.,#flashcards),split("#")will return["", "flashcards"], makingtoan empty string. While unlikely given the route guard context snippets showlocation.hrefis used, Tanstack Router may behave unexpectedly with an emptyto.A safer approach would validate
tobefore navigating:🛡️ Optional defensive check
const [to, hash] = definedRedirect.split("#"); return ( <Form {...form}> <OTPForm onVerifyOTP={() => { navigate({ - to, + to: to || "/", hash, replace: true, resetScroll: true, }); }}🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/routes/_unauthorized-layout/login/route.lazy.tsx` around lines 70 - 83, The redirect parsing can produce an empty `to` when `definedRedirect` is only a hash (e.g., "#flashcards"); update the OTPForm onVerifyOTP handler to validate the parsed values: keep using `definedRedirect.split("#")` to extract `to` and `hash`, but if `to` is an empty string set a safe fallback (e.g., "/") before calling `navigate`; adjust the `onVerifyOTP` closure where `navigate({ to, hash, replace: true, resetScroll: true })` is invoked to use the validated `to`.biome.jsonc (1)
69-70: Disabling these rules may mask legitimate issues.
useAwait: "off"— Async functions that don't await anything may indicate forgotten awaits or unnecessary async declarations. Consider keeping this at"info"to catch accidental omissions.useIterableCallbackReturn: "off"— Array methods likemap/filterexpecting return values could silently produceundefinedresults.If these are disabled to address specific false positives, consider using inline
// biome-ignorecomments at those locations instead of globally disabling the rules.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@biome.jsonc` around lines 69 - 70, The biome config globally disables two useful rules—useAwait and useIterableCallbackReturn—which can hide real bugs; instead re-enable them (set them to "info" or remove the "off" entries) and if you had specific false positives suppress only those sites with inline biome-ignore comments; target the rule names useAwait and useIterableCallbackReturn in your change, or add localized // biome-ignore for the exact functions/lines that triggered false positives.README.md (1)
25-25: Consider using more descriptive link text.The link text "here" is less accessible and SEO-friendly than descriptive text.
📝 Suggested improvement
-7. If you need to test payments, make sure to set up a Cloudflare Tunnel by reading the instructions [here](./apps/api/README.md#local-development). +7. If you need to test payments, make sure to set up a Cloudflare Tunnel by reading the [Local Development instructions in the API README](./apps/api/README.md#local-development).🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@README.md` at line 25, Replace the non-descriptive link text "here" with a descriptive phrase that communicates the destination; update the README line that currently reads "reading the instructions [here](./apps/api/README.md#local-development)" to something like "reading the local development instructions for Cloudflare Tunnel" (or similar) while keeping the same target URL, so the link text is accessible and SEO-friendly.apps/marketing/src/components/PricingCards.tsx (1)
120-128: Consider using unique identifiers for list keys instead of feature text.Using the feature string as a key works but could cause issues if features have duplicate text:
// Current: Using feature text as key {free.features.map((feature) => ( <li key={feature}>For robustness, consider using index (acceptable here since list is static) or pass unique IDs with features.
Also applies to: 152-165
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/marketing/src/components/PricingCards.tsx` around lines 120 - 128, The list mapping in PricingCards uses the feature string as the React key (e.g., the free.features.map callback rendering <li key={feature}>), which can cause duplicate-key issues; update both occurrences (the free.features mapping and the other plan.features mapping around the later block) to use a stable unique key instead — either include an id on each feature object and use that (preferred) or, if the feature arrays are static, use the array index from map((feature, idx) => ...) as the key; ensure you update the key prop in the <li> elements and adjust the data shape (feature -> { id, label }) where applicable.apps/api/drizzle/0013_damp_scarecrow.sql (1)
1-2: LGTM! Safe additive migration for subscription fields.The migration correctly adds nullable columns for subscription data. Consider whether an index on
planorsubscriptionStatuswould benefit query performance if you'll be filtering users by subscription status frequently.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/drizzle/0013_damp_scarecrow.sql` around lines 1 - 2, Add optional indexes for faster filtering on subscription fields: if you expect frequent queries filtering by users.plan or users.subscriptionStatus, update the migration that alters the users table to also create indexes on the new columns (e.g., CREATE INDEX on plan and on subscriptionStatus) or add a small follow-up migration that creates these indexes; reference the table name users and the new column names plan and subscriptionStatus when implementing the index creation to ensure queries benefit from improved performance.apps/api/src/auth.ts (1)
9-9: Replace deep import from SDK's build output with the public export.Change:
import type { WebhookSubscriptionUpdatedPayload } from "@polar-sh/sdk/dist/commonjs/models/components/webhooksubscriptionupdatedpayload";To:
import type { WebhookSubscriptionUpdatedPayload } from "@polar-sh/sdk/models/components/webhooksubscriptionupdatedpayload";The public
models/components/namespace is the intended export path; importing fromdist/commonjs/reaches into build artifacts and risks breakage from packaging changes.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/auth.ts` at line 9, Replace the deep build-artifact import of the WebhookSubscriptionUpdatedPayload type with the SDK's public export: locate the import for WebhookSubscriptionUpdatedPayload and change the module specifier that points into "@polar-sh/sdk/dist/commonjs/..." to the public models/components namespace (i.e. the "@polar-sh/sdk/models/components/webhooksubscriptionupdatedpayload" export) so the code imports the type from the SDK's supported public API rather than its build output.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/api/src/auth.ts`:
- Around line 338-347: The code sets plan via a small IIFE using product.id and
returns null for unknown SKUs which allows updates to silently strip
entitlements; change this to fail closed by validating product.id before
proceeding: inside the block that computes plan (reference the variable plan and
product.id) if product.id is not one of config.POLAR_PRO_PRODUCT_ID or
config.POLAR_PRO_ANNUAL_PRODUCT_ID then throw or return an error (or log and
abort the update) instead of returning null so the update path is not executed
for unrecognized SKUs; apply the same guard to the other occurrence around the
375-380 logic so any unknown product.id aborts the update and surfaces a clear
error.
- Around line 324-325: The successUrl is pinned to allowedDomains[0] which is
always the apex host; update the successUrl construction in apps/api/src/auth.ts
(the successUrl setting where authenticatedUsersOnly is set) to use the actual
request origin or a canonical web origin config instead of allowedDomains[0].
Concretely: derive origin from the incoming request (or a dedicated canonical
config / env var like WEB_CANONICAL_ORIGIN) and build
`${origin}/checkout-success`, falling back to allowedDomains[0] only if no
request origin or canonical value is available so host-only auth cookies from
www vs apex aren’t lost.
- Around line 349-355: The webhook logger is including long-lived PII by adding
customer.email to the child logger context; remove the email from the
logger.child call in the code that builds childLogger (the logger.child
invocation with LogCategory.APPLICATION and fields customerId, checkoutId,
timestamp, email, plan) so that only non-PII identifiers (customerId,
checkoutId, timestamp, plan) remain in the context; ensure no other places
nearby resurrect customer.email into logs for the same webhook flow.
In `@apps/web/src/components/features/settings/BillingSettingsCard.tsx`:
- Around line 33-35: The billing action handlers (e.g.,
handleManageSubscription) must be made resilient: add a local pending state
(e.g., const [isPaymentPending, setIsPaymentPending] = useState(false)) and
early-return if pending to de-duplicate clicks; wrap the async call
(authClient.customer.portal()) in try/catch/finally, set isPaymentPending true
before the call and false in finally, and handle errors by mapping to a
structured error object and logging/reporting it (e.g., console.error or toast)
with contextual text; apply the same pattern to the other payment-related
handlers referenced in this file (the checkout/portal handlers around lines
~75-97) so all payment actions use guarded async handling and click
de-duplication.
In
`@apps/web/src/routes/_authorized-layout/_app-layout/checkout-success/route.lazy.tsx`:
- Around line 19-21: The unguarded async refetch in the useEffect should be
wrapped in an async function with a try/catch so failures are handled
explicitly; inside the effect, create and immediately invoke an async function
that awaits refetch({ query: { disableCookieCache: true } }) and on error catch
the thrown value (declare it as unknown), narrow it (e.g., if (err instanceof
Error) use err.message else String(err)), then log or report the structured
error (console.error or the app logger) and ensure the catch does not rethrow so
checkout success rendering isn’t blocked; update the useEffect block referencing
useEffect and refetch accordingly.
In `@apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx`:
- Around line 61-64: The useEffect that reads window.location.hash currently
uses document.querySelector(window.location.hash) which can throw for hashes
that are not valid CSS selectors; change it to extract the id by removing the
leading '#' and decoding the hash (e.g., const id =
decodeURIComponent(window.location.hash.slice(1))), then use
document.getElementById(id) to locate the element and call el?.scrollIntoView({
behavior: "smooth" }); in the same useEffect (refer to the existing
useEffect/block where window.location.hash is used).
In `@packages/i18n/locales/en.po`:
- Around line 758-760: Update the user-facing string in the settings route
source: locate the msgid in
apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx (the
export/import dictionary help text) and correct the typo "flaschards" to
"flashcards", then re-run the i18n extraction to update
packages/i18n/locales/en.po so the msgid and msgstr reflect the corrected text.
In `@packages/web-ui/package.json`:
- Line 27: Update the dependency entry for "@radix-ui/react-slot" in
package.json from "^1.2.3" to "^1.2.4": locate the package.json dependency list,
find the "@radix-ui/react-slot" key and change its version string to "^1.2.4",
then run your package manager (npm/yarn/pnpm) to install and update the
lockfile.
---
Nitpick comments:
In @.gitignore:
- Around line 82-87: The .gitignore contains developer-specific entries
(.claude/skills/socratic-tutor/, .memelord/, .mcp.json) that are not part of the
Polar payments feature; remove these personal/local entries from .gitignore and
instead add them to the developer's local excludes (.git/info/exclude) or global
gitignore (~/.gitignore_global), leaving only project-wide ignores (e.g.,
.astro/) in the repository .gitignore to keep it focused and professional.
In @.zed/settings.json:
- Around line 1-22: The .zed/settings.json editor config is unrelated to the
Polar payments work; remove it from this PR by reverting the addition of
".zed/settings.json" (or move the changes into a separate housekeeping/tooling
PR), and if you want to keep it for the repo-wide standard, add the file via a
dedicated tooling PR with team agreement and/or add an entry to .gitignore to
keep personal Zed settings out of VCS; locate the file by its root filename and
the keys like "languages" and "code_actions_on_format" in the diff to make the
change.
In `@apps/api/drizzle/0013_damp_scarecrow.sql`:
- Around line 1-2: Add optional indexes for faster filtering on subscription
fields: if you expect frequent queries filtering by users.plan or
users.subscriptionStatus, update the migration that alters the users table to
also create indexes on the new columns (e.g., CREATE INDEX on plan and on
subscriptionStatus) or add a small follow-up migration that creates these
indexes; reference the table name users and the new column names plan and
subscriptionStatus when implementing the index creation to ensure queries
benefit from improved performance.
In `@apps/api/src/auth.ts`:
- Line 9: Replace the deep build-artifact import of the
WebhookSubscriptionUpdatedPayload type with the SDK's public export: locate the
import for WebhookSubscriptionUpdatedPayload and change the module specifier
that points into "@polar-sh/sdk/dist/commonjs/..." to the public
models/components namespace (i.e. the
"@polar-sh/sdk/models/components/webhooksubscriptionupdatedpayload" export) so
the code imports the type from the SDK's supported public API rather than its
build output.
In `@apps/api/src/middleware/http-logger.ts`:
- Around line 52-63: The nested ternary computing statusCode using variables
code and set.status is hard to read; replace it with a clear mapping lookup
(e.g., a constant object mapping error codes like
"NOT_FOUND","VALIDATION","PARSE","INVALID_COOKIE_SIGNATURE" to their numeric
HTTP statuses) and then compute statusCode by checking that map first and
falling back to typeof set.status === "number" ? set.status : 500; update the
code site that references statusCode accordingly; ensure you keep the same
precedence (explicit code mappings override set.status only when present).
In `@apps/marketing/src/components/PricingCards.tsx`:
- Around line 120-128: The list mapping in PricingCards uses the feature string
as the React key (e.g., the free.features.map callback rendering <li
key={feature}>), which can cause duplicate-key issues; update both occurrences
(the free.features mapping and the other plan.features mapping around the later
block) to use a stable unique key instead — either include an id on each feature
object and use that (preferred) or, if the feature arrays are static, use the
array index from map((feature, idx) => ...) as the key; ensure you update the
key prop in the <li> elements and adjust the data shape (feature -> { id, label
}) where applicable.
In `@apps/web/src/routes/_unauthorized-layout/login/route.lazy.tsx`:
- Around line 70-83: The redirect parsing can produce an empty `to` when
`definedRedirect` is only a hash (e.g., "#flashcards"); update the OTPForm
onVerifyOTP handler to validate the parsed values: keep using
`definedRedirect.split("#")` to extract `to` and `hash`, but if `to` is an empty
string set a safe fallback (e.g., "/") before calling `navigate`; adjust the
`onVerifyOTP` closure where `navigate({ to, hash, replace: true, resetScroll:
true })` is invoked to use the validated `to`.
In `@biome.jsonc`:
- Around line 69-70: The biome config globally disables two useful
rules—useAwait and useIterableCallbackReturn—which can hide real bugs; instead
re-enable them (set them to "info" or remove the "off" entries) and if you had
specific false positives suppress only those sites with inline biome-ignore
comments; target the rule names useAwait and useIterableCallbackReturn in your
change, or add localized // biome-ignore for the exact functions/lines that
triggered false positives.
In `@README.md`:
- Line 25: Replace the non-descriptive link text "here" with a descriptive
phrase that communicates the destination; update the README line that currently
reads "reading the instructions [here](./apps/api/README.md#local-development)"
to something like "reading the local development instructions for Cloudflare
Tunnel" (or similar) while keeping the same target URL, so the link text is
accessible and SEO-friendly.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: b13eef6f-1e2a-42c6-ab88-0d962c94f1d3
⛔ Files ignored due to path filters (1)
pnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (41)
.gitignore.zed/settings.jsonREADME.mdapps/api/.example.envapps/api/README.mdapps/api/drizzle/0013_damp_scarecrow.sqlapps/api/drizzle/meta/0013_snapshot.jsonapps/api/drizzle/meta/_journal.jsonapps/api/package.jsonapps/api/src/auth.tsapps/api/src/clients/polar.tsapps/api/src/db/schema/auth.tsapps/api/src/middleware/http-logger.tsapps/api/src/utils/config.tsapps/marketing/src/components/Features.astroapps/marketing/src/components/Header.astroapps/marketing/src/components/Pricing.astroapps/marketing/src/components/PricingCards.tsxapps/marketing/src/i18n/translations/ar.jsonapps/marketing/src/i18n/translations/en.jsonapps/marketing/src/pages/ar/index.astroapps/marketing/src/pages/index.astroapps/web/package.jsonapps/web/src/components/LanguageMenu.tsxapps/web/src/components/ThemeMenu.tsxapps/web/src/components/features/settings/BillingSettingsCard.tsxapps/web/src/components/features/settings/FlashcardSettingsCardSection.tsxapps/web/src/lib/auth-client.tsapps/web/src/routeTree.gen.tsapps/web/src/routes/_authorized-layout/_app-layout/checkout-success/route.lazy.tsxapps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsxapps/web/src/routes/_unauthorized-layout/login/route.lazy.tsxbahar.penbiome.jsoncmarketing.penpackages/drizzle-user-db-schemas/package.jsonpackages/i18n/locales/ar.popackages/i18n/locales/ar.tspackages/i18n/locales/en.popackages/i18n/locales/en.tspackages/web-ui/package.json
- ts-ignore Polar SDK deep import that fails under Bundler moduleResolution - Add explicit return in userRateLimit middleware to satisfy noImplicitReturns - Cast response type in http-logger to access status property - Guard against undefined type in autofill mutation for add/edit routes Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Deploying bahar with
|
| Latest commit: |
c44e81e
|
| Status: | ✅ Deploy successful! |
| Preview URL: | https://52bd2489.bahar-5xu.pages.dev |
| Branch Preview URL: | https://feat-polar-payments.bahar-5xu.pages.dev |
Deploying with
|
| Status | Name | Latest Commit | Preview URL | Updated (UTC) |
|---|---|---|---|---|
| ✅ Deployment successful! View logs |
bahar-marketing | bf33449 | Commit Preview URL | Apr 02 2026, 12:55 AM |
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
apps/web/src/components/features/flashcards/FlashcardDrawer/FlashcardDrawer.tsx (1)
195-208:⚠️ Potential issue | 🟠 MajorKeep grading local-first; this revlog POST can desync reviews.
This new request now sits between the optimistic
setCards(...)update andupdateFlashcard(...), but its result is never checked. A rejected request can leave the current card removed from local state without ever being rescheduled, and a resolved{ error }response can still advance the flashcard while silently dropping the revlog. Please either make the revlog write best-effort after the local update, or gate the local mutation on an explicit success check.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/components/features/flashcards/FlashcardDrawer/FlashcardDrawer.tsx` around lines 195 - 208, The revlog POST (api.stats.revlogs.post) is currently between the optimistic local state change (setCards) and the persistent update (updateFlashcard), which can desync reviews on network failure; change the flow so grading is local-first: call setShowAnswer(false) and setCards(...) immediately, then await updateFlashcard({ id: currentCard.id, updates: localUpdates }) to persist the flashcard change, and only after that fire api.stats.revlogs.post(...) as a best-effort (wrap in try/catch and log errors) so a failing revlog does not leave the card removed or silently advance state; reference setShowAnswer, setCards, updateFlashcard, api.stats.revlogs.post, log, and currentCard when making the change.
♻️ Duplicate comments (2)
packages/i18n/locales/en.po (1)
770-772:⚠️ Potential issue | 🟡 MinorThe export copy still says “flaschards”.
This typo is still present in the English catalog, so it will ship again in settings. Please fix the source string and re-extract the messages so
msgid/msgstrstay in sync.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/i18n/locales/en.po` around lines 770 - 772, Fix the typo "flaschards" to "flashcards" in the source string used by the settings route (the string shown in ../../apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx) by updating the original source text so that the msgid reads "Exporting dictionary with flashcards..." and then re-run the extraction/localization pipeline so msgid and msgstr remain in sync across packages/i18n/locales/en.po.apps/web/src/routes/_authorized-layout/_app-layout/dictionary/edit/$wordId.tsx (1)
350-445:⚠️ Potential issue | 🟡 MinorSame in-flight
typerace as the add route autofill flow.This mutation posts one
typeand later applies morphology based on a freshform.getValues("type"). If the user changes category before the response returns, the data can land in the wrong branch or be discarded.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/routes/_authorized-layout/_app-layout/dictionary/edit/`$wordId.tsx around lines 350 - 445, The autofill mutation reads type twice causing a race: capture the type at request time and use that captured value in onSuccess instead of calling form.getValues("type") again; e.g., in mutationFn save const requestedType = type (or return { data, requestedType }) and then in onSuccess reference requestedType to decide whether to apply ism or fi'l morphology and set form values, ensuring the morphology is applied to the same category that was sent to api.ai.autocomplete.post rather than the current form state.
🧹 Nitpick comments (7)
apps/web/src/hooks/db/index.ts (1)
79-86: Inconsistent async/await pattern with other hooks in this file.The
onSuccesscallback doesn't awaitinvalidateQueries, unlikeuseAddDictionaryEntry(line 20) anduseEditDictionaryEntry(line 112) which both useasyncandawait. For consistency and predictable cache invalidation timing:♻️ Proposed fix for consistency
- onSuccess: () => { - queryClient.invalidateQueries({ + onSuccess: async () => { + await queryClient.invalidateQueries({ queryKey: flashcardsTable.today.cacheOptions.queryKey, }); - queryClient.invalidateQueries({ + await queryClient.invalidateQueries({ queryKey: flashcardsTable.counts.cacheOptions.queryKey, }); },🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/hooks/db/index.ts` around lines 79 - 86, The onSuccess callback that currently calls queryClient.invalidateQueries twice should follow the same async/await pattern used in useAddDictionaryEntry and useEditDictionaryEntry: make the onSuccess handler async and await both queryClient.invalidateQueries calls (the ones using flashcardsTable.today.cacheOptions.queryKey and flashcardsTable.counts.cacheOptions.queryKey) so cache invalidation timing is consistent and predictable with the other hooks.apps/web/src/lib/db/operations/dictionary-entries.ts (1)
277-285: Consider wrapping both deletes in a transaction for atomicity.The explicit flashcard deletion is correctly implemented given the sync-wasm CASCADE limitation. However, if the dictionary entry delete fails after flashcards are deleted, the database would be left in an inconsistent state (flashcards deleted but dictionary entry still exists).
Based on the drizzle database instance (see
apps/web/src/lib/db/index.ts), transaction control is supported via.run().♻️ Proposed fix to wrap operations in a transaction
if (!res) { throw new Error(`Dictionary entry not found: ${id}`); } // Explicitly delete flashcards since sync-wasm doesn't support // ON DELETE CASCADE (foreign key actions other than NO ACTION) + await db.run("BEGIN TRANSACTION;"); + try { await db .prepare("DELETE FROM flashcards WHERE dictionary_entry_id = ?;") .run([id]); await db .prepare("DELETE FROM dictionary_entries WHERE id = ?;") .run([id]); + await db.run("COMMIT;"); + } catch (txErr) { + await db.run("ROLLBACK;"); + throw txErr; + }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/lib/db/operations/dictionary-entries.ts` around lines 277 - 285, Wrap the two DELETEs in a single atomic transaction so both succeed or neither does: execute the flashcard delete (the prepare("DELETE FROM flashcards WHERE dictionary_entry_id = ?;").run([id])) and the dictionary_entry delete (the prepare("DELETE FROM dictionary_entries WHERE id = ?;").run([id])) inside a single db transaction call (use the db.run transaction wrapper supported by the drizzle instance) so that failures roll back and leave the DB consistent.packages/search/src/database.ts (1)
162-165: Consider reducing explanatory comments by extracting a helper.The long inline comments can be replaced with a small helper (e.g.,
getStemmedLengthForTolerance) to keep intent in code and reduce comment density.As per coding guidelines, "Write self-documenting code and avoid overuse of comments".
Also applies to: 179-181
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/search/src/database.ts` around lines 162 - 165, Extract the stemmed-length-for-tolerance logic into a small, well-named helper (e.g., getStemmedLengthForTolerance) and replace the long explanatory comments with calls to that helper where tolerance is computed; update both occurrences around the current inline comment block and the later similar block (the ones calculating tolerance based on stemmed token length) to call getStemmedLengthForTolerance(tokenOrWord) and let the helper encapsulate the explanation and behavior so the call site stays self-documenting.apps/web/src/lib/api.ts (1)
5-5: Consider validating the environment variable instead of using non-null assertion.Using
!suppresses TypeScript's null check but doesn't prevent runtime issues ifVITE_API_BASE_URLis undefined. If the variable is missing,treaty()will receiveundefined, potentially causing unclear errors.💡 Suggested validation pattern
-export const api = treaty<App>(import.meta.env.VITE_API_BASE_URL!, { +const apiBaseUrl = import.meta.env.VITE_API_BASE_URL; +if (!apiBaseUrl) { + throw new Error("VITE_API_BASE_URL environment variable is required"); +} + +export const api = treaty<App>(apiBaseUrl, {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/web/src/lib/api.ts` at line 5, The code uses a non-null assertion on import.meta.env.VITE_API_BASE_URL when creating the api via treaty<App>, which can hide missing env vars at runtime; update the api initialization to validate that import.meta.env.VITE_API_BASE_URL is present (e.g., check the variable and throw a clear error or fall back to a safe default) before calling treaty<App>, remove the `!` non-null assertion, and ensure the created api (export const api) only uses the validated URL so runtime failures are explicit and easier to debug.apps/api/src/routers/stats.ts (1)
41-64: Consider limiting batch size to prevent large insert operations.The
/revlogs/batchendpoint accepts an unbounded array. A malicious or buggy client could send thousands of entries, impacting database performance. Consider adding a reasonable limit.💡 Suggested limit
{ auth: "user", body: z.object({ - entries: z.array(RevlogBodySchema), + entries: z.array(RevlogBodySchema).max(1000), }), }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/routers/stats.ts` around lines 41 - 64, The /revlogs/batch route accepts an unbounded entries array (body.entries) which can cause huge inserts; update the validation and runtime checks to enforce a reasonable maximum (e.g., 100-1000) by changing the Zod schema for entries (the body z.object that uses z.array(RevlogBodySchema)) to z.array(RevlogBodySchema).max(<limit>) and additionally assert the length of body.entries before building values (in the POST handler that builds values and calls db.insert(revlogs).values) to return a 4xx error if exceeded; reference the route handler for "/revlogs/batch", the body variable, entries, RevlogBodySchema, values, and the db.insert(revlogs).values call when applying the change.apps/api/src/middleware.ts (1)
84-97: Minor: TOCTOU race in rate limiting allows slight over-limit.Between
getandincr, concurrent requests could exceed the limit by a small margin. For this use case (API rate limiting), it's acceptable—just noting it's not strictly atomic.For stricter enforcement, consider using Redis
INCRwithSETNX/SETEXpattern or Lua scripting, but this is likely overkill for the current requirements.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/middleware.ts` around lines 84 - 97, TOCTOU happens because the code calls redisClient.get before redisClient.incr; replace the get->incr flow with an atomic increment-first approach: call redisClient.incr(key) to get the new count, and if the returned count is 1 then call redisClient.expire(key, windowSecs) (or use a MULTI/EXEC transaction) to set the TTL; then compare the incr-returned count against maxReqs to decide 429. Update references in the rate-limiting loop that use limitsArray, key, user.id, redisClient.get, and redisClient.incr accordingly so you no longer rely on a separate get() read before incrementing.apps/api/src/routers/ai.ts (1)
128-136: Rate limit prefix naming inconsistency for/examplesendpoint.The rate limit configuration uses prefixes containing
autocomplete(e.g.,ratelimit:ai:autocomplete:min), but this same configuration is applied to both the/autocompleteand/examplesendpoints.If both endpoints intentionally share the same rate limit bucket, the prefix naming should reflect this (e.g.,
ratelimit:ai:min). If they should have separate limits, the/examplesendpoint needs its own configuration.Option 1: Rename to generic AI prefix if shared limits are intentional
const aiRateLimits: RateLimiterOpts[] = [ - { prefix: "ratelimit:ai:autocomplete:min", maxReqs: 5, windowSecs: 60 }, - { prefix: "ratelimit:ai:autocomplete:hr", maxReqs: 50, windowSecs: 60 * 60 }, + { prefix: "ratelimit:ai:min", maxReqs: 5, windowSecs: 60 }, + { prefix: "ratelimit:ai:hr", maxReqs: 50, windowSecs: 60 * 60 }, { - prefix: "ratelimit:ai:autocomplete:day", + prefix: "ratelimit:ai:day", maxReqs: 150, windowSecs: 60 * 60 * 24, }, ];Option 2: Create separate limits for each endpoint
+const autocompleteRateLimits: RateLimiterOpts[] = [ + { prefix: "ratelimit:ai:autocomplete:min", maxReqs: 5, windowSecs: 60 }, + { prefix: "ratelimit:ai:autocomplete:hr", maxReqs: 50, windowSecs: 60 * 60 }, + { prefix: "ratelimit:ai:autocomplete:day", maxReqs: 150, windowSecs: 60 * 60 * 24 }, +]; + +const examplesRateLimits: RateLimiterOpts[] = [ + { prefix: "ratelimit:ai:examples:min", maxReqs: 5, windowSecs: 60 }, + { prefix: "ratelimit:ai:examples:hr", maxReqs: 50, windowSecs: 60 * 60 }, + { prefix: "ratelimit:ai:examples:day", maxReqs: 150, windowSecs: 60 * 60 * 24 }, +];Then use
autocompleteRateLimitsfor/autocompleteandexamplesRateLimitsfor/examples.Also applies to: 181-189
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/routers/ai.ts` around lines 128 - 136, The aiRateLimits array uses prefixes with "autocomplete" but is applied to both the /autocomplete and /examples endpoints; either rename the prefixes to a generic name (e.g., change "ratelimit:ai:autocomplete:..." to "ratelimit:ai:...") if they should share a bucket, or split into two distinct arrays (e.g., autocompleteRateLimits and examplesRateLimits) with appropriate prefixes ("ratelimit:ai:autocomplete:..." and "ratelimit:ai:examples:...") and ensure the router uses the correct array for each endpoint (/autocomplete uses autocompleteRateLimits, /examples uses examplesRateLimits). Ensure to update all uses (also the block around lines referenced later) so prefix names and variable references are consistent.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/api/src/middleware.ts`:
- Around line 79-102: The userRateLimit resolver currently calls
redisClient.get/set/incr without error handling; wrap the Redis interactions
inside a try/catch block around the loop in userRateLimit (referencing
userRateLimit, limitsArray, and redisClient.get/set/incr) and on any Redis error
log it and gracefully degrade by allowing the request to proceed (i.e., do not
return a 429 or throw), while preserving normal rate-limit behavior when Redis
calls succeed; ensure the catch returns nothing (allows request) and does not
leak the raw exception.
In `@apps/api/src/routers/ai.ts`:
- Around line 143-153: The code uses unsafe `as never` assertions for `adapter:
aiAdapter as never` and `adapter: aiAdapterNoCache as never` inside the chat
calls, which bypasses TypeScript checks; replace these by either upgrading to a
compatible `@cloudflare/tanstack-ai`/@tanstack/ai version or creating a small
typed wrapper that maps the runtime adapter to the expected type (e.g.,
implement a function like `createTypedAdapter(adapter)` that returns the correct
`Adapter`/`AIAdapter` shape expected by `chat`) and use that wrapper for both
`aiAdapter` and `aiAdapterNoCache`, or if you must keep it temporary add a
concise TODO comment and file an upstream issue; ensure you remove both `as
never` assertions and reference the wrapper or upgraded types in the `chat`
calls where `aiAdapter` and `aiAdapterNoCache` are passed.
In `@apps/api/src/routers/stats.ts`:
- Around line 23-24: The code converts body.due and body.review to timestamps
(due_timestamp_ms and review_timestamp_ms) using new Date(...).getTime(), which
yields NaN for invalid date strings; update the logic to validate/parsing before
inserting: either strengthen RevlogBodySchema to require/parse ISO/epoch dates
(or add custom validators) or explicitly parse with Date.parse (or a trusted
date lib) and check isNaN(result) for body.due and body.review, returning a
400/validation error when invalid; ensure the checked/parsed values are used to
set due_timestamp_ms and review_timestamp_ms so no NaN is written.
In `@apps/web/src/lib/db/operations/flashcards.ts`:
- Around line 657-672: The current call to api.stats.revlogs.batch.post inside
the clearBacklog flow only captures network rejections but ignores
application-level failures returned in the resolved response; update the
handling around api.stats.revlogs.batch.post (the call that posts
revlogEntries.map(...) with due/review ISO strings and source:"clear_backlog")
to await the promise, inspect the resolved payload for an error or failure
flag/structure (e.g., response.success, response.errors or similar), and when
the API returns an application error record the failure with
Sentry.captureException including the response body and tags { operation:
"clearBacklog.revlogs" } so batch errors are surfaced even when the promise does
not reject. Ensure you still keep the existing .catch for transport errors.
- Around line 608-651: The generator yields while an open SQLite transaction is
active (BEGIN via drizzleDb.run) which can be abandoned and never
COMMIT/ROLLBACK; fix by ensuring the transaction is always finalized before any
yield or by wrapping the transactional block in a try/finally that always issues
COMMIT or ROLLBACK on early exit: locate the loop that updates flashcards (uses
drizzleDb.run(sql`BEGIN TRANSACTION`),
update(flashcards).where(eq(flashcards.id, row.id)) and the yield { cleared: i +
1, total } and either move the BEGIN/COMMIT outside the generator so yields
occur after commit, or add a surrounding try { ... } finally { if transaction
still open call drizzleDb.run(sql`ROLLBACK`) } to guarantee cleanup.
In
`@apps/web/src/routes/_authorized-layout/_app-layout/dictionary/add/route.lazy.tsx`:
- Around line 193-205: The mutation currently posts the form's type inside
mutationFn but re-reads form.getValues("type") in onSuccess, which can misapply
autofill if the user changes type mid-request; update the mutation to return an
object containing both the API response and the submitted type (capture const
submittedType = form.getValues("type") before the api.ai.autocomplete.post call
inside mutationFn and include it in the returned value), then in onSuccess use
that returned submittedType instead of calling form.getValues("type"); apply the
same change to the analogous mutation in edit/$wordId.tsx (look for mutationFn
and onSuccess in both files).
In `@packages/drizzle-user-db-schemas/src/dictionary.ts`:
- Around line 51-55: The project currently uses drizzle-zod@0.8.3 which is
incompatible with Zod v4; update the package to either remove/replace
drizzle-zod or align Zod to a supported version: either (A) remove drizzle-zod
and switch schema generation to Drizzle ORM's built-in Zod support (import from
"drizzle-orm/zod") and update usages around InsertDictionaryEntrySchema and
SelectDictionaryEntrySchema in dictionary.ts to use the new import, or (B)
downgrade Zod to a v3-compatible release and change the package.json constraint
(remove "zod: >=4.0.0") so drizzle-zod@0.8.3 peer deps are satisfied; pick one
approach and update package.json, imports, and any
createInsertSchema/createSelectSchema usages accordingly.
In `@packages/search/src/database.ts`:
- Around line 166-174: The non-Arabic path currently treats the whole phrase as
a single token via [stripArabicDiacritics(term)], which inflates stemmedLen for
multi-word queries and makes fuzzy tolerance too permissive; update the logic so
that when language !== "arabic" you split the input term into individual tokens
(e.g., by whitespace), apply stripArabicDiacritics to each token to produce
stemmedTokens, then compute stemmedLen = Math.max(...stemmedTokens.map(t =>
t.length), 1). Apply the same change to the other occurrence that computes
stemmedTokens/stemmedLen (the similar block at the second location).
---
Outside diff comments:
In
`@apps/web/src/components/features/flashcards/FlashcardDrawer/FlashcardDrawer.tsx`:
- Around line 195-208: The revlog POST (api.stats.revlogs.post) is currently
between the optimistic local state change (setCards) and the persistent update
(updateFlashcard), which can desync reviews on network failure; change the flow
so grading is local-first: call setShowAnswer(false) and setCards(...)
immediately, then await updateFlashcard({ id: currentCard.id, updates:
localUpdates }) to persist the flashcard change, and only after that fire
api.stats.revlogs.post(...) as a best-effort (wrap in try/catch and log errors)
so a failing revlog does not leave the card removed or silently advance state;
reference setShowAnswer, setCards, updateFlashcard, api.stats.revlogs.post, log,
and currentCard when making the change.
---
Duplicate comments:
In
`@apps/web/src/routes/_authorized-layout/_app-layout/dictionary/edit/`$wordId.tsx:
- Around line 350-445: The autofill mutation reads type twice causing a race:
capture the type at request time and use that captured value in onSuccess
instead of calling form.getValues("type") again; e.g., in mutationFn save const
requestedType = type (or return { data, requestedType }) and then in onSuccess
reference requestedType to decide whether to apply ism or fi'l morphology and
set form values, ensuring the morphology is applied to the same category that
was sent to api.ai.autocomplete.post rather than the current form state.
In `@packages/i18n/locales/en.po`:
- Around line 770-772: Fix the typo "flaschards" to "flashcards" in the source
string used by the settings route (the string shown in
../../apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx)
by updating the original source text so that the msgid reads "Exporting
dictionary with flashcards..." and then re-run the extraction/localization
pipeline so msgid and msgstr remain in sync across packages/i18n/locales/en.po.
---
Nitpick comments:
In `@apps/api/src/middleware.ts`:
- Around line 84-97: TOCTOU happens because the code calls redisClient.get
before redisClient.incr; replace the get->incr flow with an atomic
increment-first approach: call redisClient.incr(key) to get the new count, and
if the returned count is 1 then call redisClient.expire(key, windowSecs) (or use
a MULTI/EXEC transaction) to set the TTL; then compare the incr-returned count
against maxReqs to decide 429. Update references in the rate-limiting loop that
use limitsArray, key, user.id, redisClient.get, and redisClient.incr accordingly
so you no longer rely on a separate get() read before incrementing.
In `@apps/api/src/routers/ai.ts`:
- Around line 128-136: The aiRateLimits array uses prefixes with "autocomplete"
but is applied to both the /autocomplete and /examples endpoints; either rename
the prefixes to a generic name (e.g., change "ratelimit:ai:autocomplete:..." to
"ratelimit:ai:...") if they should share a bucket, or split into two distinct
arrays (e.g., autocompleteRateLimits and examplesRateLimits) with appropriate
prefixes ("ratelimit:ai:autocomplete:..." and "ratelimit:ai:examples:...") and
ensure the router uses the correct array for each endpoint (/autocomplete uses
autocompleteRateLimits, /examples uses examplesRateLimits). Ensure to update all
uses (also the block around lines referenced later) so prefix names and variable
references are consistent.
In `@apps/api/src/routers/stats.ts`:
- Around line 41-64: The /revlogs/batch route accepts an unbounded entries array
(body.entries) which can cause huge inserts; update the validation and runtime
checks to enforce a reasonable maximum (e.g., 100-1000) by changing the Zod
schema for entries (the body z.object that uses z.array(RevlogBodySchema)) to
z.array(RevlogBodySchema).max(<limit>) and additionally assert the length of
body.entries before building values (in the POST handler that builds values and
calls db.insert(revlogs).values) to return a 4xx error if exceeded; reference
the route handler for "/revlogs/batch", the body variable, entries,
RevlogBodySchema, values, and the db.insert(revlogs).values call when applying
the change.
In `@apps/web/src/hooks/db/index.ts`:
- Around line 79-86: The onSuccess callback that currently calls
queryClient.invalidateQueries twice should follow the same async/await pattern
used in useAddDictionaryEntry and useEditDictionaryEntry: make the onSuccess
handler async and await both queryClient.invalidateQueries calls (the ones using
flashcardsTable.today.cacheOptions.queryKey and
flashcardsTable.counts.cacheOptions.queryKey) so cache invalidation timing is
consistent and predictable with the other hooks.
In `@apps/web/src/lib/api.ts`:
- Line 5: The code uses a non-null assertion on
import.meta.env.VITE_API_BASE_URL when creating the api via treaty<App>, which
can hide missing env vars at runtime; update the api initialization to validate
that import.meta.env.VITE_API_BASE_URL is present (e.g., check the variable and
throw a clear error or fall back to a safe default) before calling treaty<App>,
remove the `!` non-null assertion, and ensure the created api (export const api)
only uses the validated URL so runtime failures are explicit and easier to
debug.
In `@apps/web/src/lib/db/operations/dictionary-entries.ts`:
- Around line 277-285: Wrap the two DELETEs in a single atomic transaction so
both succeed or neither does: execute the flashcard delete (the prepare("DELETE
FROM flashcards WHERE dictionary_entry_id = ?;").run([id])) and the
dictionary_entry delete (the prepare("DELETE FROM dictionary_entries WHERE id =
?;").run([id])) inside a single db transaction call (use the db.run transaction
wrapper supported by the drizzle instance) so that failures roll back and leave
the DB consistent.
In `@packages/search/src/database.ts`:
- Around line 162-165: Extract the stemmed-length-for-tolerance logic into a
small, well-named helper (e.g., getStemmedLengthForTolerance) and replace the
long explanatory comments with calls to that helper where tolerance is computed;
update both occurrences around the current inline comment block and the later
similar block (the ones calculating tolerance based on stemmed token length) to
call getStemmedLengthForTolerance(tokenOrWord) and let the helper encapsulate
the explanation and behavior so the call site stays self-documenting.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: c199977d-24d2-44ea-a37e-f47dc0070448
⛔ Files ignored due to path filters (3)
apps/marketing/public/og-image.pngis excluded by!**/*.pngdesign/logo.pngis excluded by!**/*.pngpnpm-lock.yamlis excluded by!**/pnpm-lock.yaml
📒 Files selected for processing (40)
apps/api/.example.envapps/api/drizzle/0014_omniscient_rumiko_fujikawa.sqlapps/api/drizzle/meta/0014_snapshot.jsonapps/api/drizzle/meta/_journal.jsonapps/api/package.jsonapps/api/src/auth.tsapps/api/src/db/schema/revlogs.tsapps/api/src/index.tsapps/api/src/middleware.tsapps/api/src/middleware/http-logger.tsapps/api/src/routers/ai.tsapps/api/src/routers/stats.tsapps/api/src/utils/config.tsapps/marketing/src/layouts/BaseLayout.astroapps/mobile/package.jsonapps/web/README.mdapps/web/package.jsonapps/web/src/components/features/dictionary/add/AdditionalDetailsFormSection.tsxapps/web/src/components/features/dictionary/add/IsmMorphologyCardSection.tsxapps/web/src/components/features/flashcards/FlashcardDrawer/FlashcardDrawer.tsxapps/web/src/hooks/db/index.tsapps/web/src/lib/api.tsapps/web/src/lib/db/index.tsapps/web/src/lib/db/operations/dictionary-entries.tsapps/web/src/lib/db/operations/flashcards.tsapps/web/src/routes/_authorized-layout/_app-layout/dictionary/add/route.lazy.tsxapps/web/src/routes/_authorized-layout/_app-layout/dictionary/edit/$wordId.tsxapps/web/tsconfig.jsondesign/bahar.penpackages/db-operations/package.jsonpackages/design-system/globals.csspackages/drizzle-user-db-schemas/package.jsonpackages/drizzle-user-db-schemas/src/dictionary.tspackages/drizzle-user-db-schemas/src/index.tspackages/drizzle-user-db-schemas/src/types.tspackages/i18n/locales/ar.popackages/i18n/locales/ar.tspackages/i18n/locales/en.popackages/i18n/locales/en.tspackages/search/src/database.ts
✅ Files skipped from review due to trivial changes (11)
- apps/mobile/package.json
- apps/web/src/lib/db/index.ts
- packages/db-operations/package.json
- apps/web/tsconfig.json
- apps/web/README.md
- packages/design-system/globals.css
- apps/api/drizzle/0014_omniscient_rumiko_fujikawa.sql
- apps/api/drizzle/meta/_journal.json
- packages/drizzle-user-db-schemas/package.json
- apps/api/drizzle/meta/0014_snapshot.json
- apps/api/src/db/schema/revlogs.ts
🚧 Files skipped from review as they are similar to previous changes (5)
- apps/api/.example.env
- apps/api/package.json
- apps/web/package.json
- apps/api/src/auth.ts
- packages/i18n/locales/ar.ts
| if (revlogEntries.length > 0) { | ||
| api.stats.revlogs.batch | ||
| .post({ | ||
| entries: revlogEntries.map(({ log, direction }) => ({ | ||
| ...log, | ||
| due: log.due.toISOString(), | ||
| review: log.review.toISOString(), | ||
| direction, | ||
| source: "clear_backlog" as const, | ||
| })), | ||
| }) | ||
| .catch((err) => { | ||
| Sentry.captureException(err, { | ||
| tags: { operation: "clearBacklog.revlogs" }, | ||
| }); | ||
| }); |
There was a problem hiding this comment.
Handle batch revlog failures from the response, not only rejected promises.
This only captures transport-level rejects. If api.stats.revlogs.batch.post(...) resolves with an application error payload, the batch silently disappears and clear_backlog stats drift without any Sentry signal.
Suggested fix
- api.stats.revlogs.batch
- .post({
- entries: revlogEntries.map(({ log, direction }) => ({
- ...log,
- due: log.due.toISOString(),
- review: log.review.toISOString(),
- direction,
- source: "clear_backlog" as const,
- })),
- })
- .catch((err) => {
- Sentry.captureException(err, {
- tags: { operation: "clearBacklog.revlogs" },
- });
- });
+ void api.stats.revlogs.batch
+ .post({
+ entries: revlogEntries.map(({ log, direction }) => ({
+ ...log,
+ due: log.due.toISOString(),
+ review: log.review.toISOString(),
+ direction,
+ source: "clear_backlog" as const,
+ })),
+ })
+ .then(({ error }) => {
+ if (error) {
+ Sentry.captureException(error, {
+ tags: { operation: "clearBacklog.revlogs" },
+ });
+ }
+ })
+ .catch((err) => {
+ Sentry.captureException(err, {
+ tags: { operation: "clearBacklog.revlogs" },
+ });
+ });🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/web/src/lib/db/operations/flashcards.ts` around lines 657 - 672, The
current call to api.stats.revlogs.batch.post inside the clearBacklog flow only
captures network rejections but ignores application-level failures returned in
the resolved response; update the handling around api.stats.revlogs.batch.post
(the call that posts revlogEntries.map(...) with due/review ISO strings and
source:"clear_backlog") to await the promise, inspect the resolved payload for
an error or failure flag/structure (e.g., response.success, response.errors or
similar), and when the API returns an application error record the failure with
Sentry.captureException including the response body and tags { operation:
"clearBacklog.revlogs" } so batch errors are surfaced even when the promise does
not reject. Ensure you still keep the existing .catch for transport errors.
| mutationFn: async () => { | ||
| const type = form.getValues("type"); | ||
| if (!type) throw new Error("Type is required"); | ||
|
|
||
| const { data, error } = await api.ai.autocomplete.post({ | ||
| word: form.getValues("word"), | ||
| translation: form.getValues("translation"), | ||
| type, | ||
| }); | ||
| if (error) throw error; | ||
| return data; | ||
| }, | ||
| onError: (error) => { |
There was a problem hiding this comment.
Use the request’s type when applying autofill results.
mutationFn posts the current type, but onSuccess reads form.getValues("type") again. If the user changes the category while the request is in flight, the returned morphology gets applied to the wrong branch or dropped. Capture the submitted type alongside the response and reuse that here. The same edge case exists in apps/web/src/routes/_authorized-layout/_app-layout/dictionary/edit/$wordId.tsx.
Suggested fix
const autofill = useMutation({
mutationFn: async () => {
const type = form.getValues("type");
if (!type) throw new Error("Type is required");
const { data, error } = await api.ai.autocomplete.post({
word: form.getValues("word"),
translation: form.getValues("translation"),
type,
});
if (error) throw error;
- return data;
+ return { data, requestedType: type };
},
@@
- onSuccess: (data) => {
- if (!data) return;
+ onSuccess: (result) => {
+ if (!result?.data) return;
+ const { data, requestedType } = result;
if (data.definition) {
form.setValue("definition", data.definition);
}
@@
if ("morphology" in data && data.morphology) {
- const type = form.getValues("type");
+ const type = requestedType;
const m = data.morphology as Record<string, unknown>;Also applies to: 217-285
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In
`@apps/web/src/routes/_authorized-layout/_app-layout/dictionary/add/route.lazy.tsx`
around lines 193 - 205, The mutation currently posts the form's type inside
mutationFn but re-reads form.getValues("type") in onSuccess, which can misapply
autofill if the user changes type mid-request; update the mutation to return an
object containing both the API response and the submitted type (capture const
submittedType = form.getValues("type") before the api.ai.autocomplete.post call
inside mutationFn and include it in the returned value), then in onSuccess use
that returned submittedType instead of calling form.getValues("type"); apply the
same change to the analogous mutation in edit/$wordId.tsx (look for mutationFn
and onSuccess in both files).
- Use config.WEB_CLIENT_DOMAIN for checkout successUrl instead of allowedDomains[0] to avoid cookie issues across apex/www - Use getElementById for hash scrolling to prevent SyntaxError on invalid CSS selectors - Fix "flaschards" typo in settings export dialog and translations Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (2)
packages/i18n/locales/en.ts (1)
1-1: Avoid hardcoding plan prices directly in locale text (Line 1).Strings like
Annual — $7/mo,Monthly — $10/mo, andBilled $84/yearare brittle for future pricing/currency changes. Prefer placeholders populated from billing config.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@packages/i18n/locales/en.ts` at line 1, The locale file hardcodes plan prices inside the exported messages JSON (see messages and keys "Fl8owW", "O+YlVk", "t/5uBV"); update those entries to use placeholders (e.g. "Annual — {price}/mo", "Monthly — {price}/mo", "Billed {price}/year") and then change the UI code that renders these messages to pass the billing-config values into the i18n interpolation (use the existing messages/messages API where you render plan labels to supply the price variable from your billing config/service).apps/api/src/auth.ts (1)
9-10: Avoid the@ts-ignoreed deep SDK import in this shared module.
apps/webimports types fromapps/api/src/auth.ts, so thedist/commonjs/...path makes that shared surface depend on Polar's internal package layout. Polar's own adapter utilities import the same payload type from@polar-sh/sdk/models/components/..., so prefer that package path, or a narrow local interface, over suppressing the resolver error here. (app.unpkg.com)As per coding guidelines
Use TypeScript with strict typing across entire codebaseandDo not use any type unless absolutely necessary.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/auth.ts` around lines 9 - 10, Replace the `@ts-ignore` deep import in apps/api/src/auth.ts that imports WebhookSubscriptionUpdatedPayload from "@polar-sh/sdk/dist/commonjs/..." by either importing the payload type from the public package path used elsewhere (e.g., "@polar-sh/sdk/models/components/webhooksubscriptionupdatedpayload") or by declaring a narrow local interface matching the fields your code actually uses; update the import for the symbol WebhookSubscriptionUpdatedPayload (or swap usages to the local interface) and remove the `@ts-ignore` so the module no longer depends on the SDK's internal layout and satisfies the bundler/type-resolver.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/api/src/auth.ts`:
- Around line 388-399: The update against users using customer.externalId may
affect 0 rows if externalId is stale; modify the block around
db.update(users).set(...) where(eq(users.id, customer.externalId)) to check the
update result (rowsAffected or returned rows) and if no rows were updated, do
not proceed to the sessions lookup or acknowledge the webhook—instead surface an
error/return a non-2xx response so the webhook can be retried; ensure the logic
that queries sessions (db.select({ token: sessions.token
}).from(sessions).where(eq(sessions.userId, customer.externalId))) only runs
after a successful update and keep the check tied to the same
customer.externalId variable.
In `@apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx`:
- Around line 215-235: The labels reference IDs (settings-theme-menu,
settings-color-theme, settings-language-menu) that are currently on
non-focusable SelectContent elements; move each ID to the corresponding
focusable SelectTrigger so the label htmlFor correctly associates with the
control. Edit the ThemeMenu, ColorThemeMenu, and LanguageMenu components: remove
the id from SelectContent and add the same id prop to the SelectTrigger element
(ensure the SelectTrigger is the button/control rendered by those components).
Keep all other behavior/props the same so keyboard focus and accessibility are
preserved.
In `@packages/i18n/locales/ar.po`:
- Around line 948-950: Update the Arabic translation for the entry with msgid
"Go to dictionary": replace the current msgstr "الذهاب إلى القاموس" with the
imperative and catalog-consistent phrasing "اذهب إلى المعجم" (or "انتقل إلى
المعجم" if you prefer that tone) so it uses the existing term "معجم" used
elsewhere for “Dictionary.”
---
Nitpick comments:
In `@apps/api/src/auth.ts`:
- Around line 9-10: Replace the `@ts-ignore` deep import in apps/api/src/auth.ts
that imports WebhookSubscriptionUpdatedPayload from
"@polar-sh/sdk/dist/commonjs/..." by either importing the payload type from the
public package path used elsewhere (e.g.,
"@polar-sh/sdk/models/components/webhooksubscriptionupdatedpayload") or by
declaring a narrow local interface matching the fields your code actually uses;
update the import for the symbol WebhookSubscriptionUpdatedPayload (or swap
usages to the local interface) and remove the `@ts-ignore` so the module no longer
depends on the SDK's internal layout and satisfies the bundler/type-resolver.
In `@packages/i18n/locales/en.ts`:
- Line 1: The locale file hardcodes plan prices inside the exported messages
JSON (see messages and keys "Fl8owW", "O+YlVk", "t/5uBV"); update those entries
to use placeholders (e.g. "Annual — {price}/mo", "Monthly — {price}/mo", "Billed
{price}/year") and then change the UI code that renders these messages to pass
the billing-config values into the i18n interpolation (use the existing
messages/messages API where you render plan labels to supply the price variable
from your billing config/service).
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: eb5a8b78-0db0-40ff-8d9a-0b57c77d7ff9
📒 Files selected for processing (6)
apps/api/src/auth.tsapps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsxpackages/i18n/locales/ar.popackages/i18n/locales/ar.tspackages/i18n/locales/en.popackages/i18n/locales/en.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- packages/i18n/locales/ar.ts
| await db | ||
| .update(users) | ||
| .set({ | ||
| plan, | ||
| subscriptionStatus: status === "active" ? "active" : "canceled", | ||
| }) | ||
| .where(eq(users.id, customer.externalId)); | ||
|
|
||
| const userSessions = await db | ||
| .select({ token: sessions.token }) | ||
| .from(sessions) | ||
| .where(eq(sessions.userId, customer.externalId)); |
There was a problem hiding this comment.
Don't acknowledge the webhook when no user row is updated.
If customer.externalId is present but stale, this update touches 0 rows, the session lookup returns nothing, and the handler still exits successfully. That silently drops a paid subscription change instead of surfacing a data-sync problem.
🛑 Suggested guard
+ const [existingUser] = await db
+ .select({ id: users.id })
+ .from(users)
+ .where(eq(users.id, customer.externalId));
+
+ if (!existingUser) {
+ childLogger.error(
+ { event: "webhook_subscription_updated.user_not_found" },
+ "No local user matches the Polar external ID."
+ );
+
+ throw new Error("No local user matches the Polar external ID.");
+ }
+
await db
.update(users)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| await db | |
| .update(users) | |
| .set({ | |
| plan, | |
| subscriptionStatus: status === "active" ? "active" : "canceled", | |
| }) | |
| .where(eq(users.id, customer.externalId)); | |
| const userSessions = await db | |
| .select({ token: sessions.token }) | |
| .from(sessions) | |
| .where(eq(sessions.userId, customer.externalId)); | |
| const [existingUser] = await db | |
| .select({ id: users.id }) | |
| .from(users) | |
| .where(eq(users.id, customer.externalId)); | |
| if (!existingUser) { | |
| childLogger.error( | |
| { event: "webhook_subscription_updated.user_not_found" }, | |
| "No local user matches the Polar external ID." | |
| ); | |
| throw new Error("No local user matches the Polar external ID."); | |
| } | |
| await db | |
| .update(users) | |
| .set({ | |
| plan, | |
| subscriptionStatus: status === "active" ? "active" : "canceled", | |
| }) | |
| .where(eq(users.id, customer.externalId)); | |
| const userSessions = await db | |
| .select({ token: sessions.token }) | |
| .from(sessions) | |
| .where(eq(sessions.userId, customer.externalId)); |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/api/src/auth.ts` around lines 388 - 399, The update against users using
customer.externalId may affect 0 rows if externalId is stale; modify the block
around db.update(users).set(...) where(eq(users.id, customer.externalId)) to
check the update result (rowsAffected or returned rows) and if no rows were
updated, do not proceed to the sessions lookup or acknowledge the
webhook—instead surface an error/return a non-2xx response so the webhook can be
retried; ensure the logic that queries sessions (db.select({ token:
sessions.token }).from(sessions).where(eq(sessions.userId,
customer.externalId))) only runs after a successful update and keep the check
tied to the same customer.externalId variable.
| <CardContent className="flex flex-col gap-4"> | ||
| <div className="flex flex-col gap-y-2 sm:flex-row sm:items-center sm:justify-between"> | ||
| <label htmlFor="settings-theme-menu"> | ||
| <Trans>Theme</Trans> | ||
| </label> | ||
| <ThemeMenu /> | ||
| </div> | ||
|
|
||
| <div className="flex flex-col gap-y-2 sm:flex-row sm:items-center sm:justify-between"> | ||
| <label htmlFor="settings-color-theme"> | ||
| <Trans>Color theme</Trans> | ||
| </label> | ||
| <ColorThemeMenu /> | ||
| </div> | ||
|
|
||
| <div className="flex flex-col gap-y-2 sm:flex-row sm:items-center sm:justify-between"> | ||
| <label htmlFor="settings-language-menu"> | ||
| <Trans>Language</Trans> | ||
| </label> | ||
| <LanguageMenu /> | ||
| </div> |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
fd -t f 'ThemeMenu\|ColorThemeMenu\|LanguageMenu' --glob '*.tsx' --glob '*.ts' 2>/dev/null | head -20Repository: Shunseii/bahar
Length of output: 40
🏁 Script executed:
rg -l "export.*ThemeMenu|export.*ColorThemeMenu|export.*LanguageMenu" --type ts --type tsxRepository: Shunseii/bahar
Length of output: 85
🏁 Script executed:
head -n 240 "apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx" | tail -n 40Repository: Shunseii/bahar
Length of output: 1359
🏁 Script executed:
head -n 50 "apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx" | grep -E "import|from"Repository: Shunseii/bahar
Length of output: 1500
🏁 Script executed:
rg "ThemeMenu" --type ts -B2 -A5 --max-count 5Repository: Shunseii/bahar
Length of output: 4929
🏁 Script executed:
find apps/web/src -name "*menu*" -o -name "*theme*" | head -20Repository: Shunseii/bahar
Length of output: 85
🏁 Script executed:
cat -n "apps/web/src/components/ThemeMenu.tsx"Repository: Shunseii/bahar
Length of output: 4045
🏁 Script executed:
cat -n "apps/web/src/components/LanguageMenu.tsx"Repository: Shunseii/bahar
Length of output: 1675
Move IDs from SelectContent to SelectTrigger.
The IDs (settings-theme-menu, settings-color-theme, settings-language-menu) are currently placed on SelectContent elements, but the labels use htmlFor pointing to these IDs. For proper label-control association, these IDs must be on the focusable SelectTrigger button elements instead. Update ThemeMenu, ColorThemeMenu, and LanguageMenu to place the IDs on their respective SelectTrigger components.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/web/src/routes/_authorized-layout/_app-layout/settings/route.lazy.tsx`
around lines 215 - 235, The labels reference IDs (settings-theme-menu,
settings-color-theme, settings-language-menu) that are currently on
non-focusable SelectContent elements; move each ID to the corresponding
focusable SelectTrigger so the label htmlFor correctly associates with the
control. Edit the ThemeMenu, ColorThemeMenu, and LanguageMenu components: remove
the id from SelectContent and add the same id prop to the SelectTrigger element
(ensure the SelectTrigger is the button/control rendered by those components).
Keep all other behavior/props the same so keyboard focus and accessibility are
preserved.
- Use finally block in clearBacklog generator to guarantee transaction rollback if consumer abandons the iterator mid-iteration - Validate due/review as ISO datetime strings in revlog body schema to prevent NaN insertion into timestamp columns - Handle application-level errors from batch revlog POST responses, not just transport-level rejections Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
apps/api/src/routers/stats.ts (2)
26-27: Prefer camelCase for local variablesLocal constants
due_timestamp_msandreview_timestamp_msare snake_case. ConsiderdueTimestampMs/reviewTimestampMslocally, while keeping DB column keys snake_case in the insert object.As per coding guidelines, "Component naming: PascalCase for components, camelCase for functions/variables".
Also applies to: 53-54
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/routers/stats.ts` around lines 26 - 27, Rename the local snake_case timestamp variables to camelCase (e.g., change due_timestamp_ms -> dueTimestampMs and review_timestamp_ms -> reviewTimestampMs) in the stats router where they are created and used (including the second occurrence around lines 53-54), and update any subsequent references to those variables; keep the DB insert object keys in snake_case (e.g., due_timestamp_ms/review_timestamp_ms) when building the payload to preserve the column names in the insert/insertOne call. Ensure you update all usages within the function in routers/stats.ts so there are no remaining references to the old snake_case identifiers.
63-65: Constrain batch size to protect write path
entriescurrently has no bounds. Add at least.min(1)and a reasonable.max(...)to prevent empty inserts and oversized payload spikes on this endpoint.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/routers/stats.ts` around lines 63 - 65, The entries array in the request body schema (body -> entries using RevlogBodySchema) is unbounded; update the Zod schema for entries to require at least one item and cap the batch size (e.g., add .min(1) and a reasonable .max(1000) or a project-appropriate limit) to prevent empty inserts and oversized payloads when handling the RevlogBodySchema entries in the stats router.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/api/src/routers/stats.ts`:
- Around line 23-38: Wrap the database insert logic in the async route handlers
that call db.insert(revlogs).values(...) (the handler that creates id, userId,
due_timestamp_ms, review_timestamp_ms and the similar handler around lines
46-60) in a try/catch; on catch, log a structured error with your Pino logger
(e.g., processLogger.error) including the error and context (user.id, id, body),
send the error to Sentry via Sentry.captureException(err), and rethrow or throw
a structured DisplayError with a clear message so route-level telemetry can
handle it. Ensure you reference the same identifiers (revlogs, db.insert,
DisplayError, Sentry.captureException, processLogger) and keep the original
successful return ({ success: true }) only in the try path.
---
Nitpick comments:
In `@apps/api/src/routers/stats.ts`:
- Around line 26-27: Rename the local snake_case timestamp variables to
camelCase (e.g., change due_timestamp_ms -> dueTimestampMs and
review_timestamp_ms -> reviewTimestampMs) in the stats router where they are
created and used (including the second occurrence around lines 53-54), and
update any subsequent references to those variables; keep the DB insert object
keys in snake_case (e.g., due_timestamp_ms/review_timestamp_ms) when building
the payload to preserve the column names in the insert/insertOne call. Ensure
you update all usages within the function in routers/stats.ts so there are no
remaining references to the old snake_case identifiers.
- Around line 63-65: The entries array in the request body schema (body ->
entries using RevlogBodySchema) is unbounded; update the Zod schema for entries
to require at least one item and cap the batch size (e.g., add .min(1) and a
reasonable .max(1000) or a project-appropriate limit) to prevent empty inserts
and oversized payloads when handling the RevlogBodySchema entries in the stats
router.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 7d1421a9-c822-42ae-a749-b8cff74b7306
📒 Files selected for processing (2)
apps/api/src/routers/stats.tsapps/web/src/lib/db/operations/flashcards.ts
✅ Files skipped from review due to trivial changes (1)
- apps/web/src/lib/db/operations/flashcards.ts
| async ({ user, body }) => { | ||
| const id = nanoid(); | ||
| const userId = user.id; | ||
| const due_timestamp_ms = new Date(body.due).getTime(); | ||
| const review_timestamp_ms = new Date(body.review).getTime(); | ||
|
|
||
| await db.insert(revlogs).values({ | ||
| ...body, | ||
| id, | ||
| user_id: userId, | ||
| due_timestamp_ms, | ||
| review_timestamp_ms, | ||
| }); | ||
|
|
||
| return { success: true } as const; | ||
| }, |
There was a problem hiding this comment.
Add structured error handling around DB writes
Both handlers perform database writes without local try/catch, so failures bubble as generic server errors and skip route-level structured handling/telemetry. Wrap inserts with structured errors (e.g., DisplayError) and log/capture failures consistently.
As per coding guidelines, "Error handling with try/catch blocks and structured error types" and "Use Pino for structured logging and Sentry for error tracking".
Also applies to: 46-60
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/api/src/routers/stats.ts` around lines 23 - 38, Wrap the database insert
logic in the async route handlers that call db.insert(revlogs).values(...) (the
handler that creates id, userId, due_timestamp_ms, review_timestamp_ms and the
similar handler around lines 46-60) in a try/catch; on catch, log a structured
error with your Pino logger (e.g., processLogger.error) including the error and
context (user.id, id, body), send the error to Sentry via
Sentry.captureException(err), and rethrow or throw a structured DisplayError
with a clear message so route-level telemetry can handle it. Ensure you
reference the same identifiers (revlogs, db.insert, DisplayError,
Sentry.captureException, processLogger) and keep the original successful return
({ success: true }) only in the try path.
- Verify user exists before updating plan in Polar webhook handler to fail loudly on stale externalId instead of silently updating 0 rows - Use consistent Arabic term (معجم) for dictionary in "Go to dictionary" CTA - Move IDs from SelectContent to SelectTrigger for proper label-control association in settings menus Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (1)
apps/api/src/auth.ts (1)
9-10: Use the public SDK webhook validation API instead of importing from a private path.The
@polar-sh/sdk/dist/commonjs/models/components/webhooksubscriptionupdatedpayloadimport is not part of the SDK's stable public API—it's an internal implementation detail of the generated SDK. The@ts-expect-errorsuppression masks a type resolution error and violates the strict typing guideline.Instead, use
validateEventfrom@polar-sh/sdk/webhooks, which provides properly typed payloads for each event type (e.g.,subscription.updated), or define a narrow local type for the fields this handler actually reads.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@apps/api/src/auth.ts` around lines 9 - 10, The code imports an internal SDK type WebhookSubscriptionUpdatedPayload via a private path and suppresses the TS error; instead, replace that usage with the public webhook validation API: import validateEvent from `@polar-sh/sdk/webhooks` (or the SDK's exported validateEvent) and use it to parse/validate incoming events (e.g., "subscription.updated") to obtain a properly typed payload, or alternatively declare a minimal local interface for only the fields your handler reads; update the handler that references WebhookSubscriptionUpdatedPayload to use the validated payload from validateEvent (or the local type) and remove the `@ts-expect-error` import.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@apps/api/src/auth.ts`:
- Around line 414-421: The Redis deletion of sessions doesn't update the 300s
cookie cache used by plan-gated auth, so either disable the cookie cache or
bypass it by reading the live subscription status from the users table in the
auth guard; specifically either set session.cookieCache to 0/false in
apps/api/src/auth.ts (remove/disable the 300s cache) or change the authorization
logic in apps/api/src/middleware.ts (the guard that reads
user.subscriptionStatus from the cached cookie) to ignore the cookie and perform
a fresh DB lookup on the users table for the user's subscriptionStatus (use the
existing users model/query functions) before enforcing plan changes.
- Around line 354-358: The webhook handler currently writes subscription updates
(using payload.timestamp, plan, productId, status) without deduplication; fix by
implementing idempotency: read the incoming webhook delivery id from the
"webhook-id" header, persist that id in the same record you write/associate with
the user's subscription, and check for an existing processed webhook id before
applying updates in the subscription update logic (the block that
creates/updates the user's subscription around payload.timestamp, plan,
productId, status). If the webhook-id already exists, skip processing; otherwise
write the webhook-id atomically with the subscription change to prevent replayed
or manually redelivered events from flipping state (address both places noted in
the diff).
---
Nitpick comments:
In `@apps/api/src/auth.ts`:
- Around line 9-10: The code imports an internal SDK type
WebhookSubscriptionUpdatedPayload via a private path and suppresses the TS
error; instead, replace that usage with the public webhook validation API:
import validateEvent from `@polar-sh/sdk/webhooks` (or the SDK's exported
validateEvent) and use it to parse/validate incoming events (e.g.,
"subscription.updated") to obtain a properly typed payload, or alternatively
declare a minimal local interface for only the fields your handler reads; update
the handler that references WebhookSubscriptionUpdatedPayload to use the
validated payload from validateEvent (or the local type) and remove the
`@ts-expect-error` import.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 2bca3659-52a3-49ed-9278-144ec6411c10
📒 Files selected for processing (5)
apps/api/src/auth.tsapps/web/src/components/LanguageMenu.tsxapps/web/src/components/ThemeMenu.tsxpackages/i18n/locales/ar.popackages/i18n/locales/ar.ts
✅ Files skipped from review due to trivial changes (1)
- apps/web/src/components/LanguageMenu.tsx
🚧 Files skipped from review as they are similar to previous changes (1)
- apps/web/src/components/ThemeMenu.tsx
| timestamp: payload.timestamp, | ||
| plan, | ||
| productId: product.id, | ||
| status, | ||
| }); |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Does Polar guarantee ordering for subscription update webhooks? If not, which event identifier or timestamp should consumers persist to reject stale or out-of-order deliveries?
💡 Result:
Yes, Polar guarantees ordering for webhook deliveries, including subscription update webhooks. Events are delivered in the order they were created for each endpoint. A newer event will not be delivered until earlier events succeed. However, this ordering can be bypassed in two cases: 1) if an event exceeds a configurable age limit while earlier events are pending, or 2) during manual redelivery of failed events. Consumers should still implement idempotency to handle at-least-once delivery (duplicates possible due to retries). Persist the unique webhook-id header (unique per delivery attempt) to detect and reject duplicates. The payload also includes a unique event id (e.g., "evt_123") and timestamp for additional deduplication if needed. Official best practices recommend logging webhook IDs and checking a database of processed webhook_ids before processing.
Citations:
- 1: https://mintlify.com/polarsource/polar/integrate/webhooks/delivery
- 2: https://www.mintlify.com/polarsource/polar/integrate/webhooks/delivery
- 3: https://www.mintlify.com/polarsource/polar/api-reference/webhooks
- 4: https://hookdeck.com/webhooks/platforms/guide-to-polar-webhooks-features-and-best-practices
🏁 Script executed:
head -n 420 apps/api/src/auth.ts | tail -n 100Repository: Shunseii/bahar
Length of output: 3199
🏁 Script executed:
sed -n '320,420p' apps/api/src/auth.tsRepository: Shunseii/bahar
Length of output: 3213
🏁 Script executed:
wc -l apps/api/src/auth.ts && sed -n '330,450p' apps/api/src/auth.tsRepository: Shunseii/bahar
Length of output: 3564
🏁 Script executed:
rg "WebhookSubscriptionUpdatedPayload" -A 10 --max-count=5Repository: Shunseii/bahar
Length of output: 1331
🏁 Script executed:
rg "webhook" -i -A 3 -B 3 apps/api/src/ | head -100Repository: Shunseii/bahar
Length of output: 3629
🏁 Script executed:
find apps -name "*.ts" -type f | xargs grep -l "schema\|migration" | head -10Repository: Shunseii/bahar
Length of output: 361
🏁 Script executed:
ls -la apps/api/src/db/schema/Repository: Shunseii/bahar
Length of output: 411
🏁 Script executed:
cat apps/api/src/db/schema/auth.tsRepository: Shunseii/bahar
Length of output: 4004
Implement webhook idempotency to prevent reprocessing of duplicate or manually redelivered subscription events.
The webhook handler applies subscription status changes without deduplication. While Polar guarantees ordering for normal deliveries, manual redelivery of failed events bypasses this guarantee. If an older canceled event is manually redelivered after a newer active event, the user's subscription state will flip incorrectly.
Persist the unique webhook-id header (unique per delivery attempt) from the webhook payload to detect and reject duplicate deliveries. Check for processed webhook IDs before updating the user's subscription status in the database.
Lines 354-358, 406-412
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/api/src/auth.ts` around lines 354 - 358, The webhook handler currently
writes subscription updates (using payload.timestamp, plan, productId, status)
without deduplication; fix by implementing idempotency: read the incoming
webhook delivery id from the "webhook-id" header, persist that id in the same
record you write/associate with the user's subscription, and check for an
existing processed webhook id before applying updates in the subscription update
logic (the block that creates/updates the user's subscription around
payload.timestamp, plan, productId, status). If the webhook-id already exists,
skip processing; otherwise write the webhook-id atomically with the subscription
change to prevent replayed or manually redelivered events from flipping state
(address both places noted in the diff).
| const userSessions = await db | ||
| .select({ token: sessions.token }) | ||
| .from(sessions) | ||
| .where(eq(sessions.userId, customer.externalId)); | ||
|
|
||
| await Promise.all( | ||
| userSessions.map((s) => redisClient.del(s.token)) | ||
| ); |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
rg -n -C2 'cookieCache|SESSION_COOKIE_CACHE_EXPIRY_SECS|redisClient\.del|planGuard|subscriptionStatus' apps/api/src/auth.ts apps/api/src/middleware.tsRepository: Shunseii/bahar
Length of output: 2270
Redis eviction won't make plan changes effective immediately.
Line 420 clears Better Auth's Redis-backed sessions, but session.cookieCache is enabled for 300 seconds in this file, and apps/api/src/middleware.ts:69 authorizes from cached user.subscriptionStatus. Upgrades and cancellations remain stale until the cookie cache expires. Either disable cookie caching for plan-gated auth or query the latest plan/status directly from the users table inside the guard.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@apps/api/src/auth.ts` around lines 414 - 421, The Redis deletion of sessions
doesn't update the 300s cookie cache used by plan-gated auth, so either disable
the cookie cache or bypass it by reading the live subscription status from the
users table in the auth guard; specifically either set session.cookieCache to
0/false in apps/api/src/auth.ts (remove/disable the 300s cache) or change the
authorization logic in apps/api/src/middleware.ts (the guard that reads
user.subscriptionStatus from the cached cookie) to ignore the cookie and perform
a fresh DB lookup on the users table for the user's subscriptionStatus (use the
existing users model/query functions) before enforcing plan changes.
Summary by CodeRabbit
New Features
Documentation
Chores