Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
No actionable comments were generated in the recent review. 🎉 ℹ️ Recent review info⚙️ Run configurationConfiguration used: Repository UI Review profile: CHILL Plan: Pro Run ID: 📒 Files selected for processing (1)
🚧 Files skipped from review as they are similar to previous changes (1)
📝 WalkthroughWalkthroughClient-side enrichment for ratelimit logs was added: base ClickHouse log queries now return core fields only; a new enrichment query and TRPC endpoint fetch supplementary request data. The client hook batches, caches, and merges enrichment into logs; server-side error handling for logs vs count queries was separated. Changes
Sequence Diagram(s)sequenceDiagram
participant Client as Client Component
participant Hook as use-logs-query Hook
participant Cache as Enrichment Cache
participant TRPC as TRPC Server
participant CH as ClickHouse
Client->>Hook: request logs
Hook->>CH: query base logs (request_id, time, identifier, status)
CH-->>Hook: base RatelimitLog[]
Hook->>Cache: check for missing enrichment
Hook->>Hook: batch missing request_ids (chunks of 100)
loop per batch
Hook->>TRPC: ratelimit.logs.enrichment(requestIds, startTime, endTime)
TRPC->>CH: getRatelimitLogEnrichment(requestIds, timeRange)
CH-->>TRPC: enrichment[]
TRPC-->>Hook: { enrichment }
Hook->>Cache: store enrichment (enrichmentMapRef)
end
Hook->>Hook: merge enrichment into logs => EnrichedRatelimitLog[]
Hook-->>Client: return historicalLogs & realtimeLogs (enriched)
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
🚥 Pre-merge checks | ✅ 1 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (1 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches📝 Generate docstrings
🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🧹 Nitpick comments (2)
web/apps/dashboard/lib/trpc/routers/ratelimit/query-overview-logs/index.ts (1)
58-77: Usefaultlibrary for error handling as per coding guidelines.The error branches at lines 59–77 use direct
console.*+TRPCError. Per the coding guidelines for**/*.{ts,tsx}, apply thefaultlibrary for error handling to align with project standards.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@web/apps/dashboard/lib/trpc/routers/ratelimit/query-overview-logs/index.ts` around lines 58 - 77, Replace the direct console logging and manual TRPCError usage with the project's fault-based error handling: import and use the fault library in this module and for the logsResult branch create and throw a fault error that wraps the original logsResult.err and includes context (namespaceId, descriptive message) instead of throwing TRPCError; for the countResult branch record the non-fatal failure via fault (e.g., capture/log a warning with countResult.err and namespaceId) and continue returning logs without total. Ensure you reference logsResult and countResult when building the fault context so the original error and namespaceId are attached.web/apps/dashboard/app/(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/hooks/use-logs-query.ts (1)
152-169: BatchrequestIdsbefore calling the enrichment route.The new endpoint caps
requestIdsat 100, but this helper sends every missing ID in one call. If a caller ever raiseslimitabove 100, the whole enrichment fetch will fail validation and backfill nothing.📦 Suggested batching
try { - const result = await queryClient.ratelimit.logs.enrichment.fetch({ - requestIds: unenrichedIds, - startTime: minTime, - endTime: maxTime, - }); - - if (result.enrichment.length > 0) { - for (const item of result.enrichment) { - enrichmentMapRef.current.set(item.request_id, item); - } - // Trigger re-render to merge enrichment into displayed logs - setEnrichmentVersion((v) => v + 1); + let updated = false; + + for (let i = 0; i < unenrichedIds.length; i += 100) { + const requestIds = unenrichedIds.slice(i, i + 100); + const result = await queryClient.ratelimit.logs.enrichment.fetch({ + requestIds, + startTime: minTime, + endTime: maxTime, + }); + + for (const item of result.enrichment) { + enrichmentMapRef.current.set(item.request_id, item); + updated = true; + } + } + + if (updated) { + setEnrichmentVersion((v) => v + 1); } } catch (error) {🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@web/apps/dashboard/app/`(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/hooks/use-logs-query.ts around lines 152 - 169, The enrichment call currently sends all missing IDs at once which will fail when requestIds > 100; change the logic in use-logs-query.ts to batch unenrichedIds into chunks of at most 100 and call queryClient.ratelimit.logs.enrichment.fetch for each chunk (either sequentially or via Promise.all), then merge each chunk's results into enrichmentMapRef.current; ensure you still compute minTime/maxTime once (from times) and pass them to every fetch call and handle/aggregate errors per-chunk so one oversized batch doesn't prevent partial backfill.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@web/apps/dashboard/components/logs/details/log-details/index.tsx`:
- Around line 90-92: Remove the unsafe type assertion on line 91 and use the
existing isStandardLog type guard to narrow log before calling
extractResponseField; specifically, ensure you check isStandardLog(log) (or
otherwise branch where isStandardLog is true) and then call
extractResponseField(log, "meta") so TypeScript knows log is Log |
EnrichedRatelimitLog without using `as`; update the conditional that currently
checks "request_body"|"response_body" to incorporate the isStandardLog guard
before invoking extractResponseField.
In `@web/internal/clickhouse/src/ratelimits.ts`:
- Around line 374-389: The query currently pages only by time which skips rows
with identical time; change pagination to use a stable (time, request_id)
cursor: add a new parameter cursorRequestId and replace the time-only cursor
check with a tuple comparison such as ({cursorTime} IS NULL OR (time <
{cursorTime} OR (time = {cursorTime} AND request_id < {cursorRequestId}))),
update ORDER BY to "ORDER BY time DESC, request_id DESC", and ensure the calling
logic (e.g., getRatelimitOverviewLogs and its router/client cursor payload)
includes and forwards cursorRequestId alongside cursorTime so the next-page
cursor contains both values.
---
Nitpick comments:
In
`@web/apps/dashboard/app/`(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/hooks/use-logs-query.ts:
- Around line 152-169: The enrichment call currently sends all missing IDs at
once which will fail when requestIds > 100; change the logic in
use-logs-query.ts to batch unenrichedIds into chunks of at most 100 and call
queryClient.ratelimit.logs.enrichment.fetch for each chunk (either sequentially
or via Promise.all), then merge each chunk's results into
enrichmentMapRef.current; ensure you still compute minTime/maxTime once (from
times) and pass them to every fetch call and handle/aggregate errors per-chunk
so one oversized batch doesn't prevent partial backfill.
In `@web/apps/dashboard/lib/trpc/routers/ratelimit/query-overview-logs/index.ts`:
- Around line 58-77: Replace the direct console logging and manual TRPCError
usage with the project's fault-based error handling: import and use the fault
library in this module and for the logsResult branch create and throw a fault
error that wraps the original logsResult.err and includes context (namespaceId,
descriptive message) instead of throwing TRPCError; for the countResult branch
record the non-fatal failure via fault (e.g., capture/log a warning with
countResult.err and namespaceId) and continue returning logs without total.
Ensure you reference logsResult and countResult when building the fault context
so the original error and namespaceId are attached.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 5e52e174-8b9a-437e-a144-017f84453377
📒 Files selected for processing (13)
web/apps/dashboard/app/(app)/[workspaceSlug]/logs/utils.tsweb/apps/dashboard/app/(app)/[workspaceSlug]/projects/[projectId]/(overview)/requests/utils.tsweb/apps/dashboard/app/(app)/[workspaceSlug]/ratelimits/[namespaceId]/_overview/components/table/logs-table.tsxweb/apps/dashboard/app/(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/hooks/use-logs-query.tsweb/apps/dashboard/app/(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/logs-table.tsxweb/apps/dashboard/app/(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/context/logs.tsxweb/apps/dashboard/components/logs/details/log-details/index.tsxweb/apps/dashboard/lib/trpc/routers/index.tsweb/apps/dashboard/lib/trpc/routers/ratelimit/query-logs/enrichment.tsweb/apps/dashboard/lib/trpc/routers/ratelimit/query-logs/index.tsweb/apps/dashboard/lib/trpc/routers/ratelimit/query-overview-logs/index.tsweb/internal/clickhouse/src/index.tsweb/internal/clickhouse/src/ratelimits.ts
web/apps/dashboard/components/logs/details/log-details/index.tsx
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In
`@web/apps/dashboard/app/`(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/hooks/use-logs-query.ts:
- Around line 224-229: The realtime row insertion currently creates enriched
rows with only ENRICHMENT_DEFAULTS and never applies later-fetched metadata from
enrichmentMapRef, so update the logic in use-logs-query (the code that builds
`enriched` and sets `newMap`) to re-merge enrichment after enrichmentMapRef is
updated: when enrichmentMapRef.current has data for a request_id, merge that
object into the `enriched` object (same way as for historical rows) and update
`newMap.set(log.request_id, enriched)`; also ensure the code path that rebuilds
`historicalLogsMap` (the block around `historicalLogsMap`) also rebuilds the
realtime/newMap entries or triggers the same merge routine so live-polled rows
pick up fetched metadata without a full reload.
- Around line 56-57: The enrichment cache (enrichmentMapRef) is never evicted or
cleared, causing unbounded memory growth; modify the use-logs-query hook to
either cap entries (implement simple LRU or max size) or clear the map on query
resets/changes and bump enrichmentVersion to force UI updates; specifically
update logic around enrichmentMapRef and setEnrichmentVersion in the
useLogsQuery hook (and the related code referenced at the other occurrence) to
call enrichmentMapRef.current.clear() (or trim entries) whenever the
query/filters/page/long-polling session resets and/or when you programmatically
evict oldest entries once a configurable maxEntries threshold is reached.
- Around line 214-230: The updater for setRealtimeLogsMap mutates newLogs inside
the updater (using newLogs.push) which is unsafe because React may call updaters
async or multiple times; instead, compute the list of logs-to-add outside the
updater and then call setRealtimeLogsMap to merge them in. Concretely: iterate
over result.ratelimitLogs to build a local array (e.g., logsToAdd) skipping
entries present in realtime/historical maps or enrichmentMapRef, then call
setRealtimeLogsMap(prev => { const newMap = new Map(prev); for (const log of
logsToAdd) newMap.set(log.request_id, { ...ENRICHMENT_DEFAULTS, ...log,
...(enrichmentMapRef.current.get(log.request_id) ?? {}) }); return newMap; });
finally call fetchEnrichment(logsToAdd) (or the existing enrichment fetch) with
that local array — remove any newLogs.push usage from inside the updater.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 9111d417-0e93-43f5-a7a8-03489c780557
📒 Files selected for processing (2)
web/apps/dashboard/app/(app)/[workspaceSlug]/ratelimits/[namespaceId]/logs/components/table/hooks/use-logs-query.tsweb/apps/dashboard/components/logs/details/log-details/index.tsx
🚧 Files skipped from review as they are similar to previous changes (1)
- web/apps/dashboard/components/logs/details/log-details/index.tsx
What does this PR do?
Ratelimit logs query was LEFT JOINing against api_requests_raw_v2 using the full time window, causing ClickHouse timeouts for workspaces with heavy verifyKey usage, (e.g. 2B+ requests/day) even though their ratelimit data was small
Fixes UNKEY-DASHBOARD-4W
Changes
Type of change
How should this be tested?
Test plan
Checklist
Required
pnpm buildpnpm fmtmake fmton/godirectoryconsole.logsgit pull origin mainAppreciated