Skip to content

chore: Sprint 8B.2 NX Monorepo & Provider Cleanup [V2]#153

Merged
frigini merged 49 commits intomasterfrom
feature/sprint-8b2-monorepo-cleanup
Mar 13, 2026
Merged

chore: Sprint 8B.2 NX Monorepo & Provider Cleanup [V2]#153
frigini merged 49 commits intomasterfrom
feature/sprint-8b2-monorepo-cleanup

Conversation

@frigini
Copy link
Owner

@frigini frigini commented Mar 9, 2026

Part 2 of Sprint 8B.2. Contains the structural changes: NX Monorepo root cleanup and removal of the MeAjudaAi.Web.Provider project. Re-opened to ensure the .coderabbit.yaml filter correctly excludes the deleted files.

Summary by CodeRabbit

Notas da Versão

  • Novos Recursos

    • Modernização da infraestrutura frontend com configuração otimizada do workspace.
    • Geração e integração melhorada de cliente API.
  • Correções de Bugs

    • Tratamento aprimorado de erros de autenticação.
    • Maior segurança em manipulação de valores nulos nas respostas da API.
  • Chores

    • Modernização da infraestrutura de mensageria.
    • Limpeza e otimização de configurações de projeto.

@frigini frigini force-pushed the feature/sprint-8b2-monorepo-cleanup branch from 11c5c68 to 69c8b6e Compare March 9, 2026 19:16
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 9, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • ✅ Review completed - (🔄 Check again to review again)
📝 Walkthrough

Walkthrough

Remoção sistemática do suporte a Azure Service Bus; mensageria padronizada para RabbitMQ/NoOp; grande poda de dependências/lockfiles; adição de workspace Nx/Next.js; introdução de EnvironmentHelpers e fallbacks de conexão; remoção de docs/configs e ajustes em CI/workflow.

Changes

Cohort / File(s) Summary
Messaging core & DLQ
src/Shared/Messaging/..., src/Shared/Messaging/DeadLetter/*, src/Shared/Messaging/Options/*
Eliminação das opções/paths ServiceBus; consolidação do fluxo DLQ para RabbitMQ/NoOp; remoções e renomes nas extensões públicas de DLQ.
Service Bus impls & managers
src/Shared/Messaging/ServiceBus/*, src/Shared/Messaging/Factories/*
Removidos ServiceBusMessageBus, ServiceBusTopicManager, ServiceBusDeadLetterService, interfaces e factories associadas.
Message bus selection & wiring
src/Shared/Messaging/Factories/MessageBusFactory.cs, src/Shared/Messaging/MessagingExtensions.cs, src/Shared/Extensions/ServiceCollectionExtensions.cs
Fábrica/extensões passam a priorizar RabbitMQ com fallback NoOp; registros, clients e provisionamento Service Bus removidos.
RabbitMQ infra changes
src/Shared/Messaging/RabbitMq/...
Simplificação: criação de um exchange compartilhado e bindings para filas; remoção de ITopicStrategySelector e lógica per-evento.
Dead-letter runtime
src/Shared/Messaging/DeadLetter/*, src/Shared/Messaging/DeadLetter/RabbitMqDeadLetterService.cs
RabbitMqDeadLetterService adiciona IDisposable, controle de lifecycle e cancelamento; logging/validação reorganizados; ServiceBus-specific helpers removidos.
Tests & mocks
tests/..., tests/.../Mocks/*
Exclusão de suites/tests de Service Bus e DLQ; mocks migrados para semântica RabbitMQ (Published→Recorded); adaptação de helpers de teste.
Dependencies & lockfiles
Directory.Packages.props, src/Shared/*.csproj, **/packages.lock.json
Removidas referências a Azure.Messaging.ServiceBus / Rebus.AzureServiceBus / Aspire.Hosting.Azure.ServiceBus; poda massiva de entradas transitivas nos lockfiles.
Infrastructure as Code
infrastructure/main.bicep, infrastructure/servicebus.bicep
Removidos módulos/recursos/outputs relacionados ao Azure Service Bus.
AppHost startup wiring
src/Aspire/MeAjudaAi.AppHost/Program.cs, src/Aspire/.../packages.lock.json
Startup repontado para RabbitMQ (builder.AddRabbitMQ); validação do path do web app; remoção de referências Service Bus.
Frontend monorepo & Next.js
src/Web/*, src/Web/libs/*, src/Web/nx.json, src/Web/package.json
Adição do workspace Nx/Next.js, libs (ui, auth, api-client), configs ESLint/Prettier/Jest/tsconfig e scripts.
Frontend auth & middleware
src/Web/MeAjudaAi.Web.Customer/*
Reestruturação NextAuth: exposição authOptions, wrapper auth(...), middleware migrado para withAuth com callback authorized; rota auth reexporta handler único.
Config/docs & repo config
config/*, api/README.md, automation/README.md, .coderabbit.yaml, .gitignore, .gitleaks.toml, .lycheeignore
Remoção de vários arquivos de configuração/documentação; adição de dois reviews.path_filters e ampliação do .gitignore; .gitleaks.toml e .lycheeignore removidos.
CI pipeline
.github/workflows/pr-validation.yml
Workflow ampliado: adicionados passos Node/Nx, geração de OpenAPI/client, build frontend e migração da cobertura para Cobertura; reorganização de steps.
Env helpers & DB fallbacks
src/Shared/Utilities/EnvironmentHelpers.cs, src/Shared/Database/DatabaseConstants.cs, multiple .../Extensions.cs
Novo helper IsSecurityBypassEnvironment; centralização de bypass de segurança; DefaultTestConnectionString adicionado e pragmas S2068 aplicados onde necessário.
API null-safety & result types
src/Contracts/Functional/Result.cs, endpoints diversos
Fortalecimento de invariantes e null-safety em Result/Result<T>; pequenas correções de uso de operador nulo em endpoints.

Sequence Diagram(s)

sequenceDiagram
    participant App as Aplicação
    participant Factory as MessageBusFactory
    participant Rabbit as RabbitMqMessageBus
    participant Broker as RabbitMQ
    participant DLQ as DeadLetterService

    App->>Factory: Resolve IMessageBus
    Factory->>Rabbit: Inicializa/retorna RabbitMqMessageBus
    App->>Rabbit: Send/Publish(message)
    Rabbit->>Broker: Envia para Exchange compartilhado
    Broker-->>Rabbit: Ack / Nack
    alt Falha no processamento
        Rabbit->>DLQ: Enfileira / encaminha para DeadLetter
        DLQ-->>Rabbit: Log / Retry / Descarte
    end
Loading

Estimated code review effort

🎯 4 (Complexo) | ⏱️ ~75 minutes

Possibly related PRs

Poem

🐰 Pulei do Service Bus com um pulo ligeiro,

Troquei tópicos por exchanges num abraço certeiro.
Filas coaxiam no Rabbit, o DLQ a vigiar,
Mocks gravam histórias prontas pra validar.
Monorepo e pipelines — cenouras pra celebrar!

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch feature/sprint-8b2-monorepo-cleanup

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 9

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs (1)

1-1: ⚠️ Potential issue | 🟡 Minor

Remove unused import.

The MeAjudaAi.Shared.Messaging.Strategy namespace import is unused in this file.

Remove stale import
-using MeAjudaAi.Shared.Messaging.Strategy;
 using MeAjudaAi.Shared.Messaging.Options;
 using Microsoft.Extensions.Logging;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs` at line 1,
Remove the unused using directive for the MeAjudaAi.Shared.Messaging.Strategy
namespace from the RabbitMqInfrastructureManager.cs file; locate the using line
"using MeAjudaAi.Shared.Messaging.Strategy;" at the top of the
RabbitMqInfrastructureManager class and delete it so only necessary imports
remain.
🧹 Nitpick comments (10)
src/Shared/Messaging/MessagingExtensions.cs (1)

68-68: Remove commented-out code or link to tracking issue.

Commented code without context degrades readability. The inline comment explains the removal reason, but consider removing this line entirely since the TopicStrategySelector removal is already tracked in the PR.

🧹 Remove commented code
         services.AddSingleton<IEventTypeRegistry, EventTypeRegistry>();
-        // services.AddSingleton<ITopicStrategySelector, TopicStrategySelector>(); // Removed ASB Topic Strategy
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/MessagingExtensions.cs` at line 68, Remove the
commented-out registration line for TopicStrategySelector—specifically the
commented "services.AddSingleton<ITopicStrategySelector,
TopicStrategySelector>(); // Removed ASB Topic Strategy"—to clean up
MessagingExtensions.cs; if you need to retain traceability, replace the comment
with a brief reference to the PR or issue ID (not the commented code) instead.
src/Shared/Messaging/Factories/MessageBusFactory.cs (1)

62-66: Consider adding error handling for production RabbitMQ initialization.

The development/testing path (lines 45-54) includes try-catch with NoOp fallback, but production throws unhandled exceptions if RabbitMQ fails. While failing fast in production may be intentional, consider wrapping this in a try-catch to provide a clearer error message or at least log the failure before propagating.

🛡️ Optional: Add explicit error handling for production
         else
         {
-            _logger.LogInformation("Creating RabbitMQ MessageBus as default for environment: {Environment}", _environment.EnvironmentName);
-            return _serviceProvider.GetRequiredService<RabbitMqMessageBus>();
+            try
+            {
+                _logger.LogInformation("Creating RabbitMQ MessageBus as default for environment: {Environment}", _environment.EnvironmentName);
+                return _serviceProvider.GetRequiredService<RabbitMqMessageBus>();
+            }
+            catch (Exception ex)
+            {
+                _logger.LogCritical(ex, "Failed to create RabbitMQ MessageBus in production environment: {Environment}", _environment.EnvironmentName);
+                throw;
+            }
         }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/Factories/MessageBusFactory.cs` around lines 62 - 66,
Wrap the production branch in MessageBusFactory where you return the
RabbitMqMessageBus (the else that calls
_serviceProvider.GetRequiredService<RabbitMqMessageBus> and logs "Creating
RabbitMQ MessageBus...") in a try-catch; on exception catch (Exception ex) use
_logger.LogError(ex, "Failed to initialize RabbitMQ MessageBus for environment
{Environment}", _environment.EnvironmentName) and then either rethrow the
original exception or throw a new InvalidOperationException with contextual text
so callers see a clear message rather than an unlogged failure.
src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs (1)

115-170: Consider consolidating duplicate infrastructure logging logic.

EnsureDeadLetterInfrastructureAsync (lines 115-140) and LogDeadLetterInfrastructureInfo (lines 145-170) contain nearly identical logic—both check IsDevelopment() || IsProduction() and log the same RabbitMQ infrastructure message. This duplication can lead to maintenance drift.

Consider extracting the common logic into a private helper method.

♻️ Optional: Extract common logic
+    private static void LogRabbitMqInfrastructureStatus(IServiceScope scope)
+    {
+        var environment = scope.ServiceProvider.GetRequiredService<IHostEnvironment>();
+        var logger = scope.ServiceProvider.GetRequiredService<ILogger<IDeadLetterService>>();
+
+        if (environment.IsDevelopment() || environment.IsProduction())
+        {
+            logger.LogInformation("Dead Letter infrastructure for RabbitMQ will be created dynamically");
+        }
+    }
+
     public static Task EnsureDeadLetterInfrastructureAsync(this IHost host)
     {
         using var scope = host.Services.CreateScope();
 
         try
         {
-            var environment = scope.ServiceProvider.GetRequiredService<IHostEnvironment>();
-            var logger = scope.ServiceProvider.GetRequiredService<ILogger<IHostEnvironment>>();
-
-            if (environment.IsDevelopment() || environment.IsProduction())
-            {
-                logger.LogInformation("Dead Letter infrastructure for RabbitMQ will be created dynamically");
-            }
-
+            LogRabbitMqInfrastructureStatus(scope);
             return Task.CompletedTask;
         }
         // ... catch block unchanged
     }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs` around lines 115 -
170, Both EnsureDeadLetterInfrastructureAsync and
LogDeadLetterInfrastructureInfo duplicate the same environment check and log
message; extract that into a single private helper to avoid duplication. Add a
private static method like LogRabbitMqInfrastructure<TLogger>(IServiceProvider
services) that resolves IHostEnvironment and ILogger<TLogger>, performs the
environment.IsDevelopment() || environment.IsProduction() check, and logs the
RabbitMQ message; then replace the duplicated blocks in
EnsureDeadLetterInfrastructureAsync and LogDeadLetterInfrastructureInfo with
calls to this helper (passing the scope.ServiceProvider and the appropriate
logger generic type).
src/Aspire/MeAjudaAi.AppHost/Program.cs (1)

183-188: Fail fast on the moved customer app path.

This repo path is now part of the monorepo contract, so it’s worth normalizing and validating it here instead of letting AddJavaScriptApp fail later with a less actionable error.

Proposed hardening
-        var customerWebPath = Path.Combine(builder.AppHostDirectory, "..", "..", "..", "src", "Web", "MeAjudaAi.Web.Customer");
+        var customerWebPath = Path.GetFullPath(Path.Combine(
+            builder.AppHostDirectory, "..", "..", "..", "src", "Web", "MeAjudaAi.Web.Customer"));
+
+        if (!Directory.Exists(customerWebPath))
+        {
+            throw new DirectoryNotFoundException($"Customer web app path not found: {customerWebPath}");
+        }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Aspire/MeAjudaAi.AppHost/Program.cs` around lines 183 - 188, The code
constructs customerWebPath and calls builder.AddJavaScriptApp("customer-web",
customerWebPath) but does not validate the moved/normalized path; add a
normalization and existence check before calling AddJavaScriptApp: compute the
full path (use Path.GetFullPath on the combined path), verify
Directory.Exists(fullPath) (or File.Exists if applicable), and if the path is
missing or invalid log a clear error via the same logger/process mechanism and
fail fast (throw or Environment.Exit(1)); update references around
customerWebPath and the AddJavaScriptApp call in Program.cs to use the validated
fullPath.
src/Web/.eslintrc.json (1)

14-19: Consider defining stricter dependency constraints as the workspace matures.

The current depConstraints configuration allows any project to depend on any other project ("*"["*"]). While acceptable for initial setup, defining explicit tags (e.g., scope:shared, scope:feature, type:ui) will help enforce architectural boundaries as the monorepo grows.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/.eslintrc.json` around lines 14 - 19, The current depConstraints
block (depConstraints, sourceTag, onlyDependOnLibsWithTags) uses wildcards and
should be tightened: define meaningful tags for projects (e.g., scope:shared,
scope:feature, type:ui) in your project configuration and replace the wildcard
rule with explicit entries mapping each sourceTag to the allowed
onlyDependOnLibsWithTags for that tag; update the depConstraints array to list
one rule per sourceTag (e.g., "sourceTag":"scope:feature" ->
"onlyDependOnLibsWithTags":["scope:shared","type:ui"]) so the linter enforces
the intended architectural boundaries.
src/Web/.nx/workspace-data/eslint-2654242865465226088.hash (1)

1-93: Add .nx/workspace-data/ to .gitignore.

Multiple Nx generated cache files are currently tracked in source control (eslint hashes, jest hashes, project-graph.json, file-map.json, and others). These are local workspace state that regenerate on each run and should be excluded from version control to prevent machine-specific diffs and unnecessary merge churn.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/.nx/workspace-data/eslint-2654242865465226088.hash` around lines 1 -
93, The repo is currently tracking Nx workspace cache files; update the
.gitignore to exclude the Nx workspace-data artifacts by adding an entry for
.nx/workspace-data/ and patterns for generated files such as eslint-*.hash,
jest-*.hash, project-graph.json, file-map.json (and any other files under
.nx/workspace-data/) so these local cache/state files are not committed; remove
any already-tracked files from git index after updating .gitignore (git rm
--cached ...) to stop future commits from including them.
src/Web/libs/ui/tsconfig.json (1)

11-15: Consider adding test config reference if UI library has tests.

The references array only includes tsconfig.lib.json. If the UI library will have unit tests, consider adding a reference to tsconfig.spec.json for consistency with other libraries like api-client.

📝 Suggested addition if tests are planned
   "references": [
     {
       "path": "./tsconfig.lib.json"
-    }
+    },
+    {
+      "path": "./tsconfig.spec.json"
+    }
   ],
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/libs/ui/tsconfig.json` around lines 11 - 15, The tsconfig.json
references array currently only points to tsconfig.lib.json; if this UI library
will contain unit tests, add a reference entry for tsconfig.spec.json so the
project references include test config as well—update the "references" array in
src/Web/libs/ui/tsconfig.json to include both the existing "./tsconfig.lib.json"
and a "./tsconfig.spec.json" reference (matching the pattern used in other libs
like api-client).
src/Web/libs/api-client/tsconfig.spec.json (1)

5-6: Consider using "node" instead of "node10" for moduleResolution.

"node10" is the legacy alias for "node" module resolution. While functionally equivalent, "node" is the canonical name and slightly clearer.

📝 Optional naming improvement
     "module": "commonjs",
-    "moduleResolution": "node10",
+    "moduleResolution": "node",
     "types": ["jest", "node"]
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/libs/api-client/tsconfig.spec.json` around lines 5 - 6, The tsconfig
uses the legacy moduleResolution value "node10"; update the "moduleResolution"
setting in tsconfig.spec.json from "node10" to the canonical "node" (i.e.,
change the value for the "moduleResolution" property) so the compiler uses the
modern, clearer Node resolution mode.
src/Web/tsconfig.base.json (1)

20-30: Consider adding a path alias for Shared.Contracts if backend integration is planned.

The path aliases cover the new frontend libraries. If this Web workspace will consume shared enums or constants from a backend Shared.Contracts module (as mentioned in project conventions), consider adding a path alias for it to maintain alignment between Web and backend code.

Based on learnings: "prefer reuse of Shared.Contracts for enums/constants to keep Web aligned with backend/shared code."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/tsconfig.base.json` around lines 20 - 30, The tsconfig "paths"
currently defines aliases "ui", "auth", and "api-client" but lacks an alias for
the backend-aligned Shared.Contracts module; add a new path entry
"Shared.Contracts" (e.g., pointing to "libs/shared/contracts/src/index.ts")
alongside the existing aliases in the "paths" object so frontend code can import
shared enums/constants using that alias, and update any import statements to use
"Shared.Contracts" where applicable (e.g., imports in components or services
that should use backend shared types).
src/Web/nx.json (1)

31-42: Exclude jest.config.cts from production inputs.

libs/api-client uses jest.config.cts, but the current exclusion only matches .js/.ts. Pure test-config edits will therefore invalidate production build inputs for that library.

♻️ Proposed fix
       "!{projectRoot}/eslint.config.mjs",
       "!{projectRoot}/jest.config.[jt]s",
+      "!{projectRoot}/jest.config.[cm][jt]s",
       "!{projectRoot}/test-setup.[jt]s"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/nx.json` around lines 31 - 42, The production inputs list in nx.json
omits jest.config.cts, so changes to test config files like
libs/api-client/jest.config.cts can incorrectly affect the "production" inputs;
update the "production" inputs (the array containing "default" and the exclusion
patterns) to also exclude jest.config.cts by adding an exclusion for
"!{projectRoot}/jest.config.cts" (or broaden the existing pattern that matches
"jest.config.[jt]s" to include .cts) so test-config-only edits no longer
invalidate production builds.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.coderabbit.yaml:
- Line 20: The current negation pattern "!src/Web/libs/**" in .coderabbit.yaml
hides all changes under src/Web/libs; replace that broad pattern with a narrow
exclusion for the specific noisy subtree(s) only (e.g.,
"!src/Web/libs/MeAjudaAi.Web.Provider/**") or remove the negation entirely so
active Nx sources are reviewed; update the entry that contains
"!src/Web/libs/**" to reference the exact folder(s) you want ignored instead of
the whole libs root.

In `@src/Shared/Messaging/MessagingExtensions.cs`:
- Around line 7-8: The file contains a duplicate using directive for
MeAjudaAi.Shared.Messaging.RabbitMq in MessagingExtensions.cs which triggers
CS0105; remove the redundant using so the namespace is only imported once (leave
a single using MeAjudaAi.Shared.Messaging.RabbitMq and delete the duplicate
line) to eliminate the compiler warning.

In `@src/Web/.nx/workspace-data/d/server-process.json`:
- Around line 1-5: The committed runtime artifact server-process.json in
.nx/workspace-data contains machine-specific transient data and should be
removed from version control; add the .nx/ (or specifically .nx/workspace-data/)
pattern to .gitignore, then remove the file from the repo index (git rm --cached
src/Web/.nx/workspace-data/d/server-process.json) and commit the .gitignore
change so the daemon-generated server-process.json is no longer tracked.

In `@src/Web/.nx/workspace-data/lockfile-nodes.hash`:
- Line 1: The repo is tracking machine-generated Nx artifacts (notably
.nx/workspace-data/lockfile-nodes.hash) which should be untracked; remove the
tracked files from Git index (use git rm --cached for files under
.nx/workspace-data), commit that removal, and ensure the existing .gitignore
entries cover .nx/workspace-data so these files remain untracked going forward;
verify by running git status to confirm the artifacts are no longer tracked and
that only desired files are committed.

In `@src/Web/libs/api-client/package.json`:
- Around line 5-10: Update the package.json entrypoints so they point to the
build output under dist instead of src: change the "main" field from
"./src/index.js" to "./dist/libs/api-client/index.js" and the "types" field from
"./src/index.d.ts" to "./dist/libs/api-client/index.d.ts" so the
published/consumed package matches the Nx build output defined in project.json.

In `@src/Web/libs/auth/src/lib/auth.module.css`:
- Around line 1-7: auth.module.css is missing the .container class that auth.tsx
expects via styles['container']; add a .container CSS rule to the module (even
empty or with minimal layout styles) so the CSS module exports the container key
used by the component, e.g., create a .container { /* styles */ } selector in
auth.module.css to match the reference in auth.tsx.

In `@src/Web/package.json`:
- Around line 10-12: Remove the placeholder root test script from package.json
by deleting the "test" entry under "scripts" so the repository no longer defines
a workspace-level target that always exits 1; locate the "scripts" object in
src/Web/package.json and remove the "test": "echo \"Error: no test specified\"
&& exit 1" key/value pair (leaving other scripts or the scripts object as
appropriate) so the Nx graph no longer exposes a bogus '.' → 'test' target.
- Around line 2-3: Add "private": true to the workspace root package.json to
prevent accidental npm publication of the monorepo; update the top-level
package.json (the file shown in the diff where "name": "meajudaai" and
"version": "1.0.0" are declared) by inserting the "private": true property
alongside the existing fields.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`:
- Around line 9-12: The MockRabbitMqMessageBus is mixing Send with Publish via
EMessageType and the PublishedMessages collection, making the API misleading;
either (A) keep PublishedMessages publish-only and introduce a separate
SentMessages (or SentRecordedMessages) collection plus methods that append to
SentMessages when EMessageType.Send is used, or (B) rename PublishedMessages to
a neutral RecordedMessages and keep EMessageType to mark entries as Send vs
Publish; also move or reuse the EMessageType enum from Shared.Contracts (or
reference that shared enum) so the test contract aligns with production/shared
code and update all references in MockRabbitMqMessageBus, tests, and docs to the
chosen collection name and enum.

---

Outside diff comments:
In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs`:
- Line 1: Remove the unused using directive for the
MeAjudaAi.Shared.Messaging.Strategy namespace from the
RabbitMqInfrastructureManager.cs file; locate the using line "using
MeAjudaAi.Shared.Messaging.Strategy;" at the top of the
RabbitMqInfrastructureManager class and delete it so only necessary imports
remain.

---

Nitpick comments:
In `@src/Aspire/MeAjudaAi.AppHost/Program.cs`:
- Around line 183-188: The code constructs customerWebPath and calls
builder.AddJavaScriptApp("customer-web", customerWebPath) but does not validate
the moved/normalized path; add a normalization and existence check before
calling AddJavaScriptApp: compute the full path (use Path.GetFullPath on the
combined path), verify Directory.Exists(fullPath) (or File.Exists if
applicable), and if the path is missing or invalid log a clear error via the
same logger/process mechanism and fail fast (throw or Environment.Exit(1));
update references around customerWebPath and the AddJavaScriptApp call in
Program.cs to use the validated fullPath.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs`:
- Around line 115-170: Both EnsureDeadLetterInfrastructureAsync and
LogDeadLetterInfrastructureInfo duplicate the same environment check and log
message; extract that into a single private helper to avoid duplication. Add a
private static method like LogRabbitMqInfrastructure<TLogger>(IServiceProvider
services) that resolves IHostEnvironment and ILogger<TLogger>, performs the
environment.IsDevelopment() || environment.IsProduction() check, and logs the
RabbitMQ message; then replace the duplicated blocks in
EnsureDeadLetterInfrastructureAsync and LogDeadLetterInfrastructureInfo with
calls to this helper (passing the scope.ServiceProvider and the appropriate
logger generic type).

In `@src/Shared/Messaging/Factories/MessageBusFactory.cs`:
- Around line 62-66: Wrap the production branch in MessageBusFactory where you
return the RabbitMqMessageBus (the else that calls
_serviceProvider.GetRequiredService<RabbitMqMessageBus> and logs "Creating
RabbitMQ MessageBus...") in a try-catch; on exception catch (Exception ex) use
_logger.LogError(ex, "Failed to initialize RabbitMQ MessageBus for environment
{Environment}", _environment.EnvironmentName) and then either rethrow the
original exception or throw a new InvalidOperationException with contextual text
so callers see a clear message rather than an unlogged failure.

In `@src/Shared/Messaging/MessagingExtensions.cs`:
- Line 68: Remove the commented-out registration line for
TopicStrategySelector—specifically the commented
"services.AddSingleton<ITopicStrategySelector, TopicStrategySelector>(); //
Removed ASB Topic Strategy"—to clean up MessagingExtensions.cs; if you need to
retain traceability, replace the comment with a brief reference to the PR or
issue ID (not the commented code) instead.

In `@src/Web/.eslintrc.json`:
- Around line 14-19: The current depConstraints block (depConstraints,
sourceTag, onlyDependOnLibsWithTags) uses wildcards and should be tightened:
define meaningful tags for projects (e.g., scope:shared, scope:feature, type:ui)
in your project configuration and replace the wildcard rule with explicit
entries mapping each sourceTag to the allowed onlyDependOnLibsWithTags for that
tag; update the depConstraints array to list one rule per sourceTag (e.g.,
"sourceTag":"scope:feature" ->
"onlyDependOnLibsWithTags":["scope:shared","type:ui"]) so the linter enforces
the intended architectural boundaries.

In `@src/Web/.nx/workspace-data/eslint-2654242865465226088.hash`:
- Around line 1-93: The repo is currently tracking Nx workspace cache files;
update the .gitignore to exclude the Nx workspace-data artifacts by adding an
entry for .nx/workspace-data/ and patterns for generated files such as
eslint-*.hash, jest-*.hash, project-graph.json, file-map.json (and any other
files under .nx/workspace-data/) so these local cache/state files are not
committed; remove any already-tracked files from git index after updating
.gitignore (git rm --cached ...) to stop future commits from including them.

In `@src/Web/libs/api-client/tsconfig.spec.json`:
- Around line 5-6: The tsconfig uses the legacy moduleResolution value "node10";
update the "moduleResolution" setting in tsconfig.spec.json from "node10" to the
canonical "node" (i.e., change the value for the "moduleResolution" property) so
the compiler uses the modern, clearer Node resolution mode.

In `@src/Web/libs/ui/tsconfig.json`:
- Around line 11-15: The tsconfig.json references array currently only points to
tsconfig.lib.json; if this UI library will contain unit tests, add a reference
entry for tsconfig.spec.json so the project references include test config as
well—update the "references" array in src/Web/libs/ui/tsconfig.json to include
both the existing "./tsconfig.lib.json" and a "./tsconfig.spec.json" reference
(matching the pattern used in other libs like api-client).

In `@src/Web/nx.json`:
- Around line 31-42: The production inputs list in nx.json omits
jest.config.cts, so changes to test config files like
libs/api-client/jest.config.cts can incorrectly affect the "production" inputs;
update the "production" inputs (the array containing "default" and the exclusion
patterns) to also exclude jest.config.cts by adding an exclusion for
"!{projectRoot}/jest.config.cts" (or broaden the existing pattern that matches
"jest.config.[jt]s" to include .cts) so test-config-only edits no longer
invalidate production builds.

In `@src/Web/tsconfig.base.json`:
- Around line 20-30: The tsconfig "paths" currently defines aliases "ui",
"auth", and "api-client" but lacks an alias for the backend-aligned
Shared.Contracts module; add a new path entry "Shared.Contracts" (e.g., pointing
to "libs/shared/contracts/src/index.ts") alongside the existing aliases in the
"paths" object so frontend code can import shared enums/constants using that
alias, and update any import statements to use "Shared.Contracts" where
applicable (e.g., imports in components or services that should use backend
shared types).

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 39e14f05-5db9-4506-9d94-29fe9d9bd53c

📥 Commits

Reviewing files that changed from the base of the PR and between 728370e and 69c8b6e.

⛔ Files ignored due to path filters (2)
  • src/Web/MeAjudaAi.Web.Customer/package-lock.json is excluded by !**/package-lock.json
  • src/Web/package-lock.json is excluded by !**/package-lock.json
📒 Files selected for processing (137)
  • .coderabbit.yaml
  • .gitignore
  • Directory.Packages.props
  • api/README.md
  • automation/README.md
  • config/.editorconfig
  • config/.gitleaks.toml
  • config/.lycheeignore
  • config/.yamllint.yml
  • config/README.md
  • config/coverage.runsettings
  • config/coverlet.json
  • config/lychee.toml
  • docs/messaging.md
  • docs/technical-debt.md
  • infrastructure/main.bicep
  • infrastructure/servicebus.bicep
  • prompts/design-react-project.md
  • src/Aspire/MeAjudaAi.AppHost/MeAjudaAi.AppHost.csproj
  • src/Aspire/MeAjudaAi.AppHost/Program.cs
  • src/Aspire/MeAjudaAi.AppHost/packages.lock.json
  • src/Aspire/MeAjudaAi.ServiceDefaults/packages.lock.json
  • src/Bootstrapper/MeAjudaAi.ApiService/packages.lock.json
  • src/Modules/Documents/API/packages.lock.json
  • src/Modules/Documents/Application/packages.lock.json
  • src/Modules/Documents/Domain/packages.lock.json
  • src/Modules/Documents/Infrastructure/packages.lock.json
  • src/Modules/Documents/Tests/packages.lock.json
  • src/Modules/Locations/API/packages.lock.json
  • src/Modules/Locations/Application/packages.lock.json
  • src/Modules/Locations/Domain/packages.lock.json
  • src/Modules/Locations/Infrastructure/packages.lock.json
  • src/Modules/Locations/Tests/packages.lock.json
  • src/Modules/Providers/API/packages.lock.json
  • src/Modules/Providers/Application/packages.lock.json
  • src/Modules/Providers/Domain/packages.lock.json
  • src/Modules/Providers/Infrastructure/packages.lock.json
  • src/Modules/Providers/Tests/packages.lock.json
  • src/Modules/SearchProviders/API/packages.lock.json
  • src/Modules/SearchProviders/Application/packages.lock.json
  • src/Modules/SearchProviders/Domain/packages.lock.json
  • src/Modules/SearchProviders/Infrastructure/packages.lock.json
  • src/Modules/SearchProviders/Tests/packages.lock.json
  • src/Modules/ServiceCatalogs/API/packages.lock.json
  • src/Modules/ServiceCatalogs/Application/packages.lock.json
  • src/Modules/ServiceCatalogs/Domain/packages.lock.json
  • src/Modules/ServiceCatalogs/Infrastructure/packages.lock.json
  • src/Modules/ServiceCatalogs/Tests/packages.lock.json
  • src/Modules/Users/API/packages.lock.json
  • src/Modules/Users/Application/packages.lock.json
  • src/Modules/Users/Domain/packages.lock.json
  • src/Modules/Users/Infrastructure/packages.lock.json
  • src/Modules/Users/Tests/packages.lock.json
  • src/Shared/Extensions/ServiceCollectionExtensions.cs
  • src/Shared/MeAjudaAi.Shared.csproj
  • src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs
  • src/Shared/Messaging/DeadLetter/ServiceBusDeadLetterService.cs
  • src/Shared/Messaging/Factories/DeadLetterServiceFactory.cs
  • src/Shared/Messaging/Factories/IDeadLetterServiceFactory.cs
  • src/Shared/Messaging/Factories/MessageBusFactory.cs
  • src/Shared/Messaging/MessagingExtensions.cs
  • src/Shared/Messaging/NoOp/NoOpServiceBusTopicManager.cs
  • src/Shared/Messaging/Options/DeadLetterOptions.cs
  • src/Shared/Messaging/Options/ServiceBusOptions.cs
  • src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs
  • src/Shared/Messaging/ServiceBus/IServiceBusTopicManager.cs
  • src/Shared/Messaging/ServiceBus/ServiceBusMessageBus.cs
  • src/Shared/Messaging/ServiceBus/ServiceBusTopicManager.cs
  • src/Shared/Messaging/Services/ServiceBusInitializationService.cs
  • src/Shared/Messaging/Strategy/ITopicStrategySelector.cs
  • src/Shared/Messaging/Strategy/TopicStrategySelector.cs
  • src/Shared/packages.lock.json
  • src/Web/.eslintignore
  • src/Web/.eslintrc.json
  • src/Web/.nx/workspace-data/d/server-process.json
  • src/Web/.nx/workspace-data/eslint-2654242865465226088.hash
  • src/Web/.nx/workspace-data/file-map.json
  • src/Web/.nx/workspace-data/jest-7930610538513362720.hash
  • src/Web/.nx/workspace-data/lockfile-dependencies.hash
  • src/Web/.nx/workspace-data/lockfile-nodes.hash
  • src/Web/.nx/workspace-data/nx_files.nxt
  • src/Web/.nx/workspace-data/parsed-lock-file.dependencies.json
  • src/Web/.nx/workspace-data/parsed-lock-file.nodes.json
  • src/Web/.nx/workspace-data/playwright-5186013267177752385.hash
  • src/Web/.nx/workspace-data/project-graph.json
  • src/Web/.nx/workspace-data/source-maps.json
  • src/Web/.prettierignore
  • src/Web/.prettierrc
  • src/Web/MeAjudaAi.Web.Customer/project.json
  • src/Web/MeAjudaAi.Web.Customer/tsconfig.json
  • src/Web/config/tailwind.extension.json
  • src/Web/jest.config.ts
  • src/Web/jest.preset.js
  • src/Web/libs/api-client/.eslintrc.json
  • src/Web/libs/api-client/README.md
  • src/Web/libs/api-client/jest.config.cts
  • src/Web/libs/api-client/package.json
  • src/Web/libs/api-client/project.json
  • src/Web/libs/api-client/src/index.ts
  • src/Web/libs/api-client/src/lib/api-client.spec.ts
  • src/Web/libs/api-client/src/lib/api-client.ts
  • src/Web/libs/api-client/tsconfig.json
  • src/Web/libs/api-client/tsconfig.lib.json
  • src/Web/libs/api-client/tsconfig.spec.json
  • src/Web/libs/auth/.babelrc
  • src/Web/libs/auth/README.md
  • src/Web/libs/auth/project.json
  • src/Web/libs/auth/src/index.ts
  • src/Web/libs/auth/src/lib/auth.module.css
  • src/Web/libs/auth/src/lib/auth.tsx
  • src/Web/libs/auth/tsconfig.json
  • src/Web/libs/auth/tsconfig.lib.json
  • src/Web/libs/ui/.babelrc
  • src/Web/libs/ui/README.md
  • src/Web/libs/ui/project.json
  • src/Web/libs/ui/src/index.ts
  • src/Web/libs/ui/src/lib/ui.module.css
  • src/Web/libs/ui/src/lib/ui.tsx
  • src/Web/libs/ui/tsconfig.json
  • src/Web/libs/ui/tsconfig.lib.json
  • src/Web/nx.json
  • src/Web/package.json
  • src/Web/tsconfig.base.json
  • tests/MeAjudaAi.ApiService.Tests/packages.lock.json
  • tests/MeAjudaAi.Architecture.Tests/packages.lock.json
  • tests/MeAjudaAi.E2E.Tests/packages.lock.json
  • tests/MeAjudaAi.Integration.Tests/Messaging/DeadLetter/DeadLetterIntegrationTests.cs
  • tests/MeAjudaAi.Integration.Tests/Messaging/MessageBusSelectionTests.cs
  • tests/MeAjudaAi.Integration.Tests/packages.lock.json
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Extensions/MessagingMockExtensions.cs
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockServiceBusMessageBus.cs
  • tests/MeAjudaAi.Shared.Tests/Unit/Messaging/MessagingExtensionsTests.cs
  • tests/MeAjudaAi.Shared.Tests/Unit/Messaging/ServiceBus/ServiceBusMessageBusTests.cs
  • tests/MeAjudaAi.Shared.Tests/Unit/Messaging/Strategy/TopicStrategySelectorTests.cs
  • tests/MeAjudaAi.Shared.Tests/packages.lock.json
  • tests/MeAjudaAi.Web.Admin.Tests/packages.lock.json
💤 Files with no reviewable changes (72)
  • automation/README.md
  • src/Shared/Messaging/DeadLetter/ServiceBusDeadLetterService.cs
  • config/.gitleaks.toml
  • src/Modules/SearchProviders/Infrastructure/packages.lock.json
  • infrastructure/main.bicep
  • src/Shared/Messaging/Strategy/ITopicStrategySelector.cs
  • src/Modules/Providers/Infrastructure/packages.lock.json
  • src/Shared/Messaging/Factories/IDeadLetterServiceFactory.cs
  • config/lychee.toml
  • src/Modules/Locations/API/packages.lock.json
  • src/Modules/Documents/Application/packages.lock.json
  • src/Modules/Documents/Tests/packages.lock.json
  • config/coverlet.json
  • tests/MeAjudaAi.Architecture.Tests/packages.lock.json
  • config/README.md
  • src/Modules/SearchProviders/Tests/packages.lock.json
  • src/Modules/Providers/Tests/packages.lock.json
  • tests/MeAjudaAi.Shared.Tests/Unit/Messaging/ServiceBus/ServiceBusMessageBusTests.cs
  • docs/technical-debt.md
  • src/Modules/Users/Application/packages.lock.json
  • src/Bootstrapper/MeAjudaAi.ApiService/packages.lock.json
  • src/Shared/Messaging/Options/DeadLetterOptions.cs
  • src/Modules/ServiceCatalogs/Infrastructure/packages.lock.json
  • src/Shared/Messaging/Strategy/TopicStrategySelector.cs
  • config/.editorconfig
  • tests/MeAjudaAi.Integration.Tests/Messaging/MessageBusSelectionTests.cs
  • api/README.md
  • src/Modules/ServiceCatalogs/Tests/packages.lock.json
  • src/Shared/Messaging/ServiceBus/ServiceBusMessageBus.cs
  • src/Modules/ServiceCatalogs/API/packages.lock.json
  • src/Modules/Users/Domain/packages.lock.json
  • Directory.Packages.props
  • src/Shared/Messaging/Services/ServiceBusInitializationService.cs
  • src/Modules/Locations/Application/packages.lock.json
  • tests/MeAjudaAi.Integration.Tests/packages.lock.json
  • tests/MeAjudaAi.Shared.Tests/Unit/Messaging/Strategy/TopicStrategySelectorTests.cs
  • src/Modules/Documents/API/packages.lock.json
  • config/.yamllint.yml
  • src/Shared/Messaging/Factories/DeadLetterServiceFactory.cs
  • src/Modules/SearchProviders/Application/packages.lock.json
  • tests/MeAjudaAi.E2E.Tests/packages.lock.json
  • src/Shared/Messaging/ServiceBus/IServiceBusTopicManager.cs
  • src/Aspire/MeAjudaAi.ServiceDefaults/packages.lock.json
  • src/Modules/Documents/Domain/packages.lock.json
  • src/Modules/ServiceCatalogs/Domain/packages.lock.json
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockServiceBusMessageBus.cs
  • src/Shared/packages.lock.json
  • src/Modules/Providers/Domain/packages.lock.json
  • config/coverage.runsettings
  • config/.lycheeignore
  • tests/MeAjudaAi.ApiService.Tests/packages.lock.json
  • src/Shared/Messaging/Options/ServiceBusOptions.cs
  • src/Modules/SearchProviders/API/packages.lock.json
  • infrastructure/servicebus.bicep
  • src/Modules/Locations/Tests/packages.lock.json
  • src/Modules/SearchProviders/Domain/packages.lock.json
  • src/Shared/MeAjudaAi.Shared.csproj
  • src/Modules/Locations/Domain/packages.lock.json
  • src/Modules/Documents/Infrastructure/packages.lock.json
  • src/Modules/Providers/API/packages.lock.json
  • src/Modules/Users/API/packages.lock.json
  • src/Shared/Messaging/NoOp/NoOpServiceBusTopicManager.cs
  • src/Modules/ServiceCatalogs/Application/packages.lock.json
  • src/Shared/Messaging/ServiceBus/ServiceBusTopicManager.cs
  • src/Modules/Users/Infrastructure/packages.lock.json
  • src/Modules/Providers/Application/packages.lock.json
  • src/Aspire/MeAjudaAi.AppHost/packages.lock.json
  • src/Aspire/MeAjudaAi.AppHost/MeAjudaAi.AppHost.csproj
  • tests/MeAjudaAi.Shared.Tests/packages.lock.json
  • src/Modules/Locations/Infrastructure/packages.lock.json
  • tests/MeAjudaAi.Integration.Tests/Messaging/DeadLetter/DeadLetterIntegrationTests.cs
  • src/Modules/Users/Tests/packages.lock.json

Filipe Frigini added 2 commits March 9, 2026 21:31
…ions, shared messaging infrastructure, Aspire host, and related test utilities.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
docs/roadmap-current.md (1)

422-437: ⚠️ Potential issue | 🟡 Minor

Evite marcar como concluída enquanto os entregáveis do bloco ainda estão pendentes.

Como o checklist logo abaixo ainda deixa apps/provider-web, apps/admin-portal e extração de libs/ em aberto, este ✅ CONCLUÍDA fica contraditório. Ou feche esses entregáveis, ou troque o status para algo como “parcialmente concluída” e descreva o que ficou para a próxima sprint.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/roadmap-current.md` around lines 422 - 437, The document's "Status: ✅
CONCLUÍDA" conflicts with remaining checklist items; update the Status field or
complete the listed deliverables: either change "Status" to "Parcialmente
concluída" (or similar) and add a short "Remaining" subsection listing pending
items (apps/provider-web, apps/admin-portal, libs/ui, libs/auth,
libs/api-client), or mark each of those scaffolding/extraction items as done and
remove them from the checklist so "Status" can legitimately be "CONCLUÍDA";
ensure you update the specific symbols "Status", the app placeholders
"apps/provider-web" and "apps/admin-portal", and the library entries "libs/ui",
"libs/auth", "libs/api-client" accordingly.
🧹 Nitpick comments (7)
tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Extensions/TestInfrastructureExtensions.cs (1)

120-130: Considerar adicionar validação de nulidade ou documentar comportamento esperado.

Os operadores null-forgiving (message! e @event!) suprimem avisos do compilador, mas podem mascarar problemas se mensagens nulas forem passadas acidentalmente durante os testes.

♻️ Sugestão opcional para tratamento explícito
 public Task SendAsync<TMessage>(TMessage message, string? queueName = null, CancellationToken cancellationToken = default)
 {
-    _recordedMessages.Add(message!);
+    ArgumentNullException.ThrowIfNull(message);
+    _recordedMessages.Add(message);
     return Task.CompletedTask;
 }

 public Task PublishAsync<TMessage>(TMessage `@event`, string? topicName = null, CancellationToken cancellationToken = default)
 {
-    _recordedMessages.Add(`@event`!);
+    ArgumentNullException.ThrowIfNull(`@event`);
+    _recordedMessages.Add(`@event`);
     return Task.CompletedTask;
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Extensions/TestInfrastructureExtensions.cs`
around lines 120 - 130, The SendAsync and PublishAsync methods currently use
null-forgiving operators (message! / `@event`!) which hide nulls; update
SendAsync<TMessage> and PublishAsync<TMessage> to explicitly validate input
(e.g., if (message == null) throw new ArgumentNullException(nameof(message));
and similarly for `@event`) and then add the non-null value to _recordedMessages
(removing the null-forgiving operator), or alternatively add XML/doc comments
stating that null is allowed; prefer the explicit ArgumentNullException approach
to fail fast in tests.
tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs (1)

33-35: Documentação XML desatualizada após renomeação.

Os comentários XML ainda mencionam "mensagens publicadas", mas a propriedade RecordedMessages e o método ClearRecordedMessages agora abrangem tanto mensagens enviadas (Send) quanto publicadas (Publish).

📝 Sugestão de atualização da documentação
     /// <summary>
-    /// Lista de mensagens publicadas durante os testes
+    /// Lista de mensagens registradas durante os testes (Send e Publish)
     /// </summary>
     public IReadOnlyList<(object message, string? destination, EMessageType type)> RecordedMessages
         => _recordedMessages.AsReadOnly();

     /// <summary>
-    /// Limpa a lista de mensagens publicadas
+    /// Limpa a lista de mensagens registradas
     /// </summary>
     public void ClearRecordedMessages()

Also applies to: 39-41

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`
around lines 33 - 35, A documentação XML da classe MockRabbitMqMessageBus está
desatualizada: atualize a summary da propriedade RecordedMessages e do método
ClearRecordedMessages para refletir que agora elas abrangem tanto mensagens
enviadas (Send) quanto publicadas (Publish); localize as tags XML acima de
RecordedMessages e ClearRecordedMessages na classe MockRabbitMqMessageBus e
troque o texto que diz "mensagens publicadas" por uma frase clara como
"mensagens enviadas e publicadas" ou equivalente que indique ambos os fluxos.
src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs (2)

35-40: Registro redundante de RabbitMqDeadLetterService.

A linha 36 registra RabbitMqDeadLetterService como um serviço concreto, e a linha 40 registra novamente a mesma implementação para a interface IDeadLetterService. O registro na linha 36 só seria necessário se houvesse injeção direta do tipo concreto em algum lugar, caso contrário é redundante.

♻️ Considere remover registros redundantes
-        // Registrar implementações específicas
-        services.AddScoped<RabbitMqDeadLetterService>();
-        services.AddScoped<NoOpDeadLetterService>();
-
         // Registrar serviço principal baseado no ambiente (RabbitMQ por padrão)
         services.AddScoped<IDeadLetterService, RabbitMqDeadLetterService>();

Se NoOpDeadLetterService for necessário em algum cenário específico, mantenha apenas esse registro.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs` around lines 35 -
40, The RabbitMqDeadLetterService is being registered twice: once as a concrete
type (services.AddScoped<RabbitMqDeadLetterService>()) and again as the
implementation for IDeadLetterService (services.AddScoped<IDeadLetterService,
RabbitMqDeadLetterService>()); remove the redundant concrete registration of
RabbitMqDeadLetterService (the AddScoped<RabbitMqDeadLetterService>() line)
unless some code directly injects the concrete type—if direct injection is
required keep the concrete registration, otherwise keep only the interface
registration and leave NoOpDeadLetterService registered as needed.

161-165: Condição de ambiente pode não cobrir Staging.

A condição IsDevelopment() || IsProduction() não inclui ambientes como Staging. Se a intenção é logar em todos os ambientes não-Testing, considere inverter a lógica ou adicionar Staging explicitamente.

♻️ Sugestão para cobrir mais ambientes
-        if (environment.IsDevelopment() || environment.IsProduction())
+        if (!environment.IsEnvironment("Testing"))
         {
             // Para RabbitMQ, a infraestrutura é criada dinamicamente quando necessário
             logger.LogInformation("Dead Letter infrastructure for RabbitMQ will be created dynamically");
         }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs` around lines 161 -
165, The environment check using environment.IsDevelopment() ||
environment.IsProduction() in DeadLetterExtensions.cs misses other non-testing
environments (e.g., Staging); update the condition to cover all non-testing
environments by either using !environment.IsEnvironment("Testing") or adding
environment.IsStaging() (or environment.IsEnvironment("Staging")) to the OR
list, and keep the logger.LogInformation("Dead Letter infrastructure for
RabbitMQ will be created dynamically") call intact so it executes in the broader
set of environments.
src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs (1)

42-47: Criação de infraestrutura redundante no loop.

Com a mudança para usar _options.DefaultQueueName para todos os tipos de evento, o loop está chamando CreateExchangeAsync e CreateQueueAsync com os mesmos valores para cada iteração. Apenas o BindQueueToExchangeAsync recebe um routing key diferente (eventType.Name).

Considere otimizar extraindo a criação de exchange/queue para fora do loop:

♻️ Sugestão de refatoração
             var eventTypes = await _eventRegistry.GetAllEventTypesAsync();
+            var queueName = _options.DefaultQueueName;
+            var exchangeName = $"{queueName}.exchange";
+
+            // Cria exchange e fila uma única vez
+            await CreateExchangeAsync(exchangeName, ExchangeType.Topic);
+            await CreateQueueAsync(queueName);
+
+            // Cria bindings para cada tipo de evento
             foreach (var eventType in eventTypes)
             {
-                var queueName = _options.DefaultQueueName;
-                var exchangeName = $"{queueName}.exchange";
-
-                await CreateExchangeAsync(exchangeName, ExchangeType.Topic);
-                await CreateQueueAsync(queueName);
                 await BindQueueToExchangeAsync(queueName, exchangeName, eventType.Name);

                 _logger.LogDebug("Infrastructure created for event type {EventType}: exchange={Exchange}, queue={Queue}",
                     eventType.Name, exchangeName, queueName);
             }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs` around lines
42 - 47, The loop currently recreates the same exchange and queue for every
event type because it uses _options.DefaultQueueName; move the one-time calls to
CreateExchangeAsync(exchangeName, ExchangeType.Topic) and
CreateQueueAsync(queueName) outside the loop and leave only
BindQueueToExchangeAsync(queueName, exchangeName, eventType.Name) inside the
loop; update the surrounding method in RabbitMqInfrastructureManager so
exchangeName = $"{queueName}.exchange" and queueName = _options.DefaultQueueName
are computed once, then iterate event types to call BindQueueToExchangeAsync for
each eventType.Name.
src/Web/package.json (2)

5-5: Campo main é desnecessário para workspace root.

O campo "main": "index.js" é vestigial para um monorepo root, já que este package.json não será importado como módulo. Pode ser removido para manter a configuração limpa.

♻️ Correção sugerida
   "version": "1.0.0",
   "description": "Uma plataforma abrangente de serviços construída com .NET Aspire, projetada para conectar prestadores de serviços com clientes usando arquitetura monólito modular.",
-  "main": "index.js",
   "private": true,
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/package.json` at line 5, Remova o campo desnecessário "main":
"index.js" do package.json da raiz do workspace (o package.json que contém o
campo "main") porque este package.json não é consumido como módulo; abra o
arquivo package.json e delete a propriedade "main" inteira mantendo o restante
do JSON intacto.

69-91: CI/CD deve incluir build e testes do workspace Nx.

Conforme observado em src/Aspire/MeAjudaAi.AppHost/Program.cs (linhas 197-220), o AppHost referencia src/Web/MeAjudaAi.Web.Customer via AddJavaScriptApp(). Se os workflows de CI/CD não executarem npm install e nx build neste workspace, a aplicação pode falhar em runtime devido a dependências faltantes ou erros de build não detectados.

Considere adicionar um job no workflow de CI que:

  1. Execute npm ci em src/Web/
  2. Execute npx nx affected --target=lint
  3. Execute npx nx affected --target=test
  4. Execute npx nx affected --target=build
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/package.json` around lines 69 - 91, CI workflow is missing steps to
install and build the Web Nx workspace referenced by the AppHost; ensure the CI
job runs npm ci in the src/Web directory and runs Nx lint/test/build for
affected projects so the AddJavaScriptApp() reference to MeAjudaAi.Web.Customer
in Program.cs does not fail at runtime—add a CI job that: (1) cd into src/Web
and run npm ci, (2) run npx nx affected --target=lint, (3) run npx nx affected
--target=test, and (4) run npx nx affected --target=build (adjust working
directory and caching as needed).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/roadmap-current.md`:
- Around line 420-423: The roadmap entry under the heading "### ✅ Sprint 8B.2 -
NX Scaffolding & Initial Migration" references the wrong branch name; replace
the branch string `feature/sprint-8b2-technical-excellence` with the actual PR
branch `feature/sprint-8b2-monorepo-cleanup` so the documentation matches the
real PR source and preserves traceability.

In `@src/Web/package.json`:
- Around line 82-90: Update the dependencies in package.json to avoid shipping a
beta and to bring zod up-to-date: replace "next-auth": "^5.0.0-beta.25" with a
stable release (e.g., "next-auth": "4.24.13" or the desired stable version) and
update "zod": "^3.25.76" to the target v4 release (e.g., "zod": "4.3.6"); after
changing package.json run your package manager install and run tests/build to
catch any breaking changes, and if zod v4 introduces API changes, update usages
of zod in the codebase accordingly (search for imports/usages of zod to apply
necessary migration edits).

---

Outside diff comments:
In `@docs/roadmap-current.md`:
- Around line 422-437: The document's "Status: ✅ CONCLUÍDA" conflicts with
remaining checklist items; update the Status field or complete the listed
deliverables: either change "Status" to "Parcialmente concluída" (or similar)
and add a short "Remaining" subsection listing pending items (apps/provider-web,
apps/admin-portal, libs/ui, libs/auth, libs/api-client), or mark each of those
scaffolding/extraction items as done and remove them from the checklist so
"Status" can legitimately be "CONCLUÍDA"; ensure you update the specific symbols
"Status", the app placeholders "apps/provider-web" and "apps/admin-portal", and
the library entries "libs/ui", "libs/auth", "libs/api-client" accordingly.

---

Nitpick comments:
In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs`:
- Around line 35-40: The RabbitMqDeadLetterService is being registered twice:
once as a concrete type (services.AddScoped<RabbitMqDeadLetterService>()) and
again as the implementation for IDeadLetterService
(services.AddScoped<IDeadLetterService, RabbitMqDeadLetterService>()); remove
the redundant concrete registration of RabbitMqDeadLetterService (the
AddScoped<RabbitMqDeadLetterService>() line) unless some code directly injects
the concrete type—if direct injection is required keep the concrete
registration, otherwise keep only the interface registration and leave
NoOpDeadLetterService registered as needed.
- Around line 161-165: The environment check using environment.IsDevelopment()
|| environment.IsProduction() in DeadLetterExtensions.cs misses other
non-testing environments (e.g., Staging); update the condition to cover all
non-testing environments by either using !environment.IsEnvironment("Testing")
or adding environment.IsStaging() (or environment.IsEnvironment("Staging")) to
the OR list, and keep the logger.LogInformation("Dead Letter infrastructure for
RabbitMQ will be created dynamically") call intact so it executes in the broader
set of environments.

In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs`:
- Around line 42-47: The loop currently recreates the same exchange and queue
for every event type because it uses _options.DefaultQueueName; move the
one-time calls to CreateExchangeAsync(exchangeName, ExchangeType.Topic) and
CreateQueueAsync(queueName) outside the loop and leave only
BindQueueToExchangeAsync(queueName, exchangeName, eventType.Name) inside the
loop; update the surrounding method in RabbitMqInfrastructureManager so
exchangeName = $"{queueName}.exchange" and queueName = _options.DefaultQueueName
are computed once, then iterate event types to call BindQueueToExchangeAsync for
each eventType.Name.

In `@src/Web/package.json`:
- Line 5: Remova o campo desnecessário "main": "index.js" do package.json da
raiz do workspace (o package.json que contém o campo "main") porque este
package.json não é consumido como módulo; abra o arquivo package.json e delete a
propriedade "main" inteira mantendo o restante do JSON intacto.
- Around line 69-91: CI workflow is missing steps to install and build the Web
Nx workspace referenced by the AppHost; ensure the CI job runs npm ci in the
src/Web directory and runs Nx lint/test/build for affected projects so the
AddJavaScriptApp() reference to MeAjudaAi.Web.Customer in Program.cs does not
fail at runtime—add a CI job that: (1) cd into src/Web and run npm ci, (2) run
npx nx affected --target=lint, (3) run npx nx affected --target=test, and (4)
run npx nx affected --target=build (adjust working directory and caching as
needed).

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Extensions/TestInfrastructureExtensions.cs`:
- Around line 120-130: The SendAsync and PublishAsync methods currently use
null-forgiving operators (message! / `@event`!) which hide nulls; update
SendAsync<TMessage> and PublishAsync<TMessage> to explicitly validate input
(e.g., if (message == null) throw new ArgumentNullException(nameof(message));
and similarly for `@event`) and then add the non-null value to _recordedMessages
(removing the null-forgiving operator), or alternatively add XML/doc comments
stating that null is allowed; prefer the explicit ArgumentNullException approach
to fail fast in tests.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`:
- Around line 33-35: A documentação XML da classe MockRabbitMqMessageBus está
desatualizada: atualize a summary da propriedade RecordedMessages e do método
ClearRecordedMessages para refletir que agora elas abrangem tanto mensagens
enviadas (Send) quanto publicadas (Publish); localize as tags XML acima de
RecordedMessages e ClearRecordedMessages na classe MockRabbitMqMessageBus e
troque o texto que diz "mensagens publicadas" por uma frase clara como
"mensagens enviadas e publicadas" ou equivalente que indique ambos os fluxos.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 2f9cc5bb-000f-45f7-8296-9c2fccc1bc76

📥 Commits

Reviewing files that changed from the base of the PR and between 69c8b6e and 79c31bf.

📒 Files selected for processing (18)
  • .coderabbit.yaml
  • .gitignore
  • docs/roadmap-current.md
  • src/Aspire/MeAjudaAi.AppHost/Program.cs
  • src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs
  • src/Shared/Messaging/Factories/MessageBusFactory.cs
  • src/Shared/Messaging/MessagingExtensions.cs
  • src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs
  • src/Web/.eslintrc.json
  • src/Web/libs/api-client/package.json
  • src/Web/libs/api-client/tsconfig.spec.json
  • src/Web/libs/auth/src/lib/auth.module.css
  • src/Web/libs/ui/tsconfig.json
  • src/Web/nx.json
  • src/Web/package.json
  • src/Web/tsconfig.base.json
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Extensions/TestInfrastructureExtensions.cs
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs
🚧 Files skipped from review as they are similar to previous changes (8)
  • src/Web/libs/api-client/tsconfig.spec.json
  • src/Web/libs/api-client/package.json
  • src/Web/nx.json
  • src/Web/libs/ui/tsconfig.json
  • src/Web/.eslintrc.json
  • src/Aspire/MeAjudaAi.AppHost/Program.cs
  • src/Web/libs/auth/src/lib/auth.module.css
  • src/Web/tsconfig.base.json

…frastructure management, testing utilities, and a PR validation workflow.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs (1)

47-64: ⚠️ Potential issue | 🟠 Major

Não registre a operação antes de confirmar o sucesso do mock.

Hoje, em Line 52 e Line 62, a mensagem/evento entra em RecordedMessages antes da chamada ao _mockMessageBus. Isso quebra os cenários de SimulateSendFailure/SimulatePublishFailure: o teste enxerga a operação como registrada mesmo quando ela falha. Grave somente após o await, ou separe tentativas falhas em outra coleção.

💡 Ajuste sugerido
-public Task SendAsync<TMessage>(TMessage message, string? queueName = null, CancellationToken cancellationToken = default)
+public async Task SendAsync<TMessage>(TMessage message, string? queueName = null, CancellationToken cancellationToken = default)
 {
     _logger.LogInformation("Mock RabbitMQ: Sending message of type {MessageType} to queue {QueueName}",
         typeof(TMessage).Name, queueName);
 
-    _recordedMessages.Add((message!, queueName, EMessageType.Send));
-
-    return _mockMessageBus.Object.SendAsync(message, queueName, cancellationToken);
+    await _mockMessageBus.Object.SendAsync(message, queueName, cancellationToken);
+    _recordedMessages.Add((message!, queueName, EMessageType.Send));
 }
 
-public Task PublishAsync<TMessage>(TMessage `@event`, string? topicName = null, CancellationToken cancellationToken = default)
+public async Task PublishAsync<TMessage>(TMessage `@event`, string? topicName = null, CancellationToken cancellationToken = default)
 {
     _logger.LogInformation("Mock RabbitMQ: Publishing event of type {EventType} to topic {TopicName}",
         typeof(TMessage).Name, topicName);
 
-    _recordedMessages.Add((`@event`!, topicName, EMessageType.Publish));
-
-    return _mockMessageBus.Object.PublishAsync(`@event`, topicName, cancellationToken);
+    await _mockMessageBus.Object.PublishAsync(`@event`, topicName, cancellationToken);
+    _recordedMessages.Add((`@event`!, topicName, EMessageType.Publish));
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`
around lines 47 - 64, Move the recording of messages so it only happens after
the mocked call completes successfully: in SendAsync and PublishAsync, await the
call to _mockMessageBus.Object.SendAsync/PublishAsync first (or try/catch and
only add to _recordedMessages on success), so failures simulated by
SimulateSendFailure/SimulatePublishFailure do not leave entries in
_recordedMessages; reference the SendAsync and PublishAsync methods, the
_mockMessageBus.Object.SendAsync/PublishAsync invocations, and the
_recordedMessages collection when making the change.
♻️ Duplicate comments (1)
docs/roadmap-current.md (1)

420-422: ⚠️ Potential issue | 🟡 Minor

Branch corrigida, mas status "CONCLUÍDA" pode ser prematuro.

A atualização da branch de feature/sprint-8b2-technical-excellence para feature/sprint-8b2-monorepo-cleanup está correta e alinhada com o PR atual.

Porém, marcar o sprint como "✅ CONCLUÍDA" na linha 422 quando:

  • O PR ainda está aberto
  • A data atual (10 Mar 2026) está dentro do período do sprint (5-18 Mar 2026)

pode ser prematuro. Considere atualizar o status para "✅ CONCLUÍDA" somente após o merge do PR ou confirmação de que todo o trabalho foi efetivamente finalizado.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/roadmap-current.md` around lines 420 - 422, Atualize o status da seção
"### ✅ Sprint 8B.2 - NX Scaffolding & Initial Migration (5 - 18 Mar 2026)"
removendo o marcador "✅ CONCLUÍDA" e substituindo por um status temporário
adequado (ex.: "Em progresso" ou "Em revisão") enquanto o PR associado à branch
feature/sprint-8b2-monorepo-cleanup ainda estiver aberto; mantenha a referência
à branch corrigida (feature/sprint-8b2-monorepo-cleanup) e anote que a
atualização final para "✅ CONCLUÍDA" deve ocorrer somente após o merge do PR ou
confirmação explícita de finalização do trabalho.
🧹 Nitpick comments (4)
docs/roadmap-current.md (1)

418-422: Esclarecer a relação entre "Parte 2: Nx Monorepo Setup" e "Sprint 8B.2".

A linha 418 marca "🛠️ Parte 2: Nx Monorepo Setup" como "🔄 EM PROGRESSO", enquanto a linha 422 marca "Sprint 8B.2 - NX Scaffolding & Initial Migration" como "✅ CONCLUÍDA".

Ambas as seções parecem cobrir trabalho relacionado ao Nx Monorepo, o que pode gerar confusão sobre:

  • Qual é a diferença de escopo entre elas?
  • Por que uma está em progresso e a outra está concluída?
  • "Parte 2" é um container para Sprint 8B.2, ou são iniciativas paralelas?

Considere adicionar uma nota ou reestruturar a documentação para deixar clara a relação hierárquica e o escopo de cada seção.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/roadmap-current.md` around lines 418 - 422, A seção "🛠️ Parte 2: Nx
Monorepo Setup" aparece como "🔄 EM PROGRESSO" enquanto "### ✅ Sprint 8B.2 - NX
Scaffolding & Initial Migration" está marcada como "✅ CONCLUÍDA", o que causa
ambiguidade; atualize o documento para esclarecer a relação e escopo entre esses
dois blocos (por exemplo, deixar claro se "Parte 2: Nx Monorepo Setup" é um
container que engloba múltiplas sprints ou se é uma iniciativa separada),
adicionando uma frase explicativa logo abaixo do título "🛠️ Parte 2: Nx
Monorepo Setup" que define seu propósito/escopo e indicando explicitamente que
"Sprint 8B.2 - NX Scaffolding & Initial Migration" é uma sub-atividade concluída
do mesmo conjunto (ou mova/renomeie uma seção para refletir hierarquia),
referenciando esses títulos exatos para localizar onde editar.
src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs (1)

40-44: Criação duplicada da fila padrão.

A fila DefaultQueueName é criada duas vezes: primeiro na linha 30 (await CreateQueueAsync(_options.DefaultQueueName)) e novamente na linha 44 (await CreateQueueAsync(queueName)), pois queueName é atribuído com o mesmo valor de _options.DefaultQueueName na linha 40.

Considere remover a criação redundante na linha 44, já que a fila padrão já foi criada anteriormente no método.

♻️ Correção proposta para remover duplicação
            // Cria exchanges e bindings para tipos de eventos
            var eventTypes = await _eventRegistry.GetAllEventTypesAsync();
            var queueName = _options.DefaultQueueName;
            var exchangeName = $"{queueName}.exchange";

            await CreateExchangeAsync(exchangeName, ExchangeType.Topic);
-           await CreateQueueAsync(queueName);

            foreach (var eventType in eventTypes)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs` around lines
40 - 44, Remove the redundant creation of the default queue: the call to
CreateQueueAsync(queueName) is duplicating the earlier
CreateQueueAsync(_options.DefaultQueueName). In the method where queueName is
set from _options.DefaultQueueName and CreateExchangeAsync(exchangeName,
ExchangeType.Topic) is called, delete or skip the second
CreateQueueAsync(queueName) invocation (leave the initial
CreateQueueAsync(_options.DefaultQueueName) intact) so CreateQueueAsync is only
called once for the default queue.
src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs (2)

54-70: Métodos AddDeadLetterQueue e AddRabbitMqDeadLetterQueue são funcionalmente equivalentes.

Após a remoção do factory pattern, ambos os métodos registram IDeadLetterService como RabbitMqDeadLetterService. A única diferença é que AddDeadLetterQueue também registra NoOpDeadLetterService (linha 36).

Considere consolidar esses métodos ou documentar claramente quando usar cada um para evitar confusão na API pública.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs` around lines 54 -
70, The two extension methods AddDeadLetterQueue and AddRabbitMqDeadLetterQueue
in DeadLetterExtensions both register IDeadLetterService as
RabbitMqDeadLetterService (while AddDeadLetterQueue also registers
NoOpDeadLetterService), causing API duplication and confusion; consolidate by
making one method the single implementation (e.g., keep AddDeadLetterQueue as
the public entry) and have the other delegate to it (or remove the duplicate),
ensure only the intended registrations remain (register NoOpDeadLetterService
only where required), and update the XML summary/comments for the retained
method to document the intended behavior; reference AddDeadLetterQueue,
AddRabbitMqDeadLetterQueue, IDeadLetterService, RabbitMqDeadLetterService,
NoOpDeadLetterService, and DeadLetterExtensions when making the change.

114-153: Remova ou consolide o método EnsureDeadLetterInfrastructureAsync não utilizado.

O método EnsureDeadLetterInfrastructureAsync (linha 114) não é invocado em nenhum lugar do codebase, enquanto LogDeadLetterInfrastructureInfo é chamado durante a inicialização em MessagingExtensions.cs. Ambos os métodos possuem estrutura idêntica de try-catch com as mesmas mensagens de erro e diferem apenas no tipo genérico do logger (IHostEnvironment vs IDeadLetterService). Remova o método não utilizado ou consolide ambos em uma única implementação reutilizável.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs` around lines 114 -
153, Remove the unused EnsureDeadLetterInfrastructureAsync method and
consolidate its logic into the existing LogDeadLetterInfrastructureInfo
implementation: delete EnsureDeadLetterInfrastructureAsync and ensure any future
callers use LogDeadLetterInfrastructureInfo (which already calls
LogRabbitMqInfrastructure<IDeadLetterService> and logs via
ILogger<IDeadLetterService>); alternatively, if you prefer a single reusable
helper, extract the common try/catch+scope pattern into a private method (e.g.,
a helper wrapping LogRabbitMqInfrastructure<T> and accepting a generic T for the
logger) and have both LogDeadLetterInfrastructureInfo (and any removed Ensure...
callers) invoke that helper so you don’t duplicate the try/catch and error
messages.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/pr-validation.yml:
- Around line 103-105: Antes de executar o passo "📦 Install Frontend
Dependencies" (working-directory: ./src/Web run: npm ci) adicione um passo que
use actions/setup-node com uma versão fixa compatível com o workspace (por
exemplo node-version: '16' ou a versão do projeto) e configure caching do npm
apontando para o lockfile do frontend (cache: npm e path para o lockfile em
src/Web, ex.: package-lock.json ou yarn.lock dentro do diretório ./src/Web);
posicione este novo passo imediatamente antes do step que roda npm ci para
garantir ambiente determinístico e uso do cache baseado no lockfile.
- Around line 107-117: The workflow's nx affected commands (the steps named "🔍
Lint Frontend Workspace", "🧪 Test Frontend Workspace", "🏗️ Build Frontend
Workspace" that run "npx nx affected --target=...") rely on defaultBase from
src/Web/nx.json and can miscompute affected projects for PRs not targeting
master; fix by either inserting the action "nrwl/nx-set-shas@4" before these
steps to set correct SHAs, or by appending explicit flags "--base=${{
github.base_ref }} --head=${{ github.head_ref }}" to each "npx nx affected"
invocation so the affected calculation uses the PR's base/head refs.

In `@src/Web/package.json`:
- Around line 69-89: Two package manifests declare differing versions of shared
frontend deps (e.g., next-auth, zod) causing inconsistent install trees; pick a
single source of truth and make the other match: identify mismatched packages
(next-auth, zod, etc.), update the secondary package.json to use the exact same
versions as the primary Web manifest (or move the shared deps into a single
central manifest / workspace), regenerate the lockfile (npm/yarn/pnpm) and
commit it so CI and local installs use the same dependency tree, and verify by
running an install and a build.
- Around line 53-61: O package.json do workspace declara "jest" na major 30
enquanto "ts-jest" está na major 29; isso causa incompatibilidade com o
transformer configurado em src/Web/libs/api-client/jest.config.cts e pode
quebrar bootstrap/testes. Atualize as dependências para alinhar as majors (por
exemplo, elevar "ts-jest" para ^30.x para combinar com "jest" ou rebaixar "jest"
para ^29.x), e depois verifique/ajuste o transformer em
src/Web/libs/api-client/jest.config.cts para garantir compatibilidade com a
versão escolhida; alternativamente remova/substitua ts-jest por outro
transformer consistente com a versão do jest usada.

---

Outside diff comments:
In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`:
- Around line 47-64: Move the recording of messages so it only happens after the
mocked call completes successfully: in SendAsync and PublishAsync, await the
call to _mockMessageBus.Object.SendAsync/PublishAsync first (or try/catch and
only add to _recordedMessages on success), so failures simulated by
SimulateSendFailure/SimulatePublishFailure do not leave entries in
_recordedMessages; reference the SendAsync and PublishAsync methods, the
_mockMessageBus.Object.SendAsync/PublishAsync invocations, and the
_recordedMessages collection when making the change.

---

Duplicate comments:
In `@docs/roadmap-current.md`:
- Around line 420-422: Atualize o status da seção "### ✅ Sprint 8B.2 - NX
Scaffolding & Initial Migration (5 - 18 Mar 2026)" removendo o marcador "✅
CONCLUÍDA" e substituindo por um status temporário adequado (ex.: "Em progresso"
ou "Em revisão") enquanto o PR associado à branch
feature/sprint-8b2-monorepo-cleanup ainda estiver aberto; mantenha a referência
à branch corrigida (feature/sprint-8b2-monorepo-cleanup) e anote que a
atualização final para "✅ CONCLUÍDA" deve ocorrer somente após o merge do PR ou
confirmação explícita de finalização do trabalho.

---

Nitpick comments:
In `@docs/roadmap-current.md`:
- Around line 418-422: A seção "🛠️ Parte 2: Nx Monorepo Setup" aparece como "🔄
EM PROGRESSO" enquanto "### ✅ Sprint 8B.2 - NX Scaffolding & Initial Migration"
está marcada como "✅ CONCLUÍDA", o que causa ambiguidade; atualize o documento
para esclarecer a relação e escopo entre esses dois blocos (por exemplo, deixar
claro se "Parte 2: Nx Monorepo Setup" é um container que engloba múltiplas
sprints ou se é uma iniciativa separada), adicionando uma frase explicativa logo
abaixo do título "🛠️ Parte 2: Nx Monorepo Setup" que define seu
propósito/escopo e indicando explicitamente que "Sprint 8B.2 - NX Scaffolding &
Initial Migration" é uma sub-atividade concluída do mesmo conjunto (ou
mova/renomeie uma seção para refletir hierarquia), referenciando esses títulos
exatos para localizar onde editar.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs`:
- Around line 54-70: The two extension methods AddDeadLetterQueue and
AddRabbitMqDeadLetterQueue in DeadLetterExtensions both register
IDeadLetterService as RabbitMqDeadLetterService (while AddDeadLetterQueue also
registers NoOpDeadLetterService), causing API duplication and confusion;
consolidate by making one method the single implementation (e.g., keep
AddDeadLetterQueue as the public entry) and have the other delegate to it (or
remove the duplicate), ensure only the intended registrations remain (register
NoOpDeadLetterService only where required), and update the XML summary/comments
for the retained method to document the intended behavior; reference
AddDeadLetterQueue, AddRabbitMqDeadLetterQueue, IDeadLetterService,
RabbitMqDeadLetterService, NoOpDeadLetterService, and DeadLetterExtensions when
making the change.
- Around line 114-153: Remove the unused EnsureDeadLetterInfrastructureAsync
method and consolidate its logic into the existing
LogDeadLetterInfrastructureInfo implementation: delete
EnsureDeadLetterInfrastructureAsync and ensure any future callers use
LogDeadLetterInfrastructureInfo (which already calls
LogRabbitMqInfrastructure<IDeadLetterService> and logs via
ILogger<IDeadLetterService>); alternatively, if you prefer a single reusable
helper, extract the common try/catch+scope pattern into a private method (e.g.,
a helper wrapping LogRabbitMqInfrastructure<T> and accepting a generic T for the
logger) and have both LogDeadLetterInfrastructureInfo (and any removed Ensure...
callers) invoke that helper so you don’t duplicate the try/catch and error
messages.

In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs`:
- Around line 40-44: Remove the redundant creation of the default queue: the
call to CreateQueueAsync(queueName) is duplicating the earlier
CreateQueueAsync(_options.DefaultQueueName). In the method where queueName is
set from _options.DefaultQueueName and CreateExchangeAsync(exchangeName,
ExchangeType.Topic) is called, delete or skip the second
CreateQueueAsync(queueName) invocation (leave the initial
CreateQueueAsync(_options.DefaultQueueName) intact) so CreateQueueAsync is only
called once for the default queue.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 08e23b21-5ed8-4587-8e63-88dabf31e67d

📥 Commits

Reviewing files that changed from the base of the PR and between 79c31bf and f9977bc.

📒 Files selected for processing (7)
  • .github/workflows/pr-validation.yml
  • docs/roadmap-current.md
  • src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs
  • src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs
  • src/Web/package.json
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Extensions/TestInfrastructureExtensions.cs
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs

Filipe Frigini added 3 commits March 9, 2026 22:09
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 6

🧹 Nitpick comments (4)
tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs (2)

147-152: Nomenclatura de GetPublishedMessages poderia ser mais clara.

O método GetPublishedMessages retorna tanto mensagens enviadas (Send) quanto publicadas (Publish), mas o nome sugere apenas "Published". Embora a documentação XML esclareça isso ("send + publish"), considere renomear para GetAllRecordedMessages ou GetRecordedMessages<T> para manter consistência com a nova nomenclatura.

♻️ Sugestão de renomeação
 /// <summary>
-/// Obtém todas as mensagens de um tipo específico (send + publish)
+/// Obtém todas as mensagens gravadas de um tipo específico (send + publish)
 /// </summary>
-public IEnumerable<T> GetPublishedMessages<T>() where T : class
+public IEnumerable<T> GetAllRecordedMessages<T>() where T : class
 {
     return _recordedMessages
         .Where(x => x.message is T)
         .Select(x => (T)x.message);
 }

O mesmo se aplica a WasMessagePublishedWasMessageRecorded.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`
around lines 147 - 152, Rename the misleading methods to reflect that they
return both sent and published messages: change GetPublishedMessages<T>() to
GetRecordedMessages<T>() (or GetAllRecordedMessages<T>) and change
WasMessagePublished(...) to WasMessageRecorded(...); update the XML doc comments
for these methods to match the new names and intent, update all call sites and
unit tests that reference GetPublishedMessages or WasMessagePublished, and
ensure generic constraint and return semantics (filtering _recordedMessages by
x.message is T and casting) remain unchanged in the renamed method.

22-22: Considere thread-safety se testes paralelos compartilharem esta instância.

O campo _recordedMessages usa List<T>, que não é thread-safe. Se múltiplos testes paralelos compartilharem a mesma instância do mock, podem ocorrer race conditions nas operações Add (linhas 53, 62) e Clear (linha 44).

🔧 Sugestão: usar ConcurrentBag ou sincronização

Se houver cenários de testes paralelos:

-private readonly List<(object message, string? destination, EMessageType type)> _recordedMessages;
+private readonly System.Collections.Concurrent.ConcurrentBag<(object message, string? destination, EMessageType type)> _recordedMessages;

Ou adicionar lock nas operações de escrita. Se os testes sempre criam uma nova instância por fixture, ignore esta sugestão.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`
at line 22, The _recordedMessages field is a non-thread-safe List<(object
message, string? destination, EMessageType type)> and can race when tests run in
parallel; replace it with a thread-safe collection (e.g., ConcurrentBag<(object
message, string? destination, EMessageType type)>) or add a private lock object
(e.g., _recordedMessagesLock) and wrap all writes/reads (the locations using
_recordedMessages.Add and _recordedMessages.Clear and any enumerations) in
lock(_recordedMessagesLock) to ensure thread-safety while preserving existing
method signatures and behavior in MockRabbitMqMessageBus.
docs/roadmap-current.md (2)

444-446: Mesma observação sobre marcação prematura de conclusão.

Os entregáveis estão marcados como [x] concluídos, mas seguindo o mesmo raciocínio da nota na linha 424, seria mais consistente aguardar o merge do PR antes de marcar esses itens como finalizados.

Observação: O conteúdo dos entregáveis está alinhado com o escopo do PR conforme descrito no sumário da IA, então a documentação está precisa em termos de conteúdo.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/roadmap-current.md` around lines 444 - 446, The checklist items for
`apps/customer-web`, `apps/provider-web`, `apps/admin-portal` and the extracted
libraries `libs/ui`, `libs/auth`, `libs/api-client` are prematurely marked
`[x]`; change those checked boxes back to unchecked `[ ]` (or add a “pending
merge” note) so they remain uncompleted until the PR is merged, and optionally
include the PR link/ID next to each item for traceability.

435-439: Considere aguardar o merge do PR antes de marcar os itens como concluídos.

Os itens de "Technical Excellence Pack" estão marcados como [x] concluídos, mas o PR ainda está em revisão. Embora a nota na linha 424 reconheça que a finalização deve aguardar o merge, marcar esses checkboxes como feitos antes do merge pode criar uma pequena inconsistência na documentação.

Sugestão: Considere manter esses itens como [ ] até o merge do PR ser confirmado, para alinhar com a política de finalização descrita na linha 424.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/roadmap-current.md` around lines 435 - 439, Atualize a lista de
"Technical Excellence Pack" na seção que contém os itens "- [x] [**TD**]
**Keycloak Automation**: `setup-keycloak-clients.ps1`...", "- [x] [**TD**]
**Analyzer Cleanup**: Fix MudBlazor/SonarLint warnings...", "- [x] [**TD**]
**Refactor Extensions**: Extract `BusinessMetricsMiddlewareExtensions`", "- [x]
[**TD**] **Polly Logging**: Migrate resilience logging to ILogger (Issue `#113`)",
e "- [x] [**TD**] **Standardization**: Record syntax alignment in `Contracts`"
para marcar esses checkboxes como não concluídos ([ ]) até que o PR de
referência seja mergeado; mantenha a nota existente que instruí a aguardar o
merge (a mesma observação já presente no documento) e adicione uma breve TODO
indicando que os checkboxes serão trocados para [x] após confirmação do merge.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs`:
- Around line 84-100: O método LogDeadLetterInfrastructureInfo foi renomeado
para apenas registrar informações, mas as mensagens de erro ainda dizem "ensure
Dead Letter Queue infrastructure"; corrija as mensagens no bloco catch: atualize
o logger.LogError e a InvalidOperationException para refletir "Failed to log
Dead Letter Queue infrastructure info" (ou texto equivalente que mencione "log"
em vez de "ensure"), mantendo os mesmos tipos e contexto; localize e atualizar
as ocorrências no método LogDeadLetterInfrastructureInfo e quaisquer textos
gerados por LogRabbitMqInfrastructure<IDeadLetterService> para evitar confusão
no bootstrap chamado por MessagingExtensions.
- Around line 36-42: Remova o registro redundante de NoOpDeadLetterService e a
semântica de "Enabled" que cria confusão: em vez de sempre registrar
IDeadLetterService para RabbitMqDeadLetterService, torne o registro condicional
(registre RabbitMqDeadLetterService apenas quando a funcionalidade estiver
habilitada) e elimine o registro/uso de NoOpDeadLetterService e qualquer
propriedade/flag DeadLetterOptions.Enabled que já não faça parte do contrato;
também ajuste a validação de startup que resolve IDeadLetterService
(referenciada em RabbitMqDeadLetterService/validação) para não forçar resolução
quando a feature estiver desabilitada.

In `@src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs`:
- Around line 40-47: Você fixou queueName para _options.DefaultQueueName antes
de fazer os binds, o que faz com que BindQueueToExchangeAsync(exchangeName, ...,
eventType.Name) aponte sempre para a fila default e deixe as filas criadas em
DomainQueues órfãs; corrija iterando as filas de domínio reais (DomainQueues) em
vez de usar sempre _options.DefaultQueueName ao chamar BindQueueToExchangeAsync,
garantindo que cada domain queue seja ligada ao exchange criado por
CreateExchangeAsync com a routing key correta (por exemplo eventType.Name) —
alternativamente, se a intenção é usar uma fila única, remova também o
provisionamento de DomainQueues desse fluxo para manter a lógica consistente.

In `@src/Web/MeAjudaAi.Web.Customer/package.json`:
- Line 33: Update Zod usages to be compatible with Zod v4: replace any
z.nativeEnum(...) calls in lib/schemas/auth.ts with z.enum(...) (use the same
enum values to construct the new z.enum), replace instance method validators
like .email() and .uuid() with the top-level validators z.string().email() ->
z.email() and z.string().uuid() -> z.uuid() (or use z.email()/z.uuid() where
appropriate), and review any uses of .default(...) across schemas to ensure the
new short-circuiting behavior for undefined is acceptable and adjust defaults or
explicit preprocessing if needed; keep the package.json pinned to 4.3.6 as
currently specified until migration is verified.
- Line 25: Você rebaixou next-auth para v4, mas o código ainda usa recursos do
v5: substitua o uso de auth() no middleware (export default auth((req) => { …
})) por API compatível com v4 — por exemplo use withAuth(...) ou exporte o
middleware padrão conforme a docs v4; também corrija o arquivo auth.ts onde você
desestrutura { handlers, signIn, signOut, auth } de NextAuth() (linha 53) e
adapte para o padrão v4: chame NextAuth(...) sem essa desestruturação e
remova/reescreva qualquer uso de helpers específicos do v5 para os handlers de
rota/credenciais.

In `@src/Web/package.json`:
- Around line 52-53: As dependências "@typescript-eslint/eslint-plugin" e
"@typescript-eslint/parser" in package.json are pinned to ^7.16.0 which is
incompatible with TypeScript 5.9.2; update both entries to a v8+ release that
supports TS 5.x (e.g., set "@typescript-eslint/eslint-plugin" and
"@typescript-eslint/parser" to "^9.0.0" or any current v8/v9 compatible
version), then run install and lint to verify; ensure both package names are
updated together so the parser and plugin versions stay in sync.

---

Nitpick comments:
In `@docs/roadmap-current.md`:
- Around line 444-446: The checklist items for `apps/customer-web`,
`apps/provider-web`, `apps/admin-portal` and the extracted libraries `libs/ui`,
`libs/auth`, `libs/api-client` are prematurely marked `[x]`; change those
checked boxes back to unchecked `[ ]` (or add a “pending merge” note) so they
remain uncompleted until the PR is merged, and optionally include the PR link/ID
next to each item for traceability.
- Around line 435-439: Atualize a lista de "Technical Excellence Pack" na seção
que contém os itens "- [x] [**TD**] **Keycloak Automation**:
`setup-keycloak-clients.ps1`...", "- [x] [**TD**] **Analyzer Cleanup**: Fix
MudBlazor/SonarLint warnings...", "- [x] [**TD**] **Refactor Extensions**:
Extract `BusinessMetricsMiddlewareExtensions`", "- [x] [**TD**] **Polly
Logging**: Migrate resilience logging to ILogger (Issue `#113`)", e "- [x]
[**TD**] **Standardization**: Record syntax alignment in `Contracts`" para
marcar esses checkboxes como não concluídos ([ ]) até que o PR de referência
seja mergeado; mantenha a nota existente que instruí a aguardar o merge (a mesma
observação já presente no documento) e adicione uma breve TODO indicando que os
checkboxes serão trocados para [x] após confirmação do merge.

In
`@tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs`:
- Around line 147-152: Rename the misleading methods to reflect that they return
both sent and published messages: change GetPublishedMessages<T>() to
GetRecordedMessages<T>() (or GetAllRecordedMessages<T>) and change
WasMessagePublished(...) to WasMessageRecorded(...); update the XML doc comments
for these methods to match the new names and intent, update all call sites and
unit tests that reference GetPublishedMessages or WasMessagePublished, and
ensure generic constraint and return semantics (filtering _recordedMessages by
x.message is T and casting) remain unchanged in the renamed method.
- Line 22: The _recordedMessages field is a non-thread-safe List<(object
message, string? destination, EMessageType type)> and can race when tests run in
parallel; replace it with a thread-safe collection (e.g., ConcurrentBag<(object
message, string? destination, EMessageType type)>) or add a private lock object
(e.g., _recordedMessagesLock) and wrap all writes/reads (the locations using
_recordedMessages.Add and _recordedMessages.Clear and any enumerations) in
lock(_recordedMessagesLock) to ensure thread-safety while preserving existing
method signatures and behavior in MockRabbitMqMessageBus.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: f773c24d-e561-475e-900a-1aeba7cb386d

📥 Commits

Reviewing files that changed from the base of the PR and between f9977bc and e15b344.

⛔ Files ignored due to path filters (1)
  • src/Web/package-lock.json is excluded by !**/package-lock.json, !**/package-lock.json
📒 Files selected for processing (7)
  • .github/workflows/pr-validation.yml
  • docs/roadmap-current.md
  • src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs
  • src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs
  • src/Web/MeAjudaAi.Web.Customer/package.json
  • src/Web/package.json
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs
🚧 Files skipped from review as they are similar to previous changes (1)
  • .github/workflows/pr-validation.yml

Filipe Frigini added 2 commits March 9, 2026 22:32
… and Credentials, alongside RabbitMQ messaging infrastructure with dead-letter support.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

♻️ Duplicate comments (1)
src/Web/libs/api-client/package.json (1)

5-6: ⚠️ Potential issue | 🟡 Minor

Os caminhos de main e types no package.json não correspondem ao output real do build.

O project.json configura outputPath como dist/libs/api-client (relativo à raiz do workspace em src/Web/), mas main e types apontam para ./dist/index.js e ./dist/index.d.ts (relativo ao local do package.json).

Isso resulta em:

  • Build output em: src/Web/dist/libs/api-client/index.js
  • Paths atuais apontam para: src/Web/libs/api-client/dist/index.js (não existe)

Como a biblioteca é private: true e o workspace usa symlinks (visto em package-lock.json), durante desenvolvimento o TypeScript resolve via path alias ("api-client": ["libs/api-client/src/index.ts"]) e não pelos campos main/types. Porém, se o output do build for consumido diretamente, os caminhos estarão incorretos.

♻️ Correção sugerida
-  "main": "./dist/index.js",
-  "types": "./dist/index.d.ts",
+  "main": "../../dist/libs/api-client/index.js",
+  "types": "../../dist/libs/api-client/index.d.ts",
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/libs/api-client/package.json` around lines 5 - 6, Os campos "main" e
"types" no package.json estão apontando para ./dist/... que não corresponde ao
output configurado em project.json (outputPath -> dist/libs/api-client);
atualize os valores de "main" e "types" para o caminho relativo correto do
artefato buildado (a partir de src/Web/libs/api-client/package.json):
../../dist/libs/api-client/index.js e ../../dist/libs/api-client/index.d.ts,
garantindo que os campos "main" e "types" reflitam o outputPath definido em
project.json.
🧹 Nitpick comments (2)
src/Web/MeAjudaAi.Web.Customer/auth.ts (1)

193-197: Imports devem estar no topo do arquivo.

Os imports de GetServerSidePropsContext, NextApiRequest e NextApiResponse estão posicionados após a definição de authOptions. Por convenção e legibilidade, todos os imports devem estar agrupados no início do arquivo.

♻️ Mover imports para o topo do arquivo
 import { type NextAuthOptions } from "next-auth"
 import Keycloak from "next-auth/providers/keycloak"
 import Credentials from "next-auth/providers/credentials"
 import { JWT } from "next-auth/jwt"
 import { decodeJwt } from "jose"
 import { getServerSession } from "next-auth/next"
+import {
+    GetServerSidePropsContext,
+    NextApiRequest,
+    NextApiResponse,
+} from "next"

E remover as linhas 193-197.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts` around lines 193 - 197, Mover as
declarações de import para o topo do arquivo: remova as linhas onde
GetServerSidePropsContext, NextApiRequest e NextApiResponse são importados
depois da definição de authOptions e coloque esses imports junto com o restante
das importações no início do ficheiro; mantenha os nomes exatos
(GetServerSidePropsContext, NextApiRequest, NextApiResponse) e preserve a
utilização de authOptions onde for referenciado após o bloco de imports.
src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs (1)

116-124: Logue os valores efetivos da infraestrutura, não só o sucesso.

Hoje o helper só emite uma frase genérica; no bootstrap isso ajuda pouco a diagnosticar exchange/fila/TTL incorretos. Vale registrar pelo menos a configuração efetiva do DLQ.

💡 Ajuste sugerido
+using Microsoft.Extensions.Options;
+
 private static void LogRabbitMqInfrastructure<TLogger>(IServiceProvider services)
 {
     var environment = services.GetRequiredService<IHostEnvironment>();
     var logger = services.GetRequiredService<ILogger<TLogger>>();
+    var options = services.GetRequiredService<IOptions<DeadLetterOptions>>().Value;

     if (!environment.IsEnvironment("Testing"))
     {
-        // Para RabbitMQ, a infraestrutura é criada dinamicamente quando necessário
-        logger.LogInformation("Dead Letter infrastructure info for RabbitMQ logged successfully");
+        // Para RabbitMQ, a infraestrutura é criada dinamicamente quando necessário
+        logger.LogInformation(
+            "Dead Letter Queue RabbitMQ configuration. Exchange: {Exchange}, TtlHours: {TtlHours}, DetailedLogging: {DetailedLogging}",
+            options.RabbitMq.DeadLetterExchange,
+            options.DeadLetterTtlHours,
+            options.EnableDetailedLogging);
     }
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs` around lines 116 -
124, In LogRabbitMqInfrastructure<TLogger>, replace the generic success message
with a detailed log of the effective RabbitMQ DLQ settings: resolve the RabbitMQ
options/config (e.g., via services.GetRequiredService<IOptions<RabbitMqOptions>>
or the concrete settings/service used to configure the broker), extract and log
the actual exchange names, queue names, routing keys/bindings,
dead-letter-exchange/queue names, TTLs, max-retries/backoff and any
durability/auto-delete flags, and emit them via logger.LogInformation with a
clear structured message and properties so operators can see the running
configuration; keep the environment guard (IHostEnvironment) and use the
existing ILogger<TLogger> instance.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@docs/roadmap-current.md`:
- Around line 435-440: Update the post-merge TODO note that currently references
"feature/sprint-8b2-core-tech" to the correct branch
"feature/sprint-8b2-monorepo-cleanup" in the roadmap entry containing the
checklist (the paragraph that starts "(TODO: Marcar os checkboxes acima...)" so
the checkbox-tracking text matches this PR's branch; ensure the branch string is
exactly replaced and keep the surrounding Portuguese TODO text intact.

In `@src/Shared/Messaging/Options/DeadLetterOptions.cs`:
- Around line 10-13: Remove the orphaned XML documentation summary that no
longer applies to any member in the DeadLetterOptions class: delete the lines
containing "/// <summary> Habilita o sistema de Dead Letter Queue </summary>" so
the class (DeadLetterOptions) no longer contains an unattached XML comment after
the Enabled property was removed; ensure no other XML doc tags remain orphaned
in DeadLetterOptions.cs.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts`:
- Around line 199-206: A função exportada auth(...) atualmente chama
getServerSession(...) (NextAuth v4 signature) but is being used as a v5-style
middleware wrapper (auth((req)=>...)) in middleware.ts; fix by making the caller
and callee agree: either change middleware.ts to call auth() without a callback
and handle the NextRequest/NextResponse flow (use auth() to get session, then
NextResponse.redirect/next) or reimplement auth to detect a function argument
and act as a middleware wrapper (accepting a callback (req) => ...) that returns
a middleware-compatible handler; locate auth (the exported auth function) and
middleware.ts (where auth((req)=>...) is used) and apply one of these two fixes
so the signatures match.

In `@src/Web/MeAjudaAi.Web.Customer/lib/schemas/auth.ts`:
- Around line 83-89: The schema currently uses z.enum([...]) with numeric
EProviderType values (and casts to any), which is incorrect for numbers; replace
that call in the auth schema's "type" field with z.union of z.literal(...)
entries for each EProviderType value (e.g., z.literal(EProviderType.None),
z.literal(EProviderType.Individual), ...), remove the "as any" casts, and ensure
the resulting union matches the intended numeric literal types from the
EProviderType constant.

---

Duplicate comments:
In `@src/Web/libs/api-client/package.json`:
- Around line 5-6: Os campos "main" e "types" no package.json estão apontando
para ./dist/... que não corresponde ao output configurado em project.json
(outputPath -> dist/libs/api-client); atualize os valores de "main" e "types"
para o caminho relativo correto do artefato buildado (a partir de
src/Web/libs/api-client/package.json): ../../dist/libs/api-client/index.js e
../../dist/libs/api-client/index.d.ts, garantindo que os campos "main" e "types"
reflitam o outputPath definido em project.json.

---

Nitpick comments:
In `@src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs`:
- Around line 116-124: In LogRabbitMqInfrastructure<TLogger>, replace the
generic success message with a detailed log of the effective RabbitMQ DLQ
settings: resolve the RabbitMQ options/config (e.g., via
services.GetRequiredService<IOptions<RabbitMqOptions>> or the concrete
settings/service used to configure the broker), extract and log the actual
exchange names, queue names, routing keys/bindings, dead-letter-exchange/queue
names, TTLs, max-retries/backoff and any durability/auto-delete flags, and emit
them via logger.LogInformation with a clear structured message and properties so
operators can see the running configuration; keep the environment guard
(IHostEnvironment) and use the existing ILogger<TLogger> instance.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts`:
- Around line 193-197: Mover as declarações de import para o topo do arquivo:
remova as linhas onde GetServerSidePropsContext, NextApiRequest e
NextApiResponse são importados depois da definição de authOptions e coloque
esses imports junto com o restante das importações no início do ficheiro;
mantenha os nomes exatos (GetServerSidePropsContext, NextApiRequest,
NextApiResponse) e preserve a utilização de authOptions onde for referenciado
após o bloco de imports.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 5ac07e21-45e1-4889-96ef-fcd3fcec0ab3

📥 Commits

Reviewing files that changed from the base of the PR and between e15b344 and f3b2c75.

⛔ Files ignored due to path filters (15)
  • src/Web/.nx/cache/run.json is excluded by !**/.nx/**
  • src/Web/.nx/cache/terminalOutputs/5750941785946575987 is excluded by !**/.nx/**
  • src/Web/.nx/cache/terminalOutputs/6595496034224995433 is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/eslint-2654242865465226088.hash is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/file-map.json is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/jest-7930610538513362720.hash is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/lockfile-dependencies.hash is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/lockfile-nodes.hash is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/nx_files.nxt is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/parsed-lock-file.dependencies.json is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/parsed-lock-file.nodes.json is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/project-graph.json is excluded by !**/.nx/**
  • src/Web/.nx/workspace-data/project-graph.lock is excluded by !**/*.lock, !**/*.lock, !**/.nx/**
  • src/Web/.nx/workspace-data/source-maps.json is excluded by !**/.nx/**
  • src/Web/package-lock.json is excluded by !**/package-lock.json, !**/package-lock.json
📒 Files selected for processing (13)
  • docs/roadmap-current.md
  • src/Shared/Messaging/DeadLetter/DeadLetterExtensions.cs
  • src/Shared/Messaging/Options/DeadLetterOptions.cs
  • src/Shared/Messaging/RabbitMq/RabbitMqInfrastructureManager.cs
  • src/Web/MeAjudaAi.Web.Customer/app/api/auth/[...nextauth]/route.ts
  • src/Web/MeAjudaAi.Web.Customer/auth.ts
  • src/Web/MeAjudaAi.Web.Customer/lib/schemas/auth.ts
  • src/Web/MeAjudaAi.Web.Customer/project.json
  • src/Web/libs/api-client/jest.config.ts
  • src/Web/libs/api-client/package.json
  • src/Web/libs/api-client/tsconfig.spec.json
  • src/Web/package.json
  • tests/MeAjudaAi.Shared.Tests/TestInfrastructure/Mocks/Messaging/MockRabbitMqMessageBus.cs
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/Web/MeAjudaAi.Web.Customer/project.json
  • src/Web/libs/api-client/tsconfig.spec.json

@frigini
Copy link
Owner Author

frigini commented Mar 10, 2026

@coderabbitai review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 10, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

…ity, frontend, and integration tests, alongside initial infrastructure and utility files.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
.github/workflows/pr-validation.yml (2)

491-533: ⚠️ Potential issue | 🟡 Minor

Migração para Cobertura correta, mas inconsistência com webadmin.

A migração dos arquivos de cobertura para o formato Cobertura está correta. Porém, a linha 523 ainda inclui webadmin no loop de debug, enquanto a linha 763 (pre-flight check) o exclui.

Como os testes bUnit do webadmin rodam sem cobertura (conforme documentado no workflow), o módulo deveria ser removido aqui também para evitar warnings falsos de "NOT FOUND".

💡 Ajuste sugerido
          echo "🔍 DEBUG: Verifying coverage files for CodeCoverageSummary..."
-          for module in users providers documents servicecatalogs locations searchproviders shared apiservice webadmin; do
+          for module in users providers documents servicecatalogs locations searchproviders shared apiservice; do
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 491 - 533, The debug loop
currently enumerates webadmin ("for module in users providers documents
servicecatalogs locations searchproviders shared apiservice webadmin; do") but
webadmin is excluded in the pre-flight check and has no coverage; remove
webadmin from that module list (or add an explicit skip for module ==
"webadmin") so the existence check for
"coverage/${module}/${module}.cobertura.xml" no longer emits a false "NOT FOUND"
warning.

1001-1008: ⚠️ Potential issue | 🔴 Critical

Erro de sintaxe shell: fi e done órfãos.

As linhas 1003-1004 contêm declarações fi e done duplicadas que não correspondem a nenhuma estrutura de controle. Isso causará erro de sintaxe e falhará o step.

A estrutura correta termina em:

  • Linha 1001: fi (fecha o if da linha 983)
  • Linha 1002: done (fecha o for da linha 981)

As linhas 1003-1004 são órfãs e devem ser removidas.

🐛 Correção proposta
               echo ""
             fi
           done
-            fi
-          done

           echo "💡 For detailed coverage report, check the 'Code Coverage Summary' step above"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 1001 - 1008, Remove the two
orphaned shell closing tokens so the control structures balance: keep a single
`fi` and single `done` that close the earlier `if` and `for` and delete the
duplicated `fi` and `done` immediately before the echo lines (the orphaned `fi`
and `done` tokens shown in the diff); ensure the file ends with only the
required `fi`/`done` pair followed by the echo statements so the workflow step
no longer has syntax errors.
♻️ Duplicate comments (1)
.github/workflows/pr-validation.yml (1)

143-146: ⚠️ Potential issue | 🔴 Critical

Caminho incorreto em OPENAPI_SPEC_URL ainda causará falha.

O arquivo api-spec.json é criado em ../api/api-spec.json (linha 139), que resolve para ./src/api/api-spec.json a partir do working-directory: ./src/Web.

Porém, na linha 145, OPENAPI_SPEC_URL é definido como "../../api/api-spec.json", que resolve para ./api/api-spec.json (na raiz do repositório) — caminho inexistente.

O correto seria "../api/api-spec.json".

🐛 Correção proposta
           # 3. Generate API client for Customer Web App
           # The generator reads OPENAPI_SPEC_URL from env
-          export OPENAPI_SPEC_URL="../../api/api-spec.json"
+          export OPENAPI_SPEC_URL="../api/api-spec.json"
           npm run generate:api --workspace=meajudaai.web.customer
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 143 - 146, O valor de
OPENAPI_SPEC_URL usado antes de rodar npm run generate:api para o workspace
meajudaai.web.customer aponta para "../../api/api-spec.json" (caminho incorreto)
— atualize a variável OPENAPI_SPEC_URL para "../api/api-spec.json" no bloco onde
é exportada para que ela aponte para o arquivo gerado ../api/api-spec.json
(resolvido a partir de working-directory ./src/Web) antes de executar o comando
generate:api.
🧹 Nitpick comments (2)
src/Web/MeAjudaAi.Web.Customer/auth.ts (1)

226-234: Chamada redundante de validateCriticalEnvOnStartup() em cada invocação de auth()

A função auth() valida as variáveis de ambiente críticas em cada requisição, mesmo que route.ts já realize essa validação ao carregar o módulo. Como a validação é idempotente mas adiciona overhead desnecessário, considere implementar um flag para evitar re-execução.

💡 Sugestão de refatoração
+let envValidated = false;
+
 export function auth(
     ...args:
         | [GetServerSidePropsContext["req"], GetServerSidePropsContext["res"]]
         | [NextApiRequest, NextApiResponse]
         | []
 ) {
-    validateCriticalEnvOnStartup();
+    if (!envValidated) {
+        validateCriticalEnvOnStartup();
+        envValidated = true;
+    }
     return getServerSession(...args, authOptions);
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts` around lines 226 - 234, The auth()
function calls validateCriticalEnvOnStartup() on every invocation causing
redundant work; add a module-level boolean flag (e.g., criticalEnvValidated) and
guard the call so validateCriticalEnvOnStartup() runs only once (or perform
validation at module initialization), then have auth(...) call
getServerSession(...args, authOptions) without re-running validation; update
references in this file to use the flag and ensure thread-safety/atomic set so
validateCriticalEnvOnStartup() is not invoked repeatedly.
.github/workflows/pr-validation.yml (1)

866-882: Limpeza opcional: step de debug ainda referencia OpenCover.

Após a migração para Cobertura, a busca por *.opencover.xml na linha 873 sempre retornará vazio. Considere atualizar para buscar *.cobertura.xml para manter consistência com o resto do workflow.

♻️ Ajuste sugerido
           echo ""
-          echo "OpenCover XML files:"
-          find ./coverage -name "*.opencover.xml" -type f 2>/dev/null || echo "No .opencover.xml files found"
+          echo "Cobertura XML files:"
+          find ./coverage -name "*.cobertura.xml" -type f 2>/dev/null || echo "No .cobertura.xml files found"
           echo ""
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 866 - 882, The debug step
"List Coverage Files (Debug)" still searches for OpenCover files using the
pattern "*.opencover.xml"; update that find invocation to look for the Cobertura
file pattern (e.g., "*.cobertura.xml") or include both patterns if you want
backwards compatibility so the step reflects the migrated coverage format
consistently.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 20-31: O construtor público de Result<T>(bool isSuccess, T value,
Error error) tem uma verificação que usa value != null, o que falha para tipos
por valor (ex: Result<int>.Failure) e contradiz as fábricas e os atributos
MemberNotNullWhen; corrija as invariantes substituindo as comparações com null
por EqualityComparer<T>.Default.Equals(value, default!) (ou Object.Equals(value,
default(T))) para detectar valores padrão de T; adicione validação
correspondente nas fábricas privadas usadas por Success(T) e Failure(Error) e no
operador implícito (linha ~46) para impedir materialização de estados inválidos
(por exemplo Success(null!) ou (Result<T>)(Error)null!), garantindo que Failure
aceita apenas value == default(T) e Success exige value != default(T) de forma
consistente com os atributos.

In `@src/Shared/Utilities/EnvironmentHelpers.cs`:
- Around line 16-39: The helper IsSecurityBypassEnvironment currently returns
true based solely on INTEGRATION_TESTS or envName and should be tightened:
change IsSecurityBypassEnvironment(IHostEnvironment? environment = null) so it
only permits bypass when the hosting environment reports development
(environment?.IsDevelopment() == true) and AND either INTEGRATION_TESTS=="true"
or envName equals one of the allowed values
("Testing","Development","Integration"); if environment is null, return false.
Update the logic in IsSecurityBypassEnvironment (and any early returns) to
require environment?.IsDevelopment() before honoring the integration flag or
envName to avoid accidental bypass in production.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts`:
- Around line 85-86: The current authOptions sets secret using
process.env.AUTH_SECRET || requireEnv("NEXTAUTH_SECRET"), but requireEnv can
return an empty string so this can yield a blank secret; change authOptions to
avoid falling back to a value that may be empty — use only
process.env.AUTH_SECRET if present, otherwise use process.env.NEXTAUTH_SECRET
(or undefined) and rely on validateCriticalEnvOnStartup to enforce presence;
replace the fallback to requireEnv("NEXTAUTH_SECRET") in the secret assignment
so authOptions.secret is undefined instead of "" when no secret is set
(references: authOptions, requireEnv, process.env.AUTH_SECRET, NEXTAUTH_SECRET,
validateCriticalEnvOnStartup).

---

Outside diff comments:
In @.github/workflows/pr-validation.yml:
- Around line 491-533: The debug loop currently enumerates webadmin ("for module
in users providers documents servicecatalogs locations searchproviders shared
apiservice webadmin; do") but webadmin is excluded in the pre-flight check and
has no coverage; remove webadmin from that module list (or add an explicit skip
for module == "webadmin") so the existence check for
"coverage/${module}/${module}.cobertura.xml" no longer emits a false "NOT FOUND"
warning.
- Around line 1001-1008: Remove the two orphaned shell closing tokens so the
control structures balance: keep a single `fi` and single `done` that close the
earlier `if` and `for` and delete the duplicated `fi` and `done` immediately
before the echo lines (the orphaned `fi` and `done` tokens shown in the diff);
ensure the file ends with only the required `fi`/`done` pair followed by the
echo statements so the workflow step no longer has syntax errors.

---

Duplicate comments:
In @.github/workflows/pr-validation.yml:
- Around line 143-146: O valor de OPENAPI_SPEC_URL usado antes de rodar npm run
generate:api para o workspace meajudaai.web.customer aponta para
"../../api/api-spec.json" (caminho incorreto) — atualize a variável
OPENAPI_SPEC_URL para "../api/api-spec.json" no bloco onde é exportada para que
ela aponte para o arquivo gerado ../api/api-spec.json (resolvido a partir de
working-directory ./src/Web) antes de executar o comando generate:api.

---

Nitpick comments:
In @.github/workflows/pr-validation.yml:
- Around line 866-882: The debug step "List Coverage Files (Debug)" still
searches for OpenCover files using the pattern "*.opencover.xml"; update that
find invocation to look for the Cobertura file pattern (e.g., "*.cobertura.xml")
or include both patterns if you want backwards compatibility so the step
reflects the migrated coverage format consistently.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts`:
- Around line 226-234: The auth() function calls validateCriticalEnvOnStartup()
on every invocation causing redundant work; add a module-level boolean flag
(e.g., criticalEnvValidated) and guard the call so
validateCriticalEnvOnStartup() runs only once (or perform validation at module
initialization), then have auth(...) call getServerSession(...args, authOptions)
without re-running validation; update references in this file to use the flag
and ensure thread-safety/atomic set so validateCriticalEnvOnStartup() is not
invoked repeatedly.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: de645788-e8df-42d8-ba67-01a19675d377

📥 Commits

Reviewing files that changed from the base of the PR and between c1c86dd and 8586ebb.

📒 Files selected for processing (10)
  • .github/workflows/pr-validation.yml
  • src/Contracts/Functional/Result.cs
  • src/Modules/Locations/Infrastructure/Extensions.cs
  • src/Modules/Providers/Infrastructure/Extensions.cs
  • src/Modules/SearchProviders/Infrastructure/Extensions.cs
  • src/Modules/ServiceCatalogs/Infrastructure/Extensions.cs
  • src/Modules/Users/Infrastructure/Extensions.cs
  • src/Shared/Database/DatabaseConstants.cs
  • src/Shared/Utilities/EnvironmentHelpers.cs
  • src/Web/MeAjudaAi.Web.Customer/auth.ts
🚧 Files skipped from review as they are similar to previous changes (4)
  • src/Modules/ServiceCatalogs/Infrastructure/Extensions.cs
  • src/Modules/Locations/Infrastructure/Extensions.cs
  • src/Modules/Users/Infrastructure/Extensions.cs
  • src/Modules/Providers/Infrastructure/Extensions.cs

…ional Result type, PR validation, and unit tests for the SearchProviders module.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (2)
src/Shared/Utilities/EnvironmentHelpers.cs (1)

18-25: ⚠️ Potential issue | 🟡 Minor

Padronize os comentários inline para português.

Os comentários das Lines 18 e 24 ainda estão em inglês, destoando do restante do arquivo e da convenção do repositório. Based on learnings: In all C# files, keep logging messages in English, but ensure comments and XML documentation (///

, /// , etc.) are written in Portuguese across the codebase.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Utilities/EnvironmentHelpers.cs` around lines 18 - 25, Padronize
os comentários inline deste trecho para português: troque os comentários acima
das verificações environment?.IsDevelopment() e
Environment.GetEnvironmentVariable("INTEGRATION_TESTS") para frases em português
mantendo o sentido original; assegure também que mensagens de log permaneçam em
inglês e que qualquer documentação XML (/// <summary>, /// <param>, etc.) no
mesmo arquivo esteja em português conforme a convenção do repositório; localize
o ponto dentro da classe/método em EnvironmentHelpers onde essas duas checagens
ocorrem e atualize apenas os comentários (sem alterar a lógica).
.github/workflows/pr-validation.yml (1)

133-146: ⚠️ Potential issue | 🔴 Critical

OPENAPI_SPEC_URL ainda não resolve para o arquivo gerado.

Com npm run ... --workspace=meajudaai.web.customer, o script roda no diretório do pacote. Nesse contexto, ../api/api-spec.json aponta para src/Web/api/api-spec.json, mas o swagger tofile acima grava em src/api/api-spec.json. Isso reabre a falha na geração do cliente. Prefira um caminho absoluto compartilhado pelos dois comandos, ou pelo menos ../../api/api-spec.json.

💡 Ajuste sugerido
-          mkdir -p ../api
+          API_SPEC_PATH="$GITHUB_WORKSPACE/src/api/api-spec.json"
+          mkdir -p "$(dirname "$API_SPEC_PATH")"
@@
-          swagger tofile --output ../api/api-spec.json \
+          swagger tofile --output "$API_SPEC_PATH" \
             ../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/net10.0/MeAjudaAi.ApiService.dll \
             v1
@@
-          export OPENAPI_SPEC_URL="../api/api-spec.json"
+          export OPENAPI_SPEC_URL="$API_SPEC_PATH"
           npm run generate:api --workspace=meajudaai.web.customer
#!/bin/bash
set -euo pipefail

echo "== Script do workspace customer =="
jq '{name, scripts: {"generate:api": .scripts["generate:api"]}}' \
  src/Web/MeAjudaAi.Web.Customer/package.json

echo
echo "== Consumidores de OPENAPI_SPEC_URL =="
rg -n -C2 'OPENAPI_SPEC_URL|api-spec\.json|generate:api|openapi' \
  src/Web/MeAjudaAi.Web.Customer src/Web/libs/api-client

echo
echo "== Resolução dos caminhos =="
python - <<'PY'
from pathlib import Path
repo = Path.cwd()
step_cwd = repo / "src/Web"
workspace_cwd = repo / "src/Web/MeAjudaAi.Web.Customer"

print("swagger output       :", (step_cwd / "../api/api-spec.json").resolve())
print("env path atual       :", (workspace_cwd / "../api/api-spec.json").resolve())
print("env path sugerido    :", (workspace_cwd / "../../api/api-spec.json").resolve())
PY
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 133 - 146, The
OPENAPI_SPEC_URL used by the customer workspace is pointing to
"../api/api-spec.json" which resolves differently when npm runs in the
meajudaai.web.customer workspace; update the export of OPENAPI_SPEC_URL (the env
var consumed by the generate:api script) to a path that both commands resolve
the same (e.g., use an absolute path like
"$GITHUB_WORKSPACE/src/api/api-spec.json" or at minimum
"../../api/api-spec.json") so that the swagger tofile output location and the
generate:api invocation both reference the same file; modify the export
OPENAPI_SPEC_URL line accordingly in the workflow and keep the existing swagger
tofile and npm run generate:api calls unchanged.
🧹 Nitpick comments (3)
src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs (1)

402-419: Considere cobrir ou remover o caminho Success(null) no código de produção.

A alteração do mock de Success(null) para Failure(Error.NotFound(...)) é válida e testa o caminho onde a API retorna um erro explícito (linhas 145-150 de SearchProvidersModuleApi.cs). Porém, o null-check ainda existe na produção (linhas 153-157):

var providerData = providerDataResult.Value;
if (providerData == null)
{
    logger.LogWarning("Provider {ProviderId} not found in Providers module, cannot index", providerId);
    return Result.Failure($"Provider {providerId} not found");
}

Como nenhum teste cobre este caminho no módulo SearchProviders, você tem duas opções:

  1. Se o contrato da IProvidersModuleApi.GetProviderForIndexingAsync foi atualizado para sempre retornar Failure(NotFound) em vez de Success(null), remova o null-check do código de produção para eliminar código morto.
  2. Caso contrário, adicione um teste que cubra o cenário Success(null).
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs`
around lines 402 - 419, The production code in
SearchProvidersModuleApi.IndexProviderAsync still checks for providerData ==
null after calling IProvidersModuleApi.GetProviderForIndexingAsync
(providerDataResult.Value and the logger.LogWarning/return Result.Failure path)
but no test covers the Success(null) case; either remove that null-check if the
contract of GetProviderForIndexingAsync was changed to always return
Failure(NotFound) (delete the providerData == null branch and its
LogWarning/return) or add a unit test to SearchProvidersModuleApiTests that sets
the _providersApiMock to return Result.Success<ModuleProviderIndexingDto?>(null)
and asserts the method logs the warning and returns the expected failure and
does not call _repositoryMock.AddAsync, thereby covering the Success(null) path.
src/Contracts/Functional/Result.cs (1)

8-14: Atributos MemberNotNullWhen são semanticamente imprecisos para Result<T?> com T nullable.

Os atributos declaram que Value não é nulo quando IsSuccess é verdadeiro. Porém, para cenários como Result<ModuleUserDto?>, um valor null é semanticamente válido em sucesso (ex: "busca bem-sucedida, mas entidade não encontrada"). Isso pode gerar warnings do analisador ou suposições falsas em call sites que usam tipos nullable como parâmetro genérico.

Considere documentar essa limitação ou criar uma sobrecarga/tipo separado para cenários onde null é um valor de sucesso válido (ex: MaybeResult<T>).

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` around lines 8 - 14, O uso de
[MemberNotNullWhen] em Result<T> (membros IsSuccess, IsFailure, Value, Error) é
incorreto para cenários onde T é nullable (ex.: Result<ModuleUserDto?>) porque
estes atributos forçam a análise a assumir que Value não é nulo em sucessos;
corrija removendo ou condicionando os atributos a um genérico não-nullable (por
exemplo aplicar-os apenas quando T : notnull) ou introduza um tipo
separado/overload (por exemplo MaybeResult<T> ou ResultNullable<T>) que
represente sucesso com Value possivelmente nulo e não possua esses atributos, e
adicione documentação clara na classe Result<T> explicando a limitação para
tipos nullable; altere os atributos nas propriedades IsSuccess/IsFailure e a
definição genérica Result<T> conforme a opção escolhida.
.github/workflows/pr-validation.yml (1)

814-815: Restrinja os globs de integração/E2E a *.cobertura.xml.

Os módulos já foram padronizados para Cobertura, e tests/parallel.runsettings também fixa esse formato. Deixar coverage/integration/**/*.xml e coverage/e2e/**/*.xml mais abertos do que o necessário aumenta o risco de o ReportGenerator tentar ler XML que não é coverage se surgirem outros artefatos nesses diretórios.

💡 Ajuste sugerido
-          reports: "coverage/**/*.cobertura.xml;coverage/integration/**/*.xml;coverage/e2e/**/*.xml"
+          reports: "coverage/**/*.cobertura.xml;coverage/integration/**/*.cobertura.xml;coverage/e2e/**/*.cobertura.xml"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 814 - 815, Atualize o valor
da chave reports para restringir os globs de integração/E2E a arquivos
Cobertura: substitua "coverage/integration/**/*.xml" por
"coverage/integration/**/*.cobertura.xml" e "coverage/e2e/**/*.xml" por
"coverage/e2e/**/*.cobertura.xml" (a linha que contém reports:
"coverage/**/*.cobertura.xml;coverage/integration/**/*.xml;coverage/e2e/**/*.xml");
assim o ReportGenerator só processará arquivos *.cobertura.xml e evitará ler
outros XMLs não relacionados.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 66-67: O operador implícito public static implicit operator
Result<T>(T value) => value is null ? ... deve usar a mesma validação do
construtor/Success (EqualityComparer<T>.Default.Equals(value, default!)) para
tratar corretamente value types; altere a checagem "value is null" para
EqualityComparer<T>.Default.Equals(value, default!) e mantenha a mesma
exceção/mensagem usada pelo construtor (referência: o operador implícito
Result<T>(T value) e a lógica de Success/constructor que usa
EqualityComparer<T>.Default.Equals).
- Around line 22-29: The current validation in the Result<T> constructor/factory
that calls EqualityComparer<T>.Default.Equals(value, default!) and throws
ArgumentNullException prevents legitimate Result<T?>.Success(null) uses; remove
that check so success results are allowed to carry default/null values (i.e.,
delete the EqualityComparer<T>.Default.Equals(...) block and its
ArgumentNullException) while keeping the existing error-null check (the throw in
the isSuccess branch that validates error). This restores compatibility for
Result<T?>.Success(null) without changing external APIs; alternatively, if you
prefer a safer API, add a new factory SuccessOrEmpty(T? value) for nullable
semantics, but the minimal fix is to remove the equality/default-value
validation in the Result<T> success-path.

In `@src/Shared/Utilities/EnvironmentHelpers.cs`:
- Around line 14-43: The IsSecurityBypassEnvironment(IHostEnvironment?
environment = null) helper currently returns early because of the
environment?.IsDevelopment() guard, making calls without an environment always
false and leaving the DOTNET_ENVIRONMENT/ASPNETCORE_ENVIRONMENT and
Testing/Integration branches dead; fix by choosing one policy: either (A) make
IHostEnvironment required (remove the optional signature and delete the
DOTNET_ENVIRONMENT/ASPNETCORE_ENVIRONMENT fallback and Testing/Integration
branches) so the helper strictly relies on environment.IsDevelopment(), or (B)
keep the optional parameter and remove the initial IsDevelopment() early-return
so the method falls back to environment.EnvironmentName / DOTNET_ENVIRONMENT /
ASPNETCORE_ENVIRONMENT and correctly returns true for
"Testing"/"Integration"/"Development"; update callers (e.g.,
Modules/Documents/Infrastructure/Extensions) accordingly to match the chosen
contract.

---

Duplicate comments:
In @.github/workflows/pr-validation.yml:
- Around line 133-146: The OPENAPI_SPEC_URL used by the customer workspace is
pointing to "../api/api-spec.json" which resolves differently when npm runs in
the meajudaai.web.customer workspace; update the export of OPENAPI_SPEC_URL (the
env var consumed by the generate:api script) to a path that both commands
resolve the same (e.g., use an absolute path like
"$GITHUB_WORKSPACE/src/api/api-spec.json" or at minimum
"../../api/api-spec.json") so that the swagger tofile output location and the
generate:api invocation both reference the same file; modify the export
OPENAPI_SPEC_URL line accordingly in the workflow and keep the existing swagger
tofile and npm run generate:api calls unchanged.

In `@src/Shared/Utilities/EnvironmentHelpers.cs`:
- Around line 18-25: Padronize os comentários inline deste trecho para
português: troque os comentários acima das verificações
environment?.IsDevelopment() e
Environment.GetEnvironmentVariable("INTEGRATION_TESTS") para frases em português
mantendo o sentido original; assegure também que mensagens de log permaneçam em
inglês e que qualquer documentação XML (/// <summary>, /// <param>, etc.) no
mesmo arquivo esteja em português conforme a convenção do repositório; localize
o ponto dentro da classe/método em EnvironmentHelpers onde essas duas checagens
ocorrem e atualize apenas os comentários (sem alterar a lógica).

---

Nitpick comments:
In @.github/workflows/pr-validation.yml:
- Around line 814-815: Atualize o valor da chave reports para restringir os
globs de integração/E2E a arquivos Cobertura: substitua
"coverage/integration/**/*.xml" por "coverage/integration/**/*.cobertura.xml" e
"coverage/e2e/**/*.xml" por "coverage/e2e/**/*.cobertura.xml" (a linha que
contém reports:
"coverage/**/*.cobertura.xml;coverage/integration/**/*.xml;coverage/e2e/**/*.xml");
assim o ReportGenerator só processará arquivos *.cobertura.xml e evitará ler
outros XMLs não relacionados.

In `@src/Contracts/Functional/Result.cs`:
- Around line 8-14: O uso de [MemberNotNullWhen] em Result<T> (membros
IsSuccess, IsFailure, Value, Error) é incorreto para cenários onde T é nullable
(ex.: Result<ModuleUserDto?>) porque estes atributos forçam a análise a assumir
que Value não é nulo em sucessos; corrija removendo ou condicionando os
atributos a um genérico não-nullable (por exemplo aplicar-os apenas quando T :
notnull) ou introduza um tipo separado/overload (por exemplo MaybeResult<T> ou
ResultNullable<T>) que represente sucesso com Value possivelmente nulo e não
possua esses atributos, e adicione documentação clara na classe Result<T>
explicando a limitação para tipos nullable; altere os atributos nas propriedades
IsSuccess/IsFailure e a definição genérica Result<T> conforme a opção escolhida.

In
`@src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs`:
- Around line 402-419: The production code in
SearchProvidersModuleApi.IndexProviderAsync still checks for providerData ==
null after calling IProvidersModuleApi.GetProviderForIndexingAsync
(providerDataResult.Value and the logger.LogWarning/return Result.Failure path)
but no test covers the Success(null) case; either remove that null-check if the
contract of GetProviderForIndexingAsync was changed to always return
Failure(NotFound) (delete the providerData == null branch and its
LogWarning/return) or add a unit test to SearchProvidersModuleApiTests that sets
the _providersApiMock to return Result.Success<ModuleProviderIndexingDto?>(null)
and asserts the method logs the warning and returns the expected failure and
does not call _repositoryMock.AddAsync, thereby covering the Success(null) path.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: d658046a-49d2-4623-8421-bd2252407e77

📥 Commits

Reviewing files that changed from the base of the PR and between 8586ebb and 57f4c0a.

📒 Files selected for processing (6)
  • .github/workflows/pr-validation.yml
  • src/Contracts/Functional/Result.cs
  • src/Modules/SearchProviders/Tests/Unit/API/ModuleExtensionsTests.cs
  • src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs
  • src/Shared/Utilities/EnvironmentHelpers.cs
  • src/Web/MeAjudaAi.Web.Customer/auth.ts

Filipe Frigini added 2 commits March 11, 2026 18:39
…ests for the SearchProviders module, and configure a PR validation workflow.
…intelligence with conditional registration.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (2)
src/Contracts/Functional/Result.cs (1)

58-63: ⚠️ Potential issue | 🟠 Major

O operador implícito ficou mais restritivo que Success(...).

Hoje Success(value) aceita cargas válidas como 0, false e null para Result<T?>, mas a conversão implícita lança para o mesmo payload porque trata default(T) como inválido. Isso cria duas semânticas diferentes para o mesmo tipo e introduz exceções só em runtime. O ideal é o operador delegar para Success(value) ou então mover a mesma regra para todas as fábricas, mas não deixar os caminhos divergirem.

💡 Ajuste sugerido
-    public static implicit operator Result<T>(T value) => EqualityComparer<T>.Default.Equals(value, default!) ? throw new ArgumentNullException(nameof(value)) : Success(value);
+    public static implicit operator Result<T>(T value) => Success(value);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` around lines 58 - 63, O operador
implícito Result<T>(T value) está mais restritivo que a fábrica Success(T)
porque rejeita default(T) e lança, criando semânticas divergentes; altere o
operador implícito para delegar diretamente para Result<T>.Success(value) (assim
como o operador implícito de Error deve delegar para Result<T>.Failure(error)),
removendo a verificação EqualityComparer<T>.Default.Equals(value, default!) e o
throw para garantir comportamento consistente entre as vias de criação.
src/Shared/Utilities/EnvironmentHelpers.cs (1)

16-21: ⚠️ Potential issue | 🟠 Major

Não deixe INTEGRATION_TESTS=true habilitar bypass fora de um host explicitamente não produtivo.

Do jeito que está, basta essa env var estar definida para o helper retornar true, mesmo sem validar o ambiente real. Como esse helper já centraliza decisões de bypass/configuração em outros módulos, um valor acidental em produção vira um bypass global de segurança/configuração. Restrinja essa decisão a um IHostEnvironment explícito e trate a flag apenas como reforço dentro de Development/Testing/Integration, não como atalho autônomo.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Shared/Utilities/EnvironmentHelpers.cs` around lines 16 - 21, A função
IsSecurityBypassEnvironment permite bypass apenas baseado na variável
INTEGRATION_TESTS, o que pode ativar bypass em produção; atualize
IsSecurityBypassEnvironment para exigir um IHostEnvironment explícito (não null)
e só tratar INTEGRATION_TESTS como reforço quando o host for claramente
não-produtivo (por exemplo environment.IsDevelopment() ||
environment.EnvironmentName == "Testing" || environment.EnvironmentName ==
"Integration"); se environment for null ou indicar ambiente de produção, ignore
a variável de ambiente e retorne false; localize a chamada a
Environment.GetEnvironmentVariable("INTEGRATION_TESTS") dentro de
IsSecurityBypassEnvironment para aplicar essa checagem combinada.
🧹 Nitpick comments (3)
src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs (2)

416-418: Trave também a ausência de UpdateAsync e SaveChangesAsync nesses cenários de falha.

Hoje os testes só garantem que AddAsync não aconteceu. Se alguém introduzir uma escrita antes do retorno de erro, o bug ainda passa. Como SearchProvidersModuleApi.IndexProviderAsync faz short-circuit antes de tocar o repositório em src/Modules/SearchProviders/Application/ModuleApi/SearchProvidersModuleApi.cs:153-157, vale deixar isso explícito aqui também.

Diff sugerido
         result.IsFailure.Should().BeTrue();
         result.Error.Message.Should().Contain("not found");
         _repositoryMock.Verify(x => x.AddAsync(It.IsAny<SearchableProvider>(), It.IsAny<CancellationToken>()), Times.Never);
+        _repositoryMock.Verify(x => x.UpdateAsync(It.IsAny<SearchableProvider>(), It.IsAny<CancellationToken>()), Times.Never);
+        _repositoryMock.Verify(x => x.SaveChangesAsync(It.IsAny<CancellationToken>()), Times.Never);
@@
         result.IsFailure.Should().BeTrue();
         result.Error.Message.Should().Contain(providerId.ToString());
         _repositoryMock.Verify(x => x.AddAsync(It.IsAny<SearchableProvider>(), It.IsAny<CancellationToken>()), Times.Never);
+        _repositoryMock.Verify(x => x.UpdateAsync(It.IsAny<SearchableProvider>(), It.IsAny<CancellationToken>()), Times.Never);
+        _repositoryMock.Verify(x => x.SaveChangesAsync(It.IsAny<CancellationToken>()), Times.Never);

Also applies to: 436-438

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs`
around lines 416 - 418, The test currently asserts failure and that AddAsync was
not called, but must also assert that UpdateAsync and SaveChangesAsync were
never invoked to prevent regressions; update the assertions in
SearchProvidersModuleApiTests (near the IsFailure checks for IndexProviderAsync
scenarios) to call _repositoryMock.Verify(x =>
x.UpdateAsync(It.IsAny<SearchableProvider>(), It.IsAny<CancellationToken>()),
Times.Never) and _repositoryMock.Verify(x =>
x.SaveChangesAsync(It.IsAny<CancellationToken>()), Times.Never) for both failing
cases (the blocks around the existing AddAsync verify at the locations
mentioned) so the test enforces no repository writes occur on short-circuit
paths in SearchProvidersModuleApi.IndexProviderAsync.

424-425: Traduza este comentário para português.

Fora dos headers AAA, os comentários de testes em C# seguem o padrão em português, e este trecho ficou em inglês.

Diff sugerido
-        // Arrange — simulates the contract where the API returns Success but with a null payload
-        // (e.g., provider exists in the index but has no indexable data yet)
+        // Arrange — simula o contrato em que a API retorna Success com payload nulo
+        // (por exemplo, quando o provider existe no índice, mas ainda não tem dados indexáveis)

Based on learnings: in unit test code, keep AAA headers in English, but comments and XML documentation should be in Portuguese.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs`
around lines 424 - 425, Substitua o comentário em inglês "// Arrange — simulates
the contract where the API returns Success but with a null payload (e.g.,
provider exists in the index but has no indexable data yet)" por uma versão em
português mantendo o header AAA "Arrange" em inglês; por exemplo: "// Arrange —
simula o contrato em que a API retorna Success mas com payload nulo (p.ex., o
provedor existe no índice mas ainda não tem dados indexáveis)".
src/Web/MeAjudaAi.Web.Customer/auth.ts (1)

180-187: Asserção de tipo funcional.

A asserção manual funciona, mas considere usar type augmentation do NextAuth para tipagem mais segura dos campos customizados em next-auth.d.ts.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts` around lines 180 - 187, O trecho faz
uma asserção manual do user para acessar campos customizados (accessToken,
refreshToken, expiresAt, id); em vez disso adicione uma type augmentation para
NextAuth (next-auth.d.ts) definindo estes campos em User/Session/JWT e ajuste o
retorno do callback que usa as variáveis token/user para usar os tipos
fortemente tipados; localize o callback que monta o objeto (referenciado pelas
variáveis user, token e pela expressão que atribui
accessToken/refreshToken/expiresAt/id) e remova a asserção manual, atualizando
as assinaturas de NextAuth para refletir os campos customizados e garantir
tipagem segura em toda a cadeia.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/pr-validation.yml:
- Around line 494-501: The script currently copies found Cobertura reports
(COVERAGE_FILE) into MODULE_COVERAGE_DIR using cp, causing duplicate reports
(original GUID path plus ./coverage/<module>/<module>.cobertura.xml) and
double-counting during aggregation; replace the copy with a move so only the
normalized path remains—i.e., change the cp "$COVERAGE_FILE"
"$MODULE_COVERAGE_DIR/${module_name,,}.cobertura.xml" step to move (mv) the file
or otherwise delete the original after copying, ensuring COVERAGE_FILE is
relocated to the normalized filename and no duplicate remains for aggregation.
- Line 817: A configuração `reports` está com globs redundantes:
`coverage/**/*.cobertura.xml` já abrange
`coverage/integration/**/*.cobertura.xml` e `coverage/e2e/**/*.cobertura.xml`,
causando processamento duplicado; edite o valor associado a `reports` (a chave
reports na workflow) removendo as entradas redundantes e deixando apenas
`coverage/**/*.cobertura.xml`, ou se quiser separar categorias, remova o glob
genérico e mantenha somente
`coverage/integration/**/*.cobertura.xml;coverage/e2e/**/*.cobertura.xml` para
evitar sobreposição.

In `@src/Modules/Documents/Infrastructure/Extensions.cs`:
- Around line 122-127: Os stubs de fallback são registrados incondicionalmente;
altere Extensions.cs para só registrar NullBlobStorageService e
NullDocumentIntelligenceService via TryAddScoped quando estivermos em ambiente
de bypass (ex.: env.IsDevelopment() ou uma flag de configuração de bypass), e
nos demais ambientes não registrar os stubs e em vez disso validar a
configuração de Azure/serviços e lançar/registrar um erro fail-fast (por exemplo
lançar InvalidOperationException ou usar
IHostApplicationLifetime.StopApplication) indicando falta de configuração para
IBlobStorageService/IDocumentIntelligenceService; verifique os símbolos
TryAddScoped, IBlobStorageService, NullBlobStorageService,
IDocumentIntelligenceService e NullDocumentIntelligenceService para localizar o
trecho a condicionar e adicione a validação de configuração antes de aceitar a
inicialização em produção.

---

Duplicate comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 58-63: O operador implícito Result<T>(T value) está mais
restritivo que a fábrica Success(T) porque rejeita default(T) e lança, criando
semânticas divergentes; altere o operador implícito para delegar diretamente
para Result<T>.Success(value) (assim como o operador implícito de Error deve
delegar para Result<T>.Failure(error)), removendo a verificação
EqualityComparer<T>.Default.Equals(value, default!) e o throw para garantir
comportamento consistente entre as vias de criação.

In `@src/Shared/Utilities/EnvironmentHelpers.cs`:
- Around line 16-21: A função IsSecurityBypassEnvironment permite bypass apenas
baseado na variável INTEGRATION_TESTS, o que pode ativar bypass em produção;
atualize IsSecurityBypassEnvironment para exigir um IHostEnvironment explícito
(não null) e só tratar INTEGRATION_TESTS como reforço quando o host for
claramente não-produtivo (por exemplo environment.IsDevelopment() ||
environment.EnvironmentName == "Testing" || environment.EnvironmentName ==
"Integration"); se environment for null ou indicar ambiente de produção, ignore
a variável de ambiente e retorne false; localize a chamada a
Environment.GetEnvironmentVariable("INTEGRATION_TESTS") dentro de
IsSecurityBypassEnvironment para aplicar essa checagem combinada.

---

Nitpick comments:
In
`@src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs`:
- Around line 416-418: The test currently asserts failure and that AddAsync was
not called, but must also assert that UpdateAsync and SaveChangesAsync were
never invoked to prevent regressions; update the assertions in
SearchProvidersModuleApiTests (near the IsFailure checks for IndexProviderAsync
scenarios) to call _repositoryMock.Verify(x =>
x.UpdateAsync(It.IsAny<SearchableProvider>(), It.IsAny<CancellationToken>()),
Times.Never) and _repositoryMock.Verify(x =>
x.SaveChangesAsync(It.IsAny<CancellationToken>()), Times.Never) for both failing
cases (the blocks around the existing AddAsync verify at the locations
mentioned) so the test enforces no repository writes occur on short-circuit
paths in SearchProvidersModuleApi.IndexProviderAsync.
- Around line 424-425: Substitua o comentário em inglês "// Arrange — simulates
the contract where the API returns Success but with a null payload (e.g.,
provider exists in the index but has no indexable data yet)" por uma versão em
português mantendo o header AAA "Arrange" em inglês; por exemplo: "// Arrange —
simula o contrato em que a API retorna Success mas com payload nulo (p.ex., o
provedor existe no índice mas ainda não tem dados indexáveis)".

In `@src/Web/MeAjudaAi.Web.Customer/auth.ts`:
- Around line 180-187: O trecho faz uma asserção manual do user para acessar
campos customizados (accessToken, refreshToken, expiresAt, id); em vez disso
adicione uma type augmentation para NextAuth (next-auth.d.ts) definindo estes
campos em User/Session/JWT e ajuste o retorno do callback que usa as variáveis
token/user para usar os tipos fortemente tipados; localize o callback que monta
o objeto (referenciado pelas variáveis user, token e pela expressão que atribui
accessToken/refreshToken/expiresAt/id) e remova a asserção manual, atualizando
as assinaturas de NextAuth para refletir os campos customizados e garantir
tipagem segura em toda a cadeia.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 1d84b1f3-beea-48b4-a14f-3bf63c5af89c

📥 Commits

Reviewing files that changed from the base of the PR and between 8586ebb and 739d96b.

📒 Files selected for processing (9)
  • .github/workflows/pr-validation.yml
  • src/Contracts/Functional/Result.cs
  • src/Modules/Documents/Infrastructure/Extensions.cs
  • src/Modules/Documents/Infrastructure/Services/NullBlobStorageService.cs
  • src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs
  • src/Modules/SearchProviders/Tests/Unit/API/ModuleExtensionsTests.cs
  • src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs
  • src/Shared/Utilities/EnvironmentHelpers.cs
  • src/Web/MeAjudaAi.Web.Customer/auth.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • src/Modules/SearchProviders/Tests/Unit/API/ModuleExtensionsTests.cs

Filipe Frigini added 2 commits March 11, 2026 21:29
…egrate NextAuth.js for Keycloak authentication, and add foundational infrastructure components.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (2)
src/Modules/Documents/Infrastructure/Extensions.cs (1)

47-48: Considere usar DatabaseConstants.DefaultTestConnectionString para consistência.

Outros módulos (Providers, ServiceCatalogs, Locations) usam MeAjudaAi.Shared.Database.DatabaseConstants.DefaultTestConnectionString como fallback em ambientes de teste. Usar a constante centralizada evita duplicação e facilita manutenção.

♻️ Proposta de ajuste
             if (isTestEnvironment)
             {
                 // Use placeholder for integration tests - will be replaced by test infrastructure
-                connectionString = "Host=localhost;Database=test;Username=test;Password=test";
+                connectionString = MeAjudaAi.Shared.Database.DatabaseConstants.DefaultTestConnectionString;
             }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Modules/Documents/Infrastructure/Extensions.cs` around lines 47 - 48,
Replace the hard-coded test connection string assignment in Extensions.cs (the
connectionString variable) with the centralized constant
MeAjudaAi.Shared.Database.DatabaseConstants.DefaultTestConnectionString so other
modules use the same fallback; locate the assignment to connectionString in the
Extensions class and swap the literal
"Host=localhost;Database=test;Username=test;Password=test" for
DatabaseConstants.DefaultTestConnectionString (import or fully qualify
MeAjudaAi.Shared.Database.DatabaseConstants if needed).
src/Contracts/Functional/Result.cs (1)

109-112: Operador ! em Error! é redundante mas inofensivo.

O atributo [MemberNotNullWhen(true, nameof(Error))] em IsFailure (linha 77) deveria garantir que Error não é nulo no branch else. Porém, o compilador pode não inferir isso através da propriedade computada IsFailure => !IsSuccess. O ! é um workaround pragmático e não indica problema real.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` around lines 109 - 112, Remove the
redundant null-forgiving operator on Error in the onFailure call and make the
non-nullability explicit to the compiler by adding the nullable analysis
attribute to IsSuccess (or adjust the null-state annotation): add
[MemberNotNullWhen(false, nameof(Error))] to the IsSuccess property (or
alternatively capture Error into a local with Error! before calling onFailure),
then change the call in the block using IsSuccess/IsFailure to onFailure(Error)
without the trailing !; this ensures the compiler understands Error is non-null
in the else branch while eliminating the unnecessary `!` operator.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/pr-validation.yml:
- Around line 116-122: O step de geração do cliente OpenAPI (generate:api)
precisa rodar antes dos passos "🔍 Lint Frontend Workspace" e "🧪 Test Frontend
Workspace" porque MeAjudaAi.Web.Customer depende de artefatos gerados; mova o
step que executa generate:api para antes dos comandos que usam npx nx affected
--target=lint e --target=test (também verifique e ajustar os blocks
correspondentes nas linhas 146-155 mencionadas), ou alternativamente remova
MeAjudaAi.Web.Customer do conjunto afetado para lint/test e adicione um step que
execute explicitamente lint/test desse app somente após generate:api completar.

In `@src/Modules/Documents/Infrastructure/Extensions.cs`:
- Around line 125-127: The code calls BuildServiceProvider() and
GetService<IHostEnvironment>() inside AddServices (or AddInfrastructure) which
creates a temporary service provider and triggers CA2255 and resolution issues;
instead accept IHostEnvironment as an explicit parameter on the extension method
(or rely on IConfiguration/env vars inside
MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment) and
remove the BuildServiceProvider/GetService calls so the method uses the provided
IHostEnvironment (or the parameterless helper) when invoking
IsSecurityBypassEnvironment.

In `@src/Shared/Messaging/DeadLetter/RabbitMqDeadLetterService.cs`:
- Around line 475-479: The Dispose method currently blocks on async code using
DisposeAsync().AsTask().GetAwaiter().GetResult(), which is a sync-over-async
anti-pattern; either extract the cleanup into a synchronous path and call that
from Dispose, or avoid context capture by offloading the async call to the
threadpool. Concretely: create a shared cleanup implementation (e.g., a private
void Dispose(bool disposing) or a private void DisposeSync()) that performs the
necessary synchronous resource cleanup and have Dispose() call it, and keep
DisposeAsync() for async cleanup; or if you must call the async implementation,
replace the blocking call with Task.Run(() =>
DisposeAsync().AsTask()).GetAwaiter().GetResult() to avoid capturing a
SynchronizationContext. Ensure _disposed is set and idempotency preserved in
both Dispose and DisposeAsync.

---

Nitpick comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 109-112: Remove the redundant null-forgiving operator on Error in
the onFailure call and make the non-nullability explicit to the compiler by
adding the nullable analysis attribute to IsSuccess (or adjust the null-state
annotation): add [MemberNotNullWhen(false, nameof(Error))] to the IsSuccess
property (or alternatively capture Error into a local with Error! before calling
onFailure), then change the call in the block using IsSuccess/IsFailure to
onFailure(Error) without the trailing !; this ensures the compiler understands
Error is non-null in the else branch while eliminating the unnecessary `!`
operator.

In `@src/Modules/Documents/Infrastructure/Extensions.cs`:
- Around line 47-48: Replace the hard-coded test connection string assignment in
Extensions.cs (the connectionString variable) with the centralized constant
MeAjudaAi.Shared.Database.DatabaseConstants.DefaultTestConnectionString so other
modules use the same fallback; locate the assignment to connectionString in the
Extensions class and swap the literal
"Host=localhost;Database=test;Username=test;Password=test" for
DatabaseConstants.DefaultTestConnectionString (import or fully qualify
MeAjudaAi.Shared.Database.DatabaseConstants if needed).

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 121936ff-e2ce-4a95-9a66-22e0a7c5b84d

📥 Commits

Reviewing files that changed from the base of the PR and between 739d96b and 69577c0.

📒 Files selected for processing (8)
  • .github/workflows/pr-validation.yml
  • src/Contracts/Functional/Result.cs
  • src/Modules/Documents/Infrastructure/Extensions.cs
  • src/Modules/SearchProviders/Tests/Unit/Application/ModuleApi/SearchProvidersModuleApiTests.cs
  • src/Shared/Messaging/DeadLetter/RabbitMqDeadLetterService.cs
  • src/Shared/Utilities/EnvironmentHelpers.cs
  • src/Web/MeAjudaAi.Web.Customer/auth.ts
  • src/Web/MeAjudaAi.Web.Customer/types/next-auth.d.ts

Filipe Frigini added 2 commits March 11, 2026 21:47
…Result type, RabbitMQ dead-lettering, and PR validation workflow.
…onfiguration and validation, including CORS, authentication, and rate limiting.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (1)
src/Contracts/Functional/Result.cs (1)

18-19: 🛠️ Refactor suggestion | 🟠 Major

Faltam atributos [MemberNotNullWhen] em Result<T> para paridade com Result.

A classe não-genérica Result (linhas 74 e 77) possui os atributos [MemberNotNullWhen], mas Result<T> não. Isso causa inconsistência na análise de fluxo de nulabilidade do compilador.

Para que o compilador saiba que Error não é nulo quando IsFailure é true:

 public class Result<T>
 {
+    [MemberNotNullWhen(false, nameof(Error))]
     public bool IsSuccess { get; }
+    [MemberNotNullWhen(true, nameof(Error))]
     public bool IsFailure => !IsSuccess;
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` around lines 18 - 19, Add the same
nullable flow attributes to the generic Result<T> as used in the non-generic
Result: annotate the IsFailure and IsSuccess properties in the Result<T> class
with [MemberNotNullWhen(true, nameof(Error))] (and [MemberNotNullWhen(false,
nameof(Error))] as appropriate for IsSuccess) so the compiler knows Error is
non-null when IsFailure is true; update the Result<T> property declarations
(IsSuccess/IsFailure) accordingly to mirror the non-generic Result attributes.
🧹 Nitpick comments (4)
src/Modules/Documents/Infrastructure/Extensions.cs (2)

122-144: Implementação do fail-fast atende às preocupações anteriores.

O código agora:

  1. Registra stubs no-op apenas em ambientes de bypass
  2. Evita BuildServiceProvider() usando a sobrecarga sem parâmetros
  3. Valida fail-fast em produção

Uma observação menor: as mensagens de exceção estão em português (linhas 137-138, 142-143). Considerando a política de logs em inglês do repositório, mensagens de erro de configuração (voltadas para desenvolvedores/ops) podem seguir o mesmo padrão para consistência. Isto é opcional e de baixa prioridade.

,

💬 Opcional: Mensagens de exceção em inglês
             if (!registered)
                 throw new InvalidOperationException(
-                    "IBlobStorageService não está configurado. Defina 'Azure:Storage:ConnectionString' para habilitar uploads de arquivo.");
+                    "IBlobStorageService is not configured. Set 'Azure:Storage:ConnectionString' to enable file uploads.");

             var intelligenceRegistered = services.Any(sd => sd.ServiceType == typeof(IDocumentIntelligenceService));
             if (!intelligenceRegistered)
                 throw new InvalidOperationException(
-                    "IDocumentIntelligenceService não está configurado. Defina 'Azure:DocumentIntelligence:Endpoint' e 'Azure:DocumentIntelligence:ApiKey' para habilitar OCR.");
+                    "IDocumentIntelligenceService is not configured. Set 'Azure:DocumentIntelligence:Endpoint' and 'Azure:DocumentIntelligence:ApiKey' to enable OCR.");

Based on learnings: "Enforce English for all runtime logs and messages in C# code. Comments may remain in Portuguese."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Modules/Documents/Infrastructure/Extensions.cs` around lines 122 - 144,
Update the two configuration error messages thrown by InvalidOperationException
to English for consistency with the repo logging policy: locate the fail-fast
checks in Extensions.cs that run when
EnvironmentHelpers.IsSecurityBypassEnvironment() is false (the blocks that test
for services.Any(sd => sd.ServiceType == typeof(IBlobStorageService)) and
services.Any(sd => sd.ServiceType == typeof(IDocumentIntelligenceService))) and
replace the Portuguese messages with clear English equivalents that state the
missing service and the required configuration keys (mention
Azure:Storage:ConnectionString and
Azure:DocumentIntelligence:Endpoint/Azure:DocumentIntelligence:ApiKey) while
leaving the surrounding logic (TryAddScoped registrations, exception types, and
environment check) unchanged.

1-1: Considere restringir o escopo do pragma disable.

O pragma #pragma warning disable S2068 está no escopo do arquivo inteiro, mas o código que dispara o warning está apenas na linha 48 (dentro do bloco isTestEnvironment). Escopar o pragma mais estritamente evita mascarar futuros problemas de credenciais hard-coded em outras partes do arquivo.

♻️ Sugestão de escopo restrito
-#pragma warning disable S2068 // "password" detected here, make sure this is not a hard-coded credential
 using EFCore.NamingConventions;

E ao redor da linha 48:

`#pragma` warning disable S2068 // Credential placeholder for test environment only
                connectionString = MeAjudaAi.Shared.Database.DatabaseConstants.DefaultTestConnectionString;
`#pragma` warning restore S2068
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Modules/Documents/Infrastructure/Extensions.cs` at line 1, O pragma que
desabilita o warning S2068 está no escopo do arquivo inteiro; restrinja-o apenas
ao trecho que causa o falso positivo: dentro do bloco isTestEnvironment ao redor
da atribuição de connectionString que usa
DatabaseConstants.DefaultTestConnectionString. Remova o pragma global e envolva
apenas essa atribuição com `#pragma` warning disable S2068 antes e `#pragma` warning
restore S2068 depois (ou equivalente), garantindo que outras partes do arquivo
continuem a sinalizar hard-coded credentials.
.github/workflows/pr-validation.yml (1)

137-139: Caminho do DLL com TFM hardcoded pode quebrar silenciosamente.

O caminho ../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/net10.0/MeAjudaAi.ApiService.dll assume que o TargetFramework é net10.0. Se o csproj for atualizado para outra versão (ex: net11.0), este step falhará sem mensagem clara.

💡 Sugestão: detectar TFM dinamicamente
+          # Detect TFM from csproj
+          API_SERVICE_CSPROJ="../../src/Bootstrapper/MeAjudaAi.ApiService/MeAjudaAi.ApiService.csproj"
+          TFM=$(grep -oP '(?<=<TargetFramework>)[^<]+' "$API_SERVICE_CSPROJ" || echo "net10.0")
+          DLL_PATH="../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/${TFM}/MeAjudaAi.ApiService.dll"
+
+          if [ ! -f "$DLL_PATH" ]; then
+            echo "❌ DLL not found at $DLL_PATH"
+            echo "Available Release builds:"
+            ls -la ../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/ || true
+            exit 1
+          fi
+
           swagger tofile --output ../api/api-spec.json \
-            ../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/net10.0/MeAjudaAi.ApiService.dll \
+            "$DLL_PATH" \
             v1
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/workflows/pr-validation.yml around lines 137 - 139, The workflow
step uses a hardcoded TFM in the DLL path
("../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/net10.0/MeAjudaAi.ApiService.dll")
which will break if TargetFramework changes; update the step that runs "swagger
tofile" to compute the DLL path dynamically instead of embedding "net10.0" —
e.g., detect TargetFramework by reading the csproj
(TargetFramework/TargetFrameworks) or glob for the built DLL under
bin/Release/*/MeAjudaAi.ApiService.dll (or use dotnet publish output) and pass
that resolved path into the "swagger tofile" command so the workflow works for
other TFMs.
src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs (1)

46-62: Possível falso positivo na validação quando seções existem mas valores não estão configurados.

GetValue<int> retorna 0 se a chave não existir. Se a seção Anonymous ou Authenticated existir mas sem valores configurados, a validação falhará com "must be positive values".

Considere usar GetValue<int?> para distinguir entre "valor não configurado" e "valor zero explícito":

♻️ Sugestão para evitar falsos positivos
 if (anonymousLimits.Exists())
 {
-    var anonMinute = anonymousLimits.GetValue<int>("RequestsPerMinute");
-    var anonHour = anonymousLimits.GetValue<int>("RequestsPerHour");
-
-    if (anonMinute <= 0 || anonHour <= 0)
-        errors.Add("Anonymous request limits must be positive values");
+    var anonMinute = anonymousLimits.GetValue<int?>("RequestsPerMinute");
+    var anonHour = anonymousLimits.GetValue<int?>("RequestsPerHour");
+
+    if (anonMinute.HasValue && anonMinute.Value <= 0)
+        errors.Add("Anonymous RequestsPerMinute must be positive");
+    if (anonHour.HasValue && anonHour.Value <= 0)
+        errors.Add("Anonymous RequestsPerHour must be positive");
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`
around lines 46 - 62, The validation falsely flags missing keys as zero because
GetValue<int> returns 0 for missing keys; change the retrieval to nullable ints
(use GetValue<int?>) for anonymousLimits and authenticatedLimits (e.g., replace
anonMinute/anonHour and authMinute/authHour retrievals with int? types), then
only perform the <=0 check when the nullable HasValue is true (or value != null)
and treat null as "not configured" so you don't add the "must be positive
values" error for missing keys; update the conditional logic around
anonymousLimits.Exists()/authenticatedLimits.Exists() accordingly and keep the
existing error messages in errors.Add.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`:
- Around line 304-309: The else-branch currently calls
services.AddAuthentication() with no default scheme; change it to register the
same default authentication scheme used elsewhere (e.g., the JwtBearer scheme
used by AddKeycloakAuthentication/ServiceCollectionExtensions) so the app has a
consistent default—replace the parameterless AddAuthentication() call with
AddAuthentication(<defaultScheme>) using the same symbol (e.g.,
JwtBearerDefaults.AuthenticationScheme or your project’s
DefaultAuthenticationScheme constant) and add the necessary using/import if
missing.

In `@src/Shared/Messaging/DeadLetter/RabbitMqDeadLetterService.cs`:
- Line 23: O problema é uma race entre EnsureConnectionAsync() (que chama
_connectionSemaphore.WaitAsync() e depois Release() no finally) e
Dispose()/DisposeAsync() que pode descartar o semáforo enquanto há waiters, e
além disso Dispose() não fecha/dispõe corretamente _channel/_connection; para
corrigir, altere EnsureConnectionAsync() para aceitar um CancellationToken
ligado a uma flag de descarte, faça Dispose()/DisposeAsync() sinalizarem
cancelamento e aguardarem as operações pendentes concluírem antes de descartar
_connectionSemaphore (ex.: usar CancellationTokenSource, verificar _disposed e
cancelar/await todas as tarefas que entraram em WaitAsync), e somente após
aguardar feche e disponha explicitamente a conexão RabbitMQ chamando
Close()/Dispose() em _channel e _connection (ou, alternativamente, lance
NotSupportedException em Dispose() e force o uso de DisposeAsync()), garantindo
que _disposed seja consultado atomically antes de liberar ou descartar o
semáforo e recursos.
- Line 17: No Dispose síncrono de RabbitMqDeadLetterService está descartando
recursos; atualize o método Dispose() da classe RabbitMqDeadLetterService para
chamar _channel?.Dispose() e _connection?.Dispose() antes de zerar as
referências, garantindo também que qualquer lógica de Dispose pattern existente
(interfaces IDisposable e IAsyncDisposable) mantenha a idempotência e defina
_channel = null e _connection = null após Dispose(); preserve o comportamento do
DisposeAsync() (se existir) para não duplicar liberação e marque objetos como
liberados para evitar múltiplas chamadas.

---

Duplicate comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 18-19: Add the same nullable flow attributes to the generic
Result<T> as used in the non-generic Result: annotate the IsFailure and
IsSuccess properties in the Result<T> class with [MemberNotNullWhen(true,
nameof(Error))] (and [MemberNotNullWhen(false, nameof(Error))] as appropriate
for IsSuccess) so the compiler knows Error is non-null when IsFailure is true;
update the Result<T> property declarations (IsSuccess/IsFailure) accordingly to
mirror the non-generic Result attributes.

---

Nitpick comments:
In @.github/workflows/pr-validation.yml:
- Around line 137-139: The workflow step uses a hardcoded TFM in the DLL path
("../../src/Bootstrapper/MeAjudaAi.ApiService/bin/Release/net10.0/MeAjudaAi.ApiService.dll")
which will break if TargetFramework changes; update the step that runs "swagger
tofile" to compute the DLL path dynamically instead of embedding "net10.0" —
e.g., detect TargetFramework by reading the csproj
(TargetFramework/TargetFrameworks) or glob for the built DLL under
bin/Release/*/MeAjudaAi.ApiService.dll (or use dotnet publish output) and pass
that resolved path into the "swagger tofile" command so the workflow works for
other TFMs.

In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`:
- Around line 46-62: The validation falsely flags missing keys as zero because
GetValue<int> returns 0 for missing keys; change the retrieval to nullable ints
(use GetValue<int?>) for anonymousLimits and authenticatedLimits (e.g., replace
anonMinute/anonHour and authMinute/authHour retrievals with int? types), then
only perform the <=0 check when the nullable HasValue is true (or value != null)
and treat null as "not configured" so you don't add the "must be positive
values" error for missing keys; update the conditional logic around
anonymousLimits.Exists()/authenticatedLimits.Exists() accordingly and keep the
existing error messages in errors.Add.

In `@src/Modules/Documents/Infrastructure/Extensions.cs`:
- Around line 122-144: Update the two configuration error messages thrown by
InvalidOperationException to English for consistency with the repo logging
policy: locate the fail-fast checks in Extensions.cs that run when
EnvironmentHelpers.IsSecurityBypassEnvironment() is false (the blocks that test
for services.Any(sd => sd.ServiceType == typeof(IBlobStorageService)) and
services.Any(sd => sd.ServiceType == typeof(IDocumentIntelligenceService))) and
replace the Portuguese messages with clear English equivalents that state the
missing service and the required configuration keys (mention
Azure:Storage:ConnectionString and
Azure:DocumentIntelligence:Endpoint/Azure:DocumentIntelligence:ApiKey) while
leaving the surrounding logic (TryAddScoped registrations, exception types, and
environment check) unchanged.
- Line 1: O pragma que desabilita o warning S2068 está no escopo do arquivo
inteiro; restrinja-o apenas ao trecho que causa o falso positivo: dentro do
bloco isTestEnvironment ao redor da atribuição de connectionString que usa
DatabaseConstants.DefaultTestConnectionString. Remova o pragma global e envolva
apenas essa atribuição com `#pragma` warning disable S2068 antes e `#pragma` warning
restore S2068 depois (ou equivalente), garantindo que outras partes do arquivo
continuem a sinalizar hard-coded credentials.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 9f563b66-8b4c-4059-a9dd-97c9ee4bcb33

📥 Commits

Reviewing files that changed from the base of the PR and between bf6f7d7 and 0b90a4e.

📒 Files selected for processing (6)
  • .github/workflows/pr-validation.yml
  • src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs
  • src/Contracts/Functional/.editorconfig
  • src/Contracts/Functional/Result.cs
  • src/Modules/Documents/Infrastructure/Extensions.cs
  • src/Shared/Messaging/DeadLetter/RabbitMqDeadLetterService.cs
✅ Files skipped from review due to trivial changes (1)
  • src/Contracts/Functional/.editorconfig

Filipe Frigini added 2 commits March 12, 2026 10:51
…lidation workflow, and introduce core functional Result type and RabbitMQ dead-letter service.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs (1)

227-246: ⚠️ Potential issue | 🟠 Major

O bypass de teste ainda falha com AllowedOrigins = "*" .

Aqui o startup continua aceitando coringa só em Development. Então, mesmo com Validate() pulado em Testing/Integration, esse bloco ainda lança InvalidOperationException se a configuração de E2E/CI usar "*". Vale reaproveitar o mesmo predicado de bypass usado acima.

🛠️ Ajuste sugerido
-                    if (environment.IsDevelopment())
+                    if (isTesting || environment.IsDevelopment())
                     {
                         // AllowAnyOrigin() é incompatível com AllowCredentials()
                         if (corsOptions.AllowCredentials)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`
around lines 227 - 246, O bloco que trata AllowedOrigins.Contains("*") só
permite curinga em environment.IsDevelopment(), o que faz com que E2E/CI tests
que dependem do bypass ainda lancem InvalidOperationException; altere a
verificação para reutilizar o mesmo predicado de bypass usado anteriormente (em
vez de apenas environment.IsDevelopment()) — ou seja, permita o curinga quando o
predicado de teste/bypass usado acima retornar true, mantendo a lógica de
AllowCredentials (usar policy.SetIsOriginAllowed(_ => true) se
corsOptions.AllowCredentials; caso contrário policy.AllowAnyOrigin()), e só
lance InvalidOperationException quando nem development nem o predicado de bypass
indicarem ambiente de teste. Referências: corsOptions.AllowedOrigins,
environment, policy, SetIsOriginAllowed, AllowCredentials,
AllowedOrigins.Contains("*").
♻️ Duplicate comments (1)
src/Modules/Documents/Infrastructure/Extensions.cs (1)

40-40: ⚠️ Potential issue | 🟠 Major

Passe IHostEnvironment explicitamente para o helper de bypass de segurança.

Nas linhas 40 e 128, IsSecurityBypassEnvironment() é chamado sem IHostEnvironment, dependendo apenas de variáveis de ambiente do processo (DOTNET_ENVIRONMENT / ASPNETCORE_ENVIRONMENT). Se testes ou a aplicação usarem UseEnvironment("Testing") configurado programaticamente no host builder, mas sem setar essas variáveis, o módulo ignorará o bypass e disparará exceções de fail-fast mesmo em ambientes de teste.

O padrão correto existe no repositório: src/Modules/SearchProviders/API/Extensions.cs já recebe IHostEnvironment como parâmetro e o passa ao helper. Aplique o mesmo padrão aqui.

💡 Ajuste sugerido
-public static IServiceCollection AddInfrastructure(this IServiceCollection services, IConfiguration configuration)
+public static IServiceCollection AddInfrastructure(
+    this IServiceCollection services,
+    IConfiguration configuration,
+    IHostEnvironment environment)
 {
-    services.AddPersistence(configuration);
-    services.AddServices(configuration);
+    services.AddPersistence(configuration, environment);
+    services.AddServices(configuration, environment);
     services.AddEventHandlers();
     services.AddJobs();

     return services;
 }

-private static IServiceCollection AddPersistence(this IServiceCollection services, IConfiguration configuration)
+private static IServiceCollection AddPersistence(
+    this IServiceCollection services,
+    IConfiguration configuration,
+    IHostEnvironment environment)
 {
     var connectionString = configuration.GetConnectionString("DefaultConnection")
                           ?? configuration.GetConnectionString("Documents")
                           ?? configuration.GetConnectionString("meajudaai-db");

-    var isTestEnvironment = MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment();
+    var isBypassEnvironment =
+        MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment(environment);

     if (string.IsNullOrEmpty(connectionString))
     {
-        if (isTestEnvironment)
+        if (isBypassEnvironment)
         {
 `#pragma` warning disable S2068
             connectionString = MeAjudaAi.Shared.Database.DatabaseConstants.DefaultTestConnectionString;
 `#pragma` warning restore S2068
         }
         else
         {
             throw new InvalidOperationException(
                 "Database connection string is not configured. "
                 + "Please set one of the following configuration keys: "
                 + "'ConnectionStrings:DefaultConnection', 'ConnectionStrings:Documents', or 'ConnectionStrings:meajudaai-db'");
         }
     }
     ...
 }

-private static IServiceCollection AddServices(this IServiceCollection services, IConfiguration configuration)
+private static IServiceCollection AddServices(
+    this IServiceCollection services,
+    IConfiguration configuration,
+    IHostEnvironment environment)
 {
     ...
-    if (MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment())
+    if (MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment(environment))
     {
         services.TryAddScoped<IBlobStorageService, NullBlobStorageService>();
         services.TryAddScoped<IDocumentIntelligenceService, NullDocumentIntelligenceService>();
     }
     else

Atualize também a chamada em src/Modules/Documents/API/Extensions.cs:24:

 public static IServiceCollection AddDocumentsModule(
     this IServiceCollection services,
     IConfiguration configuration)
 {
     services.AddApplication(configuration);
-    services.AddInfrastructure(configuration);
+    services.AddInfrastructure(configuration, services.BuildServiceProvider().GetRequiredService<IHostEnvironment>());

Ou melhor ainda, receba IHostEnvironment como parâmetro em AddDocumentsModule (seguindo o padrão de AddSearchProvidersModule) e passe-o adiante.

Also applies to: 128-128

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Modules/Documents/Infrastructure/Extensions.cs` at line 40, Altere as
chamadas a
MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment() para
aceitar e usar explicitamente um IHostEnvironment: atualize o método público
AddDocumentsModule (em Extensions.cs) para receber IHostEnvironment como
parâmetro e repasse essa instância para
IsSecurityBypassEnvironment(hostEnvironment) onde hoje é chamado sem parâmetro
(linhas aproximadas com IsSecurityBypassEnvironment e a verificação em 40 e
128), e atualize também a chamada de AddDocumentsModule em
src/Modules/Documents/API/Extensions.cs para fornecer o IHostEnvironment
recebido pelo host; siga o mesmo padrão usado em
src/Modules/SearchProviders/API/Extensions.cs.
🧹 Nitpick comments (4)
src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs (1)

50-51: Considere traduzir a ErrorMessage para português se ela for exposta ao usuário.

De acordo com a política de linguagem do repositório, mensagens voltadas ao usuário devem estar em português. Se a razão de rejeição do documento (ErrorMessage) for exibida na interface ou API para o usuário final, considere traduzir:

-            ErrorMessage: "Azure AI Document Intelligence is not configured. "
-                + "Set 'Azure:DocumentIntelligence:Endpoint' and 'Azure:DocumentIntelligence:ApiKey' to enable OCR."));
+            ErrorMessage: "Serviço de análise de documentos não configurado. "
+                + "Entre em contato com o suporte técnico."));

Se esta mensagem for apenas para logs/operadores (não exposta ao frontend), o texto atual em inglês está adequado. Based on learnings: "User-facing validation messages and API responses shown to the frontend must be in Portuguese."

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs`
around lines 50 - 51, The ErrorMessage string in NullDocumentIntelligenceService
("Azure AI Document Intelligence is not configured. Set
'Azure:DocumentIntelligence:Endpoint' and 'Azure:DocumentIntelligence:ApiKey' to
enable OCR.") should be translated to Portuguese if it is shown to end users;
update the ErrorMessage value in the NullDocumentIntelligenceService class to a
Portuguese sentence conveying the same information (e.g. mention que as chaves
Endpoint e ApiKey devem ser configuradas para habilitar OCR). If the message is
strictly for logs/operators and not user-facing, add a comment near the
ErrorMessage field in NullDocumentIntelligenceService clarifying it remains in
English for operational use.
.github/scripts/generate-runsettings.sh (1)

30-63: Considere adicionar tratamento de erros para robustez (opcional).

O script poderia se beneficiar de set -euo pipefail no início para falhar explicitamente em caso de erros, e uma verificação se o arquivo foi criado com sucesso. Como este script é sourced por workflows, isso pode ajudar a detectar problemas mais cedo.

♻️ Sugestão opcional para adicionar tratamento de erros
 #!/bin/bash
+set -euo pipefail
 #
 # Shared utility functions for generating Coverlet runsettings files.

E ao final da função generate_runsettings:

   } > "$file"
+
+  if [ ! -f "$file" ]; then
+    echo "Error: Failed to create runsettings file: $file" >&2
+    return 1
+  fi
 }
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/scripts/generate-runsettings.sh around lines 30 - 63, Add strict
error handling to the script by enabling "set -euo pipefail" at the top of the
file, and make generate_runsettings more robust: validate required parameters
(e.g., "file") before use, ensure "escape_xml" is present/used safely for
"exclude_filter"/"exclude_by_file"/"exclude_by_attr"/"include_filter", and after
the here-doc redirection that writes to "$file" verify the write succeeded (file
exists and is non-empty or check the redirection exit status) and return a
non-zero status from generate_runsettings on failure so callers/workflows fail
fast.
src/Contracts/Functional/Result.cs (2)

68-72: Inconsistência na validação de argumentos do método Match.

O Match de Result<T> não valida onSuccess/onFailure para null, enquanto os métodos Match de Result (linhas 109-110, 128-129) usam ArgumentNullException.ThrowIfNull. Se passado null, o genérico lançará NullReferenceException em vez de ArgumentNullException.

♻️ Sugestão de alinhamento
 public TResult Match<TResult>(
     Func<T, TResult> onSuccess,
     Func<Error, TResult> onFailure)
-    => IsSuccess ? onSuccess(Value) : onFailure(Error);
+{
+    ArgumentNullException.ThrowIfNull(onSuccess);
+    ArgumentNullException.ThrowIfNull(onFailure);
+
+    return IsSuccess ? onSuccess(Value) : onFailure(Error);
+}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` around lines 68 - 72, O método genérico
Match<TResult> em Result<T> não valida os parâmetros onSuccess e onFailure,
causando NullReferenceException; alinhe-o aos Match do Result não genérico
adicionando chamadas a ArgumentNullException.ThrowIfNull(onSuccess) e
ArgumentNullException.ThrowIfNull(onFailure) no início de Match<TResult> (antes
de avaliar IsSuccess) para garantir ArgumentNullException consistente.

99-99: Consistência: operador implícito sem validação explícita de null.

Diferente do operador genérico (linha 66) que valida error is null explicitamente, este operador delega a validação ao construtor. Funciona corretamente, mas a consistência no estilo defensivo facilitaria debug e manutenção.

♻️ Sugestão para alinhar com a versão genérica
-    public static implicit operator Result(Error error) => Failure(error);
+    public static implicit operator Result(Error error) => error is null ? throw new ArgumentNullException(nameof(error)) : Failure(error);
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` at line 99, The implicit operator "public
static implicit operator Result(Error error) => Failure(error);" lacks the
explicit null-check present in the generic counterpart (the implicit operator
for Result<T>), so update this operator to mirror that defensive pattern: check
"error is null" and throw or return the same behavior used in the generic
version before calling Failure(error), referencing the implicit operator for
Result<T>, the Error parameter, and the Failure(Error) factory to locate and
apply the same null-validation logic.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`:
- Around line 46-62: The current validation in SecurityExtensions.cs uses
anonymousLimits.GetValue<int?>(...) and authenticatedLimits.GetValue<int?>(...)
then compares anonMinute/anonHour and authMinute/authHour with <= 0, which
misses missing keys because GetValue returns null; update the checks to also
detect null (e.g., if (anonMinute is null || anonHour is null || anonMinute <= 0
|| anonHour <= 0) and similarly for authMinute/authHour) and add distinct error
messages when a required key is missing or non-positive so
anonymousLimits/authenticatedLimits blocks fail when RequestsPerMinute or
RequestsPerHour are absent or invalid.
- Around line 304-309: The bypass branch currently calls
services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme) without
registering any handler, which causes UseAuthentication() to throw "no
authentication handler is registered for scheme 'Bearer'"; modify the bypass
branch in SecurityExtensions.cs to register a handler fallback (either add a
no-op JwtBearer handler or register a minimal handler via
AddAuthentication().AddScheme/ AddPolicyScheme) so the "Bearer" scheme resolves
in dev/test/CI; reference the existing call to
services.AddAuthentication(JwtBearerDefaults.AuthenticationScheme) and the
pipeline UseAuthentication() to ensure the chosen approach (no-op handler or
AddPolicyScheme with fallback) is wired where the branch currently sets the
scheme.

---

Outside diff comments:
In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`:
- Around line 227-246: O bloco que trata AllowedOrigins.Contains("*") só permite
curinga em environment.IsDevelopment(), o que faz com que E2E/CI tests que
dependem do bypass ainda lancem InvalidOperationException; altere a verificação
para reutilizar o mesmo predicado de bypass usado anteriormente (em vez de
apenas environment.IsDevelopment()) — ou seja, permita o curinga quando o
predicado de teste/bypass usado acima retornar true, mantendo a lógica de
AllowCredentials (usar policy.SetIsOriginAllowed(_ => true) se
corsOptions.AllowCredentials; caso contrário policy.AllowAnyOrigin()), e só
lance InvalidOperationException quando nem development nem o predicado de bypass
indicarem ambiente de teste. Referências: corsOptions.AllowedOrigins,
environment, policy, SetIsOriginAllowed, AllowCredentials,
AllowedOrigins.Contains("*").

---

Duplicate comments:
In `@src/Modules/Documents/Infrastructure/Extensions.cs`:
- Line 40: Altere as chamadas a
MeAjudaAi.Shared.Utilities.EnvironmentHelpers.IsSecurityBypassEnvironment() para
aceitar e usar explicitamente um IHostEnvironment: atualize o método público
AddDocumentsModule (em Extensions.cs) para receber IHostEnvironment como
parâmetro e repasse essa instância para
IsSecurityBypassEnvironment(hostEnvironment) onde hoje é chamado sem parâmetro
(linhas aproximadas com IsSecurityBypassEnvironment e a verificação em 40 e
128), e atualize também a chamada de AddDocumentsModule em
src/Modules/Documents/API/Extensions.cs para fornecer o IHostEnvironment
recebido pelo host; siga o mesmo padrão usado em
src/Modules/SearchProviders/API/Extensions.cs.

---

Nitpick comments:
In @.github/scripts/generate-runsettings.sh:
- Around line 30-63: Add strict error handling to the script by enabling "set
-euo pipefail" at the top of the file, and make generate_runsettings more
robust: validate required parameters (e.g., "file") before use, ensure
"escape_xml" is present/used safely for
"exclude_filter"/"exclude_by_file"/"exclude_by_attr"/"include_filter", and after
the here-doc redirection that writes to "$file" verify the write succeeded (file
exists and is non-empty or check the redirection exit status) and return a
non-zero status from generate_runsettings on failure so callers/workflows fail
fast.

In `@src/Contracts/Functional/Result.cs`:
- Around line 68-72: O método genérico Match<TResult> em Result<T> não valida os
parâmetros onSuccess e onFailure, causando NullReferenceException; alinhe-o aos
Match do Result não genérico adicionando chamadas a
ArgumentNullException.ThrowIfNull(onSuccess) e
ArgumentNullException.ThrowIfNull(onFailure) no início de Match<TResult> (antes
de avaliar IsSuccess) para garantir ArgumentNullException consistente.
- Line 99: The implicit operator "public static implicit operator Result(Error
error) => Failure(error);" lacks the explicit null-check present in the generic
counterpart (the implicit operator for Result<T>), so update this operator to
mirror that defensive pattern: check "error is null" and throw or return the
same behavior used in the generic version before calling Failure(error),
referencing the implicit operator for Result<T>, the Error parameter, and the
Failure(Error) factory to locate and apply the same null-validation logic.

In
`@src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs`:
- Around line 50-51: The ErrorMessage string in NullDocumentIntelligenceService
("Azure AI Document Intelligence is not configured. Set
'Azure:DocumentIntelligence:Endpoint' and 'Azure:DocumentIntelligence:ApiKey' to
enable OCR.") should be translated to Portuguese if it is shown to end users;
update the ErrorMessage value in the NullDocumentIntelligenceService class to a
Portuguese sentence conveying the same information (e.g. mention que as chaves
Endpoint e ApiKey devem ser configuradas para habilitar OCR). If the message is
strictly for logs/operators and not user-facing, add a comment near the
ErrorMessage field in NullDocumentIntelligenceService clarifying it remains in
English for operational use.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 1c6f9c60-3cbb-4deb-ba25-1cf7ed6cb1c0

📥 Commits

Reviewing files that changed from the base of the PR and between 0b90a4e and 0096b3b.

📒 Files selected for processing (9)
  • .github/scripts/generate-runsettings.sh
  • .github/workflows/pr-validation.yml
  • src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs
  • src/Contracts/Functional/Result.cs
  • src/Modules/Documents/Infrastructure/Extensions.cs
  • src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs
  • src/Modules/Documents/Tests/Mocks/MockDocumentIntelligenceService.cs
  • src/Shared/Messaging/DeadLetter/RabbitMqDeadLetterService.cs
  • tests/MeAjudaAi.E2E.Tests/Base/TestContainerFixture.cs

…cluding unit and API tests, functional Result type, security configurations, and a runsettings generation script.
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 4

🧹 Nitpick comments (2)
.github/scripts/generate-runsettings.sh (1)

30-86: Implementação sólida com padrão de escrita atômica.

O uso de arquivo temporário com mktemp seguido de mv é uma boa prática para evitar escritas parciais.

Sugestão opcional: Considerar adicionar um trap para limpeza do arquivo temporário em caso de erros inesperados entre mktemp e o mv/rm final. Isso evitaria acúmulo de arquivos temporários órfãos em cenários de falha.

♻️ Sugestão de melhoria com trap
 generate_runsettings() {
     local file="${1:-}"
     local exclude_filter="${2:-}"
     local exclude_by_file="${3:-}"
     local exclude_by_attr="${4:-}"
     local include_filter="${5:-}"

     # Validate required parameters
     if [ -z "$file" ]; then
         echo "❌ ERROR: Output file path is required as first parameter." >&2
         return 1
     fi

     echo "📝 Generating runsettings: $file" >&2

     # Escape XML special characters
     exclude_filter=$(escape_xml "$exclude_filter")
     exclude_by_file=$(escape_xml "$exclude_by_file")
     exclude_by_attr=$(escape_xml "$exclude_by_attr")

     # Use a temporary file to avoid partial writes if something fails
     local temp_file
     temp_file=$(mktemp)
+    trap 'rm -f "$temp_file"' EXIT

Nota: Como o script roda em ambientes CI efêmeros e o sistema limpa /tmp periodicamente, isso é apenas uma melhoria nice-to-have.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/scripts/generate-runsettings.sh around lines 30 - 86, The
generate_runsettings function creates a temp file with mktemp then mv's it into
place but lacks a cleanup trap for errors; add a trap that removes the temp_file
on EXIT (or on ERR/INT) so orphaned temp files are deleted if the function exits
early, ensure the trap is installed immediately after temp_file=$(mktemp) and
cleared (or left to noop) after the successful mv, and reference the temp_file,
mktemp, mv, and generate_runsettings identifiers when making the change.
src/Modules/Documents/Tests/API/ModuleExtensionsTests.cs (1)

32-80: Falta cobrir o branch novo de fail-fast.

Os testes de AddDocumentsModule aqui passam só por Development, que cai no bypass. O comportamento novo mais sensível está no caminho não-bypass de src/Modules/Documents/Infrastructure/Extensions.cs:125-141; sem um teste para Production sem credenciais de Azure, uma regressão nessa validação passa despercebida.

💡 Exemplo de teste
+[Fact]
+public void AddDocumentsModule_InProductionWithoutAzureConfiguration_ShouldThrow()
+{
+    var services = new ServiceCollection();
+    var mockEnv = new Mock<IHostEnvironment>();
+    mockEnv.Setup(e => e.EnvironmentName).Returns(Environments.Production);
+
+    var act = () => services.AddDocumentsModule(_configuration, mockEnv.Object);
+
+    act.Should().Throw<InvalidOperationException>()
+        .WithMessage("*IBlobStorageService is not configured*");
+}
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Modules/Documents/Tests/API/ModuleExtensionsTests.cs` around lines 32 -
80, Add a test that covers the new fail-fast branch in the AddDocumentsModule
extension: call AddDocumentsModule (the extension implemented in
src/Modules/Documents/Infrastructure/Extensions.cs around the fail-fast logic at
lines ~125-141) with a mocked IHostEnvironment where EnvironmentName is
"Production" and with no Azure credentials configured, and assert that the call
fails fast (i.e., throws an exception) instead of succeeding; implement this as
a new Fact in ModuleExtensionsTests.cs that sets mockEnv.Setup(e =>
e.EnvironmentName).Returns("Production") and verifies AddDocumentsModule throws
(use Assert.ThrowsAny<Exception> or an equivalent FluentAssertions assertion).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 18-25: As propriedades não refletem nulabilidade real: change
Error to Error? in both Result and Result<T>, mark Value in Result<T> with
[MaybeNull], and update the MemberNotNullWhen attributes on IsSuccess/IsFailure
to match actual semantics (e.g., when IsFailure is true Error is non-null, when
IsSuccess is true Value is non-null). Update Result<T>.Success(...) and
Result<T>.Failure(...) constructors to assign proper nullable-safe values
instead of using null!/default! and adjust the [JsonConstructor] parameter types
and their validation checks to accept nullable Error? / [MaybeNull] Value and
validate accordingly so flow analysis and runtime behavior align (update
symbols: IsSuccess, IsFailure, Value, Error, Result<T>.Success,
Result<T>.Failure, and the [JsonConstructor] overloads).

In
`@src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs`:
- Around line 45-52: O valor retornado em OcrResult.ErrorMessage no
NullDocumentIntelligenceService está em inglês e contém instruções técnicas de
configuração que podem chegar ao usuário final via DocumentVerificationJob;
substitua esse texto por uma mensagem em português e sem instruções de
configuração (por exemplo uma frase curta e amigável como "Não foi possível
processar o documento no momento, tente novamente mais tarde."), mantendo
qualquer detalhe técnico apenas nos logs/warnings já existentes em
NullDocumentIntelligenceService ou no ponto que chama CreateOcrResult; altere
apenas o conteúdo de ErrorMessage (mantendo Success: false e demais campos) para
evitar vazar instruções operacionais ao usuário.
- Around line 36-38: Na classe NullDocumentIntelligenceService, não inclua o
valor bruto de blobUrl na ArgumentException; isso pode vazar SAS/query params.
Altere o trecho que lança a exceção para usar uma mensagem genérica ou um valor
sanitizado (por exemplo apenas Uri.Host ou Uri.GetLeftPart(UriPartial.Path)) em
vez de interpolar blobUrl diretamente, mantendo a validação com Uri.TryCreate
sobre o parâmetro blobUrl.

In `@src/Modules/Documents/Tests/Unit/API/ExtensionsTests.cs`:
- Around line 138-140: Replace the mocked environment name "Test" with "Testing"
so EnvironmentHelpers.IsSecurityBypassEnvironment recognizes it and prevents
fail-fast credential checks; update the Mock<IHostEnvironment> setup
(mockEnv.Setup(e => e.EnvironmentName).Returns("Testing")) used before calling
AddDocumentsModule(_testConfiguration, mockEnv.Object) to ensure the test runs
without throwing InvalidOperationException due to missing Azure credentials.

---

Nitpick comments:
In @.github/scripts/generate-runsettings.sh:
- Around line 30-86: The generate_runsettings function creates a temp file with
mktemp then mv's it into place but lacks a cleanup trap for errors; add a trap
that removes the temp_file on EXIT (or on ERR/INT) so orphaned temp files are
deleted if the function exits early, ensure the trap is installed immediately
after temp_file=$(mktemp) and cleared (or left to noop) after the successful mv,
and reference the temp_file, mktemp, mv, and generate_runsettings identifiers
when making the change.

In `@src/Modules/Documents/Tests/API/ModuleExtensionsTests.cs`:
- Around line 32-80: Add a test that covers the new fail-fast branch in the
AddDocumentsModule extension: call AddDocumentsModule (the extension implemented
in src/Modules/Documents/Infrastructure/Extensions.cs around the fail-fast logic
at lines ~125-141) with a mocked IHostEnvironment where EnvironmentName is
"Production" and with no Azure credentials configured, and assert that the call
fails fast (i.e., throws an exception) instead of succeeding; implement this as
a new Fact in ModuleExtensionsTests.cs that sets mockEnv.Setup(e =>
e.EnvironmentName).Returns("Production") and verifies AddDocumentsModule throws
(use Assert.ThrowsAny<Exception> or an equivalent FluentAssertions assertion).

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: afe095b7-4e7d-4a57-9f0d-c4cff470b097

📥 Commits

Reviewing files that changed from the base of the PR and between 0096b3b and 7831eee.

📒 Files selected for processing (9)
  • .github/scripts/generate-runsettings.sh
  • src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs
  • src/Bootstrapper/MeAjudaAi.ApiService/Program.cs
  • src/Contracts/Functional/Result.cs
  • src/Modules/Documents/API/Extensions.cs
  • src/Modules/Documents/Infrastructure/Extensions.cs
  • src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs
  • src/Modules/Documents/Tests/API/ModuleExtensionsTests.cs
  • src/Modules/Documents/Tests/Unit/API/ExtensionsTests.cs

Filipe Frigini added 3 commits March 12, 2026 19:04
…ullDocumentIntelligenceService, and a Coverlet runsettings generation script.
…ation` covering CORS, Keycloak, and HTTPS redirection rules.
…elligenceService, add unit tests for module extensions and the null service, and configure code coverage.
@frigini
Copy link
Owner Author

frigini commented Mar 13, 2026

@coderabbitai review

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 13, 2026

@frigini Sure, triggering a review of PR #153 now!


🧠 Learnings used
Learnt from: frigini
Repo: frigini/MeAjudaAi PR: 0
File: :0-0
Timestamp: 2026-01-16T21:50:23.593Z
Learning: For frigini/MeAjudaAi Web reviews, prefer commit-by-commit (newest-first) with concise verdicts and concrete follow-ups; prioritize reuse of Shared.Contracts for enums/constants to keep Web aligned with backend/shared code.
✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 13, 2026

✅ Actions performed

Review triggered.

Note: CodeRabbit is an incremental review system and does not re-review already reviewed commits. This command is applicable only when automatic reviews are paused.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (1)
src/Contracts/Functional/Result.cs (1)

18-24: ⚠️ Potential issue | 🟠 Major

Contrato de nulabilidade de Value está inconsistente com o comportamento permitido.

Nas Lines 18-24, IsSuccess/IsFailure afirmam ao compilador que Value é não-nulo no caminho de sucesso. Porém, as Lines 31-32 e 49-53 aceitam/materializam sucesso com value = null, e as Lines 65-70 também permitem isso via Success(value)/conversão implícita. Isso pode gerar falsa segurança de flow analysis e NRE em call sites que confiam apenas em IsSuccess.

💡 Ajuste mínimo sugerido (alinha contrato sem quebrar `Result.Success(null)`)
-    [MemberNotNullWhen(true, nameof(Value))]
     public bool IsSuccess { get; }

     [MemberNotNullWhen(true, nameof(Error))]
-    [MemberNotNullWhen(false, nameof(Value))]
     public bool IsFailure => !IsSuccess;
#!/bin/bash
set -euo pipefail

echo "== 1) Chamadas explícitas de Success(null) =="
rg -n --type cs 'Result<[^>]+>\.Success\s*\(\s*null!?'

echo
echo "== 2) Conversões implícitas potencialmente nulas para Result<T> =="
rg -n --type cs '=\s*null!?\s*;' src
rg -n --type cs 'Result<[^>]+>\s+\w+\s*=\s*null!?' src

echo
echo "== 3) Guardas por IsSuccess seguidas de uso de Value (potencial confiança no atributo) =="
rg -nPU --type cs '(?s)\bif\s*\(\s*[^)]*\.IsSuccess\s*\)\s*\{[^}]{0,300}\.Value\b' src

Also applies to: 31-32, 49-53, 65-70

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Contracts/Functional/Result.cs` around lines 18 - 24, Atributos de
nulabilidade em IsSuccess/IsFailure afirmam que Value é não-nulo quando
IsSuccess==true, mas a implementação e usos permitem Success(null); remova (ou
não aplique) os MemberNotNullWhen(..., nameof(Value)) em IsSuccess e IsFailure
para não garantir falsamente que Value é não-nulo, mantendo apenas os atributos
que se referem a Error; localize as declarações de propriedade IsSuccess e
IsFailure (e seus atributos MemberNotNullWhen) e elimine as referências a
nameof(Value) para alinhar o contrato sem proibir Result<T?>.Success(null).
🧹 Nitpick comments (3)
.github/scripts/generate-runsettings.sh (1)

50-53: Use aspas simples no trap para evitar expansão prematura.

A variável $temp_file é expandida no momento da definição do trap, não na execução. Embora funcione neste caso específico, o padrão com aspas simples é mais robusto e convencional.

♻️ Correção proposta
     local temp_file
     temp_file=$(mktemp)
-    trap "rm -f '$temp_file'" EXIT
+    trap 'rm -f "$temp_file"' EXIT
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.github/scripts/generate-runsettings.sh around lines 50 - 53, A linha do
trap está definindo o comando com aspas duplas, o que causa expansão imediata de
$temp_file; altere a definição do trap (referência: temp_file, mktemp, trap)
para usar aspas simples em torno do comando do trap para que $temp_file seja
expandida somente na execução (ou seja, invista a estratégia de citação para
prevenir expansão prematura do nome do ficheiro temporário).
src/Modules/Documents/Tests/Unit/API/ExtensionsTests.cs (1)

128-178: Testes duplicados: considere consolidar ou diferenciar.

Os métodos UseDocumentsModule_InTestEnvironment_ShouldSkipMigrations (linhas 128-152) e UseDocumentsModule_InTestingEnvironment_ShouldSkipMigrations (linhas 154-178) são funcionalmente idênticos — ambos configuram EnvironmentName = "Testing" e validam o mesmo comportamento.

Se a intenção era testar diferentes valores de ambiente (por exemplo, "Test" vs "Testing"), o primeiro teste deveria usar um valor diferente. Caso contrário, um dos testes é redundante e pode ser removido.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/Modules/Documents/Tests/Unit/API/ExtensionsTests.cs` around lines 128 -
178, Two unit tests, UseDocumentsModule_InTestEnvironment_ShouldSkipMigrations
and UseDocumentsModule_InTestingEnvironment_ShouldSkipMigrations, are identical;
either remove the redundant test or make them assert different scenarios. Fix by
either deleting one of the test methods (keep the clearer-named one) or change
one test to use a different environment value (e.g., set mockEnv.Setup(e =>
e.EnvironmentName).Returns("Test") in
UseDocumentsModule_InTestEnvironment_ShouldSkipMigrations) and adjust the
assertion if needed so each test covers a unique case; ensure you update or
remove the duplicate setup of IWebHostEnvironment/IHostEnvironment accordingly.
src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs (1)

45-50: Alinhe a validação de SAS token entre o serviço null e o serviço real.

O NullDocumentIntelligenceService rejeita URLs contendo tokens SAS (sig= ou SharedAccessSignature), mas o AzureDocumentIntelligenceService aceita essas URLs sem validação. Isso cria inconsistência comportamental: URLs com SAS funcionam em produção mas falham em dev/test.

Recomendações:

  • Remova a validação de SAS do NullDocumentIntelligenceService para manter consistência, ou
  • Adicione a mesma validação ao AzureDocumentIntelligenceService se a rejeição for intencional, ou
  • Documente explicitamente essa diferença no XML summary da classe null
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In
`@src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs`
around lines 45 - 50, Remove the SAS-token rejection in
NullDocumentIntelligenceService by deleting the if block that checks
uri.Query.Contains("sig=") || uri.Query.Contains("SharedAccessSignature") and
the subsequent ArgumentException throw so its behavior matches
AzureDocumentIntelligenceService; if the intention is to reject SAS URLs
instead, add the same query check and throw to AzureDocumentIntelligenceService
(matching the sanitizedUrl message and argument name), or alternatively add an
XML summary to NullDocumentIntelligenceService explicitly documenting this
behavioral difference.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs`:
- Line 330: No bloco dentro da classe SecurityExtensions onde há o comentário em
inglês "No-op - bypass real authentication logic", substitua o comentário inline
por português para manter a padronização do arquivo; por exemplo troque para
"Sem operação — ignora a lógica real de autenticação" no mesmo local (mantenha
estilo e acentuação consistente com outros comentários do arquivo).

In
`@src/Modules/Documents/Tests/Unit/Infrastructure/Services/NullDocumentIntelligenceServiceTests.cs`:
- Around line 51-61: The test
AnalyzeDocumentAsync_WhenUrlIsInvalidFormat_ShouldThrowArgumentException is
asserting the raw URL appears in the exception message; update its assertion to
avoid checking for the raw blob URL (since the service will stop embedding
user-supplied URLs). Modify the FluentAssertions message check (the .WithMessage
call) to assert a generic message like "Invalid blob URL format*" or otherwise
match only the fixed prefix (e.g., .WithMessage("Invalid blob URL format*") or
.WithMessageStartingWith("Invalid blob URL format")) so the test no longer
expects the raw "invalid-url" string when calling _service.AnalyzeDocumentAsync.

---

Duplicate comments:
In `@src/Contracts/Functional/Result.cs`:
- Around line 18-24: Atributos de nulabilidade em IsSuccess/IsFailure afirmam
que Value é não-nulo quando IsSuccess==true, mas a implementação e usos permitem
Success(null); remova (ou não aplique) os MemberNotNullWhen(..., nameof(Value))
em IsSuccess e IsFailure para não garantir falsamente que Value é não-nulo,
mantendo apenas os atributos que se referem a Error; localize as declarações de
propriedade IsSuccess e IsFailure (e seus atributos MemberNotNullWhen) e elimine
as referências a nameof(Value) para alinhar o contrato sem proibir
Result<T?>.Success(null).

---

Nitpick comments:
In @.github/scripts/generate-runsettings.sh:
- Around line 50-53: A linha do trap está definindo o comando com aspas duplas,
o que causa expansão imediata de $temp_file; altere a definição do trap
(referência: temp_file, mktemp, trap) para usar aspas simples em torno do
comando do trap para que $temp_file seja expandida somente na execução (ou seja,
invista a estratégia de citação para prevenir expansão prematura do nome do
ficheiro temporário).

In
`@src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs`:
- Around line 45-50: Remove the SAS-token rejection in
NullDocumentIntelligenceService by deleting the if block that checks
uri.Query.Contains("sig=") || uri.Query.Contains("SharedAccessSignature") and
the subsequent ArgumentException throw so its behavior matches
AzureDocumentIntelligenceService; if the intention is to reject SAS URLs
instead, add the same query check and throw to AzureDocumentIntelligenceService
(matching the sanitizedUrl message and argument name), or alternatively add an
XML summary to NullDocumentIntelligenceService explicitly documenting this
behavioral difference.

In `@src/Modules/Documents/Tests/Unit/API/ExtensionsTests.cs`:
- Around line 128-178: Two unit tests,
UseDocumentsModule_InTestEnvironment_ShouldSkipMigrations and
UseDocumentsModule_InTestingEnvironment_ShouldSkipMigrations, are identical;
either remove the redundant test or make them assert different scenarios. Fix by
either deleting one of the test methods (keep the clearer-named one) or change
one test to use a different environment value (e.g., set mockEnv.Setup(e =>
e.EnvironmentName).Returns("Test") in
UseDocumentsModule_InTestEnvironment_ShouldSkipMigrations) and adjust the
assertion if needed so each test covers a unique case; ensure you update or
remove the duplicate setup of IWebHostEnvironment/IHostEnvironment accordingly.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: abe22cf7-65c4-46d8-9a5d-07ee7f5136d2

📥 Commits

Reviewing files that changed from the base of the PR and between 0096b3b and 286867d.

📒 Files selected for processing (14)
  • .github/scripts/generate-runsettings.sh
  • coverlet.runsettings
  • src/Bootstrapper/MeAjudaAi.ApiService/Extensions/SecurityExtensions.cs
  • src/Bootstrapper/MeAjudaAi.ApiService/Program.cs
  • src/Contracts/Functional/Result.cs
  • src/Modules/Documents/API/Extensions.cs
  • src/Modules/Documents/Infrastructure/Extensions.cs
  • src/Modules/Documents/Infrastructure/MeAjudaAi.Modules.Documents.Infrastructure.csproj
  • src/Modules/Documents/Infrastructure/Services/NullDocumentIntelligenceService.cs
  • src/Modules/Documents/Tests/API/ModuleExtensionsTests.cs
  • src/Modules/Documents/Tests/Unit/API/ExtensionsTests.cs
  • src/Modules/Documents/Tests/Unit/Infrastructure/Services/NullDocumentIntelligenceServiceTests.cs
  • tests/MeAjudaAi.ApiService.Tests/Unit/Extensions/SecurityExtensionsTests.cs
  • tests/MeAjudaAi.Shared.Tests/Unit/Functional/ResultTests.cs
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/Modules/Documents/API/Extensions.cs
  • src/Bootstrapper/MeAjudaAi.ApiService/Program.cs

Filipe Frigini added 2 commits March 13, 2026 09:06
…ntication, authorization, CORS, and rate limiting, alongside a functional result type, a null document intelligence service, and related tests and a runsettings script.
…caping and configurable exclusion filters for code coverage.
@github-actions
Copy link

📊 Code Coverage Report

Coverage: 90.31% (extracted from Cobertura (Aggregated Direct))

📈 Coverage Details

  • Coverage badges: Coverage
  • Minimum threshold: 80% (warning) / 90% (good)
  • Report format: Auto-detected from OpenCover/Cobertura XML files
  • Coverage source: Cobertura (Aggregated Direct)

📋 Coverage Analysis

  • Line Coverage: Shows percentage of code lines executed during tests
  • Branch Coverage: Shows percentage of code branches/conditions tested
  • Complexity: Code complexity metrics for maintainability

🎯 Quality Gates

  • Pass: Coverage ≥ 90%
  • ⚠️ Warning: Coverage 80-89%
  • Fail: Coverage < 80%

📁 Artifacts

  • Coverage reports: Available in workflow artifacts
  • Test results: TRX files with detailed test execution data

This comment is updated automatically on each push to track coverage trends.

@frigini frigini merged commit 9ae9e55 into master Mar 13, 2026
11 checks passed
@frigini frigini deleted the feature/sprint-8b2-monorepo-cleanup branch March 13, 2026 13:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant