diff --git a/cli/.gitignore b/cli/.gitignore index 3c93cf2269..c802965c78 100644 --- a/cli/.gitignore +++ b/cli/.gitignore @@ -4,4 +4,5 @@ out node_modules .env .eslintcache -e2e/config.json \ No newline at end of file +e2e/config.json +coverage diff --git a/cli/README.md b/cli/README.md index 29ff0cb8fe..ab9ce176cf 100644 --- a/cli/README.md +++ b/cli/README.md @@ -12,11 +12,11 @@ With `wgc`, you can: -* Create and manage **federated GraphQL APIs** and **subgraphs** -* Perform **schema checks** and **composition validations** -* Generate and deploy **router configurations** -* Integrate with **CI/CD pipelines** for automated workflows -* Manage **namespaces**, **API keys**, and more +- Create and manage **federated GraphQL APIs** and **subgraphs** +- Perform **schema checks** and **composition validations** +- Generate and deploy **router configurations** +- Integrate with **CI/CD pipelines** for automated workflows +- Manage **namespaces**, **API keys**, and more Whether you're building monolithic or federated GraphQL architectures, `wgc` provides the tools to manage your development and deployment processes. @@ -24,12 +24,12 @@ Whether you're building monolithic or federated GraphQL architectures, `wgc` pro ## ๐Ÿงฐ Cosmo Features -* **Federation Support**: Compatible with GraphQL Federation v1 and v2 -* **Schema Registry**: Centralized management of your GraphQL schemas with versioning and change tracking -* **Composition Checks**: Automated validation to ensure subgraphs compose correctly without breaking changes -* **Router Configuration**: Generate and manage router configurations for efficient query planning and execution -* **Observability**: Integrated with OpenTelemetry and Prometheus for metrics, tracing, and monitoring -* **Access Control**: Fine-grained access controls with support for OIDC, RBAC, and SCIM +- **Federation Support**: Compatible with GraphQL Federation v1 and v2 +- **Schema Registry**: Centralized management of your GraphQL schemas with versioning and change tracking +- **Composition Checks**: Automated validation to ensure subgraphs compose correctly without breaking changes +- **Router Configuration**: Generate and manage router configurations for efficient query planning and execution +- **Observability**: Integrated with OpenTelemetry and Prometheus for metrics, tracing, and monitoring +- **Access Control**: Fine-grained access controls with support for OIDC, RBAC, and SCIM --- @@ -37,7 +37,7 @@ Whether you're building monolithic or federated GraphQL architectures, `wgc` pro ### Prerequisites -* [Node.js](https://nodejs.org/) v20 LTS or higher +- [Node.js](https://nodejs.org/) v20 LTS or higher ### Install via npm @@ -83,8 +83,8 @@ chmod +x start-subgraphs.sh Verify the subgraphs are running: -* [Posts Subgraph](http://localhost:4001/graphql) -* [Users Subgraph](http://localhost:4002/graphql) +- [Posts Subgraph](http://localhost:4001/graphql) +- [Users Subgraph](http://localhost:4002/graphql) ### 4. Generate Router Configuration @@ -137,9 +137,9 @@ query { ## ๐Ÿ“š Documentation -* **CLI Reference**: [https://cosmo-docs.wundergraph.com/cli](https://cosmo-docs.wundergraph.com/cli) -* **Zero to Federation Tutorial**: [https://cosmo-docs.wundergraph.com/tutorial/from-zero-to-federation-in-5-steps-using-cosmo](https://cosmo-docs.wundergraph.com/tutorial/from-zero-to-federation-in-5-steps-using-cosmo) -* **Full Documentation**: [https://cosmo-docs.wundergraph.com/](https://cosmo-docs.wundergraph.com/) +- **CLI Reference**: [https://cosmo-docs.wundergraph.com/cli](https://cosmo-docs.wundergraph.com/cli) +- **Zero to Federation Tutorial**: [https://cosmo-docs.wundergraph.com/tutorial/from-zero-to-federation-in-5-steps-using-cosmo](https://cosmo-docs.wundergraph.com/tutorial/from-zero-to-federation-in-5-steps-using-cosmo) +- **Full Documentation**: [https://cosmo-docs.wundergraph.com/](https://cosmo-docs.wundergraph.com/) --- @@ -147,11 +147,11 @@ query { WunderGraph Cosmo is a comprehensive, open-source platform for managing GraphQL APIs at scale. It offers: -* **Schema Registry**: Centralized schema management with versioning and validation -* **Cosmo Studio**: A web interface for exploring schemas, monitoring performance, and managing access -* **Cosmo Router**: A high-performance, Go-based router supporting federation, subscriptions, and more -* **Observability**: Built-in support for OpenTelemetry and Prometheus -* **Security**: Fine-grained access controls with OIDC, RBAC, and SCIM support +- **Schema Registry**: Centralized schema management with versioning and validation +- **Cosmo Studio**: A web interface for exploring schemas, monitoring performance, and managing access +- **Cosmo Router**: A high-performance, Go-based router supporting federation, subscriptions, and more +- **Observability**: Built-in support for OpenTelemetry and Prometheus +- **Security**: Fine-grained access controls with OIDC, RBAC, and SCIM support Cosmo can be deployed on-premises, in the cloud, or used as a managed service. @@ -159,31 +159,31 @@ Cosmo can be deployed on-premises, in the cloud, or used as a managed service. ## ๐Ÿงช Example Commands -* **Create Namespace**: +- **Create Namespace**: ```bash npx wgc namespace create production ``` -* **Create Federated Graph**: +- **Create Federated Graph**: ```bash npx wgc federated-graph create main -r http://router.example.com/graphql -n production ``` -* **Create Subgraph**: +- **Create Subgraph**: ```bash npx wgc subgraph create products --routing-url http://localhost:4001/graphql ``` -* **Check Subgraph Schema Changes**: +- **Check Subgraph Schema Changes**: ```bash npx wgc subgraph check products -n production --schema ./schemas/products.graphql ``` -* **Generate Router Configuration locally**: +- **Generate Router Configuration locally**: Composition Configuration (graph.yaml): @@ -202,7 +202,7 @@ Generate CMD: npx wgc router compose -i graph.yaml -o config.json ``` -* **Run Router**: +- **Run Router**: ```bash docker run \ @@ -223,9 +223,9 @@ docker run \ ## ๐Ÿ”— Related Projects -* **Cosmo Demo**: [https://github.com/wundergraph/cosmo-demo](https://github.com/wundergraph/cosmo-demo) -* **Cosmo GitHub Repository**: [https://github.com/wundergraph/cosmo](https://github.com/wundergraph/cosmo) -* **WunderGraph Website**: [https://wundergraph.com](https://wundergraph.com) +- **Cosmo Demo**: [https://github.com/wundergraph/cosmo-demo](https://github.com/wundergraph/cosmo-demo) +- **Cosmo GitHub Repository**: [https://github.com/wundergraph/cosmo](https://github.com/wundergraph/cosmo) +- **WunderGraph Website**: [https://wundergraph.com](https://wundergraph.com) --- @@ -271,8 +271,8 @@ This project is licensed under the [Apache 2.0 License](https://github.com/wunde ## ๐Ÿ“ฌ Support & Community -* **Discord**: Join our [Discord community](https://wundergraph.com/discord) for support and discussions -* **GitHub Issues**: Report issues or request features on our [GitHub repository](https://github.com/wundergraph/cosmo/issues) +- **Discord**: Join our [Discord community](https://wundergraph.com/discord) for support and discussions +- **GitHub Issues**: Report issues or request features on our [GitHub repository](https://github.com/wundergraph/cosmo/issues) --- diff --git a/cli/package.json b/cli/package.json index 6629a77194..048973b7de 100644 --- a/cli/package.json +++ b/cli/package.json @@ -29,7 +29,7 @@ "coverage": "vitest run test/ --coverage", "lint": "eslint --cache --ext .ts,.mjs,.cjs . && prettier -c src", "lint:fix": "eslint --cache --fix --ext .ts,.mjs,.cjs . && pnpm format", - "format": "prettier -w -c .", + "format": "prettier -w .", "e2e": "bun test e2e/" }, "keywords": [ @@ -104,7 +104,6 @@ "eslint": "8.57.1", "eslint-config-unjs": "0.2.1", "eslint-plugin-require-extensions": "0.1.3", - "prettier": "3.5.2", "tsx": "4.19.4", "typescript": "5.5.2", "vitest": "3.2.4" diff --git a/cli/src/commands/graph/monograph/commands/check.ts b/cli/src/commands/graph/monograph/commands/check.ts index 4586c24807..98e173a38f 100644 --- a/cli/src/commands/graph/monograph/commands/check.ts +++ b/cli/src/commands/graph/monograph/commands/check.ts @@ -21,8 +21,11 @@ export default (opts: BaseCommandOptions) => { 'This will skip checking for client traffic and any breaking change will fail the run.', ); command.option('-l, --limit [number]', 'The amount of entries shown in the schema checks output.', '50'); + command.option('-j, --json', 'Prints to the console in json format instead of table'); + command.option('-o, --out [string]', 'Destination file for the json output.'); command.action(async (name, options) => { + let outFile; const schemaFile = resolve(options.schema); if (!existsSync(schemaFile)) { @@ -34,6 +37,10 @@ export default (opts: BaseCommandOptions) => { return; } + if (options.out) { + outFile = resolve(options.out); + } + const limit = Number(options.limit); if (Number.isNaN(limit) || limit <= 0 || limit > limitMaxValue) { program.error( @@ -79,7 +86,12 @@ export default (opts: BaseCommandOptions) => { }, ); - const success = handleCheckResult(resp, limit); + const success = await handleCheckResult({ + response: resp, + rowLimit: limit, + shouldOutputJson: options.json || !!outFile, + outFile, + }); if (!success && !ignoreErrorsDueToGitHubIntegration) { process.exitCode = 1; diff --git a/cli/src/commands/subgraph/commands/check.ts b/cli/src/commands/subgraph/commands/check.ts index 1866df7e38..3f796c23de 100644 --- a/cli/src/commands/subgraph/commands/check.ts +++ b/cli/src/commands/subgraph/commands/check.ts @@ -33,9 +33,12 @@ export default (opts: BaseCommandOptions) => { 'This flag will disable the validation for whether all nodes of the federated graph are resolvable. Do NOT use unless troubleshooting.', ); command.option('-l, --limit [number]', 'The amount of entries shown in the schema checks output.', '50'); + command.option('-j, --json', 'Prints to the console in json format instead of table'); + command.option('-o, --out [string]', 'Destination file for the json output.'); command.action(async (name, options) => { let schemaFile; + let outFile; if (!options.schema && !options.delete) { program.error("required option '--schema ' or '--delete' not specified."); @@ -52,6 +55,10 @@ export default (opts: BaseCommandOptions) => { } } + if (options.out) { + outFile = resolve(options.out); + } + const limit = Number(options.limit); if (Number.isNaN(limit) || limit <= 0 || limit > limitMaxValue) { program.error( @@ -91,7 +98,12 @@ export default (opts: BaseCommandOptions) => { }, ); - const success = handleCheckResult(resp, limit); + const success = await handleCheckResult({ + response: resp, + rowLimit: limit, + shouldOutputJson: options.json || !!outFile, + outFile, + }); if (!success && !ignoreErrorsDueToGitHubIntegration) { process.exitCode = 1; diff --git a/cli/src/handle-check-result.ts b/cli/src/handle-check-result.ts index 165305af92..edb42aa17b 100644 --- a/cli/src/handle-check-result.ts +++ b/cli/src/handle-check-result.ts @@ -1,16 +1,141 @@ import { EnumStatusCode } from '@wundergraph/cosmo-connect/dist/common/common_pb'; -import { CheckSubgraphSchemaResponse } from '@wundergraph/cosmo-connect/dist/platform/v1/platform_pb'; +import type { + CheckSubgraphSchemaResponse, + CheckOperationUsageStats, +} from '@wundergraph/cosmo-connect/dist/platform/v1/platform_pb'; import Table from 'cli-table3'; import { program } from 'commander'; import logSymbols from 'log-symbols'; import pc from 'picocolors'; import { config } from './core/config.js'; +import { JsonCheckSchemaOutputBuilder } from './json-check-schema-output-builder.js'; + +// operationUsageStats is required โ€” caller must guard with `if (response.operationUsageStats)` before calling +const handleTrafficCheck = ( + response: CheckSubgraphSchemaResponse, + operationUsageStats: CheckOperationUsageStats, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): { success: boolean; finalStatement: string } => { + const { + clientTrafficCheckSkipped, + compositionErrors, + lintErrors, + graphPruneErrors, + breakingChanges, + composedSchemaBreakingChanges, + } = response; + const { totalOperations, safeOperations, firstSeenAt, lastSeenAt } = operationUsageStats; + + if (totalOperations === 0 && !clientTrafficCheckSkipped) { + // Composition errors are still considered failures, otherwise we can consider this a success + // because no operations were affected by the change + const success = compositionErrors.length === 0 && lintErrors.length === 0 && graphPruneErrors.length === 0; + const message = 'No operations were affected by this schema change.'; + jsonBuilder.setTraffic(message); + if (!shouldOutputJson) { + console.log(message); + } + return { success, finalStatement: `This schema change didn't affect any operations from existing client traffic.` }; + } + + if (totalOperations === safeOperations && !clientTrafficCheckSkipped) { + // This is also a success because changes to these operations were marked as safe + const success = compositionErrors.length === 0 && lintErrors.length === 0 && graphPruneErrors.length === 0; + const message = `${totalOperations} operations were considered safe due to overrides.`; + jsonBuilder.setTraffic(message); + if (!shouldOutputJson) { + console.log(message); + } + return { success, finalStatement: `This schema change affected operations with safe overrides.` }; + } -export const handleCheckResult = (resp: CheckSubgraphSchemaResponse, rowLimit: number) => { + // Composition and breaking errors are considered failures because operations were affected + const success = + breakingChanges.length === 0 && + composedSchemaBreakingChanges.length === 0 && + compositionErrors.length === 0 && + lintErrors.length === 0 && + graphPruneErrors.length === 0; + let finalStatement = ''; + + const totalBreakingChanges = breakingChanges.length + composedSchemaBreakingChanges.length; + + if (breakingChanges.length > 0 || composedSchemaBreakingChanges.length > 0) { + jsonBuilder.addBreakingChanges(breakingChanges); + + const warningMessage = [logSymbols.warning, ` Found ${pc.bold(totalBreakingChanges)} breaking changes.`]; + const jsonMessage = [`Found ${totalBreakingChanges} breaking changes.`]; + if (totalOperations > 0) { + warningMessage.push(`${pc.bold(totalOperations - safeOperations)} operations impacted.`); + jsonMessage.push(`${totalOperations - safeOperations} operations impacted.`); + } + if (safeOperations > 0) { + warningMessage.push(`In addition, ${safeOperations} operations marked safe due to overrides.`); + jsonMessage.push(`In addition, ${safeOperations} operations marked safe due to overrides.`); + } + if (!clientTrafficCheckSkipped) { + warningMessage.push( + `\nFound client activity between ${pc.underline(new Date(firstSeenAt).toLocaleString())} and ${pc.underline(new Date(lastSeenAt).toLocaleString())}.`, + ); + jsonMessage.push( + `Found client activity between ${new Date(firstSeenAt).toLocaleString()} and ${new Date(lastSeenAt).toLocaleString()}.`, + ); + jsonBuilder.setTraffic(jsonMessage.join(' ')); + jsonBuilder.setOperationUsageStats(operationUsageStats); + } + if (!shouldOutputJson) { + console.log(warningMessage.join('')); + } + + finalStatement = `This check has encountered ${pc.bold(`${totalBreakingChanges}`)} breaking changes${ + clientTrafficCheckSkipped ? `.` : ` that would break operations from existing client traffic.` + }`; + } + + return { success, finalStatement }; +}; + +const handleSchemaChanges = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): void => { + if (response.breakingChanges.length > 0) { + jsonBuilder.addBreakingChanges(response.breakingChanges); + } + if (response.nonBreakingChanges.length > 0) { + jsonBuilder.addNonBreakingChanges(response.nonBreakingChanges); + } + + if (shouldOutputJson) { + return; + } + + console.log('\nDetected the following subgraph schema changes:'); const changesTable = new Table({ head: [pc.bold(pc.white('CHANGE')), pc.bold(pc.white('TYPE')), pc.bold(pc.white('DESCRIPTION'))], wordWrap: true, }); + for (const change of response.breakingChanges) { + changesTable.push([`${logSymbols.error} ${pc.red('BREAKING')}`, change.changeType, change.message]); + } + for (const change of response.nonBreakingChanges) { + changesTable.push([`${logSymbols.success} NON-BREAKING`, change.changeType, change.message]); + } + console.log(changesTable.toString()); +}; + +const handleComposedSchemaBreakingChanges = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): void => { + jsonBuilder.addComposedSchemaBreakingChanges(response.composedSchemaBreakingChanges); + + if (shouldOutputJson) { + return; + } const composedSchemaChangesTable = new Table({ head: [ @@ -22,344 +147,418 @@ export const handleCheckResult = (resp: CheckSubgraphSchemaResponse, rowLimit: n wordWrap: true, }); - const compositionErrorsTable = new Table({ - head: [pc.bold(pc.white('GRAPH_NAME')), pc.bold(pc.white('NAMESPACE')), pc.bold(pc.white('ERROR_MESSAGE'))], - colWidths: [30, 30, 120], - wordWrap: true, - }); + for (const change of response.composedSchemaBreakingChanges) { + composedSchemaChangesTable.push([ + `${logSymbols.error} ${pc.red('BREAKING')}`, + change.changeType, + change.federatedGraphName, + change.message, + ]); + } - const compositionWarningsTable = new Table({ - head: [pc.bold(pc.white('GRAPH_NAME')), pc.bold(pc.white('NAMESPACE')), pc.bold(pc.white('WARNING_MESSAGE'))], - colWidths: [30, 30, 120], - wordWrap: true, - }); + console.log(pc.red('\nDetected the following federated graph schema breaking changes:')); + console.log( + pc.dim( + 'These breaking changes were detected in the composed federated graph schema after composition. They are not reported above because they only become visible when all subgraphs are composed together (e.g., field type or nullability conflicts between subgraphs).', + ), + ); + console.log(composedSchemaChangesTable.toString()); +}; - const lintIssuesTable = new Table({ - head: [pc.bold(pc.white('LINT_RULE')), pc.bold(pc.white('ERROR_MESSAGE')), pc.bold(pc.white('LINE NUMBER'))], - colAligns: ['left', 'left', 'center'], - wordWrap: true, - }); +const handleCompositionErrors = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): void => { + jsonBuilder.addCompositionErrors(response.compositionErrors); + + if (!shouldOutputJson) { + const compositionErrorsTable = new Table({ + head: [pc.bold(pc.white('GRAPH_NAME')), pc.bold(pc.white('NAMESPACE')), pc.bold(pc.white('ERROR_MESSAGE'))], + colWidths: [30, 30, 120], + wordWrap: true, + }); + for (const error of response.compositionErrors) { + compositionErrorsTable.push([error.federatedGraphName, error.namespace, error.message]); + } + console.log(pc.red('\nDetected composition errors:')); + console.log(compositionErrorsTable.toString()); + } +}; - const graphPruningIssuesTable = new Table({ - head: [ - pc.bold(pc.white('RULE')), - pc.bold(pc.white('FEDERATED_GRAPH_NAME')), - pc.bold(pc.white('FIELD_PATH')), - pc.bold(pc.white('MESSAGE')), - pc.bold(pc.white('LINE NUMBER')), - ], - colAligns: ['left', 'left', 'left', 'left', 'center'], - wordWrap: true, - }); +const handleCompositionWarnings = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): void => { + jsonBuilder.addCompositionWarnings(response.compositionWarnings); + + if (!shouldOutputJson) { + const compositionWarningsTable = new Table({ + head: [pc.bold(pc.white('GRAPH_NAME')), pc.bold(pc.white('NAMESPACE')), pc.bold(pc.white('WARNING_MESSAGE'))], + colWidths: [30, 30, 120], + wordWrap: true, + }); + for (const warning of response.compositionWarnings) { + compositionWarningsTable.push([warning.federatedGraphName, warning.namespace, warning.message]); + } + console.log(pc.yellow(`\nDetected composition warnings:`)); + console.log(compositionWarningsTable.toString()); + } +}; + +const handleLintIssues = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): void => { + jsonBuilder.addLintErrors(response.lintErrors); + jsonBuilder.addLintWarnings(response.lintWarnings); + + if (!shouldOutputJson) { + const lintIssuesTable = new Table({ + head: [pc.bold(pc.white('LINT_RULE')), pc.bold(pc.white('ERROR_MESSAGE')), pc.bold(pc.white('LINE NUMBER'))], + colAligns: ['left', 'left', 'center'], + wordWrap: true, + }); + for (const error of response.lintErrors) { + lintIssuesTable.push([ + `${logSymbols.error} ${pc.red(error.lintRuleType)}`, + error.message, + error.issueLocation?.line, + ]); + } + for (const warning of response.lintWarnings) { + lintIssuesTable.push([ + `${logSymbols.warning} ${pc.yellow(warning.lintRuleType)}`, + warning.message, + warning.issueLocation?.line, + ]); + } + console.log('\nDetected lint issues:'); + console.log(lintIssuesTable.toString()); + } +}; + +const handleGraphPruneIssues = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + shouldOutputJson: boolean, +): void => { + jsonBuilder.addGraphPruneErrors(response.graphPruneErrors); + jsonBuilder.addGraphPruneWarnings(response.graphPruneWarnings); + + if (!shouldOutputJson) { + const graphPruningIssuesTable = new Table({ + head: [ + pc.bold(pc.white('RULE')), + pc.bold(pc.white('FEDERATED_GRAPH_NAME')), + pc.bold(pc.white('FIELD_PATH')), + pc.bold(pc.white('MESSAGE')), + pc.bold(pc.white('LINE NUMBER')), + ], + colAligns: ['left', 'left', 'left', 'left', 'center'], + wordWrap: true, + }); + for (const error of response.graphPruneErrors) { + graphPruningIssuesTable.push([ + `${logSymbols.error} ${pc.red(error.graphPruningRuleType)}`, + error.federatedGraphName, + error.fieldPath, + error.message, + error.issueLocation?.line || '-', + ]); + } + for (const warning of response.graphPruneWarnings) { + graphPruningIssuesTable.push([ + `${logSymbols.warning} ${pc.yellow(warning.graphPruningRuleType)}`, + warning.federatedGraphName, + warning.fieldPath, + warning.message, + warning.issueLocation?.line || '-', + ]); + } + console.log('\nDetected graph pruning issues:'); + console.log(graphPruningIssuesTable.toString()); + } +}; + +// currentSuccess determines which sentence variant to use in the returned statement +const handleLinkedCheckFailures = ( + response: CheckSubgraphSchemaResponse, + jsonBuilder: JsonCheckSchemaOutputBuilder, + currentSuccess: boolean, +): string => { + let additionalStatement = currentSuccess + ? `\n\n But this schema change has been linked to a target subgraph and the target subgraph check has failed.` + : `\n\n This schema change has been linked to a target subgraph and the target subgraph check has failed.`; + + if (response.isLinkedTrafficCheckFailed) { + const message = 'The target subgraph check has failed because of client traffic issues.'; + additionalStatement += `\n\n ${message}`; + jsonBuilder.markTrafficLinkedFailed(message); + } + + if (response.isLinkedPruningCheckFailed) { + jsonBuilder.markGraphPruneLinkedFailed(); + additionalStatement += `\n\n The target subgraph check has failed because of graph pruning issues.`; + } + + return additionalStatement; +}; +const handleOkResult = ({ + response, + jsonBuilder, + rowLimit, + shouldOutputJson, + studioCheckDestination, +}: { + response: CheckSubgraphSchemaResponse; + jsonBuilder: JsonCheckSchemaOutputBuilder; + rowLimit: number; + shouldOutputJson?: boolean; + studioCheckDestination: string; +}): { success: boolean } => { let success = false; let finalStatement = ''; - let studioCheckDestination = ''; - if (resp.checkId && resp.checkedFederatedGraphs.length > 0) { - studioCheckDestination = `${pc.bold('Open in studio')}: ${config.webURL}/${ - resp.checkedFederatedGraphs[0].organizationSlug - }/${resp.checkedFederatedGraphs[0].namespace}/graph/${resp.checkedFederatedGraphs[0].name}/checks/${resp.checkId}`; + // Proposal match warning โ€” always build json, conditionally print + if (response.proposalMatchMessage) { + jsonBuilder.setProposals(response.proposalMatchMessage); + if (!shouldOutputJson) { + console.log(pc.yellow(`Warning: Proposal match failed`)); + console.log(pc.yellow(response.proposalMatchMessage)); + } } - switch (resp.response?.code) { - case EnumStatusCode.OK: { - if (resp.proposalMatchMessage) { - console.log(pc.yellow(`Warning: Proposal match failed`)); - console.log(pc.yellow(resp.proposalMatchMessage)); - } - - if ( - resp.nonBreakingChanges.length === 0 && - resp.breakingChanges.length === 0 && - resp.composedSchemaBreakingChanges.length === 0 && - resp.compositionErrors.length === 0 && - resp.lintErrors.length === 0 && - resp.lintWarnings.length === 0 && - resp.graphPruneErrors.length === 0 && - resp.graphPruneWarnings.length === 0 && - (resp.isCheckExtensionSkipped ?? true) - ) { - console.log( - `\nDetected no changes.\nDetected no lint issues.\nDetected no graph pruning issues.\n\n${studioCheckDestination}\n`, - ); + // Early exit: nothing to report + const hasNoIssues = + response.nonBreakingChanges.length === 0 && + response.breakingChanges.length === 0 && + response.composedSchemaBreakingChanges.length === 0 && + response.compositionErrors.length === 0 && + response.lintErrors.length === 0 && + response.lintWarnings.length === 0 && + response.graphPruneErrors.length === 0 && + response.graphPruneWarnings.length === 0 && + (response.isCheckExtensionSkipped ?? true); + + if (hasNoIssues) { + jsonBuilder.initProposals('Detected no changes. Detected no lint issues. Detected no graph pruning issues.'); + jsonBuilder.setStatus(true); + if (!shouldOutputJson) { + console.log( + `\nDetected no changes.\nDetected no lint issues.\nDetected no graph pruning issues.\n\n${studioCheckDestination}\n`, + ); + } + return { success: true }; + } - success = true; + if (!shouldOutputJson) { + console.log(`\nChecking the proposed schema`); + } - break; - } + // No operations usage stats mean the check was not performed against any live traffic + if (response.operationUsageStats) { + ({ success, finalStatement } = handleTrafficCheck( + response, + response.operationUsageStats, + jsonBuilder, + shouldOutputJson ?? false, + )); + } - console.log(`\nChecking the proposed schema`); - - // No operations usage stats mean the check was not performed against any live traffic - if (resp.operationUsageStats) { - if (resp.operationUsageStats.totalOperations === 0 && !resp.clientTrafficCheckSkipped) { - // Composition errors are still considered failures, otherwise we can consider this a success - // because no operations were affected by the change - success = - resp.compositionErrors.length === 0 && resp.lintErrors.length === 0 && resp.graphPruneErrors.length === 0; - console.log(`No operations were affected by this schema change.`); - finalStatement = `This schema change didn't affect any operations from existing client traffic.`; - } else if ( - resp.operationUsageStats.totalOperations === resp.operationUsageStats.safeOperations && - !resp.clientTrafficCheckSkipped - ) { - // This is also a success because changes to these operations were marked as safe - success = - resp.compositionErrors.length === 0 && resp.lintErrors.length === 0 && resp.graphPruneErrors.length === 0; - console.log(`${resp.operationUsageStats.totalOperations} operations were considered safe due to overrides.`); - finalStatement = `This schema change affected operations with safe overrides.`; - } else { - // Composition and breaking errors are considered failures because operations were affected by the change - success = - resp.breakingChanges.length === 0 && - resp.compositionErrors.length === 0 && - resp.lintErrors.length === 0 && - resp.graphPruneErrors.length === 0 && - resp.composedSchemaBreakingChanges.length === 0; - - const { breakingChanges, operationUsageStats, clientTrafficCheckSkipped, composedSchemaBreakingChanges } = - resp; - const { totalOperations, safeOperations, firstSeenAt, lastSeenAt } = operationUsageStats; - - if (breakingChanges.length > 0 || composedSchemaBreakingChanges.length > 0) { - const warningMessage = [ - logSymbols.warning, - ` Found ${pc.bold(breakingChanges.length + composedSchemaBreakingChanges.length)} breaking changes.`, - ]; - - if (totalOperations > 0) { - warningMessage.push(`${pc.bold(totalOperations - safeOperations)} operations impacted.`); - } - - if (safeOperations > 0) { - warningMessage.push(`In addition, ${safeOperations} operations marked safe due to overrides.`); - } - - if (!clientTrafficCheckSkipped) { - warningMessage.push( - `\nFound client activity between ${pc.underline( - new Date(firstSeenAt).toLocaleString(), - )} and ${pc.underline(new Date(lastSeenAt).toLocaleString())}.`, - ); - } - - console.log(warningMessage.join('')); - - finalStatement = `This check has encountered ${pc.bold(`${breakingChanges.length + composedSchemaBreakingChanges.length}`)} breaking changes${ - clientTrafficCheckSkipped ? `.` : ` that would break operations from existing client traffic.` - }`; - } - } - } + // Schema changes โ€” build json always, build + print table only for text output + if (response.nonBreakingChanges.length > 0 || response.breakingChanges.length > 0) { + handleSchemaChanges(response, jsonBuilder, shouldOutputJson ?? false); + } - if (resp.nonBreakingChanges.length > 0 || resp.breakingChanges.length > 0) { - console.log('\nDetected the following subgraph schema changes:'); - - if (resp.breakingChanges.length > 0) { - for (const breakingChange of resp.breakingChanges) { - changesTable.push([ - `${logSymbols.error} ${pc.red('BREAKING')}`, - breakingChange.changeType, - breakingChange.message, - ]); - } - } + // Composed federated graph schema breaking changes + if (response.composedSchemaBreakingChanges.length > 0) { + handleComposedSchemaBreakingChanges(response, jsonBuilder, shouldOutputJson ?? false); + } - if (resp.nonBreakingChanges.length > 0) { - for (const nonBreakingChange of resp.nonBreakingChanges) { - changesTable.push([ - `${logSymbols.success} NON-BREAKING`, - nonBreakingChange.changeType, - nonBreakingChange.message, - ]); - } - } + // Composition errors + if (response.compositionErrors.length > 0) { + handleCompositionErrors(response, jsonBuilder, shouldOutputJson ?? false); + } - console.log(changesTable.toString()); - } + // Composition warnings + if (response.compositionWarnings.length > 0) { + handleCompositionWarnings(response, jsonBuilder, shouldOutputJson ?? false); + } - if (resp.composedSchemaBreakingChanges.length > 0) { - console.log(pc.red('\nDetected the following federated graph schema breaking changes:')); - console.log( - pc.dim( - 'These breaking changes were detected in the composed federated graph schema after composition. They are not reported above because they only become visible when all subgraphs are composed together (e.g., field type or nullability conflicts between subgraphs).', - ), - ); + // Lint issues + if (response.lintErrors.length > 0 || response.lintWarnings.length > 0) { + handleLintIssues(response, jsonBuilder, shouldOutputJson ?? false); + } - for (const change of resp.composedSchemaBreakingChanges) { - composedSchemaChangesTable.push([ - `${logSymbols.error} ${pc.red('BREAKING')}`, - change.changeType, - change.federatedGraphName, - change.message, - ]); - } + // Graph pruning issues + if (response.graphPruneErrors.length > 0 || response.graphPruneWarnings.length > 0) { + handleGraphPruneIssues(response, jsonBuilder, shouldOutputJson ?? false); + } - console.log(composedSchemaChangesTable.toString()); - } + // Linked subgraph check failures + if (response.isLinkedTrafficCheckFailed || response.isLinkedPruningCheckFailed) { + finalStatement += handleLinkedCheckFailures(response, jsonBuilder, success); + success = false; + } - if (resp.compositionErrors.length > 0) { - console.log(pc.red('\nDetected composition errors:')); - for (const compositionError of resp.compositionErrors) { - compositionErrorsTable.push([ - compositionError.federatedGraphName, - compositionError.namespace, - compositionError.message, - ]); - } - console.log(compositionErrorsTable.toString()); - } + // Extension error + if (response.checkExtensionErrorMessage) { + const message = `Subgraph extension check failed with message: ${response.checkExtensionErrorMessage}`; + jsonBuilder.setExtensionError(message); + success = false; + finalStatement += `\n${logSymbols.error} ${message}`; + } - if (resp.compositionWarnings.length > 0) { - console.log(pc.yellow(`\nDetected composition warnings:`)); - for (const compositionWarning of resp.compositionWarnings) { - compositionWarningsTable.push([ - compositionWarning.federatedGraphName, - compositionWarning.namespace, - compositionWarning.message, - ]); - } - console.log(compositionWarningsTable.toString()); + // Row limit exceeded message + let moreEntriesAvailableMessage = ''; + if (response.counts) { + const hasExceeded = + response.counts.lintWarnings + response.counts.lintErrors > rowLimit || + response.counts.breakingChanges + response.counts.nonBreakingChanges > rowLimit || + response.counts.graphPruneErrors + response.counts.graphPruneWarnings > rowLimit || + response.counts.compositionErrors > rowLimit || + response.counts.compositionWarnings > rowLimit || + response.counts.composedSchemaBreakingChanges > rowLimit; + + jsonBuilder.setExceededRowLimit(hasExceeded); + + if (hasExceeded) { + if (studioCheckDestination !== '') { + moreEntriesAvailableMessage += `\n\n`; } - - if (resp.lintErrors.length > 0 || resp.lintWarnings.length > 0) { - console.log('\nDetected lint issues:'); - for (const error of resp.lintErrors) { - lintIssuesTable.push([ - `${logSymbols.error} ${pc.red(error.lintRuleType)}`, - error.message, - error.issueLocation?.line, - ]); - } - for (const warning of resp.lintWarnings) { - lintIssuesTable.push([ - `${logSymbols.warning} ${pc.yellow(warning.lintRuleType)}`, - warning.message, - warning.issueLocation?.line, - ]); - } - console.log(lintIssuesTable.toString()); + moreEntriesAvailableMessage += pc.red( + `Some results were truncated due to exceeding the limit of ${rowLimit} rows.`, + ); + if (studioCheckDestination !== '') { + moreEntriesAvailableMessage += ` They can be viewed in the studio dashboard.`; } + } + } - if (resp.graphPruneErrors.length > 0 || resp.graphPruneWarnings.length > 0) { - console.log('\nDetected graph pruning issues:'); - for (const error of resp.graphPruneErrors) { - graphPruningIssuesTable.push([ - `${logSymbols.error} ${pc.red(error.graphPruningRuleType)}`, - error.federatedGraphName, - error.fieldPath, - error.message, - error.issueLocation?.line || '-', - ]); - } - for (const warning of resp.graphPruneWarnings) { - graphPruningIssuesTable.push([ - `${logSymbols.warning} ${pc.yellow(warning.graphPruningRuleType)}`, - warning.federatedGraphName, - warning.fieldPath, - warning.message, - warning.issueLocation?.line || '-', - ]); - } - console.log(graphPruningIssuesTable.toString()); - } + // Finalise json state, then print text output if not in JSON mode + jsonBuilder.setStatus(success).setMessage(finalStatement); - if (resp.isLinkedTrafficCheckFailed || resp.isLinkedPruningCheckFailed) { - finalStatement += success - ? `\n\n But this schema change has been linked to a target subgraph and the target subgraph check has failed.` - : `\n\n This schema change has been linked to a target subgraph and the target subgraph check has failed.`; + if (!shouldOutputJson) { + if (success) { + console.log( + '\n' + + logSymbols.success + + pc.green(` Schema check passed. ${finalStatement}`) + + '\n\n' + + studioCheckDestination + + moreEntriesAvailableMessage + + '\n', + ); + } else { + program.error( + '\n' + + logSymbols.error + + pc.red( + ` Schema check failed. ${finalStatement}\nSee https://cosmo-docs.wundergraph.com/studio/schema-checks for more information on resolving operation check errors.\n${studioCheckDestination}${moreEntriesAvailableMessage}\n`, + ) + + '\n', + ); + } + } - if (resp.isLinkedTrafficCheckFailed) { - finalStatement += `\n\n The target subgraph check has failed because of client traffic issues.`; - } + return { success }; +}; - if (resp.isLinkedPruningCheckFailed) { - finalStatement += `\n\n The target subgraph check has failed because of graph pruning issues.`; - } - success = false; - } +export const handleCheckResult = async ({ + response, + rowLimit, + shouldOutputJson, + outFile, +}: { + response: CheckSubgraphSchemaResponse; + rowLimit: number; + shouldOutputJson?: boolean; + outFile?: string; +}): Promise => { + const jsonBuilder = new JsonCheckSchemaOutputBuilder(EnumStatusCode.ERR, rowLimit, outFile); - if (resp.checkExtensionErrorMessage) { - success = false; - finalStatement += `\n${logSymbols.error} Subgraph extension check failed with message: ${resp.checkExtensionErrorMessage}`; - } + let studioCheckDestination = ''; + if (response.checkId && response.checkedFederatedGraphs.length > 0) { + const url = `${config.webURL}/${ + response.checkedFederatedGraphs[0].organizationSlug + }/${response.checkedFederatedGraphs[0].namespace}/graph/${response.checkedFederatedGraphs[0].name}/checks/${response.checkId}`; + jsonBuilder.setUrl(url); + studioCheckDestination = `${pc.bold('Open in studio')}: ${url}`; + } - let moreEntriesAvailableMessage = ''; - if (resp.counts) { - const hasExceeded = - resp.counts.lintWarnings + resp.counts.lintErrors > rowLimit || - resp.counts.breakingChanges + resp.counts.nonBreakingChanges > rowLimit || - resp.counts.graphPruneErrors + resp.counts.graphPruneWarnings > rowLimit || - resp.counts.compositionErrors > rowLimit || - resp.counts.compositionWarnings > rowLimit || - resp.counts.composedSchemaBreakingChanges > rowLimit; - - if (hasExceeded) { - if (studioCheckDestination !== '') { - moreEntriesAvailableMessage += `\n\n`; - } - moreEntriesAvailableMessage += pc.red( - `Some results were truncated due to exceeding the limit of ${rowLimit} rows.`, - ); - if (studioCheckDestination !== '') { - moreEntriesAvailableMessage += ` They can be viewed in the studio dashboard.`; - } - } + switch (response.response?.code) { + case EnumStatusCode.OK: { + const { success } = handleOkResult({ response, jsonBuilder, rowLimit, shouldOutputJson, studioCheckDestination }); + if (shouldOutputJson) { + await jsonBuilder.write(); } - - if (success) { - console.log( - '\n' + - logSymbols.success + - pc.green(` Schema check passed. ${finalStatement}`) + - '\n\n' + - studioCheckDestination + - moreEntriesAvailableMessage + - '\n', - ); + return success; + } + case EnumStatusCode.ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL: { + const message = 'Error: Proposal match failed'; + if (shouldOutputJson) { + await jsonBuilder + .setCode(EnumStatusCode.ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL) + .setDetails(response.proposalMatchMessage) + .setMessage(message) + .setStatus(false) + .write(); } else { - program.error( - '\n' + - logSymbols.error + + console.log(pc.red(message)); + console.log(pc.red(response.proposalMatchMessage)); + console.log( + logSymbols.error + pc.red( - ` Schema check failed. ${finalStatement}\nSee https://cosmo-docs.wundergraph.com/studio/schema-checks for more information on resolving operation check errors.\n${studioCheckDestination}${moreEntriesAvailableMessage}\n`, - ) + - '\n', + `Schema check failed.\nSee https://cosmo-docs.wundergraph.com/studio/schema-checks for more information on resolving operation check errors.\n${studioCheckDestination}\n`, + ), ); } - break; - } - case EnumStatusCode.ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL: { - console.log(pc.red(`Error: Proposal match failed`)); - console.log(pc.red(resp.proposalMatchMessage)); - console.log( - logSymbols.error + - pc.red( - `Schema check failed.\nSee https://cosmo-docs.wundergraph.com/studio/schema-checks for more information on resolving operation check errors.\n${studioCheckDestination}\n`, - ), - ); - success = false; - break; + return false; } case EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA: { - console.log( - '\nCheck has failed early because the schema could not be built. Please ensure that the schema is valid GraphQL and try again.', - ); - if (resp.response?.details) { - console.log(pc.red(pc.bold(resp.response?.details))); + const message = + 'Check has failed early because the schema could not be built. Please ensure that the schema is valid GraphQL and try again.'; + if (shouldOutputJson) { + await jsonBuilder + .setCode(EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA) + .setDetails(response.response?.details) + .setMessage(message) + .setStatus(false) + .write(); + return false; + } else { + console.log('\n' + message); + if (response.response?.details) { + console.log(pc.red(pc.bold(response.response?.details))); + } } program.error(logSymbols.error + pc.red(' Schema check failed.')); break; } default: { - console.log('\nFailed to perform the check operation.'); - if (resp.response?.details) { - console.log(pc.red(pc.bold(resp.response?.details))); + const message = 'Failed to perform the check operation.'; + if (shouldOutputJson) { + await jsonBuilder + .setCode(EnumStatusCode.ERR) + .setMessage(message) + .setDetails(response.response?.details) + .setStatus(false) + .write(); + console.log(JSON.stringify(jsonBuilder.build())); + return false; + } else { + console.log('\nFailed to perform the check operation.'); } + + if (response.response?.details && !shouldOutputJson) { + console.log(pc.red(pc.bold(response.response?.details))); + } + program.error(logSymbols.error + pc.red(' Schema check failed.')); } } - - return success; }; diff --git a/cli/src/json-check-schema-output-builder.ts b/cli/src/json-check-schema-output-builder.ts new file mode 100644 index 0000000000..4385f8a721 --- /dev/null +++ b/cli/src/json-check-schema-output-builder.ts @@ -0,0 +1,211 @@ +import { writeFile } from 'node:fs/promises'; +import { EnumStatusCode } from '@wundergraph/cosmo-connect/dist/common/common_pb'; +import type { + CheckOperationUsageStats, + CompositionError, + FederatedGraphSchemaChange, + GraphPruningIssue, + LintIssue, + SchemaChange, +} from '@wundergraph/cosmo-connect/dist/platform/v1/platform_pb'; + +export type JsonCheckSchemaOutputDescriptor = { + status: 'error' | 'success'; + code: EnumStatusCode; + details?: string; + message?: string; + url?: string; + proposals?: { + message: string; + }; + traffic?: { + message: string; + }; + changes?: { + breaking: SchemaChange[]; + nonBreaking: SchemaChange[]; + }; + composition?: { + errors: CompositionError[]; + warnings: CompositionError[]; + }; + lint?: { + errors: LintIssue[]; + warnings: LintIssue[]; + }; + graphPrune?: { + errors: GraphPruningIssue[]; + warnings: GraphPruningIssue[]; + }; + composedSchemaBreakingChanges?: FederatedGraphSchemaChange[]; + extensions?: { + message: string; + }; + exceededRowLimit?: boolean; + rowLimit: number; + operationUsageStats?: CheckOperationUsageStats; +}; + +export class JsonCheckSchemaOutputBuilder { + private readonly data: JsonCheckSchemaOutputDescriptor; + private readonly outFile?: string; + + constructor(code: EnumStatusCode, rowLimit: number, outFile?: string) { + this.data = { status: 'error', code, rowLimit }; + this.outFile = outFile; + } + + setUrl(url: string): this { + this.data.url = url; + return this; + } + + setCode(code: EnumStatusCode): this { + this.data.code = code; + return this; + } + + setStatus(success: boolean): this { + this.data.status = success ? 'success' : 'error'; + return this; + } + + setMessage(message: string): this { + this.data.message = message; + return this; + } + + setDetails(details: string | undefined): this { + this.data.details = details; + return this; + } + + setProposals(message: string): this { + this.data.proposals = { message }; + return this; + } + + initProposals(message: string): this { + this.data.proposals ??= { message }; + return this; + } + + setTraffic(message: string): this { + this.data.traffic = { message }; + return this; + } + + markTrafficLinkedFailed(fallbackMessage: string): this { + this.data.traffic = { + message: this.data.traffic?.message ?? fallbackMessage, + }; + return this; + } + + addBreakingChanges(changes: SchemaChange[]): this { + this.data.changes = { + ...this.data.changes, + breaking: [...(this.data.changes?.breaking ?? []), ...changes], + nonBreaking: [...(this.data.changes?.nonBreaking ?? [])], + }; + return this; + } + + addNonBreakingChanges(changes: SchemaChange[]): this { + this.data.changes = { + breaking: [...(this.data.changes?.breaking ?? [])], + nonBreaking: [...(this.data.changes?.nonBreaking ?? []), ...changes], + }; + return this; + } + + setOperationUsageStats(stats: CheckOperationUsageStats): this { + this.data.operationUsageStats ??= stats; + return this; + } + + addCompositionErrors(errors: CompositionError[]): this { + this.data.composition = { + errors: [...(this.data.composition?.errors ?? []), ...errors], + warnings: [...(this.data.composition?.warnings ?? [])], + }; + return this; + } + + addCompositionWarnings(warnings: CompositionError[]): this { + this.data.composition = { + errors: [...(this.data.composition?.errors ?? [])], + warnings: [...(this.data.composition?.warnings ?? []), ...warnings], + }; + return this; + } + + addLintErrors(errors: LintIssue[]): this { + this.data.lint = { + errors: [...(this.data.lint?.errors ?? []), ...errors], + warnings: [...(this.data.lint?.warnings ?? [])], + }; + return this; + } + + addLintWarnings(warnings: LintIssue[]): this { + this.data.lint = { + errors: [...(this.data.lint?.errors ?? [])], + warnings: [...(this.data.lint?.warnings ?? []), ...warnings], + }; + return this; + } + + addGraphPruneErrors(errors: GraphPruningIssue[]): this { + this.data.graphPrune = { + errors: [...(this.data.graphPrune?.errors ?? []), ...errors], + warnings: [...(this.data.graphPrune?.warnings ?? [])], + }; + return this; + } + + addGraphPruneWarnings(warnings: GraphPruningIssue[]): this { + this.data.graphPrune = { + errors: [...(this.data.graphPrune?.errors ?? [])], + warnings: [...(this.data.graphPrune?.warnings ?? []), ...warnings], + }; + return this; + } + + markGraphPruneLinkedFailed(): this { + this.data.graphPrune = { + errors: [...(this.data.graphPrune?.errors ?? [])], + warnings: [...(this.data.graphPrune?.warnings ?? [])], + }; + return this; + } + + addComposedSchemaBreakingChanges(changes: FederatedGraphSchemaChange[]): this { + this.data.composedSchemaBreakingChanges = [...(this.data.composedSchemaBreakingChanges ?? []), ...changes]; + return this; + } + + setExtensionError(message: string): this { + this.data.extensions = { message }; + return this; + } + + setExceededRowLimit(exceeded: boolean): this { + this.data.exceededRowLimit = exceeded; + return this; + } + + build(): JsonCheckSchemaOutputDescriptor { + return this.data; + } + + async write(): Promise { + const output = this.build(); + + if (this.outFile) { + await writeFile(this.outFile, JSON.stringify(output, null, 2)); + } else { + console.log(output); + } + } +} diff --git a/cli/test/check-schema.test.ts b/cli/test/check-schema.test.ts index 4ac9a43f97..84af6aec7c 100644 --- a/cli/test/check-schema.test.ts +++ b/cli/test/check-schema.test.ts @@ -1,47 +1,1206 @@ -import { describe, test } from 'vitest'; +import { readFileSync, rmSync } from 'node:fs'; +import { tmpdir } from 'node:os'; +import { join } from 'node:path'; import { Command } from 'commander'; +import { beforeEach, afterEach, describe, expect, onTestFinished, test, vi, type MockInstance } from 'vitest'; +import { type PartialMessage } from '@bufbuild/protobuf'; import { createPromiseClient, createRouterTransport } from '@connectrpc/connect'; import { PlatformService } from '@wundergraph/cosmo-connect/dist/platform/v1/platform_connect'; +import { CheckSubgraphSchemaResponse } from '@wundergraph/cosmo-connect/dist/platform/v1/platform_pb'; import { EnumStatusCode } from '@wundergraph/cosmo-connect/dist/common/common_pb'; import { Client } from '../src/core/client/client.js'; +import { config } from '../src/core/config.js'; import CheckSchema from '../src/commands/subgraph/commands/check.js'; +import type { JsonCheckSchemaOutputDescriptor } from '../src/json-check-schema-output-builder.js'; -export const mockPlatformTransport = () => - createRouterTransport(({ service }) => { +vi.mock('../src/core/config.js', async (importOriginal) => { + const mod = await importOriginal(); + return { ...mod, config: { ...mod.config } }; +}); + +function createMockTransport(response: PartialMessage) { + return createRouterTransport(({ service }) => { service(PlatformService, { - checkSubgraphSchema: (ctx) => { - return { - response: { - code: EnumStatusCode.OK, + checkSubgraphSchema: () => response, + isGitHubAppInstalled: () => ({ + response: { code: EnumStatusCode.OK }, + isInstalled: false, + }), + }); + }); +} + +function setVcsConfig({ author = '', commitSha = '', branch = '' } = {}) { + vi.mocked(config).checkAuthor = author; + vi.mocked(config).checkCommitSha = commitSha; + vi.mocked(config).checkBranch = branch; +} + +function resetVcsConfig() { + vi.mocked(config).checkAuthor = ''; + vi.mocked(config).checkCommitSha = ''; + vi.mocked(config).checkBranch = ''; +} + +async function runCheck( + response: PartialMessage, + opts: { + limit?: number | string; + schema?: string | null; + delete?: boolean; + json?: boolean; + outFile?: string; + skipTrafficCheck?: boolean; + } = {}, +): Promise { + const schema = 'schema' in opts ? opts.schema : 'test/fixtures/schema.graphql'; + const args = ['check', 'wg.orders']; + if (schema !== null) { + args.push('--schema', schema ?? 'test/fixtures/schema.graphql'); + } + if (opts.delete) { + args.push('--delete'); + } + if (opts.limit !== undefined) { + args.push('--limit', String(opts.limit)); + } + if (opts.json) { + args.push('--json'); + } + if (opts.outFile !== undefined) { + args.push('--out', opts.outFile); + } + if (opts.skipTrafficCheck) { + args.push('--skip-traffic-check'); + } + + const client: Client = { + platform: createPromiseClient(PlatformService, createMockTransport(response)), + }; + const program = new Command(); + program.addCommand(CheckSchema({ client })); + await program.parseAsync(args, { from: 'user' }); +} + +function getJsonOutput(logSpy: MockInstance): JsonCheckSchemaOutputDescriptor { + const call = logSpy.mock.calls.find(([arg]) => { + if (arg !== null && typeof arg === 'object') { + return true; + } + try { + JSON.parse(String(arg)); + return true; + } catch { + return false; + } + }); + if (!call) { + throw new Error('No JSON output found in console.log calls'); + } + const arg = call[0]; + return typeof arg === 'string' ? JSON.parse(arg) : (arg as JsonCheckSchemaOutputDescriptor); +} + +describe('stdout', () => { + let logSpy: MockInstance; + let stderrSpy: MockInstance; + let exitSpy: MockInstance; + + beforeEach(() => { + logSpy = vi.spyOn(console, 'log').mockImplementation(() => {}); + stderrSpy = vi.spyOn(process.stderr, 'write').mockImplementation(() => true); + exitSpy = vi.spyOn(process, 'exit').mockImplementation(() => { + throw new Error('process.exit'); + }); + }); + + afterEach(() => { + logSpy.mockRestore(); + stderrSpy.mockRestore(); + exitSpy.mockRestore(); + process.exitCode = undefined; + resetVcsConfig(); + }); + + test('no changes logs no changes, no lint issues, and no graph pruning issues', async () => { + await runCheck({ response: { code: EnumStatusCode.OK } }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected no changes.')); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected no lint issues.')); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected no graph pruning issues.')); + }); + + test('proposal match warning is logged before no-changes message', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + proposalMatchMessage: 'Schema does not match approved proposal', + }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Warning: Proposal match failed')); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Schema does not match approved proposal')); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected no changes.')); + }); + + test('no operations affected succeeds and logs message when no errors', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('No operations were affected by this schema change.')); + expect(process.exitCode).not.toBe(1); + }); + + test('no operations affected fails when composition errors are present', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + compositionErrors: [ + { message: 'Composition failed', federatedGraphName: 'my-graph', namespace: 'default', featureFlag: '' }, + ], + }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('No operations were affected by this schema change.')); + }); + + test('all operations safe succeeds and logs safe operations count', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 5, safeOperations: 5, firstSeenAt: '', lastSeenAt: '' }, + clientTrafficCheckSkipped: false, + }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('5 operations were considered safe due to overrides.')); + expect(process.exitCode).not.toBe(1); + }); + + test('breaking changes logs count and fails', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + }), + ).rejects.toThrow(); + + expect(String(logSpy.mock.calls[1]?.[0])).toMatch(/Found .*1.* breaking changes\./); + }); + + test('breaking changes reports impacted and safe operation counts', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + operationUsageStats: { + totalOperations: 3, + safeOperations: 1, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + }), + ).rejects.toThrow(); + + expect(String(logSpy.mock.calls[1]?.[0])).toMatch(/2.*operations impacted\./); + expect(String(logSpy.mock.calls[1]?.[0])).toMatch(/1.*operations marked safe due to overrides\./); + }); + + test('breaking changes logs client activity timestamps when traffic check not skipped', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + clientTrafficCheckSkipped: false, + }), + ).rejects.toThrow(); + + expect(String(logSpy.mock.calls[1]?.[0])).toMatch(/Found client activity between/); + }); + + test('breaking changes does not log client activity timestamps when skip-traffic-check is used', async () => { + await expect( + runCheck( + { + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', }, - }; - }, - isGitHubAppInstalled: (ctx) => { - return { - response: { - code: EnumStatusCode.OK, + clientTrafficCheckSkipped: true, + }, + { skipTrafficCheck: true }, + ), + ).rejects.toThrow(); + + expect(String(logSpy.mock.calls[1]?.[0])).not.toMatch(/Found client activity between/); + }); + + test('non-breaking changes succeeds and logs detected changes table', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'New field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected the following subgraph schema changes:')); + expect(process.exitCode).not.toBe(1); + }); + + test('composition errors logs table and fails', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + compositionErrors: [ + { message: 'Type mismatch error', federatedGraphName: 'my-graph', namespace: 'default', featureFlag: '' }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected composition errors:')); + }); + + test('composition warnings logs table and succeeds', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + compositionWarnings: [ + { message: 'Deprecation warning', federatedGraphName: 'my-graph', namespace: 'default', featureFlag: '' }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected composition warnings:')); + expect(process.exitCode).not.toBe(1); + }); + + test('lint errors logs table and fails', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + lintErrors: [ + { + lintRuleType: 'FIELD_NAMES_SHOULD_BE_CAMEL_CASE', + message: 'Field name should be camelCase', + issueLocation: { line: 10, column: 1, endLine: 10, endColumn: 20 }, }, - isInstalled: false, - }; - }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected lint issues:')); + }); + + test('lint warnings logs table and succeeds', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + lintWarnings: [ + { + lintRuleType: 'FIELD_NAMES_SHOULD_BE_CAMEL_CASE', + message: 'Consider using camelCase', + issueLocation: { line: 5, column: 1, endLine: 5, endColumn: 10 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected lint issues:')); + expect(process.exitCode).not.toBe(1); + }); + + test('graph pruning errors logs table and fails', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + graphPruneErrors: [ + { + graphPruningRuleType: 'UNUSED_FIELDS', + federatedGraphName: 'my-graph', + fieldPath: 'Query.deprecatedField', + message: 'Field is unused', + issueLocation: { line: 15, column: 1, endLine: 15, endColumn: 30 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected graph pruning issues:')); + }); + + test('graph pruning warnings logs table and succeeds', async () => { + await runCheck({ + response: { code: EnumStatusCode.OK }, + graphPruneWarnings: [ + { + graphPruningRuleType: 'UNUSED_FIELDS', + federatedGraphName: 'my-graph', + fieldPath: 'Query.deprecatedField', + message: 'Field might be unused', + issueLocation: { line: 20, column: 1, endLine: 20, endColumn: 25 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected graph pruning issues:')); + expect(process.exitCode).not.toBe(1); + }); + + test('linked traffic check failure fails the check', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + isLinkedTrafficCheckFailed: true, + }), + ).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('target subgraph check has failed')); + }); + + test('linked pruning check failure fails the check', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + isLinkedPruningCheckFailed: true, + }), + ).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('target subgraph check has failed')); + }); + + test('extension check error fails and includes error message in output', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + checkExtensionErrorMessage: 'Extension validation failed', + }), + ).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith( + expect.stringContaining('Subgraph extension check failed with message: Extension validation failed'), + ); + }); + + test('row limit exceeded logs truncation warning', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + counts: { + breakingChanges: 0, + nonBreakingChanges: 60, + compositionErrors: 0, + compositionWarnings: 0, + lintErrors: 0, + lintWarnings: 0, + graphPruneErrors: 0, + graphPruneWarnings: 0, + }, + }, + { limit: 50 }, + ); + + expect(logSpy).toHaveBeenCalledWith( + expect.stringContaining('Some results were truncated due to exceeding the limit of 50 rows.'), + ); }); -describe('Schema Command', () => { - test('Check subgraph schema', () => { - const client: Client = { - platform: createPromiseClient(PlatformService, mockPlatformTransport()), - }; + test('composedSchemaBreakingChanges alone fails the check', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { + changeType: 'FIELD_REMOVED', + message: 'Field removed at federated level', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], + }), + ).rejects.toThrow(); - const program = new Command(); + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('Schema check failed')); + }); - program.addCommand( - CheckSchema({ - client, + test('composedSchemaBreakingChanges logs federated graph breaking changes section', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { + changeType: 'FIELD_REMOVED', + message: 'Field removed at federated level', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith( + expect.stringContaining('Detected the following federated graph schema breaking changes:'), ); - const command = program.parse(['check', 'wg.orders', '--schema', 'test/fixtures/schema.graphql'], { - from: 'user', + }); + + test('composedSchemaBreakingChanges suppresses no-changes message', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { + changeType: 'FIELD_REMOVED', + message: 'Field removed at federated level', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], + }), + ).rejects.toThrow(); + + expect(logSpy).not.toHaveBeenCalledWith(expect.stringContaining('Detected no changes.')); + }); + + test('combined breaking changes count in traffic warning includes composedSchemaBreakingChanges', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + composedSchemaBreakingChanges: [ + { changeType: 'TYPE_CHANGED', message: 'Type changed', federatedGraphName: 'my-graph', isBreaking: true }, + { + changeType: 'FIELD_TYPE_CHANGED', + message: 'Field type changed', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + }), + ).rejects.toThrow(); + + expect(String(logSpy.mock.calls[1]?.[0])).toMatch(/Found .*3.* breaking changes\./); + }); + + test('composedSchemaBreakingChanges without subgraph breaking changes counts in traffic warning', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { changeType: 'TYPE_CHANGED', message: 'Type changed', federatedGraphName: 'my-graph', isBreaking: true }, + { + changeType: 'FIELD_TYPE_CHANGED', + message: 'Field type changed', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + }), + ).rejects.toThrow(); + + expect(String(logSpy.mock.calls[1]?.[0])).toMatch(/Found .*2.* breaking changes\./); + }); + + test('row limit exceeded triggered by composedSchemaBreakingChanges count', async () => { + await expect( + runCheck( + { + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { changeType: 'FIELD_REMOVED', message: 'Field removed', federatedGraphName: 'my-graph', isBreaking: true }, + ], + counts: { + breakingChanges: 0, + nonBreakingChanges: 0, + compositionErrors: 0, + compositionWarnings: 0, + lintErrors: 0, + lintWarnings: 0, + graphPruneErrors: 0, + graphPruneWarnings: 0, + composedSchemaBreakingChanges: 60, + }, + }, + { limit: 50 }, + ), + ).rejects.toThrow(); + + // Truncation message is appended to program.error() when check fails (success=false) + expect(stderrSpy).toHaveBeenCalledWith( + expect.stringContaining('Some results were truncated due to exceeding the limit of 50 rows.'), + ); + }); + + test('vcs context is constructed when vcs config fields are set', async () => { + setVcsConfig({ author: 'test-author', commitSha: 'abc123', branch: 'main' }); + + await runCheck({ response: { code: EnumStatusCode.OK } }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected no changes.')); + }); + + test('missing schema and delete flag logs error and exits', async () => { + await expect(runCheck({ response: { code: EnumStatusCode.OK } }, { schema: null })).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith( + expect.stringContaining("'--schema ' or '--delete' not specified"), + ); + }); + + test('non-existent schema file logs error and exits', async () => { + await expect( + runCheck({ response: { code: EnumStatusCode.OK } }, { schema: 'test/fixtures/nonexistent.graphql' }), + ).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('does not exist')); + }); + + test('delete flag sends check without schema', async () => { + await runCheck({ response: { code: EnumStatusCode.OK } }, { schema: null, delete: true }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Detected no changes.')); + }); + + test('invalid limit logs error and exits', async () => { + await expect(runCheck({ response: { code: EnumStatusCode.OK } }, { limit: 'abc' })).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('limit must be a valid number')); + }); + + test('limit of zero logs error and exits', async () => { + await expect(runCheck({ response: { code: EnumStatusCode.OK } }, { limit: 0 })).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('limit must be a valid number')); + }); + + test('limit exceeding max value logs error and exits', async () => { + await expect(runCheck({ response: { code: EnumStatusCode.OK } }, { limit: 10_001 })).rejects.toThrow(); + + expect(stderrSpy).toHaveBeenCalledWith(expect.stringContaining('limit must be a valid number')); + }); + + test('ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL logs proposal match failed and sets exit code', async () => { + await runCheck({ + response: { code: EnumStatusCode.ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL }, + proposalMatchMessage: 'Schema does not match approved proposal', }); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Error: Proposal match failed')); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Schema does not match approved proposal')); + expect(process.exitCode).toBe(1); + }); + + test('ERR_INVALID_SUBGRAPH_SCHEMA logs early failure message with details and exits', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA, details: 'Syntax error in schema' }, + }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith( + expect.stringContaining('Check has failed early because the schema could not be built.'), + ); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Syntax error in schema')); + }); + + test('default error status logs failed to perform check with details and exits', async () => { + await expect( + runCheck({ + response: { code: EnumStatusCode.ERR, details: 'Internal server error' }, + }), + ).rejects.toThrow(); + + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Failed to perform the check operation.')); + expect(logSpy).toHaveBeenCalledWith(expect.stringContaining('Internal server error')); + }); +}); + +describe('json output', () => { + let logSpy: MockInstance; + let stderrSpy: MockInstance; + let exitSpy: MockInstance; + + beforeEach(() => { + logSpy = vi.spyOn(console, 'log').mockImplementation(() => {}); + stderrSpy = vi.spyOn(process.stderr, 'write').mockImplementation(() => true); + exitSpy = vi.spyOn(process, 'exit').mockImplementation(() => { + throw new Error('process.exit'); + }); + }); + + afterEach(() => { + logSpy.mockRestore(); + stderrSpy.mockRestore(); + exitSpy.mockRestore(); + process.exitCode = undefined; + resetVcsConfig(); + }); + + test('no changes outputs JSON with success status and proposals', async () => { + await runCheck({ response: { code: EnumStatusCode.OK } }, { json: true }); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.proposals?.message).toContain('no changes'); + }); + + test('proposal match warning outputs JSON with proposals message', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + proposalMatchMessage: 'Schema does not match approved proposal', + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.proposals?.message).toBe('Schema does not match approved proposal'); + }); + + test('no operations affected outputs JSON with success status and traffic info', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.traffic?.message).toBe('No operations were affected by this schema change.'); + }); + + test('all operations safe outputs JSON with success status', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 5, safeOperations: 5, firstSeenAt: '', lastSeenAt: '' }, + clientTrafficCheckSkipped: false, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.traffic?.message).toBe('5 operations were considered safe due to overrides.'); + }); + + test('breaking changes outputs JSON with error status and breaking changes array', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.changes).toBeDefined(); + expect(Array.isArray(output.changes?.breaking)).toBe(true); + expect(Array.isArray(output.changes?.nonBreaking)).toBe(true); + expect(output.operationUsageStats).toBeDefined(); + }); + + test('non-breaking changes outputs JSON with success status and nonBreaking changes', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'New field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.changes).toBeDefined(); + expect(Array.isArray(output.changes?.nonBreaking)).toBe(true); + }); + + test('composition errors outputs JSON with error status and composition.errors populated', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + compositionErrors: [ + { message: 'Type mismatch error', federatedGraphName: 'my-graph', namespace: 'default', featureFlag: '' }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.composition).toBeDefined(); + expect(Array.isArray(output.composition?.errors)).toBe(true); + expect(Array.isArray(output.composition?.warnings)).toBe(true); + }); + + test('composition warnings outputs JSON with success status and composition.warnings populated', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + compositionWarnings: [ + { message: 'Deprecation warning', federatedGraphName: 'my-graph', namespace: 'default', featureFlag: '' }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.composition).toBeDefined(); + expect(Array.isArray(output.composition?.errors)).toBe(true); + expect(Array.isArray(output.composition?.warnings)).toBe(true); + }); + + test('lint errors outputs JSON with error status and lint.errors populated', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + lintErrors: [ + { + lintRuleType: 'FIELD_NAMES_SHOULD_BE_CAMEL_CASE', + message: 'Field name should be camelCase', + issueLocation: { line: 10, column: 1, endLine: 10, endColumn: 20 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.lint).toBeDefined(); + expect(Array.isArray(output.lint?.errors)).toBe(true); + expect(Array.isArray(output.lint?.warnings)).toBe(true); + }); + + test('lint warnings outputs JSON with success status and lint.warnings populated', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + lintWarnings: [ + { + lintRuleType: 'FIELD_NAMES_SHOULD_BE_CAMEL_CASE', + message: 'Consider using camelCase', + issueLocation: { line: 5, column: 1, endLine: 5, endColumn: 10 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.lint).toBeDefined(); + expect(Array.isArray(output.lint?.errors)).toBe(true); + expect(Array.isArray(output.lint?.warnings)).toBe(true); + }); + + test('graph pruning errors outputs JSON with error status and graphPrune errors', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + graphPruneErrors: [ + { + graphPruningRuleType: 'UNUSED_FIELDS', + federatedGraphName: 'my-graph', + fieldPath: 'Query.deprecatedField', + message: 'Field is unused', + issueLocation: { line: 15, column: 1, endLine: 15, endColumn: 30 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(Array.isArray(output.graphPrune?.errors)).toBe(true); + expect(Array.isArray(output.graphPrune?.warnings)).toBe(true); + }); + + test('graph pruning warnings outputs JSON with success status', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + graphPruneWarnings: [ + { + graphPruningRuleType: 'UNUSED_FIELDS', + federatedGraphName: 'my-graph', + fieldPath: 'Query.deprecatedField', + message: 'Field might be unused', + issueLocation: { line: 20, column: 1, endLine: 20, endColumn: 25 }, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.graphPrune).toBeDefined(); + expect(Array.isArray(output.graphPrune?.errors)).toBe(true); + expect(Array.isArray(output.graphPrune?.warnings)).toBe(true); + }); + + test('linked traffic check failure outputs JSON with error status and traffic message', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + isLinkedTrafficCheckFailed: true, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.traffic?.message).toBe('No operations were affected by this schema change.'); + }); + + test('linked pruning check failure outputs JSON with error status and graphPrune defined', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + isLinkedPruningCheckFailed: true, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.graphPrune).toBeDefined(); + }); + + test('extension check error outputs JSON with extensions message', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + checkExtensionErrorMessage: 'Extension validation failed', + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.extensions?.message).toContain('Extension validation failed'); + }); + + test('row limit exceeded outputs JSON with exceededRowLimit true', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + counts: { + breakingChanges: 0, + nonBreakingChanges: 60, + compositionErrors: 0, + compositionWarnings: 0, + lintErrors: 0, + lintWarnings: 0, + graphPruneErrors: 0, + graphPruneWarnings: 0, + }, + }, + { json: true, limit: 50 }, + ); + + const output = getJsonOutput(logSpy); + expect(output.exceededRowLimit).toBe(true); + expect(output.rowLimit).toBe(50); + }); + + test('skip-traffic-check with no operationUsageStats omits traffic from JSON output', async () => { + // When no operationUsageStats is returned (server omits it when traffic check is skipped), + // the traffic field should be absent from JSON output entirely + await runCheck( + { + response: { code: EnumStatusCode.OK }, + // No changes, no operationUsageStats โ€” clean check with traffic check skipped + }, + { json: true, skipTrafficCheck: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.traffic).toBeUndefined(); + }); + + test('skip-traffic-check with operationUsageStats omits traffic from JSON when no breaking changes', async () => { + // When clientTrafficCheckSkipped: true and no breaking changes, traffic should be absent โ€” + // consistent with stdout which also prints nothing in this case + await runCheck( + { + response: { code: EnumStatusCode.OK }, + nonBreakingChanges: [{ changeType: 'FIELD_ADDED', message: 'Field added', isBreaking: false }], + operationUsageStats: { + totalOperations: 5, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + clientTrafficCheckSkipped: true, + }, + { json: true, skipTrafficCheck: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('success'); + expect(output.traffic).toBeUndefined(); + }); + + test('ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL outputs JSON with error status and details', async () => { + await runCheck( + { + response: { code: EnumStatusCode.ERR_SCHEMA_MISMATCH_WITH_APPROVED_PROPOSAL }, + proposalMatchMessage: 'Schema does not match approved proposal', + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.message).toContain('Proposal match failed'); + expect(output.details).toBe('Schema does not match approved proposal'); + }); + + test('ERR_INVALID_SUBGRAPH_SCHEMA outputs JSON with error status and details', async () => { + await runCheck( + { + response: { code: EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA, details: 'Syntax error in schema' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.message).toContain('schema could not be built'); + expect(output.details).toBe('Syntax error in schema'); + }); + + test('default error status outputs JSON with error status and details', async () => { + await runCheck( + { + response: { code: EnumStatusCode.ERR, details: 'Internal server error' }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.message).toContain('Failed to perform the check operation'); + expect(output.details).toBe('Internal server error'); + }); + + test('--out writes JSON to file instead of console', async () => { + const tmpFile = join(tmpdir(), `cosmo-check-test-${Date.now()}.json`); + onTestFinished(() => rmSync(tmpFile, { force: true })); + + await runCheck({ response: { code: EnumStatusCode.OK } }, { json: true, outFile: tmpFile }); + + const written = JSON.parse(readFileSync(tmpFile, 'utf8')) as JsonCheckSchemaOutputDescriptor; + expect(written.status).toBe('success'); + expect(logSpy).not.toHaveBeenCalledWith(expect.objectContaining({ status: expect.any(String) })); + }); + + test('--out writes pretty-printed JSON to file', async () => { + const tmpFile = join(tmpdir(), `cosmo-check-test-${Date.now()}.json`); + onTestFinished(() => rmSync(tmpFile, { force: true })); + + await runCheck({ response: { code: EnumStatusCode.OK } }, { json: true, outFile: tmpFile }); + + const raw = readFileSync(tmpFile, 'utf8'); + expect(raw).toContain('\n'); + }); + + test('--out with error status writes error JSON to file', async () => { + const tmpFile = join(tmpdir(), `cosmo-check-test-${Date.now()}.json`); + onTestFinished(() => rmSync(tmpFile, { force: true })); + + await runCheck( + { response: { code: EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA, details: 'Syntax error' } }, + { json: true, outFile: tmpFile }, + ); + + const written = JSON.parse(readFileSync(tmpFile, 'utf8')) as JsonCheckSchemaOutputDescriptor; + expect(written.status).toBe('error'); + expect(written.details).toBe('Syntax error'); + }); + + test('composedSchemaBreakingChanges outputs JSON with error status and composedSchemaBreakingChanges array', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { + changeType: 'FIELD_REMOVED', + message: 'Field removed at federated level', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(Array.isArray(output.composedSchemaBreakingChanges)).toBe(true); + expect(output.composedSchemaBreakingChanges).toHaveLength(1); + }); + + test('composedSchemaBreakingChanges combined with subgraph breaking changes outputs combined count in traffic message', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + breakingChanges: [{ changeType: 'FIELD_REMOVED', message: 'Field removed', isBreaking: true }], + composedSchemaBreakingChanges: [ + { changeType: 'TYPE_CHANGED', message: 'Type changed', federatedGraphName: 'my-graph', isBreaking: true }, + { + changeType: 'FIELD_TYPE_CHANGED', + message: 'Field type changed', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + ], + operationUsageStats: { + totalOperations: 3, + safeOperations: 0, + firstSeenAt: '2024-01-01T00:00:00Z', + lastSeenAt: '2024-01-02T00:00:00Z', + }, + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.traffic?.message).toContain('Found 3 breaking changes.'); + expect(Array.isArray(output.composedSchemaBreakingChanges)).toBe(true); + expect(output.composedSchemaBreakingChanges).toHaveLength(2); + }); + + test('composedSchemaBreakingChanges without traffic outputs JSON with error status', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { + changeType: 'FIELD_REMOVED', + message: 'Field removed at federated level', + federatedGraphName: 'my-graph', + isBreaking: true, + }, + { + changeType: 'TYPE_REMOVED', + message: 'Type removed at federated level', + federatedGraphName: 'other-graph', + isBreaking: true, + }, + ], + }, + { json: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.composedSchemaBreakingChanges).toHaveLength(2); + expect(output.traffic).toBeUndefined(); + }); + + test('skip-traffic-check with composedSchemaBreakingChanges omits traffic and operationUsageStats from JSON', async () => { + await runCheck( + { + response: { code: EnumStatusCode.OK }, + composedSchemaBreakingChanges: [ + { + changeType: 'FIELD_TYPE_CHANGED', + message: 'Type changed', + federatedGraphName: 'demo-fed', + isBreaking: true, + }, + ], + operationUsageStats: { totalOperations: 0, safeOperations: 0, firstSeenAt: '', lastSeenAt: '' }, + clientTrafficCheckSkipped: true, + }, + { json: true, skipTrafficCheck: true }, + ); + + const output = getJsonOutput(logSpy); + expect(output.status).toBe('error'); + expect(output.traffic).toBeUndefined(); + expect(output.operationUsageStats).toBeUndefined(); + }); + + test('--out without --json implicitly enables JSON output', async () => { + const tmpFile = join(tmpdir(), `cosmo-check-test-${Date.now()}.json`); + onTestFinished(() => rmSync(tmpFile, { force: true })); + + await runCheck({ response: { code: EnumStatusCode.OK } }, { outFile: tmpFile }); + + const written = JSON.parse(readFileSync(tmpFile, 'utf8')) as JsonCheckSchemaOutputDescriptor; + expect(written.status).toBe('success'); }); }); diff --git a/cli/test/fetch-schema.test.ts b/cli/test/fetch-schema.test.ts index e5ea1b00fc..eadf69c43e 100644 --- a/cli/test/fetch-schema.test.ts +++ b/cli/test/fetch-schema.test.ts @@ -45,12 +45,9 @@ describe('Fetch schema', () => { const tmp = join(tmpdir(), `router-schema-${Date.now()}.graphql`); try { - const command = await program.parseAsync( - ['fetch-schema', 'mygraph', '-o', tmp], - { - from: 'user', - } - ); + const command = await program.parseAsync(['fetch-schema', 'mygraph', '-o', tmp], { + from: 'user', + }); const content = readFileSync(tmp, 'utf8'); expect(content).toBe(routerSdl); @@ -73,12 +70,9 @@ describe('Fetch schema', () => { const tmp = join(tmpdir(), `client-schema-${Date.now()}.graphql`); try { - const command = await program.parseAsync( - ['fetch-schema', 'mygraph', '-o', tmp, '--client-schema'], - { - from: 'user', - } - ); + const command = await program.parseAsync(['fetch-schema', 'mygraph', '-o', tmp, '--client-schema'], { + from: 'user', + }); const content = readFileSync(tmp, 'utf8'); expect(content).toBe(clientSdl); @@ -86,4 +80,4 @@ describe('Fetch schema', () => { rmSync(tmp); } }); -}); \ No newline at end of file +}); diff --git a/cli/test/fixtures/full-schema.graphql b/cli/test/fixtures/full-schema.graphql index d9d7477d64..7be11cc71e 100644 --- a/cli/test/fixtures/full-schema.graphql +++ b/cli/test/fixtures/full-schema.graphql @@ -1,5 +1,22 @@ extend schema -@link(url: "https://specs.apollo.dev/federation/v2.5", import: ["@authenticated", "@composeDirective", "@external", "@extends", "@inaccessible", "@interfaceObject", "@override", "@provides", "@key", "@requires", "@requiresScopes", "@shareable", "@tag"]) + @link( + url: "https://specs.apollo.dev/federation/v2.5" + import: [ + "@authenticated" + "@composeDirective" + "@external" + "@extends" + "@inaccessible" + "@interfaceObject" + "@override" + "@provides" + "@key" + "@requires" + "@requiresScopes" + "@shareable" + "@tag" + ] + ) schema { query: Query @@ -24,8 +41,8 @@ type Mutation { input ProjectInput { name: String! description: String - startDate: String # ISO date - endDate: String # ISO date + startDate: String # ISO date + endDate: String # ISO date status: ProjectStatus! } @@ -33,14 +50,14 @@ type Project @key(fields: "id") { id: ID! name: String! description: String - startDate: String # ISO date - endDate: String # ISO date + startDate: String # ISO date + endDate: String # ISO date status: ProjectStatus! # Federated references: - teamMembers: [Employee!]! - relatedProducts: [Product!]! # from products subgraph + teamMembers: [Employee!]! + relatedProducts: [Product!]! # from products subgraph # Project milestones or checkpoints - milestoneIds: [String!] # Array of milestone identifiers + milestoneIds: [String!] # Array of milestone identifiers } enum ProjectStatus { @@ -51,14 +68,13 @@ enum ProjectStatus { } type Employee @key(fields: "id") { - id: Int! + id: Int! # New field resolved by this subgraph: projects: [Project!] } type Product @key(fields: "upc") { - upc: String! + upc: String! # Projects contributing to this product: projects: [Project!] } - diff --git a/cli/test/fixtures/schema-with-nullable-list-items.graphql b/cli/test/fixtures/schema-with-nullable-list-items.graphql index d4cb02e2ec..de59907d1a 100644 --- a/cli/test/fixtures/schema-with-nullable-list-items.graphql +++ b/cli/test/fixtures/schema-with-nullable-list-items.graphql @@ -6,8 +6,8 @@ type Project { id: ID! name: String! description: String - startDate: String # ISO date - endDate: String # ISO date + startDate: String # ISO date + endDate: String # ISO date status: ProjectStatus! tags: [String]! } @@ -15,4 +15,4 @@ type Project { enum ProjectStatus { ACTIVE INACTIVE -} \ No newline at end of file +} diff --git a/cli/test/fixtures/schema-with-validation-errors.graphql b/cli/test/fixtures/schema-with-validation-errors.graphql index 0cd215429d..8ca21de99f 100644 --- a/cli/test/fixtures/schema-with-validation-errors.graphql +++ b/cli/test/fixtures/schema-with-validation-errors.graphql @@ -10,4 +10,4 @@ type Nested { type User @key(fields: "id nested { name }") { id: ID! nested: Nested! -} \ No newline at end of file +} diff --git a/cli/test/fixtures/schema-with-warnings-and-errors.graphql b/cli/test/fixtures/schema-with-warnings-and-errors.graphql index c15faa5cf6..5bfc3a178f 100644 --- a/cli/test/fixtures/schema-with-warnings-and-errors.graphql +++ b/cli/test/fixtures/schema-with-warnings-and-errors.graphql @@ -13,4 +13,4 @@ type Nested { type User @key(fields: "id nested { name }") { id: ID! nested: Nested! -} \ No newline at end of file +} diff --git a/cli/test/grpc-service.test.ts b/cli/test/grpc-service.test.ts index 35ae2a41d4..a25e1266e7 100644 --- a/cli/test/grpc-service.test.ts +++ b/cli/test/grpc-service.test.ts @@ -36,19 +36,9 @@ describe('gRPC Generate Command', () => { const schemaPath = resolve(__dirname, 'fixtures', 'full-schema.graphql'); - await program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - schemaPath, - '-o', - tmpDir, - ], - { - from: 'user', - } - ); + await program.parseAsync(['generate', 'testservice', '-i', schemaPath, '-o', tmpDir], { + from: 'user', + }); // Verify the output files exist expect(existsSync(join(tmpDir, 'mapping.json'))).toBe(true); @@ -73,19 +63,9 @@ describe('gRPC Generate Command', () => { const schemaPath = resolve(__dirname, 'fixtures', 'full-schema.graphql'); - await program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - schemaPath, - '-o', - nonExistentDir, - ], - { - from: 'user', - } - ); + await program.parseAsync(['generate', 'testservice', '-i', schemaPath, '-o', nonExistentDir], { + from: 'user', + }); // Verify the output directory and files exist expect(existsSync(nonExistentDir)).toBe(true); @@ -112,23 +92,12 @@ describe('gRPC Generate Command', () => { rmdirSync(tmpDir, { recursive: true }); }); - const nonExistentFile = join(tmpdir(), 'non-existent-schema.graphql'); await expect( - program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - nonExistentFile, - '-o', - tmpDir, - ], - { - from: 'user', - } - ) + program.parseAsync(['generate', 'testservice', '-i', nonExistentFile, '-o', tmpDir], { + from: 'user', + }), ).rejects.toThrow(); }); @@ -151,24 +120,15 @@ describe('gRPC Generate Command', () => { const outputFile = join(tmpDir, 'output.txt'); writeFileSync(outputFile, 'test'); - program.exitOverride(err => { + program.exitOverride((err) => { expect(err.message).toContain(`Output directory ${outputFile} is not a directory`); }); await expect( - program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - 'test/fixtures/full-schema.graphql', - '-o', - outputFile, - ], - { - from: 'user', - } - )).rejects.toThrow('process.exit unexpectedly called with "1"'); + program.parseAsync(['generate', 'testservice', '-i', 'test/fixtures/full-schema.graphql', '-o', outputFile], { + from: 'user', + }), + ).rejects.toThrow('process.exit unexpectedly called with "1"'); }); test('should generate all files with warnings', async (testContext) => { @@ -189,19 +149,9 @@ describe('gRPC Generate Command', () => { const schemaPath = resolve(__dirname, 'fixtures', 'schema-with-nullable-list-items.graphql'); // Should complete successfully despite warnings - await program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - schemaPath, - '-o', - tmpDir, - ], - { - from: 'user', - } - ); + await program.parseAsync(['generate', 'testservice', '-i', schemaPath, '-o', tmpDir], { + from: 'user', + }); // Verify the output files exist (generation should continue with warnings) expect(existsSync(join(tmpDir, 'mapping.json'))).toBe(true); @@ -228,19 +178,9 @@ describe('gRPC Generate Command', () => { // Should fail due to validation errors await expect( - program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - schemaPath, - '-o', - tmpDir, - ], - { - from: 'user', - } - ) + program.parseAsync(['generate', 'testservice', '-i', schemaPath, '-o', tmpDir], { + from: 'user', + }), ).rejects.toThrow('Schema validation failed'); // Verify no output files were created (generation should stop on errors) @@ -268,19 +208,9 @@ describe('gRPC Generate Command', () => { // Should fail due to validation errors (despite having warnings) await expect( - program.parseAsync( - [ - 'generate', - 'testservice', - '-i', - schemaPath, - '-o', - tmpDir, - ], - { - from: 'user', - } - ) + program.parseAsync(['generate', 'testservice', '-i', schemaPath, '-o', tmpDir], { + from: 'user', + }), ).rejects.toThrow('Schema validation failed'); // Verify no output files were created (generation should stop on errors) diff --git a/cli/test/json-check-schema-output-builder.test.ts b/cli/test/json-check-schema-output-builder.test.ts new file mode 100644 index 0000000000..6d83ab433b --- /dev/null +++ b/cli/test/json-check-schema-output-builder.test.ts @@ -0,0 +1,331 @@ +import { readFileSync, rmSync } from 'node:fs'; +import { tmpdir } from 'node:os'; +import { join } from 'node:path'; +import { describe, it, expect, vi, afterEach } from 'vitest'; +import { EnumStatusCode } from '@wundergraph/cosmo-connect/dist/common/common_pb'; +import { + CheckOperationUsageStats, + CompositionError, + FederatedGraphSchemaChange, + GraphPruningIssue, + LintIssue, + SchemaChange, +} from '@wundergraph/cosmo-connect/dist/platform/v1/platform_pb'; +import { + JsonCheckSchemaOutputBuilder, + type JsonCheckSchemaOutputDescriptor, +} from '../src/json-check-schema-output-builder.js'; + +describe('JsonCheckSchemaOutputBuilder', () => { + describe('constructor / build', () => { + it('initialises with error status and given code and rowLimit', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.ERR, 50); + expect(b.build()).toMatchObject>({ + status: 'error', + code: EnumStatusCode.ERR, + rowLimit: 50, + }); + }); + }); + + describe('setStatus', () => { + it('sets status to success when true', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setStatus(true); + expect(b.build().status).toBe('success'); + }); + + it('sets status to error when false', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setStatus(false); + expect(b.build().status).toBe('error'); + }); + }); + + describe('setCode', () => { + it('updates code', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.ERR, 10); + b.setCode(EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA); + expect(b.build().code).toBe(EnumStatusCode.ERR_INVALID_SUBGRAPH_SCHEMA); + }); + }); + + describe('setUrl / setMessage / setDetails', () => { + it('sets url, message, and details', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setUrl('https://example.com').setMessage('hello').setDetails('some detail'); + const result = b.build(); + expect(result.url).toBe('https://example.com'); + expect(result.message).toBe('hello'); + expect(result.details).toBe('some detail'); + }); + + it('setDetails with undefined clears details', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setDetails(undefined); + expect(b.build().details).toBeUndefined(); + }); + }); + + describe('proposals', () => { + it('setProposals overwrites existing proposals', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setProposals('first').setProposals('second'); + expect(b.build().proposals).toEqual({ message: 'second' }); + }); + + it('initProposals does not overwrite if already set', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setProposals('original').initProposals('ignored'); + expect(b.build().proposals).toEqual({ message: 'original' }); + }); + + it('initProposals sets value when not set', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.initProposals('new'); + expect(b.build().proposals).toEqual({ message: 'new' }); + }); + }); + + describe('traffic', () => { + it('setTraffic replaces traffic', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setTraffic('ok'); + expect(b.build().traffic).toEqual({ message: 'ok' }); + }); + + it('markTrafficLinkedFailed uses fallback when no prior message', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.markTrafficLinkedFailed('fallback'); + expect(b.build().traffic).toMatchObject({ + message: 'fallback', + }); + }); + + it('markTrafficLinkedFailed preserves prior message', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setTraffic('prior').markTrafficLinkedFailed('fallback'); + expect(b.build().traffic?.message).toBe('prior'); + }); + }); + + describe('schema changes', () => { + const change = new SchemaChange({ + changeType: 'FIELD_REMOVED', + message: 'field removed', + path: 'Query.foo', + isBreaking: true, + }); + const nonChange = new SchemaChange({ + changeType: 'FIELD_ADDED', + message: 'field added', + path: 'Query.bar', + isBreaking: false, + }); + + it('addBreakingChanges accumulates changes', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addBreakingChanges([change]).addBreakingChanges([change]); + expect(b.build().changes?.breaking).toHaveLength(2); + expect(b.build().changes?.nonBreaking).toHaveLength(0); + }); + + it('addNonBreakingChanges accumulates changes', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addNonBreakingChanges([nonChange]).addNonBreakingChanges([nonChange]); + expect(b.build().changes?.nonBreaking).toHaveLength(2); + expect(b.build().changes?.breaking).toHaveLength(0); + }); + + it('mixing breaking and non-breaking preserves both', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addBreakingChanges([change]).addNonBreakingChanges([nonChange]); + expect(b.build().changes?.breaking).toHaveLength(1); + expect(b.build().changes?.nonBreaking).toHaveLength(1); + }); + }); + + describe('composition', () => { + const err = new CompositionError({ + message: 'compose error', + federatedGraphName: 'g', + namespace: 'ns', + featureFlag: '', + }); + const warn = new CompositionError({ + message: 'compose warning', + federatedGraphName: 'g', + namespace: 'ns', + featureFlag: '', + }); + + it('addCompositionErrors accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addCompositionErrors([err]).addCompositionErrors([err]); + const comp = b.build().composition!; + expect(comp.errors).toHaveLength(2); + expect(comp.warnings).toHaveLength(0); + }); + + it('addCompositionWarnings accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addCompositionWarnings([warn]).addCompositionWarnings([warn]); + const comp = b.build().composition!; + expect(comp.warnings).toHaveLength(2); + expect(comp.errors).toHaveLength(0); + }); + }); + + describe('lint', () => { + const lintErr = new LintIssue({ message: 'lint error', lintRuleType: 'RULE_A' }); + const lintWarn = new LintIssue({ message: 'lint warn', lintRuleType: 'RULE_B' }); + + it('addLintErrors accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addLintErrors([lintErr]).addLintErrors([lintErr]); + const lint = b.build().lint!; + expect(lint.errors).toHaveLength(2); + }); + + it('addLintWarnings accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addLintWarnings([lintWarn]); + const lint = b.build().lint!; + expect(lint.warnings).toHaveLength(1); + }); + }); + + describe('graphPrune', () => { + const pruneErr = new GraphPruningIssue({ + message: 'prune error', + graphPruningRuleType: 'RULE', + federatedGraphName: 'g', + fieldPath: 'f', + }); + const pruneWarn = new GraphPruningIssue({ + message: 'prune warn', + graphPruningRuleType: 'RULE', + federatedGraphName: 'g', + fieldPath: 'f', + }); + + it('addGraphPruneErrors accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addGraphPruneErrors([pruneErr]).addGraphPruneErrors([pruneErr]); + const gp = b.build().graphPrune!; + expect(gp.errors).toHaveLength(2); + expect(gp.warnings).toHaveLength(0); + }); + + it('addGraphPruneWarnings accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addGraphPruneWarnings([pruneWarn]); + const gp = b.build().graphPrune!; + expect(gp.warnings).toHaveLength(1); + }); + + it('markGraphPruneLinkedFailed initializes graphPrune', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.markGraphPruneLinkedFailed(); + const gp = b.build().graphPrune!; + expect(gp).toBeDefined(); + expect(gp.errors).toHaveLength(0); + expect(gp.warnings).toHaveLength(0); + }); + }); + + describe('composedSchemaBreakingChanges', () => { + const composedChange = new FederatedGraphSchemaChange({ + changeType: 'FIELD_TYPE_CHANGED', + message: "Field 'User.username' changed type from 'String!' to 'String'", + path: 'User.username', + isBreaking: true, + federatedGraphName: 'demo-fed', + }); + + it('addComposedSchemaBreakingChanges initializes and accumulates', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addComposedSchemaBreakingChanges([composedChange]).addComposedSchemaBreakingChanges([composedChange]); + expect(b.build().composedSchemaBreakingChanges).toHaveLength(2); + }); + + it('addComposedSchemaBreakingChanges preserves existing entries when called multiple times', () => { + const second = new FederatedGraphSchemaChange({ ...composedChange, federatedGraphName: 'other-fed' }); + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.addComposedSchemaBreakingChanges([composedChange]).addComposedSchemaBreakingChanges([second]); + const result = b.build().composedSchemaBreakingChanges!; + expect(result).toHaveLength(2); + expect(result[0].federatedGraphName).toBe('demo-fed'); + expect(result[1].federatedGraphName).toBe('other-fed'); + }); + + it('composedSchemaBreakingChanges is absent when never set', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + expect(b.build().composedSchemaBreakingChanges).toBeUndefined(); + }); + }); + + describe('extensions / exceededRowLimit / operationUsageStats', () => { + it('setExtensionError sets message', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setExtensionError('bad extension'); + expect(b.build().extensions).toEqual({ message: 'bad extension' }); + }); + + it('setExceededRowLimit stores the flag', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setExceededRowLimit(true); + expect(b.build().exceededRowLimit).toBe(true); + }); + + it('setOperationUsageStats does not overwrite if already set', () => { + const stats1 = new CheckOperationUsageStats({ totalOperations: 5 }); + const stats2 = new CheckOperationUsageStats({ totalOperations: 99 }); + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setOperationUsageStats(stats1).setOperationUsageStats(stats2); + expect(b.build().operationUsageStats?.totalOperations).toBe(5); + }); + }); + + describe('write', () => { + afterEach(() => { + vi.restoreAllMocks(); + }); + + it('logs to console when no outFile', async () => { + const spy = vi.spyOn(console, 'log').mockImplementation(() => {}); + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + b.setStatus(true); + await b.write(); + expect(spy).toHaveBeenCalledWith(b.build()); + }); + + it('writes JSON to file when outFile provided', async () => { + const outFile = join(tmpdir(), `json-output-builder-test-${Date.now()}.json`); + try { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10, outFile); + b.setStatus(true).setMessage('done'); + await b.write(); + const written = JSON.parse(readFileSync(outFile, 'utf8')); + expect(written).toMatchObject({ status: 'success', message: 'done' }); + } finally { + rmSync(outFile, { force: true }); + } + }); + }); + + describe('method chaining', () => { + it('all setters return this for fluent chaining', () => { + const b = new JsonCheckSchemaOutputBuilder(EnumStatusCode.OK, 10); + const result = b + .setUrl('https://x.com') + .setCode(EnumStatusCode.OK) + .setStatus(true) + .setMessage('msg') + .setDetails('det') + .setExceededRowLimit(false) + .addComposedSchemaBreakingChanges([]); + expect(result).toBe(b); + }); + }); +}); diff --git a/cli/test/parse-operations.test.ts b/cli/test/parse-operations.test.ts index 5124fe6271..05d5ae43a7 100644 --- a/cli/test/parse-operations.test.ts +++ b/cli/test/parse-operations.test.ts @@ -5,44 +5,49 @@ import { describe, test, expect } from 'vitest'; import { parseOperations } from '../src/commands/operations/commands/push.js'; - describe('parse operations from different formats', () => { - test('parse operations from graphql', () => { - const operation = `query { + test('parse operations from graphql', () => { + const operation = `query { hello }`; - const id = crypto.createHash('sha256').update(operation).digest('hex'); - const operations = parseOperations(operation); - expect(operations).toEqual([{ id, contents: operation }]); - }); - test('returns consistent hash', () => { - const operation = `query Employees {\n employees {\n id\n }\n}`; - const operations = parseOperations(operation); - expect(operations).toEqual([{ id: "33651da3d80e420709520fb900c7ab8ec4151555da56062feeee428cf7f3a5dd", contents: operation }]); - }); - test('parse operations from Apollo', async() => { - const persistedQueries = await fs.readFile(path.join('test', 'testdata', 'persisted-query-manifest.json'), 'utf8'); - const operations = parseOperations(persistedQueries); - expect(operations).toEqual([ - { id: "33651da3d80e420709520fb900c7ab8ec4151555da56062feeee428cf7f3a5dd", contents: "query Employees {\n employees {\n id\n }\n}" }, - ]); - }); - test('parse query map', async() => { - const queryMap = await fs.readFile(path.join('test', 'testdata', 'query-map.json'), 'utf8'); - const operations = parseOperations(queryMap); - expect(operations).toEqual([ - { id: "1", contents: "subscription {\n currentTime {\n unixTime \n }\n}" }, - { id: "2", contents: "query { employee(id:1) { id } }" }, - ]); - }); - test('parse relay persisted', async() => { - const persisted = await fs.readFile(path.join('test', 'testdata', 'relay-persisted.json'), 'utf8'); - const operations = parseOperations(persisted); - const op1 = "query DragonsListDragonsQuery {\n spacex_dragons {\n ...Dragons_display_details\n id\n }\n}\n\nfragment Dragons_display_details on spacex_Dragon {\n name\n active\n}\n"; - const op2 = "query DragonsListDragonsQuery {\n spacex_dragons {\n name\n active\n id\n }\n}\n"; - expect(operations).toEqual([ - { id: "c11158afcc8e55409b96972f20e26fa1", contents: op1 }, - { id: "ce2342daed4e1960717c581d645e335d", contents: op2 }, - ]); - }); -}) + const id = crypto.createHash('sha256').update(operation).digest('hex'); + const operations = parseOperations(operation); + expect(operations).toEqual([{ id, contents: operation }]); + }); + test('returns consistent hash', () => { + const operation = `query Employees {\n employees {\n id\n }\n}`; + const operations = parseOperations(operation); + expect(operations).toEqual([ + { id: '33651da3d80e420709520fb900c7ab8ec4151555da56062feeee428cf7f3a5dd', contents: operation }, + ]); + }); + test('parse operations from Apollo', async () => { + const persistedQueries = await fs.readFile(path.join('test', 'testdata', 'persisted-query-manifest.json'), 'utf8'); + const operations = parseOperations(persistedQueries); + expect(operations).toEqual([ + { + id: '33651da3d80e420709520fb900c7ab8ec4151555da56062feeee428cf7f3a5dd', + contents: 'query Employees {\n employees {\n id\n }\n}', + }, + ]); + }); + test('parse query map', async () => { + const queryMap = await fs.readFile(path.join('test', 'testdata', 'query-map.json'), 'utf8'); + const operations = parseOperations(queryMap); + expect(operations).toEqual([ + { id: '1', contents: 'subscription {\n currentTime {\n unixTime \n }\n}' }, + { id: '2', contents: 'query { employee(id:1) { id } }' }, + ]); + }); + test('parse relay persisted', async () => { + const persisted = await fs.readFile(path.join('test', 'testdata', 'relay-persisted.json'), 'utf8'); + const operations = parseOperations(persisted); + const op1 = + 'query DragonsListDragonsQuery {\n spacex_dragons {\n ...Dragons_display_details\n id\n }\n}\n\nfragment Dragons_display_details on spacex_Dragon {\n name\n active\n}\n'; + const op2 = 'query DragonsListDragonsQuery {\n spacex_dragons {\n name\n active\n id\n }\n}\n'; + expect(operations).toEqual([ + { id: 'c11158afcc8e55409b96972f20e26fa1', contents: op1 }, + { id: 'ce2342daed4e1960717c581d645e335d', contents: op2 }, + ]); + }); +}); diff --git a/cli/test/testdata/query-map.json b/cli/test/testdata/query-map.json index c605177118..98e1d6df98 100644 --- a/cli/test/testdata/query-map.json +++ b/cli/test/testdata/query-map.json @@ -1,4 +1,4 @@ [ - ["1","subscription {\n currentTime {\n unixTime \n }\n}"], - ["2","query { employee(id:1) { id } }"] + ["1", "subscription {\n currentTime {\n unixTime \n }\n}"], + ["2", "query { employee(id:1) { id } }"] ] diff --git a/cli/tsconfig.json b/cli/tsconfig.json index c34cd0a571..c219558c5e 100644 --- a/cli/tsconfig.json +++ b/cli/tsconfig.json @@ -9,6 +9,6 @@ "typeRoots": ["./node_modules/@types", "./src/typedefinitions"], "rootDir": "." }, - "include": ["src/**/*"], - "exclude": ["node_modules", "dist"] + "include": ["src/**/*", "test/**/*", "e2e/**/*"], + "exclude": ["node_modules", "dist", "test/fixtures/**", "test/testdata/**", "coverage/**/*"] } diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 8a2684337b..1f43f30e4f 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -367,9 +367,6 @@ importers: eslint-plugin-require-extensions: specifier: 0.1.3 version: 0.1.3(eslint@8.57.1) - prettier: - specifier: 3.5.2 - version: 3.5.2 tsx: specifier: 4.19.4 version: 4.19.4 @@ -12595,11 +12592,6 @@ packages: engines: {node: '>=14'} hasBin: true - prettier@3.5.2: - resolution: {integrity: sha512-lc6npv5PH7hVqozBR7lkBNOGXV9vMwROAPlumdBkX0wTbbzPu/U1hk5yL8p2pt4Xoc+2mkT8t/sow2YrV/M5qg==} - engines: {node: '>=14'} - hasBin: true - prettier@3.5.3: resolution: {integrity: sha512-QQtaxnoDJeAkDvDKWCLiwIXkTgRhwYDEQCghU9Z6q03iyek/rxRh/2lC3HB7P8sWT2xC/y5JDctPLBIGzHKbhw==} engines: {node: '>=14'} @@ -24693,10 +24685,10 @@ snapshots: dependencies: eslint: 8.57.1 - eslint-config-standard@17.1.0(eslint-plugin-import@2.27.5)(eslint-plugin-n@16.6.2(eslint@8.57.1))(eslint-plugin-promise@6.6.0(eslint@8.57.1))(eslint@8.57.1): + eslint-config-standard@17.1.0(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1))(eslint-plugin-n@16.6.2(eslint@8.57.1))(eslint-plugin-promise@6.6.0(eslint@8.57.1))(eslint@8.57.1): dependencies: eslint: 8.57.1 - eslint-plugin-import: 2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1) + eslint-plugin-import: 2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) eslint-plugin-n: 16.6.2(eslint@8.57.1) eslint-plugin-promise: 6.6.0(eslint@8.57.1) @@ -24706,9 +24698,9 @@ snapshots: '@typescript-eslint/parser': 5.62.0(eslint@8.57.1)(typescript@5.5.2) eslint: 8.57.1 eslint-config-prettier: 8.10.0(eslint@8.57.1) - eslint-config-standard: 17.1.0(eslint-plugin-import@2.27.5)(eslint-plugin-n@16.6.2(eslint@8.57.1))(eslint-plugin-promise@6.6.0(eslint@8.57.1))(eslint@8.57.1) - eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5)(eslint@8.57.1) - eslint-plugin-import: 2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1) + eslint-config-standard: 17.1.0(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1))(eslint-plugin-n@16.6.2(eslint@8.57.1))(eslint-plugin-promise@6.6.0(eslint@8.57.1))(eslint@8.57.1) + eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1) + eslint-plugin-import: 2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) eslint-plugin-n: 16.6.2(eslint@8.57.1) eslint-plugin-node: 11.1.0(eslint@8.57.1) eslint-plugin-promise: 6.6.0(eslint@8.57.1) @@ -24745,13 +24737,13 @@ snapshots: - eslint-import-resolver-webpack - supports-color - eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5)(eslint@8.57.1): + eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1): dependencies: debug: 4.3.7 enhanced-resolve: 5.15.0 eslint: 8.57.1 - eslint-module-utils: 2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5)(eslint@8.57.1) - eslint-plugin-import: 2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1) + eslint-module-utils: 2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) + eslint-plugin-import: 2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) get-tsconfig: 4.7.2 globby: 13.2.2 is-core-module: 2.12.1 @@ -24763,6 +24755,17 @@ snapshots: - eslint-import-resolver-webpack - supports-color + eslint-module-utils@2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-node@0.3.7)(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1): + dependencies: + debug: 4.3.7 + optionalDependencies: + '@typescript-eslint/parser': 5.62.0(eslint@8.57.1)(typescript@5.5.2) + eslint: 8.57.1 + eslint-import-resolver-node: 0.3.7 + eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1) + transitivePeerDependencies: + - supports-color + eslint-module-utils@2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-node@0.3.7)(eslint-import-resolver-typescript@3.5.5)(eslint@8.57.1): dependencies: debug: 4.3.7 @@ -24770,17 +24773,17 @@ snapshots: '@typescript-eslint/parser': 5.62.0(eslint@8.57.1)(typescript@5.5.2) eslint: 8.57.1 eslint-import-resolver-node: 0.3.7 - eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5)(eslint@8.57.1) + eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-node@0.3.7)(eslint-plugin-import@2.27.5(eslint@8.57.1))(eslint@8.57.1) transitivePeerDependencies: - supports-color - eslint-module-utils@2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5)(eslint@8.57.1): + eslint-module-utils@2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1): dependencies: debug: 4.3.7 optionalDependencies: '@typescript-eslint/parser': 5.62.0(eslint@8.57.1)(typescript@5.5.2) eslint: 8.57.1 - eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5)(eslint@8.57.1) + eslint-import-resolver-typescript: 3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1) transitivePeerDependencies: - supports-color @@ -24797,7 +24800,7 @@ snapshots: eslint-utils: 2.1.0 regexpp: 3.2.0 - eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5)(eslint@8.57.1): + eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1): dependencies: array-includes: 3.1.6 array.prototype.flat: 1.3.1 @@ -24806,7 +24809,7 @@ snapshots: doctrine: 2.1.0 eslint: 8.57.1 eslint-import-resolver-node: 0.3.7 - eslint-module-utils: 2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-node@0.3.7)(eslint-import-resolver-typescript@3.5.5)(eslint@8.57.1) + eslint-module-utils: 2.8.0(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-node@0.3.7)(eslint-import-resolver-typescript@3.5.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1))(eslint@8.57.1))(eslint@8.57.1) has: 1.0.3 is-core-module: 2.12.1 is-glob: 4.0.3 @@ -24822,7 +24825,7 @@ snapshots: - eslint-import-resolver-webpack - supports-color - eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint@8.57.1): + eslint-plugin-import@2.27.5(@typescript-eslint/parser@5.62.0(eslint@8.57.1)(typescript@5.5.2))(eslint-import-resolver-typescript@3.5.5)(eslint@8.57.1): dependencies: array-includes: 3.1.6 array.prototype.flat: 1.3.1 @@ -28202,8 +28205,6 @@ snapshots: prettier@3.2.5: {} - prettier@3.5.2: {} - prettier@3.5.3: {} prettier@3.6.2: {}