From aae4ecade92ce6fb4d9850acf9bca76010c38b8a Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 00:24:47 +0000
Subject: [PATCH 1/9] Initial plan
From 83a0c906d4db0465ec78748070fc0a5db2155e08 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 00:35:28 +0000
Subject: [PATCH 2/9] Add --lint feature: Linter.cs, Context.cs, Program.cs,
Validation.cs, tests, and requirements
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/0fc31532-46f1-4661-a8c4-59d2c67d10b6
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
docs/reqstream/unit-linter.yaml | 131 ++++
docs/reqstream/unit-program.yaml | 13 +
requirements.yaml | 1 +
src/DemaConsulting.ReqStream/Context.cs | 11 +
src/DemaConsulting.ReqStream/Linter.cs | 584 ++++++++++++++++++
src/DemaConsulting.ReqStream/Program.cs | 10 +-
src/DemaConsulting.ReqStream/Validation.cs | 93 +++
.../LinterTests.cs | 568 +++++++++++++++++
.../ProgramTests.cs | 5 +-
9 files changed, 1413 insertions(+), 3 deletions(-)
create mode 100644 docs/reqstream/unit-linter.yaml
create mode 100644 src/DemaConsulting.ReqStream/Linter.cs
create mode 100644 test/DemaConsulting.ReqStream.Tests/LinterTests.cs
diff --git a/docs/reqstream/unit-linter.yaml b/docs/reqstream/unit-linter.yaml
new file mode 100644
index 0000000..b5e8967
--- /dev/null
+++ b/docs/reqstream/unit-linter.yaml
@@ -0,0 +1,131 @@
+---
+sections:
+ - title: ReqStream Requirements
+ sections:
+ - title: Linting
+ requirements:
+ - id: ReqStream-Lint-MalformedYaml
+ title: The linter shall report an error when a requirements file contains malformed YAML.
+ justification: |
+ Malformed YAML files cannot be parsed and prevent requirements from being loaded.
+ Reporting the error with a file location helps users identify and fix the syntax problem.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithMalformedYaml_ReportsError
+
+ - id: ReqStream-Lint-UnknownDocumentField
+ title: The linter shall report an error when the requirements document contains an unknown field.
+ justification: |
+ Unknown fields may indicate typos or outdated configuration that would otherwise be silently
+ ignored, leading to unexpected behaviour.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithUnknownDocumentField_ReportsError
+
+ - id: ReqStream-Lint-UnknownSectionField
+ title: The linter shall report an error when a section contains an unknown field.
+ justification: |
+ Unknown fields in sections may indicate typos or unsupported configuration that would otherwise
+ be silently ignored.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithUnknownSectionField_ReportsError
+
+ - id: ReqStream-Lint-MissingSectionTitle
+ title: The linter shall report an error when a section is missing the required title field.
+ justification: |
+ The title field is mandatory for all sections. Missing it prevents the section from being
+ correctly identified in reports and trace matrices.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithSectionMissingTitle_ReportsError
+ - Linter_Lint_WithBlankSectionTitle_ReportsError
+
+ - id: ReqStream-Lint-UnknownRequirementField
+ title: The linter shall report an error when a requirement contains an unknown field.
+ justification: |
+ Unknown fields in requirements may indicate typos or unsupported attributes that would otherwise
+ be silently ignored.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithUnknownRequirementField_ReportsError
+ - Linter_Lint_WithNestedSectionIssues_ReportsError
+
+ - id: ReqStream-Lint-MissingRequirementFields
+ title: The linter shall report an error when a requirement is missing the required id or title field.
+ justification: |
+ The id and title fields are mandatory for all requirements. Missing them causes downstream
+ processing failures and produces incomplete reports.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithRequirementMissingId_ReportsError
+ - Linter_Lint_WithRequirementMissingTitle_ReportsError
+
+ - id: ReqStream-Lint-DuplicateIds
+ title: The linter shall report an error when duplicate requirement IDs are found.
+ justification: |
+ Duplicate requirement IDs cause ambiguity in trace matrices and reports, and can lead to
+ incorrect coverage calculations.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithDuplicateIds_ReportsError
+ - Linter_Lint_WithDuplicateIdsAcrossFiles_ReportsError
+
+ - id: ReqStream-Lint-MultipleIssues
+ title: The linter shall report all issues found rather than stopping at the first issue.
+ justification: |
+ Reporting all issues at once allows users to fix all problems in a single pass, which is more
+ efficient than fixing one issue at a time and re-running.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithMultipleIssues_ReportsAllIssues
+
+ - id: ReqStream-Lint-FollowsIncludes
+ title: The linter shall follow include directives and lint all included files.
+ justification: |
+ Requirements files can include other files. Linting must follow these includes to ensure all
+ referenced files are also checked for issues.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithIncludes_LintesIncludedFiles
+
+ - id: ReqStream-Lint-NoIssuesMessage
+ title: The linter shall print a no-issues message when no problems are found.
+ justification: |
+ Printing a success message when no issues are found provides positive confirmation to the user
+ that the requirements files passed all lint checks.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithValidFile_ReportsNoIssues
+ - Linter_Lint_WithEmptyFile_ReportsNoIssues
+
+ - id: ReqStream-Lint-ErrorFormat
+ title: "The linter shall format errors as \"[location]: [severity]: [description]\"."
+ justification: |
+ Standard lint output format with file path and location information allows editors and CI tools
+ to parse the output and navigate directly to the reported issue.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_ErrorFormat_IncludesFileAndLocation
+
+ - id: ReqStream-Lint-UnknownMappingField
+ title: The linter shall report an error when a test mapping contains an unknown field.
+ justification: |
+ Unknown fields in test mappings may indicate typos or unsupported configuration that would
+ otherwise be silently ignored.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithUnknownMappingField_ReportsError
+ - Linter_Lint_WithMappingMissingId_ReportsError
diff --git a/docs/reqstream/unit-program.yaml b/docs/reqstream/unit-program.yaml
index 1fbd101..9a225a0 100644
--- a/docs/reqstream/unit-program.yaml
+++ b/docs/reqstream/unit-program.yaml
@@ -60,3 +60,16 @@ sections:
- Program_Run_WithEnforcementAndUnsatisfiedRequirements_Fails
- Program_Run_WithEnforcementAndNoTests_Fails
- Program_Run_WithEnforcementAndFailedTests_Fails
+
+ - id: ReqStream-Prog-Lint
+ title: The tool shall lint requirements files and report all issues when the lint flag is provided.
+ justification: |
+ Linting requirements files allows users to identify structural problems such as malformed YAML,
+ missing mandatory fields, unknown fields, and duplicate requirement IDs before processing,
+ improving the quality and reliability of requirement definitions.
+ tags:
+ - cli
+ - requirements
+ tests:
+ - Linter_ProgramRun_WithLintFlag_RunsLinter
+ - ReqStream_Lint
diff --git a/requirements.yaml b/requirements.yaml
index 4200c05..5141d24 100644
--- a/requirements.yaml
+++ b/requirements.yaml
@@ -5,6 +5,7 @@ includes:
- docs/reqstream/unit-validation.yaml
- docs/reqstream/unit-requirements.yaml
- docs/reqstream/unit-trace-matrix.yaml
+ - docs/reqstream/unit-linter.yaml
- docs/reqstream/platform-requirements.yaml
- docs/reqstream/ots-mstest.yaml
- docs/reqstream/ots-reqstream.yaml
diff --git a/src/DemaConsulting.ReqStream/Context.cs b/src/DemaConsulting.ReqStream/Context.cs
index a3551c2..0d17ae2 100644
--- a/src/DemaConsulting.ReqStream/Context.cs
+++ b/src/DemaConsulting.ReqStream/Context.cs
@@ -58,6 +58,11 @@ public sealed class Context : IDisposable
///
public bool Validate { get; private init; }
+ ///
+ /// Gets a value indicating whether the lint flag was specified.
+ ///
+ public bool Lint { get; private init; }
+
///
/// Gets the validation results output file path.
///
@@ -142,6 +147,7 @@ public static Context Create(string[] args)
var help = false;
var silent = false;
var validate = false;
+ var lint = false;
var enforce = false;
// Initialize collection variables
@@ -187,6 +193,10 @@ public static Context Create(string[] args)
validate = true;
break;
+ case "--lint":
+ lint = true;
+ break;
+
case "--results":
// Ensure argument has a value
if (i >= args.Length)
@@ -335,6 +345,7 @@ public static Context Create(string[] args)
Help = help,
Silent = silent,
Validate = validate,
+ Lint = lint,
ResultsFile = resultsFile,
Enforce = enforce,
FilterTags = filterTags,
diff --git a/src/DemaConsulting.ReqStream/Linter.cs b/src/DemaConsulting.ReqStream/Linter.cs
new file mode 100644
index 0000000..e9fc723
--- /dev/null
+++ b/src/DemaConsulting.ReqStream/Linter.cs
@@ -0,0 +1,584 @@
+// Copyright (c) 2026 DEMA Consulting
+//
+// Permission is hereby granted, free of charge, to any person obtaining a copy
+// of this software and associated documentation files (the "Software"), to deal
+// in the Software without restriction, including without limitation the rights
+// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+// copies of the Software, and to permit persons to whom the Software is
+// furnished to do so, subject to the following conditions:
+//
+// The above copyright notice and this permission notice shall be included in all
+// copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+// SOFTWARE.
+
+using YamlDotNet.Core;
+using YamlDotNet.RepresentationModel;
+
+namespace DemaConsulting.ReqStream;
+
+///
+/// Provides linting functionality for ReqStream requirement YAML files.
+///
+public static class Linter
+{
+ ///
+ /// Known fields at the document root level.
+ ///
+ private static readonly HashSet KnownDocumentFields =
+ new(StringComparer.Ordinal) { "sections", "mappings", "includes" };
+
+ ///
+ /// Known fields within a section.
+ ///
+ private static readonly HashSet KnownSectionFields =
+ new(StringComparer.Ordinal) { "title", "requirements", "sections" };
+
+ ///
+ /// Known fields within a requirement.
+ ///
+ private static readonly HashSet KnownRequirementFields =
+ new(StringComparer.Ordinal) { "id", "title", "justification", "tests", "children", "tags" };
+
+ ///
+ /// Known fields within a mapping.
+ ///
+ private static readonly HashSet KnownMappingFields =
+ new(StringComparer.Ordinal) { "id", "tests" };
+
+ ///
+ /// Lints a list of requirement files and reports all issues found.
+ ///
+ /// The context for output.
+ /// The list of requirement files to lint.
+ public static void Lint(Context context, IReadOnlyList files)
+ {
+ // Validate input
+ ArgumentNullException.ThrowIfNull(context);
+ ArgumentNullException.ThrowIfNull(files);
+
+ // No files to lint
+ if (files.Count == 0)
+ {
+ context.WriteLine("No requirements files specified.");
+ return;
+ }
+
+ // Track duplicate requirement IDs across all linted files: ID -> source file path
+ var seenIds = new Dictionary(StringComparer.Ordinal);
+
+ // Track all visited files to avoid linting the same file twice (following includes)
+ var visitedFiles = new HashSet(StringComparer.OrdinalIgnoreCase);
+
+ // Count total issues
+ var issueCount = 0;
+
+ // Lint each file, following includes
+ foreach (var file in files)
+ {
+ issueCount += LintFile(context, file, seenIds, visitedFiles);
+ }
+
+ // If no issues found, print success message using first file as root
+ if (issueCount == 0)
+ {
+ context.WriteLine($"{files[0]}: No issues found");
+ }
+ }
+
+ ///
+ /// Lints a single requirements file, following includes.
+ ///
+ /// The context for output.
+ /// The path to the file to lint.
+ /// Dictionary of requirement IDs already seen and the file they came from.
+ /// Set of files already visited to avoid re-linting.
+ /// The number of issues found in this file and its includes.
+ private static int LintFile(
+ Context context,
+ string path,
+ Dictionary seenIds,
+ HashSet visitedFiles)
+ {
+ // Resolve to full path to detect duplicate includes
+ string fullPath;
+ try
+ {
+ fullPath = Path.GetFullPath(path);
+ }
+ catch (Exception ex)
+ {
+ context.WriteError($"{path}: error: Invalid file path: {ex.Message}");
+ return 1;
+ }
+
+ // Skip already-visited files
+ if (!visitedFiles.Add(fullPath))
+ {
+ return 0;
+ }
+
+ var issueCount = 0;
+
+ // Verify the file exists
+ if (!File.Exists(fullPath))
+ {
+ context.WriteError($"{path}: error: File not found");
+ return 1;
+ }
+
+ // Read the file content
+ string yaml;
+ try
+ {
+ yaml = File.ReadAllText(fullPath);
+ }
+ catch (Exception ex)
+ {
+ context.WriteError($"{path}: error: Failed to read file: {ex.Message}");
+ return 1;
+ }
+
+ // Parse the YAML into a node tree
+ YamlMappingNode? root;
+ try
+ {
+ root = ParseYaml(yaml);
+ }
+ catch (Exception ex)
+ {
+ // YamlDotNet may throw YamlException or InvalidOperationException for malformed YAML
+ var location = ex is YamlException yamlEx
+ ? $"{path}({yamlEx.Start.Line},{yamlEx.Start.Column})"
+ : path;
+ context.WriteError($"{location}: error: Malformed YAML: {ex.Message}");
+ return 1;
+ }
+
+ // Empty documents are valid
+ if (root == null)
+ {
+ return 0;
+ }
+
+ // Lint document root fields
+ issueCount += LintDocumentRoot(context, path, root, seenIds);
+
+ // Follow includes
+ var includes = GetStringList(root, "includes");
+ if (includes != null)
+ {
+ var baseDirectory = Path.GetDirectoryName(fullPath) ?? string.Empty;
+ foreach (var include in includes)
+ {
+ if (string.IsNullOrWhiteSpace(include))
+ {
+ continue;
+ }
+
+ var includePath = Path.Combine(baseDirectory, include);
+ issueCount += LintFile(context, includePath, seenIds, visitedFiles);
+ }
+ }
+
+ return issueCount;
+ }
+
+ ///
+ /// Parses YAML text into a mapping node, or returns null for empty documents.
+ ///
+ /// The YAML text to parse.
+ /// The root mapping node, or null if the document is empty.
+ /// Thrown when the YAML is malformed.
+ private static YamlMappingNode? ParseYaml(string yaml)
+ {
+ var stream = new YamlStream();
+ using var reader = new StringReader(yaml);
+ stream.Load(reader);
+
+ if (stream.Documents.Count == 0)
+ {
+ return null;
+ }
+
+ var rootNode = stream.Documents[0].RootNode;
+ return rootNode as YamlMappingNode;
+ }
+
+ ///
+ /// Lints the document root mapping node.
+ ///
+ /// The context for output.
+ /// The file path for error messages.
+ /// The root mapping node.
+ /// Dictionary of requirement IDs already seen.
+ /// The number of issues found.
+ private static int LintDocumentRoot(
+ Context context,
+ string path,
+ YamlMappingNode root,
+ Dictionary seenIds)
+ {
+ var issueCount = 0;
+
+ // Check for unknown fields at document root
+ foreach (var key in root.Children.Keys.OfType())
+ {
+ var keyValue = key.Value ?? string.Empty;
+ if (!KnownDocumentFields.Contains(keyValue))
+ {
+ context.WriteError(
+ $"{path}({key.Start.Line},{key.Start.Column}): error: Unknown field '{keyValue}'");
+ issueCount++;
+ }
+ }
+
+ // Lint sections
+ var sections = GetSequence(root, "sections");
+ if (sections != null)
+ {
+ foreach (var sectionNode in sections.Children)
+ {
+ if (sectionNode is YamlMappingNode sectionMapping)
+ {
+ issueCount += LintSection(context, path, sectionMapping, seenIds);
+ }
+ else
+ {
+ context.WriteError(
+ $"{path}({sectionNode.Start.Line},{sectionNode.Start.Column}): error: Section must be a mapping");
+ issueCount++;
+ }
+ }
+ }
+
+ // Lint mappings
+ var mappings = GetSequence(root, "mappings");
+ if (mappings != null)
+ {
+ foreach (var mappingNode in mappings.Children)
+ {
+ if (mappingNode is YamlMappingNode mappingMapping)
+ {
+ issueCount += LintMapping(context, path, mappingMapping);
+ }
+ else
+ {
+ context.WriteError(
+ $"{path}({mappingNode.Start.Line},{mappingNode.Start.Column}): error: Mapping must be a mapping node");
+ issueCount++;
+ }
+ }
+ }
+
+ return issueCount;
+ }
+
+ ///
+ /// Lints a section mapping node.
+ ///
+ /// The context for output.
+ /// The file path for error messages.
+ /// The section mapping node.
+ /// Dictionary of requirement IDs already seen.
+ /// The number of issues found.
+ private static int LintSection(
+ Context context,
+ string path,
+ YamlMappingNode section,
+ Dictionary seenIds)
+ {
+ var issueCount = 0;
+
+ // Check for unknown fields in section
+ foreach (var key in section.Children.Keys.OfType())
+ {
+ var keyValue = key.Value ?? string.Empty;
+ if (!KnownSectionFields.Contains(keyValue))
+ {
+ context.WriteError(
+ $"{path}({key.Start.Line},{key.Start.Column}): error: Unknown field '{keyValue}' in section");
+ issueCount++;
+ }
+ }
+
+ // Check required 'title' field
+ var titleNode = GetScalar(section, "title");
+ if (titleNode == null)
+ {
+ context.WriteError(
+ $"{path}({section.Start.Line},{section.Start.Column}): error: Section missing required field 'title'");
+ issueCount++;
+ }
+ else if (string.IsNullOrWhiteSpace(titleNode.Value))
+ {
+ context.WriteError(
+ $"{path}({titleNode.Start.Line},{titleNode.Start.Column}): error: Section 'title' cannot be blank");
+ issueCount++;
+ }
+
+ // Lint requirements
+ var requirements = GetSequence(section, "requirements");
+ if (requirements != null)
+ {
+ foreach (var reqNode in requirements.Children)
+ {
+ if (reqNode is YamlMappingNode reqMapping)
+ {
+ issueCount += LintRequirement(context, path, reqMapping, seenIds);
+ }
+ else
+ {
+ context.WriteError(
+ $"{path}({reqNode.Start.Line},{reqNode.Start.Column}): error: Requirement must be a mapping");
+ issueCount++;
+ }
+ }
+ }
+
+ // Lint child sections
+ var sections = GetSequence(section, "sections");
+ if (sections != null)
+ {
+ foreach (var childNode in sections.Children)
+ {
+ if (childNode is YamlMappingNode childMapping)
+ {
+ issueCount += LintSection(context, path, childMapping, seenIds);
+ }
+ else
+ {
+ context.WriteError(
+ $"{path}({childNode.Start.Line},{childNode.Start.Column}): error: Section must be a mapping");
+ issueCount++;
+ }
+ }
+ }
+
+ return issueCount;
+ }
+
+ ///
+ /// Lints a requirement mapping node.
+ ///
+ /// The context for output.
+ /// The file path for error messages.
+ /// The requirement mapping node.
+ /// Dictionary of requirement IDs already seen.
+ /// The number of issues found.
+ private static int LintRequirement(
+ Context context,
+ string path,
+ YamlMappingNode requirement,
+ Dictionary seenIds)
+ {
+ var issueCount = 0;
+
+ // Check for unknown fields in requirement
+ foreach (var key in requirement.Children.Keys.OfType())
+ {
+ var keyValue = key.Value ?? string.Empty;
+ if (!KnownRequirementFields.Contains(keyValue))
+ {
+ context.WriteError(
+ $"{path}({key.Start.Line},{key.Start.Column}): error: Unknown field '{keyValue}' in requirement");
+ issueCount++;
+ }
+ }
+
+ // Check required 'id' field
+ var idNode = GetScalar(requirement, "id");
+ string? reqId = null;
+ if (idNode == null)
+ {
+ context.WriteError(
+ $"{path}({requirement.Start.Line},{requirement.Start.Column}): error: Requirement missing required field 'id'");
+ issueCount++;
+ }
+ else if (string.IsNullOrWhiteSpace(idNode.Value))
+ {
+ context.WriteError(
+ $"{path}({idNode.Start.Line},{idNode.Start.Column}): error: Requirement 'id' cannot be blank");
+ issueCount++;
+ }
+ else
+ {
+ reqId = idNode.Value;
+
+ // Check for duplicate IDs
+ if (seenIds.TryGetValue(reqId, out var firstFile))
+ {
+ context.WriteError(
+ $"{path}({idNode.Start.Line},{idNode.Start.Column}): error: Duplicate requirement ID '{reqId}' (first seen in {firstFile})");
+ issueCount++;
+ }
+ else
+ {
+ seenIds[reqId] = path;
+ }
+ }
+
+ // Check required 'title' field
+ var titleNode = GetScalar(requirement, "title");
+ if (titleNode == null)
+ {
+ var location = reqId != null ? $"requirement '{reqId}'" : "requirement";
+ context.WriteError(
+ $"{path}({requirement.Start.Line},{requirement.Start.Column}): error: {location} missing required field 'title'");
+ issueCount++;
+ }
+ else if (string.IsNullOrWhiteSpace(titleNode.Value))
+ {
+ context.WriteError(
+ $"{path}({titleNode.Start.Line},{titleNode.Start.Column}): error: Requirement 'title' cannot be blank");
+ issueCount++;
+ }
+
+ // Check tests list for blank entries
+ var tests = GetSequence(requirement, "tests");
+ if (tests != null)
+ {
+ foreach (var testNode in tests.Children)
+ {
+ if (testNode is YamlScalarNode testScalar && string.IsNullOrWhiteSpace(testScalar.Value))
+ {
+ context.WriteError(
+ $"{path}({testNode.Start.Line},{testNode.Start.Column}): error: Test name cannot be blank");
+ issueCount++;
+ }
+ }
+ }
+
+ // Check tags list for blank entries
+ var tags = GetSequence(requirement, "tags");
+ if (tags != null)
+ {
+ foreach (var tagNode in tags.Children)
+ {
+ if (tagNode is YamlScalarNode tagScalar && string.IsNullOrWhiteSpace(tagScalar.Value))
+ {
+ context.WriteError(
+ $"{path}({tagNode.Start.Line},{tagNode.Start.Column}): error: Tag name cannot be blank");
+ issueCount++;
+ }
+ }
+ }
+
+ return issueCount;
+ }
+
+ ///
+ /// Lints a test mapping node.
+ ///
+ /// The context for output.
+ /// The file path for error messages.
+ /// The mapping node to lint.
+ /// The number of issues found.
+ private static int LintMapping(
+ Context context,
+ string path,
+ YamlMappingNode mapping)
+ {
+ var issueCount = 0;
+
+ // Check for unknown fields in mapping
+ foreach (var key in mapping.Children.Keys.OfType())
+ {
+ var keyValue = key.Value ?? string.Empty;
+ if (!KnownMappingFields.Contains(keyValue))
+ {
+ context.WriteError(
+ $"{path}({key.Start.Line},{key.Start.Column}): error: Unknown field '{keyValue}' in mapping");
+ issueCount++;
+ }
+ }
+
+ // Check required 'id' field
+ var idNode = GetScalar(mapping, "id");
+ if (idNode == null)
+ {
+ context.WriteError(
+ $"{path}({mapping.Start.Line},{mapping.Start.Column}): error: Mapping missing required field 'id'");
+ issueCount++;
+ }
+ else if (string.IsNullOrWhiteSpace(idNode.Value))
+ {
+ context.WriteError(
+ $"{path}({idNode.Start.Line},{idNode.Start.Column}): error: Mapping 'id' cannot be blank");
+ issueCount++;
+ }
+
+ // Check tests list for blank entries
+ var tests = GetSequence(mapping, "tests");
+ if (tests != null)
+ {
+ foreach (var testNode in tests.Children)
+ {
+ if (testNode is YamlScalarNode testScalar && string.IsNullOrWhiteSpace(testScalar.Value))
+ {
+ context.WriteError(
+ $"{path}({testNode.Start.Line},{testNode.Start.Column}): error: Test name cannot be blank in mapping");
+ issueCount++;
+ }
+ }
+ }
+
+ return issueCount;
+ }
+
+ ///
+ /// Gets a scalar node value from a mapping node by key.
+ ///
+ /// The mapping node to search.
+ /// The key to look up.
+ /// The scalar node, or null if not found or not a scalar.
+ private static YamlScalarNode? GetScalar(YamlMappingNode mapping, string key)
+ {
+ var keyNode = new YamlScalarNode(key);
+ return mapping.Children.TryGetValue(keyNode, out var value) ? value as YamlScalarNode : null;
+ }
+
+ ///
+ /// Gets a sequence node from a mapping node by key.
+ ///
+ /// The mapping node to search.
+ /// The key to look up.
+ /// The sequence node, or null if not found or not a sequence.
+ private static YamlSequenceNode? GetSequence(YamlMappingNode mapping, string key)
+ {
+ var keyNode = new YamlScalarNode(key);
+ return mapping.Children.TryGetValue(keyNode, out var value) ? value as YamlSequenceNode : null;
+ }
+
+ ///
+ /// Gets a list of string values from a sequence node within a mapping.
+ ///
+ /// The mapping node to search.
+ /// The key to look up.
+ /// A list of string values, or null if the key is not found.
+ private static List? GetStringList(YamlMappingNode mapping, string key)
+ {
+ var sequence = GetSequence(mapping, key);
+ if (sequence == null)
+ {
+ return null;
+ }
+
+ var result = new List();
+ foreach (var node in sequence.Children)
+ {
+ if (node is YamlScalarNode scalar && scalar.Value != null)
+ {
+ result.Add(scalar.Value);
+ }
+ }
+
+ return result;
+ }
+}
diff --git a/src/DemaConsulting.ReqStream/Program.cs b/src/DemaConsulting.ReqStream/Program.cs
index 4e1067e..1f84a99 100644
--- a/src/DemaConsulting.ReqStream/Program.cs
+++ b/src/DemaConsulting.ReqStream/Program.cs
@@ -109,7 +109,14 @@ public static void Run(Context context)
return;
}
- // Priority 4: Requirements processing
+ // Priority 4: Lint requirements files
+ if (context.Lint)
+ {
+ Linter.Lint(context, context.RequirementsFiles);
+ return;
+ }
+
+ // Priority 5: Requirements processing
ProcessRequirements(context);
}
@@ -138,6 +145,7 @@ private static void PrintHelp(Context context)
context.WriteLine(" --silent Suppress console output");
context.WriteLine(" --validate Run self-validation");
context.WriteLine(" --results Write validation results to file (TRX or JUnit format)");
+ context.WriteLine(" --lint Lint requirements files for issues");
context.WriteLine(" --log Write output to log file");
context.WriteLine(" --requirements Requirements files glob pattern");
context.WriteLine(" --report Export requirements to markdown file");
diff --git a/src/DemaConsulting.ReqStream/Validation.cs b/src/DemaConsulting.ReqStream/Validation.cs
index b2c6b7b..96346db 100644
--- a/src/DemaConsulting.ReqStream/Validation.cs
+++ b/src/DemaConsulting.ReqStream/Validation.cs
@@ -51,6 +51,7 @@ public static void Run(Context context)
RunReportExportTest(context, testResults);
RunTagsFilteringTest(context, testResults);
RunEnforcementModeTest(context, testResults);
+ RunLintTest(context, testResults);
// Calculate totals
var totalTests = testResults.Results.Count;
@@ -497,6 +498,98 @@ private static void RunEnforcementModeTest(Context context, DemaConsulting.TestR
FinalizeTestResult(test, startTime, testResults);
}
+ ///
+ /// Runs a test for lint functionality.
+ ///
+ /// The context for output.
+ /// The test results collection.
+ private static void RunLintTest(Context context, DemaConsulting.TestResults.TestResults testResults)
+ {
+ var startTime = DateTime.UtcNow;
+ var test = CreateTestResult("ReqStream_Lint");
+
+ try
+ {
+ using var tempDir = new TemporaryDirectory();
+
+ // Create a valid requirements file
+ var reqFile = Path.Combine(tempDir.DirectoryPath, "lint-requirements.yaml");
+ var reqYaml = @"sections:
+ - title: Lint Test
+ requirements:
+ - id: LNT-001
+ title: Lint requirement
+";
+ File.WriteAllText(reqFile, reqYaml);
+
+ // Create a requirements file with a known issue (duplicate ID)
+ var badReqFile = Path.Combine(tempDir.DirectoryPath, "bad-requirements.yaml");
+ var badReqYaml = @"sections:
+ - title: Bad Lint Test
+ requirements:
+ - id: LNT-001
+ title: Duplicate requirement ID
+";
+ File.WriteAllText(badReqFile, badReqYaml);
+
+ using (new DirectorySwitch(tempDir.DirectoryPath))
+ {
+ // Test 1: Lint a valid file - should succeed with no issues
+ int exitCode;
+ string logContent;
+ var logFile = Path.Combine(tempDir.DirectoryPath, "lint-test.log");
+
+ using (var testContext = Context.Create(["--silent", "--log", "lint-test.log", "--lint",
+ "--requirements", "lint-requirements.yaml"]))
+ {
+ Program.Run(testContext);
+ exitCode = testContext.ExitCode;
+ }
+
+ logContent = File.ReadAllText(logFile);
+
+ if (exitCode != 0 || !logContent.Contains("No issues found"))
+ {
+ test.Outcome = DemaConsulting.TestResults.TestOutcome.Failed;
+ test.ErrorMessage = "Lint of valid file should succeed with 'No issues found'";
+ context.WriteError($"✗ ReqStream_Lint - Failed: {test.ErrorMessage}");
+ FinalizeTestResult(test, startTime, testResults);
+ return;
+ }
+
+ // Test 2: Lint a file with a duplicate ID - should fail
+ var logFile2 = Path.Combine(tempDir.DirectoryPath, "lint-test2.log");
+ using (var testContext = Context.Create(["--silent", "--log", "lint-test2.log", "--lint",
+ "--requirements", "lint-requirements.yaml",
+ "--requirements", "bad-requirements.yaml"]))
+ {
+ Program.Run(testContext);
+ exitCode = testContext.ExitCode;
+ }
+
+ var logContent2 = File.ReadAllText(logFile2);
+
+ if (exitCode == 0 || !logContent2.Contains("Duplicate requirement ID 'LNT-001'"))
+ {
+ test.Outcome = DemaConsulting.TestResults.TestOutcome.Failed;
+ test.ErrorMessage = "Lint of file with duplicate ID should fail";
+ context.WriteError($"✗ ReqStream_Lint - Failed: {test.ErrorMessage}");
+ }
+ else
+ {
+ test.Outcome = DemaConsulting.TestResults.TestOutcome.Passed;
+ context.WriteLine("✓ ReqStream_Lint - Passed");
+ }
+ }
+ }
+ catch (Exception ex)
+ {
+ HandleTestException(test, context, "ReqStream_Lint", ex);
+ }
+
+ FinalizeTestResult(test, startTime, testResults);
+ }
+
///
/// Writes test results to a file in TRX or JUnit format.
///
diff --git a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
new file mode 100644
index 0000000..db7a3bd
--- /dev/null
+++ b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
@@ -0,0 +1,568 @@
+// Copyright (c) 2026 DEMA Consulting
+//
+// Permission is hereby granted, free of charge, to any person obtaining a copy
+// of this software and associated documentation files (the "Software"), to deal
+// in the Software without restriction, including without limitation the rights
+// to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+// copies of the Software, and to permit persons to whom the Software is
+// furnished to do so, subject to the following conditions:
+//
+// The above copyright notice and this permission notice shall be included in all
+// copies or substantial portions of the Software.
+//
+// THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+// IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+// FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+// AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+// LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+// OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+// SOFTWARE.
+
+namespace DemaConsulting.ReqStream.Tests;
+
+///
+/// Unit tests for the Linter class.
+///
+[TestClass]
+public class LinterTests
+{
+ private string _testDirectory = string.Empty;
+
+ ///
+ /// Initialize test by creating a temporary test directory.
+ ///
+ [TestInitialize]
+ public void TestInitialize()
+ {
+ _testDirectory = Path.Combine(Path.GetTempPath(), $"reqstream_lint_test_{Guid.NewGuid()}");
+ Directory.CreateDirectory(_testDirectory);
+ }
+
+ ///
+ /// Clean up test by deleting the temporary test directory.
+ ///
+ [TestCleanup]
+ public void TestCleanup()
+ {
+ if (Directory.Exists(_testDirectory))
+ {
+ Directory.Delete(_testDirectory, recursive: true);
+ }
+ }
+
+ ///
+ /// Helper to run linter and capture error output.
+ ///
+ private static (int exitCode, string errors) RunLint(params string[] files)
+ {
+ var originalError = Console.Error;
+ using var errorOutput = new StringWriter();
+ Console.SetError(errorOutput);
+
+ try
+ {
+ using var context = Context.Create([]);
+ Linter.Lint(context, files.ToList());
+ return (context.ExitCode, errorOutput.ToString());
+ }
+ finally
+ {
+ Console.SetError(originalError);
+ }
+ }
+
+ ///
+ /// Helper to run linter and capture all output.
+ ///
+ private static (int exitCode, string output, string errors) RunLintWithOutput(params string[] files)
+ {
+ var originalOut = Console.Out;
+ var originalError = Console.Error;
+ using var stdOutput = new StringWriter();
+ using var errorOutput = new StringWriter();
+ Console.SetOut(stdOutput);
+ Console.SetError(errorOutput);
+
+ try
+ {
+ using var context = Context.Create([]);
+ Linter.Lint(context, files.ToList());
+ return (context.ExitCode, stdOutput.ToString(), errorOutput.ToString());
+ }
+ finally
+ {
+ Console.SetOut(originalOut);
+ Console.SetError(originalError);
+ }
+ }
+
+ ///
+ /// Test that linting with no files prints appropriate message.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithNoFiles_PrintsMessage()
+ {
+ var originalOut = Console.Out;
+ using var output = new StringWriter();
+ Console.SetOut(output);
+
+ try
+ {
+ using var context = Context.Create([]);
+ Linter.Lint(context, []);
+
+ Assert.AreEqual(0, context.ExitCode);
+ Assert.Contains("No requirements files specified", output.ToString());
+ }
+ finally
+ {
+ Console.SetOut(originalOut);
+ }
+ }
+
+ ///
+ /// Test that a valid requirements file produces no issues.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithValidFile_ReportsNoIssues()
+ {
+ var reqFile = Path.Combine(_testDirectory, "valid.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+ tests:
+ - SomeTest
+ tags:
+ - tag1
+");
+
+ var (exitCode, output, errors) = RunLintWithOutput(reqFile);
+
+ Assert.AreEqual(0, exitCode);
+ Assert.Contains("No issues found", output);
+ Assert.AreEqual(string.Empty, errors);
+ }
+
+ ///
+ /// Test that a file that doesn't exist reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithMissingFile_ReportsError()
+ {
+ var (exitCode, errors) = RunLint("/nonexistent/path/missing.yaml");
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("error", errors);
+ Assert.Contains("File not found", errors);
+ }
+
+ ///
+ /// Test that malformed YAML reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithMalformedYaml_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "malformed.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Bad
+ requirements: [
+ invalid yaml here
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("error", errors);
+ Assert.Contains("Malformed YAML", errors);
+ }
+
+ ///
+ /// Test that an empty YAML file produces no issues.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithEmptyFile_ReportsNoIssues()
+ {
+ var reqFile = Path.Combine(_testDirectory, "empty.yaml");
+ File.WriteAllText(reqFile, string.Empty);
+
+ var (exitCode, output, errors) = RunLintWithOutput(reqFile);
+
+ Assert.AreEqual(0, exitCode);
+ Assert.Contains("No issues found", output);
+ Assert.AreEqual(string.Empty, errors);
+ }
+
+ ///
+ /// Test that an unknown field at document root reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithUnknownDocumentField_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "unknown-field.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test
+unknown_field: value
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_field'", errors);
+ }
+
+ ///
+ /// Test that a section missing the title field reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithSectionMissingTitle_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "missing-title.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - requirements:
+ - id: REQ-001
+ title: A requirement
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Section missing required field 'title'", errors);
+ }
+
+ ///
+ /// Test that a section with a blank title reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankSectionTitle_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-title.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: ''
+ requirements:
+ - id: REQ-001
+ title: A requirement
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Section 'title' cannot be blank", errors);
+ }
+
+ ///
+ /// Test that a section with an unknown field reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithUnknownSectionField_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "unknown-section-field.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test
+ unknown_field: value
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_field' in section", errors);
+ }
+
+ ///
+ /// Test that a requirement missing the id field reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithRequirementMissingId_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "missing-id.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - title: Requirement without ID
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Requirement missing required field 'id'", errors);
+ }
+
+ ///
+ /// Test that a requirement missing the title field reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithRequirementMissingTitle_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "missing-req-title.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("missing required field 'title'", errors);
+ }
+
+ ///
+ /// Test that a requirement with an unknown field reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithUnknownRequirementField_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "unknown-req-field.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+ unknown_field: value
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_field' in requirement", errors);
+ }
+
+ ///
+ /// Test that duplicate requirement IDs report an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithDuplicateIds_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "duplicates.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: First requirement
+ - id: REQ-001
+ title: Duplicate requirement
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Duplicate requirement ID 'REQ-001'", errors);
+ }
+
+ ///
+ /// Test that duplicate IDs across multiple files report an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithDuplicateIdsAcrossFiles_ReportsError()
+ {
+ var reqFile1 = Path.Combine(_testDirectory, "file1.yaml");
+ File.WriteAllText(reqFile1, @"sections:
+ - title: Section 1
+ requirements:
+ - id: REQ-001
+ title: First requirement
+");
+
+ var reqFile2 = Path.Combine(_testDirectory, "file2.yaml");
+ File.WriteAllText(reqFile2, @"sections:
+ - title: Section 2
+ requirements:
+ - id: REQ-001
+ title: Duplicate across files
+");
+
+ var (exitCode, errors) = RunLint(reqFile1, reqFile2);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Duplicate requirement ID 'REQ-001'", errors);
+ }
+
+ ///
+ /// Test that multiple issues are all reported.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithMultipleIssues_ReportsAllIssues()
+ {
+ var reqFile = Path.Combine(_testDirectory, "multiple-issues.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ unknown_section_field: bad
+ requirements:
+ - id: REQ-001
+ title: Good requirement
+ - title: Missing id requirement
+ - id: REQ-001
+ title: Duplicate id
+unknown_root_field: bad
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_section_field' in section", errors);
+ Assert.Contains("Requirement missing required field 'id'", errors);
+ Assert.Contains("Duplicate requirement ID 'REQ-001'", errors);
+ Assert.Contains("Unknown field 'unknown_root_field'", errors);
+ }
+
+ ///
+ /// Test that linting follows includes.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithIncludes_LintesIncludedFiles()
+ {
+ var includedFile = Path.Combine(_testDirectory, "included.yaml");
+ File.WriteAllText(includedFile, @"sections:
+ - title: Included Section
+ requirements:
+ - id: INC-001
+ title: Included requirement
+ unknown_field: bad
+");
+
+ var rootFile = Path.Combine(_testDirectory, "root.yaml");
+ File.WriteAllText(rootFile, $@"includes:
+ - included.yaml
+sections:
+ - title: Root Section
+ requirements:
+ - id: ROOT-001
+ title: Root requirement
+");
+
+ var (exitCode, errors) = RunLint(rootFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_field' in requirement", errors);
+ }
+
+ ///
+ /// Test that --lint via Program.Run works correctly.
+ ///
+ [TestMethod]
+ public void Linter_ProgramRun_WithLintFlag_RunsLinter()
+ {
+ var reqFile = Path.Combine(_testDirectory, "valid.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+");
+
+ var originalOut = Console.Out;
+ using var output = new StringWriter();
+ Console.SetOut(output);
+
+ var originalDir = Directory.GetCurrentDirectory();
+ try
+ {
+ Directory.SetCurrentDirectory(_testDirectory);
+
+ using var context = Context.Create(["--lint", "--requirements", "*.yaml"]);
+ Program.Run(context);
+
+ Assert.AreEqual(0, context.ExitCode);
+ Assert.Contains("No issues found", output.ToString());
+ }
+ finally
+ {
+ Directory.SetCurrentDirectory(originalDir);
+ Console.SetOut(originalOut);
+ }
+ }
+
+ ///
+ /// Test that a mapping with an unknown field reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithUnknownMappingField_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "unknown-mapping-field.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+mappings:
+ - id: REQ-001
+ tests:
+ - SomeTest
+ unknown_field: bad
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_field' in mapping", errors);
+ }
+
+ ///
+ /// Test that a mapping missing id reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithMappingMissingId_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "mapping-missing-id.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+mappings:
+ - tests:
+ - SomeTest
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Mapping missing required field 'id'", errors);
+ }
+
+ ///
+ /// Test that a nested section with issues is linted.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithNestedSectionIssues_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "nested.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Parent Section
+ sections:
+ - title: Child Section
+ requirements:
+ - id: REQ-001
+ title: Valid requirement
+ - id: REQ-002
+ unknown_req_field: bad
+ title: Bad requirement
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Unknown field 'unknown_req_field' in requirement", errors);
+ }
+
+ ///
+ /// Test that error format includes file path and line/column info.
+ ///
+ [TestMethod]
+ public void Linter_Lint_ErrorFormat_IncludesFileAndLocation()
+ {
+ var reqFile = Path.Combine(_testDirectory, "format-test.yaml");
+ File.WriteAllText(reqFile, @"unknown_field: value
+");
+
+ var (_, errors) = RunLint(reqFile);
+
+ // Error should include the file path
+ Assert.Contains(reqFile, errors);
+ // Error should include line and column in (line,col) format
+ Assert.Contains("(1,", errors);
+ // Error should include 'error:' severity
+ Assert.Contains("error:", errors);
+ }
+}
diff --git a/test/DemaConsulting.ReqStream.Tests/ProgramTests.cs b/test/DemaConsulting.ReqStream.Tests/ProgramTests.cs
index 5e34bf8..c7126c0 100644
--- a/test/DemaConsulting.ReqStream.Tests/ProgramTests.cs
+++ b/test/DemaConsulting.ReqStream.Tests/ProgramTests.cs
@@ -131,8 +131,9 @@ public void Program_Run_WithValidateFlag_RunsValidation()
Assert.Contains("ReqStream_ReportExport - Passed", logContent);
Assert.Contains("ReqStream_TagsFiltering - Passed", logContent);
Assert.Contains("ReqStream_EnforcementMode - Passed", logContent);
- Assert.Contains("Total Tests: 5", logContent);
- Assert.Contains("Passed: 5", logContent);
+ Assert.Contains("ReqStream_Lint - Passed", logContent);
+ Assert.Contains("Total Tests: 6", logContent);
+ Assert.Contains("Passed: 6", logContent);
Assert.Contains("Failed: 0", logContent);
}
From 090efcb53164d13ec296c1dd551a411537bf4945 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 00:38:16 +0000
Subject: [PATCH 3/9] Fix spelling: LintesIncludedFiles -> LintsIncludedFiles;
fix behaviour -> behavior
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/0fc31532-46f1-4661-a8c4-59d2c67d10b6
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
docs/reqstream/unit-linter.yaml | 4 ++--
test/DemaConsulting.ReqStream.Tests/LinterTests.cs | 2 +-
2 files changed, 3 insertions(+), 3 deletions(-)
diff --git a/docs/reqstream/unit-linter.yaml b/docs/reqstream/unit-linter.yaml
index b5e8967..7345782 100644
--- a/docs/reqstream/unit-linter.yaml
+++ b/docs/reqstream/unit-linter.yaml
@@ -18,7 +18,7 @@ sections:
title: The linter shall report an error when the requirements document contains an unknown field.
justification: |
Unknown fields may indicate typos or outdated configuration that would otherwise be silently
- ignored, leading to unexpected behaviour.
+ ignored, leading to unexpected behavior.
tags:
- lint
tests:
@@ -96,7 +96,7 @@ sections:
tags:
- lint
tests:
- - Linter_Lint_WithIncludes_LintesIncludedFiles
+ - Linter_Lint_WithIncludes_LintsIncludedFiles
- id: ReqStream-Lint-NoIssuesMessage
title: The linter shall print a no-issues message when no problems are found.
diff --git a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
index db7a3bd..b97edda 100644
--- a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
+++ b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
@@ -410,7 +410,7 @@ public void Linter_Lint_WithMultipleIssues_ReportsAllIssues()
/// Test that linting follows includes.
///
[TestMethod]
- public void Linter_Lint_WithIncludes_LintesIncludedFiles()
+ public void Linter_Lint_WithIncludes_LintsIncludedFiles()
{
var includedFile = Path.Combine(_testDirectory, "included.yaml");
File.WriteAllText(includedFile, @"sections:
From a727d9538dc38466ca3d2cc75693cbdba3bd6773 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 00:54:35 +0000
Subject: [PATCH 4/9] Add Linter review-set and update documentation
(ARCHITECTURE.md and user guide)
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/b9784df3-8c7e-4868-a56f-350cc15d4a2c
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
.reviewmark.yaml | 7 +++++
ARCHITECTURE.md | 31 ++++++++++++++++++++--
docs/guide/guide.md | 63 +++++++++++++++++++++++++++++++++++++++++++--
3 files changed, 97 insertions(+), 4 deletions(-)
diff --git a/.reviewmark.yaml b/.reviewmark.yaml
index dfa43f2..46a2ee1 100644
--- a/.reviewmark.yaml
+++ b/.reviewmark.yaml
@@ -64,6 +64,13 @@ reviews:
- "test/**/TraceMatrixReadTests.cs"
- "test/**/TraceMatrixExportTests.cs"
+ - id: ReqStream-Linter
+ title: Review of ReqStream Linter Unit
+ paths:
+ - "docs/reqstream/unit-linter.yaml"
+ - "src/**/Linter.cs"
+ - "test/**/LinterTests.cs"
+
# Platform and OTS dependency reviews
- id: Platform-Support
title: Review of Platform and Runtime Support Requirements
diff --git a/ARCHITECTURE.md b/ARCHITECTURE.md
index 1300cab..bd94217 100644
--- a/ARCHITECTURE.md
+++ b/ARCHITECTURE.md
@@ -26,6 +26,7 @@ concerns with distinct classes for each major responsibility.
| `Context` | `Context.cs` | Parses CLI arguments; owns all options and output |
| `Requirements` | `Requirements.cs` | Reads, merges, and validates YAML requirement files |
| `TraceMatrix` | `TraceMatrix.cs` | Maps test results to requirements; calculates coverage |
+| `Linter` | `Linter.cs` | Lints requirement YAML files and reports all structural issues |
Two supporting value types live alongside `TraceMatrix`:
@@ -42,16 +43,20 @@ flowchart TD
ctx[Context
options & output]
req[Requirements
parsed tree]
tm[TraceMatrix
coverage analysis]
+ lint[Linter
YAML structural checks]
reports[Markdown Reports
requirements · justifications · trace matrix]
exit[Exit Code
0 = pass · 1 = fail]
yaml --> req
+ yaml --> lint
tests --> tm
args --> ctx
ctx --> req
+ ctx --> lint
req --> tm
tm --> reports
tm --> exit
+ lint --> exit
```
### Execution Flow at a Glance
@@ -60,7 +65,8 @@ flowchart TD
2. Banner → printed for all remaining steps (`--help`, `--validate`, normal run)
3. `--help` → print usage and exit
4. `--validate` → run self-validation tests and exit
-5. Normal run → read requirements → generate reports → enforce coverage
+5. `--lint` → lint requirements files and exit
+6. Normal run → read requirements → generate reports → enforce coverage
Each step is described in detail in the [Program Execution Flow](#program-execution-flow) section.
@@ -151,6 +157,24 @@ Handles CLI argument parsing and owns all program-wide options and output.
- Manage console and log file output through `WriteLine` / `WriteError`
- Track error state and surface the appropriate process exit code
+### Linter
+
+**Location**: `Linter.cs`
+
+Lints requirement YAML files and reports all structural issues found.
+
+**Key Responsibilities**:
+
+- Parse YAML files using YamlDotNet's representation model to retain position information
+- Report unknown fields at document root, section, requirement, and mapping level
+- Report missing required fields (`title` for sections; `id` and `title` for requirements; `id` for mappings)
+- Report blank field values for `title`, `id`, and list entries (tests, tags)
+- Detect and report duplicate requirement IDs across all files (including includes)
+- Follow `includes:` directives recursively, deduplicating visited files
+- Report **all** issues found rather than stopping at the first error
+- Format errors as `{file}({line},{col}): error: {description}`
+- Print a no-issues message when the files pass all checks
+
## Requirements Processing Flow
### 1. YAML File Parsing
@@ -311,7 +335,10 @@ enforcement results — this allows users to review the trace matrix even on a f
4. Self-Validation (--validate)
└─> Run self-validation tests and exit
-5. Requirements Processing
+5. Lint (--lint)
+ └─> Lint requirements files and exit
+
+6. Requirements Processing
├─> Read and merge requirements files
├─> Export requirements report (if --report specified)
├─> Export justifications report (if --justifications specified)
diff --git a/docs/guide/guide.md b/docs/guide/guide.md
index 91519fc..b178778 100644
--- a/docs/guide/guide.md
+++ b/docs/guide/guide.md
@@ -36,6 +36,7 @@ to be treated as code, stored in source control, and integrated into CI/CD pipel
- **Test Mapping** - Link requirements to test cases for traceability and verification
- **Justifications** - Document the rationale behind each requirement for better understanding
- **File Includes** - Modularize requirements across multiple YAML files for better maintainability
+- **Linting** - Inspect requirements files for structural issues before processing
- **Validation** - Built-in validation ensures requirement structure and references are correct
- **Tag Filtering** - Categorize and filter requirements using tags for focused reporting and enforcement
- **Export Capabilities** - Generate markdown reports for requirements, justifications, and test trace matrices
@@ -526,6 +527,7 @@ ReqStream supports the following command-line options:
| `--silent` | Suppress console output (useful in CI/CD) |
| `--validate` | Run self-validation and display test results |
| `--results ` | Write validation test results to a file (TRX or JUnit format, use .trx or .xml extension) |
+| `--lint` | Lint requirements files for structural issues |
| `--log ` | Write output to specified log file |
| `--requirements ` | Glob pattern for requirements YAML files |
| `--report ` | Export requirements to markdown file |
@@ -583,9 +585,10 @@ Example validation report:
✓ ReqStream_ReportExport - Passed
✓ ReqStream_TagsFiltering - Passed
✓ ReqStream_EnforcementMode - Passed
+✓ ReqStream_Lint - Passed
-Total Tests: 5
-Passed: 5
+Total Tests: 6
+Passed: 6
Failed: 0
```
@@ -598,6 +601,62 @@ Each test proves specific functionality works correctly:
- **`ReqStream_ReportExport`** - requirements report is correctly exported to a markdown file.
- **`ReqStream_TagsFiltering`** - requirements are correctly filtered by tags.
- **`ReqStream_EnforcementMode`** - enforcement mode correctly validates requirement test coverage.
+- **`ReqStream_Lint`** - lint mode correctly identifies and reports issues in requirements files.
+
+### Linting Requirements Files
+
+Use the `--lint` flag to inspect requirements files for structural problems before processing them.
+Unlike normal processing, linting reports **all** issues found rather than stopping at the first error.
+
+**Lint a single requirements file (including all its includes):**
+
+```bash
+reqstream --requirements requirements.yaml --lint
+```
+
+**Lint multiple requirements files:**
+
+```bash
+reqstream --requirements "docs/**/*.yaml" --lint
+```
+
+**Example output when issues are found:**
+
+```text
+docs/reqs/unit.yaml(42,5): error: Unknown field 'tittle' in requirement
+docs/reqs/unit.yaml(57,13): error: Duplicate requirement ID 'REQ-001' (first seen in docs/reqs/base.yaml)
+docs/reqs/other.yaml(10,1): error: Section missing required field 'title'
+```
+
+**Example output when no issues are found:**
+
+```text
+requirements.yaml: No issues found
+```
+
+The exit code is `0` when no issues are found, and `1` when any issues are reported — making `--lint`
+suitable for use in CI/CD quality gates.
+
+**Issues detected by the linter:**
+
+| Issue | Description |
+| ----- | ----------- |
+| Malformed YAML | File cannot be parsed as valid YAML |
+| Unknown document field | Top-level key other than `sections`, `mappings`, or `includes` |
+| Unknown section field | Section key other than `title`, `requirements`, or `sections` |
+| Unknown requirement field | Requirement key other than `id`, `title`, `justification`, `tests`, `children`, or `tags` |
+| Unknown mapping field | Mapping key other than `id` or `tests` |
+| Missing section title | Section does not have a `title` field |
+| Blank section title | Section `title` is empty or whitespace |
+| Missing requirement id | Requirement does not have an `id` field |
+| Blank requirement id | Requirement `id` is empty or whitespace |
+| Missing requirement title | Requirement does not have a `title` field |
+| Blank requirement title | Requirement `title` is empty or whitespace |
+| Missing mapping id | Mapping does not have an `id` field |
+| Blank mapping id | Mapping `id` is empty or whitespace |
+| Blank test name | A test name in a `tests` list is empty or whitespace |
+| Blank tag name | A tag name in a `tags` list is empty or whitespace |
+| Duplicate requirement ID | Two requirements share the same `id` (within or across files) |
### Requirements Processing
From 8d2c56cd163687a06a2fa93ab639f439cd1f8676 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 01:22:42 +0000
Subject: [PATCH 5/9] Add docs/design/ pandoc folder with detailed design
documents for all software units
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/5ed4bb83-dde9-4099-af6f-47856f47a4db
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
.reviewmark.yaml | 7 ++
docs/design/context.md | 127 +++++++++++++++++++++++++++
docs/design/definition.yaml | 20 +++++
docs/design/introduction.md | 49 +++++++++++
docs/design/linter.md | 150 ++++++++++++++++++++++++++++++++
docs/design/program.md | 126 +++++++++++++++++++++++++++
docs/design/requirements.md | 169 ++++++++++++++++++++++++++++++++++++
docs/design/title.txt | 15 ++++
docs/design/tracematrix.md | 157 +++++++++++++++++++++++++++++++++
docs/design/validation.md | 101 +++++++++++++++++++++
10 files changed, 921 insertions(+)
create mode 100644 docs/design/context.md
create mode 100644 docs/design/definition.yaml
create mode 100644 docs/design/introduction.md
create mode 100644 docs/design/linter.md
create mode 100644 docs/design/program.md
create mode 100644 docs/design/requirements.md
create mode 100644 docs/design/title.txt
create mode 100644 docs/design/tracematrix.md
create mode 100644 docs/design/validation.md
diff --git a/.reviewmark.yaml b/.reviewmark.yaml
index 46a2ee1..c026427 100644
--- a/.reviewmark.yaml
+++ b/.reviewmark.yaml
@@ -8,6 +8,7 @@
needs-review:
- "**/*.cs" # All C# source and test files
- "docs/reqstream/**/*.yaml" # Requirements files
+ - "docs/design/**/*.md" # Design documents
- "!**/obj/**" # Exclude build output
- "!**/bin/**" # Exclude build output
@@ -28,6 +29,7 @@ reviews:
title: Review of ReqStream Context Unit
paths:
- "docs/reqstream/unit-context.yaml"
+ - "docs/design/context.md"
- "src/**/Context.cs"
- "test/**/ContextTests.cs"
@@ -35,6 +37,7 @@ reviews:
title: Review of ReqStream Program Unit
paths:
- "docs/reqstream/unit-program.yaml"
+ - "docs/design/program.md"
- "src/**/Program.cs"
- "test/**/ProgramTests.cs"
@@ -42,6 +45,7 @@ reviews:
title: Review of ReqStream Validation Unit
paths:
- "docs/reqstream/unit-validation.yaml"
+ - "docs/design/validation.md"
- "src/**/Validation.cs"
- "test/**/ValidationTests.cs"
@@ -49,6 +53,7 @@ reviews:
title: Review of ReqStream Requirements Unit
paths:
- "docs/reqstream/unit-requirements.yaml"
+ - "docs/design/requirements.md"
- "src/**/Requirement.cs"
- "src/**/Requirements.cs"
- "src/**/Section.cs"
@@ -59,6 +64,7 @@ reviews:
title: Review of ReqStream TraceMatrix Unit
paths:
- "docs/reqstream/unit-trace-matrix.yaml"
+ - "docs/design/tracematrix.md"
- "src/**/TraceMatrix.cs"
- "test/**/TraceMatrixTests.cs"
- "test/**/TraceMatrixReadTests.cs"
@@ -68,6 +74,7 @@ reviews:
title: Review of ReqStream Linter Unit
paths:
- "docs/reqstream/unit-linter.yaml"
+ - "docs/design/linter.md"
- "src/**/Linter.cs"
- "test/**/LinterTests.cs"
diff --git a/docs/design/context.md b/docs/design/context.md
new file mode 100644
index 0000000..89e3cb8
--- /dev/null
+++ b/docs/design/context.md
@@ -0,0 +1,127 @@
+# Context Unit Design
+
+## Overview
+
+`Context` is the command-line argument parser and I/O owner for ReqStream. It is the single
+authoritative source for all runtime options and is the only unit permitted to write to the console
+or the log file. `Context` never touches YAML content, test result data, or domain objects; its
+sole concerns are parsing arguments and surfacing results to the caller.
+
+`Context` implements `IDisposable` so that the log-file `StreamWriter` is closed deterministically
+when the enclosing `using` block in `Program.Main` exits.
+
+## Private State
+
+| Field | Type | Purpose |
+| ----- | ---- | ------- |
+| `_logWriter` | `StreamWriter?` | Open writer for the optional log file; `null` when no log file was requested |
+| `_hasErrors` | `bool` | Accumulates error state; initially `false`; set to `true` by `WriteError` |
+
+## Properties
+
+| Property | Type | CLI flag | Notes |
+| -------- | ---- | -------- | ----- |
+| `Version` | `bool` | `--version` | Print version and exit |
+| `Help` | `bool` | `--help` | Print usage and exit |
+| `Silent` | `bool` | `--silent` | Suppress console output |
+| `Validate` | `bool` | `--validate` | Run self-validation tests |
+| `Lint` | `bool` | `--lint` | Lint requirements files |
+| `ResultsFile` | `string?` | `--results` | Path for validation test-results output file |
+| `Enforce` | `bool` | `--enforce` | Fail if requirements are not fully covered |
+| `FilterTags` | `HashSet?` | `--filter` | Comma-separated tag filter; `null` when not specified |
+| `RequirementsFiles` | `List` | `--requirements` | Expanded list of requirement file paths |
+| `TestFiles` | `List` | `--tests` | Expanded list of test-result file paths |
+| `RequirementsReport` | `string?` | `--report` | Destination path for requirements report |
+| `ReportDepth` | `int` | `--report-depth` | Heading depth for requirements report |
+| `Matrix` | `string?` | `--matrix` | Destination path for trace matrix report |
+| `MatrixDepth` | `int` | `--matrix-depth` | Heading depth for trace matrix report |
+| `JustificationsFile` | `string?` | `--justifications` | Destination path for justifications report |
+| `JustificationsDepth` | `int` | `--justifications-depth` | Heading depth for justifications report |
+| `ExitCode` | `int` | — | Computed: `_hasErrors ? 1 : 0` |
+
+## Methods
+
+### `Create(args)`
+
+`Create` is the static factory method that constructs and returns a fully initialized `Context`.
+It implements a sequential switch-based parser over the `args` array.
+
+**Parse loop**:
+
+1. Iterate `args` with an index variable `i`.
+2. Match `args[i]` against known flags using a `switch` statement.
+3. For flags that consume the next element (e.g., `--requirements`), check `i + 1 >= args.Length`
+ before advancing; if the check fails an `ArgumentException` is thrown.
+4. An unrecognized argument causes an `ArgumentException` listing the unknown argument.
+
+**`--filter` handling**:
+
+The value following `--filter` is split on `','`. Each non-empty token is added to `FilterTags`.
+If `FilterTags` is `null` at the point the first `--filter` is encountered, the `HashSet` is
+created before adding tokens. Multiple `--filter` arguments are accumulated into the same set.
+
+**`--requirements` and `--tests` handling**:
+
+Each value is passed to `ExpandGlobPattern`; the resulting paths are appended to
+`RequirementsFiles` or `TestFiles` respectively.
+
+**Log file**:
+
+If `--log` was specified, `Create` opens the named file for writing and assigns the resulting
+`StreamWriter` to `_logWriter` before returning.
+
+### `ExpandGlobPattern(pattern)`
+
+`ExpandGlobPattern` resolves a single pattern (which may contain `*` or `**` wildcards) to a list
+of absolute file paths.
+
+**Implementation**:
+
+1. Construct a `Microsoft.Extensions.FileSystemGlobbing.Matcher`.
+2. Add `pattern` as an include pattern.
+3. Execute the matcher against `Directory.GetCurrentDirectory()`.
+4. Return the matched absolute paths.
+
+**Known limitation**: the `Matcher` library silently ignores patterns that are themselves absolute
+paths. Callers that pass absolute paths directly will receive an empty result set. This is an
+accepted limitation of the underlying library; users should use relative paths or glob wildcards.
+
+### `WriteLine(message)`
+
+`WriteLine` writes a message to the output channel.
+
+1. If `Silent` is `false`, write to `Console.WriteLine`.
+2. If `_logWriter` is not `null`, write to `_logWriter`.
+
+### `WriteError(message)`
+
+`WriteError` records an error and writes it to the error channel.
+
+1. Set `_hasErrors = true`.
+2. If `Silent` is `false`, set `Console.ForegroundColor` to red, write to `Console.Error`, then
+ restore the original foreground color.
+3. If `_logWriter` is not `null`, write to `_logWriter`.
+
+### `Dispose()`
+
+`Dispose` flushes and closes `_logWriter` if it is not `null`, then sets it to `null`. This
+ensures the log file is not truncated and file handles are not leaked even when the process exits
+via an early return path.
+
+## Interactions with Other Units
+
+| Unit | Nature of interaction |
+| ---- | --------------------- |
+| `Program` | Creates `Context` via `Create`; calls `WriteLine` and `WriteError`; reads `ExitCode` |
+| `Validation` | Calls `context.WriteLine`, `context.WriteError`, reads `ResultsFile`, `Silent` |
+| `Linter` | Calls `context.WriteError` to report linting issues |
+| `Requirements` | Receives `RequirementsFiles`; does not hold a reference to `Context` |
+| `TraceMatrix` | Receives `TestFiles`; does not hold a reference to `Context` |
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[repo]: https://github.com/demaconsulting/ReqStream
diff --git a/docs/design/definition.yaml b/docs/design/definition.yaml
new file mode 100644
index 0000000..c6a8c53
--- /dev/null
+++ b/docs/design/definition.yaml
@@ -0,0 +1,20 @@
+---
+resource-path:
+ - docs/design
+ - docs/template
+
+input-files:
+ - docs/design/title.txt
+ - docs/design/introduction.md
+ - docs/design/program.md
+ - docs/design/context.md
+ - docs/design/validation.md
+ - docs/design/requirements.md
+ - docs/design/tracematrix.md
+ - docs/design/linter.md
+
+template: template.html
+
+table-of-contents: true
+
+number-sections: true
diff --git a/docs/design/introduction.md b/docs/design/introduction.md
new file mode 100644
index 0000000..a4c7900
--- /dev/null
+++ b/docs/design/introduction.md
@@ -0,0 +1,49 @@
+# Introduction
+
+This document provides the detailed design for the ReqStream tool, a .NET command-line application
+for managing software requirements in YAML format.
+
+## Purpose
+
+The purpose of this document is to describe the internal design of each software unit that comprises
+ReqStream. It captures data models, algorithms, key methods, and inter-unit interactions at a level
+of detail sufficient for formal code review, compliance verification, and future maintenance. The
+document does not restate requirements; it explains how they are realized.
+
+## Scope
+
+This document covers the detailed design of the following software units:
+
+- **Program** — entry point and execution orchestrator (`Program.cs`)
+- **Context** — command-line argument parser and I/O owner (`Context.cs`)
+- **Validation** — self-validation test runner (`Validation.cs`)
+- **Requirements, Section, and Requirement** — YAML parsing, section merging, validation, and export
+ (`Requirements.cs`, `Section.cs`, `Requirement.cs`)
+- **TraceMatrix** — test result loader and requirement-coverage analyzer (`TraceMatrix.cs`)
+- **Linter** — structural linter for requirement YAML files (`Linter.cs`)
+
+The following topics are out of scope:
+
+- External library internals (YamlDotNet, DemaConsulting.TestResults)
+- Build pipeline configuration
+- Deployment and packaging
+
+## Document Conventions
+
+Throughout this document:
+
+- Class names, method names, property names, and file names appear in `monospace` font.
+- The word **shall** denotes a design constraint that the implementation must satisfy.
+- Section headings within each unit chapter follow a consistent structure: overview, data model,
+ methods/algorithms, and interactions with other units.
+- Text tables are used in preference to diagrams, which may not render in all PDF viewers.
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream User Guide][guide]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[guide]: ../../README.md
+[repo]: https://github.com/demaconsulting/ReqStream
diff --git a/docs/design/linter.md b/docs/design/linter.md
new file mode 100644
index 0000000..a854c8b
--- /dev/null
+++ b/docs/design/linter.md
@@ -0,0 +1,150 @@
+# Linter Unit Design
+
+## Overview
+
+`Linter` performs structural validation of requirement YAML files and reports all issues found
+without stopping at the first error. It is intended as a developer-aid tool that catches
+authoring mistakes — unknown fields, missing required fields, blank values, and duplicate
+requirement IDs — before the files are processed by the requirements loader.
+
+Unlike the requirements loader, which uses `YamlDotNet`'s deserializer, `Linter` uses
+`YamlDotNet`'s **representation model** (`YamlStream`, `YamlMappingNode`, `YamlSequenceNode`,
+`YamlScalarNode`). This preserves `Mark.Line` and `Mark.Column` position information so that
+every error message includes an accurate file location.
+
+## Known Field Sets
+
+The linter maintains four constant sets of known field names, compared using ordinal (case-sensitive)
+equality:
+
+| Set | Members |
+| --- | ------- |
+| `KnownDocumentFields` | `sections`, `mappings`, `includes` |
+| `KnownSectionFields` | `title`, `requirements`, `sections` |
+| `KnownRequirementFields` | `id`, `title`, `justification`, `tests`, `children`, `tags` |
+| `KnownMappingFields` | `id`, `tests` |
+
+Any key found in a YAML node that is not a member of the corresponding set is reported as an unknown
+field error.
+
+## Error Format
+
+All errors emitted by the linter use the following format:
+
+```text
+{path}({line},{col}): error: {description}
+```
+
+`line` and `col` are taken from `YamlNode.Start.Line` and `YamlNode.Start.Column` respectively,
+providing the exact position of the offending node in the source file.
+
+Errors are written via `context.WriteError`, which sets `Context._hasErrors = true` and eventually
+causes the process to exit with code `1`.
+
+## Methods
+
+### `Lint(context, files)`
+
+`Lint` is the single public entry point.
+
+1. Initialize a `Dictionary seenIds` (maps requirement ID to source file path) and
+ a `HashSet visitedFiles` shared across all files in `files`.
+2. For each path in `files`, call `LintFile(context, path, seenIds, visitedFiles)`.
+3. Count the total number of issues written (tracked via `context`'s error state or a local
+ counter).
+4. If the total issue count is zero, print `"{files[0]}: No issues found"` via `context.WriteLine`.
+
+### `LintFile(context, path, seenIds, visitedFiles)`
+
+`LintFile` lints a single YAML file.
+
+1. Resolve `path` to its full absolute path.
+2. If `path` is already in `visitedFiles`, return immediately.
+3. Add `path` to `visitedFiles`.
+4. Read the file text.
+5. Attempt to parse the text with `YamlStream.Load()`. If a `YamlException` is thrown (malformed
+ YAML), call `context.WriteError` with the exception message and return; do not attempt to lint
+ further.
+6. For each `YamlDocument` in the stream, call `LintDocument(context, path, doc, seenIds)`.
+7. After all documents are linted, locate the `includes:` sequence in the root mapping (if present)
+ and for each scalar entry call `LintFile` recursively, resolving the include path relative to
+ the directory of the current file.
+
+### `LintDocument(context, path, doc, seenIds)`
+
+`LintDocument` validates the top-level structure of a single YAML document.
+
+1. Assert that the document root is a `YamlMappingNode`; if not, emit an error and return.
+2. For each key in the root mapping, check that it is a member of `KnownDocumentFields`; if not,
+ emit an unknown-field error at the key's position.
+3. Locate the `sections:` node and delegate to `LintSections`.
+4. Locate the `mappings:` node and delegate to `LintMappings`.
+
+### `LintSections(context, path, sectionsNode, seenIds)`
+
+`LintSections` iterates a `YamlSequenceNode` of section entries and calls `LintSection` for each.
+
+### `LintSection(context, path, sectionNode, seenIds)`
+
+`LintSection` validates one section mapping node.
+
+1. Assert `sectionNode` is a `YamlMappingNode`; emit an error and return if not.
+2. For each key, check against `KnownSectionFields`; emit an unknown-field error for any unknown key.
+3. Check that `title` is present and non-blank; emit an error if missing or blank.
+4. If `sections:` is present, call `LintSections` recursively.
+5. If `requirements:` is present, call `LintRequirements`.
+
+### `LintRequirements(context, path, requirementsNode, seenIds)`
+
+`LintRequirements` iterates a `YamlSequenceNode` of requirement entries and calls
+`LintRequirement` for each.
+
+### `LintRequirement(context, path, requirementNode, seenIds)`
+
+`LintRequirement` validates one requirement mapping node.
+
+1. Assert `requirementNode` is a `YamlMappingNode`; emit an error and return if not.
+2. For each key, check against `KnownRequirementFields`; emit an unknown-field error for any
+ unknown key.
+3. Check that `id` is present and non-blank; emit an error if missing or blank.
+4. If `id` is valid, check `seenIds` for a duplicate; if found, emit a duplicate-ID error
+ referencing both the current file position and the previously seen file. Add the ID to
+ `seenIds` if not already present.
+5. Check that `title` is present and non-blank; emit an error if missing or blank.
+6. If `tests:` is present, iterate each entry and emit an error for any blank scalar.
+7. If `tags:` is present, iterate each entry and emit an error for any blank scalar.
+
+### `LintMappings(context, path, mappingsNode, seenIds)`
+
+`LintMappings` iterates a `YamlSequenceNode` of mapping entries and calls `LintMapping` for each.
+
+### `LintMapping(context, path, mappingNode, seenIds)`
+
+`LintMapping` validates one mapping entry.
+
+1. Assert `mappingNode` is a `YamlMappingNode`; emit an error and return if not.
+2. For each key, check against `KnownMappingFields`; emit an unknown-field error for any unknown key.
+3. Check that `id` is present and non-blank; emit an error if missing or blank.
+4. If `tests:` is present, iterate each entry and emit an error for any blank scalar.
+
+## Issue Accumulation and No-Issues Message
+
+The linter accumulates all issues across all files before deciding whether to print the no-issues
+message. This ensures that a clean run produces exactly one affirmative line of output and that a
+run with issues lists every problem without any misleading success message.
+
+## Interactions with Other Units
+
+| Unit | Nature of interaction |
+| ---- | --------------------- |
+| `Program` | Calls `Linter.Lint(context, context.RequirementsFiles)` when `--lint` is present |
+| `Context` | Provides `WriteError` for issue reporting and `WriteLine` for the no-issues message |
+| `Validation` | `RunLintTest` exercises `Linter.Lint` with fixture YAML files |
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[repo]: https://github.com/demaconsulting/ReqStream
diff --git a/docs/design/program.md b/docs/design/program.md
new file mode 100644
index 0000000..0968fea
--- /dev/null
+++ b/docs/design/program.md
@@ -0,0 +1,126 @@
+# Program Unit Design
+
+## Overview
+
+`Program` is the entry point of the ReqStream executable. It owns the top-level execution flow,
+dispatches to the appropriate subsystem based on the parsed command-line options, and establishes the
+error-handling boundary for the entire process. All meaningful work is delegated to `Context`,
+`Validation`, `Linter`, `Requirements`, and `TraceMatrix`; `Program` itself contains no domain logic.
+
+## Properties
+
+### `Version`
+
+`Version` is a static read-only string property that resolves the tool's version at runtime.
+
+Resolution order:
+
+| Priority | Source | API |
+| -------- | ------ | --- |
+| 1 | `AssemblyInformationalVersionAttribute` | `Assembly.GetExecutingAssembly()` |
+| 2 | `AssemblyName.Version` | `Assembly.GetExecutingAssembly().GetName().Version` |
+| 3 | Fallback literal | `"Unknown"` |
+
+The informational version (set by the build system) is preferred because it carries pre-release
+labels and build metadata. If the attribute is absent or empty the numeric `AssemblyName.Version`
+string is used. If neither is available the string `"Unknown"` is returned so that the property
+never throws and never returns `null`.
+
+## Methods
+
+### `Main(args)`
+
+`Main` is the process entry point. Its responsibilities are:
+
+1. Create a `Context` instance via `Context.Create(args)`.
+2. Invoke `Run(context)` inside a `using` block so that `Context.Dispose()` is called on exit.
+3. Return `context.ExitCode` as the process exit code.
+
+**Error-handling contract**:
+
+| Exception type | Handling |
+| -------------- | -------- |
+| `ArgumentException` | Message written to `Console.Error`; returns exit code `1` |
+| `InvalidOperationException` | Message written to `Console.Error`; returns exit code `1` |
+| Any other exception | Message written to `Console.Error`; exception re-thrown for event-log capture |
+
+`ArgumentException` originates in `Context.Create` during argument parsing.
+`InvalidOperationException` originates in the execution layer (e.g., YAML validation failures,
+test result parse errors). Unexpected exceptions are intentionally re-thrown so that the operating
+system or process supervisor can capture the full stack trace.
+
+### `Run(context)`
+
+`Run` implements the priority-ordered dispatch shown in the table below. Each step is attempted in
+order; if a step applies, execution returns immediately without reaching later steps.
+
+| Priority | Condition | Action |
+| -------- | --------- | ------ |
+| 1 | `context.Version` is `true` | Call `PrintBanner`; print version only; return |
+| 2 | (always) | Call `PrintBanner` |
+| 3 | `context.Help` is `true` | Call `PrintHelp`; return |
+| 4 | `context.Validate` is `true` | Call `Validation.Run(context)`; return |
+| 5 | `context.Lint` is `true` | Call `Linter.Lint(context, context.RequirementsFiles)`; return |
+| 6 | (default) | Call `ProcessRequirements(context)` |
+
+### `PrintBanner`
+
+`PrintBanner` writes a single line to `context` containing the tool name, version string, and
+copyright notice. It is called at priority step 2 (and also at step 1 for the version query) so
+that every non-trivial invocation identifies the running version.
+
+### `PrintHelp`
+
+`PrintHelp` writes the full option listing to `context`. It documents every supported flag and
+argument, grouped logically. It is only called when `--help` is present.
+
+### `ProcessRequirements`
+
+`ProcessRequirements` orchestrates the normal (non-version, non-help, non-validate, non-lint) run.
+Its internal sequence is:
+
+1. Call `Requirements.Read(context.RequirementsFiles)` to build the parsed requirement tree.
+2. If `context.RequirementsReport` is set, export the requirements report at
+ `context.ReportDepth`.
+3. If `context.JustificationsFile` is set, export the justifications report at
+ `context.JustificationsDepth`.
+4. If `context.TestFiles` is non-empty, construct a `TraceMatrix` from the requirements tree and
+ the test result files.
+5. If `context.Matrix` is set and a trace matrix was constructed, export the matrix report at
+ `context.MatrixDepth`.
+6. If `context.Enforce` is `true`, call `EnforceRequirementsCoverage(context, requirements,
+ traceMatrix)`.
+
+All export methods respect `context.FilterTags` for tag-filtered output.
+
+### `EnforceRequirementsCoverage`
+
+`EnforceRequirementsCoverage` evaluates whether all requirements are covered by passing tests.
+
+1. If no `TraceMatrix` was built (i.e., no `--tests` argument was provided), call
+ `context.WriteError` with a message indicating that enforcement requires test results; return.
+2. Call `traceMatrix.CalculateSatisfiedRequirements(context.FilterTags)` to obtain satisfied and
+ total counts.
+3. If `satisfied < total`, iterate all requirements that are unsatisfied and call
+ `context.WriteError` for each unsatisfied requirement ID.
+
+This method never throws; all failure signalling goes through `context.WriteError`, which sets the
+internal error flag and eventually produces a non-zero exit code.
+
+## Interactions with Other Units
+
+| Unit | Nature of interaction |
+| ---- | --------------------- |
+| `Context` | Created in `Main`; passed to all subsystems; owns output and exit code |
+| `Validation` | Called by `Run` when `--validate` is present |
+| `Linter` | Called by `Run` when `--lint` is present |
+| `Requirements` | Constructed in `ProcessRequirements`; provides the requirement tree |
+| `TraceMatrix` | Constructed in `ProcessRequirements` when test files are present |
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[repo]: https://github.com/demaconsulting/ReqStream
diff --git a/docs/design/requirements.md b/docs/design/requirements.md
new file mode 100644
index 0000000..be74cc4
--- /dev/null
+++ b/docs/design/requirements.md
@@ -0,0 +1,169 @@
+# Requirements, Section, and Requirement Unit Design
+
+## Overview
+
+The three classes `Requirements`, `Section`, and `Requirement` together form the domain model for
+requirement data in ReqStream. They are responsible for reading YAML files, merging hierarchical
+section trees, validating data integrity, preventing infinite include loops and circular child
+references, applying test mappings, and exporting content to Markdown reports.
+
+## Data Model
+
+### `Requirement`
+
+`Requirement` represents a single requirement node.
+
+| Property | Type | YAML key | Notes |
+| -------- | ---- | -------- | ----- |
+| `Id` | `string` | `id` | Unique across all files; must not be blank |
+| `Title` | `string` | `title` | Human-readable name; must not be blank |
+| `Justification` | `string?` | `justification` | Optional rationale text |
+| `Tests` | `List` | `tests` | Test identifiers linked to this requirement |
+| `Children` | `List` | `children` | IDs of child requirements |
+| `Tags` | `List` | `tags` | Optional labels for filtering |
+
+### `Section`
+
+`Section` is a container node in the requirement hierarchy.
+
+| Property | Type | YAML key | Notes |
+| -------- | ---- | -------- | ----- |
+| `Title` | `string` | `title` | Used to match and merge sections across files |
+| `Requirements` | `List` | `requirements` | Requirements directly in this section |
+| `Sections` | `List` | `sections` | Child sections |
+
+### `Requirements`
+
+`Requirements` extends `Section` and acts as the root of the tree. In addition to the properties
+inherited from `Section`, it maintains two private fields that span the lifetime of a load
+operation:
+
+| Field | Type | Purpose |
+| ----- | ---- | ------- |
+| `_includedFiles` | `HashSet` | Absolute paths of files already processed; prevents infinite include loops |
+| `_allRequirements` | `Dictionary` | Maps requirement ID to source file path; detects duplicate IDs |
+
+## YAML Intermediate Types
+
+YAML is deserialized into a set of intermediate types using `YamlDotNet` with the
+`HyphenatedNamingConvention`:
+
+| Intermediate type | Maps to | Notes |
+| ----------------- | ------- | ----- |
+| `YamlDocument` | Top-level document | Contains `sections`, `mappings`, `includes` |
+| `YamlSection` | `sections[]` entries | Contains `title`, `requirements`, `sections` |
+| `YamlRequirement` | `requirements[]` entries | Contains `id`, `title`, `justification`, `tests`, `children`, `tags` |
+| `YamlMapping` | `mappings[]` entries | Contains `id`, `tests` |
+
+These intermediate types are discarded after `ReadFile` completes; the resulting `Requirement`,
+`Section`, and `Requirements` objects are the only long-lived representations.
+
+## Methods
+
+### `Requirements.Read(paths)`
+
+`Read` is the static factory method that constructs and returns a fully loaded `Requirements`
+instance.
+
+1. Create a new `Requirements` with empty collections.
+2. For each path in `paths`, call `ReadFile(path)`.
+3. Call `ValidateCycles()` to detect circular child-requirement references.
+4. Return the populated `Requirements` instance.
+
+### `ReadFile(path)`
+
+`ReadFile` loads a single YAML file and merges its content into the `Requirements` tree.
+
+1. Normalize `path` to an absolute path.
+2. If the path is already in `_includedFiles`, return immediately (loop prevention).
+3. Add the path to `_includedFiles`.
+4. Read the file text and deserialize it into a `YamlDocument` using `YamlDotNet`. If the document
+ is empty or `null`, return silently.
+5. Validate each section title (must not be blank) and each requirement ID and title (must not be
+ blank). Duplicate IDs are detected against `_allRequirements`; a duplicate causes an
+ `InvalidOperationException` with the file path and conflicting ID.
+6. Call `MergeSection` for each top-level section in the document.
+7. Apply each entry in the document's `mappings` block: find the matching `Requirement` by ID in
+ `_allRequirements` (skipping unknown IDs silently) and append the mapping's tests to
+ `Requirement.Tests`.
+8. For each path in the document's `includes` block, resolve it relative to the current file's
+ directory and call `ReadFile` recursively.
+
+### `MergeSection(parent, yamlSection)`
+
+`MergeSection` integrates a newly parsed section into an existing section tree.
+
+1. Search `parent.Sections` for an existing `Section` whose `Title` equals `yamlSection.Title`.
+2. If a match is found:
+ - Append all requirements from `yamlSection` to the existing section's `Requirements` list.
+ - Recursively call `MergeSection` for each child section in `yamlSection`.
+3. If no match is found:
+ - Create a new `Section` from `yamlSection` and append it to `parent.Sections`.
+
+This algorithm ensures that sections with the same title at the same hierarchy level are merged
+across multiple files, enabling modular requirements management.
+
+### `ValidateCycles()`
+
+`ValidateCycles` performs a depth-first search (DFS) over all requirements to detect circular child
+references. It is called once after all files are loaded.
+
+**Tracking structures**:
+
+| Structure | Type | Purpose |
+| --------- | ---- | ------- |
+| `visiting` | `HashSet` | IDs on the current DFS stack; a hit here indicates a cycle |
+| `path` | `List` | Ordered IDs on the current stack; used to build the error message |
+| `visited` | `HashSet` | IDs whose entire sub-tree is confirmed cycle-free; skipped on future encounters |
+
+**Algorithm** (per requirement):
+
+1. If the ID is in `visited`, return immediately.
+2. If the ID is in `visiting`, a cycle is detected; throw `InvalidOperationException` with the
+ cycle path formatted as `REQ-A -> REQ-B -> ... -> REQ-A`.
+3. Add the ID to `visiting` and `path`.
+4. Recurse into each child ID present in `_allRequirements`.
+5. Remove the ID from `visiting` and `path`; add it to `visited`.
+
+Because `ValidateCycles` runs before any downstream analysis, `TraceMatrix.CollectAllTests` can
+recurse through child requirements without its own cycle guard.
+
+### Export Methods
+
+| Method | Output | Notes |
+| ------ | ------ | ----- |
+| `Export(filePath, depth, filterTags)` | Requirements Markdown report | Recursive; applies `filterTags` |
+| `ExportJustifications(filePath, depth, filterTags)` | Justifications Markdown report | Recursive with tag filtering |
+
+When `filterTags` is non-`null`, only requirements whose `Tags` list contains at least one
+matching tag are included in the output.
+
+## Validation Error Table
+
+| Check | Condition | Error text |
+| ----- | --------- | ---------- |
+| Section title | Blank | `Section title cannot be blank` |
+| Requirement ID | Blank | `Requirement ID cannot be blank` |
+| Requirement ID | Duplicate | `Duplicate requirement ID found: '{id}'` |
+| Requirement title | Blank | `Requirement title cannot be blank` |
+| Test name | Blank entry in `tests` list | `Test name cannot be blank` |
+| Mapping ID | Blank | `Mapping requirement ID cannot be blank` |
+
+All validation errors throw `InvalidOperationException` and include the source file path for
+actionable debugging.
+
+## Interactions with Other Units
+
+| Unit | Nature of interaction |
+| ---- | --------------------- |
+| `Program` | Calls `Requirements.Read`; passes file paths from `Context.RequirementsFiles` |
+| `TraceMatrix` | Receives the populated `Requirements` root and iterates the tree |
+| `Validation` | Exercises `Requirements.Read` with fixture YAML files in tests |
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[repo]: https://github.com/demaconsulting/ReqStream
diff --git a/docs/design/title.txt b/docs/design/title.txt
new file mode 100644
index 0000000..f86be17
--- /dev/null
+++ b/docs/design/title.txt
@@ -0,0 +1,15 @@
+---
+title: ReqStream Design
+subtitle: Detailed Design for the ReqStream Tool
+author: DEMA Consulting
+description: Detailed design for the ReqStream Tool for managing requirements
+lang: en-US
+keywords:
+ - ReqStream
+ - Design
+ - Detailed Design
+ - Requirements
+ - C#
+ - .NET
+ - Documentation
+---
diff --git a/docs/design/tracematrix.md b/docs/design/tracematrix.md
new file mode 100644
index 0000000..f844733
--- /dev/null
+++ b/docs/design/tracematrix.md
@@ -0,0 +1,157 @@
+# TraceMatrix Unit Design
+
+## Overview
+
+`TraceMatrix` maps test execution results to requirements and calculates requirement-coverage
+metrics. It consumes an already-validated `Requirements` tree and a list of test-result file paths,
+then provides lookup and satisfaction-analysis methods used by `Program` to generate reports and
+enforce coverage.
+
+## Supporting Value Types
+
+### `TestMetrics`
+
+`TestMetrics` is an immutable record that aggregates pass/fail counts for a single named test
+across all loaded result files.
+
+| Property | Type | Formula | Notes |
+| -------- | ---- | ------- | ----- |
+| `Passes` | `int` | — | Total passing executions |
+| `Fails` | `int` | — | Total failing executions |
+| `Executed` | `int` | `Passes + Fails` | Total executions recorded |
+| `AllPassed` | `bool` | `Fails == 0 && Executed > 0` | True only when executed at least once with no failures |
+
+`GetTestResult` returns `TestMetrics(0, 0)` when the test name has no recorded executions, so
+callers always receive a valid object.
+
+### `TestExecution`
+
+`TestExecution` is an immutable record that holds the results for one test name from one
+result file.
+
+| Property | Type | Notes |
+| -------- | ---- | ----- |
+| `FileBaseName` | `string` | Base name (no extension) of the result file; used for source-specific matching |
+| `Name` | `string` | Test name as it appears in the result file |
+| `Metrics` | `TestMetrics` | Aggregated pass/fail counts for this test in this file |
+
+## Private State
+
+| Field | Type | Purpose |
+| ----- | ---- | ------- |
+| `_testExecutions` | `Dictionary>` | Maps test names to lists of `TestExecution` entries |
+| `_requirements` | `Requirements` | The validated requirement tree; held for iteration in analysis methods |
+
+## Construction
+
+### `TraceMatrix(requirements, testResultFiles)`
+
+The constructor builds the internal test-execution index:
+
+1. Store `requirements` for later iteration.
+2. For each path in `testResultFiles`, call `LoadTestResultFile(path)`.
+3. After all files are loaded, `_testExecutions` contains every unique test name seen, each mapped
+ to a list of `TestExecution` records (one per file that contained that test name).
+
+### `LoadTestResultFile(path)`
+
+`LoadTestResultFile` reads and parses one test-result file.
+
+1. Read the file text.
+2. Call `DemaConsulting.TestResults.IO.Serializer.Deserialize(content)` to auto-detect the format
+ (TRX or JUnit) and parse the results.
+3. If parsing fails, wrap the underlying exception in an `InvalidOperationException` that includes
+ `path` so the caller can identify the offending file.
+4. For each test case in the deserialized result set, create a `TestExecution` with:
+ - `FileBaseName` = `Path.GetFileNameWithoutExtension(path)`
+ - `Name` = test case name
+ - `Metrics` = `TestMetrics(passes, fails)` derived from the test case outcome
+5. Append the `TestExecution` to `_testExecutions[name]`, creating the list entry if absent.
+
+## Methods
+
+### `GetTestResult(testName, sourceFilter)`
+
+`GetTestResult` returns aggregated `TestMetrics` for a named test, with optional source filtering.
+
+**Source-specific format** (`testName` contains `'@'`):
+
+1. Split `testName` on the first `'@'` to obtain `sourcePart` and `namePart`.
+2. Look up `_testExecutions[namePart]`.
+3. Filter the list to entries where `FileBaseName.Contains(sourcePart, OrdinalIgnoreCase)`.
+4. Sum the `Metrics.Passes` and `Metrics.Fails` of the filtered entries.
+5. Return `TestMetrics(totalPasses, totalFails)`.
+
+**Plain format** (`testName` does not contain `'@'`):
+
+1. Look up `_testExecutions[testName]`.
+2. Sum all `Metrics.Passes` and `Metrics.Fails` without source filtering.
+3. Return `TestMetrics(totalPasses, totalFails)`.
+
+If the test name is not found in `_testExecutions`, return `TestMetrics(0, 0)`.
+
+### `CalculateSatisfiedRequirements(filterTags)`
+
+`CalculateSatisfiedRequirements` iterates every requirement in the tree and returns a two-element
+tuple `(satisfied, total)`.
+
+For each requirement (subject to `filterTags` filtering):
+
+1. Increment `total`.
+2. Call `IsRequirementSatisfied(requirement)`.
+3. If satisfied, increment `satisfied`.
+
+Returns `(satisfied, total)`.
+
+### `CollectAllTests(requirement)`
+
+`CollectAllTests` recursively collects every test name associated with a requirement and its
+descendants.
+
+1. Add all entries from `requirement.Tests` to the result set.
+2. For each ID in `requirement.Children`:
+ - Look up the child `Requirement` by ID.
+ - If found, recurse into `CollectAllTests(child)` and union the results.
+3. Return the union set.
+
+Because `Requirements.ValidateCycles()` has already confirmed the child graph is acyclic, this
+method recurses without a cycle guard.
+
+### `IsRequirementSatisfied(requirement)`
+
+`IsRequirementSatisfied` returns `true` if and only if the requirement has passing test coverage.
+
+1. Call `CollectAllTests(requirement)` to obtain the complete set of test names.
+2. If the set is empty, return `false` (no tests mapped — requirement is unsatisfied).
+3. For each test name, call `GetTestResult(testName)`.
+4. If any result has `AllPassed == false`, return `false`.
+5. Return `true`.
+
+### `Export(filePath, depth, filterTags)`
+
+`Export` writes the trace matrix to a Markdown file at `filePath`. The output lists each
+requirement (respecting `filterTags`), its associated tests, and the pass/fail status of each
+test. The heading depth for requirement IDs is controlled by `depth`.
+
+## Test Name Format Summary
+
+| Format | Example | Matching rule |
+| ------ | ------- | ------------- |
+| Plain | `TestFeature_Valid_Passes` | Aggregates across all result files |
+| Source-specific | `ubuntu@TestFeature_Valid_Passes` | Restricted to files whose base name contains `ubuntu` |
+
+## Interactions with Other Units
+
+| Unit | Nature of interaction |
+| ---- | --------------------- |
+| `Program` | Constructs `TraceMatrix`; calls `CalculateSatisfiedRequirements` and `Export` |
+| `Requirements` | Provides the requirement tree; iterated during analysis |
+| `Validation` | Exercises `TraceMatrix` with fixture test-result files in validation tests |
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[repo]: https://github.com/demaconsulting/ReqStream
diff --git a/docs/design/validation.md b/docs/design/validation.md
new file mode 100644
index 0000000..03b237f
--- /dev/null
+++ b/docs/design/validation.md
@@ -0,0 +1,101 @@
+# Validation Unit Design
+
+## Overview
+
+`Validation` is the self-validation test runner for ReqStream. Its purpose is to execute a suite
+of end-to-end tests that verify the tool's own behavior and to produce structured test-result
+evidence in TRX or JUnit format. This evidence can then be fed back into ReqStream to validate the
+tool's own requirements — enabling a self-hosting compliance workflow.
+
+All tests run in temporary directories to avoid side effects and are isolated from one another.
+
+## Methods
+
+### `Run(context)`
+
+`Run` is the single public entry point. Its sequence is:
+
+1. Print a header block to `context` containing the tool version, machine name, operating system,
+ .NET runtime version, and current UTC timestamp.
+2. Execute the six validation tests in order, collecting a `TestResult` for each.
+3. Print a summary line showing the number of passed and failed tests.
+4. If `context.ResultsFile` is set, call `WriteResultsFile(context, testResults)`.
+
+The six validation tests are listed in the order they are executed:
+
+| # | Method | What it verifies |
+| - | ------ | ---------------- |
+| 1 | `RunRequirementsProcessingTest` | Requirements YAML files are read, merged, and exported |
+| 2 | `RunTraceMatrixTest` | Test results are loaded and mapped to requirements |
+| 3 | `RunReportExportTest` | Requirements and justifications reports are written correctly |
+| 4 | `RunTagsFilteringTest` | Tag-based filtering restricts output and coverage calculation |
+| 5 | `RunEnforcementModeTest` | `--enforce` produces a non-zero exit code when coverage fails |
+| 6 | `RunLintTest` | The linter detects and reports structural issues in YAML files |
+
+Each test method:
+
+1. Creates a `DirectorySwitch` (see below) to operate in a fresh temporary directory.
+2. Writes one or more YAML or test-result fixture files to the temporary directory.
+3. Invokes a `Program` method or builds a `Context` and executes the relevant workflow.
+4. Asserts the expected outcomes (file content, exit code, error messages).
+5. Returns a `TestResult` with outcome `Passed` or `Failed`.
+
+### `WriteResultsFile(context, testResults)`
+
+`WriteResultsFile` serializes the collected `TestResult` list to a structured file.
+
+**Format dispatch**:
+
+| File extension | Serializer |
+| -------------- | ---------- |
+| `.trx` | TRX serializer (`DemaConsulting.TestResults.IO`) |
+| `.xml` | JUnit serializer (`DemaConsulting.TestResults.IO`) |
+| Any other | Throws `ArgumentException` |
+
+The serializer is invoked with the assembled `TestResults` object and the resolved output path.
+
+## Supporting Types
+
+### `DirectorySwitch` (nested helper class)
+
+`DirectorySwitch` is an `IDisposable` helper that manages temporary working-directory lifetime for
+test isolation.
+
+**Construction**:
+
+1. Capture `Directory.GetCurrentDirectory()` as the original directory.
+2. Create a new temporary directory (e.g., via `Path.GetTempPath()` + a unique name).
+3. Call `Directory.SetCurrentDirectory` to make the temporary directory the working directory.
+
+**Disposal**:
+
+1. Call `Directory.SetCurrentDirectory` to restore the original directory.
+2. Delete the temporary directory and all its contents recursively.
+
+This pattern guarantees that each test starts with a clean file system state and that no test
+artifacts persist after the test completes, regardless of whether the test passes or fails.
+
+## Dependencies
+
+| Library / Type | Role |
+| -------------- | ---- |
+| `DemaConsulting.TestResults` | `TestResults`, `TestResult`, `TestOutcome` model types |
+| `DemaConsulting.TestResults.IO.Serializer` | TRX and JUnit file serialization |
+
+## Interactions with Other Units
+
+| Unit | Nature of interaction |
+| ---- | --------------------- |
+| `Context` | Reads `ResultsFile`, `Version`, `Silent`; calls `WriteLine` for headers and summary |
+| `Program` | `Run` internally exercises `Program.Run` or individual workflow methods |
+| `Requirements` | Tests exercise `Requirements.Read` with fixture YAML files |
+| `TraceMatrix` | Tests exercise `TraceMatrix` construction with fixture test-result files |
+| `Linter` | `RunLintTest` exercises `Linter.Lint` with fixture YAML files |
+
+## References
+
+- [ReqStream Architecture][arch]
+- [ReqStream Repository][repo]
+
+[arch]: ../../ARCHITECTURE.md
+[repo]: https://github.com/demaconsulting/ReqStream
From 91047bc6a5d07baf6585ea67a032efc173e825b7 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 12:12:41 +0000
Subject: [PATCH 6/9] fix: address code-review findings across all review-sets
MIME-Version: 1.0
Content-Type: text/plain; charset=UTF-8
Content-Transfer-Encoding: 8bit
- platform-requirements.yaml: add ReqStream_Lint to all 6 platform requirements
- program.md: fix version priority action (no PrintBanner), PrintBanner output (3 lines not 1), EnforceRequirementsCoverage parameter list
- validation.md: add TemporaryDirectory section, correct DirectorySwitch description, fix test method step 1 description
- requirements.md: correct _allRequirements type from Dictionary to Dictionary
- tracematrix.md: rename LoadTestResultFile→ProcessTestResultFile, fix GetTestResult signature to remove sourceFilter param
- ValidationTests.cs: add missing assertion for ReqStream_Lint - Passed
- LinterTests.cs: add 5 new tests for blank values (req id, req title, mapping id, test name, tag name)
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/9c0775a2-8fea-461e-abb7-0b8db2817031
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
docs/design/program.md | 11 +-
docs/design/requirements.md | 2 +-
docs/design/tracematrix.md | 19 +--
docs/design/validation.md | 32 ++++--
docs/reqstream/platform-requirements.yaml | 6 +
.../LinterTests.cs | 108 ++++++++++++++++++
.../ValidationTests.cs | 1 +
7 files changed, 155 insertions(+), 24 deletions(-)
diff --git a/docs/design/program.md b/docs/design/program.md
index 0968fea..f077877 100644
--- a/docs/design/program.md
+++ b/docs/design/program.md
@@ -56,7 +56,7 @@ order; if a step applies, execution returns immediately without reaching later s
| Priority | Condition | Action |
| -------- | --------- | ------ |
-| 1 | `context.Version` is `true` | Call `PrintBanner`; print version only; return |
+| 1 | `context.Version` is `true` | Print version string only; return |
| 2 | (always) | Call `PrintBanner` |
| 3 | `context.Help` is `true` | Call `PrintHelp`; return |
| 4 | `context.Validate` is `true` | Call `Validation.Run(context)`; return |
@@ -65,9 +65,9 @@ order; if a step applies, execution returns immediately without reaching later s
### `PrintBanner`
-`PrintBanner` writes a single line to `context` containing the tool name, version string, and
-copyright notice. It is called at priority step 2 (and also at step 1 for the version query) so
-that every non-trivial invocation identifies the running version.
+`PrintBanner` writes three lines to `context`: the tool name with version string, the copyright
+notice, and a blank line. It is called at priority step 2 for all invocations except version
+queries, so that every non-trivial invocation identifies the running version.
### `PrintHelp`
@@ -88,8 +88,7 @@ Its internal sequence is:
the test result files.
5. If `context.Matrix` is set and a trace matrix was constructed, export the matrix report at
`context.MatrixDepth`.
-6. If `context.Enforce` is `true`, call `EnforceRequirementsCoverage(context, requirements,
- traceMatrix)`.
+6. If `context.Enforce` is `true`, call `EnforceRequirementsCoverage(context, traceMatrix)`.
All export methods respect `context.FilterTags` for tag-filtered output.
diff --git a/docs/design/requirements.md b/docs/design/requirements.md
index be74cc4..cd58d64 100644
--- a/docs/design/requirements.md
+++ b/docs/design/requirements.md
@@ -41,7 +41,7 @@ operation:
| Field | Type | Purpose |
| ----- | ---- | ------- |
| `_includedFiles` | `HashSet` | Absolute paths of files already processed; prevents infinite include loops |
-| `_allRequirements` | `Dictionary` | Maps requirement ID to source file path; detects duplicate IDs |
+| `_allRequirements` | `Dictionary` | Maps requirement ID to the owning `Requirement` object; detects duplicate IDs and enables child-requirement lookup |
## YAML Intermediate Types
diff --git a/docs/design/tracematrix.md b/docs/design/tracematrix.md
index f844733..80c1357 100644
--- a/docs/design/tracematrix.md
+++ b/docs/design/tracematrix.md
@@ -49,32 +49,33 @@ result file.
The constructor builds the internal test-execution index:
1. Store `requirements` for later iteration.
-2. For each path in `testResultFiles`, call `LoadTestResultFile(path)`.
+2. For each path in `testResultFiles`, call `ProcessTestResultFile(path)`.
3. After all files are loaded, `_testExecutions` contains every unique test name seen, each mapped
to a list of `TestExecution` records (one per file that contained that test name).
-### `LoadTestResultFile(path)`
+### `ProcessTestResultFile(filePath)`
-`LoadTestResultFile` reads and parses one test-result file.
+`ProcessTestResultFile` reads and parses one test-result file.
1. Read the file text.
2. Call `DemaConsulting.TestResults.IO.Serializer.Deserialize(content)` to auto-detect the format
(TRX or JUnit) and parse the results.
3. If parsing fails, wrap the underlying exception in an `InvalidOperationException` that includes
- `path` so the caller can identify the offending file.
+ `filePath` so the caller can identify the offending file.
4. For each test case in the deserialized result set, create a `TestExecution` with:
- - `FileBaseName` = `Path.GetFileNameWithoutExtension(path)`
+ - `FileBaseName` = `Path.GetFileNameWithoutExtension(filePath)`
- `Name` = test case name
- `Metrics` = `TestMetrics(passes, fails)` derived from the test case outcome
5. Append the `TestExecution` to `_testExecutions[name]`, creating the list entry if absent.
## Methods
-### `GetTestResult(testName, sourceFilter)`
+### `GetTestResult(testName)`
-`GetTestResult` returns aggregated `TestMetrics` for a named test, with optional source filtering.
+`GetTestResult` returns aggregated `TestMetrics` for a named test, with optional source filtering
+encoded in the `testName` parameter itself.
-**Source-specific format** (`testName` contains `'@'`):
+**Source-specific format** (`testName` contains `'@'` not at position 0 or end):
1. Split `testName` on the first `'@'` to obtain `sourcePart` and `namePart`.
2. Look up `_testExecutions[namePart]`.
@@ -82,7 +83,7 @@ The constructor builds the internal test-execution index:
4. Sum the `Metrics.Passes` and `Metrics.Fails` of the filtered entries.
5. Return `TestMetrics(totalPasses, totalFails)`.
-**Plain format** (`testName` does not contain `'@'`):
+**Plain format** (`testName` does not contain a valid `'@'` separator):
1. Look up `_testExecutions[testName]`.
2. Sum all `Metrics.Passes` and `Metrics.Fails` without source filtering.
diff --git a/docs/design/validation.md b/docs/design/validation.md
index 03b237f..e66826c 100644
--- a/docs/design/validation.md
+++ b/docs/design/validation.md
@@ -34,7 +34,8 @@ The six validation tests are listed in the order they are executed:
Each test method:
-1. Creates a `DirectorySwitch` (see below) to operate in a fresh temporary directory.
+1. Creates a `TemporaryDirectory` for isolation and uses `DirectorySwitch` (see below) to operate
+ within it.
2. Writes one or more YAML or test-result fixture files to the temporary directory.
3. Invokes a `Program` method or builds a `Context` and executes the relevant workflow.
4. Asserts the expected outcomes (file content, exit code, error messages).
@@ -56,24 +57,39 @@ The serializer is invoked with the assembled `TestResults` object and the resolv
## Supporting Types
+### `TemporaryDirectory` (nested helper class)
+
+`TemporaryDirectory` is an `IDisposable` helper that creates and manages the lifetime of a
+temporary directory.
+
+**Construction**:
+
+1. Compose a unique path under `Path.GetTempPath()` using a GUID suffix.
+2. Call `Directory.CreateDirectory` to create the directory.
+3. Expose the path via the `DirectoryPath` property.
+
+**Disposal**:
+
+1. If the directory still exists, call `Directory.Delete` recursively to remove it and all
+ contents.
+
### `DirectorySwitch` (nested helper class)
-`DirectorySwitch` is an `IDisposable` helper that manages temporary working-directory lifetime for
-test isolation.
+`DirectorySwitch` is an `IDisposable` helper that temporarily changes the process working directory.
**Construction**:
1. Capture `Directory.GetCurrentDirectory()` as the original directory.
-2. Create a new temporary directory (e.g., via `Path.GetTempPath()` + a unique name).
-3. Call `Directory.SetCurrentDirectory` to make the temporary directory the working directory.
+2. Call `Directory.SetCurrentDirectory` to switch to the supplied `newDirectory`.
**Disposal**:
1. Call `Directory.SetCurrentDirectory` to restore the original directory.
-2. Delete the temporary directory and all its contents recursively.
-This pattern guarantees that each test starts with a clean file system state and that no test
-artifacts persist after the test completes, regardless of whether the test passes or fails.
+Each test uses both classes together: `TemporaryDirectory` owns the directory lifetime and
+`DirectorySwitch` makes it the working directory for the duration of the test. This pattern
+guarantees that each test starts with a clean file system state and that no test artifacts persist
+after the test completes, regardless of whether the test passes or fails.
## Dependencies
diff --git a/docs/reqstream/platform-requirements.yaml b/docs/reqstream/platform-requirements.yaml
index 2f68a8b..dae8866 100644
--- a/docs/reqstream/platform-requirements.yaml
+++ b/docs/reqstream/platform-requirements.yaml
@@ -39,6 +39,7 @@ sections:
- "windows@ReqStream_ReportExport"
- "windows@ReqStream_TagsFiltering"
- "windows@ReqStream_EnforcementMode"
+ - "windows@ReqStream_Lint"
- id: ReqStream-Plt-Linux
title: The tool shall run on Linux operating systems.
@@ -53,6 +54,7 @@ sections:
- "ubuntu@ReqStream_ReportExport"
- "ubuntu@ReqStream_TagsFiltering"
- "ubuntu@ReqStream_EnforcementMode"
+ - "ubuntu@ReqStream_Lint"
- id: ReqStream-Plt-MacOS
title: The tool shall run on macOS operating systems.
@@ -67,6 +69,7 @@ sections:
- "macos@ReqStream_ReportExport"
- "macos@ReqStream_TagsFiltering"
- "macos@ReqStream_EnforcementMode"
+ - "macos@ReqStream_Lint"
- id: ReqStream-Plt-Net8
title: The tool shall support .NET 8.0 runtime.
@@ -80,6 +83,7 @@ sections:
- "dotnet8.x@ReqStream_ReportExport"
- "dotnet8.x@ReqStream_TagsFiltering"
- "dotnet8.x@ReqStream_EnforcementMode"
+ - "dotnet8.x@ReqStream_Lint"
- id: ReqStream-Plt-Net9
title: The tool shall support .NET 9.0 runtime.
@@ -94,6 +98,7 @@ sections:
- "dotnet9.x@ReqStream_ReportExport"
- "dotnet9.x@ReqStream_TagsFiltering"
- "dotnet9.x@ReqStream_EnforcementMode"
+ - "dotnet9.x@ReqStream_Lint"
- id: ReqStream-Plt-Net10
title: The tool shall support .NET 10.0 runtime.
@@ -108,3 +113,4 @@ sections:
- "dotnet10.x@ReqStream_ReportExport"
- "dotnet10.x@ReqStream_TagsFiltering"
- "dotnet10.x@ReqStream_EnforcementMode"
+ - "dotnet10.x@ReqStream_Lint"
diff --git a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
index b97edda..fdaff95 100644
--- a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
+++ b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
@@ -565,4 +565,112 @@ public void Linter_Lint_ErrorFormat_IncludesFileAndLocation()
// Error should include 'error:' severity
Assert.Contains("error:", errors);
}
+
+ ///
+ /// Test that a requirement with a blank id reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankRequirementId_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-req-id.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: ''
+ title: Test requirement
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Requirement 'id' cannot be blank", errors);
+ }
+
+ ///
+ /// Test that a requirement with a blank title reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankRequirementTitle_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-req-title.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: ''
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Requirement 'title' cannot be blank", errors);
+ }
+
+ ///
+ /// Test that a mapping with a blank id reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankMappingId_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-mapping-id.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+mappings:
+ - id: ''
+ tests:
+ - SomeTest
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Mapping 'id' cannot be blank", errors);
+ }
+
+ ///
+ /// Test that a blank test name in a requirement reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankTestName_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-test-name.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+ tests:
+ - ''
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Test name cannot be blank", errors);
+ }
+
+ ///
+ /// Test that a blank tag name in a requirement reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankTagName_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-tag-name.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+ tags:
+ - ''
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Tag name cannot be blank", errors);
+ }
}
diff --git a/test/DemaConsulting.ReqStream.Tests/ValidationTests.cs b/test/DemaConsulting.ReqStream.Tests/ValidationTests.cs
index 4c04f9f..757f90c 100644
--- a/test/DemaConsulting.ReqStream.Tests/ValidationTests.cs
+++ b/test/DemaConsulting.ReqStream.Tests/ValidationTests.cs
@@ -90,6 +90,7 @@ public void Validation_Run_WithSilentContext_CompletesSuccessfully()
Assert.Contains("ReqStream_ReportExport - Passed", logContent);
Assert.Contains("ReqStream_TagsFiltering - Passed", logContent);
Assert.Contains("ReqStream_EnforcementMode - Passed", logContent);
+ Assert.Contains("ReqStream_Lint - Passed", logContent);
Assert.Contains("Failed: 0", logContent);
}
From cf76022bfff91fc08ddbf3168e5ed042f6c92cc2 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 12:26:36 +0000
Subject: [PATCH 7/9] fix: resolve lint issues (cspell, markdownlint)
- docs/guide/guide.md: replace 'reqs' path abbreviation with 'requirements' in example output block (cspell unknown word)
- docs/guide/guide.md: shorten 'Unknown requirement field' table row by 1 char to stay within 120-char line limit (MD013)
- docs/design/requirements.md: shorten _allRequirements description from 175 chars to fit within 120-char line limit (MD013)
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/f32e90db-541b-4613-a757-4c714b31cf8b
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
docs/design/requirements.md | 2 +-
docs/guide/guide.md | 8 ++++----
2 files changed, 5 insertions(+), 5 deletions(-)
diff --git a/docs/design/requirements.md b/docs/design/requirements.md
index cd58d64..8a2b65e 100644
--- a/docs/design/requirements.md
+++ b/docs/design/requirements.md
@@ -41,7 +41,7 @@ operation:
| Field | Type | Purpose |
| ----- | ---- | ------- |
| `_includedFiles` | `HashSet` | Absolute paths of files already processed; prevents infinite include loops |
-| `_allRequirements` | `Dictionary` | Maps requirement ID to the owning `Requirement` object; detects duplicate IDs and enables child-requirement lookup |
+| `_allRequirements` | `Dictionary` | Maps requirement ID to `Requirement`; detects duplicates |
## YAML Intermediate Types
diff --git a/docs/guide/guide.md b/docs/guide/guide.md
index b178778..670fc4c 100644
--- a/docs/guide/guide.md
+++ b/docs/guide/guide.md
@@ -623,9 +623,9 @@ reqstream --requirements "docs/**/*.yaml" --lint
**Example output when issues are found:**
```text
-docs/reqs/unit.yaml(42,5): error: Unknown field 'tittle' in requirement
-docs/reqs/unit.yaml(57,13): error: Duplicate requirement ID 'REQ-001' (first seen in docs/reqs/base.yaml)
-docs/reqs/other.yaml(10,1): error: Section missing required field 'title'
+docs/requirements/unit.yaml(42,5): error: Unknown field 'tittle' in requirement
+docs/requirements/unit.yaml(57,13): error: Duplicate requirement ID 'REQ-001' (first seen in docs/requirements/base.yaml)
+docs/requirements/other.yaml(10,1): error: Section missing required field 'title'
```
**Example output when no issues are found:**
@@ -644,7 +644,7 @@ suitable for use in CI/CD quality gates.
| Malformed YAML | File cannot be parsed as valid YAML |
| Unknown document field | Top-level key other than `sections`, `mappings`, or `includes` |
| Unknown section field | Section key other than `title`, `requirements`, or `sections` |
-| Unknown requirement field | Requirement key other than `id`, `title`, `justification`, `tests`, `children`, or `tags` |
+| Unknown requirement field | Requirement key other than `id`, `title`, `justification`, `tests`, `children`, `tags` |
| Unknown mapping field | Mapping key other than `id` or `tests` |
| Missing section title | Section does not have a `title` field |
| Blank section title | Section `title` is empty or whitespace |
From bb9fa058f02b8f8ccadd3998737c776cc2cb7dcf Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 12:37:55 +0000
Subject: [PATCH 8/9] fix: apply code review feedback on Linter
- Linter.cs: change visitedFiles to StringComparer.Ordinal (was OrdinalIgnoreCase)
- Linter.cs: ParseYaml now returns YamlNode? so non-mapping roots emit an error
- Linter.cs: add GetSequenceChecked helper; detect key-exists-but-wrong-type for
sections/mappings/requirements fields and report an error
- LinterTests.cs: add Linter_Lint_WithBlankMappingTestName_ReportsError
- LinterTests.cs: add Linter_Lint_WithNonMappingRoot_ReportsError
- unit-linter.yaml: split ReqStream-Lint-UnknownMappingField from new
ReqStream-Lint-MissingMappingId; add ReqStream-Lint-BlankTestName and
ReqStream-Lint-BlankTagName; extend ReqStream-Lint-MissingRequirementFields
to cover blank id/title
- linter.md: update design doc to reflect non-mapping root check and
sequence-type-mismatch detection
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/b010a2d9-07e7-492f-a5fb-4ebeb03ed061
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
docs/design/linter.md | 26 ++++---
docs/reqstream/unit-linter.yaml | 39 ++++++++++-
src/DemaConsulting.ReqStream/Linter.cs | 67 +++++++++++++++----
.../LinterTests.cs | 41 ++++++++++++
4 files changed, 147 insertions(+), 26 deletions(-)
diff --git a/docs/design/linter.md b/docs/design/linter.md
index a854c8b..4d3eca0 100644
--- a/docs/design/linter.md
+++ b/docs/design/linter.md
@@ -65,20 +65,24 @@ causes the process to exit with code `1`.
5. Attempt to parse the text with `YamlStream.Load()`. If a `YamlException` is thrown (malformed
YAML), call `context.WriteError` with the exception message and return; do not attempt to lint
further.
-6. For each `YamlDocument` in the stream, call `LintDocument(context, path, doc, seenIds)`.
-7. After all documents are linted, locate the `includes:` sequence in the root mapping (if present)
+6. If the stream has no documents (empty file), return immediately — empty files are valid.
+7. If the root node is present but is not a `YamlMappingNode` (e.g. a top-level sequence or scalar),
+ emit an error at the node's position and return.
+8. Call `LintDocumentRoot(context, path, root, seenIds)` with the mapping root.
+9. After all documents are linted, locate the `includes:` sequence in the root mapping (if present)
and for each scalar entry call `LintFile` recursively, resolving the include path relative to
the directory of the current file.
-### `LintDocument(context, path, doc, seenIds)`
+### `LintDocumentRoot(context, path, root, seenIds)`
-`LintDocument` validates the top-level structure of a single YAML document.
+`LintDocumentRoot` validates the top-level structure of a single YAML document.
-1. Assert that the document root is a `YamlMappingNode`; if not, emit an error and return.
-2. For each key in the root mapping, check that it is a member of `KnownDocumentFields`; if not,
+1. For each key in the root mapping, check that it is a member of `KnownDocumentFields`; if not,
emit an unknown-field error at the key's position.
-3. Locate the `sections:` node and delegate to `LintSections`.
-4. Locate the `mappings:` node and delegate to `LintMappings`.
+2. Locate the `sections:` node. If the key exists but its value is not a `YamlSequenceNode`, emit a
+ type-mismatch error. Otherwise delegate to `LintSections`.
+3. Locate the `mappings:` node. If the key exists but its value is not a `YamlSequenceNode`, emit a
+ type-mismatch error. Otherwise delegate to `LintMappings`.
### `LintSections(context, path, sectionsNode, seenIds)`
@@ -91,8 +95,10 @@ causes the process to exit with code `1`.
1. Assert `sectionNode` is a `YamlMappingNode`; emit an error and return if not.
2. For each key, check against `KnownSectionFields`; emit an unknown-field error for any unknown key.
3. Check that `title` is present and non-blank; emit an error if missing or blank.
-4. If `sections:` is present, call `LintSections` recursively.
-5. If `requirements:` is present, call `LintRequirements`.
+4. If `sections:` key is present but its value is not a sequence, emit a type-mismatch error;
+ otherwise call `LintSections` recursively.
+5. If `requirements:` key is present but its value is not a sequence, emit a type-mismatch error;
+ otherwise call `LintRequirements`.
### `LintRequirements(context, path, requirementsNode, seenIds)`
diff --git a/docs/reqstream/unit-linter.yaml b/docs/reqstream/unit-linter.yaml
index 7345782..03c7297 100644
--- a/docs/reqstream/unit-linter.yaml
+++ b/docs/reqstream/unit-linter.yaml
@@ -57,15 +57,17 @@ sections:
- Linter_Lint_WithNestedSectionIssues_ReportsError
- id: ReqStream-Lint-MissingRequirementFields
- title: The linter shall report an error when a requirement is missing the required id or title field.
+ title: The linter shall report an error when a requirement is missing or has a blank id or title field.
justification: |
- The id and title fields are mandatory for all requirements. Missing them causes downstream
- processing failures and produces incomplete reports.
+ The id and title fields are mandatory for all requirements. Missing or blank values cause downstream
+ processing failures and produce incomplete reports.
tags:
- lint
tests:
- Linter_Lint_WithRequirementMissingId_ReportsError
- Linter_Lint_WithRequirementMissingTitle_ReportsError
+ - Linter_Lint_WithBlankRequirementId_ReportsError
+ - Linter_Lint_WithBlankRequirementTitle_ReportsError
- id: ReqStream-Lint-DuplicateIds
title: The linter shall report an error when duplicate requirement IDs are found.
@@ -128,4 +130,35 @@ sections:
- lint
tests:
- Linter_Lint_WithUnknownMappingField_ReportsError
+
+ - id: ReqStream-Lint-MissingMappingId
+ title: The linter shall report an error when a test mapping is missing or has a blank id field.
+ justification: |
+ The id field is mandatory for all test mappings. Missing or blank values prevent the mapping
+ from being associated with a requirement and cause downstream processing failures.
+ tags:
+ - lint
+ tests:
- Linter_Lint_WithMappingMissingId_ReportsError
+ - Linter_Lint_WithBlankMappingId_ReportsError
+
+ - id: ReqStream-Lint-BlankTestName
+ title: The linter shall report an error when a test name in a requirement or mapping is blank.
+ justification: |
+ Blank test names cannot be matched against test result files and would silently break
+ coverage calculations.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithBlankTestName_ReportsError
+ - Linter_Lint_WithBlankMappingTestName_ReportsError
+
+ - id: ReqStream-Lint-BlankTagName
+ title: The linter shall report an error when a tag name in a requirement is blank.
+ justification: |
+ Blank tag names cannot be used for filtering and would silently produce incorrect filtered
+ exports.
+ tags:
+ - lint
+ tests:
+ - Linter_Lint_WithBlankTagName_ReportsError
diff --git a/src/DemaConsulting.ReqStream/Linter.cs b/src/DemaConsulting.ReqStream/Linter.cs
index e9fc723..73f6870 100644
--- a/src/DemaConsulting.ReqStream/Linter.cs
+++ b/src/DemaConsulting.ReqStream/Linter.cs
@@ -74,7 +74,7 @@ public static void Lint(Context context, IReadOnlyList files)
var seenIds = new Dictionary(StringComparer.Ordinal);
// Track all visited files to avoid linting the same file twice (following includes)
- var visitedFiles = new HashSet(StringComparer.OrdinalIgnoreCase);
+ var visitedFiles = new HashSet(StringComparer.Ordinal);
// Count total issues
var issueCount = 0;
@@ -146,10 +146,10 @@ private static int LintFile(
}
// Parse the YAML into a node tree
- YamlMappingNode? root;
+ YamlNode? rawRoot;
try
{
- root = ParseYaml(yaml);
+ rawRoot = ParseYaml(yaml);
}
catch (Exception ex)
{
@@ -162,11 +162,19 @@ private static int LintFile(
}
// Empty documents are valid
- if (root == null)
+ if (rawRoot == null)
{
return 0;
}
+ // Document root must be a mapping node
+ if (rawRoot is not YamlMappingNode root)
+ {
+ context.WriteError(
+ $"{path}({rawRoot.Start.Line},{rawRoot.Start.Column}): error: Document root must be a mapping");
+ return 1;
+ }
+
// Lint document root fields
issueCount += LintDocumentRoot(context, path, root, seenIds);
@@ -191,12 +199,12 @@ private static int LintFile(
}
///
- /// Parses YAML text into a mapping node, or returns null for empty documents.
+ /// Parses YAML text and returns the root node, or returns null for empty documents.
///
/// The YAML text to parse.
- /// The root mapping node, or null if the document is empty.
+ /// The root node, or null if the document is empty.
/// Thrown when the YAML is malformed.
- private static YamlMappingNode? ParseYaml(string yaml)
+ private static YamlNode? ParseYaml(string yaml)
{
var stream = new YamlStream();
using var reader = new StringReader(yaml);
@@ -207,8 +215,7 @@ private static int LintFile(
return null;
}
- var rootNode = stream.Documents[0].RootNode;
- return rootNode as YamlMappingNode;
+ return stream.Documents[0].RootNode;
}
///
@@ -240,7 +247,7 @@ private static int LintDocumentRoot(
}
// Lint sections
- var sections = GetSequence(root, "sections");
+ var sections = GetSequenceChecked(context, path, root, "sections", ref issueCount);
if (sections != null)
{
foreach (var sectionNode in sections.Children)
@@ -259,7 +266,7 @@ private static int LintDocumentRoot(
}
// Lint mappings
- var mappings = GetSequence(root, "mappings");
+ var mappings = GetSequenceChecked(context, path, root, "mappings", ref issueCount);
if (mappings != null)
{
foreach (var mappingNode in mappings.Children)
@@ -324,7 +331,7 @@ private static int LintSection(
}
// Lint requirements
- var requirements = GetSequence(section, "requirements");
+ var requirements = GetSequenceChecked(context, path, section, "requirements", ref issueCount);
if (requirements != null)
{
foreach (var reqNode in requirements.Children)
@@ -343,7 +350,7 @@ private static int LintSection(
}
// Lint child sections
- var sections = GetSequence(section, "sections");
+ var sections = GetSequenceChecked(context, path, section, "sections", ref issueCount);
if (sections != null)
{
foreach (var childNode in sections.Children)
@@ -532,6 +539,40 @@ private static int LintMapping(
return issueCount;
}
+ ///
+ /// Gets a sequence node from a mapping node by key, reporting a type mismatch error if the
+ /// key exists but the value is not a sequence.
+ ///
+ /// The context for output.
+ /// The file path for error messages.
+ /// The mapping node to search.
+ /// The key to look up.
+ /// Incremented by one when a type mismatch error is reported.
+ /// The sequence node, or null if not found or a type error was reported.
+ private static YamlSequenceNode? GetSequenceChecked(
+ Context context,
+ string path,
+ YamlMappingNode mapping,
+ string key,
+ ref int issues)
+ {
+ var keyNode = new YamlScalarNode(key);
+ if (!mapping.Children.TryGetValue(keyNode, out var value))
+ {
+ return null;
+ }
+
+ if (value is YamlSequenceNode seq)
+ {
+ return seq;
+ }
+
+ context.WriteError(
+ $"{path}({value.Start.Line},{value.Start.Column}): error: Field '{key}' must be a sequence");
+ issues++;
+ return null;
+ }
+
///
/// Gets a scalar node value from a mapping node by key.
///
diff --git a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
index fdaff95..36174dd 100644
--- a/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
+++ b/test/DemaConsulting.ReqStream.Tests/LinterTests.cs
@@ -673,4 +673,45 @@ public void Linter_Lint_WithBlankTagName_ReportsError()
Assert.AreEqual(1, exitCode);
Assert.Contains("Tag name cannot be blank", errors);
}
+
+ ///
+ /// Test that a mapping with a blank test name reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithBlankMappingTestName_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "blank-mapping-test-name.yaml");
+ File.WriteAllText(reqFile, @"sections:
+ - title: Test Section
+ requirements:
+ - id: REQ-001
+ title: Test requirement
+mappings:
+ - id: REQ-001
+ tests:
+ - ''
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Test name cannot be blank in mapping", errors);
+ }
+
+ ///
+ /// Test that a requirements file with a non-mapping root (e.g. a top-level sequence) reports an error.
+ ///
+ [TestMethod]
+ public void Linter_Lint_WithNonMappingRoot_ReportsError()
+ {
+ var reqFile = Path.Combine(_testDirectory, "non-mapping-root.yaml");
+ File.WriteAllText(reqFile, @"- item1
+- item2
+");
+
+ var (exitCode, errors) = RunLint(reqFile);
+
+ Assert.AreEqual(1, exitCode);
+ Assert.Contains("Document root must be a mapping", errors);
+ }
}
From 0ebedf5cb7b5f192f603a6a6b21e54a1e6240787 Mon Sep 17 00:00:00 2001
From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com>
Date: Fri, 27 Mar 2026 12:50:17 +0000
Subject: [PATCH 9/9] docs: update README.md for --lint feature
- Add Linting to Features list
- Add --lint to help output block
- Update self-validation output to show 6 tests including ReqStream_Lint
- Add ReqStream_Lint description to self-validation section
Agent-Logs-Url: https://github.com/demaconsulting/ReqStream/sessions/8a2ef406-6eec-4487-b1b1-0dd6e402fa20
Co-authored-by: Malcolmnixon <1863707+Malcolmnixon@users.noreply.github.com>
---
README.md | 8 ++++++--
1 file changed, 6 insertions(+), 2 deletions(-)
diff --git a/README.md b/README.md
index f9dc9ec..f8e281c 100644
--- a/README.md
+++ b/README.md
@@ -25,6 +25,7 @@ create, validate, and manage requirement documents in a structured and maintaina
- 🧪 **Test Mapping** - Link requirements to test cases for traceability
- 📦 **File Includes** - Modularize requirements across multiple YAML files
- ✅ **Validation** - Built-in validation for requirement structure and references
+- 🔍 **Linting** - Validate requirements YAML structure and report all issues in one pass
- 🏷️ **Tag Filtering** - Categorize and filter requirements using tags
- 📋 **Justifications** - Document the rationale behind each requirement
- 🔒 **Continuous Compliance** - Compliance evidence generated automatically on every CI run, following
@@ -97,6 +98,7 @@ Options:
--validate Run self-validation
--results Write validation results to file (TRX or JUnit format)
--log Write output to log file
+ --lint Lint requirements files for structural issues
--requirements Requirements files glob pattern
--report Export requirements to markdown file
--report-depth Markdown header depth for requirements report (default: 1)
@@ -129,9 +131,10 @@ Running self-validation produces a report containing the following information:
✓ ReqStream_ReportExport - Passed
✓ ReqStream_TagsFiltering - Passed
✓ ReqStream_EnforcementMode - Passed
+✓ ReqStream_Lint - Passed
-Total Tests: 5
-Passed: 5
+Total Tests: 6
+Passed: 6
Failed: 0
```
@@ -142,6 +145,7 @@ Each test in the report proves:
- **`ReqStream_ReportExport`** - requirements report is correctly exported to a markdown file.
- **`ReqStream_TagsFiltering`** - requirements are correctly filtered by tags.
- **`ReqStream_EnforcementMode`** - enforcement mode correctly validates requirement test coverage.
+- **`ReqStream_Lint`** - linter correctly validates requirements YAML file structure and reports all issues.
See the [User Guide][link-guide] for more details on the self-validation tests.