Byte slice references#8
Merged
jensneuse merged 7 commits intoadd-start-end-position-to-nodesfrom Jan 17, 2019
Merged
Conversation
…ByteSliceReferences
jensneuse
added a commit
that referenced
this pull request
Mar 11, 2020
Refactoring of subscriptions
devsergiy
pushed a commit
that referenced
this pull request
Aug 22, 2023
* re-enable normalization with inline fragments * chore: temporarily comment out broken tests * fix field selection merging * fix validation tests --------- Co-authored-by: David Stutt <david@wundergraph.com>
buger
added a commit
to buger/graphql-go-tools
that referenced
this pull request
Apr 30, 2026
…DC tests
- Remove tests/resolve-engine/ FLIP fixtures (708 auto-generated JSON files
plus mcdc-SYS-REQ-001-s1.json). They are not load-bearing in this audit:
no component declares fixture-backed evidence in proof.yaml, so the
flip_fixtures_exist and fixture_staleness_clean audit checks already pass
with "no components declare fixture-backed evidence -- not applicable".
Added tests/resolve-engine/ to .gitignore so locally regenerated fixtures
do not slip back into the tree.
- Remove pkg/doc.go workaround. It existed to satisfy reqproof's hardcoded
./pkg/... test scope (BUG-008); that bug is now fixed in reqproof itself
(per-target scope from proof.yaml is honored).
- Replace stub MCDC witnesses with real exercising tests where feasible:
* SYS-REQ-001 -> annotation moved onto TestCVE_BUG004_SkipFieldDepthPanic
* SYS-REQ-002 -> annotation moved onto TestCVE_BUG005_SkipFieldEmptyTypeNamesPanic
+ new TestMCDCReal_SYSREQ002_SkipFieldOnTypeNamesTrueRow
* SYS-REQ-003 -> new TestMCDCReal_SYSREQ003_FieldInfoMergeKeepsSlicesParallel
* SYS-REQ-004 -> new TestMCDCReal_SYSREQ004_FieldCopyPreservesParentOnTypeNames
* SYS-REQ-005 -> annotation moved onto TestCVE_BUG008_NilAsyncErrorWriterPanic
* SYS-REQ-010 -> new TestMCDCReal_SYSREQ010_ResolvableResetClearsState
* SYS-REQ-011 -> new TestMCDCReal_SYSREQ011_LoaderFreeNilsReferences
- Rename mcdc_witnesses_full_test.go to mcdc_witnesses_pending_test.go and
add an honest header explaining that the remaining 173 stubs satisfy the
spec-side row-coverage tracker but not code-level execution evidence; each
pending entry calls out the test infrastructure (federation simulator,
subgraph HTTP fixtures, subscription harnesses, planner-driven walker
setup) it still needs.
Diff stats: from 958 files +55,070/-2 to 246 files +332/-31,463.
Audit shape unchanged: 1 error from intentional CVE reproducer fails
(disclosure evidence for BUG wundergraph#4/wundergraph#5/wundergraph#8) and 1 warning from suspect_clean.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR adds start and end positions to tokens. It'll enable the parser to know the exact structure of a graphql document.
On the way to the implementation the token lexer got refactored to simplify the code and have less moving parts around. The lexer does no longer emit tokens with byte slices but will just return a reference to the start and end in the input byte slice. This does not only simplify the lexer codebase but also improves performance by ~10% which isn't too bad.
For Usability reasons a ByteSlice lookup method has been added to both, the parser and lexer which lets the user easily transform a ByteSliceReference to a ByteSlice in case the latter is needed.