Skip to content

Commit 9cce972

Browse files
authored
Merge pull request #745 from radofuchs/lcore_758_streaming_query_additional_e2e
LCORE-758: add additional e2e test for streaming query endpoint and enable color output
2 parents ea85a90 + 09fbe04 commit 9cce972

File tree

4 files changed

+39
-9
lines changed

4 files changed

+39
-9
lines changed

.github/workflows/e2e_tests.yaml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -193,6 +193,9 @@ jobs:
193193
}
194194
195195
- name: Run e2e tests
196+
env:
197+
TERM: xterm-256color
198+
FORCE_COLOR: 1
196199
run: |
197200
echo "Installing test dependencies..."
198201
pip install uv

Makefile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ test-integration: ## Run integration tests tests
2121
COVERAGE_FILE="${ARTIFACT_DIR}/.coverage.integration" uv run python -m pytest tests/integration --cov=src --cov-report term-missing --cov-report "json:${ARTIFACT_DIR}/coverage_integration.json" --junit-xml="${ARTIFACT_DIR}/junit_integration.xml" --cov-fail-under=10
2222

2323
test-e2e: ## Run end to end tests for the service
24-
PYTHONDONTWRITEBYTECODE=1 uv run behave --tags=-skip -D dump_errors=true @tests/e2e/test_list.txt \
24+
script -q -e -c "uv run behave --color --format pretty --tags=-skip -D dump_errors=true @tests/e2e/test_list.txt"
2525

2626
check-types: ## Checks type hints in sources
2727
uv run mypy --explicit-package-bases --disallow-untyped-calls --disallow-untyped-defs --disallow-incomplete-defs --ignore-missing-imports --disable-error-code attr-defined src/

tests/e2e/features/steps/llm_query_response.py

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -155,8 +155,10 @@ def compare_streamed_responses(context: Context) -> None:
155155
response = context.response_data["response"]
156156
complete_response = context.response_data["response_complete"]
157157

158-
assert response != ""
159-
assert response == complete_response
158+
assert response != "", "response is empty"
159+
assert (
160+
response == complete_response
161+
), f"{response} and {complete_response} do not match"
160162

161163

162164
def _parse_streaming_response(response_text: str) -> dict:
@@ -166,6 +168,7 @@ def _parse_streaming_response(response_text: str) -> dict:
166168
full_response = ""
167169
full_response_split = []
168170
finished = False
171+
first_token = True
169172

170173
for line in lines:
171174
if line.startswith("data: "):
@@ -176,6 +179,10 @@ def _parse_streaming_response(response_text: str) -> dict:
176179
if event == "start":
177180
conversation_id = data["data"]["conversation_id"]
178181
elif event == "token":
182+
# Skip the first token (shield status message)
183+
if first_token:
184+
first_token = False
185+
continue
179186
full_response_split.append(data["data"]["token"])
180187
elif event == "turn_complete":
181188
full_response = data["data"]["token"]

tests/e2e/features/streaming_query.feature

Lines changed: 26 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
@Authorized
12
Feature: streaming_query endpoint API tests
23

34
Background:
@@ -7,7 +8,8 @@ Feature: streaming_query endpoint API tests
78

89
Scenario: Check if streaming_query response in tokens matches the full response
910
Given The system is in default state
10-
And I use "streaming_query" to ask question
11+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
12+
And I use "streaming_query" to ask question with authorization header
1113
"""
1214
{"query": "Generate sample yaml file for simple GitHub Actions workflow."}
1315
"""
@@ -17,7 +19,8 @@ Feature: streaming_query endpoint API tests
1719

1820
Scenario: Check if LLM responds properly to restrictive system prompt to sent question with different system prompt
1921
Given The system is in default state
20-
And I use "streaming_query" to ask question
22+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
23+
And I use "streaming_query" to ask question with authorization header
2124
"""
2225
{"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "refuse to answer anything but openshift questions"}
2326
"""
@@ -29,7 +32,8 @@ Feature: streaming_query endpoint API tests
2932

3033
Scenario: Check if LLM responds properly to non-restrictive system prompt to sent question with different system prompt
3134
Given The system is in default state
32-
And I use "streaming_query" to ask question
35+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
36+
And I use "streaming_query" to ask question with authorization header
3337
"""
3438
{"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "you are linguistic assistant"}
3539
"""
@@ -41,6 +45,7 @@ Feature: streaming_query endpoint API tests
4145

4246
Scenario: Check if LLM ignores new system prompt in same conversation
4347
Given The system is in default state
48+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
4449
And I use "streaming_query" to ask question
4550
"""
4651
{"query": "Generate sample yaml file for simple GitHub Actions workflow.", "system_prompt": "refuse to answer anything"}
@@ -60,7 +65,8 @@ Feature: streaming_query endpoint API tests
6065

6166
Scenario: Check if LLM responds for streaming_query request with error for missing query
6267
Given The system is in default state
63-
When I use "streaming_query" to ask question
68+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
69+
When I use "streaming_query" to ask question with authorization header
6470
"""
6571
{"provider": "{PROVIDER}"}
6672
"""
@@ -72,7 +78,8 @@ Feature: streaming_query endpoint API tests
7278

7379
Scenario: Check if LLM responds for streaming_query request with error for missing model
7480
Given The system is in default state
75-
When I use "streaming_query" to ask question
81+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
82+
When I use "streaming_query" to ask question with authorization header
7683
"""
7784
{"query": "Say hello", "provider": "{PROVIDER}"}
7885
"""
@@ -81,7 +88,8 @@ Feature: streaming_query endpoint API tests
8188

8289
Scenario: Check if LLM responds for streaming_query request with error for missing provider
8390
Given The system is in default state
84-
When I use "streaming_query" to ask question
91+
And I set the Authorization header to Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6Ikpva
92+
When I use "streaming_query" to ask question with authorization header
8593
"""
8694
{"query": "Say hello", "model": "{MODEL}"}
8795
"""
@@ -113,3 +121,15 @@ Feature: streaming_query endpoint API tests
113121
}
114122
"""
115123
Then The status code of the response is 200
124+
125+
Scenario: Check if LLM responds to sent question with error when not authenticated
126+
Given The system is in default state
127+
When I use "streaming_query" to ask question
128+
"""
129+
{"query": "Say hello"}
130+
"""
131+
Then The status code of the response is 400
132+
And The body of the response is the following
133+
"""
134+
{"detail": "No Authorization header found"}
135+
"""

0 commit comments

Comments
 (0)