Skip to content

Catch token count issue while streaming with customized models #6322

Catch token count issue while streaming with customized models

Catch token count issue while streaming with customized models #6322

Triggered via pull request September 25, 2024 14:23
Status Failure
Total duration 30d 0h 0m 4s
Artifacts

openai.yml

on: pull_request
Matrix: test
Fit to window
Zoom out
Zoom in

Annotations

4 errors
test (ubuntu-latest, 3.9)
The deployment was rejected or didn't satisfy other protection rules.
test (ubuntu-latest, 3.12)
The deployment was rejected or didn't satisfy other protection rules.
test (ubuntu-latest, 3.10)
The deployment was rejected or didn't satisfy other protection rules.
test (ubuntu-latest, 3.11)
The deployment was rejected or didn't satisfy other protection rules.