-
Notifications
You must be signed in to change notification settings - Fork 531
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The example provided by the official source has crashed. #1075
Open
Labels
Comments
When I try llama3-8B-instruct, I get: [rank0]: Traceback (most recent call last):
[rank0]: File "/data/ruanjh/best_training_method/11.py", line 31, in <module>
[rank0]: character = generator(
[rank0]: ^^^^^^^^^^
[rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/api.py", line 511, in __call__
[rank0]: return format(completions)
[rank0]: ^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/api.py", line 497, in format
[rank0]: return self.format_sequence(sequences)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/site-packages/outlines/generate/json.py", line 60, in <lambda>
[rank0]: generator.format_sequence = lambda x: pyjson.loads(x)
[rank0]: ^^^^^^^^^^^^^^^
[rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/json/__init__.py", line 346, in loads
[rank0]: return _default_decoder.decode(s)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/json/decoder.py", line 337, in decode
[rank0]: obj, end = self.raw_decode(s, idx=_w(s, 0).end())
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]: File "/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/json/decoder.py", line 353, in raw_decode
[rank0]: obj, end = self.scan_once(s, idx)
[rank0]: ^^^^^^^^^^^^^^^^^^^^^^
[rank0]: json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 32 (char 31)
ERROR 07-30 19:03:51 multiproc_worker_utils.py:120] Worker VllmWorkerProcess pid 122794 died, exit code: -15
INFO 07-30 19:03:51 multiproc_worker_utils.py:123] Killing local vLLM worker processes
/data/ruanjh/miniconda3/envs/mamba/lib/python3.11/multiprocessing/resource_tracker.py:254: UserWarning: resource_tracker: There appear to be 2 leaked shared_memory objects to clean up at shutdown
warnings.warn('resource_tracker: There appear to be %d ' |
Likely related to #985 I'm working on a fix to a few json schema issues which have appeared. Thank you for your patience. |
This was referenced Sep 16, 2024
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
I attempted to use gemma2b to perform a simple test on outlines; however, I unfortunately encountered the following error.
Steps/code to reproduce the bug:
Expected result:
.
Error message:
No response
Outlines/Python version information:
Version information
Context for the issue:
No response
The text was updated successfully, but these errors were encountered: