-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement use
argument for LLM for decorators
#1234
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Few small nits, but otherwise looks great!
sdk/aqueduct/__init__.py
Outdated
@@ -7,6 +7,8 @@ | |||
from aqueduct.flow import Flow | |||
from aqueduct.schedule import DayOfMonth, DayOfWeek, Hour, Minute, daily, hourly, monthly, weekly | |||
|
|||
llm = "llm" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we declare this as a constant?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
removed this as it's not needed anymore.
@@ -47,6 +47,7 @@ type ResourceConfig struct { | |||
MemoryMB *int `json:"memory_mb,omitempty"` | |||
GPUResourceName *string `json:"gpu_resource_name,omitempty"` | |||
CudaVersion *CudaVersionNumber `json:"cuda_version,omitempty"` | |||
UseLlm *bool `json:"use_llm,omitempty"` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change to UseLLM
just to be consistent with our naming scheme.
@hsubbaraj-spiral updated the implementation to add the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, the new API makes sense to me, just a quick questions about the prompt.
if not isinstance(prompt, str): | ||
raise ValueError("The 'prompt' parameter must be a string.") | ||
|
||
if "{text}" not in prompt: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain the significance of "{text}"
is this a keyword in LLM prompts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"If the prompt contains {text}, we will replace {text} with the input string(s) before sending to the LLM. If the prompt does not contain {text}, we will prepend the prompt to the input string(s) before sending to the LLM."
Describe your changes and why you are making these changes
This PR implements the
use
argument that allows the user to specify the intent to use our LLM library. Under the hood, we include theuse_llm
flag into the resource struct and use this as a flag to select Docker images that have LLM Python library dependencies and Cuda drivers pre-installed.Related issue number (if any)
Loom demo (if any)
Checklist before requesting a review
python3 scripts/run_linters.py -h
for usage).run_integration_test
: Runs integration testsskip_integration_test
: Skips integration tests (Should be used when changes are ONLY documentation/UI)