Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feature request] set default cache policy #1292

Open
kaifronsdal opened this issue Feb 11, 2025 · 1 comment
Open

[feature request] set default cache policy #1292

kaifronsdal opened this issue Feb 11, 2025 · 1 comment

Comments

@kaifronsdal
Copy link
Contributor

I love being able to cache model responses in inspect and the fine grained control I have over the cache policy. However, I often find myself writting a new solver and forgetting to add a cache policy when I call generate or wanting to change the cache policy in a bunch of places all at once. Is there some way to set a defeault cache policy for an eval?

If not, here are some potential options for an interface:

  1. In the Task
Task(
    dataset=dataset,
    solver=solver,
    scorer=[...],
    config=GenerateConfig(cache=CachePolicy("1M")), # in the generate config
    cache=CachePolicy("1M") # or a seperate parameter
)
  1. As an environment variable
export INSPECT_DEFAULT_CACHE_POLICY="expiry=1M, per_epoch=True"
  1. As a argument to inspect eval
inspect eval task.py --model openai/gpt-4o --cache "expiry=1M, per_epoch=True"
@jjallaire
Copy link
Collaborator

If we added it to GenerateConfig then all of the above would actually work! (syntax for the CLI and environment variables would be JSON encoded rather than what you show here). We'll plan on doing that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants