Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is it desirable to have LLM interactions be deterministic? #127

Open
cproctor opened this issue Jun 23, 2024 · 0 comments
Open

Is it desirable to have LLM interactions be deterministic? #127

cproctor opened this issue Jun 23, 2024 · 0 comments
Assignees

Comments

@cproctor
Copy link
Owner

If so, we may want to consider providing a SEED directive. Like INCLUDE, SEED X would only be allowed at the top of a story.

Reasons to consider deterministic LLM interactions:

  • It would make it much easier to debug, test, and resolve abusive use of LLMs.
  • We could cache requests to the LLM, drastically reducing the backend cost of supporting LLMs.

Notes/Issues

  • How do we handle INCLUDE statements which import stories with different seeds? The simplest approach would be to say that only the top-level story's SEED is active.
  • SEEDwill not affect random functions within the story (e.g. random, random_integer, random_gaussian). Perhaps we should consider a different name?
  • A malicious user could evade the cache by interpolating random numbers into the prompt. This would be pretty easy to spot.
  • Is there a use case for exposing the seed to users? Alternatively, we could always make requests deterministic and cached, and hide this.
@cproctor cproctor self-assigned this Jun 23, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant