Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve JSON prompt response. #52

Merged
merged 1 commit into from
May 11, 2023
Merged

Improve JSON prompt response. #52

merged 1 commit into from
May 11, 2023

Conversation

raulraja
Copy link
Contributor

This PR includes the following changes:

  • Updated the chatCompletionResponse function to use an LLMModel instead of a string for the model parameter.
  • Updated the response instructions in the Prompt to provide detailed guidelines for formatting and returning the response.
  • Removed the response example from the code comments.
  • Updated the chatCompletionResponse function call with the new LLMModel parameter.

@raulraja raulraja requested a review from franciscodr May 11, 2023 10:55
@raulraja raulraja merged commit 403a8c4 into main May 11, 2023
@raulraja raulraja deleted the improve-json-prompt branch May 11, 2023 20:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants