Skip to content
This repository has been archived by the owner on Jun 8, 2024. It is now read-only.

feat: make model customizable #29

Closed

Conversation

jonasiwnl
Copy link

just a note -- this change requires that model is not a static property of the EmbedChain class, but rather a field of an instance. As a result, the functions getOpenAiAnswer and getAnswerFromLLM are no longer static, but methods on an instance.

constructor(appConfig: AppConfig | null = null) {
// If a model is defined in the config, set it as the model
if (appConfig?.model) {
this.model = appConfig.model;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this, there's no default model, right? We do need a default model, to not make this change breaking.

Copy link
Author

@jonasiwnl jonasiwnl Jul 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

default is 'gpt-3.5-turbo', set on line 40. this is the model that was used before this PR. if a model is defined in the config it changes the model to that, but if not, it stays as the default value

@cachho
Copy link
Contributor

cachho commented Jul 23, 2023

And AppConfig is just a type, not a class. We have to see in what way we copy that from Python, there it's a class but TS gives us different options. But that doesn't have to be done in this PR.

@jonasiwnl
Copy link
Author

AppConfig doesn't require state, so personally I don't think a class would be useful.

@taranjeet
Copy link
Member

Closing this pull request. We will address this issue in a different way.

@taranjeet taranjeet closed this Sep 5, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants