Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(ollama): resolve model list loading issue and add Pytest for component testing #3575

Merged
merged 8 commits into from
Aug 27, 2024

Conversation

edwinjosechittilappilly
Copy link
Collaborator

@edwinjosechittilappilly edwinjosechittilappilly commented Aug 27, 2024

Fixes #2885

@jordanrfrazier requesting for a review

The issue was that the url of the models: api/tags was not parsed correctly.
It was having a // hence used urlencode to parse it properly.

Th e correct apporach works only if the base_url is correct,i.e a valid ollama URL:
for DS LF this must be a public ollama Server URL.
changed the get model to take in base url and the function will make the expected url for the model names. This makes the function better, than providing the model url as paramter.

Added Pytest, 7 tests, 1 test excluded for future implememtstion: test_build_model_failure

Make lint and Make format had touched multiple files
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. bug Something isn't working labels Aug 27, 2024
Copy link
Contributor

Pull Request Validation Report

This comment is automatically generated by Conventional PR

Whitelist Report

Whitelist Active Result
Pull request is a draft and should be ignored
Pull request is made by a whitelisted user and should be ignored
Pull request is submitted by a bot and should be ignored
Pull request is submitted by administrators and should be ignored

Result

Pull request does not satisfy any enabled whitelist criteria. Pull request will be validated.

Validation Report

Validation Active Result
All commits in this pull request has valid messages
Pull request does not introduce too many changes
Pull request has mentioned issues
Pull request has valid branch name
Pull request should have a non-empty body
Pull request has a valid title

Result

Pull request is invalid.

Reason

  • Pull request title does not follow the desired pattern

Last Modified at 27 Aug 24 15:11 UTC

Copy link

This pull request is automatically being deployed by Amplify Hosting (learn more).

Access this pull request here: https://pr-3575.dmtpw4p5recq1.amplifyapp.com

@ogabrielluiz
Copy link
Contributor

Hey, @edwinjosechittilappilly

Thanks for this fix!
Looking good.

Could you provide a more descriptive PR title that follows conventional-commits?

@edwinjosechittilappilly edwinjosechittilappilly changed the title BugFix/ollama component with Pytest added for the component fix(ollama): resolve model list loading issue and add Pytest for component testing Aug 27, 2024
@edwinjosechittilappilly
Copy link
Collaborator Author

Title Changed to : fix(ollama): resolve model list loading issue and add Pytest for component testing
@ogabrielluiz

@github-actions github-actions bot added bug Something isn't working and removed bug Something isn't working labels Aug 27, 2024
src/backend/base/langflow/components/models/OllamaModel.py Outdated Show resolved Hide resolved
@@ -55,8 +55,9 @@ def update_build_config(self, build_config: dict, field_value: Any, field_name:

return build_config

def get_model(self, url: str) -> list[str]:
def get_model(self, base_url_value: str) -> list[str]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: could just call this base_url

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I changed it such that the endpoint url of the models are formed inside the function.
Earlier the input to the function was the url to the tags/models, which is ok, but in this way it would be more clear that the model is loaded from the base url and the endpoint url is framed without error inside the get_model function. Should I revert back to the previous method and have the model url endpoint before calling the function ?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nah, that's fine

removed unwanted print statements.

make format, formatted a lot of .tsx files also
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Aug 27, 2024
@edwinjosechittilappilly edwinjosechittilappilly merged commit 46a9789 into main Aug 27, 2024
28 checks passed
@edwinjosechittilappilly edwinjosechittilappilly deleted the bug/ollama-component branch August 27, 2024 23:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working lgtm This PR has been approved by a maintainer size:L This PR changes 100-499 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Ollama component cannot get models from Ollama server
3 participants