Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix self deployed llm lost #2510

Merged
merged 1 commit into from
Sep 20, 2024
Merged

fix self deployed llm lost #2510

merged 1 commit into from
Sep 20, 2024

Conversation

KevinHuSh
Copy link
Collaborator

@KevinHuSh KevinHuSh commented Sep 20, 2024

What problem does this PR solve?

#2506

Type of change

  • Bug Fix (non-breaking change which fixes an issue)

@KevinHuSh KevinHuSh merged commit a44f1f7 into infiniflow:main Sep 20, 2024
@0000sir
Copy link
Contributor

0000sir commented Sep 20, 2024

mysql> select * from llm where fid='Xinference';
Empty set (0.00 sec)

There's no fid 'Xinference', so this won't work.

@0000sir
Copy link
Contributor

0000sir commented Sep 20, 2024

The local defined LLMs is located in tenant_llm, so I changed line 317 to:

if o.llm_name in llm_set and o.llm_factory not in self_deploied:continue

Then the chat model shows up.

Halfknow pushed a commit to Halfknow/ragflow that referenced this pull request Nov 11, 2024
### What problem does this PR solve?

infiniflow#2509 

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants