Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug] Using gemini-1.5-flash-lastest as Chat model throws error with knowledge graph creation #2720

Open
1 task done
marcfon opened this issue Oct 5, 2024 · 2 comments
Open
1 task done
Assignees
Labels
bug Something isn't working

Comments

@marcfon
Copy link

marcfon commented Oct 5, 2024

Is there an existing issue for the same bug?

  • I have checked the existing issues.

Branch name

main

Commit ID

--

Other environment information

No response

Actual behavior

Traceback (most recent call last):
  File "/ragflow/graphrag/graph_extractor.py", line 128, in __call__
    result, token_count = self._process_document(text, prompt_variables)
  File "/ragflow/graphrag/graph_extractor.py", line 177, in _process_document
    if response.find("**ERROR**") >=0: raise Exception(response)
Exception: **ERROR**: 400 Please use a valid role: user, model.
**ERROR**: contents must not be empty

Expected behavior

No response

Steps to reproduce

1. Set `Model Providers > System Model Settings > Chat model` to `gemini-1.5-flash-lastest`
2. Create a knowledge graph 
3. Process a file

Additional information

Processing the file works fine when the Chat model is set to gpt-4o-mini.

@KevinHuSh
Copy link
Collaborator

What's the llm factory/supplier do you select? Gemini?

@marcfon
Copy link
Author

marcfon commented Oct 6, 2024

What's the llm factory/supplier do you select? Gemini?

Yes, Gemini.

KevinHuSh pushed a commit that referenced this issue Oct 8, 2024
### What problem does this PR solve?
#2720

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
Halfknow pushed a commit to Halfknow/ragflow that referenced this issue Nov 11, 2024
### What problem does this PR solve?
infiniflow#2720

### Type of change

- [x] Bug Fix (non-breaking change which fixes an issue)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants