Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request]: update new llm from Groq #2076

Open
1 task done
lionkingc opened this issue Aug 23, 2024 · 0 comments
Open
1 task done

[Feature Request]: update new llm from Groq #2076

lionkingc opened this issue Aug 23, 2024 · 0 comments
Assignees
Labels

Comments

@lionkingc
Copy link

Is there an existing issue for the same feature request?

  • I have checked the existing issues.

Is your feature request related to a problem?

No response

Describe the feature you'd like

Hello there,
Groq have update new llm :llama-3.1-70b-versatile and llama-3.1-8b-instant , after input API-KEY, it still only shows old llama3.0.

I try to update conf/llm_factories.json and compose the docker but didn't work.

Please update the new llm models, thank you.

Describe implementation you've considered

No response

Documentation, adoption, use case

No response

Additional information

No response

@hangters hangters mentioned this issue Aug 27, 2024
1 task
KevinHuSh pushed a commit that referenced this issue Aug 27, 2024
### What problem does this PR solve?

#2076   update groq llm.

### Type of change

- [x] New Feature (non-breaking change which adds functionality)

Co-authored-by: Zhedong Cen <[email protected]>
Halfknow pushed a commit to Halfknow/ragflow that referenced this issue Nov 11, 2024
### What problem does this PR solve?

infiniflow#2076   update groq llm.

### Type of change

- [x] New Feature (non-breaking change which adds functionality)

Co-authored-by: Zhedong Cen <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants