From 4c2906d6fde9591399462bfb78f97bf818306e98 Mon Sep 17 00:00:00 2001 From: writinwaters <93570324+writinwaters@users.noreply.github.com> Date: Tue, 6 Aug 2024 19:06:36 +0800 Subject: [PATCH] Fixed a broken link (#1831) ### What problem does this PR solve? Fixed a display issue. ### Type of change - [x] Documentation Update --- docs/guides/llm_api_key_setup.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/guides/llm_api_key_setup.md b/docs/guides/llm_api_key_setup.md index 46dfcc8689d..32908571cc1 100644 --- a/docs/guides/llm_api_key_setup.md +++ b/docs/guides/llm_api_key_setup.md @@ -29,7 +29,7 @@ For now, RAGFlow supports the following online LLMs. Click the corresponding lin - [StepFun](https://platform.stepfun.com/) :::note -If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can [file a feature request](https://github.com/infiniflow/ragflow/issues/new?assignees=&labels=feature+request&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+) with us! Alternatively, if you have customized or locally-deployed models, you can [bind them to RAGFlow using Ollama, Xinferenc, or LocalAI](./deploy_local_llm.md). +If you find your online LLM is not on the list, don't feel disheartened. The list is expanding, and you can [file a feature request](https://github.com/infiniflow/ragflow/issues/new?assignees=&labels=feature+request&projects=&template=feature_request.yml&title=%5BFeature+Request%5D%3A+) with us! Alternatively, if you have customized or locally-deployed models, you can [bind them to RAGFlow using Ollama, Xinferenc, or LocalAI](./deploy_local_llm.mdx). ::: ## Configure your API key