-
Notifications
You must be signed in to change notification settings - Fork 15.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Standardize LLM Docs #24803
Comments
- **Description:** Standardize OpenAI Docs - **Issue:** the issue #24803 --------- Co-authored-by: Chester Curme <[email protected]>
- **Description:** Standardize Tongyi LLM,include: - docs, the issue langchain-ai#24803 - model init arg names, the issue langchain-ai#20085
- **Description:** Standardize QianfanLLMEndpoint LLM,include: - docs, the issue langchain-ai#24803 - model init arg names, the issue langchain-ai#20085
- **Description:** Standardize OpenAI Docs - **Issue:** the issue langchain-ai#24803 --------- Co-authored-by: Chester Curme <[email protected]>
- **Description:** Standardize SparkLLM, include: - docs, the issue langchain-ai#24803 - to support stream - update api url - model init arg names, the issue langchain-ai#20085
Hi, @efriis. I'm helping the LangChain team manage their backlog and am marking this issue as stale. You've raised an important point about the need to standardize and enhance documentation for LLM integrations in LangChain. Your proposal includes updating the LLM class docstrings and integration documentation, along with specific guidelines and templates to improve the overall structure and examples. However, there haven't been any additional comments or developments since your initial post. Could you please let us know if this issue is still relevant to the latest version of the LangChain repository? If it is, feel free to comment here to keep it open. Otherwise, you can close the issue yourself, or it will be automatically closed in 7 days. Thank you! |
Privileged issue
Issue Content
Issue
To make our LLM integrations as easy to use as possible we need to make sure the docs for them are thorough and standardized. There are two parts to this: updating the llm docstrings and updating the actual integration docs.
This needs to be done for each LLM integration, ideally with one PR per LLM.
Related to broader issues #21983 and #22005.
Docstrings
Each LLM class docstring should have the sections shown in the Appendix below. The sections should have input and output code blocks when relevant.
To build a preview of the API docs for the package you're working on run (from root of repo):
make api_docs_clean; make api_docs_quick_preview API_PKG=openai
where
API_PKG=
should be the parent directory that houses the edited package (e.g. community, openai, anthropic, huggingface, together, mistralai, groq, fireworks, etc.). This should be quite fast for all the partner packages.Doc pages
Each LLM docs page should follow this template.
You can use the
langchain-cli
to quickly get started with a new chat model integration docs page (run from root of repo):poetry run pip install -e libs/cli poetry run langchain-cli integration create-doc --name "foo-bar" --name-class FooBar --component-type LLM --destination-dir ./docs/docs/integrations/llms/
where
--name
is the integration package name without the "langchain-" prefix and--name-class
is the class name without the "LLM" prefix. This will create a template doc with some autopopulated fields at docs/docs/integrations/llms/foo_bar.ipynb.To build a preview of the docs you can run (from root):
make docs_clean make docs_build cd docs/build/output-new yarn yarn start
Appendix
Expected sections for the LLM class docstring.
The text was updated successfully, but these errors were encountered: