-
Notifications
You must be signed in to change notification settings - Fork 39
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
feat: add create extension with predefined type docs (#285)
- Loading branch information
Showing
4 changed files
with
68 additions
and
18 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
35 changes: 35 additions & 0 deletions
35
docs/ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,35 @@ | ||
# Create a LLM Extension | ||
|
||
## Creating AsyncLLMBaseExtension by using tman | ||
|
||
Run the following command, | ||
|
||
```bash | ||
tman install extension default_async_llm_extension_python --template-mode --template-data package_name=llm_extension --template-data class_name_prefix=LLMExtension | ||
``` | ||
|
||
## Abstract APIs to implement | ||
|
||
### `on_data_chat_completion(self, ten_env: TenEnv, **kargs: LLMDataCompletionArgs) -> None` | ||
|
||
This method is called when the LLM Extension receives a data completion request. It's used when data is passed in via data protocol in streaming mode. | ||
|
||
### `on_call_chat_completion(self, ten_env: TenEnv, **kargs: LLMCallCompletionArgs) -> any` | ||
|
||
This method is called when the LLM Extension receives a call completion request. It's used when data is passed in via call protocol in non-streaming mode. | ||
|
||
This method is called when the LLM Extension receives a call completion request. | ||
|
||
### `on_tools_update(self, ten_env: TenEnv, tool: LLMToolMetadata) -> None` | ||
|
||
This method is called when the LLM Extension receives a tool update request. | ||
|
||
## APIs | ||
|
||
### cmd_in: `tool_register` | ||
|
||
This API is used to consume the tool registration request. An array of LLMToolMetadata will be received as input. The tools will be appended to `self.available_tools` for future use. | ||
|
||
### cmd_out: `tool_call` | ||
|
||
This API is used to send the tool call request. You can connect this API to any LLMTool extension destination to get the tool call result. |
29 changes: 29 additions & 0 deletions
29
docs/ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,29 @@ | ||
# Create a LLM Tool Extension | ||
|
||
## Creating AsyncLLMToolBaseExtension by using tman | ||
|
||
Run the following command, | ||
|
||
```bash | ||
tman install extension default_async_llm_tool_extension_python --template-mode --template-data package_name=llm_tool_extension --template-data class_name_prefix=LLMToolExtension | ||
``` | ||
|
||
## Abstract APIs to implement | ||
|
||
### `get_tool_metadata(self, ten_env: TenEnv) -> list[LLMToolMetadata]` | ||
|
||
This method is called when the LLM Extension is going to register itself to any connected LLM. It should return a list of LLMToolMetadata. | ||
|
||
### `run_tool(self, ten_env: AsyncTenEnv, name: str, args: dict) -> LLMToolResult` | ||
|
||
This method is called when the LLM Extension receives a tool call request. It should execute the tool function and return the result. | ||
|
||
## APIs | ||
|
||
### cmd_out: `tool_register` | ||
|
||
This API is used to send the tool registration request. An array of LLMToolMetadata returned from `get_tool_metadata` will be sent as output. | ||
|
||
### cmd_in: `tool_call` | ||
|
||
This API is used to consume the tool call request. The `run_tool` will be executed when the cmd_in is received. |