From a8479968f14befb93ab26119fc93fb0009678058 Mon Sep 17 00:00:00 2001 From: zhangqianze Date: Fri, 15 Nov 2024 22:20:28 +0800 Subject: [PATCH] feat: add create extension with predefined type docs --- docs/SUMMARY.md | 3 ++ .../overview.md | 26 ++++++++++++++ .../type_llm_extension.md | 35 +++++++++++++++++++ .../type_llm_tool_extension.md | 29 +++++++++++++++ 4 files changed, 93 insertions(+) create mode 100644 docs/ten_agent/create_an_extension_with_predefined_type/overview.md create mode 100644 docs/ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md create mode 100644 docs/ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md diff --git a/docs/SUMMARY.md b/docs/SUMMARY.md index 5eb2c4cef..a54b97db2 100644 --- a/docs/SUMMARY.md +++ b/docs/SUMMARY.md @@ -13,6 +13,9 @@ * [Create a Hello World Extension](ten_agent/create_a_hello_world_extension.md) * [Setup VSCode Inside Container](ten_agent/setting_up_vscode_for_development_inside_container.md) * [How does interrupt work in TEN-Agent](ten_agent/how_does_interrupt_work.md) +* [Create an Extension with Predefined Type](ten_agent/create_an_extension_with_predefined_type/overview.md) + * [LLM Extension](ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md) + * [LLM Tool Extension](ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md) ## TEN framework diff --git a/docs/ten_agent/create_an_extension_with_predefined_type/overview.md b/docs/ten_agent/create_an_extension_with_predefined_type/overview.md new file mode 100644 index 000000000..e06422339 --- /dev/null +++ b/docs/ten_agent/create_an_extension_with_predefined_type/overview.md @@ -0,0 +1,26 @@ +# Overview + +## Extension types + +When developing Extensions, we often notice that implementations for Extensions of the same category share similarities. For example, the Extensions for Gemini and OpenAI have similar implementation logic, but they also differ in certain details. To improve development efficiency, these similar Extension implementations can be abstracted into a generic Extension type. During actual development, you only need to inherit from this type and implement a few specific methods. + +Currently, TEN Agent supports the following Extension types: + +- `AsyncLLMBaseExtension`: Designed for implementing large language model Extensions, such as those similar to OpenAI. +- `AsyncLLMToolBaseExtension`: Used to implement tool Extensions for large language models. These are Extensions that provide tool capabilities based on Function Call mechanisms. + +This abstraction helps standardize development while reducing repetitive work. + +You can execute the following command in the TEN project to install the abstract base class library: + +```bash +tman install system ten_ai_base@0.1.0 +``` + +## Extension behavior + +### LLM Extension and LLMTool Extension + +Any LLMTool Extension can be connected with an LLM Extension. When the LLMTool is started, it will automatically connect to the LLM Extension. + +When the LLM Extension detects a Function Call, it will pass the Function Call to the LLMTool Extension for processing. Once the LLMTool Extension has completed the processing, it will return the result to the LLM Extension. diff --git a/docs/ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md b/docs/ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md new file mode 100644 index 000000000..4f4929d64 --- /dev/null +++ b/docs/ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md @@ -0,0 +1,35 @@ +# Create a LLM Extension + +## Creating AsyncLLMBaseExtension by using tman + +Run the following command, + +```bash +tman install extension default_async_llm_extension_python --template-mode --template-data package_name=llm_extension --template-data class_name_prefix=LLMExtension +``` + +## Abstract APIs to implement + +### `on_data_chat_completion(self, ten_env: TenEnv, **kargs: LLMDataCompletionArgs) -> None` + +This method is called when the LLM Extension receives a data completion request. It's used when data is passed in via data protocol in streaming mode. + +### `on_call_chat_completion(self, ten_env: TenEnv, **kargs: LLMCallCompletionArgs) -> any` + +This method is called when the LLM Extension receives a call completion request. It's used when data is passed in via call protocol in non-streaming mode. + +This method is called when the LLM Extension receives a call completion request. + +### `on_tools_update(self, ten_env: TenEnv, tool: LLMToolMetadata) -> None` + +This method is called when the LLM Extension receives a tool update request. + +## APIs + +### cmd_in: `tool_register` + +This API is used to consume the tool registration request. An array of LLMToolMetadata will be received as input. The tools will be appended to `self.available_tools` for future use. + +### cmd_out: `tool_call` + +This API is used to send the tool call request. You can connect this API to any LLMTool extension destination to get the tool call result. diff --git a/docs/ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md b/docs/ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md new file mode 100644 index 000000000..aab7274bc --- /dev/null +++ b/docs/ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md @@ -0,0 +1,29 @@ +# Create a LLM Tool Extension + +## Creating AsyncLLMToolBaseExtension by using tman + +Run the following command, + +```bash +tman install extension default_async_llm_tool_extension_python --template-mode --template-data package_name=llm_tool_extension --template-data class_name_prefix=LLMToolExtension +``` + +## Abstract APIs to implement + +### `get_tool_metadata(self, ten_env: TenEnv) -> list[LLMToolMetadata]` + +This method is called when the LLM Extension is going to register itself to any connected LLM. It should return a list of LLMToolMetadata. + +### `run_tool(self, ten_env: AsyncTenEnv, name: str, args: dict) -> LLMToolResult` + +This method is called when the LLM Extension receives a tool call request. It should execute the tool function and return the result. + +## APIs + +### cmd_out: `tool_register` + +This API is used to send the tool registration request. An array of LLMToolMetadata returned from `get_tool_metadata` will be sent as output. + +### cmd_in: `tool_call` + +This API is used to consume the tool call request. The `run_tool` will be executed when the cmd_in is received.