Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add create extension with predefined type docs #285

Merged
merged 2 commits into from
Nov 17, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,8 @@
* [Setup VSCode Inside Container](ten_agent/setting_up_vscode_for_development_inside_container.md)
* [How does interrupt work in TEN-Agent](ten_agent/how_does_interrupt_work.md)
* [Create an Extension with Predefined Type](ten_agent/create_an_extension_with_predefined_type/overview.md)
* [LLM Extension](ten_agent/create_an_extension_with_predefined_type/llm_extension.md)
* [LLM Tool Extension](ten_agent/create_an_extension_with_predefined_type/llm_tool_extension.md)
* [LLM Extension](ten_agent/create_an_extension_with_predefined_type/type_llm_extension.md)
* [LLM Tool Extension](ten_agent/create_an_extension_with_predefined_type/type_llm_tool_extension.md)

## TEN framework

Expand Down
Original file line number Diff line number Diff line change
@@ -1,17 +1,3 @@
---
layout:
title:
visible: true
description:
visible: false
tableOfContents:
visible: true
outline:
visible: true
pagination:
visible: true
---

# Overview

## Extension types
Expand All @@ -20,8 +6,8 @@ When developing Extensions, we often notice that implementations for Extensions

Currently, TEN Agent supports the following Extension types:

- AsyncLLMBaseExtension: Designed for implementing large language model Extensions, such as those similar to OpenAI.
- AsyncLLMToolBaseExtension: Used to implement tool Extensions for large language models. These are Extensions that provide tool capabilities based on Function Call mechanisms.
- `AsyncLLMBaseExtension`: Designed for implementing large language model Extensions, such as those similar to OpenAI.
- `AsyncLLMToolBaseExtension`: Used to implement tool Extensions for large language models. These are Extensions that provide tool capabilities based on Function Call mechanisms.

This abstraction helps standardize development while reducing repetitive work.

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
# Create a LLM Extension

## Creating AsyncLLMBaseExtension by using tman

Run the following command,

```bash
tman install extension default_async_llm_extension_python --template-mode --template-data package_name=llm_extension --template-data class_name_prefix=LLMExtension
```

## Abstract APIs to implement

### `on_data_chat_completion(self, ten_env: TenEnv, **kargs: LLMDataCompletionArgs) -> None`

This method is called when the LLM Extension receives a data completion request. It's used when data is passed in via data protocol in streaming mode.

### `on_call_chat_completion(self, ten_env: TenEnv, **kargs: LLMCallCompletionArgs) -> any`

This method is called when the LLM Extension receives a call completion request. It's used when data is passed in via call protocol in non-streaming mode.

This method is called when the LLM Extension receives a call completion request.

### `on_tools_update(self, ten_env: TenEnv, tool: LLMToolMetadata) -> None`

This method is called when the LLM Extension receives a tool update request.

## APIs

### cmd_in: `tool_register`

This API is used to consume the tool registration request. An array of LLMToolMetadata will be received as input. The tools will be appended to `self.available_tools` for future use.

### cmd_out: `tool_call`

This API is used to send the tool call request. You can connect this API to any LLMTool extension destination to get the tool call result.
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Create a LLM Tool Extension

## Creating AsyncLLMToolBaseExtension by using tman

Run the following command,

```bash
tman install extension default_async_llm_tool_extension_python --template-mode --template-data package_name=llm_tool_extension --template-data class_name_prefix=LLMToolExtension
```

## Abstract APIs to implement

### `get_tool_metadata(self, ten_env: TenEnv) -> list[LLMToolMetadata]`

This method is called when the LLM Extension is going to register itself to any connected LLM. It should return a list of LLMToolMetadata.

### `run_tool(self, ten_env: AsyncTenEnv, name: str, args: dict) -> LLMToolResult`

This method is called when the LLM Extension receives a tool call request. It should execute the tool function and return the result.

## APIs

### cmd_out: `tool_register`

This API is used to send the tool registration request. An array of LLMToolMetadata returned from `get_tool_metadata` will be sent as output.

### cmd_in: `tool_call`

This API is used to consume the tool call request. The `run_tool` will be executed when the cmd_in is received.