Skip to content

Commit

Permalink
server : add /detokenize endpoint (ggerganov#2802)
Browse files Browse the repository at this point in the history
* Add a /detokenize endpoint to the example server

* remove trailing white-space
  • Loading branch information
BruceMacD authored Aug 26, 2023
1 parent 730d9c6 commit c1ac54b
Show file tree
Hide file tree
Showing 2 changed files with 27 additions and 0 deletions.
6 changes: 6 additions & 0 deletions examples/server/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -164,6 +164,12 @@ node index.js

Note that the special `BOS` token is not added in front of the text and also a space character is not inserted automatically as it is for `/completion`.

- **POST** `/detokenize`: Convert tokens to text.

*Options:*

`tokens`: Set the tokens to detokenize.

- **POST** `/embedding`: Generate embedding of a given text just as [the embedding example](../embedding) does.

*Options:*
Expand Down
21 changes: 21 additions & 0 deletions examples/server/server.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -1104,6 +1104,12 @@ static json format_tokenizer_response(const std::vector<llama_token> &tokens)
{"tokens", tokens}};
}

static json format_detokenized_response(std::string content)
{
return json{
{"content", content}};
}

template <typename T>
static T json_value(const json &body, const std::string &key, const T &default_value)
{
Expand Down Expand Up @@ -1501,6 +1507,21 @@ int main(int argc, char **argv)
const json data = format_tokenizer_response(tokens);
return res.set_content(data.dump(), "application/json"); });

svr.Post("/detokenize", [&llama](const Request &req, Response &res)
{
auto lock = llama.lock();

const json body = json::parse(req.body);
std::string content;
if (body.count("tokens") != 0)
{
const std::vector<llama_token> tokens = body["tokens"];
content = tokens_to_str(llama.ctx, tokens.cbegin(), tokens.cend());
}

const json data = format_detokenized_response(content);
return res.set_content(data.dump(), "application/json"); });

svr.Post("/embedding", [&llama](const Request &req, Response &res)
{
auto lock = llama.lock();
Expand Down

0 comments on commit c1ac54b

Please sign in to comment.