Skip to content

Latest commit

 

History

History
51 lines (39 loc) · 1.87 KB

README.md

File metadata and controls

51 lines (39 loc) · 1.87 KB

LocalLm

An api to query local language models using different backends

Version Name Description Doc
pub package @locallm/types The shared data types Api doc - Readme
pub package @locallm/api Run local language models using different backends Api doc - Readme
pub package @locallm/browser Run quantitized language models inside the browser Api doc - Readme

Supported backends

Quickstart

Api

npm install @locallm/api
# or
yarn add @locallm/api

Example with the Koboldcpp provider:

import { Lm } from "@locallm/api";

const lm = new Lm({
  providerType: "koboldcpp",
  serverUrl: "http://localhost:5001",
  onToken: (t) => process.stdout.write(t),
});
const template = "<s>[INST] {prompt} [/INST]";
const _prompt = template.replace("{prompt}", "list the planets in the solar system");
// run the inference query
const res = await lm.infer(_prompt, {
  temperature: 0,
  top_p: 0.35,
  n_predict: 200,
});
console.log(res);

Examples

Check the examples directory for more examples