Skip to content
Dewayne VanHoozer edited this page Mar 4, 2024 · 1 revision

llm is a backend for the 'aia' program

A CLI utility and Python library for interacting with Large Language Models, both via remote APIs and models that can be installed and run on your own machine.

The current integration with llm is light weight. It has some capabilityes that exceed those of the default backend processor mods which make it a possible replacement as the default - if it were not for being implemented in puthon and that whole problem with python environment confusion between python2 and python3. That little problem impacted by workstation. After install llm my python environment change in such a way that I can not longer access the sgpt backend process which is also a python implemented program.

Some of the features that make llm interesting on its own are its integration with local models, LocalAI as well as the remote APIs. It has an integrated chat capability and its supports templated prompts as well as embeddings.

Website: https://llm.datasette.io/

Clone this wiki locally