Skip to content

atgehrhardt/funcyllama

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FuncyLlama

This is an implementation of Ollama over langchain, utilizin NexusRaven-V2 as a function calling LLM to invoke custom functions as well as other LLMs.

Installation

Be sure to install Ollama first. Detail instructions can be found here: https://github.com/jmorganca/ollama

Once installed you will need to ensure you have downloaded NexusRaven-V2, Codellama, and Mistral ollama pull nexusraven && ollama pull codellama && ollama pull mistral

After Ollama is installed and you have downloaded the appropriate models you can run the following commands to clone the repo and run the script:

Install

git clone https://github.com/TaitTechMedia/FuncyLlama
cd FuncyLlama
python3 -m venv venv
source venv/bin/activate
pip install langchain requests

Run

source venv/bin/activate
python3 -m app.py

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages