A Web App built with React as a Frontend and Express as a Backend. It uses Node LlamaCPP as a way to run an LLM locally and get financial predictions based on data from different APIs. Data store: MongoDB.
You need to setup the LLM you want to be using in the backend. The LLM should be saved as a .gguf file in the backend/models folder.
- Validate LLM:
npx --no node-llama-cpp chat --model PATH-TO-MODEL-DIRnpx --no node-llama-cpp chat --model c:/projects/finai/backend/models/codellama-13b.Q3_K_M.gguf - Or do a test request:
curl --location 'localhost:9000/api/llm' \
--header 'Content-Type: application/json' \
--data '{ "messages": "Hello there" }'
- Or call the express API endpoint
localhost:9000/api/llmto see the result
Some examples for models and formats: LLMTypeDefinitions.json
Then you can run the below commands from the FinAI (main) directory and start the project:
npm i
npm startThis installs and starts both the FE and BE using the npm tool 'concurrently'. Or you can run the commands separately in the frontend / backend folders to have them running in separate instances/terminals.
Create a .env.local file in the root directory:
ANTHROPIC_API_KEY=your_api_key_hereRun the development server:
npm i
npm start- [✔]
React (Vite, Typescript) - [✔]
Express API - [✔]
Node LlamaCPP - [✔]
TradingView API / Widgets - [✔]
Axios - [✔]
MongoDB
Free Chat / Text InputsAnalysis Prompts- specific prompts for analyzing financial dataPrice / Ticker Inputs- a structured way of serving financial data to the LLMData Visualisation for Stocks / Crypto- Charts / Diagrams / Tickers with live price updates from the TradingView API
Auth0Auth and User ManagementLLM Fine-Tuningand general Model-related options
https://github.com/withcatai/node-llama-cpp