Skip to content

GeniSysAI LLMCore is the core AI assistant for the GeniSysAI Network.

License

Notifications You must be signed in to change notification settings

GeniSysAI/LLMCore

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 

Repository files navigation

GeniSysAI LLMCore

GeniSysAI LLMCore GeniSysAI LLMCore is the successor to the GeniSysAI NLU Engine, the retrieval based natural language understanding engine that was the core of the GeniSysAI network. GeniSysAI LLMCore takes advantage of modern day AI technologies, replacing traditional natural language understanding with powerful generative AI models and modern day libraries.

GeniSysAI LLMCore currently supports the following hardware:

GeniSysAI LLMCore For Intel® AI PC

Project Status: In Development

Intel® AI PC The first version of LLMCore is built for running on Intel® AI PC, supporting Intel® Core™ Ultra CPUs and NPUs, and Intel® Arc™ GPUs. You can find out more about Intel® AI PCs on our article here.

Intel® OpenVINO™

Intel® OpenVINO™ OpenVINO™ is an open-source toolkit by Intel, designed to optimize and deploy deep learning models across a range of tasks including computer vision, automatic speech recognition, generative AI, and natural language processing. It supports models built with frameworks like PyTorch, TensorFlow, ONNX, and Keras.

Intel® OpenVINO™ GenAI

Intel® OpenVINO™ GenAI OpenVINO™ GenAI is designed to simplify the process of running generative AI models, giving you access to top Generative AI models with optimized pipelines, efficient execution methods, and sample implementations. It abstracts away the complexity of the generation pipeline, letting you focus on providing the model and input context while OpenVINO handles tokenization, executes the generation loop on your device, and returns the results.

Gettng started with LLMCore For Intel® AI PC

The project currently allows for you to set up the basics of LLMCore for Intel AI PC. The current functionality and documentation allows for you to use LLama 3.2, modify the system prompt, and communicate with the LLM locally.

To get started with LLMCore For Intel AI PC follow the installation guide.

Author

Adam Milton-Barker

About

GeniSysAI LLMCore is the core AI assistant for the GeniSysAI Network.

Resources

License

Stars

Watchers

Forks

Packages

No packages published