!IMPORTANT This learning sample is for educational purposes only and should not be used in any production use case. It is intended to highlight concepts of Semantic Kernel and not any architectural / security design practices to be used.
- You will need an Open AI Key or Azure Open AI Service key for this sample.
- Ensure the service API is already running
http://localhost:7071
. If not, learn how to start it here. - You will also need to Copy .env.example into a new file with name ".env".
Note: Samples are configured to use chat completion AI models (e.g., gpt-3.5-turbo, gpt-4, etc.). See https://platform.openai.com/docs/models/model-endpoint-compatibility for chat completion model options.
- Run the following command
yarn install
(if you have never run the sample before) and/oryarn start
from the command line. - A browser will open or you can navigate to
http://localhost:3000
to use the sample.
Working with Secrets: KernelHttpServer's Readme has a note on safely working with keys and other secrets.
The GitHub Repo Q&A Bot sample allows you to pull in data from a public GitHub repo into a local memory store in order to ask questions about the project and to get answers about it. The sample highlights how memory and embeddings work along with the TextChunker when the size of the data is larger than the allowed token limited. Each SK function will call Open AI to perform the tasks you ask about.
In order to reduce costs and improve overall performance, this sample app indexes only content extracted from markdown files.
Caution
Each function will call Open AI which will use tokens that you will be billed for.
Create Skills and SK functions: Check out the documentation for how to create Skills.
Join the community: Join our Discord community to share ideas and get help.
Contribute: We need your help to make this the best it can be. Learn how you can contribute to this project.