This is an extremely simple library that helps me play around with ollama.
- It only supports generating a completion.
- The only available parameter is the
format
one. - The response is not streamed.
- Download and install ollama
- Run the server
ollama serve
Follow this guide to setup your project using published packages.
The package for this library is: https://maven.pkg.github.com/le0nidas/ollama-kotlin-playground
Add the dependency to your build.gradle
:
implementation("gr.le0nidas:ollama-kotlin-playground:0.0.2")
fun main() {
val ollamaClient = OllamaClient()
val request = GenerateRequest.Builder(Model("llama3.2"))
.build(prompt = Prompt("Generate three numbers from 0 to 100"))
val response = ollamaClient.generate(request)
response
.onSuccess { println(it.value) }
.onFailure { println(it.message) }
}
/*
Here are three random numbers between 0 and 100:
1. 43
2. 91
3. 18
*/
or if you need to set a json schema for formatting the response:
fun main() {
val ollamaClient = OllamaClient()
val request = GenerateRequest.Builder(Model("llama3.2"))
.withFormat(JsonSchemaFormat(
buildJsonObject {
put("type", "object")
put("properties", buildJsonObject {
put("numbers", buildJsonObject {
put("type", "array")
})
})
}
))
.build(prompt = Prompt("Generate three numbers from 0 to 100"))
val response = ollamaClient.generate(request)
response
.onSuccess { println(it.value) }
.onFailure { println(it.message) }
}
/*
{
"numbers": [
43,
91,
13
]
}
*/
Q: Will it ever support anything else from the API?
A: Maybe. Depends on the needs of the experiments I do that involve ollama.
Q: Why did you write it then?
A: I didn't want to copy code from project to project.