Refactor LLama extension + cookbook to map chat messages --> Prompt 1-1 #630
Labels
effort: high
pri: mid
status: backlog
type: bug
Something isn't working
type: enhancement
New feature or request
See comments in #605 (comment)
Right now we'll only be storing the last message from response instead of the response (if there are multiple texts returned)
The text was updated successfully, but these errors were encountered: