-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Closed
Labels
questionQuestion about the usageQuestion about the usage
Description
❓ General Questions
I'm trying out from mlc_chat import ChatModule. I can use it to continue a conversation like this:
from mlc_chat import ChatModule
from mlc_chat.callback import StreamToStdout
cm = ChatModule(model="Llama-2-7b-chat-hf-q4f16_1")
print(cm.generate(prompt="Three terrific names for a pet skunk"))
# Prints that out
print(cm.generate(prompt="Two more"))
# Prints out two moreSo it's clearly persisting the previous messages in that conversation somewhere.
But... I have my own code for persisting messages (in https://llm.datasette.io/ ) - and I'd like to be able to instantiate an instance of ChatModule that restores a previous conversation so I can continue it.
I can't see a way to feed ChatModule a full previous saved conversation in order to continue it with an additional prompt. Does that mechanism exist yet?
I want something maybe like this:
cm = ChatModule(
model="Llama-2-7b-chat-hf-q4f16_1",
messages=[
["system", "You are a helpful assistant"],
["user", "Three terrific names for a pet skunk"],
["assistant": "Stinky, Pepe and Odie"],
]
)
print(cm.generate(prompt="Two more"))CharlieFRuan and tqchen
Metadata
Metadata
Assignees
Labels
questionQuestion about the usageQuestion about the usage