-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing necessary newline in st.chat_message
#7978
Comments
@ShaneTian Thanks for reporting this issue. I was able to reproduce this here. The problem comes from markdown rendering ( |
Btw. a workaround is to add spaces before all-new line symbols. E.g. via a string replacement on the message. |
Yes, it is a solution, but is is not a permanent one. |
There are 62 likes on a linked issue and multiple posts about this - when you ask for likes, do likes thread through from linked discussions in your analytics? You may be underestimating the importance if not. I think anyone dealing with generated output is going to find this to be a gigantic headache ... not a trivial problem. |
Checklist
Summary
When I use
st.chat_message
in LLM demo, I found the output message was missing some of the necessary line breaks.The assistant message below is from GPT-4, so it is common phenomenon.
Reproducible Code Example
Steps To Reproduce
No response
Expected Behavior
Current Behavior
Is this a regression?
Debug info
Additional Information
No response
The text was updated successfully, but these errors were encountered: