-
Notifications
You must be signed in to change notification settings - Fork 0
UPSTREAM PR #16618: webui: add OAI-Compat Harmony tool-call streaming visualization and persistence in chat UI #200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
Access the complete analysis in the LOCI Dashboard Performance Analysis SummaryOverviewAnalysis of version Key FindingsPerformance Metrics:
Core Function Impact: Power Consumption Analysis:
Assembly and Control Flow Analysis: GitHub Code Review Insights:
Conclusion: |
ef7ca13 to
c65ae84
Compare
…and persistence in chat UI - Purely visual and diagnostic change, no effect on model context, prompt construction, or inference behavior - Captured assistant tool call payloads during streaming and non-streaming completions, and persisted them in chat state and storage for downstream use - Exposed parsed tool call labels beneath the assistant's model info line with graceful fallback when parsing fails - Added tool call badges beneath assistant responses that expose JSON tooltips and copy their payloads when clicked, matching the existing model badge styling - Added a user-facing setting to toggle tool call visibility to the Developer settings section directly under the model selector option
…atMessageAssistant.svelte Co-authored-by: Aleksander Grygier <[email protected]>
…atMessageAssistant.svelte Co-authored-by: Aleksander Grygier <[email protected]>
0ba18eb to
73e4023
Compare
|
Access the complete analysis in the LOCI Dashboard Performance Analysis SummaryOverviewAnalysis of version Performance MetricsHighest Response Time Change:
Highest Throughput Change:
Power Consumption Analysis: Core Function Impact AssessmentNo Core Function Changes Detected:
Inference Performance Impact: Technical AnalysisFlame Graph Analysis: CFG Comparison: Code Review Findings: ConclusionThe performance analysis indicates stable system behavior with variations well within measurement tolerance. No actionable optimizations are required as the detected changes represent measurement variance rather than functional regressions. |
|
Access the complete analysis in the LOCI Dashboard Performance Analysis SummaryOverviewAnalysis of version Key FindingsHighest Performance Changes:
Core Function Impact Assessment: Power Consumption Analysis: Flame Graph and CFG Analysis: GitHub Code Review Insights: Conclusion: |
0f3e62f to
a483926
Compare
2baff0f to
92ef8cd
Compare
Mirrored from ggml-org/llama.cpp#16618
Close ggml-org/llama.cpp#16597