bug: Model using CPU instead of GPU #4053
Labels
category: observability
Logs, telemetry, crash reports
P1: important
Important feature / fix
type: bug
Something isn't working
Milestone
Jan version
0.5.8
Describe the Bug
https://discord.com/channels/1107178041848909847/1306684719181729802
The app detects the NVIDIA RTX 3060 GPU but fails to utilize it, defaulting to CPU for model inference instead. When attempting to run models:
Steps to Reproduce
Screenshots / Logs
Device: Dell XPS 9710 laptop
CPU: Intel i7-11800H
GPU: NVIDIA RTX 3060
OS: Windows
What is your OS?
The text was updated successfully, but these errors were encountered: