You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[5464088][fix] Enhance LoRA support in PyTorch model configuration
- Added logging for dtype casting in LoraLayer to ensure compatibility with FP16/BF16.
- Updated model configuration to derive the number of LoRA adapters from the model label, improving flexibility in adapter management.
Signed-off-by: Venky Ganesh <[email protected]>
0 commit comments