We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
当我设置 Lora 为 LCM 1.5 Lora时候 ,发现这个错误:
The text was updated successfully, but these errors were encountered:
LCM is not supported in this project.
Sorry, something went wrong.
看起来您在使用某种深度学习模型时遇到了维度不匹配的问题,特别是在权重转换过程中出现了维度错误。根据您的描述,问题可能出现在下采样(downsampling)和上采样(upsampling)操作中,其中权重的维度不符合预期。
针对您提到的具体问题和需要修改的部分:
权重维度问题:您提到了权重维度是 [64, 320, 3, 3] 和 [320, 64, 1, 1],其中后两个维度是 1,这可能导致维度不匹配的错误。通常情况下,卷积操作的权重维度应该是 [out_channels, in_channels, kernel_height, kernel_width],确保 kernel_height 和 kernel_width 不是 1,除非特定设计需要。
调整建议:根据您的情况,可能需要调整模型的设计或者权重初始化的方式:
检查模型定义:确认您的模型定义与预期一致,特别是在下采样和上采样模块的设计上。 权重初始化:如果您使用了自定义的下采样或上采样模块,确保权重初始化的方式正确,并且符合卷积操作的基本要求。 LCM Lora 实现:关于您提到的LCM Lora实现和项目相关的问题,封装通常是为了简化使用或提供更高层次的抽象,但具体实现可能因项目而异。如果封装过于隐蔽导致找不到具体实现,可以考虑查看文档或者源代码,以确保使用正确的功能和API。
如果您能提供更多关于项目或代码的背景信息,我可以更具体地帮助您解决这些问题。
No branches or pull requests
当我设置 Lora 为 LCM 1.5 Lora时候 ,发现这个错误:
![image](https://private-user-images.githubusercontent.com/48466610/325562276-466e52e1-1ab6-4c3c-8629-4e9d061d559d.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MjAxNTE1MTQsIm5iZiI6MTcyMDE1MTIxNCwicGF0aCI6Ii80ODQ2NjYxMC8zMjU1NjIyNzYtNDY2ZTUyZTEtMWFiNi00YzNjLTg2MjktNGU5ZDA2MWQ1NTlkLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA3MDUlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNzA1VDAzNDY1NFomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWNmNjM3MWFlMTU2ZmFiZjc2M2ZmMWZhNzI4ZjQ0MzhhNGJiMWE2ZTMzODQ4YzY0MDA5NDRmNWEzOGI2ZjNmYjImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.1VKJyErRf3EXlfqNWZiohGiIPOs2GkYTg69XqlXM5PQ)
lora_unet_down_blocks_0_downsamplers_0_conv.alpha:torch.Size([])
lora_unet_down_blocks_0_downsamplers_0_conv.lora_down.weight:torch.Size([64, 320, 3, 3])
lora_unet_down_blocks_0_downsamplers_0_conv.lora_up.weight:torch.Size([320, 64, 1, 1])
The text was updated successfully, but these errors were encountered: