-
Notifications
You must be signed in to change notification settings - Fork 212
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ImagetNet Training Dataset Preprocessing #156
Comments
Hi Stephen, You can refer to the readme here to extract latent data. efficientvit/applications/dc_ae/README.md Line 190 in 5dd097d
|
Hi @chenjy2003 thank you so much for the response. Would there be a easy way to fine-tune the autoencoder as well? I am thinking of training a dc-ae on the RUGD dataset, and im not sure if the pretrained autoencoder would work out of the box. Do you by any chance have any insights? Any pointers would be greatly appreciated! Thank you. |
Hi Stephen, We tried some images from the RUGD dataset and observed that our autoencoders worked well. Here are some examples. The left part is the original image and the right part is the reconstructed image. You can also use this script to test other images. |
Thank you so much for getting back @chenjy2003! May I also know what would be the command for finetuning a DiT w/ the pretrained autoencoder, from the imagenet pretrained DiT presented in the paper? I think the readme has the command only for the uvit but not the DiT. Thank you! |
That's a good point. @chenjy2003, we should add the command to train DiT-XL on ImageNet 512x512. |
@StephenYangjz Thanks for your suggestion. The training command for DiT-XL on ImageNet 512x512 is added here and here. If you want to finetune from the imagenet pretrained checkpoint, you can add |
Hi, I am trying to train the dc-ae using the default setup (imagenet). I saw that imagenet here is being loaded as a npy file:
I am wondering how is this file prepared and would it be possible to share a minimal working example of the file? Thank you!
The text was updated successfully, but these errors were encountered: