Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot reproduce results on ImageNet1K #27

Open
techmn opened this issue Oct 22, 2023 · 0 comments
Open

Cannot reproduce results on ImageNet1K #27

techmn opened this issue Oct 22, 2023 · 0 comments

Comments

@techmn
Copy link

techmn commented Oct 22, 2023

Hi,
Thanks for sharing the good work.
I was reproducing the results on ImageNet1k using the hyperparameters settings provided on the repo but I am not able to reproduce the results of Acc1: 67.9 for SeaFormer_T. Instead I get the Acc1 value as 66.23.
Here are the parameter values from the log file

aa: rand-m9-mstd0.5
amp: true
apex_amp: false
aug_repeats: 0
aug_splits: 0
batch_size: 128
bce_loss: false
bce_target_thresh: null
bn_eps: null
bn_momentum: null
channels_last: false
checkpoint_hist: 10
class_map: ''
clip_grad: null
clip_mode: norm
color_jitter: 0.4
cooldown_epochs: 10
crop_pct: null
cutmix: 0.0
cutmix_minmax: null
data_dir: ./imagenet_1k
dataset: ''
dataset_download: false
decay_epochs: 2.4
decay_rate: 0.973
dist_bn: reduce
drop: 0.2
drop_block: null
drop_connect: 0.2
drop_path: null
epoch_repeats: 0.0
epochs: 600
eval_metric: top1
experiment: SeaFormer_T
fuser: ''
gp: null
grad_checkpointing: false
hflip: 0.5
img_size: 224
initial_checkpoint: ''
input_size:

  • 3
  • 224
  • 224

interpolation: ''
jsd_loss: false
layer_decay: null
local_rank: 0
log_interval: 50
log_wandb: false
lr: 0.064
lr_cycle_decay: 0.5
lr_cycle_limit: 1
lr_cycle_mul: 1.0
lr_k_decay: 1.0
lr_noise:

  • 0.42

lr_noise_pct: 0.67
lr_noise_std: 1.0
mean: null
min_lr: 1.0e-06
mixup: 0.0
mixup_mode: batch
mixup_off_epoch: 0
mixup_prob: 1.0
mixup_switch_prob: 0.5
model: SeaFormer_T
model_ema: true
model_ema_decay: 0.9999
model_ema_force_cpu: false
momentum: 0.9
native_amp: false
no_aug: false
no_ddp_bb: false
no_prefetcher: false
no_resume_opt: false
num_classes: 1000
opt: adamw
opt_betas: null
opt_eps: 0.001
output: ./output_dir
patience_epochs: 10
pin_mem: false
pretrained: false
ratio:

  • 0.75
  • 1.3333333333333333

recount: 1
recovery_interval: 0
remode: pixel
reprob: 0.2
resplit: false
resume: ''
save_images: false
scale:

  • 0.08
  • 1.0

sched: cosine
seed: 42
smoothing: 0.1
split_bn: false
start_epoch: null
std: null
sync_bn: false
torchscript: false
train_interpolation: random
train_split: train
tta: 0
use_multi_epochs_loader: false
val_split: validation
validation_batch_size: null
vflip: 0.0
warmup_epochs: 10
warmup_lr: 1.0e-06
weight_decay: 2.0e-05
worker_seeding: all
workers: 7

Could you please tell me if the parameters values are correct and what am I doing wrong here. As recommended in repo I utilized 8 GPUs but I am not able to reproduce the results. Same is the case with SeaFormer_S.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant