Skip to content

About the Comparison between AID and default data aug in mmpsoe #1064

Answered by jin-s13
WeianMao asked this question in General
Discussion options

You must be logged in to vote

It seems that in AID's paper, the models are first pre-trained on COCO for 210 epochs with the default settings and then fine-tuned with AID for another 210 epochs. The authors report that this training strategy will bring some gains, but we did not fully check this.

Replies: 3 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by jin-s13
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/discussion community discussion
3 participants
Converted from issue

This discussion was converted from issue #1004 on December 07, 2021 04:33.