You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi, i recently use the code to train my own dataset. and i found something strange, that is on tensorboard, my val loss is always lower then train loss. however the train acc is higher than val acc.
the important thing is i don't know why my val loss is always lower than train loss. the gap between the two curves maintain the same, like the image below. i can't make the two converge.
i know there are something different in model.train() and model.eval(), but i expect the curves can converge as the epoch increase.
does anyone encountered this question before? would this happened when training dataset is too small?
The text was updated successfully, but these errors were encountered:
hi, i recently use the code to train my own dataset. and i found something strange, that is on tensorboard, my val loss is always lower then train loss. however the train acc is higher than val acc.
the important thing is i don't know why my val loss is always lower than train loss. the gap between the two curves maintain the same, like the image below. i can't make the two converge.
i know there are something different in model.train() and model.eval(), but i expect the curves can converge as the epoch increase.
does anyone encountered this question before? would this happened when training dataset is too small?
The text was updated successfully, but these errors were encountered: