We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi,
In the "dropout from scratch" chapter, there is no significative difference between adding or not adding dropout. See metrics below:
with drop_prob=0.5: Epoch 0. Loss: 0.7281168998689048, Train_acc 0.84805, Test_acc 0.8542 Epoch 1. Loss: 0.3915857726824017, Train_acc 0.92135, Test_acc 0.9232 Epoch 2. Loss: 0.28589152397083284, Train_acc 0.94455, Test_acc 0.9453 Epoch 3. Loss: 0.24532910775307348, Train_acc 0.95731664, Test_acc 0.9544 Epoch 4. Loss: 0.2158778892831953, Train_acc 0.96416664, Test_acc 0.9577 Epoch 5. Loss: 0.18741829898958554, Train_acc 0.9690667, Test_acc 0.9629 Epoch 6. Loss: 0.17892182123475542, Train_acc 0.97436666, Test_acc 0.9669 Epoch 7. Loss: 0.16173455391374822, Train_acc 0.97606665, Test_acc 0.9697 Epoch 8. Loss: 0.14435731281497297, Train_acc 0.97805, Test_acc 0.9715 Epoch 9. Loss: 0.13987530898537387, Train_acc 0.9802333, Test_acc 0.9728
with drop_prob=0: Epoch 0. Loss: 0.5414872900664346, Train_acc 0.8639, Test_acc 0.8664 Epoch 1. Loss: 0.2935891257485201, Train_acc 0.91835, Test_acc 0.9187 Epoch 2. Loss: 0.1943884894438662, Train_acc 0.94588333, Test_acc 0.9443 Epoch 3. Loss: 0.16208293540151902, Train_acc 0.9608, Test_acc 0.9575 Epoch 4. Loss: 0.12486255719026844, Train_acc 0.96991664, Test_acc 0.965 Epoch 5. Loss: 0.1082405275668826, Train_acc 0.97546667, Test_acc 0.9675 Epoch 6. Loss: 0.08227119813514386, Train_acc 0.97891665, Test_acc 0.9709 Epoch 7. Loss: 0.07985300555672155, Train_acc 0.98141664, Test_acc 0.9731 Epoch 8. Loss: 0.06968508473642689, Train_acc 0.98436666, Test_acc 0.9736 Epoch 9. Loss: 0.05624080957929232, Train_acc 0.98793334, Test_acc 0.9774
It would be nice to see an improvement when using dropout wrt. not using it. Or perhaps I'm missing something? Best, Patricia
The text was updated successfully, but these errors were encountered:
This is just a simple example. In real projects dropout is important.
Sorry, something went wrong.
No branches or pull requests
Hi,
In the "dropout from scratch" chapter, there is no significative difference between adding or not adding dropout. See metrics below:
with drop_prob=0.5:
Epoch 0. Loss: 0.7281168998689048, Train_acc 0.84805, Test_acc 0.8542
Epoch 1. Loss: 0.3915857726824017, Train_acc 0.92135, Test_acc 0.9232
Epoch 2. Loss: 0.28589152397083284, Train_acc 0.94455, Test_acc 0.9453
Epoch 3. Loss: 0.24532910775307348, Train_acc 0.95731664, Test_acc 0.9544
Epoch 4. Loss: 0.2158778892831953, Train_acc 0.96416664, Test_acc 0.9577
Epoch 5. Loss: 0.18741829898958554, Train_acc 0.9690667, Test_acc 0.9629
Epoch 6. Loss: 0.17892182123475542, Train_acc 0.97436666, Test_acc 0.9669
Epoch 7. Loss: 0.16173455391374822, Train_acc 0.97606665, Test_acc 0.9697
Epoch 8. Loss: 0.14435731281497297, Train_acc 0.97805, Test_acc 0.9715
Epoch 9. Loss: 0.13987530898537387, Train_acc 0.9802333, Test_acc 0.9728
with drop_prob=0:
Epoch 0. Loss: 0.5414872900664346, Train_acc 0.8639, Test_acc 0.8664
Epoch 1. Loss: 0.2935891257485201, Train_acc 0.91835, Test_acc 0.9187
Epoch 2. Loss: 0.1943884894438662, Train_acc 0.94588333, Test_acc 0.9443
Epoch 3. Loss: 0.16208293540151902, Train_acc 0.9608, Test_acc 0.9575
Epoch 4. Loss: 0.12486255719026844, Train_acc 0.96991664, Test_acc 0.965
Epoch 5. Loss: 0.1082405275668826, Train_acc 0.97546667, Test_acc 0.9675
Epoch 6. Loss: 0.08227119813514386, Train_acc 0.97891665, Test_acc 0.9709
Epoch 7. Loss: 0.07985300555672155, Train_acc 0.98141664, Test_acc 0.9731
Epoch 8. Loss: 0.06968508473642689, Train_acc 0.98436666, Test_acc 0.9736
Epoch 9. Loss: 0.05624080957929232, Train_acc 0.98793334, Test_acc 0.9774
It would be nice to see an improvement when using dropout wrt. not using it. Or perhaps I'm missing something?
Best,
Patricia
The text was updated successfully, but these errors were encountered: