Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consistent loss and acc difference between shadow models and target model #158

Open
henrikfo opened this issue Oct 10, 2024 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@henrikfo
Copy link
Collaborator

Issue

Problem Description

Consistent loss and acc difference between shadow models and target model during training.

Expected Behavior

For many runs of leakpro, training taget and shadow model from scratch, they should have the same average of loss and acc.
Right now the target model is on average 2-3 % points better than the average of shadow models trained.

What Needs to be Done

Figure out the difference within the training loop, amount of data, what data the target model is trained on etc.

How Can It Be Tested or Reproduced

Run leakpro with retrain=True and train and some shadow models for 6 or more epochs a few times

@henrikfo henrikfo added the bug Something isn't working label Oct 10, 2024
@henrikfo henrikfo self-assigned this Oct 10, 2024
@henrikfo
Copy link
Collaborator Author

target model is not using the same indices for multiple runs, opimizer configuration is set correctly. One hint might be that the target model takes <50% of the time for training one epoch

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant