Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fraction proposal network of FQF #6

Open
zhangpaipai opened this issue Jun 16, 2022 · 0 comments
Open

fraction proposal network of FQF #6

zhangpaipai opened this issue Jun 16, 2022 · 0 comments

Comments

@zhangpaipai
Copy link

Hi,
I have some problems about fraction proposal network of FQF:
1.why set fraction_lr=5e-5*fqf_factor(0.000001)=5e-11, which is very small? And I found that the tau_hats distibution almost had no change during the training.
2.why apply initialize_weights_xavier(x, gain=0.01)? When I trained, I found if I didn't apply this initialization, gradient explosion would happen sometiomes.
3.why use RMSprop, and set alpha=0.95, eps=0.00001, of which the default values are 0.99 and 10e-8 respectively.
4.And I found that the tau_hats distibution almost had no change during the training of qbert. Is it the key of this algorithm?
thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant