Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distance scaling in ExpNorm RBFs #28

Closed
nec4 opened this issue Jul 4, 2021 · 6 comments
Closed

Distance scaling in ExpNorm RBFs #28

nec4 opened this issue Jul 4, 2021 · 6 comments
Assignees
Labels
enhancement New feature or request

Comments

@nec4
Copy link
Contributor

nec4 commented Jul 4, 2021

I have noticed that the ExpNorm basis functions might benefit from a new parameter that rescales the distances inside the inner expoential. Consider the default setup where the distances are not rescaled and an upper cutoff of 35 angstroms is used (the first figure). While there is a good density of basis functions at small distances, there are just a few at longer distances. By contrast, if the distances are rescaled by some factor, say alpha=0.2. the second figure results in a more even distribution of basis functions (without changing anything else). I think this might be at odds with how the distance expansions are implemented in the TorchMD_GN class, because it always expects a fixed number of parameters common to all distance expansion types. If this is desirable, I would be happy to make a PR and implement tests.

Screen Shot 2021-07-04 at 6 51 11 PM

Screen Shot 2021-07-04 at 6 51 16 PM

@nec4 nec4 added the enhancement New feature or request label Jul 4, 2021
@nec4 nec4 self-assigned this Jul 4, 2021
@PhilippThoelke
Copy link
Collaborator

This would be a nice addition! Is there a way to compute a good value for alpha from the cutoff range? Do you have an idea how to cleanly implement this additional argument for ExpNorm? It might make sense then to create the RBF module inside create_model then instead of the model's __init__ to avoid redundancy.

@nec4
Copy link
Contributor Author

nec4 commented Jul 6, 2021

I think that is actually a problem. I'm not sure of an intuitive way to choose alpha without visually checking the plots of the basis functions over distances. To me, it makes sense to leave the default value as 1 for consistency with respect to previous works like PhysNet, but I would have to think a bit about how to choose alpha systematically based on the choice of the cutoffs.

@PhilippThoelke
Copy link
Collaborator

For now we could also add a special rbf_arg in the argparse of train.py and pass that to the different RBF expansion classes in the constructor. Then each class can handle it individually or just ignore it if it is not needed. This would be similar to how we are passing arguments to the dataset classes right now.

We have this argument:
https://github.com/compsciencelab/torchmd-net/blob/1759652a89c82dc90671a028316b015e1a27a2d5/scripts/train.py#L60
And it is passed to all dataset classes by default:
https://github.com/compsciencelab/torchmd-net/blob/1759652a89c82dc90671a028316b015e1a27a2d5/torchmdnet/data.py#L32-L35

@nec4
Copy link
Contributor Author

nec4 commented Jul 18, 2021

Hello - sorry for the delay on this. Turns out there is a very simple way to tune $\alpha$: It is actually just $\alpha=5/(cutoff_upper-cutoff_lower)$. The 5 is chosen to reproduce the default behavior from PhysNet. Here are a few examples where only the cutoff distances are varied:
Screen Shot 2021-07-18 at 9 00 57 PM

I will make a branch and then a PR and if you guys like it we can discuss further.

@giadefa
Copy link
Contributor

giadefa commented Jul 18, 2021 via email

@PhilippThoelke
Copy link
Collaborator

Looks good, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants