Skip to content
forked from ml-jku/UPT

Code for the paper Universal Physics Transformers

Notifications You must be signed in to change notification settings

LeeLizuoLiu/UPT

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code and instructions are coming soon.

[Project Page] [Paper (arxiv)] [BibTeX]

Train your own models

Instructions to setup the codebase on your own environment are provided in SETUP_CODE, SETUP_DATA.

Configurations to train models can be found here.

Citation

If you like our work, please consider giving it a star ⭐ and cite us

@article{alkin2024upt,
      title={Universal Physics Transformers}, 
      author={Benedikt Alkin and Andreas Fürst and Simon Schmid and Lukas Gruber and Markus Holzleitner and Johannes Brandstetter},
      journal={arXiv preprint arXiv:2402.12365},
      year={2024}
}

About

Code for the paper Universal Physics Transformers

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Shell 0.1%