Skip to content

Latest commit

 

History

History
41 lines (30 loc) · 2.5 KB

README.md

File metadata and controls

41 lines (30 loc) · 2.5 KB

Miles Cranmer, Sam Greydanus, Stephan Hoyer, Peter Battaglia, David Spergel, Shirley Ho Accepted to ICLR 2020 Workshop on Deep Differential Equations

overall-idea.png

Warning To use our implementation with more recent versions of JAX, you change jax.experimental.stax to jax.example_libraries.stax and jax.experimental.optimizers to jax.example_libraries.optimizers. Please raise an issue if there are other deprecated functionalities.

Summary

In this project we propose Lagrangian Neural Networks (LNNs), which can parameterize arbitrary Lagrangians using neural networks. In contrast to Hamiltonian Neural Networks, these models do not require canonical coordinates and perform well in situations where generalized momentum is difficult to compute (e.g., the double pendulum). This is particularly appealing for use with a learned latent representation, a case where HNNs struggle. Unlike previous work on learning Lagrangians, LNNs are fully general and extend to non-holonomic systems such as the 1D wave equation.

Neural Networks Neural ODEs HNN DLN (ICLR'19) LNN (this work)
Learns dynamics ✔️ ✔️ ✔️ ✔️ ✔️
Learns continuous-time dynamics ✔️ ✔️ ✔️ ✔️
Learns exact conservation laws ✔️ ✔️ ✔️
Learns from arbitrary coordinates ✔️ ✔️ ✔️ ✔️
Learns arbitrary Lagrangians ✔️

Dependencies

  • Jax
  • NumPy
  • MoviePy (visualization)
  • celluloid (visualization)

This project is written in Python 3.