Skip to content

Latest commit

 

History

History
118 lines (47 loc) · 6.3 KB

index.md

File metadata and controls

118 lines (47 loc) · 6.3 KB

Marathon Man

This repository explores the application of deep reinforcement learning for physics-based animation. It contains a set of high-dimensional continuous control benchmarks using Unity’s native physics simulator, PhysX. The environments can be trained using Unity ML-Agents or any OpenAI Gym compatible algorithm. This project may be useful for:

  • Video Game researchers interested in apply bleeding-edge robotics research into the domain of locomotion and AI for video games.
  • Academic researchers looking to leverage the strengths of Unity and ML-Agents along with the body of existing research and benchmarks provided by projects such as the DeepMind Control Suite, or OpenAI Mujoco environments.

The Unity project has two parts. Both can be find in UnitySDK > Assets:

  • In folder MarathonEnvs there are several benchmarks of physics-based animation, implemented on the basis of different papers in the field. More details on the environments can be found here and instructions on how to train them here

MarathonEnvs

  • In folder MarathonController there are resources to take a skinned character, with a typical controller like mecanim or motion matching, and generate from it a training environment. Further details can be found here

Example-current-status,

  • There are also instructions to export the outcome of the training here

1. Getting started

If you have further questions, feel free to join our Discord server

2. Contributors

v4.0 was created by:

v3.0 was created by:

Note: This project is the result of contributions from members of the Unity community (see below) who actively maintain the repository. As such, the contents of this repository are not officially supported by Unity Technologies.

3. Open issues

Currently, our main challenge is that results still look like if they came with this department of silly walks effect, (something that obviously does not appear in the demos of the papers). It is annoying, and we absolutely need to solve it if we want to have something that can be used in practice.

Weird Walks

4. Publications

5. Licensing

All the project is under Apache License Version 2.0, January 2004 http://www.apache.org/licenses/LICENSE-2.0 , with the single exception of the motion data for the quadruped is adapted under the available under the terms of the Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) license, as stated in their README,

6. References

Document last updated: 11.05.2021