Skip to content

Latest commit

 

History

History
25 lines (18 loc) · 581 Bytes

README.md

File metadata and controls

25 lines (18 loc) · 581 Bytes

RL-with-CBF

Install the dependancies

pip install -r requirements.txt

To run the PPO code (tested on Cartpole Environment - discrete action space)

python ppo.py --track --capture-video 

For continous action space

python ppo_continous.py --track --capture-video

Use Wanb library to visualise the loss functions and model performance

To test on custom enviroments

  • Create an environment in OpenAI Gym format
  • Wrap it using gym.make() function and test it for Gym compatibility
  • Vectorise the environment and run the PPO code