Skip to content

Latest commit

 

History

History
38 lines (28 loc) · 2.75 KB

ReadME.md

File metadata and controls

38 lines (28 loc) · 2.75 KB

Implemented Algorithms

Results

Lunar Lander (DQN, PPO, DDPG)

Reward Curves

Algorithm Reward
DQN
PPO
DDPG (Continuous)

Training Video

Atari Pong (DQN, PPO)

Reward Curves

Algorithm Reward
DQN
PPO

Training video

Step 0 Step 900 000 Step 1 500 000 (final)

Atari Breakout (PPO)

Step 0 Step 900 000 Step 1 500 000 (final)