Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add DDPG agent #16

Open
lefnire opened this issue Feb 8, 2018 · 1 comment
Open

Add DDPG agent #16

lefnire opened this issue Feb 8, 2018 · 1 comment
Assignees

Comments

@lefnire
Copy link
Owner

lefnire commented Feb 8, 2018

Per #6 (comment), I'd like to try the DDPG RL agent (compared to PPO agent). DDPG hypers will need to be added to hypersearch, and likely some other code adjustments. I once had DQN support, when I removed it I may have tailored the code to be too PPO-centric.

@methenol
Copy link
Collaborator

methenol commented Aug 17, 2018

This shoudn't be a problem. I have a tensorforce DDPG agent that I was hypersearching with hyperas and a different environment. I'll work on adapting it as the learning characteristics vary between PPO and DDPG. Recommending a separate branch for this as it will need to be initially hypersearched with some wide ranges, and to keep the amount of search parameters minimized in each run. Since there is a DDPG branch already, would it make sense to copy the v0.2 PPO branch over to there then modify to the DDPG agent params?

Might as well make an ACKTR branch. Would be a heavier modification but I've seen at least one tensorflow implementation that I was looking into and would be interested in rigging up a non-tensorforce agent for testing.

@methenol methenol self-assigned this Aug 17, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants