Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minibatch iterator #403

Merged
merged 2 commits into from
Apr 3, 2023

Conversation

eluzhnica
Copy link
Contributor

This adds compatibility for ILQL, SFT and PPO

@eluzhnica eluzhnica changed the title Minibatch implementation Minibatch iterator Mar 29, 2023
@eluzhnica eluzhnica mentioned this pull request Mar 29, 2023
@Dahoas
Copy link
Collaborator

Dahoas commented Apr 3, 2023

This looks good to me!

@Dahoas Dahoas merged commit 8485e78 into CarperAI:minibatch-impl Apr 3, 2023
cat-state pushed a commit that referenced this pull request Apr 6, 2023
This PR adds gradient accumulation and minibatching to the accelerate trainers.

* fixes half exp not implemented error

* added minibatching

* fix num_mb name

* fix minibatch indexing

* fixing style

* fixing style

* fixing style

* Minibatch iterator (#403)

* Add minibatch iterator

* Add tests

* Avoid gradient synchronization when accumulating (#396)

* Avoid gradient synchronization when accumulating

* Fix accumulation to account for dataloader

* Add some tests

---------

Co-authored-by: Enxhell <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants