This repository has been archived by the owner on Nov 22, 2022. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 799
Dynamic Batch Scheduler Implementation #1200
Closed
Closed
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This pull request was exported from Phabricator. Differential Revision: D18900677 |
AkshatSh
pushed a commit
to AkshatSh/pytext
that referenced
this pull request
Dec 17, 2019
Summary: Pull Request resolved: facebookresearch#1200 This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch. The diff implements two schedulers: * **Linear**: increases batch size linearly * **Exponential**: increases batch size exponentially # API The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test. Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely: ``` class SchedulerConfig(ModuleConfig): # the initial batch size used for training start_batch_size: int = 32 # the final or max batch size to use, any scheduler should # not go over this batch size end_batch_size: int = 256 # the number of epochs to increase the batch size over epoch_period: int = 10 # the batch size is kept constant for `step_size` number of epochs step_size: int = 1 ``` Paper: https://arxiv.org/abs/1711.00489 Reviewed By: seayoung1112, ArmenAg Differential Revision: D18900677 fbshipit-source-id: 4e0640bfa537c199420e07510387cde2fa217f57
598ea15
to
dc80c82
Compare
This pull request was exported from Phabricator. Differential Revision: D18900677 |
AkshatSh
pushed a commit
to AkshatSh/pytext
that referenced
this pull request
Dec 17, 2019
Summary: Pull Request resolved: facebookresearch#1200 This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch. The diff implements two schedulers: * **Linear**: increases batch size linearly * **Exponential**: increases batch size exponentially # API The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test. Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely: ``` class SchedulerConfig(ModuleConfig): # the initial batch size used for training start_batch_size: int = 32 # the final or max batch size to use, any scheduler should # not go over this batch size end_batch_size: int = 256 # the number of epochs to increase the batch size over epoch_period: int = 10 # the batch size is kept constant for `step_size` number of epochs step_size: int = 1 ``` Paper: https://arxiv.org/abs/1711.00489 Reviewed By: seayoung1112, ArmenAg Differential Revision: D18900677 fbshipit-source-id: 63cff4597e04922804c3d551afb6435a6f719591
dc80c82
to
cfe4281
Compare
This pull request was exported from Phabricator. Differential Revision: D18900677 |
AkshatSh
pushed a commit
to AkshatSh/pytext
that referenced
this pull request
Dec 17, 2019
Summary: Pull Request resolved: facebookresearch#1200 This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch. The diff implements two schedulers: * **Linear**: increases batch size linearly * **Exponential**: increases batch size exponentially # API The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test. Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely: ``` class SchedulerConfig(ModuleConfig): # the initial batch size used for training start_batch_size: int = 32 # the final or max batch size to use, any scheduler should # not go over this batch size end_batch_size: int = 256 # the number of epochs to increase the batch size over epoch_period: int = 10 # the batch size is kept constant for `step_size` number of epochs step_size: int = 1 ``` Paper: https://arxiv.org/abs/1711.00489 Reviewed By: seayoung1112, ArmenAg Differential Revision: D18900677 fbshipit-source-id: 1f1e8ec850b84778a3e7a2c505a7c500de0ad2d5
cfe4281
to
146cd50
Compare
This pull request was exported from Phabricator. Differential Revision: D18900677 |
AkshatSh
pushed a commit
to AkshatSh/pytext
that referenced
this pull request
Dec 17, 2019
Summary: Pull Request resolved: facebookresearch#1200 This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch. The diff implements two schedulers: * **Linear**: increases batch size linearly * **Exponential**: increases batch size exponentially # API The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test. Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely: ``` class SchedulerConfig(ModuleConfig): # the initial batch size used for training start_batch_size: int = 32 # the final or max batch size to use, any scheduler should # not go over this batch size end_batch_size: int = 256 # the number of epochs to increase the batch size over epoch_period: int = 10 # the batch size is kept constant for `step_size` number of epochs step_size: int = 1 ``` Paper: https://arxiv.org/abs/1711.00489 Reviewed By: seayoung1112, ArmenAg Differential Revision: D18900677 fbshipit-source-id: 65f3ce6d3ea5834a770ffa365c6dda371bbfc77b
146cd50
to
0f0ac87
Compare
This pull request was exported from Phabricator. Differential Revision: D18900677 |
Summary: Pull Request resolved: facebookresearch#1200 This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch. The diff implements two schedulers: * **Linear**: increases batch size linearly * **Exponential**: increases batch size exponentially # API The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test. Dynamic batcher holds a new configuration object `scheduler_config`, this contains the information needed to compute dynamic batch sizes namely: ``` class SchedulerConfig(ModuleConfig): # the initial batch size used for training start_batch_size: int = 32 # the final or max batch size to use, any scheduler should # not go over this batch size end_batch_size: int = 256 # the number of epochs to increase the batch size over epoch_period: int = 10 # the batch size is kept constant for `step_size` number of epochs step_size: int = 1 ``` Paper: https://arxiv.org/abs/1711.00489 Reviewed By: seayoung1112, ArmenAg Differential Revision: D18900677 fbshipit-source-id: 752e75c6f28b99a042d4af1bd0962739e611c658
0f0ac87
to
bc9ddd8
Compare
This pull request was exported from Phabricator. Differential Revision: D18900677 |
This pull request has been merged in 3f8d293. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Labels
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Summary:
This diff adds support of dynamic batch training in pytext. It creates a new batcher that computes the batch size depending on the current epoch.
The diff implements two schedulers:
API
The dynamic batcher extends the pooling batcher so that most the arguments are there and are pretty consistent. It is important to note that dynamic batch sizes only affects the training batch size, not eval or test.
Dynamic batcher holds a new configuration object
scheduler_config
, this contains the information needed to compute dynamic batch sizes namely:Paper: https://arxiv.org/abs/1711.00489
Reviewed By: seayoung1112, ArmenAg
Differential Revision: D18900677