Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
add step_limit to dynamicqueryattention (#159)
This just adds a way to have a given attention mechanism stop before it has completed all steps. This makes sense because even though an attention mechanism is trained for n steps we may not always need to go the whole n steps. I'm using this right now to train it for classification on the average of all steps. I think it should make it to shimmer since we'll probably use this functionality quite a lot.
- Loading branch information