Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add long sequence strategies #8076
add long sequence strategies #8076
Changes from all commits
cd8b643
58dde75
58b66a2
c5acdc2
c0c950a
4d2b599
fbfe654
fa66a5f
4e2ffc9
6bf1282
4aa5917
c46527e
c202d89
fe57746
1481944
56e014a
d245ad4
d51de39
dada723
32033f3
e1b0ff9
dcb712f
a00fcee
d88b4f4
386bca6
506b2cf
7176f9f
0c788ad
ef02a24
dc8da0a
File filter
Filter by extension
Conversations
Jump to
There are no files selected for viewing
Check warning on line 951 in paddlenlp/transformers/bloom/modeling.py
paddlenlp/transformers/bloom/modeling.py#L951
Check warning on line 957 in paddlenlp/transformers/bloom/modeling.py
paddlenlp/transformers/bloom/modeling.py#L956-L957
Check warning on line 962 in paddlenlp/transformers/bloom/modeling.py
paddlenlp/transformers/bloom/modeling.py#L962
Check warning on line 968 in paddlenlp/transformers/bloom/modeling.py
paddlenlp/transformers/bloom/modeling.py#L967-L968
Check warning on line 449 in paddlenlp/transformers/chatglm/modeling.py
paddlenlp/transformers/chatglm/modeling.py#L449
Check warning on line 561 in paddlenlp/transformers/chatglm/modeling.py
paddlenlp/transformers/chatglm/modeling.py#L558-L561
Check warning on line 565 in paddlenlp/transformers/chatglm/modeling.py
paddlenlp/transformers/chatglm/modeling.py#L564-L565
Check warning on line 657 in paddlenlp/transformers/chatglm_v2/modeling.py
paddlenlp/transformers/chatglm_v2/modeling.py#L656-L657
Check warning on line 702 in paddlenlp/transformers/chatglm_v2/modeling.py
paddlenlp/transformers/chatglm_v2/modeling.py#L699-L702
Check warning on line 767 in paddlenlp/transformers/llama/modeling.py
paddlenlp/transformers/llama/modeling.py#L767
Check warning on line 986 in paddlenlp/transformers/llama/modeling.py
paddlenlp/transformers/llama/modeling.py#L983-L986
Check warning on line 1499 in paddlenlp/transformers/llama/modeling.py
paddlenlp/transformers/llama/modeling.py#L1498-L1499
Check warning on line 1504 in paddlenlp/transformers/llama/modeling.py
paddlenlp/transformers/llama/modeling.py#L1504
Check warning on line 1506 in paddlenlp/transformers/llama/modeling.py
paddlenlp/transformers/llama/modeling.py#L1506
Check warning on line 26 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L26
Check warning on line 31 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L29-L31
Check warning on line 34 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L33-L34
Check warning on line 37 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L36-L37
Check warning on line 47 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L43-L47
Check warning on line 51 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L50-L51