add long sequence strategies #8076
43.16% of diff hit (target 80.00%)
View this Pull Request on Codecov
43.16% of diff hit (target 80.00%)
Annotations
Check warning on line 951 in paddlenlp/transformers/bloom/modeling.py
codecov / codecov/patch
paddlenlp/transformers/bloom/modeling.py#L951
Added line #L951 was not covered by tests
Check warning on line 957 in paddlenlp/transformers/bloom/modeling.py
codecov / codecov/patch
paddlenlp/transformers/bloom/modeling.py#L956-L957
Added lines #L956 - L957 were not covered by tests
Check warning on line 962 in paddlenlp/transformers/bloom/modeling.py
codecov / codecov/patch
paddlenlp/transformers/bloom/modeling.py#L962
Added line #L962 was not covered by tests
Check warning on line 968 in paddlenlp/transformers/bloom/modeling.py
codecov / codecov/patch
paddlenlp/transformers/bloom/modeling.py#L967-L968
Added lines #L967 - L968 were not covered by tests
Check warning on line 449 in paddlenlp/transformers/chatglm/modeling.py
codecov / codecov/patch
paddlenlp/transformers/chatglm/modeling.py#L449
Added line #L449 was not covered by tests
Check warning on line 561 in paddlenlp/transformers/chatglm/modeling.py
codecov / codecov/patch
paddlenlp/transformers/chatglm/modeling.py#L558-L561
Added lines #L558 - L561 were not covered by tests
Check warning on line 565 in paddlenlp/transformers/chatglm/modeling.py
codecov / codecov/patch
paddlenlp/transformers/chatglm/modeling.py#L564-L565
Added lines #L564 - L565 were not covered by tests
Check warning on line 657 in paddlenlp/transformers/chatglm_v2/modeling.py
codecov / codecov/patch
paddlenlp/transformers/chatglm_v2/modeling.py#L656-L657
Added lines #L656 - L657 were not covered by tests
Check warning on line 702 in paddlenlp/transformers/chatglm_v2/modeling.py
codecov / codecov/patch
paddlenlp/transformers/chatglm_v2/modeling.py#L699-L702
Added lines #L699 - L702 were not covered by tests
Check warning on line 767 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L767
Added line #L767 was not covered by tests
Check warning on line 986 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L983-L986
Added lines #L983 - L986 were not covered by tests
Check warning on line 1499 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1498-L1499
Added lines #L1498 - L1499 were not covered by tests
Check warning on line 1504 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1504
Added line #L1504 was not covered by tests
Check warning on line 1506 in paddlenlp/transformers/llama/modeling.py
codecov / codecov/patch
paddlenlp/transformers/llama/modeling.py#L1506
Added line #L1506 was not covered by tests
Check warning on line 26 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L26
Added line #L26 was not covered by tests
Check warning on line 31 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L29-L31
Added lines #L29 - L31 were not covered by tests
Check warning on line 34 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L33-L34
Added lines #L33 - L34 were not covered by tests
Check warning on line 37 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L36-L37
Added lines #L36 - L37 were not covered by tests
Check warning on line 47 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L43-L47
Added lines #L43 - L47 were not covered by tests
Check warning on line 51 in paddlenlp/transformers/long_sequence_strategies/attention_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/attention_strategies.py#L50-L51
Added lines #L50 - L51 were not covered by tests
Check warning on line 33 in paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py#L28-L33
Added lines #L28 - L33 were not covered by tests
Check warning on line 36 in paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py#L35-L36
Added lines #L35 - L36 were not covered by tests
Check warning on line 41 in paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py#L41
Added line #L41 was not covered by tests
Check warning on line 45 in paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py#L44-L45
Added lines #L44 - L45 were not covered by tests
Check warning on line 48 in paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py
codecov / codecov/patch
paddlenlp/transformers/long_sequence_strategies/embedding_strategies.py#L48
Added line #L48 was not covered by tests