Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Test failures with 1.7.0 #42

Open
cdeepali opened this issue Aug 2, 2022 · 1 comment
Open

[BUG] Test failures with 1.7.0 #42

cdeepali opened this issue Aug 2, 2022 · 1 comment
Labels
bug Something isn't working

Comments

@cdeepali
Copy link
Collaborator

cdeepali commented Aug 2, 2022

The following lightning tests are observed to fail with opence v1.7.0 build:

FAILED tests/callbacks/test_finetuning_callback.py::test_callbacks_restore - ...
FAILED tests/callbacks/test_tqdm_progress_bar.py::test_tensor_to_float_conversion
FAILED tests/core/test_lightning_module.py::test_proper_refcount - assert 2 == 3
FAILED tests/strategies/test_common.py::test_evaluate[trainer_kwargs0] - Futu...
FAILED tests/strategies/test_common.py::test_evaluate[trainer_kwargs1] - Futu...
FAILED tests/strategies/test_common.py::test_evaluate[trainer_kwargs2] - Futu...
FAILED tests/strategies/test_ddp_spawn.py::test_ddp_all_dataloaders_passed_to_fit
FAILED tests/trainer/optimization/test_manual_optimization.py::test_multiple_optimizers_step
FAILED tests/trainer/optimization/test_multiple_optimizers.py::test_unbalanced_logging_with_multiple_optimizers
FAILED tests/utilities/test_auto_restart.py::test_auto_restart_within_validation_loop[1.0-train_datasets0-val_datasets0]
FAILED tests/utilities/test_auto_restart.py::test_auto_restart_within_validation_loop[1.0-train_datasets1-val_datasets1]
FAILED tests/utilities/test_auto_restart.py::test_fault_tolerant_manual_mode[0.5-train_dataset_cls0-val_dataset_cls0]
[31m= [31m[1m12 failed[0m, [32m2211 passed[0m, [33m358 skipped[0m, [33m36 deselected[0m, [33m11 xfailed[0m, [33m4034 warnings[0m[31m in 557.87s (0:09:17)[0m[31m =[0m

Most of these are failing with the error:

[1m[31mE FutureWarning: torch.testing.assert_allclose() is deprecated since 1.12 and will be removed in 1.14. Use torch.testing.assert_close() instead. For detailed upgrade instructions see https://github.com/pytorch/pytorch/issues/61844

@cdeepali cdeepali added the bug Something isn't working label Aug 2, 2022
@cdeepali
Copy link
Collaborator Author

cdeepali commented Aug 2, 2022

Similarly with torchmetrics seeing a few test failures:

FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[sum-pairwise_cosine_similarity-cosine_similarity-x0-y0]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[sum-pairwise_cosine_similarity-cosine_similarity-x1-y1]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[sum-pairwise_linear_similarity-linear_kernel-x0-y0]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[sum-pairwise_linear_similarity-linear_kernel-x1-y1]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[mean-pairwise_cosine_similarity-cosine_similarity-x0-y0]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[mean-pairwise_cosine_similarity-cosine_similarity-x1-y1]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[mean-pairwise_linear_similarity-linear_kernel-x0-y0]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[mean-pairwise_linear_similarity-linear_kernel-x1-y1]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[None-pairwise_cosine_similarity-cosine_similarity-x0-y0]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[None-pairwise_cosine_similarity-cosine_similarity-x1-y1]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[None-pairwise_linear_similarity-linear_kernel-x0-y0]
FAILED tests/pairwise/test_pairwise_distance.py::TestPairwise::test_pairwise_half_cpu[None-pairwise_linear_similarity-linear_kernel-x1-y1]
[31m===== [31m[1m12 failed[0m, [32m1073 passed[0m, [33m10 xfailed[0m, [33m382 warnings[0m[31m in 86.95s (0:01:26)[0m[31m =====[0m

with the error:
RuntimeError: "addmm_impl_cpu_" not implemented for 'Half'

@cdeepali cdeepali mentioned this issue Aug 3, 2022
3 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant