Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Make pytest fail on uncaught warnings #133

Merged
merged 6 commits into from
Apr 12, 2023

Conversation

michaelraczycki
Copy link
Collaborator

@michaelraczycki michaelraczycki commented Apr 3, 2023

introduced config file (pytest.ini) in a root of the repository, added configuration failing tests if containing uncaught warnings. Added ignores for already known and ofter appearing warnings that won't be resolved in the near future
(namespace deprecation warning and arviz inference data warning that warns about inference data not being registered in InferenceData scheme

Closes #102

@michaelraczycki
Copy link
Collaborator Author

addresses changes requested in #102

@michaelraczycki
Copy link
Collaborator Author

I guess it's one of the few situations where the pytest / ubuntu failing might signify that my work was a success :)

@michaelraczycki
Copy link
Collaborator Author

@ricardoV94 if you will make a list of which warnings we want to ignore for now, and not make them fail tests I can add regex exceptions for now in pytest.ini. I think this step will be quite a shock, as every single test has some warnings, and we will get test failing here unless we'd go one by one and adapt all the tests (did you think of that as within a scope of that issue?)

@ricardoV94
Copy link
Member

ricardoV94 commented Apr 5, 2023

The current failure should be fixed in the codebase (we are still using a deprecated feature).

How many other tests were failing? If there is other obvious stuff like about deprecated use of PyMC or PyTensor we should fix it in this PR. That's the point of the original issue. It's a bit of trouble now but it will be less so in the future, and we will avoid hard fails ahead of time.

=================================== FAILURES ===================================
_______________________________ test_pathfinder ________________________________

    @pytest.mark.skipif(sys.platform == "win32", reason="JAX not supported on windows.")
    def test_pathfinder():
        # Data of the Eight Schools Model
        J = 8
        y = np.array([28.0, 8.0, -3.0, 7.0, -1.0, 1.0, 18.0, 12.0])
        sigma = np.array([15.0, 10.0, 16.0, 11.0, 9.0, 11.0, 10.0, 18.0])
    
        with pm.Model() as model:
    
            mu = pm.Normal("mu", mu=0.0, sigma=10.0)
            tau = pm.HalfCauchy("tau", 5.0)
    
            theta = pm.Normal("theta", mu=0, sigma=1, shape=J)
            theta_1 = mu + tau * theta
            obs = pm.Normal("obs", mu=theta, sigma=sigma, shape=J, observed=y)
    
>           idata = pmx.fit(method="pathfinder", iterations=100)

pymc_experimental/tests/test_pathfinder.py:40: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
pymc_experimental/inference/fit.py:37: in fit
    return fit_pathfinder(**kwargs)
pymc_experimental/inference/pathfinder.py:141: in fit_pathfinder
    for var_name, dims in model.RV_dims.items()
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <pymc.model.Model object at 0x7f90a684d2b0>

    @property
    def RV_dims(self) -> Dict[str, Tuple[Union[str, None], ...]]:
        """Tuples of dimension names for specific model variables.
    
        Entries in the tuples may be ``None``, if the RV dimension was not given a name.
        """
>       warnings.warn(
            "Model.RV_dims is deprecated. User Model.named_vars_to_dims instead.",
            FutureWarning,
        )
E       FutureWarning: Model.RV_dims is deprecated. User Model.named_vars_to_dims instead.

pytest.ini Show resolved Hide resolved
@michaelraczycki
Copy link
Collaborator Author

It's hard to say, because a lot of errors break the test suite on collection, in model_builder related tests every test was breaking because of the namespace and arviz warning, that's why I added the ignores. I contacted @OriolAbril about the arviz warning, and also got a confirmation that it's nothing to worry about now

@ricardoV94
Copy link
Member

That's good so it seems only the failing test needs to be addressed and this PR is good to go?

@michaelraczycki
Copy link
Collaborator Author

I can fix it tomorrow, and hopefully it's good to go

@michaelraczycki
Copy link
Collaborator Author

@ricardoV94 it seems like 2 things were still failing (I can change the pathfinder test code but I can't check it locally, as it requires x86_64 architecture and me with Mac have arm64 I think we should extend the pytest.mark.skipif for Macs as well.

@ricardoV94 ricardoV94 merged commit 5f1c2bb into pymc-devs:main Apr 12, 2023
@ricardoV94
Copy link
Member

Thanks @michaelraczycki

@michaelraczycki michaelraczycki deleted the pytest_configuration branch April 19, 2023 11:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fail tests with uncaught warnings
2 participants