Skip to content

Add initial testing functions#18

Merged
ekneale merged 18 commits intomainfrom
initial_testing
Jan 14, 2026
Merged

Add initial testing functions#18
ekneale merged 18 commits intomainfrom
initial_testing

Conversation

@martinjohndyer
Copy link
Collaborator

This PR adds in a tests/ directory with two test scripts.

First, test_spectra.py is a proper unit testing module, which has functions for testing loading and manipulating the spectra (scaling, combining etc). This is mostly done using mock data (any of the test functions that end with _mock), since we can feed simple values in and compare to what we'd expect to get out with Numpy. There are also some _real functions that apply to the included data files to make sure they are loaded correctly as well.

Ultimately this is the first part of a full unit testing suite, which should be run through continuous integration whenever future changes are made to the code (see #6). As included, this can be run with pytest, and all tests currently pass. There are a few major limitations before it's ready for proper CI though:

  • Many of the spec functions also have plotting code within them, which will pop up a ROOT window that you need to manually close before continuing onto the next test (to run with pytest this requires the -s flag to be used).
  • There are several hardcoded paths in the code as is, which means it only runs if the script is called within the SNF-simulations/snf_simulations directory. For example:
    Sr90 = np.genfromtxt("./antineutrino_spec_data/Sr90_an.txt",skip_header=1)[:,[7,10,11]]

Most of the test functions use Numpy arrays to verify the output of ROOT functions (TH1::Interpolate, TH1::Merge etc). This was a deliberate choice with the aim of eventually replacing ROOT in the main package code (see #14).

The second script, test_commandline.py, is more of a functional script to verify the results of the code compared to the "canonical" output of the main command_line.py script in the package. These output files are included in tests/test_data/, so when running test_commandline.py it will check that what's calculated matches those files, for both Sizewell and Hartlepool.

As mentioned above, there are a lot of plotting and output functions currently embedded into the main code, but the script will run the function, load the output, check with the test output and then clean up the spare file. You still have to close each plot window as it opens, and while we can compare the contents of CSVs and some returned values you can't easily do that for plots or values that are only printed to the terminal.

This isn't really meant for integrated testing like test_spectra.py, although I've formatted it as such, and it does run and pass with pytest. Instead, it's more to verify later on that the output is still the same if larger structural/functional changes are made within the package. Eventually, once the entire package is rewritten then this script should be replaced with more structured unit and end-to-end testing.

When running pytest with code coverage enabled (from within SNF-simulations/snf_simulations run pytest -s --cov=snf_simulations --cov-report html ../tests/test_spectra.py to generate the HTML report) the current coverage value is 93%, which is pretty good. I'd have liked to get to 100%, but the only function that isn't covered is plotting.plot() which since it only outputs a file can't really be tested easily. Also the actual command_line.py file has 0% coverage, but that's because it's really a script not a module and test_commandline.py duplicates everything it does. Otherwise everything is covered.


While writing these tests, I did my best to keep the main part of the code completely untouched, since the entire point was to get a baseline that I can compare later changes to. However there were a few major bugs I found that I did include fixes for:

These all came from noticing issues when feeding in mock data in the test functions, but don't really make any difference to the outputs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments