For documentation see the
diff --git a/benchmarks/README.md b/benchmarks/README.md
index 8dffd473f3..316c8f9e32 100644
--- a/benchmarks/README.md
+++ b/benchmarks/README.md
@@ -11,13 +11,33 @@ shifts in performance being flagged in a new GitHub issue.
## Running benchmarks
+On GitHub: a Pull Request can be benchmarked by adding the
+https://github.com/SciTools/iris/labels/benchmark_this
+label to the PR (to run a second time: just remove and re-add the label).
+Note that a benchmark run could take an hour or more to complete.
+This runs a comparison between the PR branch's ``HEAD`` and its merge-base with
+the PR's base branch, thus showing performance differences introduced
+by the PR. (This run is managed by
+[the aforementioned GitHub Action](../.github/workflows/benchmark.yml)).
+
`asv ...` commands must be run from this directory. You will need to have ASV
installed, as well as Nox (see
[Benchmark environments](#benchmark-environments)).
-[Iris' noxfile](../noxfile.py) includes a `benchmarks` session that provides
-conveniences for setting up before benchmarking, and can also replicate the
-automated overnight run locally. See the session docstring for detail.
+The benchmark runner ([bm_runner.py](./bm_runner.py)) provides conveniences for
+common benchmark setup and run tasks, including replicating the automated
+overnight run locally. See `python bm_runner.py --help` for detail.
+
+A significant portion of benchmark run time is environment management. Run-time
+can be reduced by placing the benchmark environment on the same file system as
+your
+[Conda package cache](https://conda.io/projects/conda/en/latest/user-guide/configuration/use-condarc.html#specify-pkg-directories),
+if it is not already. You can achieve this by either:
+
+- Temporarily reconfiguring `delegated_env_commands` and `delegated_env_parent`
+ in [asv.conf.json](asv.conf.json) to reference a location on the same file
+ system as the Conda package cache.
+- Moving your Iris repo to the same file system as the Conda package cache.
### Environment variables
@@ -26,8 +46,8 @@ automated overnight run locally. See the session docstring for detail.
benchmark scripts.
* `DATA_GEN_PYTHON` - required - path to a Python executable that can be
used to generate benchmark test objects/files; see
-[Data generation](#data-generation). The Nox session sets this automatically,
-but will defer to any value already set in the shell.
+[Data generation](#data-generation). The benchmark runner sets this
+automatically, but will defer to any value already set in the shell.
* `BENCHMARK_DATA` - optional - path to a directory for benchmark synthetic
test data, which the benchmark scripts will create if it doesn't already
exist. Defaults to `/benchmarks/.data/` if not set. Note that some of
@@ -36,7 +56,7 @@ plan accordingly.
* `ON_DEMAND_BENCHMARKS` - optional - when set (to any value): benchmarks
decorated with `@on_demand_benchmark` are included in the ASV run. Usually
coupled with the ASV `--bench` argument to only run the benchmark(s) of
-interest. Is set during the Nox `cperf` and `sperf` sessions.
+interest. Is set during the benchmark runner `cperf` and `sperf` sub-commands.
## Writing benchmarks
diff --git a/benchmarks/asv.conf.json b/benchmarks/asv.conf.json
index 7337eaa8c7..faa7f6daee 100644
--- a/benchmarks/asv.conf.json
+++ b/benchmarks/asv.conf.json
@@ -4,6 +4,7 @@
"project_url": "https://github.com/SciTools/iris",
"repo": "..",
"environment_type": "conda-delegated",
+ "conda_channels": ["conda-forge", "defaults"],
"show_commit_url": "http://github.com/scitools/iris/commit/",
"branches": ["upstream/main"],
@@ -19,8 +20,7 @@
// * No build-time environment variables.
// * Is run in the same environment as the ASV install itself.
"delegated_env_commands": [
- "sed -i 's/_PY_VERSIONS_ALL/_PY_VERSION_LATEST/g' noxfile.py",
- "nox --envdir={conf_dir}/.asv/env/nox01 --session=tests --install-only --no-error-on-external-run --verbose"
+ "PY_VER=3.11 nox --envdir={conf_dir}/.asv/env/nox01 --session=tests --install-only --no-error-on-external-run --verbose"
],
// The parent directory of the above environment.
// The most recently modified environment in the directory will be used.
diff --git a/benchmarks/benchmarks/experimental/ugrid/regions_combine.py b/benchmarks/benchmarks/experimental/ugrid/regions_combine.py
index 3b2d77a80a..c5f8fb564e 100644
--- a/benchmarks/benchmarks/experimental/ugrid/regions_combine.py
+++ b/benchmarks/benchmarks/experimental/ugrid/regions_combine.py
@@ -50,7 +50,7 @@ def _make_region_cubes(self, full_mesh_cube):
i_faces = np.concatenate([i_faces[:, 2:], i_faces[:, :2]], axis=1)
# flatten to get [2 3 4 0 1 (-) 8 9 10 6 7 (-) 13 14 15 11 12 ...]
i_faces = i_faces.flatten()
- # reduce back to orignal length, wrap any overflows into valid range
+ # reduce back to original length, wrap any overflows into valid range
i_faces = i_faces[:n_faces] % n_faces
# Divide into regions -- always slightly uneven, since 7 doesn't divide
diff --git a/benchmarks/benchmarks/sperf/combine_regions.py b/benchmarks/benchmarks/sperf/combine_regions.py
index d3d128c7d8..e27b3b1996 100644
--- a/benchmarks/benchmarks/sperf/combine_regions.py
+++ b/benchmarks/benchmarks/sperf/combine_regions.py
@@ -46,7 +46,7 @@ def _make_region_cubes(self, full_mesh_cube):
i_faces = np.concatenate([i_faces[:, 2:], i_faces[:, :2]], axis=1)
# flatten to get [2 3 4 0 1 (-) 8 9 10 6 7 (-) 13 14 15 11 12 ...]
i_faces = i_faces.flatten()
- # reduce back to orignal length, wrap any overflows into valid range
+ # reduce back to original length, wrap any overflows into valid range
i_faces = i_faces[:n_faces] % n_faces
# Divide into regions -- always slightly uneven, since 7 doesn't divide
diff --git a/benchmarks/bm_runner.py b/benchmarks/bm_runner.py
new file mode 100644
index 0000000000..f3efb0ea31
--- /dev/null
+++ b/benchmarks/bm_runner.py
@@ -0,0 +1,401 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""
+Argparse conveniences for executing common types of benchmark runs.
+"""
+
+from abc import ABC, abstractmethod
+import argparse
+from argparse import ArgumentParser
+from datetime import datetime
+from importlib import import_module
+from os import environ
+from pathlib import Path
+import re
+import subprocess
+from tempfile import NamedTemporaryFile
+from typing import Literal
+
+# The threshold beyond which shifts are 'notable'. See `asv compare`` docs
+# for more.
+COMPARE_FACTOR = 1.2
+
+BENCHMARKS_DIR = Path(__file__).parent
+
+# Common ASV arguments for all run_types except `custom`.
+ASV_HARNESS = (
+ "run {posargs} --attribute rounds=4 --interleave-rounds --strict "
+ "--show-stderr"
+)
+
+
+def _subprocess_run_print(args, **kwargs):
+ # Use subprocess for printing to reduce chance of printing out of sequence
+ # with the subsequent calls.
+ subprocess.run(["echo", f"BM_RUNNER DEBUG: {' '.join(args)}"])
+ return subprocess.run(args, **kwargs)
+
+
+def _subprocess_run_asv(args, **kwargs):
+ args.insert(0, "asv")
+ kwargs["cwd"] = BENCHMARKS_DIR
+ return _subprocess_run_print(args, **kwargs)
+
+
+def _check_requirements(package: str) -> None:
+ try:
+ import_module(package)
+ except ImportError as exc:
+ message = (
+ f"No {package} install detected. Benchmarks can only "
+ f"be run in an environment including {package}."
+ )
+ raise Exception(message) from exc
+
+
+def _prep_data_gen_env() -> None:
+ """
+ Create/access a separate, unchanging environment for generating test data.
+ """
+
+ root_dir = BENCHMARKS_DIR.parent
+ python_version = "3.11"
+ data_gen_var = "DATA_GEN_PYTHON"
+ if data_gen_var in environ:
+ print("Using existing data generation environment.")
+ else:
+ print("Setting up the data generation environment ...")
+ # Get Nox to build an environment for the `tests` session, but don't
+ # run the session. Will re-use a cached environment if appropriate.
+ _subprocess_run_print(
+ [
+ "nox",
+ f"--noxfile={root_dir / 'noxfile.py'}",
+ "--session=tests",
+ "--install-only",
+ f"--python={python_version}",
+ ]
+ )
+ # Find the environment built above, set it to be the data generation
+ # environment.
+ data_gen_python = next(
+ (root_dir / ".nox").rglob(f"tests*/bin/python{python_version}")
+ ).resolve()
+ environ[data_gen_var] = str(data_gen_python)
+
+ print("Installing Mule into data generation environment ...")
+ mule_dir = data_gen_python.parents[1] / "resources" / "mule"
+ if not mule_dir.is_dir():
+ _subprocess_run_print(
+ [
+ "git",
+ "clone",
+ "https://github.com/metomi/mule.git",
+ str(mule_dir),
+ ]
+ )
+ _subprocess_run_print(
+ [
+ str(data_gen_python),
+ "-m",
+ "pip",
+ "install",
+ str(mule_dir / "mule"),
+ ]
+ )
+
+ print("Data generation environment ready.")
+
+
+def _setup_common() -> None:
+ _check_requirements("asv")
+ _check_requirements("nox")
+
+ _prep_data_gen_env()
+
+ print("Setting up ASV ...")
+ _subprocess_run_asv(["machine", "--yes"])
+
+ print("Setup complete.")
+
+
+def _asv_compare(*commits: str, overnight_mode: bool = False) -> None:
+ """Run through a list of commits comparing each one to the next."""
+ commits = [commit[:8] for commit in commits]
+ shifts_dir = BENCHMARKS_DIR / ".asv" / "performance-shifts"
+ for i in range(len(commits) - 1):
+ before = commits[i]
+ after = commits[i + 1]
+ asv_command = (
+ f"compare {before} {after} --factor={COMPARE_FACTOR} --split"
+ )
+ _subprocess_run_asv(asv_command.split(" "))
+
+ if overnight_mode:
+ # Record performance shifts.
+ # Run the command again but limited to only showing performance
+ # shifts.
+ shifts = _subprocess_run_asv(
+ [*asv_command.split(" "), "--only-changed"],
+ capture_output=True,
+ text=True,
+ ).stdout
+ if shifts:
+ # Write the shifts report to a file.
+ # Dir is used by .github/workflows/benchmarks.yml,
+ # but not cached - intended to be discarded after run.
+ shifts_dir.mkdir(exist_ok=True, parents=True)
+ shifts_path = (shifts_dir / after).with_suffix(".txt")
+ with shifts_path.open("w") as shifts_file:
+ shifts_file.write(shifts)
+
+
+class _SubParserGenerator(ABC):
+ """Convenience for holding all the necessary argparse info in 1 place."""
+
+ name: str = NotImplemented
+ description: str = NotImplemented
+ epilog: str = NotImplemented
+
+ def __init__(self, subparsers: ArgumentParser.add_subparsers) -> None:
+ self.subparser: ArgumentParser = subparsers.add_parser(
+ self.name,
+ description=self.description,
+ epilog=self.epilog,
+ formatter_class=argparse.RawTextHelpFormatter,
+ )
+ self.add_arguments()
+ self.subparser.add_argument(
+ "asv_args",
+ nargs=argparse.REMAINDER,
+ help="Any number of arguments to pass down to ASV.",
+ )
+ self.subparser.set_defaults(func=self.func)
+
+ @abstractmethod
+ def add_arguments(self) -> None:
+ """All self.subparser.add_argument() calls."""
+ _ = NotImplemented
+
+ @staticmethod
+ @abstractmethod
+ def func(args: argparse.Namespace):
+ """
+ The function to return when the subparser is parsed.
+
+ `func` is then called, performing the user's selected sub-command.
+
+ """
+ _ = args
+ return NotImplemented
+
+
+class Overnight(_SubParserGenerator):
+ name = "overnight"
+ description = (
+ "Benchmarks all commits between the input **first_commit** to ``HEAD``, "
+ "comparing each to its parent for performance shifts. If a commit causes "
+ "shifts, the output is saved to a file:\n"
+ "``.asv/performance-shifts/``\n\n"
+ "Designed for checking the previous 24 hours' commits, typically in a "
+ "scheduled script."
+ )
+ epilog = (
+ "e.g. python bm_runner.py overnight a1b23d4\n"
+ "e.g. python bm_runner.py overnight a1b23d4 --bench=regridding"
+ )
+
+ def add_arguments(self) -> None:
+ self.subparser.add_argument(
+ "first_commit",
+ type=str,
+ help="The first commit in the benchmarking commit sequence.",
+ )
+
+ @staticmethod
+ def func(args: argparse.Namespace) -> None:
+ _setup_common()
+
+ commit_range = f"{args.first_commit}^^.."
+ asv_command = ASV_HARNESS.format(posargs=commit_range)
+ _subprocess_run_asv([*asv_command.split(" "), *args.asv_args])
+
+ # git rev-list --first-parent is the command ASV uses.
+ git_command = f"git rev-list --first-parent {commit_range}"
+ commit_string = _subprocess_run_print(
+ git_command.split(" "), capture_output=True, text=True
+ ).stdout
+ commit_list = commit_string.rstrip().split("\n")
+ _asv_compare(*reversed(commit_list), overnight_mode=True)
+
+
+class Branch(_SubParserGenerator):
+ name = "branch"
+ description = (
+ "Performs the same operations as ``overnight``, but always on two commits "
+ "only - ``HEAD``, and ``HEAD``'s merge-base with the input "
+ "**base_branch**. Output from this run is never saved to a file. Designed "
+ "for testing if the active branch's changes cause performance shifts - "
+ "anticipating what would be caught by ``overnight`` once merged.\n\n"
+ "**For maximum accuracy, avoid using the machine that is running this "
+ "session. Run time could be >1 hour for the full benchmark suite.**"
+ )
+ epilog = (
+ "e.g. python bm_runner.py branch upstream/main\n"
+ "e.g. python bm_runner.py branch upstream/main --bench=regridding"
+ )
+
+ def add_arguments(self) -> None:
+ self.subparser.add_argument(
+ "base_branch",
+ type=str,
+ help="A branch that has the merge-base with ``HEAD`` - ``HEAD`` will be benchmarked against that merge-base.",
+ )
+
+ @staticmethod
+ def func(args: argparse.Namespace) -> None:
+ _setup_common()
+
+ git_command = f"git merge-base HEAD {args.base_branch}"
+ merge_base = _subprocess_run_print(
+ git_command.split(" "), capture_output=True, text=True
+ ).stdout[:8]
+
+ with NamedTemporaryFile("w") as hashfile:
+ hashfile.writelines([merge_base, "\n", "HEAD"])
+ hashfile.flush()
+ commit_range = f"HASHFILE:{hashfile.name}"
+ asv_command = ASV_HARNESS.format(posargs=commit_range)
+ _subprocess_run_asv([*asv_command.split(" "), *args.asv_args])
+
+ _asv_compare(merge_base, "HEAD")
+
+
+class _CSPerf(_SubParserGenerator, ABC):
+ """Common code used by both CPerf and SPerf."""
+
+ description = (
+ "Run the on-demand {} suite of benchmarks (part of the UK Met "
+ "Office NG-VAT project) for the ``HEAD`` of ``upstream/main`` only, "
+ "and publish the results to the input **publish_dir**, within a "
+ "unique subdirectory for this run."
+ )
+ epilog = (
+ "e.g. python bm_runner.py {0} my_publish_dir\n"
+ "e.g. python bm_runner.py {0} my_publish_dir --bench=regridding"
+ )
+
+ def add_arguments(self) -> None:
+ self.subparser.add_argument(
+ "publish_dir",
+ type=str,
+ help="HTML results will be published to a sub-dir in this dir.",
+ )
+
+ @staticmethod
+ def csperf(
+ args: argparse.Namespace, run_type: Literal["cperf", "sperf"]
+ ) -> None:
+ _setup_common()
+
+ publish_dir = Path(args.publish_dir)
+ if not publish_dir.is_dir():
+ message = (
+ f"Input 'publish directory' is not a directory: {publish_dir}"
+ )
+ raise NotADirectoryError(message)
+ publish_subdir = (
+ publish_dir
+ / f"{run_type}_{datetime.now().strftime('%Y%m%d_%H%M%S')}"
+ )
+ publish_subdir.mkdir()
+
+ # Activate on demand benchmarks (C/SPerf are deactivated for
+ # 'standard' runs).
+ environ["ON_DEMAND_BENCHMARKS"] = "True"
+ commit_range = "upstream/main^!"
+
+ asv_command = (
+ ASV_HARNESS.format(posargs=commit_range) + f" --bench={run_type}"
+ )
+ # C/SPerf benchmarks are much bigger than the CI ones:
+ # Don't fail the whole run if memory blows on 1 benchmark.
+ asv_command = asv_command.replace(" --strict", "")
+ # Only do a single round.
+ asv_command = re.sub(r"rounds=\d", "rounds=1", asv_command)
+ _subprocess_run_asv([*asv_command.split(" "), *args.asv_args])
+
+ asv_command = f"publish {commit_range} --html-dir={publish_subdir}"
+ _subprocess_run_asv(asv_command.split(" "))
+
+ # Print completion message.
+ location = BENCHMARKS_DIR / ".asv"
+ print(
+ f'New ASV results for "{run_type}".\n'
+ f'See "{publish_subdir}",'
+ f'\n or JSON files under "{location / "results"}".'
+ )
+
+
+class CPerf(_CSPerf):
+ name = "cperf"
+ description = _CSPerf.description.format("CPerf")
+ epilog = _CSPerf.epilog.format("cperf")
+
+ @staticmethod
+ def func(args: argparse.Namespace) -> None:
+ _CSPerf.csperf(args, "cperf")
+
+
+class SPerf(_CSPerf):
+ name = "sperf"
+ description = _CSPerf.description.format("SPerf")
+ epilog = _CSPerf.epilog.format("sperf")
+
+ @staticmethod
+ def func(args: argparse.Namespace) -> None:
+ _CSPerf.csperf(args, "sperf")
+
+
+class Custom(_SubParserGenerator):
+ name = "custom"
+ description = (
+ "Run ASV with the input **ASV sub-command**, without any preset "
+ "arguments - must all be supplied by the user. So just like running "
+ "ASV manually, with the convenience of re-using the runner's "
+ "scripted setup steps."
+ )
+ epilog = "e.g. python bm_runner.py custom continuous a1b23d4 HEAD --quick"
+
+ def add_arguments(self) -> None:
+ self.subparser.add_argument(
+ "asv_sub_command",
+ type=str,
+ help="The ASV command to run.",
+ )
+
+ @staticmethod
+ def func(args: argparse.Namespace) -> None:
+ _setup_common()
+ _subprocess_run_asv([args.asv_sub_command, *args.asv_args])
+
+
+def main():
+ parser = ArgumentParser(
+ description="Run the Iris performance benchmarks (using Airspeed Velocity).",
+ epilog="More help is available within each sub-command.",
+ )
+ subparsers = parser.add_subparsers(required=True)
+
+ for gen in (Overnight, Branch, CPerf, SPerf, Custom):
+ _ = gen(subparsers).subparser
+
+ parsed = parser.parse_args()
+ parsed.func(parsed)
+
+
+if __name__ == "__main__":
+ main()
diff --git a/codecov.yml b/codecov.yml
new file mode 100644
index 0000000000..a0efbb9997
--- /dev/null
+++ b/codecov.yml
@@ -0,0 +1,9 @@
+coverage:
+ # see https://docs.codecov.com/docs/commit-status
+ status:
+ project:
+ default:
+ target: auto
+ # coverage can drop by up to % while still posting success
+ threshold: 3%
+ patch: off
diff --git a/docs/Makefile b/docs/Makefile
index fcb0ec0116..b6f52f58f9 100644
--- a/docs/Makefile
+++ b/docs/Makefile
@@ -1,39 +1,29 @@
SUBDIRS = src
+help:
+ @for i in $(SUBDIRS); do \
+ echo "make help in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) help); done
+
html:
@for i in $(SUBDIRS); do \
- echo "make html in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html); done
+ echo "make html in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html); done
html-noplot:
@for i in $(SUBDIRS); do \
- echo "make html-noplot in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-noplot); done
+ echo "make html-noplot in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-noplot); done
html-noapi:
@for i in $(SUBDIRS); do \
- echo "make html-noapi in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-noapi); done
+ echo "make html-noapi in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-noapi); done
html-quick:
@for i in $(SUBDIRS); do \
- echo "make html-quick in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-quick); done
-
-all:
- @for i in $(SUBDIRS); do \
- echo "make all in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) all); done
-
-install:
- @for i in $(SUBDIRS); do \
- echo "Installing in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) install); done
-
-build:
- @for i in $(SUBDIRS); do \
- echo "Clearing in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) build); done
+ echo "make html-quick in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-quick); done
clean:
@for i in $(SUBDIRS); do \
@@ -42,8 +32,8 @@ clean:
doctest:
@for i in $(SUBDIRS); do \
- echo "Running doctest in $$i..."; \
- (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) doctest); done
+ echo "Running doctest in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) doctest); done
linkcheck:
@for i in $(SUBDIRS); do \
@@ -55,3 +45,7 @@ show:
echo "Running show in $$i..."; \
(cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) show); done
+livehtml:
+ @for i in $(SUBDIRS); do \
+ echo "Running show in $$i..."; \
+ (cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) livehtml); done
\ No newline at end of file
diff --git a/docs/gallery_code/general/plot_custom_aggregation.py b/docs/gallery_code/general/plot_custom_aggregation.py
index 5fba3669b6..6ef6075fb3 100644
--- a/docs/gallery_code/general/plot_custom_aggregation.py
+++ b/docs/gallery_code/general/plot_custom_aggregation.py
@@ -72,7 +72,7 @@ def main():
# Make an aggregator from the user function.
SPELL_COUNT = Aggregator(
- "spell_count", count_spells, units_func=lambda units: 1
+ "spell_count", count_spells, units_func=lambda units, **kwargs: 1
)
# Define the parameters of the test.
diff --git a/docs/gallery_code/general/plot_projections_and_annotations.py b/docs/gallery_code/general/plot_projections_and_annotations.py
index 2cf42e66e0..c4254ad544 100644
--- a/docs/gallery_code/general/plot_projections_and_annotations.py
+++ b/docs/gallery_code/general/plot_projections_and_annotations.py
@@ -78,7 +78,7 @@ def make_plot(projection_name, projection_crs):
y_points = y_lower + y_delta * np.concatenate(
(zeros, steps, ones, steps[::-1])
)
- # Get the Iris coordinate sytem of the X coordinate (Y should be the same).
+ # Get the Iris coordinate system of the X coordinate (Y should be the same).
cs_data1 = x_coord.coord_system
# Construct an equivalent Cartopy coordinate reference system ("crs").
crs_data1 = cs_data1.as_cartopy_crs()
diff --git a/docs/gallery_code/general/plot_rotated_pole_mapping.py b/docs/gallery_code/general/plot_rotated_pole_mapping.py
index 8a0c80c707..30975a4828 100644
--- a/docs/gallery_code/general/plot_rotated_pole_mapping.py
+++ b/docs/gallery_code/general/plot_rotated_pole_mapping.py
@@ -40,7 +40,7 @@ def main():
plt.gca().coastlines()
iplt.show()
- # Plot #3: Contourf overlayed by coloured point data
+ # Plot #3: Contourf overlaid by coloured point data
plt.figure()
qplt.contourf(air_pressure)
iplt.points(air_pressure, c=air_pressure.data)
diff --git a/docs/gallery_code/meteorology/plot_lagged_ensemble.py b/docs/gallery_code/meteorology/plot_lagged_ensemble.py
index e15aa0e6ef..0639c7ac1d 100644
--- a/docs/gallery_code/meteorology/plot_lagged_ensemble.py
+++ b/docs/gallery_code/meteorology/plot_lagged_ensemble.py
@@ -5,16 +5,16 @@
This example demonstrates the loading of a lagged ensemble dataset from the
GloSea4 model, which is then used to produce two types of plot:
- * The first shows the "postage stamp" style image with an array of 14 images,
- one for each ensemble member with a shared colorbar. (The missing image in
- this example represents ensemble member number 6 which was a failed run)
+* The first shows the "postage stamp" style image with an array of 14 images,
+ one for each ensemble member with a shared colorbar. (The missing image in
+ this example represents ensemble member number 6 which was a failed run)
- * The second plot shows the data limited to a region of interest, in this case
- a region defined for forecasting ENSO (El Nino-Southern Oscillation), which,
- for the purposes of this example, has had the ensemble mean subtracted from
- each ensemble member to give an anomaly surface temperature. In practice a
- better approach would be to take the climatological mean, calibrated to the
- model, from each ensemble member.
+* The second plot shows the data limited to a region of interest, in this case
+ a region defined for forecasting ENSO (El Nino-Southern Oscillation), which,
+ for the purposes of this example, has had the ensemble mean subtracted from
+ each ensemble member to give an anomaly surface temperature. In practice a
+ better approach would be to take the climatological mean, calibrated to the
+ model, from each ensemble member.
"""
@@ -115,7 +115,7 @@ def main():
# Get the time for the entire plot.
time = last_time_coord.units.num2date(last_time_coord.bounds[0, 0])
- # Set a global title for the postage stamps with the date formated by
+ # Set a global title for the postage stamps with the date formatted by
# "monthname year".
time_string = time.strftime("%B %Y")
plt.suptitle(f"Surface temperature ensemble forecasts for {time_string}")
diff --git a/docs/src/IEP/IEP001.adoc b/docs/src/IEP/IEP001.adoc
index d38b2e8478..e43969f3ce 100644
--- a/docs/src/IEP/IEP001.adoc
+++ b/docs/src/IEP/IEP001.adoc
@@ -119,7 +119,7 @@ cube.sel(height=1.5)
The semantics of position-based slices will continue to match that of normal Python slices. The start position is included, the end position is excluded.
-Value-based slices will be stricly inclusive, with both the start and end values included. This behaviour differs from normal Python slices but is in common with pandas.
+Value-based slices will be strictly inclusive, with both the start and end values included. This behaviour differs from normal Python slices but is in common with pandas.
Just as for normal Python slices, we do not need to provide the ability to control the include/exclude behaviour for slicing.
diff --git a/docs/src/Makefile b/docs/src/Makefile
index a75da5371b..8d652878f6 100644
--- a/docs/src/Makefile
+++ b/docs/src/Makefile
@@ -20,27 +20,18 @@ ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
help:
@echo "Please use \`make ' where is one of"
- @echo " html to make standalone HTML files"
- @echo " dirhtml to make HTML files named index.html in directories"
- @echo " singlehtml to make a single large HTML file"
- @echo " pickle to make pickle files"
- @echo " json to make JSON files"
- @echo " htmlhelp to make HTML files and a HTML help project"
- @echo " qthelp to make HTML files and a qthelp project"
- @echo " devhelp to make HTML files and a Devhelp project"
- @echo " epub to make an epub"
- @echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
- @echo " latexpdf to make LaTeX files and run them through pdflatex"
- @echo " text to make text files"
- @echo " man to make manual pages"
- @echo " changes to make an overview of all changed/added/deprecated items"
- @echo " linkcheck to check all external links for integrity"
- @echo " doctest to run all doctests embedded in the documentation (if enabled)"
- @echo " show to open the built documentation in the default browser"
-
-clean:
- -rm -rf $(BUILDDIR)
- -rm -rf $(SRCDIR)/generated
+ @echo " help to view this help"
+ @echo " html to make standalone HTML files"
+ @echo " html-noplot to make standalone HTML files, skip gallery"
+ @echo " html-noapi to make standalone HTML files, skip the API"
+ @echo " html-quick to make standalone HTML files, skip the gallery and API"
+ @echo " clean to remove all built files"
+ @echo " doctest to run all doctests embedded in the documentation (if enabled)"
+ @echo " linkcheck to check all external links for integrity"
+ @echo " show to open the built documentation in the default browser"
+ @echo " livehtml to auto build the docs when any file changes are detected."
+ @echo " You need to install sphinx-autobuild first:"
+ @echo " conda install -c conda-forge sphinx-autobuild"
html:
$(SPHINXBUILD) $(WARNING_TO_ERROR) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@@ -62,94 +53,23 @@ html-quick:
@echo
@echo "Build finished. The HTML (no gallery or api docs) pages are in $(BUILDDIR)/html"
-dirhtml:
- $(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
- @echo
- @echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml"
-
-singlehtml:
- $(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
- @echo
- @echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml"
-
-pickle:
- $(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
- @echo
- @echo "Build finished; now you can process the pickle files"
-
-json:
- $(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
- @echo
- @echo "Build finished; now you can process the JSON files"
-
-htmlhelp:
- $(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
- @echo
- @echo "Build finished; now you can run HTML Help Workshop with the" \
- ".hhp project file in $(BUILDDIR)/htmlhelp."
-
-qthelp:
- $(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
- @echo
- @echo "Build finished; now you can run "qcollectiongenerator" with the" \
- ".qhcp project file in $(BUILDDIR)/qthelp, like this:"
- @echo "# qcollectiongenerator $(BUILDDIR)/qthelp/Iris.qhcp"
- @echo "To view the help file:"
- @echo "# assistant -collectionFile $(BUILDDIR)/qthelp/Iris.qhc"
-
-devhelp:
- $(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
- @echo
- @echo "Build finished."
- @echo "To view the help file:"
- @echo "# mkdir -p $$HOME/.local/share/devhelp/Iris"
- @echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/Iris"
- @echo "# devhelp"
-
-epub:
- $(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
- @echo
- @echo "Build finished. The epub file is in $(BUILDDIR)/epub"
-
-latex:
- $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
- @echo
- @echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
- @echo "Run \`make' in that directory to run these through (pdf)latex" \
- "(use \`make latexpdf' here to do that automatically)."
-
-latexpdf: latex
- $(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
- @echo "Running LaTeX files through pdflatex..."
- make -C $(BUILDDIR)/latex all-pdf
- @echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex"
-
-text:
- $(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
- @echo
- @echo "Build finished. The text files are in $(BUILDDIR)/text."
-
-man:
- $(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
- @echo
- @echo "Build finished. The manual pages are in $(BUILDDIR)/man"
+clean:
+ -rm -rf $(BUILDDIR)
+ -rm -rf $(SRCDIR)/generated
-changes:
- $(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
- @echo
- @echo "The overview file is in $(BUILDDIR)/changes."
+doctest:
+ $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
+ @echo "Testing of doctests in the sources finished, look at the "
+ @echo "results in $(BUILDDIR)/doctest/output.txt."
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
- @echo "Link check complete; look for any errors in the above output " \
- "or in $(BUILDDIR)/linkcheck/output.txt."
-
-doctest:
- $(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
- @echo "Testing of doctests in the sources finished, look at the " \
- "results in $(BUILDDIR)/doctest/output.txt."
+ @echo "Link check complete; look for any errors in the above output "
+ @echo "or in $(BUILDDIR)/linkcheck/output.txt."
show:
@python -c "import webbrowser; webbrowser.open_new_tab('file://$(shell pwd)/$(BUILDDIR)/html/index.html')"
+livehtml:
+ sphinx-autobuild "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) --ignore generated $(O)
\ No newline at end of file
diff --git a/docs/src/_static/icon_api.svg b/docs/src/_static/icon_api.svg
index 841b105973..bf2f8d67bb 100644
--- a/docs/src/_static/icon_api.svg
+++ b/docs/src/_static/icon_api.svg
@@ -13,14 +13,15 @@
id="Capa_1"
x="0px"
y="0px"
- viewBox="0 0 511 511"
- style="enable-background:new 0 0 511 511;"
+ viewBox="0 0 508 511"
xml:space="preserve"
- sodipodi:docname="icon_api2.svg"
- inkscape:version="0.92.4 (5da689c313, 2019-01-14)">image/svg+xml
+ inkscape:current-layer="g20"
+ inkscape:pagecheckerboard="true" />
+ style="fill:#1b8fb7;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#1b8fb7;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#1b8fb7;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#1b8fb7;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#1b8fb7;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#cadc6d;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#cadc6d;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#cadc6d;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#cadc6d;fill-opacity:1"
+ inkscape:connector-curvature="0" />
+
\ No newline at end of file
diff --git a/docs/src/_static/iris-logo-title.svg b/docs/src/_static/iris-logo-title.svg
index 5bc38bfbda..98dd1a73d5 100644
--- a/docs/src/_static/iris-logo-title.svg
+++ b/docs/src/_static/iris-logo-title.svg
@@ -1,13 +1,13 @@
-
+Banner logo for the SciTools Iris project - https://github.com/SciTools/iris/
-
+
-
+
@@ -97,11 +97,11 @@
-
-
-
-
+
+
+
+
- Iris
+ Iris
\ No newline at end of file
diff --git a/docs/src/_static/iris-logo.svg b/docs/src/_static/iris-logo.svg
index 6c4bdb0e5a..fe49411b45 100644
--- a/docs/src/_static/iris-logo.svg
+++ b/docs/src/_static/iris-logo.svg
@@ -1,12 +1,12 @@
-
+Logo for the SciTools Iris project - https://github.com/SciTools/iris/
-
+
@@ -96,9 +96,9 @@
-
-
-
-
+
+
+
+
\ No newline at end of file
diff --git a/docs/src/_static/theme_override.css b/docs/src/_static/theme_override.css
index 326c1d4d4a..355119f8a5 100644
--- a/docs/src/_static/theme_override.css
+++ b/docs/src/_static/theme_override.css
@@ -1,5 +1,5 @@
/* import the standard theme css */
-@import url("css/theme.css");
+@import url("styles/theme.css");
/* now we can add custom css.... */
diff --git a/docs/src/common_links.inc b/docs/src/common_links.inc
index 17278460dd..ba24141d87 100644
--- a/docs/src/common_links.inc
+++ b/docs/src/common_links.inc
@@ -6,6 +6,7 @@
.. _flake8: https://flake8.pycqa.org/en/stable/
.. _.flake8.yml: https://github.com/SciTools/iris/blob/main/.flake8
.. _cirrus-ci: https://cirrus-ci.com/github/SciTools/iris
+.. _codespell: https://github.com/codespell-project/codespell
.. _conda: https://docs.conda.io/en/latest/
.. _contributor: https://github.com/SciTools/scitools.org.uk/blob/master/contributors.json
.. _core developers: https://github.com/SciTools/scitools.org.uk/blob/master/contributors.json
@@ -30,17 +31,19 @@
.. _pull request: https://github.com/SciTools/iris/pulls
.. _pull requests: https://github.com/SciTools/iris/pulls
.. _Read the Docs: https://scitools-iris.readthedocs.io/en/latest/
-.. _readthedocs.yml: https://github.com/SciTools/iris/blob/main/requirements/ci/readthedocs.yml
+.. _readthedocs.yml: https://github.com/SciTools/iris/blob/main/requirements/readthedocs.yml
.. _SciTools: https://github.com/SciTools
.. _scitools-iris: https://pypi.org/project/scitools-iris/
.. _sphinx: https://www.sphinx-doc.org/en/master/
+.. _sphinx-apidoc: https://github.com/sphinx-contrib/apidoc
.. _test-iris-imagehash: https://github.com/SciTools/test-iris-imagehash
.. _using git: https://docs.github.com/en/github/using-git
-.. _requirements/ci/: https://github.com/SciTools/iris/tree/main/requirements/ci
+.. _requirements: https://github.com/SciTools/iris/tree/main/requirements
.. _CF-UGRID: https://ugrid-conventions.github.io/ugrid-conventions/
.. _issues on GitHub: https://github.com/SciTools/iris/issues?q=is%3Aopen+is%3Aissue+sort%3Areactions-%2B1-desc
.. _python-stratify: https://github.com/SciTools/python-stratify
.. _iris-esmf-regrid: https://github.com/SciTools-incubator/iris-esmf-regrid
+.. _netCDF4: https://github.com/Unidata/netcdf4-python
.. comment
diff --git a/docs/src/community/iris_xarray.rst b/docs/src/community/iris_xarray.rst
index 859597da78..2250e3c0a3 100644
--- a/docs/src/community/iris_xarray.rst
+++ b/docs/src/community/iris_xarray.rst
@@ -11,7 +11,7 @@ you can be prepared, and to help you choose the best package for your use case.
Overall Experience
------------------
-Iris is the more specialised package, focussed on making it as easy
+Iris is the more specialised package, focused on making it as easy
as possible to work with meteorological and climatological data. Iris
is built to natively handle many key concepts, such as the CF conventions,
coordinate systems and bounded coordinates. Iris offers a smaller toolkit of
diff --git a/docs/src/conf.py b/docs/src/conf.py
index 576a099b90..b7f87d4ebc 100644
--- a/docs/src/conf.py
+++ b/docs/src/conf.py
@@ -81,7 +81,7 @@ def autolog(message):
# add some sample files from the developers guide..
sys.path.append(os.path.abspath(os.path.join("developers_guide")))
-# why isnt the iris path added to it is discoverable too? We dont need to,
+# why isn't the iris path added to it is discoverable too? We dont need to,
# the sphinext to generate the api rst knows where the source is. If it
# is added then the travis build will likely fail.
@@ -122,7 +122,7 @@ def _dotv(version):
# Automate the discovery of the python versions tested with CI.
python_support = sorted(
- [fname.stem for fname in Path(".").glob("../../requirements/ci/py*.yml")]
+ [fname.stem for fname in Path(".").glob("../../requirements/py*.yml")]
)
if not python_support:
@@ -143,7 +143,7 @@ def _dotv(version):
"""
# Add any Sphinx extension module names here, as strings. They can be
-# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
+# extensions coming with Sphinx (named "sphinx.ext.*") or your custom
# ones.
extensions = [
"sphinx.ext.todo",
@@ -157,7 +157,7 @@ def _dotv(version):
"sphinx.ext.intersphinx",
"sphinx_copybutton",
"sphinx.ext.napoleon",
- "sphinx_panels",
+ "sphinx_design",
"sphinx_gallery.gen_gallery",
"matplotlib.sphinxext.mathmpl",
"matplotlib.sphinxext.plot_directive",
@@ -166,14 +166,8 @@ def _dotv(version):
if skip_api == "1":
autolog("Skipping the API docs generation (SKIP_API=1)")
else:
- # better api documentation (custom)
- extensions.extend(
- ["custom_class_autodoc", "custom_data_autodoc", "generate_package_rst"]
- )
-
-# -- panels extension ---------------------------------------------------------
-# See https://sphinx-panels.readthedocs.io/en/latest/
-panels_add_bootstrap_css = False
+ extensions.extend(["sphinxcontrib.apidoc"])
+ extensions.extend(["api_rst_formatting"])
# -- Napoleon extension -------------------------------------------------------
# See https://sphinxcontrib-napoleon.readthedocs.io/en/latest/sphinxcontrib.napoleon.html
@@ -204,13 +198,36 @@ def _dotv(version):
# api generation configuration
autodoc_member_order = "groupwise"
autodoc_default_flags = ["show-inheritance"]
+
+# https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html#confval-autodoc_typehints
autodoc_typehints = "none"
autosummary_generate = True
autosummary_imported_members = True
autopackage_name = ["iris"]
-autoclass_content = "init"
+autoclass_content = "both"
modindex_common_prefix = ["iris"]
+# -- apidoc extension ---------------------------------------------------------
+# See https://github.com/sphinx-contrib/apidoc
+source_code_root = (Path(__file__).parents[2]).absolute()
+module_dir = source_code_root / "lib"
+apidoc_module_dir = str(module_dir)
+apidoc_output_dir = str(Path(__file__).parent / "generated/api")
+apidoc_toc_file = False
+
+apidoc_excluded_paths = [
+ str(module_dir / "iris/tests"),
+ str(module_dir / "iris/experimental/raster.*"), # gdal conflicts
+]
+
+apidoc_module_first = True
+apidoc_separate_modules = True
+apidoc_extra_args = []
+
+autolog(f"[sphinx-apidoc] source_code_root = {source_code_root}")
+autolog(f"[sphinx-apidoc] apidoc_excluded_paths = {apidoc_excluded_paths}")
+autolog(f"[sphinx-apidoc] apidoc_output_dir = {apidoc_output_dir}")
+
# Add any paths that contain templates here, relative to this directory.
templates_path = ["_templates"]
@@ -218,6 +235,7 @@ def _dotv(version):
# See https://www.sphinx-doc.org/en/master/usage/extensions/intersphinx.html
intersphinx_mapping = {
"cartopy": ("https://scitools.org.uk/cartopy/docs/latest/", None),
+ "dask": ("https://docs.dask.org/en/stable/", None),
"matplotlib": ("https://matplotlib.org/stable/", None),
"numpy": ("https://numpy.org/doc/stable/", None),
"python": ("https://docs.python.org/3/", None),
@@ -239,11 +257,11 @@ def _dotv(version):
# See https://www.sphinx-doc.org/en/master/usage/extensions/extlinks.html
extlinks = {
- "issue": ("https://github.com/SciTools/iris/issues/%s", "Issue #"),
- "pull": ("https://github.com/SciTools/iris/pull/%s", "PR #"),
+ "issue": ("https://github.com/SciTools/iris/issues/%s", "Issue #%s"),
+ "pull": ("https://github.com/SciTools/iris/pull/%s", "PR #%s"),
"discussion": (
"https://github.com/SciTools/iris/discussions/%s",
- "Discussion #",
+ "Discussion #%s",
),
}
@@ -256,7 +274,6 @@ def _dotv(version):
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
-html_logo = "_static/iris-logo-title.svg"
html_favicon = "_static/iris-logo.svg"
html_theme = "pydata_sphinx_theme"
@@ -272,7 +289,8 @@ def _dotv(version):
# See https://pydata-sphinx-theme.readthedocs.io/en/latest/user_guide/configuring.html
html_theme_options = {
- "footer_items": ["copyright", "sphinx-version", "custom_footer"],
+ "footer_start": ["copyright", "sphinx-version"],
+ "footer_end": ["custom_footer"],
"collapse_navigation": True,
"navigation_depth": 3,
"show_prev_next": True,
@@ -299,9 +317,14 @@ def _dotv(version):
],
"use_edit_page_button": True,
"show_toc_level": 1,
- # Omitted `theme-switcher` below to disable it
+ # Omit `theme-switcher` from navbar_end below to disable it
# Info: https://pydata-sphinx-theme.readthedocs.io/en/stable/user_guide/light-dark.html#configure-default-theme-mode
- "navbar_end": ["navbar-icon-links"],
+ # "navbar_end": ["navbar-icon-links"],
+ # https://pydata-sphinx-theme.readthedocs.io/en/v0.11.0/user_guide/branding.html#different-logos-for-light-and-dark-mode
+ "logo": {
+ "image_light": "_static/iris-logo-title.svg",
+ "image_dark": "_static/iris-logo-title-dark.svg",
+ },
}
rev_parse = run(["git", "rev-parse", "--short", "HEAD"], capture_output=True)
@@ -315,7 +338,7 @@ def _dotv(version):
"doc_path": "docs/src",
# default theme. Also disabled the button in the html_theme_options.
# Info: https://pydata-sphinx-theme.readthedocs.io/en/stable/user_guide/light-dark.html#configure-default-theme-mode
- "default_mode": "light",
+ "default_mode": "auto",
# custom
"on_rtd": on_rtd,
"rtd_version": rtd_version,
@@ -332,13 +355,14 @@ def _dotv(version):
html_static_path = ["_static"]
html_style = "theme_override.css"
-# this allows for using datatables: https://datatables.net/
+# this allows for using datatables: https://datatables.net/.
+# the version can be manually upgraded by changing the urls below.
html_css_files = [
- "https://cdn.datatables.net/1.10.23/css/jquery.dataTables.min.css",
+ "https://cdn.datatables.net/1.13.4/css/jquery.dataTables.min.css",
]
html_js_files = [
- "https://cdn.datatables.net/1.10.23/js/jquery.dataTables.min.js",
+ "https://cdn.datatables.net/1.13.4/js/jquery.dataTables.min.js",
]
# url link checker. Some links work but report as broken, lets ignore them.
@@ -386,13 +410,8 @@ def _dotv(version):
}
# -----------------------------------------------------------------------------
-# Remove matplotlib agg warnings from generated doc when using plt.show
-warnings.filterwarnings(
- "ignore",
- category=UserWarning,
- message="Matplotlib is currently using agg, which is a"
- " non-GUI backend, so cannot show the figure.",
-)
+# Remove warnings
+warnings.filterwarnings("ignore")
# -- numfig options (built-in) ------------------------------------------------
# Enable numfig.
diff --git a/docs/src/developers_guide/contributing_benchmarks.rst b/docs/src/developers_guide/contributing_benchmarks.rst
index 65bc9635b6..b6d005df68 100644
--- a/docs/src/developers_guide/contributing_benchmarks.rst
+++ b/docs/src/developers_guide/contributing_benchmarks.rst
@@ -22,10 +22,11 @@ requests, instead it is **run overnight against each the commits of the
previous day** to check if any commit has introduced performance shifts.
Detected shifts are reported in a new Iris GitHub issue.
+.. _on_demand_pr_benchmark:
+
If a pull request author/reviewer suspects their changes may cause performance
-shifts, a convenience is available (currently via Nox) to replicate the
-overnight benchmark run but comparing the current ``HEAD`` with a requested
-branch (e.g. ``upstream/main``). Read more in `benchmarks/README.md`_.
+shifts, they can manually order their pull request to be benchmarked by adding
+the ``benchmark_this`` label to the PR. Read more in `benchmarks/README.md`_.
Other Uses
----------
@@ -38,8 +39,8 @@ applications.
* Results for a series of commits can be visualised for an intuitive
understanding of when and why changes occurred.
- .. image:: asv_example_images/commits.png
- :width: 300
+ .. image:: asv_example_images/commits.png
+ :width: 300
* Parameterised benchmarks make it easy to visualise:
diff --git a/docs/src/developers_guide/contributing_ci_tests.rst b/docs/src/developers_guide/contributing_ci_tests.rst
index 1d06434843..bfe80cd760 100644
--- a/docs/src/developers_guide/contributing_ci_tests.rst
+++ b/docs/src/developers_guide/contributing_ci_tests.rst
@@ -48,18 +48,18 @@ GitHub Actions Test Environment
-------------------------------
The CI test environments for our GHA is determined from the requirement files
-in ``requirements/ci/pyXX.yml``. These are conda environment files list the top-level
+in ``requirements/pyXX.yml``. These are conda environment files list the top-level
package dependencies for running and testing Iris.
For reproducible test results, these environments are resolved for all their dependencies
-and stored as conda lock files in the ``requirements/ci/nox.lock`` directory. The test environments
+and stored as conda lock files in the ``requirements/locks`` directory. The test environments
will not resolve the dependencies each time, instead they will use the lock files to reproduce the
exact same environment each time.
**If you have updated the requirement YAML files with new dependencies, you will need to
generate new lock files.** To do this, run the command::
- python tools/update_lockfiles.py -o requirements/ci/nox.lock requirements/ci/py*.yml
+ python tools/update_lockfiles.py -o requirements/locks requirements/py*.yml
or simply::
@@ -67,21 +67,14 @@ or simply::
and add the changed lockfiles to your pull request.
-.. note::
-
- If your installation of conda runs through Artifactory or another similar
- proxy then you will need to amend that lockfile to use URLs that Github
- Actions can access. A utility to strip out Artifactory exists in the
- ``ssstack`` tool.
-
New lockfiles are generated automatically each week to ensure that Iris continues to be
tested against the latest available version of its dependencies.
-Each week the yaml files in ``requirements/ci`` are resolved by a GitHub Action.
+Each week the yaml files in ``requirements`` are resolved by a GitHub Action.
If the resolved environment has changed, a pull request is created with the new lock files.
The CI test suite will run on this pull request. If the tests fail, a developer
-will need to create a new branch based off the ``auto-update-lockfiles`` branch
+will need to create a new branch based off the ``auto-update-lockfiles`` branch
and add the required fixes to this new branch. If the fixes are made to the
-``auto-update-lockfiles`` branch these will be overwritten the next time the
+``auto-update-lockfiles`` branch these will be overwritten the next time the
Github Action is run.
@@ -117,6 +110,14 @@ pull-requests given the `Iris`_ GitHub repository `.pre-commit-config.yaml`_.
See the `pre-commit.ci dashboard`_ for details of recent past and active Iris jobs.
+.. note::
+
+ The `codespell`_ ``pre-commit`` hook checks the spelling of the whole codebase
+ and documentation. This hook is configured in the ``[tool.codespell]`` section
+ of the ``pyproject.toml`` file.
+
+ Append to the ``ignore-words-list`` option any **valid words** that are
+ considered **not** a typo and should **not** be corrected by `codespell`_.
.. _.pre-commit-config.yaml: https://github.com/SciTools/iris/blob/main/.pre-commit-config.yaml
.. _pre-commit.ci dashboard: https://results.pre-commit.ci/repo/github/5312648
diff --git a/docs/src/developers_guide/contributing_documentation_full.rst b/docs/src/developers_guide/contributing_documentation_full.rst
index a470def683..df850cb2c4 100755
--- a/docs/src/developers_guide/contributing_documentation_full.rst
+++ b/docs/src/developers_guide/contributing_documentation_full.rst
@@ -43,7 +43,7 @@ achieved via::
make html-noplot
-Another option is to skip the :ref:`iris` documentation creation. This can be
+Another option is to skip the :doc:`../generated/api/iris` documentation creation. This can be
useful as it reduces the time to build the documentation, however you may have
some build warnings as there maybe references to the API documentation.
This can be achieved via::
@@ -51,8 +51,8 @@ This can be achieved via::
make html-noapi
You can combine both the above and skip the
-:ref:`contributing.documentation.gallery` and :ref:`iris` documentation
-completely. This can be achieved via::
+:ref:`contributing.documentation.gallery` and :doc:`../generated/api/iris`
+documentation completely. This can be achieved via::
make html-quick
diff --git a/docs/src/developers_guide/contributing_graphics_tests.rst b/docs/src/developers_guide/contributing_graphics_tests.rst
index 7964c008c5..1e42c35ae6 100644
--- a/docs/src/developers_guide/contributing_graphics_tests.rst
+++ b/docs/src/developers_guide/contributing_graphics_tests.rst
@@ -36,9 +36,9 @@ The results of the failing image tests will now be available in
Reviewing Failing Tests
-----------------------
-#. Run ``iris/lib/iris/tests/graphics/idiff.py`` with python, e.g.:
+#. Run ``iris/lib/iris/tests/graphics/idiff.py`` with python, e.g.::
- python idiff.py
+ python idiff.py
This will open a window for you to visually inspect
side-by-side **old**, **new** and **difference** images for each failed
diff --git a/docs/src/developers_guide/contributing_pull_request_checklist.rst b/docs/src/developers_guide/contributing_pull_request_checklist.rst
index 57bc9fd728..11d68ace46 100644
--- a/docs/src/developers_guide/contributing_pull_request_checklist.rst
+++ b/docs/src/developers_guide/contributing_pull_request_checklist.rst
@@ -6,12 +6,12 @@ Pull Request Checklist
======================
All pull request will be reviewed by a core developer who will manage the
-process of merging. It is the responsibility of a developer submitting a
+process of merging. It is the responsibility of the contributor submitting a
pull request to do their best to deliver a pull request which meets the
requirements of the project it is submitted to.
-The check list summarises criteria which will be checked before a pull request
-is merged. Before submitting a pull request please consider this list.
+This check list summarises criteria which will be checked before a pull request
+is merged. Before submitting a pull request please consider the following:
#. **Provide a helpful description** of the Pull Request. This should include:
@@ -29,7 +29,7 @@ is merged. Before submitting a pull request please consider this list.
#. **Check all modified and new source files conform to the required**
:ref:`code_formatting`.
-#. **Check all new dependencies added to the** `requirements/ci/`_ **yaml
+#. **Check all new dependencies added to the** `requirements`_ **yaml
files.** If dependencies have been added then new nox testing lockfiles
should be generated too, see :ref:`gha_test_env`.
diff --git a/docs/src/developers_guide/documenting/docstrings.rst b/docs/src/developers_guide/documenting/docstrings.rst
index eeefc71e40..86f2c839c1 100644
--- a/docs/src/developers_guide/documenting/docstrings.rst
+++ b/docs/src/developers_guide/documenting/docstrings.rst
@@ -6,7 +6,8 @@ Docstrings
Every public object in the Iris package should have an appropriate docstring.
This is important as the docstrings are used by developers to understand
-the code and may be read directly in the source or via the :ref:`Iris`.
+the code and may be read directly in the source or via the
+:doc:`../../generated/api/iris`.
.. note::
As of April 2022 we are looking to adopt `numpydoc`_ strings as standard.
diff --git a/docs/src/developers_guide/documenting/whats_new_contributions.rst b/docs/src/developers_guide/documenting/whats_new_contributions.rst
index aa19722a69..82569e57a0 100644
--- a/docs/src/developers_guide/documenting/whats_new_contributions.rst
+++ b/docs/src/developers_guide/documenting/whats_new_contributions.rst
@@ -53,8 +53,8 @@ merge from trunk.
Writing a Contribution
======================
-As introduced above, a contribution is the description of a change to Iris
-which improved Iris in some way. As such, a single Iris Pull Request may
+A contribution is the short description of a change introduced to Iris
+which improved it in some way. As such, a single Iris Pull Request may
contain multiple changes that are worth highlighting as contributions to the
what's new document.
@@ -67,23 +67,31 @@ exceed **column 80** and ensure that any subsequent lines of the same entry are
aligned with the first. The content should target an Iris user as the audience.
The required content, in order, is as follows:
-* Names of those who contributed the change. These should be their GitHub
- user name. Link the name to their GitHub profile. E.g.
- ```@tkknight `_ changed...``
+* Use your discretion to decide on the names of all those that you want to
+ acknowledge as part of your contribution. Also consider the efforts of the
+ reviewer. Please use GitHub user names that link to their GitHub profile
+ e.g.,
- * Bigger changes take a lot of effort to review, too! Make sure you credit
- the reviewer(s) where appropriate.
+ ```@tkknight`_ Lorem ipsum dolor sit amet ...``
-* The new/changed behaviour
+ Also add a full reference in the following section at the end of the ``latest.rst``::
+
+ .. comment
+ Whatsnew author names (@github name) in alphabetical order. Note that,
+ core dev names are automatically included by the common_links.inc:
+
+ .. _@tkknight: https://github.com/tkknight
+
+* A succinct summary of the new/changed behaviour.
* Context to the change. Possible examples include: what this fixes, why
something was added, issue references (e.g. ``:issue:`9999```), more specific
detail on the change itself.
-* Pull request references, bracketed, following the final period. E.g.
+* Pull request references, bracketed, following the final period e.g.,
``(:pull:`1111`, :pull:`9999`)``
-* A trailing blank line (standard reStructuredText list format)
+* A trailing blank line (standard reStructuredText list format).
For example::
diff --git a/docs/src/developers_guide/gitwash/configure_git.rst b/docs/src/developers_guide/gitwash/configure_git.rst
index fd8218487f..564ae51820 100644
--- a/docs/src/developers_guide/gitwash/configure_git.rst
+++ b/docs/src/developers_guide/gitwash/configure_git.rst
@@ -144,7 +144,7 @@ and it gives graph / text output something like this (but with color!)::
| * 4aff2a8 - fixed bug 35, and added a test in test_bugfixes (2 weeks ago) [Hugo]
|/
* a7ff2e5 - Added notes on discussion/proposal made during Data Array Summit. (2 weeks ago) [Corran Webster]
- * 68f6752 - Initial implimentation of AxisIndexer - uses 'index_by' which needs to be changed to a call on an Axes object - this is all very sketchy right now. (2 weeks ago) [Corr
+ * 68f6752 - Initial implementation of AxisIndexer - uses 'index_by' which needs to be changed to a call on an Axes object - this is all very sketchy right now. (2 weeks ago) [Corr
* 376adbd - Merge pull request #46 from terhorst/main (2 weeks ago) [Jonathan Terhorst]
|\
| * b605216 - updated joshu example to current api (3 weeks ago) [Jonathan Terhorst]
diff --git a/docs/src/developers_guide/release.rst b/docs/src/developers_guide/release.rst
index bae77a7d21..97d7918148 100644
--- a/docs/src/developers_guide/release.rst
+++ b/docs/src/developers_guide/release.rst
@@ -7,47 +7,112 @@ Releases
A release of Iris is a `tag on the SciTools/Iris`_ Github repository.
-The summary below is of the main areas that constitute the release. The final
-section details the :ref:`iris_development_releases_steps` to take.
+Below is :ref:`iris_development_releases_steps`, followed by some prose on the
+main areas that constitute the release.
+
+
+.. _iris_development_releases_steps:
+
+How to Create an Iris Release
+-----------------------------
+
+The step-by-step process is walked-through by a script at:
+``/tools/release_do_nothing.py``, and also available here:
+:doc:`release_do_nothing`.
.. _release_manager:
Release Manager
---------------
+
A Release Manager will be nominated for each release of Iris. This role involves:
* deciding which features and bug fixes should be included in the release
-* managing the project board for the release
+* managing the `GitHub Projects`_ board for the release
* using :discussion:`GitHub Discussion releases category `
for documenting intent and capturing any
discussion about the release
+* holding a developer retrospective post release, to look for potential
+ future improvements
The Release Manager will make the release, ensuring that all the steps outlined
on this page are completed.
+Versioning
+----------
+
+Iris' version numbers conform to `Semantic Versioning`_ (``MAJOR.MINOR.PATCH``)
+and `PEP 440`_.
+
+Iris uses `setuptools-scm`_ to automatically manage versioning based on Git
+tags. No manual versioning work is required within the files themselves.
+
+
+Release Candidate
+-----------------
+
+Prior to a release, a release candidate tag may be created, marked as a
+pre-release in GitHub, with a tag ending with :literal:`rc` followed by a
+number (0-based), e.g.,:
+
+ :literal:`v1.9.0rc0`
+
+If created, the pre-release shall be available for a minimum of 2 weeks
+prior to the release being cut. However a 4 week period should be the goal
+to allow user groups to be notified of the existence of the pre-release and
+encouraged to test the functionality.
+
+A pre-release is expected for a major or minor release, but not for a
+patch release.
+
+If new features are required for a release after a release candidate has been
+cut, a new pre-release shall be issued first.
+
+Release candidates are made available as a conda package on the
+`conda-forge Anaconda channel`_ using the `rc_iris`_ label. This is achieved via
+the `conda-forge iris-feedstock`_ following `CFEP-05`_. For further information
+see the `conda-forge User Documentation`_.
+
+
+Patch Releases
+--------------
+
+Patch releases may be implemented to fix problems with previous major or minor
+releases. E.g. ``v1.9.1`` to fix a problem in ``v1.9.0``, both being part of
+the ``v1.9`` series.
+
+New features shall not be included in a patch release, these are for bug fixes.
+
+A patch release does not require a release candidate, but the rest of the
+release process is to be followed.
+
+
Before Release
--------------
Deprecations
~~~~~~~~~~~~
-Ensure that any behaviour which has been deprecated for the correct number of
+Any behaviour which has been deprecated for the correct number of
previous releases is now finally changed. More detail, including the correct
number of releases, is in :ref:`iris_development_deprecations`.
Standard Names
~~~~~~~~~~~~~~
-Update the file ``etc/cf-standard-name-table.xml`` to the latest CF standard names,
+The file ``etc/cf-standard-name-table.xml`` is updated to the latest CF standard names,
from the `latest CF standard names`_.
( This is used during build to automatically generate the sourcefile
``lib/iris/std_names.py``. )
+The Release
+-----------
+
Release Branch
---------------
+~~~~~~~~~~~~~~
Once the features intended for the release are on ``main``, a release branch
should be created, in the ``SciTools/iris`` repository. This will have the name:
@@ -61,73 +126,93 @@ for example:
This branch shall be used to finalise the release details in preparation for
the release candidate.
+Changes for a **patch release** should target to the same release branch as the
+rest of the series. For example, a fix
+for a problem with the ``v1.9.0`` release will be merged into ``v1.9.x`` release
+branch, and then released with the tag ``v1.9.1``.
-Release Candidate
------------------
+Documentation
+~~~~~~~~~~~~~
-Prior to a release, a release candidate tag may be created, marked as a
-pre-release in GitHub, with a tag ending with :literal:`rc` followed by a
-number (0-based), e.g.,:
+The documentation should include a dedicated What's New file for this release
+series (e.g. ``v1.9.rst``), incorporating all of the What's New entries for the release.
+This content should be reviewed and adapted as required, including highlights
+at the top of the What's New document.
- :literal:`v1.9.0rc0`
+What's New entries for **patch releases** should be added to the existing file
+for that release series (e.g. ``v1.9.1`` section in the ``v1.9.rst`` file).
-If created, the pre-release shall be available for a minimum of two weeks
-prior to the release being cut. However a 4 week period should be the goal
-to allow user groups to be notified of the existence of the pre-release and
-encouraged to test the functionality.
+A template for What's New formatting can be found in the
+``docs/src/whatsnew/latest.rst.template`` file.
-A pre-release is expected for a major or minor release, but not for a
-point release.
-If new features are required for a release after a release candidate has been
-cut, a new pre-release shall be issued first.
+Tagging
+~~~~~~~
-Make the release candidate available as a conda package on the
-`conda-forge Anaconda channel`_ using the `rc_iris`_ label. To do this visit
-the `conda-forge iris-feedstock`_ and follow `CFEP-05`_. For further information
-see the `conda-forge User Documentation`_.
+Once all checks are complete, the release is published from the release
+branch - via the GitHub release functionality in the ``SciTools/iris``
+repository - which simultaneously creates a Git tag for the release.
-Documentation
--------------
+Post Release
+------------
-The documentation should include all of the ``whatsnew`` entries for the release.
-This content should be reviewed and adapted as required.
+PyPI
+~~~~
+Iris is available on PyPI as ``scitools-iris``.
-Steps to achieve this can be found in the :ref:`iris_development_releases_steps`.
+Iris' Continuous-Integration (CI) includes the automatic building and publishing of
+PyPI artifacts in a dedicated GitHub Action.
+Legacy manual instructions are appended to this page for reference purposes
+(:ref:`update_pypi`)
-The Release
------------
+conda-forge
+~~~~~~~~~~~
+
+Iris is available on conda-forge as ``iris``.
-The final steps of the release are to ensure that the release date and details
-are correct in the relevant ``whatsnew`` page within the documentation.
+This is managed via the the Iris conda recipe on the
+`conda-forge iris-feedstock`_, which is updated after the release is cut on
+GitHub, followed by automatic build and publish of the
+conda package on the `conda-forge Anaconda channel`_.
-There is no need to update the ``iris.__version__``, as this is managed
-automatically by `setuptools-scm`_.
+Announcement
+~~~~~~~~~~~~
-Once all checks are complete, the release is published on GitHub by
-creating a new tag in the ``SciTools/iris`` repository.
+Iris uses Twitter (`@scitools_iris`_) to announce new releases, as well as any
+internal message boards that are accessible (e.g. at the UK Met Office).
+Announcements usually include a highlighted feature to hook readers' attention.
+Citation
+~~~~~~~~
-Update conda-forge
-------------------
+``docs/src/userguide/citation.rst`` is updated to include
+the latest [non-release-candidate] version, date and `Zenodo DOI`_
+of the new release. Ideally this would be updated before the release, but
+the DOI for the new version is only available once the release has been
+created in GitHub.
+
+Merge Back
+~~~~~~~~~~
+
+After any release is published, **including patch releases**, the changes from the
+release branch should be merged back onto the ``SciTools/iris`` ``main`` branch.
-Once a release is cut on GitHub, update the Iris conda recipe on the
-`conda-forge iris-feedstock`_ for the release. This will build and publish the
-conda package on the `conda-forge Anaconda channel`_.
+Appendices
+----------
.. _update_pypi:
-Update PyPI
------------
+Updating PyPI Manually
+~~~~~~~~~~~~~~~~~~~~~~
.. note::
As part of our Continuous-Integration (CI), the building and publishing of
PyPI artifacts is now automated by a dedicated GitHub Action.
-
+
The following instructions **no longer** require to be performed manually,
but remain part of the documentation for reference purposes only.
@@ -138,18 +223,18 @@ To do this perform the following steps.
Create a conda environment with the appropriate conda packages to build the
source distribution (``sdist``) and pure Python wheel (``bdist_wheel``)::
- > conda create -n iris-pypi -c conda-forge --yes pip python setuptools twine wheel
+ > conda create -n iris-pypi -c conda-forge --yes build twine
> . activate iris-pypi
Checkout the appropriate Iris ```` tag from the appropriate ````.
For example, to checkout tag ``v1.0`` from ``upstream``::
> git fetch upstream --tags
- > git checkout v1.0
+ > git checkout v1.0
Build the source distribution and wheel from the Iris root directory::
- > python setup.py sdist bdist_wheel
+ > python -m build
This ``./dist`` directory should now be populated with the source archive
``.tar.gz`` file, and built distribution ``.whl`` file.
@@ -173,9 +258,8 @@ Ensure that the artifacts are successfully uploaded and available on
from PyPI::
> conda deactivate
- > conda env create --file ./requrements/ci/iris.yml
+ > conda env create --file ./requirements/iris.yml
> . activate iris-dev
- > conda install -c conda-forge pip
> python -m pip install --no-deps scitools-iris
For further details on how to test Iris, see :ref:`developer_running_tests`.
@@ -185,105 +269,6 @@ For further details on how to test Iris, see :ref:`developer_running_tests`.
For further information on packaging and uploading a project to PyPI, please
refer to `Generating Distribution Archives`_ and `Packaging Your Project`_.
-
-Merge Back
-----------
-
-After the release is published, the changes from the release branch should be merged
-back onto the ``SciTools/iris`` ``main`` branch.
-
-To achieve this, first cut a local branch from the latest ``main`` branch,
-and `git merge` the :literal:`.x` release branch into it. Ensure that the
-``docs/src/whatsnew/index.rst`` and ``docs/src/whatsnew/latest.rst`` are
-correct, before committing these changes and then proposing a pull-request
-on the ``main`` branch of ``SciTools/iris``.
-
-
-Point Releases
---------------
-
-Bug fixes may be implemented and targeted on the :literal:`.x` release branch.
-These should lead to a new point release, and another tag. For example, a fix
-for a problem with the ``v1.9.0`` release will be merged into ``v1.9.x`` release
-branch, and then released by tagging ``v1.9.1``.
-
-New features shall not be included in a point release, these are for bug fixes.
-
-``whatsnew`` entries should be added to the existing
-``docs/src/whatsnew/v1.9.rst`` file in a new ``v1.9.1`` section. A template for
-this bugfix patches section can be found in the
-``docs/src/whatsnew/latest.rst.template`` file.
-
-A point release does not require a release candidate, but the rest of the
-release process is to be followed, including the merge back of changes into
-``main``.
-
-
-.. _iris_development_releases_steps:
-
-Maintainer Steps
-----------------
-
-These steps assume a release for ``1.9.0`` is to be created.
-
-Release Steps
-~~~~~~~~~~~~~
-
-#. Update the ``whatsnew`` for the release:
-
- * Use ``git`` to rename ``docs/src/whatsnew/latest.rst`` to the release
- version file ``v1.9.rst``
- * Use ``git`` to delete the ``docs/src/whatsnew/latest.rst.template`` file
- * In ``v1.9.rst`` remove the ``[unreleased]`` caption from the page title.
- Replace this with ``[release candidate]`` for the release candidate and
- remove this for the actual release.
- Note that, the Iris version and release date are updated automatically
- when the documentation is built
- * Review the file for correctness
- * Work with the development team to populate the ``Release Highlights``
- dropdown at the top of the file, which provides extra detail on notable
- changes
- * Use ``git`` to add and commit all changes, including removal of
- ``latest.rst.template``.
-
-#. Update the ``whatsnew`` index ``docs/src/whatsnew/index.rst``
-
- * Remove the reference to ``latest.rst``
- * Add a reference to ``v1.9.rst`` to the top of the list
-
-#. Check your changes by building the documentation and reviewing
-#. Once all the above steps are complete, the release is cut, using
- the :guilabel:`Draft a new release` button on the
- `Iris release page `_
- and targeting the release branch if it exists
-#. Create the release feature branch ``v1.9.x`` on `SciTools/iris`_ if it doesn't
- already exist. For point/bugfix releases use the branch which already exists
-
-
-Post Release Steps
-~~~~~~~~~~~~~~~~~~
-
-#. Check the documentation has built on `Read The Docs`_. The build is
- triggered by any commit to ``main``. Additionally check that the versions
- available in the pop out menu in the bottom right corner include the new
- release version. If it is not present you will need to configure the
- versions available in the **admin** dashboard in `Read The Docs`_.
-#. Review the `Active Versions`_ for the ``scitools-iris`` project on
- `Read The Docs`_ to ensure that the appropriate versions are ``Active``
- and/or ``Hidden``. To do this ``Edit`` the appropriate version e.g.,
- see `Editing v3.0.0rc0`_ (must be logged into Read the Docs).
-#. Merge back to ``main``. This should be done after all releases, including
- the release candidate, and also after major changes to the release branch.
-#. On main, make a new ``latest.rst`` from ``latest.rst.template`` and update
- the include statement and the toctree in ``index.rst`` to point at the new
- ``latest.rst``.
-#. Consider updating ``docs/src/userguide/citation.rst`` on ``main`` to include
- the version number, date and `Zenodo DOI `_
- of the new release. Ideally this would be updated before the release, but
- the DOI for the new version is only available once the release has been
- created in GitHub.
-
-
.. _SciTools/iris: https://github.com/SciTools/iris
.. _tag on the SciTools/Iris: https://github.com/SciTools/iris/releases
.. _conda-forge Anaconda channel: https://anaconda.org/conda-forge/iris
@@ -295,5 +280,10 @@ Post Release Steps
.. _rc_iris: https://anaconda.org/conda-forge/iris/labels
.. _Generating Distribution Archives: https://packaging.python.org/tutorials/packaging-projects/#generating-distribution-archives
.. _Packaging Your Project: https://packaging.python.org/guides/distributing-packages-using-setuptools/#packaging-your-project
-.. _latest CF standard names: http://cfconventions.org/standard-names.html
+.. _latest CF standard names: http://cfconventions.org/Data/cf-standard-names/current/src/cf-standard-name-table.xml
.. _setuptools-scm: https://github.com/pypa/setuptools_scm
+.. _Semantic Versioning: https://semver.org/
+.. _PEP 440: https://peps.python.org/pep-0440/
+.. _@scitools_iris: https://twitter.com/scitools_iris
+.. _GitHub Projects: https://github.com/SciTools/iris/projects
+.. _Zenodo DOI: https://doi.org/10.5281/zenodo.595182
diff --git a/docs/src/developers_guide/release_do_nothing.rst b/docs/src/developers_guide/release_do_nothing.rst
new file mode 100644
index 0000000000..1f72827184
--- /dev/null
+++ b/docs/src/developers_guide/release_do_nothing.rst
@@ -0,0 +1,12 @@
+:orphan:
+
+Release Do-Nothing Script
+-------------------------
+
+Rendered from the original ``/tools/release_do_nothing.py``.
+
+`Read more about do-nothing scripts
+`_
+
+.. literalinclude:: ../../../tools/release_do_nothing.py
+ :language: python
diff --git a/docs/src/further_topics/metadata.rst b/docs/src/further_topics/metadata.rst
index 4c55047d4c..a564b2ba68 100644
--- a/docs/src/further_topics/metadata.rst
+++ b/docs/src/further_topics/metadata.rst
@@ -120,7 +120,7 @@ For example, given the following :class:`~iris.cube.Cube`,
forecast_reference_time 1859-09-01 06:00:00
height 1.5 m
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'A1B'
diff --git a/docs/src/further_topics/ugrid/images/data_structured_grid.svg b/docs/src/further_topics/ugrid/images/data_structured_grid.svg
index 2f3a1ce342..28f088bd71 100644
--- a/docs/src/further_topics/ugrid/images/data_structured_grid.svg
+++ b/docs/src/further_topics/ugrid/images/data_structured_grid.svg
@@ -1 +1,1374 @@
-23, 28-19,-21101525-5-15-20-30xyCoordinate ArraysxyCoordinate Arrays23, 28-19, -21xyBounds Arraysderive point locationsassign data using dimensional indices,position in array == relative spatial positionderive area locations & shapesPoint DataArea DataData Array(bounded coordsalways have points too)my_variable* x+yare not lons+lats, just a demonstration!
\ No newline at end of file
+
+
+
+
+
+ image/svg+xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 23, 28-
+1
+9,
+-
+21
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 1015
+25
+-
+
+
+
+
+
+5-
+
+
+
+
+
+15-
+
+
+
+
+
+20-
+
+
+
+
+
+30
+
+
+
+
+
+
+
+
+
+
+ x
+
+ yCoordinate Arrays
+
+
+
+
+
+
+
+
+
+
+
+ x
+
+ yCoordinate Arrays
+23, 28
+-
+1
+9,
+-
+21
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ x
+
+ yBounds Arrays
+
+
+
+
+
+
+
+ derive point locationsassign data using
+dimensional indices,
+position in array == relative
+spatial position
+derive area
+locations & shapes
+Point Data
+
+
+ Area Data
+
+ Data Array(bounded
+coords
+always have
+points too)
+
+
+ my_variable
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ * x+y
+are not
+lons+lats
+, just a demonstration!
+
+
+
+
diff --git a/docs/src/further_topics/ugrid/images/data_ugrid_mesh.svg b/docs/src/further_topics/ugrid/images/data_ugrid_mesh.svg
index ab7302346b..c2b822fbcf 100644
--- a/docs/src/further_topics/ugrid/images/data_ugrid_mesh.svg
+++ b/docs/src/further_topics/ugrid/images/data_ugrid_mesh.svg
@@ -1 +1,2273 @@
-5, 7, 8, 14`xy1212`node_coordinates`every node has its own x + y coordinatesderive node locations1515xy`node_coordinates`[5][7][8][14]construct faces by connecting nodesderive ‘corner’ node locationsassign data using 1D indexing,position in array unrelated to spatial positionmatch indices with facesmatch indices with nodesNode DataFace Data12Data Arraymy_variable12 ×4`face_node_connectivity`face_nodes
\ No newline at end of file
+
+
+
+
+
+ image/svg+xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 5, 7, 8, 14
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ `
+
+
+
+ x
+
+ y12
+12
+`
+node_coordinates
+`
+every node
+has its own
+x + y
+coordinates
+
+
+ derive node locations
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 1515
+
+
+ x
+
+ y`
+node_coordinates
+`
+[5]
+[7]
+[8]
+[14]
+construct faces
+by connecting
+nodes
+derive ‘corner’
+node locations
+
+
+
+
+
+
+
+ assign data using 1D indexing,
+position in array
+unrelated to spatial
+position
+match indices
+with faces
+match indices
+with nodes
+Node Data
+
+
+ Face Data
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 12Data
+Array
+
+
+ my_variable
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ 12 ×
+4
+`
+face_node_connectivity
+`
+
+
+ face_nodes
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/src/further_topics/ugrid/images/iris-esmf-regrid.svg b/docs/src/further_topics/ugrid/images/iris-esmf-regrid.svg
index e70a9386a7..93e35cb21d 100644
--- a/docs/src/further_topics/ugrid/images/iris-esmf-regrid.svg
+++ b/docs/src/further_topics/ugrid/images/iris-esmf-regrid.svg
@@ -10,12 +10,12 @@
xmlns:xlink="http://www.w3.org/1999/xlink"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
- width="89.186264mm"
- height="35.112755mm"
- viewBox="0 0 89.186264 35.112755"
+ width="99.186264mm"
+ height="45.112755mm"
+ viewBox="0 0 99.186261 45.112755"
version="1.1"
id="svg1444"
- inkscape:version="0.92.2 (5c3e80d, 2017-08-06)"
+ inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="iris-esmf-regrid.svg">
@@ -27,20 +27,20 @@
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.979899"
- inkscape:cx="167.70776"
- inkscape:cy="25.61879"
+ inkscape:cx="125.49117"
+ inkscape:cy="44.51643"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
- fit-margin-top="0"
- fit-margin-left="0"
- fit-margin-right="0"
- fit-margin-bottom="0"
- inkscape:window-width="1920"
- inkscape:window-height="983"
- inkscape:window-x="0"
- inkscape:window-y="27"
- inkscape:window-maximized="1" />
+ fit-margin-top="5"
+ fit-margin-left="5"
+ fit-margin-right="5"
+ fit-margin-bottom="5"
+ inkscape:window-width="1718"
+ inkscape:window-height="1368"
+ inkscape:window-x="3633"
+ inkscape:window-y="0"
+ inkscape:window-maximized="0" />
@@ -57,7 +57,7 @@
inkscape:label="Layer 1"
inkscape:groupmode="layer"
id="layer1"
- transform="translate(-14.306365,-64.275871)">
+ transform="translate(-9.306365,-59.275871)">
`edge_node_connectivity`12 ×2
\ No newline at end of file
+
+
+
+
+
+ image/svg+xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ `edge_node_connectivity
+`
+12
+×
+2
+
+
+
diff --git a/docs/src/further_topics/ugrid/images/ugrid_element_centres.svg b/docs/src/further_topics/ugrid/images/ugrid_element_centres.svg
index 13b885d600..94ab6ec585 100644
--- a/docs/src/further_topics/ugrid/images/ugrid_element_centres.svg
+++ b/docs/src/further_topics/ugrid/images/ugrid_element_centres.svg
@@ -1 +1,1276 @@
-`face_node_connectivity`xy`node_coordinates`xy`face_coordinates`151512 ×41212`face_coordinates``node_coordinates`
\ No newline at end of file
+
+
+
+
+
+ image/svg+xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ `face_node_connectivity
+`
+
+
+ x
+
+ y`
+node_coordinates
+`
+
+
+ x
+
+ y`
+face_coordinates
+`
+15
+15
+12
+
+×4
+12
+12
+
+
+
+ `face_coordinates
+`
+`
+nod
+
+e_
+coordinates
+`
+
+
+
+
diff --git a/docs/src/further_topics/ugrid/images/ugrid_node_independence.svg b/docs/src/further_topics/ugrid/images/ugrid_node_independence.svg
index ba72c42ffa..d63000da92 100644
--- a/docs/src/further_topics/ugrid/images/ugrid_node_independence.svg
+++ b/docs/src/further_topics/ugrid/images/ugrid_node_independence.svg
@@ -1 +1,865 @@
-`
\ No newline at end of file
+
+
+
+
+
+ image/svg+xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ `
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
diff --git a/docs/src/further_topics/ugrid/images/ugrid_variable_faces.svg b/docs/src/further_topics/ugrid/images/ugrid_variable_faces.svg
index 378978abc3..91223e269a 100644
--- a/docs/src/further_topics/ugrid/images/ugrid_variable_faces.svg
+++ b/docs/src/further_topics/ugrid/images/ugrid_variable_faces.svg
@@ -1 +1,943 @@
-`face_node_connectivity`12 ×6
\ No newline at end of file
+
+
+
+
+
+ image/svg+xml
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+ `face_node_connectivity
+`
+12
+×
+6
+
+
+
diff --git a/docs/src/further_topics/ugrid/operations.rst b/docs/src/further_topics/ugrid/operations.rst
index a4e0e593d7..f0638800fa 100644
--- a/docs/src/further_topics/ugrid/operations.rst
+++ b/docs/src/further_topics/ugrid/operations.rst
@@ -53,7 +53,8 @@ structured formats and non-UGRID mesh formats.
The objects created in this example will be used where possible in the
subsequent example operations on this page.
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -145,7 +146,8 @@ Creating a :class:`~iris.cube.Cube` is unchanged; the
:class:`~iris.experimental.ugrid.Mesh` is linked via a
:class:`~iris.experimental.ugrid.MeshCoord` (see :ref:`ugrid MeshCoords`):
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -206,7 +208,8 @@ The Iris saving process automatically detects if the :class:`~iris.cube.Cube`
has an associated :class:`~iris.experimental.ugrid.Mesh` and automatically
saves the file in a UGRID-conformant format:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -281,7 +284,8 @@ The :func:`iris.experimental.ugrid.save_mesh` function allows
:class:`~iris.experimental.ugrid.Mesh`\es to be saved to file without
associated :class:`~iris.cube.Cube`\s:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -352,7 +356,8 @@ loading a file remains **optional**. To load UGRID data from a file into the
Iris mesh data model, use the
:const:`iris.experimental.ugrid.PARSE_UGRID_ON_LOAD` context manager:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -372,7 +377,8 @@ All the existing loading functionality still operates on UGRID-compliant
data - :class:`~iris.Constraint`\s, callbacks, :func:`~iris.load_cube`
etcetera:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -415,7 +421,8 @@ The :func:`iris.experimental.ugrid.load_mesh` and
:class:`~iris.experimental.ugrid.Mesh`\es to be loaded from a file without
creating any associated :class:`~iris.cube.Cube`\s:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -430,20 +437,20 @@ creating any associated :class:`~iris.cube.Cube`\s:
node
node_dimension: 'Mesh2d_node'
node coordinates
- shape(5,)>
- shape(5,)>
+
+
edge
edge_dimension: 'Mesh2d_edge'
- edge_node_connectivity: shape(6, 2)>
+ edge_node_connectivity:
edge coordinates
- shape(6,)>
- shape(6,)>
+
+
face
face_dimension: 'Mesh2d_face'
- face_node_connectivity: shape(2, 4)>
+ face_node_connectivity:
face coordinates
- shape(2,)>
- shape(2,)>
+
+
long_name: 'my_mesh'
var_name: 'my_mesh'
@@ -469,7 +476,8 @@ be added to API in the near future.
This first example uses GeoVista to plot the ``face_cube`` that we created
earlier:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -539,7 +547,8 @@ earlier:
Here's another example using a global cubed-sphere data set:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -566,8 +575,8 @@ Here's another example using a global cubed-sphere data set:
Auxiliary coordinates:
time x -
Cell methods:
- mean time (300 s)
- mean time_counter
+ 0 time: mean (interval: 300 s)
+ 1 time_counter: mean
Attributes:
Conventions UGRID
description Created by xios
@@ -614,7 +623,8 @@ therefore set to return an :class:`~iris.coords.AuxCoord` instead - breaking
the link between :class:`~iris.cube.Cube` and
:class:`~iris.experimental.ugrid.Mesh`:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. doctest:: ugrid_operations
@@ -657,7 +667,8 @@ mesh, we then reconstruct a :class:`~iris.experimental.ugrid.Mesh` from the
..
Not using doctest here as want to keep GeoVista as optional dependency.
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -687,7 +698,7 @@ mesh, we then reconstruct a :class:`~iris.experimental.ugrid.Mesh` from the
Auxiliary coordinates:
time x -
Cell methods:
- point time
+ 0 time: point
Attributes:
Conventions UGRID
description Created by xios
@@ -744,7 +755,7 @@ mesh, we then reconstruct a :class:`~iris.experimental.ugrid.Mesh` from the
Auxiliary coordinates:
time x -
Cell methods:
- point time
+ 0 time: point
Attributes:
Conventions UGRID
description Created by xios
@@ -784,7 +795,8 @@ with the
..
Not using doctest here as want to keep iris-esmf-regrid as optional dependency.
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -815,8 +827,8 @@ with the
Auxiliary coordinates:
time x -
Cell methods:
- mean time (300 s)
- mean time_counter
+ 0 time: mean (interval: 300 s)
+ 1 time_counter: mean
Attributes:
Conventions UGRID
description Created by xios
@@ -851,8 +863,8 @@ with the
Auxiliary coordinates:
time x - -
Cell methods:
- mean time (300 s)
- mean time_counter
+ 0 time: mean (interval: 300 s)
+ 1 time_counter: mean
Attributes:
Conventions UGRID
description Created by xios
@@ -880,11 +892,12 @@ Since calling a regridder is usually a lot faster than initialising, reusing
regridders can save a lot of time. We can demonstrate the reuse of the
previously initialised regridder:
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
- # Extract a different cube defined on te same Mesh.
+ # Extract a different cube defined on the same Mesh.
>>> mesh_cube2 = mesh_cubes.extract_cube("precipitation_flux")
>>> print(mesh_cube2)
precipitation_flux / (kg m-2 s-1) (-- : 1; -- : 13824)
@@ -894,8 +907,8 @@ previously initialised regridder:
Auxiliary coordinates:
time x -
Cell methods:
- mean time (300 s)
- mean time_counter
+ 0 time: mean (interval: 300 s)
+ 1 time_counter: mean
Attributes:
Conventions UGRID
description Created by xios
@@ -917,8 +930,8 @@ previously initialised regridder:
Auxiliary coordinates:
time x - -
Cell methods:
- mean time (300 s)
- mean time_counter
+ 0 time: mean (interval: 300 s)
+ 1 time_counter: mean
Attributes:
Conventions UGRID
description Created by xios
diff --git a/docs/src/further_topics/ugrid/other_meshes.rst b/docs/src/further_topics/ugrid/other_meshes.rst
index 38abeeca03..2fcbeda0d0 100644
--- a/docs/src/further_topics/ugrid/other_meshes.rst
+++ b/docs/src/further_topics/ugrid/other_meshes.rst
@@ -27,7 +27,8 @@ To represent the Voronoi Polygons as faces, the corner coordinates will be used
as the **nodes** when creating the Iris
:class:`~iris.experimental.ugrid.mesh.Mesh`.
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -44,8 +45,8 @@ as the **nodes** when creating the Iris
latitude - x
longitude - x
Cell methods:
- mean where sea area
- mean time
+ 0 area: mean where sea
+ 1 time: mean
Attributes:
grid 'FESOM 1.4 (unstructured grid in the horizontal with 126859 wet nodes;...
...
@@ -77,8 +78,8 @@ as the **nodes** when creating the Iris
latitude - x
longitude - x
Cell methods:
- mean where sea area
- mean time
+ 0 area: mean where sea
+ 1 time: mean
Attributes:
grid 'FESOM 1.4 (unstructured grid in the horizontal with 126859 wet nodes;...
...
@@ -115,7 +116,8 @@ as the **nodes** when creating the Iris
:class:`~iris.experimental.ugrid.mesh.Mesh`.
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -229,6 +231,7 @@ as the **nodes** when creating the Iris
.. figure:: images/orca_grid.png
:width: 300
:alt: Plot of ORCA-gridded data from NEMO.
+ :class: dark-light
NEMO can use various grids, but is frequently used with ORCA type grids.
ORCA grids store global data in 2-dimensional ny * nx arrays. All cells are
@@ -253,7 +256,8 @@ To make an unstructured cube, the data must be 'flattened' to convert the given
dimensions into a single mesh dimension. Since Iris cubes don't support a "reshape" or
"flatten" operations, we create a new cube from the flattened data.
-.. dropdown:: :opticon:`code`
+.. dropdown:: Code
+ :icon: code
.. code-block:: python
@@ -275,7 +279,7 @@ dimensions into a single mesh dimension. Since Iris cubes don't support a "resh
depth 4.999938 m, bound=(0.0, 10.0) m
time 0001-01-01 12:00:00
Cell methods:
- mean time
+ 0 time: mean
Attributes:
Conventions 'CF-1.5'
@@ -350,7 +354,7 @@ dimensions into a single mesh dimension. Since Iris cubes don't support a "resh
name unknown
location face
Cell methods:
- mean time
+ 0 time: mean
Attributes:
Conventions 'CF-1.5'
diff --git a/docs/src/index.rst b/docs/src/index.rst
index 531c0e0b26..21971c2322 100644
--- a/docs/src/index.rst
+++ b/docs/src/index.rst
@@ -15,73 +15,114 @@ representations become unwieldy and inefficient.
For more information see :ref:`why_iris`.
-.. panels::
- :container: container-lg pb-3
- :column: col-lg-4 col-md-4 col-sm-6 col-xs-12 p-2 text-center
- :img-top-cls: w-50 m-auto px-1 py-2
-
- ---
- :img-top: _static/icon_shuttle.svg
-
- Information on Iris, how to install and a gallery of examples that
- create plots.
- +++
- .. link-button:: getting_started
- :type: ref
- :text: Getting Started
- :classes: btn-outline-info btn-block
-
-
- ---
- :img-top: _static/icon_instructions.svg
-
- Learn how to use Iris, including loading, navigating, saving,
- plotting and more.
- +++
- .. link-button:: user_guide_index
- :type: ref
- :text: User Guide
- :classes: btn-outline-info btn-block
-
- ---
- :img-top: _static/icon_development.svg
-
- As a developer you can contribute to Iris.
- +++
- .. link-button:: development_where_to_start
- :type: ref
- :text: Developers Guide
- :classes: btn-outline-info btn-block
-
- ---
- :img-top: _static/icon_api.svg
-
- Browse full Iris functionality by module.
- +++
- .. link-button:: Iris
- :type: ref
- :text: Iris API
- :classes: btn-outline-info btn-block
-
- ---
- :img-top: _static/icon_new_product.svg
-
- Find out what has recently changed in Iris.
- +++
- .. link-button:: iris_whatsnew
- :type: ref
- :text: What's New
- :classes: btn-outline-info btn-block
-
- ---
- :img-top: _static/icon_thumb.png
-
- Raise the profile of issues by voting on them.
- +++
- .. link-button:: voted_issues_top
- :type: ref
- :text: Voted Issues
- :classes: btn-outline-info btn-block
+.. grid:: 3
+
+ .. grid-item-card::
+ :text-align: center
+ :img-top: _static/icon_shuttle.svg
+ :class-img-top: w-50 m-auto px-1 py-2 dark-light
+ :shadow: lg
+
+ Information on Iris, how to install and a gallery of examples that
+ create plots.
+
+ +++
+ .. button-ref:: getting_started_index
+ :ref-type: ref
+ :color: primary
+ :outline:
+ :expand:
+
+ Getting Started
+
+ .. grid-item-card::
+ :text-align: center
+ :img-top: _static/icon_instructions.svg
+ :class-img-top: w-50 m-auto px-1 py-2 dark-light
+ :shadow: lg
+
+ Learn how to use Iris, including loading, navigating, saving,
+ plotting and more.
+
+ +++
+ .. button-ref:: user_guide_index
+ :ref-type: ref
+ :color: primary
+ :outline:
+ :expand:
+
+ User Guide
+
+ .. grid-item-card::
+ :text-align: center
+ :img-top: _static/icon_development.svg
+ :class-img-top: w-50 m-auto px-1 py-2 dark-light
+ :shadow: lg
+
+ Information on how you can contribute to Iris as a developer.
+
+ +++
+ .. button-ref:: development_where_to_start
+ :ref-type: ref
+ :color: primary
+ :outline:
+ :expand:
+
+ Developers Guide
+
+
+.. grid:: 3
+
+ .. grid-item-card::
+ :text-align: center
+ :img-top: _static/icon_api.svg
+ :class-img-top: w-50 m-auto px-1 py-2 dark-light
+ :shadow: lg
+
+ Browse full Iris functionality by module.
+
+ +++
+ .. button-ref:: generated/api/iris
+ :ref-type: doc
+ :color: primary
+ :outline:
+ :expand:
+
+ Iris API
+
+ .. grid-item-card::
+ :text-align: center
+ :img-top: _static/icon_new_product.svg
+ :class-img-top: w-50 m-auto px-1 py-2 dark-light
+ :shadow: lg
+
+ Find out what has recently changed in Iris.
+
+ +++
+ .. button-ref:: iris_whatsnew
+ :ref-type: ref
+ :color: primary
+ :outline:
+ :expand:
+
+ What's New
+
+ .. grid-item-card::
+ :text-align: center
+ :img-top: _static/icon_thumb.png
+ :class-img-top: w-50 m-auto px-1 py-2 dark-light
+ :shadow: lg
+
+ Raise the profile of issues by voting on them.
+
+ +++
+ .. button-ref:: voted_issues_top
+ :ref-type: ref
+ :color: primary
+ :outline:
+ :expand:
+
+ Voted Issues
Icons made by `FreePik `_ from
@@ -150,7 +191,7 @@ The legacy support resources:
:maxdepth: 1
:hidden:
- generated/api/iris
+ Iris API
.. toctree::
diff --git a/docs/src/installing.rst b/docs/src/installing.rst
index b2481973c0..cff9a27952 100644
--- a/docs/src/installing.rst
+++ b/docs/src/installing.rst
@@ -82,7 +82,7 @@ Once conda is installed, you can install Iris using conda and then activate
it. The example commands below assume you are in the root directory of your
local copy of Iris::
- conda env create --force --file=requirements/ci/iris.yml
+ conda env create --force --file=requirements/iris.yml
conda activate iris-dev
.. note::
@@ -92,7 +92,7 @@ local copy of Iris::
particularly useful when rebuilding your environment due to a change in
requirements.
-The ``requirements/ci/iris.yml`` file defines the Iris development conda
+The ``requirements/iris.yml`` file defines the Iris development conda
environment *name* and all the relevant *top level* `conda-forge` package
dependencies that you need to **code**, **test**, and **build** the
documentation. If you wish to minimise the environment footprint, simply
@@ -100,17 +100,17 @@ remove any unwanted packages from the requirements file e.g., if you don't
intend to run the Iris tests locally or build the documentation, then remove
all the packages from the `testing` and `documentation` sections.
-.. note:: The ``requirements/ci/iris.yml`` file will always use the latest
+.. note:: The ``requirements/iris.yml`` file will always use the latest
Iris tested Python version available. For all Python versions that
are supported and tested against by Iris, view the contents of
- the `requirements/ci`_ directory.
+ the `requirements`_ directory.
-.. _requirements/ci: https://github.com/scitools/iris/tree/main/requirements/ci
+.. _requirements: https://github.com/scitools/iris/tree/main/requirements
-Finally you need to run the command to configure your shell environment
-to find your local Iris code::
+Finally you need to run the command to configure your environment
+to find your local Iris code. From your Iris directory run::
- python setup.py develop
+ pip install --no-deps --editable .
Running the Tests
diff --git a/docs/src/sphinxext/api_rst_formatting.py b/docs/src/sphinxext/api_rst_formatting.py
new file mode 100644
index 0000000000..8f1aa3c5f3
--- /dev/null
+++ b/docs/src/sphinxext/api_rst_formatting.py
@@ -0,0 +1,37 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+
+# This script will process all .rst files that have been created by
+# sphinxcontrib.apidoc extension and perform minor changes, specifically:
+#
+# - Remove the suffix for "package" and " module".
+#
+
+import ntpath
+from pathlib import Path
+
+
+def main_api_rst_formatting(app):
+ src_dir = Path("generated/api")
+
+ print(f"[{ntpath.basename(__file__)}] Processing RST files", end="")
+
+ for file in src_dir.iterdir():
+ if file.suffix == ".rst":
+ print(f".", end="")
+
+ with open(file, "r") as f:
+ lines = f.read()
+
+ lines = lines.replace(" package\n=", "\n")
+ lines = lines.replace(" module\n=", "\n")
+
+ with open(file, "w") as f:
+ f.write(lines)
+ print("")
+
+def setup(app):
+ app.connect("builder-inited", main_api_rst_formatting)
diff --git a/docs/src/sphinxext/custom_class_autodoc.py b/docs/src/sphinxext/custom_class_autodoc.py
deleted file mode 100644
index cbde413f2d..0000000000
--- a/docs/src/sphinxext/custom_class_autodoc.py
+++ /dev/null
@@ -1,86 +0,0 @@
-# Copyright Iris contributors
-#
-# This file is part of Iris and is released under the LGPL license.
-# See COPYING and COPYING.LESSER in the root of the repository for full
-# licensing details.
-
-from sphinx.ext import autodoc
-from sphinx.ext.autodoc import *
-from sphinx.util import force_decode
-from sphinx.util.docstrings import prepare_docstring
-import inspect
-
-# stop warnings cluttering the make output
-import warnings
-warnings.filterwarnings("ignore")
-
-
-class ClassWithConstructorDocumenter(autodoc.ClassDocumenter):
- priority = 1000000
-
- def get_object_members(self, want_all):
- return autodoc.ClassDocumenter.get_object_members(self, want_all)
-
- @staticmethod
- def can_document_member(member, mname, isattr, self):
- return autodoc.ClassDocumenter.can_document_member(member, mname,
- isattr, self)
-
- def get_doc(self, encoding=None):
- content = self.env.config.autoclass_content
-
- docstrings = []
- docstring = self.get_attr(self.object, '__doc__', None)
- if docstring:
- docstrings.append(docstring)
-
- # for classes, what the "docstring" is can be controlled via a
- # config value; the default is only the class docstring
- if content in ('both', 'init'):
- constructor = self.get_constructor()
- if constructor:
- initdocstring = self.get_attr(constructor, '__doc__', None)
- else:
- initdocstring = None
- if initdocstring:
- if content == 'init':
- docstrings = [initdocstring]
- else:
- docstrings.append(initdocstring)
-
- return [prepare_docstring(force_decode(docstring, encoding))
- for docstring in docstrings]
-
- def get_constructor(self):
- # for classes, the relevant signature is the __init__ method's
- initmeth = self.get_attr(self.object, '__new__', None)
-
- if initmeth is None or initmeth is object.__new__ or not \
- (inspect.ismethod(initmeth) or inspect.isfunction(initmeth)):
- initmeth = None
-
- if initmeth is None:
- initmeth = self.get_attr(self.object, '__init__', None)
-
- if initmeth is None or initmeth is object.__init__ or \
- initmeth is object.__new__ or not \
- (inspect.ismethod(initmeth) or inspect.isfunction(initmeth)):
- initmeth = None
-
- return initmeth
-
- def format_args(self):
- initmeth = self.get_constructor()
- try:
- argspec = inspect.getargspec(initmeth)
- except TypeError:
- # still not possible: happens e.g. for old-style classes
- # with __init__ in C
- return None
- if argspec[0] and argspec[0][0] in ('cls', 'self'):
- del argspec[0][0]
- return inspect.formatargspec(*argspec)
-
-
-def setup(app):
- app.add_autodocumenter(ClassWithConstructorDocumenter, override=True)
diff --git a/docs/src/sphinxext/custom_data_autodoc.py b/docs/src/sphinxext/custom_data_autodoc.py
deleted file mode 100644
index eecd395101..0000000000
--- a/docs/src/sphinxext/custom_data_autodoc.py
+++ /dev/null
@@ -1,48 +0,0 @@
-# Copyright Iris contributors
-#
-# This file is part of Iris and is released under the LGPL license.
-# See COPYING and COPYING.LESSER in the root of the repository for full
-# licensing details.
-
-from sphinx.ext.autodoc import DataDocumenter, ModuleLevelDocumenter
-try:
- # Use 'object_description' in place of the former 'safe_repr' function.
- from sphinx.util.inspect import object_description as safe_repr
-except ImportError:
- # 'safe_repr' is the old usage, for Sphinx<1.3.
- from sphinx.util.inspect import safe_repr
-
-from iris.analysis import Aggregator
-
-
-class IrisDataDocumenter(DataDocumenter):
- priority = 100
-
- def add_directive_header(self, sig):
- ModuleLevelDocumenter.add_directive_header(self, sig)
- if not self.options.annotation:
- try:
- objrepr = safe_repr(self.object)
- except ValueError:
- pass
- else:
- self.add_line(u' :annotation:', '')
- elif self.options.annotation is object():
- pass
- else:
- self.add_line(
- u' :annotation: {}'.format(self.options.annotation),
- '')
-
-
-def handler(app, what, name, obj, options, signature, return_annotation):
- if what == 'data':
- if isinstance(obj, object) and issubclass(obj.__class__, Aggregator):
- signature = '()'
- return_annotation = '{} instance.'.format(obj.__class__.__name__)
- return signature, return_annotation
-
-
-def setup(app):
- app.add_autodocumenter(IrisDataDocumenter, override=True)
- app.connect('autodoc-process-signature', handler)
diff --git a/docs/src/sphinxext/generate_package_rst.py b/docs/src/sphinxext/generate_package_rst.py
deleted file mode 100644
index 8f4119944f..0000000000
--- a/docs/src/sphinxext/generate_package_rst.py
+++ /dev/null
@@ -1,373 +0,0 @@
-# Copyright Iris contributors
-#
-# This file is part of Iris and is released under the LGPL license.
-# See COPYING and COPYING.LESSER in the root of the repository for full
-# licensing details.
-
-import os
-import sys
-import re
-import inspect
-import ntpath
-
-# list of tuples for modules to exclude. Useful if the documentation throws
-# warnings, especially for experimental modules.
-exclude_modules = [
- ("experimental/raster", "iris.experimental.raster"), # gdal conflicts
-]
-
-
-# print to stdout, including the name of the python file
-def autolog(message):
- print("[{}] {}".format(ntpath.basename(__file__), message))
-
-
-document_dict = {
- # Use autoclass for classes.
- "class": """
-{object_docstring}
-
-..
-
- .. autoclass:: {object_name}
- :members:
- :undoc-members:
- :inherited-members:
-
-""",
- "function": """
-.. autofunction:: {object_name}
-
-""",
- # For everything else, let automodule do some magic...
- None: """
-
-.. autodata:: {object_name}
-
-""",
-}
-
-
-horizontal_sep = """
-.. raw:: html
-
-
-
-
-"""
-
-
-def lookup_object_type(obj):
- if inspect.isclass(obj):
- return "class"
- elif inspect.isfunction(obj):
- return "function"
- else:
- return None
-
-
-def auto_doc_module(
- file_path, import_name, root_package, package_toc=None, title=None
-):
- doc = r""".. _{import_name}:
-
-{title_underline}
-{title}
-{title_underline}
-
-{sidebar}
-
-.. currentmodule:: {root_package}
-
-.. automodule:: {import_name}
-
-In this module:
-
-{module_elements}
-
-"""
-
- if package_toc:
- sidebar = """
-{package_toc_tree}
-
- """.format(
- package_toc_tree=package_toc
- )
- else:
- sidebar = ""
-
- try:
- mod = __import__(import_name)
- except ImportError as e:
- message = r""".. error::
-
- This module could not be imported. Some dependencies are missing::
-
- """ + str(
- e
- )
- return doc.format(
- title=title or import_name,
- title_underline="=" * len(title or import_name),
- import_name=import_name,
- root_package=root_package,
- sidebar=sidebar,
- module_elements=message,
- )
-
- mod = sys.modules[import_name]
- elems = dir(mod)
-
- if "__all__" in elems:
- document_these = [
- (attr_name, getattr(mod, attr_name)) for attr_name in mod.__all__
- ]
- else:
- document_these = [
- (attr_name, getattr(mod, attr_name))
- for attr_name in elems
- if (
- not attr_name.startswith("_")
- and not inspect.ismodule(getattr(mod, attr_name))
- )
- ]
-
- def is_from_this_module(arg):
- # name = arg[0]
- obj = arg[1]
- return (
- hasattr(obj, "__module__") and obj.__module__ == mod.__name__
- )
-
- sort_order = {"class": 2, "function": 1}
-
- # Sort them according to sort_order dict.
- def sort_key(arg):
- # name = arg[0]
- obj = arg[1]
- return sort_order.get(lookup_object_type(obj), 0)
-
- document_these = filter(is_from_this_module, document_these)
- document_these = sorted(document_these, key=sort_key)
-
- lines = []
- for element, obj in document_these:
- object_name = import_name + "." + element
- obj_content = document_dict[lookup_object_type(obj)].format(
- object_name=object_name,
- object_name_header_line="+" * len(object_name),
- object_docstring=inspect.getdoc(obj),
- )
- lines.append(obj_content)
-
- lines = horizontal_sep.join(lines)
-
- module_elements = "\n".join(
- " * :py:obj:`{}`".format(element) for element, obj in document_these
- )
-
- lines = doc + lines
- return lines.format(
- title=title or import_name,
- title_underline="=" * len(title or import_name),
- import_name=import_name,
- root_package=root_package,
- sidebar=sidebar,
- module_elements=module_elements,
- )
-
-
-def auto_doc_package(file_path, import_name, root_package, sub_packages):
- max_depth = 1 if import_name == "iris" else 2
- package_toc = "\n ".join(sub_packages)
-
- package_toc = """
- .. toctree::
- :maxdepth: {:d}
- :titlesonly:
- :hidden:
-
- {}
-
-
-""".format(
- max_depth, package_toc
- )
-
- if "." in import_name:
- title = None
- else:
- title = import_name.capitalize() + " API"
-
- return auto_doc_module(
- file_path,
- import_name,
- root_package,
- package_toc=package_toc,
- title=title,
- )
-
-
-def auto_package_build(app):
- root_package = app.config.autopackage_name
- if root_package is None:
- raise ValueError(
- "set the autopackage_name variable in the " "conf.py file"
- )
-
- if not isinstance(root_package, list):
- raise ValueError(
- "autopackage was expecting a list of packages to "
- 'document e.g. ["itertools"]'
- )
-
- for package in root_package:
- do_package(package)
-
-
-def do_package(package_name):
- out_dir = "generated/api" + os.path.sep
-
- # Import the root package. If this fails then an import error will be
- # raised.
- module = __import__(package_name)
- root_package = package_name
- rootdir = os.path.dirname(module.__file__)
-
- package_folder = []
- module_folders = {}
-
- for root, subFolders, files in os.walk(rootdir):
- for fname in files:
- name, ext = os.path.splitext(fname)
-
- # Skip some non-relevant files.
- if (
- fname.startswith(".")
- or fname.startswith("#")
- or re.search("^_[^_]", fname)
- or fname.find(".svn") >= 0
- or not (ext in [".py", ".so"])
- ):
- continue
-
- # Handle new shared library naming conventions
- if ext == ".so":
- name = name.split(".", 1)[0]
-
- rel_path = (
- root_package + os.path.join(root, fname).split(rootdir)[-1]
- )
- mod_folder = root_package + os.path.join(root).split(rootdir)[
- -1
- ].replace("/", ".")
-
- # Only add this package to folder list if it contains an __init__
- # script.
- if name == "__init__":
- package_folder.append([mod_folder, rel_path])
- else:
- import_name = mod_folder + "." + name
- mf_list = module_folders.setdefault(mod_folder, [])
- mf_list.append((import_name, rel_path))
- if not os.path.exists(out_dir):
- os.makedirs(out_dir)
-
- for package, package_path in package_folder:
- if "._" in package or "test" in package:
- continue
-
- paths = []
- for spackage, spackage_path in package_folder:
- # Ignore this packages, packages that are not children of this
- # one, test packages, private packages, and packages that are
- # subpackages of subpackages (they'll be part of the subpackage).
- if spackage == package:
- continue
- if not spackage.startswith(package):
- continue
- if spackage.count(".") > package.count(".") + 1:
- continue
- if "test" in spackage:
- continue
-
- split_path = spackage.rsplit(".", 2)[-2:]
- if any(part[0] == "_" for part in split_path):
- continue
-
- paths.append(os.path.join(*split_path) + ".rst")
-
- paths.extend(
- os.path.join(
- os.path.basename(os.path.dirname(path)),
- os.path.basename(path).split(".", 1)[0],
- )
- for imp_name, path in module_folders.get(package, [])
- )
-
- paths.sort()
- excluded_paths = [item[0] for item in exclude_modules]
-
- # check for any modules to exclude
- for excluded_path in excluded_paths:
- if excluded_path in paths:
- autolog(f"Excluding module in package: {excluded_path!r}")
- paths.remove(excluded_path)
-
- doc = auto_doc_package(package_path, package, root_package, paths)
-
- package_dir = out_dir + package.replace(".", os.path.sep)
- if not os.path.exists(package_dir):
- os.makedirs(out_dir + package.replace(".", os.path.sep))
-
- out_path = package_dir + ".rst"
- if not os.path.exists(out_path):
- autolog("Creating {} ...".format(out_path))
- with open(out_path, "w") as fh:
- fh.write(doc)
- else:
- with open(out_path, "r") as fh:
- existing_content = "".join(fh.readlines())
- if doc != existing_content:
- autolog("Creating {} ...".format(out_path))
- with open(out_path, "w") as fh:
- fh.write(doc)
-
- excluded_imports = [item[1] for item in exclude_modules]
-
- for import_name, module_path in module_folders.get(package, []):
- # check for any modules to exclude
- if import_name in excluded_imports:
- autolog(f"Excluding module file: {import_name!r}")
- continue
- doc = auto_doc_module(
- module_path, import_name, root_package
- )
- out_path = (
- out_dir
- + import_name.replace(".", os.path.sep)
- + ".rst"
- )
- if not os.path.exists(out_path):
- autolog("Creating {} ...".format(out_path))
- with open(out_path, "w") as fh:
- fh.write(doc)
- else:
- with open(out_path, "r") as fh:
- existing_content = "".join(fh.readlines())
- if doc != existing_content:
- autolog("Creating {} ...".format(out_path))
- with open(out_path, "w") as fh:
- fh.write(doc)
-
-
-def setup(app):
- app.connect("builder-inited", auto_package_build)
- app.add_config_value("autopackage_name", None, "env")
diff --git a/docs/src/userguide/citation.rst b/docs/src/userguide/citation.rst
index 1498b9dfe1..62af43c94f 100644
--- a/docs/src/userguide/citation.rst
+++ b/docs/src/userguide/citation.rst
@@ -15,12 +15,12 @@ For example::
@manual{Iris,
author = {{Met Office}},
- title = {Iris: A powerful, format-agnostic, and community-driven Python package for analysing and visualising Earth science data },
- edition = {v3.4},
- year = {2010 - 2022},
- address = {Exeter, Devon },
+ title = {Iris: A powerful, format-agnostic, and community-driven Python package for analysing and visualising Earth science data},
+ edition = {v3.6},
+ year = {2010 - 2023},
+ address = {Exeter, Devon},
url = {http://scitools.org.uk/},
- doi = {10.5281/zenodo.7386117}
+ doi = {10.5281/zenodo.7948293}
}
@@ -34,7 +34,7 @@ Suggested format::
For example::
- Iris. v3.4. 1-Dec-2022. Met Office. UK. https://doi.org/10.5281/zenodo.7386117 22-12-2022
+ Iris. v3.5. 27-Apr-2023. Met Office. UK. https://doi.org/10.5281/zenodo.7871017 22-12-2022
********************
diff --git a/docs/src/userguide/concat.png b/docs/src/userguide/concat.png
deleted file mode 100644
index eb3d84046e..0000000000
Binary files a/docs/src/userguide/concat.png and /dev/null differ
diff --git a/docs/src/userguide/concat.svg b/docs/src/userguide/concat.svg
index 0234b37bfa..f32fc0030b 100644
--- a/docs/src/userguide/concat.svg
+++ b/docs/src/userguide/concat.svg
@@ -9,11 +9,11 @@
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
- width="750"
- height="250"
+ width="772.70679"
+ height="285.20804"
id="svg2834"
version="1.1"
- inkscape:version="0.47 r22583"
+ inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="concat.svg"
inkscape:export-xdpi="90"
inkscape:export-ydpi="90">
@@ -98,7 +98,8 @@
id="path3666"
style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
- transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="scale(-0.8)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ borderlayer="true"
+ fit-margin-top="20"
+ fit-margin-left="20"
+ fit-margin-bottom="20"
+ fit-margin-right="20">
+ snapvisiblegridlinesonly="true"
+ originx="16.413907"
+ originy="19.718628" />
@@ -443,7 +456,7 @@
image/svg+xml
-
+
@@ -451,10 +464,10 @@
id="layer1"
inkscape:label="Layer 1"
inkscape:groupmode="layer"
- transform="translate(-34.602633,-28.380468)">
+ transform="translate(-18.188725,-12.891066)">
x
+ style="font-size:18.91116524px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">x
@@ -471,19 +484,22 @@
inkscape:path-effect="#path-effect3640"
id="path3638"
d="M 60,420 320,260"
- style="fill:none;stroke:#000000;stroke-width:1.42801106px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)" />
+ style="fill:none;stroke:#000000;stroke-width:1.42801106px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 V 50"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 H 480"
+ style="fill:none;stroke:#000000;stroke-width:1.16397536px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:1.42801106px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 V 50"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 H 480"
+ style="fill:none;stroke:#000000;stroke-width:1.16397536px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 60.562629,212.61178 H 54.65289"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 60.562629,165.33386 c -5.909739,0 -5.909739,0 -5.909739,0"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 60.562629,118.05595 c -5.909739,0 -5.909739,0 -5.909739,0"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 458.87904,212.61178 H 452.9693"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 458.87904,165.33386 H 452.9693"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 458.87904,118.05595 H 452.9693"
+ inkscape:connector-curvature="0" />
0
+ style="font-size:18.91116524px;line-height:1.25">0
+ inkscape:original-d="m 60.562629,259.88968 c -5.909739,0 -5.909739,0 -5.909739,0 v 0"
+ inkscape:connector-curvature="0" />
1
+ style="font-size:18.91116524px;line-height:1.25">1
2
+ style="font-size:18.91116524px;line-height:1.25">2
3
+ style="font-size:18.91116524px;line-height:1.25">3
0
+ style="font-size:18.91116524px;line-height:1.25">0
1
+ style="font-size:18.91116524px;line-height:1.25">1
2
+ style="font-size:18.91116524px;line-height:1.25">2
3
+ style="font-size:18.91116524px;line-height:1.25">3
+ inkscape:original-d="M 458.87904,259.88968 H 452.9693"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 480.51092,212.61178 600.71279,141.69491 H 770.08815 Z"
+ sodipodi:nodetypes="cccc"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 480.51092,118.05595 600.71279,47.139078 H 770.08815 L 677.2049,118.05595 Z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 481.40069,118.38047 V 212.9363 H 664.60263 V 118.38047 Z"
+ sodipodi:nodetypes="ccccc"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 664.60263,212.9363 770.08815,141.69491 V 47.139078 L 664.60263,118.38047 Z"
+ sodipodi:nodetypes="ccccc"
+ inkscape:connector-curvature="0" />
t
+ style="font-size:18.91116524px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">t
y
+ style="font-size:18.91116524px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">y
t
+ style="font-size:18.91116524px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">t
y
+ style="font-size:18.91116524px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">y
+ inkscape:original-d="m 386.84486,142.01943 v 11.81948 h 23.63896 v 5.90973 l 17.72922,-11.81947 -17.72922,-11.81948 v 5.90974 z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 144.54556,118.38048 262.74034,41.553863 h -59.09738"
+ sodipodi:nodetypes="ccc"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 262.74034,212.9363 V 118.38047 L 380.93512,41.553863 v 94.555827 z"
+ sodipodi:nodetypes="ccccc"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 86.176265,213.75562 85.448171,118.38048 203.64296,41.553863"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 262.74034,41.553863 H 380.93512"
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0" />
+ sodipodi:nodetypes="cc"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 84.558085,118.283 c 0.890086,0.0975 178.182255,0.0975 178.182255,0.0975 v 94.55582 l -176.564075,0.81932 v 0"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 144.54556,118.38048 V 212.9363"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 203.76402,118.82239 -0.12106,94.11391"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 85.636872,117.7436 -0.188701,95.1927"
+ inkscape:connector-curvature="0" />
diff --git a/docs/src/userguide/cube_diagram.dia b/docs/src/userguide/cube_diagram.dia
deleted file mode 100644
index 8edc611782..0000000000
Binary files a/docs/src/userguide/cube_diagram.dia and /dev/null differ
diff --git a/docs/src/userguide/cube_diagram.png b/docs/src/userguide/cube_diagram.png
deleted file mode 100644
index 80f5328c3b..0000000000
Binary files a/docs/src/userguide/cube_diagram.png and /dev/null differ
diff --git a/docs/src/userguide/cube_statistics.rst b/docs/src/userguide/cube_statistics.rst
index 08297c2a51..fb389a5229 100644
--- a/docs/src/userguide/cube_statistics.rst
+++ b/docs/src/userguide/cube_statistics.rst
@@ -14,7 +14,7 @@ Cube Statistics
Collapsing Entire Data Dimensions
---------------------------------
-.. testsetup::
+.. testsetup:: collapsing
import iris
filename = iris.sample_data_path('uk_hires.pp')
@@ -86,7 +86,7 @@ we can pass the coordinate name and the aggregation definition to the
model_level_number 10, bound=(1, 19)
sigma 0.92292976, bound=(0.8458596, 1.0)
Cell methods:
- mean model_level_number
+ 0 model_level_number: mean
Attributes:
STASH m01s00i004
source 'Data from Met Office Unified Model'
@@ -125,7 +125,7 @@ in order to calculate the area of the grid boxes::
These areas can now be passed to the ``collapsed`` method as weights:
-.. doctest::
+.. doctest:: collapsing
>>> new_cube = cube.collapsed(['grid_longitude', 'grid_latitude'], iris.analysis.MEAN, weights=grid_areas)
>>> print(new_cube)
@@ -141,11 +141,11 @@ These areas can now be passed to the ``collapsed`` method as weights:
altitude - x
Scalar coordinates:
forecast_reference_time 2009-11-19 04:00:00
- grid_latitude 1.5145501 degrees, bound=(0.14430022, 2.8848) degrees
- grid_longitude 358.74948 degrees, bound=(357.494, 360.00497) degrees
+ grid_latitude 1.5145501 degrees, bound=(0.13755022, 2.89155) degrees
+ grid_longitude 358.74948 degrees, bound=(357.48724, 360.01172) degrees
surface_altitude 399.625 m, bound=(-14.0, 813.25) m
Cell methods:
- mean grid_longitude, grid_latitude
+ 0 grid_longitude: grid_latitude: mean
Attributes:
STASH m01s00i004
source 'Data from Met Office Unified Model'
@@ -155,6 +155,50 @@ Several examples of area averaging exist in the gallery which may be of interest
including an example on taking a :ref:`global area-weighted mean
`.
+In addition to plain arrays, weights can also be given as cubes or (names of)
+:meth:`~iris.cube.Cube.coords`, :meth:`~iris.cube.Cube.cell_measures`, or
+:meth:`~iris.cube.Cube.ancillary_variables`.
+This has the advantage of correct unit handling, e.g., for area-weighted sums
+the units of the resulting cube are multiplied by an area unit:
+
+.. doctest:: collapsing
+
+ >>> from iris.coords import CellMeasure
+ >>> cell_areas = CellMeasure(
+ ... grid_areas,
+ ... standard_name='cell_area',
+ ... units='m2',
+ ... measure='area',
+ ... )
+ >>> cube.add_cell_measure(cell_areas, (0, 1, 2, 3))
+ >>> area_weighted_sum = cube.collapsed(
+ ... ['grid_longitude', 'grid_latitude'],
+ ... iris.analysis.SUM,
+ ... weights='cell_area'
+ ... )
+ >>> print(area_weighted_sum)
+ air_potential_temperature / (m2.K) (time: 3; model_level_number: 7)
+ Dimension coordinates:
+ time x -
+ model_level_number - x
+ Auxiliary coordinates:
+ forecast_period x -
+ level_height - x
+ sigma - x
+ Derived coordinates:
+ altitude - x
+ Scalar coordinates:
+ forecast_reference_time 2009-11-19 04:00:00
+ grid_latitude 1.5145501 degrees, bound=(0.13755022, 2.89155) degrees
+ grid_longitude 358.74948 degrees, bound=(357.48724, 360.01172) degrees
+ surface_altitude 399.625 m, bound=(-14.0, 813.25) m
+ Cell methods:
+ 0 grid_longitude: grid_latitude: sum
+ Attributes:
+ STASH m01s00i004
+ source 'Data from Met Office Unified Model'
+ um_version '7.3'
+
.. _cube-statistics-aggregated-by:
Partially Reducing Data Dimensions
@@ -232,7 +276,7 @@ Printing this cube now shows that two extra coordinates exist on the cube:
Scalar coordinates:
forecast_period 0 hours
Cell methods:
- mean month, year
+ 0 month: year: mean
Attributes:
Conventions 'CF-1.5'
STASH m01s00i024
@@ -338,3 +382,44 @@ from jja-2006 to jja-2010:
mam 2010
jja 2010
+Moreover, :meth:`Cube.aggregated_by ` supports
+weighted aggregation.
+For example, this is helpful for an aggregation over a monthly time
+coordinate that consists of months with different numbers of days.
+Similar to :meth:`Cube.collapsed `, weights can be
+given as arrays, cubes, or as (names of) :meth:`~iris.cube.Cube.coords`,
+:meth:`~iris.cube.Cube.cell_measures`, or
+:meth:`~iris.cube.Cube.ancillary_variables`.
+When weights are not given as arrays, units are correctly handled for weighted
+sums, i.e., the original unit of the cube is multiplied by the units of the
+weights.
+The following example shows a weighted sum (notice the change of the units):
+
+.. doctest:: aggregation
+
+ >>> from iris.coords import AncillaryVariable
+ >>> time_weights = AncillaryVariable(
+ ... cube.coord("time").bounds[:, 1] - cube.coord("time").bounds[:, 0],
+ ... long_name="Time Weights",
+ ... units="hours",
+ ... )
+ >>> cube.add_ancillary_variable(time_weights, 0)
+ >>> seasonal_sum = cube.aggregated_by("clim_season", iris.analysis.SUM, weights="Time Weights")
+ >>> print(seasonal_sum)
+ surface_temperature / (3600 s.K) (-- : 4; latitude: 18; longitude: 432)
+ Dimension coordinates:
+ latitude - x -
+ longitude - - x
+ Auxiliary coordinates:
+ clim_season x - -
+ forecast_reference_time x - -
+ season_year x - -
+ time x - -
+ Scalar coordinates:
+ forecast_period 0 hours
+ Cell methods:
+ 0 month: year: mean
+ 1 clim_season: sum
+ Attributes:
+ Conventions 'CF-1.5'
+ STASH m01s00i024
diff --git a/docs/src/userguide/glossary.rst b/docs/src/userguide/glossary.rst
index 818ef0c7ad..5c24f03372 100644
--- a/docs/src/userguide/glossary.rst
+++ b/docs/src/userguide/glossary.rst
@@ -1,3 +1,5 @@
+.. include:: ../common_links.inc
+
.. _glossary:
Glossary
@@ -125,7 +127,7 @@ Glossary
of formats.
| **Related:** :term:`CartoPy` **|** :term:`NumPy`
- | **More information:** `Matplotlib `_
+ | **More information:** `matplotlib`_
|
Metadata
@@ -143,9 +145,11 @@ Glossary
When Iris loads this format, it also especially recognises and interprets data
encoded according to the :term:`CF Conventions`.
+ __ `NetCDF4`_
+
| **Related:** :term:`Fields File (FF) Format`
**|** :term:`GRIB Format` **|** :term:`Post Processing (PP) Format`
- | **More information:** `NetCDF-4 Python Git `_
+ | **More information:** `NetCDF-4 Python Git`__
|
NumPy
diff --git a/docs/src/userguide/interpolation_and_regridding.rst b/docs/src/userguide/interpolation_and_regridding.rst
index deae4427ed..cba7d778d5 100644
--- a/docs/src/userguide/interpolation_and_regridding.rst
+++ b/docs/src/userguide/interpolation_and_regridding.rst
@@ -75,8 +75,8 @@ Let's take the air temperature cube we've seen previously:
pressure 1000.0 hPa
time 1998-12-01 00:00:00, bound=(1994-12-01 00:00:00, 1998-12-01 00:00:00)
Cell methods:
- mean within years time
- mean over years time
+ 0 time: mean within years
+ 1 time: mean over years
Attributes:
STASH m01s16i203
source 'Data from Met Office Unified Model'
@@ -94,8 +94,8 @@ We can interpolate specific values from the coordinates of the cube:
pressure 1000.0 hPa
time 1998-12-01 00:00:00, bound=(1994-12-01 00:00:00, 1998-12-01 00:00:00)
Cell methods:
- mean within years time
- mean over years time
+ 0 time: mean within years
+ 1 time: mean over years
Attributes:
STASH m01s16i203
source 'Data from Met Office Unified Model'
diff --git a/docs/src/userguide/iris_cubes.rst b/docs/src/userguide/iris_cubes.rst
index 29d8f3cefc..267f97b0fc 100644
--- a/docs/src/userguide/iris_cubes.rst
+++ b/docs/src/userguide/iris_cubes.rst
@@ -10,9 +10,7 @@ metadata about a phenomenon.
In Iris, a cube is an interpretation of the *Climate and Forecast (CF)
Metadata Conventions* whose purpose is to:
-.. panels::
- :container: container-lg pb-3
- :column: col-lg-12 p-2
+.. card::
*require conforming datasets to contain sufficient metadata that they are
self-describing... including physical units if appropriate, and that each
@@ -104,7 +102,7 @@ Suppose we have some gridded data which has 24 air temperature readings
(in Kelvin) which is located at 4 different longitudes, 2 different latitudes
and 3 different heights. Our data array can be represented pictorially:
-.. image:: multi_array.png
+.. image:: multi_array.svg
Where dimensions 0, 1, and 2 have lengths 3, 2 and 4 respectively.
@@ -134,7 +132,7 @@ The Iris cube to represent this data would consist of:
Pictorially the cube has taken on more information than a simple array:
-.. image:: multi_array_to_cube.png
+.. image:: multi_array_to_cube.svg
Additionally further information may be optionally attached to the cube.
diff --git a/docs/src/userguide/loading_iris_cubes.rst b/docs/src/userguide/loading_iris_cubes.rst
index 33ad932d70..b71f033c30 100644
--- a/docs/src/userguide/loading_iris_cubes.rst
+++ b/docs/src/userguide/loading_iris_cubes.rst
@@ -234,7 +234,7 @@ A single cube is loaded in the following example::
longitude - x
...
Cell methods:
- mean time
+ 0 time: mean
However, when attempting to load data which would result in anything other than
one cube, an exception is raised::
diff --git a/docs/src/userguide/merge.png b/docs/src/userguide/merge.png
deleted file mode 100644
index cafaa370da..0000000000
Binary files a/docs/src/userguide/merge.png and /dev/null differ
diff --git a/docs/src/userguide/merge.svg b/docs/src/userguide/merge.svg
index 9326bc332b..0f0d37a1ca 100644
--- a/docs/src/userguide/merge.svg
+++ b/docs/src/userguide/merge.svg
@@ -9,11 +9,11 @@
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
- width="750"
- height="250"
+ width="734.55884"
+ height="280.94952"
id="svg2834"
version="1.1"
- inkscape:version="0.47 r22583"
+ inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="merge.svg"
inkscape:export-xdpi="90"
inkscape:export-ydpi="90">
@@ -38,7 +38,8 @@
id="path3666"
style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
- transform="matrix(-1.1,0,0,-1.1,-1.1,0)" />
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="scale(-0.8)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ borderlayer="true"
+ fit-margin-top="20"
+ fit-margin-left="20"
+ fit-margin-bottom="20"
+ fit-margin-right="20">
+ snapvisiblegridlinesonly="true"
+ originx="-4.2538044"
+ originy="11.570523" />
@@ -394,7 +407,7 @@
image/svg+xml
-
+
@@ -402,10 +415,10 @@
id="layer1"
inkscape:label="Layer 1"
inkscape:groupmode="layer"
- transform="translate(-22.109375,-210.54913)">
+ transform="translate(-26.363179,-191.17014)">
y
+ style="font-size:17.92656898px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">y
@@ -422,19 +435,22 @@
inkscape:path-effect="#path-effect3640"
id="path3638"
d="M 60,420 320,260"
- style="fill:none;stroke:#000000;stroke-width:1.42801106px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)" />
+ style="fill:none;stroke:#000000;stroke-width:1.42801106px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 V 50"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 H 480"
+ style="fill:none;stroke:#000000;stroke-width:1.16397536px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:1.42801106px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 V 50"
+ style="fill:none;stroke:#000000;stroke-width:1px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ d="M 60,420 H 480"
+ style="fill:none;stroke:#000000;stroke-width:1.16397536px;stroke-linecap:butt;stroke-linejoin:miter;stroke-opacity:1;marker-end:url(#Arrow1Lend)"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 67.5722,390.18197 H 61.970147"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 67.5722,345.36554 c -5.602053,0 -5.602053,0 -5.602053,0"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 67.5722,300.54913 c -5.602053,0 -5.602053,0 -5.602053,0"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 445.15053,390.18197 h -5.60205"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 445.15053,345.36554 h -5.60205"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 445.15053,300.54913 h -5.60205"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 88.077823,390.18196 202.02147,322.95733 H 362.5784 l -88.04735,67.22463 z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 203.20315,211.22392 c 0,134.44926 0,134.44926 0,134.44926"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 88.394922,345.36553 202.33857,278.1409 H 362.8955 l -88.04735,67.22463 z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 88.077823,300.54913 202.02147,233.3245 H 362.5784 l -88.04735,67.22463 z"
+ inkscape:connector-curvature="0" />
0
+ style="font-size:17.92656898px;line-height:1.25">0
+ inkscape:original-d="m 67.5722,434.99838 c -5.602053,0 -5.602053,0 -5.602053,0 v 0"
+ inkscape:connector-curvature="0" />
1
+ style="font-size:17.92656898px;line-height:1.25">1
2
+ style="font-size:17.92656898px;line-height:1.25">2
3
+ style="font-size:17.92656898px;line-height:1.25">3
0
+ style="font-size:17.92656898px;line-height:1.25">0
1
+ style="font-size:17.92656898px;line-height:1.25">1
2
+ style="font-size:17.92656898px;line-height:1.25">2
3
+ style="font-size:17.92656898px;line-height:1.25">3
+ inkscape:original-d="m 89.980408,278.14092 c 0,134.44926 0,134.44926 0,134.44926"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 274.84815,278.14092 c 0,134.44926 0,134.44926 0,134.44926"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 445.15053,434.99838 h -5.60205"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 465.65616,390.18197 113.94362,-67.22463 h 160.55695 l -88.04736,67.22463 H 465.65615 Z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="M 465.65616,300.54913 579.59978,233.3245 h 160.55695 l -88.04736,67.22463 H 465.65615 Z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 465.65617,300.54913 v 89.63284 H 652.10938 V 300.54913 H 465.65616 Z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 652.10939,390.18197 88.04735,-67.22463 V 233.3245 l -88.04736,67.22463 v 89.63284 z"
+ inkscape:connector-curvature="0" />
+ inkscape:original-d="m 363.42185,211.17014 c 0,134.44926 0,134.44926 0,134.44926"
+ inkscape:connector-curvature="0" />
x
+ style="font-size:17.92656898px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">x
z
+ style="font-size:17.92656898px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">z
x
+ style="font-size:17.92656898px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">x
z
+ style="font-size:17.92656898px;line-height:1.25;font-family:'URW Palladio L';-inkscape-font-specification:'URW Palladio L'">z
+ inkscape:original-d="m 376.86678,323.26496 v 11.20411 h 22.4082 v 5.60205 l 16.80616,-11.2041 -16.80616,-11.20411 v 5.60205 z"
+ inkscape:connector-curvature="0" />
diff --git a/docs/src/userguide/merge_and_concat.png b/docs/src/userguide/merge_and_concat.png
deleted file mode 100644
index 48238287b4..0000000000
Binary files a/docs/src/userguide/merge_and_concat.png and /dev/null differ
diff --git a/docs/src/userguide/merge_and_concat.rst b/docs/src/userguide/merge_and_concat.rst
index b521d49a59..d754e08cc1 100644
--- a/docs/src/userguide/merge_and_concat.rst
+++ b/docs/src/userguide/merge_and_concat.rst
@@ -16,7 +16,7 @@ issues from occurring.
Both ``merge`` and ``concatenate`` take multiple cubes as input and
result in fewer cubes as output. The following diagram illustrates the two processes:
-.. image:: merge_and_concat.png
+.. image:: merge_and_concat.svg
:alt: Pictographic of merge and concatenation.
:align: center
@@ -128,7 +128,7 @@ make a new ``z`` dimension coordinate:
The following diagram illustrates what has taken place in this example:
-.. image:: merge.png
+.. image:: merge.svg
:alt: Pictographic of merge.
:align: center
@@ -294,7 +294,7 @@ cubes to form a new cube with an extended ``t`` coordinate:
The following diagram illustrates what has taken place in this example:
-.. image:: concat.png
+.. image:: concat.svg
:alt: Pictographic of concatenate.
:align: center
diff --git a/docs/src/userguide/multi_array.png b/docs/src/userguide/multi_array.png
deleted file mode 100644
index 54a2688d2a..0000000000
Binary files a/docs/src/userguide/multi_array.png and /dev/null differ
diff --git a/docs/src/userguide/multi_array.svg b/docs/src/userguide/multi_array.svg
index d28f6d71d6..38ba58744f 100644
--- a/docs/src/userguide/multi_array.svg
+++ b/docs/src/userguide/multi_array.svg
@@ -13,244 +13,250 @@
height="300"
id="svg2"
version="1.1"
- inkscape:version="0.47 r22583"
+ inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="multi_array.svg"
inkscape:export-xdpi="90"
- inkscape:export-ydpi="90">
+ inkscape:export-ydpi="90"
+ viewBox="0 0 470 320">
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,4.8,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,-4.8,0)"
+ inkscape:connector-curvature="0" />
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(1.1,0,0,1.1,1.1,0)"
+ inkscape:connector-curvature="0" />
+ style="overflow:visible">
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ style="overflow:visible">
+ style="font-size:12px;fill-rule:evenodd;stroke-width:0.625;stroke-linejoin:round"
+ d="M 8.7185878,4.0337352 -2.2072895,0.01601326 8.7185884,-4.0017078 c -1.7454984,2.3720609 -1.7354408,5.6174519 -6e-7,8.035443 z"
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)"
+ inkscape:connector-curvature="0" />
+ style="color-interpolation-filters:sRGB">
@@ -322,17 +328,24 @@
+
+ transform="translate(-22.275317,-306.43968)"
+ style="display:inline">
+ style="opacity:1;fill:none;stroke:#000000;stroke-width:1.23286104;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ d="M 227.83333,331.9969 H 460.74875"
+ id="path2967"
+ inkscape:connector-curvature="0" />
2
+ style="font-size:20px;line-height:1.25;font-family:Arial;-inkscape-font-specification:Arial">2
1
+ style="font-size:20px;line-height:1.25;font-family:Arial;-inkscape-font-specification:Arial">1
0
+ style="font-size:20px;line-height:1.25;font-family:Arial;-inkscape-font-specification:Arial">0
+ d="m 35.115947,477.76237 v 104.4593"
+ style="fill:none;stroke:#000000;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-start:url(#EmptyTriangleInL)"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-1"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:1.90165818;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-end:url(#EmptyTriangleOutL)"
+ d="M 34.20344,581.46304 H 134.30678"
+ id="path2967-5"
+ inkscape:connector-curvature="0" />
+ d="m 276.64236,373.53613 v 187.701"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ d="M 218.10752,373.53612 V 561.23713"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ d="M 339.27837,373.53612 V 561.23713"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ d="M 164.28582,434.95011 H 397.20125"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ d="M 164.28582,500.94163 H 397.20124"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ d="M 429.87688,350.79329 V 538.4943"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ d="M 197.00473,350.14458 H 429.92015"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2.03630161;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
+ inkscape:connector-curvature="0" />
+ id="path2965-9"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-1"
+ inkscape:connector-curvature="0" />
+ style="opacity:1;fill:none;stroke:#000000;stroke-width:1.40908241;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ d="M 460.83518,331.99456 V 519.69557"
+ id="path3940"
+ inkscape:connector-curvature="0" />
diff --git a/docs/src/userguide/multi_array_to_cube.png b/docs/src/userguide/multi_array_to_cube.png
deleted file mode 100644
index 1144ee6715..0000000000
Binary files a/docs/src/userguide/multi_array_to_cube.png and /dev/null differ
diff --git a/docs/src/userguide/multi_array_to_cube.svg b/docs/src/userguide/multi_array_to_cube.svg
index a2fc2f5e26..8b0cc529dd 100644
--- a/docs/src/userguide/multi_array_to_cube.svg
+++ b/docs/src/userguide/multi_array_to_cube.svg
@@ -9,11 +9,11 @@
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
- width="600"
- height="400"
+ width="588.37256"
+ height="379.43076"
id="svg2"
version="1.1"
- inkscape:version="0.47 r22583"
+ inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="multi_array_to_cube.svg"
inkscape:export-xdpi="90"
inkscape:export-ydpi="90">
@@ -28,9 +28,10 @@
style="overflow:visible">
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,4.8,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,-4.8,0)"
+ inkscape:connector-curvature="0" />
+ transform="matrix(1.1,0,0,1.1,1.1,0)"
+ inkscape:connector-curvature="0" />
+ d="M 0,0 5,-5 -12.5,0 5,5 Z"
+ style="fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,-10,0)"
+ inkscape:connector-curvature="0" />
+ transform="matrix(-1.1,0,0,-1.1,-1.1,0)"
+ inkscape:connector-curvature="0" />
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ style="color-interpolation-filters:sRGB">
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,-4.8,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,-4.8,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(0.8,0,0,0.8,-4.8,0)"
+ inkscape:connector-curvature="0" />
+ d="M 5.77,0 -2.88,5 V -5 Z"
+ style="fill:#ffffff;fill-rule:evenodd;stroke:#000000;stroke-width:1.00000003pt;marker-start:none"
+ transform="matrix(-0.8,0,0,-0.8,4.8,0)"
+ inkscape:connector-curvature="0" />
+ inkscape:guide-bbox="true"
+ borderlayer="false"
+ inkscape:pagecheckerboard="true"
+ showborder="true"
+ fit-margin-top="10"
+ fit-margin-left="10"
+ fit-margin-bottom="10"
+ fit-margin-right="10" />
@@ -1012,86 +1028,108 @@
+
+ transform="translate(-24.340681,-213.92624)">
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ d="M 279.33268,257.02949 H 512.2481"
+ id="path2967"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="m 334.56237,297.14191 v 187.701"
+ id="path3940-3"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="M 276.02753,297.1419 V 484.84291"
+ id="path3940-2"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="M 397.19838,297.1419 V 484.84291"
+ id="path3940-38"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-8"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-7"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-9"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="M 222.20583,358.55589 H 455.12126"
+ id="path2967-1"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="M 222.20583,424.54741 H 455.12125"
+ id="path2967-4"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-0"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-91"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="M 487.79689,274.39907 V 462.10008"
+ id="path3940-6"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330)"
+ d="M 254.92474,273.75036 H 487.84016"
+ id="path2967-8"
+ inkscape:connector-curvature="0" />
+ id="path2965-9"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3"
+ inkscape:connector-curvature="0" />
+ id="path2965-9-3-1"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ d="M 512.33453,257.02715 V 444.72816"
+ id="path3940"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:4.65429401;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330-5-0)"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:4.86589146;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330-5-0)"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:4.77046061;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330-5-0)"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ d="M 180.66394,434.79892 H 414.8872 m 0.13354,0.19234 23.6025,-16.61686"
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ id="path2967-45-1"
+ inkscape:connector-curvature="0" />
+ d="M 205.47014,418.47977 H 438.38556"
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:4.46560764;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330-2)"
+ inkscape:connector-curvature="0" />
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:4.46560764;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330-2)"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ d="M 607.67297,166.52524 V 354.22625"
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ d="M 632.44481,148.8327 V 336.23717"
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
2
+ style="font-size:20px;line-height:1.25;font-family:Arial;-inkscape-font-specification:Arial">2
+ style="fill:none;stroke:#000000;stroke-width:0.72307718;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-end:url(#EmptyTriangleOutL)"
+ d="m 218.65167,564.61456 h 36.19122"
+ id="path2967-5-8"
+ inkscape:connector-curvature="0" />
-18090
+ style="font-size:16px;line-height:1.25">90
0
+ style="font-size:16px;line-height:1.25">0
90
+ style="font-size:16px;line-height:1.25">90
Longitude (degrees)
+ style="font-size:16px;line-height:1.25">Longitude (degrees)
Height (meters)
+ style="font-size:16px;line-height:1.25">Height (meters)
+ d="m 654.20331,319.24177 h 58.79787"
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ d="M 454.91905,335.96264 H 687.83447"
+ style="fill:#b3b3b3;stroke:#999999;stroke-width:8.0860281;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;filter:url(#filter4330-0)"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
+ d="m 597.7085,358.87212 h 58.79787"
+ style="fill:none;stroke:#000000;stroke-width:2;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-mid:none"
+ inkscape:connector-curvature="0" />
Latitude (degrees)
+ style="font-size:16px;line-height:1.25">Latitude (degrees)
@@ -1281,9 +1337,9 @@
id="text4028-7"
y="378.68555"
x="17.929327"
- style="font-size:22.62892342px;font-style:normal;font-weight:normal;fill:#000000;fill-opacity:1;stroke:none;font-family:Bitstream Vera Sans"
+ style="font-style:normal;font-weight:normal;line-height:0%;font-family:'Bitstream Vera Sans';fill:#000000;fill-opacity:1;stroke:none"
xml:space="preserve">
+ style="fill:none;stroke:#000000;stroke-width:0.99685019;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-end:url(#EmptyTriangleOutL)"
+ inkscape:connector-curvature="0" />
-45
+ style="font-size:16px;line-height:1.25">-45
45
+ style="font-size:16px;line-height:1.25">45
+ d="m 588.51962,415.76898 v 48.81029"
+ style="fill:none;stroke:#000000;stroke-width:0.93453234;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1;marker-start:url(#EmptyTriangleInL)"
+ inkscape:connector-curvature="0" />
0Air temperature (kelvin)
+ style="font-size:20px;line-height:1.25">Air temperature (kelvin)
2
+ style="font-size:16px;line-height:1.25">2
10
+ style="font-size:16px;line-height:1.25">10
25
+ style="font-size:16px;line-height:1.25">25
diff --git a/docs/src/userguide/plotting_examples/1d_with_legend.py b/docs/src/userguide/plotting_examples/1d_with_legend.py
index 626335af45..6b29fc9e76 100644
--- a/docs/src/userguide/plotting_examples/1d_with_legend.py
+++ b/docs/src/userguide/plotting_examples/1d_with_legend.py
@@ -31,7 +31,7 @@
plt.grid(True)
# Provide some axis labels
-plt.ylabel("Temerature / kelvin")
+plt.ylabel("Temperature / kelvin")
plt.xlabel("Longitude / degrees")
# And a sensible title
diff --git a/docs/src/userguide/real_and_lazy_data.rst b/docs/src/userguide/real_and_lazy_data.rst
index 9d66a2f086..ef4de0c429 100644
--- a/docs/src/userguide/real_and_lazy_data.rst
+++ b/docs/src/userguide/real_and_lazy_data.rst
@@ -6,6 +6,7 @@
import dask.array as da
import iris
+ from iris.cube import CubeList
import numpy as np
@@ -188,17 +189,17 @@ coordinates' lazy points and bounds:
.. doctest::
- >>> cube = iris.load_cube(iris.sample_data_path('hybrid_height.nc'), 'air_potential_temperature')
+ >>> cube = iris.load_cube(iris.sample_data_path('orca2_votemper.nc'),'votemper')
- >>> dim_coord = cube.coord('model_level_number')
+ >>> dim_coord = cube.coord('depth')
>>> print(dim_coord.has_lazy_points())
False
>>> print(dim_coord.has_bounds())
- False
+ True
>>> print(dim_coord.has_lazy_bounds())
False
- >>> aux_coord = cube.coord('sigma')
+ >>> aux_coord = cube.coord('longitude')
>>> print(aux_coord.has_lazy_points())
True
>>> print(aux_coord.has_bounds())
@@ -213,7 +214,9 @@ coordinates' lazy points and bounds:
>>> print(aux_coord.has_lazy_bounds())
True
- >>> derived_coord = cube.coord('altitude')
+ # Fetch a derived coordinate, from a different file: These can also have lazy data.
+ >>> cube2 = iris.load_cube(iris.sample_data_path('hybrid_height.nc'), 'air_potential_temperature')
+ >>> derived_coord = cube2.coord('altitude')
>>> print(derived_coord.has_lazy_points())
True
>>> print(derived_coord.has_bounds())
@@ -221,17 +224,51 @@ coordinates' lazy points and bounds:
>>> print(derived_coord.has_lazy_bounds())
True
-.. note::
- Printing a lazy :class:`~iris.coords.AuxCoord` will realise its points and bounds arrays!
-
Dask Processing Options
-----------------------
-Iris uses dask to provide lazy data arrays for both Iris cubes and coordinates,
-and for computing deferred operations on lazy arrays.
+Iris uses `Dask `_ to provide lazy data arrays for
+both Iris cubes and coordinates, and for computing deferred operations on lazy arrays.
Dask provides processing options to control how deferred operations on lazy arrays
are computed. This is provided via the ``dask.set_options`` interface. See the
`dask documentation `_
for more information on setting dask processing options.
+
+
+.. _delayed_netcdf_save:
+
+Delayed NetCDF Saving
+---------------------
+
+When saving data to NetCDF files, it is possible to *delay* writing lazy content to the
+output file, to be performed by `Dask `_ later,
+thus enabling parallel save operations.
+
+This works in the following way :
+ 1. an :func:`iris.save` call is made, with a NetCDF file output and the additional
+ keyword ``compute=False``.
+ This is currently *only* available when saving to NetCDF, so it is documented in
+ the Iris NetCDF file format API. See: :func:`iris.fileformats.netcdf.save`.
+
+ 2. the call creates the output file, but does not fill in variables' data, where
+ the data is a lazy array in the Iris object. Instead, these variables are
+ initially created "empty".
+
+ 3. the :meth:`~iris.save` call returns a ``result`` which is a
+ :class:`~dask.delayed.Delayed` object.
+
+ 4. the save can be completed later by calling ``result.compute()``, or by passing it
+ to the :func:`dask.compute` call.
+
+The benefit of this, is that costly data transfer operations can be performed in
+parallel with writes to other data files. Also, where array contents are calculated
+from shared lazy input data, these can be computed in parallel efficiently by Dask
+(i.e. without re-fetching), similar to what :meth:`iris.cube.CubeList.realise_data`
+can do.
+
+.. note::
+ This feature does **not** enable parallel writes to the *same* NetCDF output file.
+ That can only be done on certain operating systems, with a specially configured
+ build of the NetCDF C library, and is not supported by Iris at present.
diff --git a/docs/src/voted_issues.rst b/docs/src/voted_issues.rst
index 7d983448b9..0c99638bbd 100644
--- a/docs/src/voted_issues.rst
+++ b/docs/src/voted_issues.rst
@@ -20,7 +20,7 @@ the below table.
.. raw:: html
-
+
👍
@@ -42,7 +42,8 @@ the below table.
"ajax": 'https://raw.githubusercontent.com/scitools/voted_issues/main/voted-issues.json',
"lengthMenu": [10, 25, 50, 100],
"pageLength": 10,
- "order": [[ 0, "desc" ]]
+ "order": [[ 0, "desc" ]],
+ "bJQueryUI": true,
} );
} );
diff --git a/docs/src/whatsnew/1.7.rst b/docs/src/whatsnew/1.7.rst
index 44ebe9ec60..1d7c7c3f60 100644
--- a/docs/src/whatsnew/1.7.rst
+++ b/docs/src/whatsnew/1.7.rst
@@ -21,14 +21,14 @@ Features
transparent; for example, before the introduction of biggus, MemoryErrors
were likely for very large datasets::
- >>> result = extremely_large_cube.collapsed('time', iris.analyis.MEAN)
+ >>> result = extremely_large_cube.collapsed('time', iris.analysis.MEAN)
MemoryError
Now, for supported operations, the evaluation is lazy (i.e. it doesn't take
place until the actual data is subsequently requested) and can handle data
larger than available system memory::
- >>> result = extremely_large_cube.collapsed('time', iris.analyis.MEAN)
+ >>> result = extremely_large_cube.collapsed('time', iris.analysis.MEAN)
>>> print(type(result))
diff --git a/docs/src/whatsnew/2.0.rst b/docs/src/whatsnew/2.0.rst
index 400a395e90..4ef50a4101 100644
--- a/docs/src/whatsnew/2.0.rst
+++ b/docs/src/whatsnew/2.0.rst
@@ -36,7 +36,7 @@ Features
* The *new* in-place arithmetic operators :data:`__iadd__`, :data:`__idiv__`,
:data:`__imul__`, :data:`__isub__`, and :data:`__itruediv__` have been
added to support :class:`~iris.cube.Cube` operations :data:`+=`,
- :data:`/=`, :data:`*=`, and :data:`-=`. Note that, for **divison**
+ :data:`/=`, :data:`*=`, and :data:`-=`. Note that, for **division**
*__future__.division* is always in effect.
* Changes to the :class:`iris.coords.Coord`:
diff --git a/docs/src/whatsnew/2.1.rst b/docs/src/whatsnew/2.1.rst
index 18c562d3da..3613bc0c23 100644
--- a/docs/src/whatsnew/2.1.rst
+++ b/docs/src/whatsnew/2.1.rst
@@ -1,3 +1,5 @@
+.. include:: ../common_links.inc
+
v2.1 (06 Jun 2018)
******************
@@ -67,7 +69,7 @@ Incompatible Changes
as an alternative.
* This release of Iris contains a number of updated metadata translations.
- See this
+ See this
`changelist `_
for further information.
@@ -84,7 +86,7 @@ Internal
calendar.
* Iris updated its time-handling functionality from the
- `netcdf4-python `_
+ `netcdf4-python`__
``netcdftime`` implementation to the standalone module
`cftime `_.
cftime is entirely compatible with netcdftime, but some issues may
@@ -92,6 +94,8 @@ Internal
In this situation, simply replacing ``netcdftime.datetime`` with
``cftime.datetime`` should be sufficient.
+__ `netCDF4`_
+
* Iris now requires version 2 of Matplotlib, and ``>=1.14`` of NumPy.
- Full requirements can be seen in the `requirements `_
+ Full requirements can be seen in the `requirements`_
directory of the Iris' the source.
diff --git a/docs/src/whatsnew/3.0.rst b/docs/src/whatsnew/3.0.rst
index 223ef60011..4107ae5d2b 100644
--- a/docs/src/whatsnew/3.0.rst
+++ b/docs/src/whatsnew/3.0.rst
@@ -6,10 +6,9 @@ v3.0 (25 Jan 2021)
This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-.. dropdown:: :opticon:`report` v3.0.0 Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.0.0 Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
@@ -42,10 +41,9 @@ This document explains the changes made to Iris for this release
v3.0.1 (27 Jan 2021)
====================
-.. dropdown:: :opticon:`alert` v3.0.1 Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.0.1 Patches
+ :color: secondary
+ :icon: alert
:animate: fade-in
The patches included in this release include:
@@ -61,10 +59,9 @@ v3.0.1 (27 Jan 2021)
v3.0.2 (27 May 2021)
====================
-.. dropdown:: :opticon:`alert` v3.0.2 Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.0.2 Patches
+ :color: secondary
+ :icon: alert
:animate: fade-in
The patches included in this release include:
@@ -115,10 +112,9 @@ v3.0.2 (27 May 2021)
v3.0.3 (07 July 2021)
=====================
-.. dropdown:: :opticon:`alert` v3.0.3 Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.0.3 Patches
+ :color: secondary
+ :icon: alert
:animate: fade-in
The patches included in this release include:
@@ -133,10 +129,9 @@ v3.0.3 (07 July 2021)
v3.0.4 (22 July 2021)
=====================
-.. dropdown:: :opticon:`alert` v3.0.4 Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.0.4 Patches
+ :color: secondary
+ :icon: alert
:animate: fade-in
The patches included in this release include:
@@ -147,7 +142,7 @@ v3.0.4 (22 July 2021)
Firstly, ancillary-variables or cell-measures with long names can now widen the cube "dimensions map" to fit,
whereas previously printing these cases caused an Exception.
Secondly, cube units are now always printed, whereas previously they were missed out any time that the
- "dimensions map" was widened to accomodate long coordinate names.
+ "dimensions map" was widened to accommodate long coordinate names.
(:pull:`4233`)(:pull:`4238`)
💼 **Internal**
diff --git a/docs/src/whatsnew/3.1.rst b/docs/src/whatsnew/3.1.rst
index 1f076572bc..744543f514 100644
--- a/docs/src/whatsnew/3.1.rst
+++ b/docs/src/whatsnew/3.1.rst
@@ -7,10 +7,9 @@ This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-.. dropdown:: :opticon:`report` v3.1.0 Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.1.0 Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
diff --git a/docs/src/whatsnew/3.2.rst b/docs/src/whatsnew/3.2.rst
index 723f26345e..87a85f9061 100644
--- a/docs/src/whatsnew/3.2.rst
+++ b/docs/src/whatsnew/3.2.rst
@@ -6,11 +6,9 @@ v3.2 (15 Feb 2022)
This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-
-.. dropdown:: :opticon:`report` v3.2.0 Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.2.0 Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
@@ -28,10 +26,9 @@ This document explains the changes made to Iris for this release
v3.2.1 (11 Mar 2022)
====================
-.. dropdown:: :opticon:`alert` v3.2.1 Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.2.1 Patches
+ :color: secondary
+ :icon: alert
:animate: fade-in
📢 **Welcome** to `@dennissergeev`_, who made his first contribution to Iris. Nice work!
@@ -170,7 +167,7 @@ v3.2.1 (11 Mar 2022)
as well as some long-standing bugs with vertical coordinates and number
formats. (:pull:`4411`)
-#. `@rcomer`_ fixed :meth:`~iris.cube.Cube.subset` to alway return ``None`` if
+#. `@rcomer`_ fixed :meth:`~iris.cube.Cube.subset` to always return ``None`` if
no value match is found. (:pull:`4417`)
#. `@wjbenfold`_ changed :meth:`iris.util.points_step` to stop it from warning
diff --git a/docs/src/whatsnew/3.3.rst b/docs/src/whatsnew/3.3.rst
index c2e47f298a..4ab5a2e973 100644
--- a/docs/src/whatsnew/3.3.rst
+++ b/docs/src/whatsnew/3.3.rst
@@ -6,11 +6,9 @@ v3.3 (1 Sep 2022)
This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-
-.. dropdown:: :opticon:`report` v3.3.0 Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: |iris_version| v3.3.0 Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
@@ -34,31 +32,30 @@ This document explains the changes made to Iris for this release
v3.3.1 (29 Sep 2022)
====================
-.. dropdown:: :opticon:`alert` v3.3.1 Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.3.1 Patches
+ :color: secondary
+ :icon: alert
:animate: fade-in
The patches in this release of Iris include:
- #. `@pp-mo`_ fixed the Jupyter notebook display of :class:`~iris.cube.CubeList`.
- (:issue:`4973`, :pull:`4976`)
+ #. `@pp-mo`_ fixed the Jupyter notebook display of :class:`~iris.cube.CubeList`.
+ (:issue:`4973`, :pull:`4976`)
- #. `@pp-mo`_ fixed a bug in NAME loaders where data with no associated statistic would
- load as a cube with invalid cell-methods, which cannot be printed or saved to netcdf.
- (:issue:`3288`, :pull:`4933`)
+ #. `@pp-mo`_ fixed a bug in NAME loaders where data with no associated statistic would
+ load as a cube with invalid cell-methods, which cannot be printed or saved to netcdf.
+ (:issue:`3288`, :pull:`4933`)
- #. `@pp-mo`_ ensured that :data:`iris.cube.Cube.cell_methods` must always be an iterable
- of :class:`iris.coords.CellMethod` objects (:pull:`4933`).
+ #. `@pp-mo`_ ensured that :data:`iris.cube.Cube.cell_methods` must always be an iterable
+ of :class:`iris.coords.CellMethod` objects (:pull:`4933`).
- #. `@trexfeathers`_ advanced the Cartopy pin to ``>=0.21``, as Cartopy's
- change to default Transverse Mercator projection affects an Iris test.
- See `SciTools/cartopy@fcb784d`_ and `SciTools/cartopy@8860a81`_ for more
- details. (:pull:`4992`)
+ #. `@trexfeathers`_ advanced the Cartopy pin to ``>=0.21``, as Cartopy's
+ change to default Transverse Mercator projection affects an Iris test.
+ See `SciTools/cartopy@fcb784d`_ and `SciTools/cartopy@8860a81`_ for more
+ details. (:pull:`4992`)
- #. `@trexfeathers`_ introduced the ``netcdf4!=1.6.1`` pin to avoid a
- problem with segfaults. (:pull:`4992`)
+ #. `@trexfeathers`_ introduced the ``netcdf4!=1.6.1`` pin to avoid a
+ problem with segfaults. (:pull:`4992`)
📢 Announcements
diff --git a/docs/src/whatsnew/3.4.rst b/docs/src/whatsnew/3.4.rst
index 1ad676c049..e8d4f0fd2b 100644
--- a/docs/src/whatsnew/3.4.rst
+++ b/docs/src/whatsnew/3.4.rst
@@ -7,10 +7,9 @@ This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-.. dropdown:: :opticon:`report` v3.4.0 Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: v3.4.0 Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
@@ -26,15 +25,29 @@ This document explains the changes made to Iris for this release
* We have **begun refactoring Iris' regridding**, which has already improved
performance and functionality, with more potential in future!
* We have made several other significant `🚀 Performance Enhancements`_.
- * Please note that **Iris cannot currently work with the latest NetCDF4
- releases**. The pin is set to ``` if you have
any issues or feature requests for improving Iris. Enjoy!
+
+v3.4.1 (21 Feb 2023)
+====================
+
+.. dropdown:: v3.4.1 Patches
+ :color: secondary
+ :icon: alert
+ :animate: fade-in
+
+ The patches in this release of Iris include:
+
+ #. `@trexfeathers`_ and `@pp-mo`_ made Iris' use of the `netCDF4`_ library
+ thread-safe. (:pull:`5095`)
+
+ #. `@trexfeathers`_ and `@pp-mo`_ removed the netCDF4 pin mentioned in
+ `🔗 Dependencies`_ point 3. (:pull:`5095`)
+
+
📢 Announcements
================
diff --git a/docs/src/whatsnew/3.5.rst b/docs/src/whatsnew/3.5.rst
new file mode 100644
index 0000000000..c6699ee842
--- /dev/null
+++ b/docs/src/whatsnew/3.5.rst
@@ -0,0 +1,214 @@
+.. include:: ../common_links.inc
+
+v3.5 (27 Apr 2023)
+****************************************
+
+This document explains the changes made to Iris for this release
+(:doc:`View all changes `.)
+
+.. dropdown:: v3.5 Release Highlights
+ :color: primary
+ :icon: info
+ :animate: fade-in
+ :open:
+
+ The highlights for this major/minor release of Iris include:
+
+ * We added support for plugins.
+ * We allowed the usage of Iris objects as weights
+ for cube aggregations.
+ * We made Iris' use of the `netCDF4`_ library
+ thread-safe.
+ * We improved performance by changing the netCDF loader to
+ fetch data immediately from small netCDF.
+ variables, instead of creating a dask array.
+ * We added notes within docstrings clarifying whether operations
+ maintain lazy data or not.
+ * We're so proud to fully support `@ed-hawkins`_ and `#ShowYourStripes`_ ❤️
+
+ And finally, get in touch with us on :issue:`GitHub` if you have
+ any issues or feature requests for improving Iris. Enjoy!
+
+
+📢 Announcements
+================
+
+#. Congratulations to `@ESadek-MO`_ who has become a core developer for Iris! 🎉
+#. Welcome and congratulations to `@HGWright`_, `@scottrobinson02`_ and
+ `@agriyakhetarpal`_ who made their first contributions to Iris! 🎉
+
+
+✨ Features
+===========
+
+#. `@bsherratt`_ added support for plugins - see the corresponding
+ :ref:`documentation page` for further information.
+ (:pull:`5144`)
+
+#. `@rcomer`_ enabled lazy evaluation of :obj:`~iris.analysis.RMS` calculations
+ with weights. (:pull:`5017`)
+
+#. `@schlunma`_ allowed the usage of cubes, coordinates, cell measures, or
+ ancillary variables as weights for cube aggregations
+ (:meth:`iris.cube.Cube.collapsed`, :meth:`iris.cube.Cube.aggregated_by`, and
+ :meth:`iris.cube.Cube.rolling_window`). This automatically adapts cube units
+ if necessary. (:pull:`5084`)
+
+#. `@lbdreyer`_ and `@trexfeathers`_ (reviewer) added :func:`iris.plot.hist`
+ and :func:`iris.quickplot.hist`. (:pull:`5189`)
+
+#. `@tinyendian`_ edited :func:`~iris.analysis.cartography.rotate_winds` to
+ enable lazy computation of rotated wind vector components (:issue:`4934`,
+ :pull:`4972`)
+
+#. `@ESadek-MO`_ updated to the latest CF Standard Names Table v80
+ (07 February 2023). (:pull:`5244`)
+
+
+🐛 Bugs Fixed
+=============
+
+#. `@schlunma`_ fixed :meth:`iris.cube.CubeList.concatenate` so that it
+ preserves derived coordinates. (:issue:`2478`, :pull:`5096`)
+
+#. `@trexfeathers`_ and `@pp-mo`_ made Iris' use of the `netCDF4`_ library
+ thread-safe. (:pull:`5095`)
+
+#. `@ESadek-MO`_ removed check and error raise for saving
+ cubes with masked :class:`iris.coords.CellMeasure`.
+ (:issue:`5147`, :pull:`5181`)
+
+#. `@scottrobinson02`_ fixed :class:`iris.util.new_axis` creating an anonymous new
+ dimension, when the scalar coord provided is already a dim coord.
+ (:issue:`4415`, :pull:`5194`)
+
+#. `@HGWright`_ and `@trexfeathers`_ (reviewer) changed the way
+ :class:`~iris.coords.CellMethod` are printed to be more CF compliant.
+ (:pull:`5224`)
+
+#. `@stephenworsley`_ fixed the way discontiguities were discovered for 2D coords.
+ Previously, the only bounds being compared were the bottom right bound in one
+ cell with the bottom left bound in the cell to its right, and the top left bound
+ in a cell with the bottom left bound in the cell above it. Now all bounds are
+ compared with all adjacent bounds from neighbouring cells. This affects
+ :meth:`~iris.coords.Coord.is_contiguous` and :func:`iris.util.find_discontiguities`
+ where additional discontiguities may be detected which previously were not.
+
+
+💣 Incompatible Changes
+=======================
+
+#. N/A
+
+
+🚀 Performance Enhancements
+===========================
+
+#. `@pp-mo`_ changed the netCDF loader to fetch data immediately from small netCDF
+ variables, instead of creating a dask array: This saves both time and memory.
+ Note that some cubes, coordinates etc loaded from netCDF will now have real data
+ where previously it was lazy. (:pull:`5229`)
+
+
+🔥 Deprecations
+===============
+
+#. N/A
+
+
+🔗 Dependencies
+===============
+
+#. `@trexfeathers`_ introduced the ``libnetcdf <4.9`` pin. (:pull:`5242`)
+
+
+📚 Documentation
+================
+
+#. `@rcomer`_ clarified instructions for updating gallery tests. (:pull:`5100`)
+#. `@tkknight`_ unpinned ``pydata-sphinx-theme`` and set the default to use
+ the light version (not dark) while we make the docs dark mode friendly
+ (:pull:`5129`)
+
+#. `@jonseddon`_ updated the citation to a more recent version of Iris. (:pull:`5116`)
+
+#. `@rcomer`_ linked the :obj:`~iris.analysis.PERCENTILE` aggregator from the
+ :obj:`~iris.analysis.MEDIAN` docstring, noting that the former handles lazy
+ data. (:pull:`5128`)
+
+#. `@trexfeathers`_ updated the WSL link to Microsoft's latest documentation,
+ and removed an ECMWF link in the ``v1.0`` What's New that was failing the
+ linkcheck CI. (:pull:`5109`)
+
+#. `@trexfeathers`_ added a new top-level :doc:`/community/index` section,
+ as a one-stop place to find out about getting involved, and how we relate
+ to other projects. (:pull:`5025`)
+
+#. The **Iris community**, with help from the **Xarray community**, produced
+ the :doc:`/community/iris_xarray` page, highlighting the similarities and
+ differences between the two packages. (:pull:`5025`)
+
+#. `@bjlittle`_ added a new section to the `README.md`_ to show our support
+ for the outstanding work of `@ed-hawkins`_ et al for `#ShowYourStripes`_.
+ (:pull:`5141`)
+
+#. `@HGWright`_ fixed some typo's from Gitwash. (:pull:`5145`)
+
+#. `@Esadek-MO`_ added notes to function docstrings to
+ to clarify if the function preserves laziness or not. (:pull:`5137`)
+
+💼 Internal
+===========
+
+#. `@bouweandela`_ and `@trexfeathers`_ (reviewer) modernized and simplified
+ the code of ``iris.analysis._Groupby``. (:pull:`5015`)
+
+#. `@fnattino`_ changed the order of ``ncgen`` arguments in the command to
+ create NetCDF files for testing (caused errors on OS X). (:pull:`5105`)
+
+#. `@rcomer`_ removed some old infrastructure that printed test timings.
+ (:pull:`5101`)
+
+#. `@lbdreyer`_ and `@trexfeathers`_ (reviewer) added coverage testing. This
+ can be enabled by using the "--coverage" flag when running the tests with
+ nox i.e. ``nox --session tests -- --coverage``. (:pull:`4765`)
+
+#. `@lbdreyer`_ and `@trexfeathers`_ (reviewer) removed the ``--coding-tests``
+ option from Iris' test runner. (:pull:`4765`)
+
+#. `@lbdreyer`_ removed the Iris TestRunner. Tests are now run via nox or
+ pytest. (:pull:`5205`)
+
+#. `@agriyakhetarpal`_ and `@trexfeathers`_ prevented the GitHub action for
+ publishing releases to PyPI from running in forks.
+ (:pull:`5220`, :pull:`5248`)
+
+#. `@trexfeathers`_ moved the benchmark runner conveniences from ``noxfile.py``
+ to a dedicated ``benchmarks/bm_runner.py``. (:pull:`5215`)
+
+#. `@bjlittle`_ follow-up to :pull:`4972`, enforced ``dask>=2022.09.0`` minimum
+ pin for first use of `dask.array.ma.empty_like`_ and replaced `@tinyendian`_
+ workaround. (:pull:`5225`)
+
+#. `@HGWright`_, `@bjlittle`_ and `@trexfeathers`_ removed the legacy pin for
+ ``numpy`` array printing and replaced the test results files to match default
+ ``numpy`` output. (:pull:`5235`)
+
+
+.. comment
+ Whatsnew author names (@github name) in alphabetical order. Note that,
+ core dev names are automatically included by the common_links.inc:
+
+.. _@fnattino: https://github.com/fnattino
+.. _@ed-hawkins: https://github.com/ed-hawkins
+.. _@scottrobinson02: https://github.com/scottrobinson02
+.. _@agriyakhetarpal: https://github.com/agriyakhetarpal
+.. _@tinyendian: https://github.com/tinyendian
+
+
+.. comment
+ Whatsnew resources in alphabetical order:
+
+.. _#ShowYourStripes: https://showyourstripes.info/s/globe/
+.. _README.md: https://github.com/SciTools/iris#-----
+.. _dask.array.ma.empty_like: https://docs.dask.org/en/stable/generated/dask.array.ma.empty_like.html
diff --git a/docs/src/whatsnew/3.6.rst b/docs/src/whatsnew/3.6.rst
new file mode 100644
index 0000000000..151c63ef51
--- /dev/null
+++ b/docs/src/whatsnew/3.6.rst
@@ -0,0 +1,183 @@
+.. include:: ../common_links.inc
+
+v3.6 (18 May 2023)
+******************
+
+This document explains the changes made to Iris for this release
+(:doc:`View all changes `.)
+
+
+.. dropdown:: v3.6 Release Highlights
+ :color: primary
+ :icon: info
+ :animate: fade-in
+ :open:
+
+ We're so excited about our recent support for **delayed saving of lazy data
+ to netCDF** (:pull:`5191`) that we're celebrating this important step change
+ in behavour with its very own dedicated release 🥳
+
+ By using ``iris.save(..., compute=False)`` you can now save to multiple NetCDF files
+ in parallel. See the new ``compute`` keyword in :func:`iris.fileformats.netcdf.save`.
+ This can share and re-use any common (lazy) result computations, and it makes much
+ better use of resources during any file-system waiting (i.e., it can use such periods
+ to progress the *other* saves).
+
+ Usage example::
+
+ # Create output files with delayed data saving.
+ delayeds = [
+ iris.save(cubes, filepath, compute=False)
+ for cubes, filepath in zip(output_cubesets, output_filepaths)
+ ]
+ # Complete saves in parallel.
+ dask.compute(*delayeds)
+
+ This advance also includes **another substantial benefit**, because NetCDF saves can
+ now use a
+ `Dask.distributed scheduler `_.
+ With `Distributed `_ you can parallelise the
+ saves across a whole cluster. Whereas previously, the NetCDF saving *only* worked with
+ a "threaded" scheduler, limiting it to a single CPU.
+
+ We're so super keen for the community to leverage the benefit of this new
+ feature within Iris that we've brought this release forward several months.
+ As a result, this minor release of Iris is intentionally light in content.
+ However, there are some other goodies available for you to enjoy, such as:
+
+ * Performing lazy arithmetic with an Iris :class:`~iris.cube.Cube` and a
+ :class:`dask.array.Array`, and
+ * Various improvements to our documentation resulting from adoption of
+ `sphinx-design`_ and `sphinx-apidoc`_.
+
+ As always, get in touch with us on :issue:`GitHub`, particularly
+ if you have any feedback with regards to delayed saving, or have any issues
+ or feature requests for improving Iris. Enjoy!
+
+
+📢 Announcements
+================
+
+#. `@bjlittle`_ added the community `Contributor Covenant`_ code of conduct.
+ (:pull:`5291`)
+
+
+✨ Features
+===========
+
+#. `@pp-mo`_ and `@lbdreyer`_ supported delayed saving of lazy data, when writing to
+ the netCDF file format. See :ref:`delayed netCDF saves `.
+ Also with significant input from `@fnattino`_.
+ (:pull:`5191`)
+
+#. `@rcomer`_ tweaked binary operations so that dask arrays may safely be passed
+ to arithmetic operations and :func:`~iris.util.mask_cube`. (:pull:`4929`)
+
+
+🐛 Bugs Fixed
+=============
+
+#. `@rcomer`_ enabled automatic replacement of a Matplotlib
+ :class:`~matplotlib.axes.Axes` with a Cartopy
+ :class:`~cartopy.mpl.geoaxes.GeoAxes` when the ``Axes`` is on a
+ :class:`~matplotlib.figure.SubFigure`. (:issue:`5282`, :pull:`5288`)
+
+
+💣 Incompatible Changes
+=======================
+
+#. N/A
+
+
+🚀 Performance Enhancements
+===========================
+
+#. N/A
+
+
+🔥 Deprecations
+===============
+
+#. N/A
+
+
+🔗 Dependencies
+===============
+
+#. `@rcomer`_ and `@bjlittle`_ (reviewer) added testing support for python
+ 3.11. (:pull:`5226`)
+
+#. `@rcomer`_ dropped support for python 3.8, in accordance with the NEP29_
+ recommendations (:pull:`5226`)
+
+#. `@trexfeathers`_ introduced the ``libnetcdf !=4.9.1`` and ``numpy !=1.24.3``
+ pins (:pull:`5274`)
+
+
+📚 Documentation
+================
+
+#. `@tkknight`_ migrated to `sphinx-design`_ over the legacy `sphinx-panels`_.
+ (:pull:`5127`)
+
+#. `@tkknight`_ updated the ``make`` target for ``help`` and added
+ ``livehtml`` to auto generate the documentation when changes are detected
+ during development. (:pull:`5258`)
+
+#. `@tkknight`_ updated the :ref:`installing_from_source` instructions to use
+ ``pip``. (:pull:`5273`)
+
+#. `@tkknight`_ removed the legacy custom sphinx extensions that generate the
+ API documentation. Instead use a less complex approach via
+ `sphinx-apidoc`_. (:pull:`5264`)
+
+#. `@trexfeathers`_ re-wrote the :ref:`iris_development_releases` documentation
+ for clarity, and wrote a step-by-step
+ :doc:`/developers_guide/release_do_nothing` for the release process.
+ (:pull:`5134`)
+
+#. `@trexfeathers`_ and `@tkknight`_ added a dark-mode friendly logo.
+ (:pull:`5278`)
+
+
+💼 Internal
+===========
+
+#. `@bjlittle`_ added the `codespell`_ `pre-commit`_ ``git-hook`` to automate
+ spell checking within the code-base. (:pull:`5186`)
+
+#. `@bjlittle`_ and `@trexfeathers`_ (reviewer) added a `check-manifest`_
+ GitHub Action and `pre-commit`_ ``git-hook`` to automate verification
+ of assets bundled within a ``sdist`` and binary ``wheel`` of our
+ `scitools-iris`_ PyPI package. (:pull:`5259`)
+
+#. `@rcomer`_ removed a now redundant copying workaround from Resolve testing.
+ (:pull:`5267`)
+
+#. `@bjlittle`_ and `@trexfeathers`_ (reviewer) migrated ``setup.cfg`` to
+ ``pyproject.toml``, as motivated by `PEP-0621`_. (:pull:`5262`)
+
+#. `@bjlittle`_ adopted `pypa/build`_ recommended best practice to build a
+ binary ``wheel`` from the ``sdist``. (:pull:`5266`)
+
+#. `@trexfeathers`_ enabled on-demand benchmarking of Pull Requests; see
+ :ref:`here `. (:pull:`5286`)
+
+
+.. comment
+ Whatsnew author names (@github name) in alphabetical order. Note that,
+ core dev names are automatically included by the common_links.inc:
+
+.. _@fnattino: https://github.com/fnattino
+
+
+.. comment
+ Whatsnew resources in alphabetical order:
+
+.. _sphinx-panels: https://github.com/executablebooks/sphinx-panels
+.. _sphinx-design: https://github.com/executablebooks/sphinx-design
+.. _check-manifest: https://github.com/mgedmin/check-manifest
+.. _PEP-0621: https://peps.python.org/pep-0621/
+.. _pypa/build: https://pypa-build.readthedocs.io/en/stable/
+.. _NEP29: https://numpy.org/neps/nep-0029-deprecation_policy.html
+.. _Contributor Covenant: https://www.contributor-covenant.org/version/2/1/code_of_conduct/
\ No newline at end of file
diff --git a/docs/src/whatsnew/index.rst b/docs/src/whatsnew/index.rst
index 005fac70c4..dce7458a13 100644
--- a/docs/src/whatsnew/index.rst
+++ b/docs/src/whatsnew/index.rst
@@ -12,6 +12,8 @@ What's New in Iris
:hidden:
latest.rst
+ 3.6.rst
+ 3.5.rst
3.4.rst
3.3.rst
3.2.rst
diff --git a/docs/src/whatsnew/latest.rst b/docs/src/whatsnew/latest.rst
index a38e426e6a..0e2896b7a1 100644
--- a/docs/src/whatsnew/latest.rst
+++ b/docs/src/whatsnew/latest.rst
@@ -7,16 +7,15 @@ This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-.. dropdown:: :opticon:`report` |iris_version| Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: |iris_version| Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
The highlights for this major/minor release of Iris include:
- * We're so proud to fully support `@ed-hawkins`_ and `#ShowYourStripes`_ ❤️
+ * N/A
And finally, get in touch with us on :issue:`GitHub` if you have
any issues or feature requests for improving Iris. Enjoy!
@@ -25,16 +24,14 @@ This document explains the changes made to Iris for this release
📢 Announcements
================
-#. Congratulations to `@ESadek-MO`_ who has become a core developer for Iris! 🎉
-#. Welcome and congratulations to `@HGWright`_ for making his first contribution to Iris! 🎉
+#. N/A
✨ Features
===========
-#. `@bsherratt`_ added support for plugins - see the corresponding
- :ref:`documentation page` for further information.
- (:pull:`5144`)
+#. `@rcomer`_ rewrote :func:`~iris.util.broadcast_to_shape` so it now handles
+ lazy data. (:pull:`5307`)
🐛 Bugs Fixed
@@ -52,8 +49,8 @@ This document explains the changes made to Iris for this release
🚀 Performance Enhancements
===========================
-#. N/A
-
+#. `@rcomer`_ made :meth:`~iris.cube.Cube.aggregated_by` faster. (:pull:`4970`)
+#. `@rsdavies`_ modified the CF compliant standard name for m01s00i023 :issue:`4566`
🔥 Deprecations
===============
@@ -70,54 +67,26 @@ This document explains the changes made to Iris for this release
📚 Documentation
================
-#. `@rcomer`_ clarified instructions for updating gallery tests. (:pull:`5100`)
-#. `@tkknight`_ unpinned ``pydata-sphinx-theme`` and set the default to use
- the light version (not dark) while we make the docs dark mode friendly
- (:pull:`5129`)
-
-#. `@jonseddon`_ updated the citation to a more recent version of Iris. (:pull:`5116`)
-
-#. `@rcomer`_ linked the :obj:`~iris.analysis.PERCENTILE` aggregator from the
- :obj:`~iris.analysis.MEDIAN` docstring, noting that the former handles lazy
- data. (:pull:`5128`)
-
-#. `@trexfeathers`_ updated the WSL link to Microsoft's latest documentation,
- and removed an ECMWF link in the ``v1.0`` What's New that was failing the
- linkcheck CI. (:pull:`5109`)
+#. `@tkknight`_ prepared the documentation for dark mode and enable the option
+ to use it. By default the theme will be based on the users system settings,
+ defaulting to ``light`` if no system setting is found. (:pull:`5299`)
-#. `@trexfeathers`_ added a new top-level :doc:`/community/index` section,
- as a one-stop place to find out about getting involved, and how we relate
- to other projects. (:pull:`5025`)
-
-#. The **Iris community**, with help from the **Xarray community**, produced
- the :doc:`/community/iris_xarray` page, highlighting the similarities and
- differences between the two packages. (:pull:`5025`)
-
-#. `@bjlittle`_ added a new section to the `README.md`_ to show our support
- for the outstanding work of `@ed-hawkins`_ et al for `#ShowYourStripes`_.
- (:pull:`5141`)
-
-#. `@HGWright`_ fixed some typo's from Gitwash. (:pull:`5145`)
💼 Internal
===========
-#. `@fnattino`_ changed the order of ``ncgen`` arguments in the command to
- create NetCDF files for testing (caused errors on OS X). (:pull:`5105`)
-
-#. `@rcomer`_ removed some old infrastructure that printed test timings.
- (:pull:`5101`)
+#. `@pp-mo`_ supported loading and saving netcdf :class:`netCDF4.Dataset` compatible
+ objects in place of file-paths, as hooks for a forthcoming
+ `"Xarray bridge" `_ facility.
+ (:pull:`5214`)
.. comment
Whatsnew author names (@github name) in alphabetical order. Note that,
core dev names are automatically included by the common_links.inc:
+.. _@rsdavies: https://github.com/rsdavies
+
-.. _@fnattino: https://github.com/fnattino
-.. _@ed-hawkins: https://github.com/ed-hawkins
.. comment
Whatsnew resources in alphabetical order:
-
-.. _#ShowYourStripes: https://showyourstripes.info/s/globe/
-.. _README.md: https://github.com/SciTools/iris#-----
diff --git a/docs/src/whatsnew/latest.rst.template b/docs/src/whatsnew/latest.rst.template
index a0ce415a65..966a91e976 100644
--- a/docs/src/whatsnew/latest.rst.template
+++ b/docs/src/whatsnew/latest.rst.template
@@ -7,10 +7,9 @@ This document explains the changes made to Iris for this release
(:doc:`View all changes `.)
-.. dropdown:: :opticon:`report` |iris_version| Release Highlights
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: |iris_version| Release Highlights
+ :color: primary
+ :icon: info
:animate: fade-in
:open:
@@ -22,29 +21,27 @@ This document explains the changes made to Iris for this release
any issues or feature requests for improving Iris. Enjoy!
-NOTE: section below is a template for bugfix patches
+NOTE: section BELOW is a template for bugfix patches
====================================================
- (Please remove this section when creating an initial 'latest.rst')
+ (Please remove this section when creating an initial 'latest.rst'
-v3.X.X (DD MMM YYYY)
-====================
+|iris_version| |build_date|
+===========================
-.. dropdown:: :opticon:`alert` v3.X.X Patches
- :container: + shadow
- :title: text-primary text-center font-weight-bold
- :body: bg-light
+.. dropdown:: |iris_version| Patches
+ :color: primary
+ :icon: alert
:animate: fade-in
The patches in this release of Iris include:
#. N/A
-NOTE: section above is a template for bugfix patches
+NOTE: section ABOVE is a template for bugfix patches
====================================================
(Please remove this section when creating an initial 'latest.rst')
-
📢 Announcements
================
@@ -108,4 +105,3 @@ NOTE: section above is a template for bugfix patches
.. comment
Whatsnew resources in alphabetical order:
-
diff --git a/etc/cf-standard-name-table.xml b/etc/cf-standard-name-table.xml
index 9c5fcd9cf0..3b145ae86e 100644
--- a/etc/cf-standard-name-table.xml
+++ b/etc/cf-standard-name-table.xml
@@ -1,11 +1,18 @@
- 79
- 2022-03-19T15:25:54Z
+ 81
+ 2023-04-25T10:43:33ZCentre for Environmental Data Analysissupport@ceda.ac.uk
+
+ 1
+
+
+ Acoustic area backscattering strength is 10 times the log10 of the ratio of the area backscattering coefficient to the reference value, 1 (m2 m-2). Area backscattering coefficient is the integral of the volume backscattering coefficient over a defined distance. Volume backscattering coefficient is the linear form of acoustic_volume_backscattering_strength_in_sea_water. For further details see MacLennan et. al (2002) doi:10.1006/jmsc.2001.1158.
+
+
s
@@ -13,6 +20,20 @@
The quantity with standard name acoustic_signal_roundtrip_travel_time_in_sea_water is the time taken for an acoustic signal to propagate from the emitting instrument to a reflecting surface and back again to the instrument. In the case of an instrument based on the sea floor and measuring the roundtrip time to the sea surface, the data are commonly used as a measure of ocean heat content.
+
+ 1
+
+
+ Target strength is 10 times the log10 of the ratio of backscattering cross-section to the reference value, 1 m2. Backscattering cross-section is a parameter computed from the intensity of the backscattered sound wave relative to the intensity of the incident sound wave. For further details see MacLennan et. al (2002) doi:10.1006/jmsc.2001.1158.
+
+
+
+ 1
+
+
+ Acoustic volume backscattering strength is 10 times the log10 of the ratio of the volume backscattering coefficient to the reference value, 1 m-1. Volume backscattering coefficient is the integral of the backscattering cross-section divided by the volume sampled. Backscattering cross-section is a parameter computed from the intensity of the backscattered sound wave relative to the intensity of the incident sound wave. The parameter is computed to provide a measurement that is proportional to biomass density per unit volume in the field of fisheries acoustics. For further details see MacLennan et. al (2002) doi:10.1006/jmsc.2001.1158.
+
+
m
@@ -27,6 +48,13 @@
The "aerodynamic_resistance" is the resistance to mixing through the boundary layer toward the surface by means of the dominant process, turbulent transport. Reference: Wesely, M. L., 1989, doi:10.1016/0004-6981(89)90153-4.
+
+ 1
+
+
+ A variable with the standard_name of aerosol_type_in_atmosphere_layer_in_air contains either strings which indicate the type of the aerosol determined following a certain aerosol typing schema, or flags which can be translated to strings using flag_values and flag_meanings attributes. "Layer" means any layer with upper and lower boundaries that have constant values in some vertical coordinate. There must be a vertical coordinate variable indicating the extent of the layer(s).
+
+
year
@@ -237,6 +265,20 @@
Altitude is the (geometric) height above the geoid, which is the reference geopotential surface. The geoid is similar to mean sea level.
+
+ m
+
+
+ The altitude at top of atmosphere boundary layer is the elevation above sea level of the top of the (atmosphere) planetary boundary layer. The phrase "defined_by" provides the information of the tracer used for identifying the atmospheric boundary layer top. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. "By ranging instrument" means that the backscattering is obtained through ranging techniques like lidar and radar.
+
+
+
+ m
+
+
+ The altitude at top of atmosphere mixed layer is the elevation above sea level of the top of the (atmosphere) mixed layer or convective boundary layer. The phrase "defined_by" provides the information of the tracer used for identifying the atmospheric boundary layer top. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. "By ranging instrument" means that the volume backscattering coefficient is obtained through ranging techniques like lidar and radar.
+
+
m
@@ -300,6 +342,13 @@
The "Angstrom exponent" appears in the formula relating aerosol optical thickness to the wavelength of incident radiation: T(lambda) = T(lambda0) * [lambda/lambda0] ** (-1 * alpha) where alpha is the Angstrom exponent, lambda is the wavelength of incident radiation, lambda0 is a reference wavelength, T(lambda) and T(lambda0) are the values of aerosol optical thickness at wavelengths lambda and lambda0, respectively. "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. To specify the relative humidity and temperature at which the quantity described by the standard name applies, provide scalar coordinate variables with standard names of "relative_humidity" and "air_temperature".
+
+ 1
+
+
+ The Angstrom exponent of volume backwards scattering is the Angstrom exponent related only to the aerosol backwards scattering component. It is alpha in the following equation relating volume backwards scattering (back) at the wavelength lambda to volume backwards scattering at a different wavelength lambda0: back(lambda) = back(lambda0) * [lambda/lambda0] ** (-1 * alpha). "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase.
+
+
K
@@ -2568,6 +2617,13 @@
"Area fraction" is the fraction of a grid cell's horizontal area that has some characteristic of interest. It is evaluated as the area of interest divided by the grid cell area. It may be expressed as a fraction, a percentage, or any other dimensionless representation of a fraction. "Burned area" means the area of burned vegetation.
+
+ 1
+
+
+ The Canadian Fire Weather Index (CFWI) is a numerical rating of potential frontal fire intensity from the Canadian Forest Fire Index System. It indicates fire intensity by combining the rate of spread with the amount of fuel being consumed and is also used for general public information about fire danger conditions. It is a function of wind speed, temperature, relative humidity, and precipitation. The calculation accounts for multiple layers of flammable material on the ground as well as fine fuels above the surface, combined with the expected rate of spread of fire. The index is open ended.
+
+
1
@@ -2932,6 +2988,13 @@
cloud_top refers to the top of the highest cloud. Altitude is the (geometric) height above the geoid, which is the reference geopotential surface. The geoid is similar to mean sea level.
+
+ 1
+
+
+ A variable with the standard_name of cloud_type contains either strings which indicate the cloud type, or flags which can be translated to strings using flag_values and flag_meanings attributes.
+
+
m-3
@@ -2946,6 +3009,20 @@
"Compressive strength" is a measure of the capacity of a material to withstand compressive forces. If compressive forces are exerted on a material in excess of its compressive strength, fracturing will occur. "Sea ice" means all ice floating in the sea which has formed from freezing sea water, rather than by other processes such as calving of land ice to form icebergs.
+
+ Pa
+
+
+ The maximum force applied as axial strain to an unconfined frozen soil sample before failure.
+
+
+
+ Pa
+
+
+ The maximum force applied as axial strain to an unconfined soil sample before failure.
+
+
1
@@ -3086,6 +3163,13 @@
Covariance refers to the sample covariance rather than the population covariance. The quantity with standard name covariance_over_longitude_of_northward_wind_and_air_temperature is the covariance of the deviations of meridional air velocity and air temperature about their respective zonal mean values. The data variable must be accompanied by a vertical coordinate variable or scalar coordinate variable and is calculated on an isosurface of that vertical coordinate. "Northward" indicates a vector component which is positive when directed northward (negative southward). Wind is defined as a two-dimensional (horizontal) air velocity vector, with no vertical component. (Vertical motion in the atmosphere has the standard name "upward_air_velocity"). Air temperature is the bulk temperature of the air, not the surface (skin) temperature.
+
+ 1
+
+
+ The phrase "ratio_of_X_to_Y" means X/Y. It may be expressed as a fraction, a percentage, or any other dimensionless representation of a fraction. Also known as specific gravity, where soil represents a dry soil sample. The density of a substance is its mass per unit volume.
+
+
m
@@ -3632,6 +3716,13 @@
Downwelling radiation is radiation from above. It does not mean "net downward". The sign convention is that "upwelling" is positive upwards and "downwelling" is positive downwards. Spherical irradiance is the radiation incident on unit area of a hemispherical (or "2-pi") collector. It is sometimes called "scalar irradiance". The direction (up/downwelling) is specified. Radiation incident on a 4-pi collector has standard names of "omnidirectional spherical irradiance". A coordinate variable for radiation wavelength should be given the standard name radiation_wavelength.
+
+ kg m-2
+
+
+ The quantity with standard name drainage_amount_through_base_of_soil_model is the amount of water that drains through the bottom of a soil column extending from the surface to a specified depth. “Drainage” is the process of removal of excess water from soil by gravitational flow. "Amount" means mass per unit area. A vertical coordinate variable or scalar coordinate with standard name "depth" should be used to specify the depth to which the soil column extends.
+
+
1
@@ -3653,6 +3744,13 @@
"Content" indicates a quantity per unit area. "Layer" means any layer with upper and lower boundaries that have constant values in some vertical coordinate. There must be a vertical coordinate variable indicating the extent of the layer(s). If the layers are model layers, the vertical coordinate can be model_level_number, but it is recommended to specify a physical coordinate (in a scalar or auxiliary coordinate variable) as well. Dry energy is the sum of dry static energy and kinetic energy. Dry static energy is the sum of enthalpy and potential energy (itself the sum of gravitational and centripetal potential energy). Enthalpy can be written either as (1) CpT, where Cp is heat capacity at constant pressure, T is absolute temperature, or (2) U+pV, where U is internal energy, p is pressure and V is volume.
+
+ kg m-3
+
+
+ The density of the soil after oven drying until constant mass is reached. Volume is determined from the field sample volume. The density of a substance is its mass per unit volume.
+
+
J m-2
@@ -3968,6 +4066,13 @@
The diameter of an aerosol particle as selected by its electrical mobility. "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. To specify the relative humidity and temperature at which the quantity described by the standard name applies, provide scalar coordinate variables with standard names of "relative_humidity" and "air_temperature".
+
+ 1
+
+
+ Isotopic enrichment of 13C, often called delta 13C, is a measure of the ratio of stable isotopes 13C:12C. It is a parameterisation of the 13C/12C isotopic ratio in the sample with respect to the isotopic ratio in a reference standard (in this case Vienna Pee Dee Belemnite). It is computed using the formula (((13C/12C)sample / (13C/12C)standard) - 1) * 1000. Particulate means suspended solids of all sizes.
+
+
1e-3
@@ -3975,6 +4080,13 @@
Isotopic enrichment of 14C, often called d14C or delta14C (lower case delta), is used to calculate the fossil fuel contribution to atmospheric carbon dioxide using isotopic ratios of carbon. It is a parameterisation of the 14C/12C isotopic ratio in the sample with respect to the isotopic ratio in a reference standard. It is computed using the formula (((14C/12C)sample / (14C/12C)standard) - 1) * 1000. The quantity called D14C, or Delta14C (upper case delta) is d14C corrected for isotopic fractionation using the 13C/12C ratio as follows: D14C = d14C - 2(dC13 + 25)(1+d14C/1000). If the sample is enriched in 14C relative to the standard, then the data value is positive. Reference: Stuiver, M. and H.A. Polach, 1977, Discussion reporting of 14C data, Radiocarbon, Volume 19, No. 3, 355-363, doi: 10.1017/S0033822200003672. The reference standard used in the calculation of delta14C should be specified by attaching a long_name attribute to the data variable. "C" means the element carbon and "14C" is the radioactive isotope "carbon-14", having six protons and eight neutrons and used in radiocarbon dating.
+
+ 1
+
+
+ Isotopic enrichment of 15N, often called delta 15N, is a measure of the ratio of stable isotopes 15N:14N. It is a parameterisation of the 15N/14N isotopic ratio in the sample with respect to the isotopic ratio in a reference standard (in this case atmospheric nitrogen). It is computed using the formula (((15N/14N)sample / (15N/14N)standard) - 1) * 1000. Particulate means suspended solids of all sizes.
+
+
J m-2
@@ -4164,6 +4276,13 @@
A lightning flash is a compound event, usually consisting of several discharges. Frequency is the number of oscillations of a wave, or the number of occurrences of an event, per unit time.
+
+ kg m-3
+
+
+ The density of the soil in its naturally frozen condition. Also known as frozen bulk density. The density of a substance is its mass per unit volume.
+
+
kg m-2
@@ -4381,6 +4500,20 @@
The ground_level_altitude is the geometric height of the upper boundary of the solid Earth above the geoid, which is the reference geopotential surface. The geoid is similar to mean sea level.
+
+ degree
+
+
+ The slope angle is the angle (in degrees) measured between the ground (earth) surface plane and a flat, horizontal surface.
+
+
+
+ degree
+
+
+ Commonly known as aspect, it is the azimuth (in degrees) of a terrain slope, taken as the direction with the greatest downslope change in elevation on the ground (earth) surface. The direction is a bearing in the usual geographical sense, measured positive clockwise from due north.
+
+
1
@@ -7412,6 +7545,13 @@
"shortwave" means shortwave radiation. Radiance is the radiative flux in a particular direction, per unit of solid angle. If radiation is isotropic, the radiance is independent of direction, so the direction should not be specified. If the radiation is directionally dependent, a standard name of upwelling or downwelling radiance should be chosen instead.
+
+ 1
+
+
+ The Keetch Byram Drought Index (KBDI) is a numerical drought index ranging from 0 to 800 that estimates the cumulative moisture deficiency in soil. It is a cumulative index. It is a function of maximum temperature and precipitation over the previous 24 hours.
+
+
J m-2
@@ -7804,6 +7944,20 @@
A quality flag that reports the result of the Location test, which checks that a location is within reasonable bounds. The linkage between the data variable and this variable is achieved using the ancillary_variables attribute. There are standard names for other specific quality tests which take the form of X_quality_flag. Quality information that does not match any of the specific quantities should be given the more general standard name of quality_flag.
+
+ m-3
+
+
+ The aerosol particle number size distribution is the number concentration of aerosol particles, normalised to the decadal logarithmic size interval the concentration applies to, as a function of particle diameter. A coordinate variable with the standard name of electrical_mobility_particle_diameter, aerodynamic_particle_diameter, or optical_particle_diameter should be specified to indicate that the property applies at specific particle sizes selected by the indicated method. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. "log10_X" means common logarithm (i.e. base 10) of X. "stp" means standard temperature (0 degC) and pressure (101325 Pa).
+
+
+
+ m-3
+
+
+ The aerosol particle number size distribution is the number concentration of aerosol particles, normalised to the decadal logarithmic size interval the concentration applies to, as a function of particle diameter. A coordinate variable with the standard name of electrical_mobility_particle_diameter, aerodynamic_particle_diameter, or optical_particle_diameter should be specified to indicate that the property applies at specific particle sizes selected by the indicated method. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. "log10_X" means common logarithm (i.e. base 10) of X.
+
+
m-3
@@ -7811,6 +7965,13 @@
The cloud condensation nuclei number size distribution is the number concentration of aerosol particles, normalised to the decadal logarithmic size interval the concentration applies to, as a function of particle diameter, where the particle acts as condensation nucleus for liquid-phase clouds. A coordinate variable with the standard name of relative_humidity should be specified to indicate that the property refers to a specific supersaturation with respect to liquid water. A coordinate variable with the standard name of electrical_mobility_particle_diameter should be specified to indicate that the property applies at specific mobility particle sizes. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. The ability of a particle to act as a condensation nucleus is determined by its size, chemical composition, and morphology. "log10_X" means common logarithm (i.e. base 10) of X. "stp" means standard temperature (0 degC) and pressure (101325 Pa).
+
+ m-3
+
+
+ The cloud condensation nuclei number size distribution is the number concentration of aerosol particles, normalised to the decadal logarithmic size interval the concentration applies to, as a function of particle diameter, where the particle acts as condensation nucleus for liquid-phase clouds. A coordinate variable with the standard name of relative_humidity should be specified to indicate that the property refers to a specific supersaturation with respect to liquid water. A coordinate variable with the standard name of electrical_mobility_particle_diameter should be specified to indicate that the property applies at specific mobility particle sizes. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. The ability of a particle to act as a condensation nucleus is determined by its size, chemical composition, and morphology. "log10_X" means common logarithm (i.e. base 10) of X.
+
+
degree_east
@@ -8028,6 +8189,34 @@
"Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula of 19'-hexanoyloxyfucoxanthin is C48H68O8. The equivalent term in the NERC P01 Parameter Usage Vocabulary may be found at http://vocab.nerc.ac.uk/collection/P01/current/HEXAXXXX/2/.
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. The absorption equivalent black carbon mass concentration is obtained by conversion from the particle light absorption coefficient with a suitable mass absorption cross-section. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm10 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 10 micrometers. The absorption equivalent black carbon mass concentration is obtained by conversion from the particle light absorption coefficient with a suitable mass absorption cross-section. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm1 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 1 micrometer. The absorption equivalent black carbon mass concentration is obtained by conversion from the particle light absorption coefficient with a suitable mass absorption cross-section. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm2p5 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 2.5 micrometers. The absorption equivalent black carbon mass concentration is obtained by conversion from the particle light absorption coefficient with a suitable mass absorption cross-section. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
kg m-3
@@ -8238,6 +8427,34 @@
Mass concentration means mass per unit volume and is used in the construction mass_concentration_of_X_in_Y, where X is a material constituent of Y. A chemical species denoted by X may be described by a single term such as 'nitrogen' or a phrase such as 'nox_expressed_as_nitrogen'. The chemical formula for carbon dioxide is CO2.
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. Chemically, "carbon" is the total sum of elemental, organic, and inorganic carbon. In measurements of carbonaceous aerosols, inorganic carbon is neglected and its mass is assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm10 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 10 micrometers. Chemically, "carbon" is the total sum of elemental, organic, and inorganic carbon. In measurements of carbonaceous aerosols, inorganic carbon is neglected and its mass is assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm1 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 1 micrometer. Chemically, "carbon" is the total sum of elemental, organic, and inorganic carbon. In measurements of carbonaceous aerosols, inorganic carbon is neglected and its mass is assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm2p5 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 2.5 micrometers. Chemically, "carbon" is the total sum of elemental, organic, and inorganic carbon. In measurements of carbonaceous aerosols, inorganic carbon is neglected and its mass is assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
kg m-3
@@ -8350,6 +8567,13 @@
"Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". Chlorophylls are the green pigments found in most plants, algae and cyanobacteria; their presence is essential for photosynthesis to take place. There are several different forms of chlorophyll that occur naturally. All contain a chlorin ring (chemical formula C20H16N4) which gives the green pigment and a side chain whose structure varies. The naturally occurring forms of chlorophyll contain between 35 and 55 carbon atoms. The chemical formula of chlorophyll c3 is C36H44MgN4O7. The equivalent term in the NERC P01 Parameter Usage Vocabulary may be found at http://vocab.nerc.ac.uk/collection/P01/current/CHLC03PX/2/.
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". Chlorophylls are the green pigments found in most plants, algae and cyanobacteria; their presence is essential for photosynthesis to take place. There are several different forms of chlorophyll that occur naturally. All contain a chlorin ring (chemical formula C20H16N4) which gives the green pigment and a side chain whose structure varies. The naturally occurring forms of chlorophyll contain between 35 and 55 carbon atoms.
+
+
kg m-3
@@ -8490,6 +8714,34 @@
Mass concentration means mass per unit volume and is used in the construction mass_concentration_of_X_in_Y, where X is a material constituent of Y. A chemical species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol takes up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the aerosol. "Dry aerosol particles" means aerosol particles without any water uptake. Chemically, "elemental carbon" is the carbonaceous fraction of particulate matter that is thermally stable in an inert atmosphere to high temperatures near 4000K and can only be gasified by oxidation starting at temperatures above 340 C. It is assumed to be inert and non-volatile under atmospheric conditions and insoluble in any solvent (Ogren and Charlson, 1983).
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. Chemically, "elemental carbon" is the carbonaceous fraction of particulate matter that is thermally stable in an inert atmosphere to high temperatures near 4000K and can only be gasified by oxidation starting at temperatures above 340 C. It is assumed to be inert and non-volatile under atmospheric conditions and insoluble in any solvent (Ogren and Charlson, 1983). In measurements of carbonaceous aerosols, elemental carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm10 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 10 micrometers. Chemically, "elemental carbon" is the carbonaceous fraction of particulate matter that is thermally stable in an inert atmosphere to high temperatures near 4000K and can only be gasified by oxidation starting at temperatures above 340 C. It is assumed to be inert and non-volatile under atmospheric conditions and insoluble in any solvent (Ogren and Charlson, 1983). In measurements of carbonaceous aerosols, elemental carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm1 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 1 micrometer. Chemically, "elemental carbon" is the carbonaceous fraction of particulate matter that is thermally stable in an inert atmosphere to high temperatures near 4000K and can only be gasified by oxidation starting at temperatures above 340 C. It is assumed to be inert and non-volatile under atmospheric conditions and insoluble in any solvent (Ogren and Charlson, 1983). In measurements of carbonaceous aerosols, elemental carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm2p5 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 2.5 micrometers. Chemically, "elemental carbon" is the carbonaceous fraction of particulate matter that is thermally stable in an inert atmosphere to high temperatures near 4000K and can only be gasified by oxidation starting at temperatures above 340 C. It is assumed to be inert and non-volatile under atmospheric conditions and insoluble in any solvent (Ogren and Charlson, 1983). In measurements of carbonaceous aerosols, elemental carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
kg m-3
@@ -8903,6 +9155,34 @@
Mass concentration means mass per unit volume and is used in the construction mass_concentration_of_X_in_Y, where X is a material constituent of Y. A chemical species denoted by X may be described by a single term such as 'nitrogen' or a phrase such as 'nox_expressed_as_nitrogen'. "Noy" describes a family of chemical species. The family usually includes atomic nitrogen (N), nitrogen monoxide (NO), nitrogen dioxide (NO2), dinitrogen pentoxide (N2O5), nitric acid (HNO3), peroxynitric acid (HNO4), bromine nitrate (BrONO2) , chlorine nitrate (ClONO2) and organic nitrates (most notably peroxyacetyl nitrate, sometimes referred to as PAN, (CH3COO2NO2)). The list of individual species that are included in a quantity having a group chemical standard name can vary between models. Where possible, the data variable should be accompanied by a complete description of the species represented, for example, by using a comment attribute. The phrase 'expressed_as' is used in the construction A_expressed_as_B, where B is a chemical constituent of A. It means that the quantity indicated by the standard name is calculated solely with respect to the B contained in A, neglecting all other chemical constituents of A.
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. Chemically, "organic carbon aerosol" refers to the carbonaceous fraction of particulate matter contained in any of the vast number of compounds where carbon is chemically combined with hydrogen and other elements like O, S, N, P, Cl, etc. In measurements of carbonaceous aerosols, organic carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm10 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 10 micrometers. Chemically, "organic carbon aerosol" refers to the carbonaceous fraction of particulate matter contained in any of the vast number of compounds where carbon is chemically combined with hydrogen and other elements like O, S, N, P, Cl, etc. In measurements of carbonaceous aerosols, organic carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm1 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 1 micrometer. Chemically, "organic carbon aerosol" refers to the carbonaceous fraction of particulate matter contained in any of the vast number of compounds where carbon is chemically combined with hydrogen and other elements like O, S, N, P, Cl, etc. In measurements of carbonaceous aerosols, organic carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
+
+ kg m-3
+
+
+ "Mass concentration" means mass per unit volume and is used in the construction "mass_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. Aerosol particles take up ambient water (a process known as hygroscopic growth) depending on the relative humidity and the composition of the particles. "Dry aerosol particles" means aerosol particles without any water uptake. "Pm2p5 aerosol" means atmospheric particulate compounds with an aerodynamic diameter of less than or equal to 2.5 micrometers. Chemically, "organic carbon aerosol" refers to the carbonaceous fraction of particulate matter contained in any of the vast number of compounds where carbon is chemically combined with hydrogen and other elements like O, S, N, P, Cl, etc. In measurements of carbonaceous aerosols, organic carbon samples may also include some inorganic carbon compounds, whose mass is neglected and assumed to be distributed between the elemental and organic carbon components of the aerosol particles. Reference: Petzold, A., Ogren, J. A., Fiebig, M., Laj, P., Li, S.-M., Baltensperger, U., Holzer-Popp, T., Kinne, S., Pappalardo, G., Sugimoto, N., Wehrli, C., Wiedensohler, A., and Zhang, X.-Y.: Recommendations for reporting "black carbon" measurements, Atmos. Chem. Phys., 13, 8365–8379, https://doi.org/10.5194/acp-13-8365-2013, 2013.
+
+
kg m-3
@@ -10555,6 +10835,13 @@
"Mass fraction" is used in the construction "mass_fraction_of_X_in_Y", where X is a material constituent of Y. It means the ratio of the mass of X to the mass of Y (including X). A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for xylene is C6H4C2H6. In chemistry, xylene is a generic term for a group of three isomers of dimethylbenzene. The IUPAC names for the isomers are 1,2-dimethylbenzene, 1,3-dimethylbenzene and 1,4-dimethylbenzene. Xylene is an aromatic hydrocarbon. There are standard names that refer to aromatic compounds as a group, as well as those for individual species.
+
+ 1
+
+
+ The quantity with standard name mass_ratio_of_moisture_to_dry_soil is also known as the water content of a soil or the wet-basis gravimetric moisture content. It is the ratio of the mass of water (liquid and solid) to the mass of the dried sample. The phrase "ratio_of_X_to_Y" means X/Y. It may be expressed as a fraction, a percentage, or any other dimensionless representation of a fraction.
+
+
s-1
@@ -10597,6 +10884,20 @@
Depth is the vertical distance below the surface. 'Undersaturation' means that a solution is unsaturated with respect to a solute. Calcite is a mineral that is a polymorph of calcium carbonate. The chemical formula of calcite is CaCO3. Standard names also exist for aragonite, another polymorph of calcium carbonate. The "minimum depth of undersaturation", sometimes called the "saturation horizon", is the shallowest depth at which a body of water is an undersaturated solution of a named solute.
+
+ 1
+
+
+ The phrase "ratio_of_X_to_Y" means X/Y. It may be expressed as a fraction, a percentage, or any other dimensionless representation of a fraction. It is the lower limit of the water content at which a 3 mm diameter cylindrical soil sample will break in 3 to 10 mm pieces. It is the lower limit of the plastic state, which has the liquid limit as the upper bound. Known as the plastic limit.
+
+
+
+ 1
+
+
+ The phrase "ratio_of_X_to_Y" means X/Y. It may be expressed as a fraction, a percentage, or any other dimensionless representation of a fraction. It is the lower limit of the water content at which a soil sample will flow in a viscous manner. Known as the liquid limit.
+
+
W m-2
@@ -10863,6 +11164,13 @@
Model level number should be understood as equivalent to layer number.
+
+ 1
+
+
+ The modified Fosberg Fire Weather Index (mFFWI) is a measure of the potential effect of weather conditions on wildland fire. The Fosberg Fire Weather Index is a function of temperature, wind, and humidity. It is modified with a fuel availability factor based on the Keetch Byram Drought Index.
+
+
kg m-2
@@ -11346,6 +11654,13 @@
"Mole concentration" means number of moles per unit volume, also called "molarity", and is used in the construction "mole_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Dissolved nitrogen" means the sum of all nitrogen in solution: inorganic nitrogen (nitrite, nitrate and ammonium) plus nitrogen in carbon compounds.
+
+ mol m-3
+
+
+ The sum of dissolved organic carbon-13 component concentrations. "Mole concentration" means number of moles per unit volume, also called "molarity", and is used in the construction "mole_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Organic carbon" describes a family of chemical species and is the term used in standard names for all species belonging to the family that are represented within a given model. The list of individual species that are included in a quantity having a group chemical standard name can vary between models. Where possible, the data variable should be accompanied by a complete description of the species represented, for example, by using a comment attribute. "C" means the element carbon and "13C" is the stable isotope "carbon-13", having six protons and seven neutrons.
+
+
mol m-3
@@ -11430,6 +11745,13 @@
Mole concentration means number of moles per unit volume, also called "molarity", and is used in the construction mole_concentration_of_X_in_Y, where X is a material constituent of Y. A chemical species denoted by X may be described by a single term such as 'nitrogen' or a phrase such as 'nox_expressed_as_nitrogen'. The chemical symbol for mercury is Hg.
+
+ mol m-3
+
+
+ "Mole concentration" means number of moles per unit volume, also called "molarity", and is used in the construction "mole_concentration_of_X_in_Y", where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula of guanosine triphosphate is C10H16N5O14P3.
+
+
mol m-3
@@ -12165,6 +12487,20 @@
Mole fraction is used in the construction mole_fraction_of_X_in_Y, where X is a material constituent of Y. The chemical formula of bromine nitrate is BrONO2.
+
+ mol mol-1
+
+
+ "Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for bromochloromethane is CH2BrCl. The IUPAC name is bromochloromethane.
+
+
+
+ mol mol-1
+
+
+ "Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for bromodichloromethane is CHBrCl2. The IUPAC name is bromodichloromethane.
+
+
1
@@ -12298,6 +12634,20 @@
"Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The phrase "expressed_as" is used in the construction A_expressed_as_B, where B is a chemical constituent of A. It means that the quantity indicated by the standard name is calculated solely with respect to the B contained in A, neglecting all other chemical constituents of A. "Clox" describes a family of chemical species consisting of inorganic chlorine compounds with the exception of hydrogen chloride (HCl) and chlorine nitrate (ClONO2). "Clox" is the term used in standard names for all species belonging to the family that are represented within a given model. The list of individual species that are included in a quantity with a group chemical standard name can vary between models. Where possible, the data variable should be accompanied by a complete description of the species represented, for example, by using a comment attribute. "Inorganic chlorine", sometimes referred to as Cly, describes a family of chemical species which result from the degradation of source gases containing chlorine (CFCs, HCFCs, VSLS) and natural inorganic chlorine sources such as sea salt and other aerosols. Standard names that use the term "inorganic_chlorine" are used for quantities that contain all inorganic chlorine species including HCl and ClONO2.
+
+ mol mol-1
+
+
+ "Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for dibromochloromethane is CHBr2Cl. The IUPAC name is dibromochloromethane.
+
+
+
+ mol mol-1
+
+
+ "Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for dibromomethane is CH2Br2. The IUPAC name is dibromomethane.
+
+
1
@@ -12886,6 +13236,13 @@
"Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for toluene is C6H5CH3. Toluene has the same structure as benzene, except that one of the hydrogen atoms is replaced by a methyl group. The IUPAC name for toluene is methylbenzene.
+
+ mol mol-1
+
+
+ "Mole fraction" is used in the construction "mole_fraction_of_X_in_Y", where X is a material constituent of Y. A chemical or biological species denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for tribromomethane is CHBr3. The IUPAC name is tribromomethane.
+
+
1
@@ -12921,6 +13278,13 @@
The construction "moles_of_X_per_unit_mass_in_Y" is also called "molality" of X in Y, where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". "Dissolved inorganic carbon" describes a family of chemical species in solution, including carbon dioxide, carbonic acid and the carbonate and bicarbonate anions. "Dissolved inorganic carbon" is the term used in standard names for all species belonging to the family that are represented within a given model. The list of individual species that are included in a quantity having a group chemical standard name can vary between models. Where possible, the data variable should be accompanied by a complete description of the species represented, for example, by using a comment attribute.
+
+ mol kg-1
+
+
+ The construction "moles_of_X_per_unit_mass_in_Y" is also called "molality" of X in Y, where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for hydrogen peroxide is H2O2.
+
+
mol kg-1
@@ -12942,6 +13306,13 @@
moles_of_X_per_unit_mass_inY is also called "molality" of X in Y, where X is a material constituent of Y.
+
+ mol kg-1
+
+
+ The construction "moles_of_X_per_unit_mass_in_Y" is also called "molality" of X in Y, where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". The chemical formula for nitrous oxide is N2O. The chemical formula for nitrous oxide is N2O. The equivalent term in the NERC P01 Parameter Usage Vocabulary may be found at http://vocab.nerc.ac.uk/collection/P01/current/DN2OZZ01/.
+
+
mol kg-1
@@ -12949,6 +13320,20 @@
moles_of_X_per_unit_mass_inY is also called "molality" of X in Y, where X is a material constituent of Y.
+
+ mol kg-1
+
+
+ The construction "moles_of_X_per_unit_mass_in_Y" is also called "molality" of X in Y, where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". Particulate means suspended solids of all sizes. Biogenic silica is a hydrated form of silica (silicon dioxide) with the chemical formula SiO2.nH2O sometimes referred to as opaline silica or opal. It is created by biological processes and in sea water it is predominantly the skeletal material of diatoms.
+
+
+
+ mol kg-1
+
+
+ The construction "moles_of_X_per_unit_mass_in_Y" is also called "molality" of X in Y, where X is a material constituent of Y. A chemical species or biological group denoted by X may be described by a single term such as "nitrogen" or a phrase such as "nox_expressed_as_nitrogen". Particulate means suspended solids of all sizes. Particulate inorganic carbon is carbon bound in molecules ionically that may be liberated from the particles as carbon dioxide by acidification.
+
+
mol kg-1
@@ -13166,6 +13551,41 @@
A phrase assuming_condition indicates that the named quantity is the value which would obtain if all aspects of the system were unaltered except for the assumption of the circumstances specified by the condition. "shortwave" means shortwave radiation. "Upward" indicates a vector component which is positive when directed upward (negative downward). Net upward radiation is the difference between radiation from below (upwelling) and radiation from above (downwelling). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ %
+
+
+ 1000 hour fuel moisture (FM1000) represents the modelled moisture content in the dead fuels in the 3 to 8 inch diameter class and the layer of the forest floor about 4 inches below the surface. The value is based on a running 7-day average. The 1000-hour time lag fuel moisture is a function of length of day (as influenced by latitude and calendar date), daily temperature and relative humidity extremes (maximum and minimum values) and the 24-hour precipitation duration values for a 7-day period. It is a component in the US National Fire Danger Rating System. The US National Fire Danger Rating System comprises several numeric indexes that rate the potential over a large area for wildland fires to ignite, spread, and require action to suppress or manage. It was designed for use in the continental United States, and all its components are relative, not absolute.
+
+
+
+ %
+
+
+ 100 hour fuel moisture (FM100) represents the modeled moisture content of dead fuels in the 1 to 3 inch diameter class. It can also be used as a very rough estimate of the average moisture content of the forest floor from three-fourths inch to 4 inches below the surface. The 100-hour timelag fuel moisture is a function of length of day (as influenced by latitude and calendar date), maximum and minimum temperature and relative humidity, and precipitation duration in the previous 24 hours. It is a component in the US National Fire Danger Rating System. The US National Fire Danger Rating System comprises several numeric indexes that rate the potential over a large area for wildland fires to ignite, spread, and require action to suppress or manage. It was designed for use in the continental United States, and all its components are relative, not absolute.
+
+
+
+ 1
+
+
+ The Burning Index (BI) is a numeric value closely related to the flame length in feet multiplied by 10, which is related to the contribution of fire behaviour to the effort of containing a fire. The BI is a function of fire spread and fire intensity and is derived from a combination of Spread and Energy Release Components. The Spread Component is a rating of the forward rate of spread of a head fire and wind is a key input. The scale is open ended which allows the range of numbers to adequately define fire problems, even in time of low to moderate fire danger. Computed BI values represent the near upper limit to be expected on the rating area. In other words, if a fire occurs in the worst fuel, weather and topography conditions of the rating area, these numbers indicate its expected fire line intensities and flame length. It is an index in the US National Fire Danger Rating System. The US National Fire Danger Rating System comprises several numeric indexes that rate the potential over a large area for wildland fires to ignite, spread, and require action to suppress or manage. It was designed for use in the continental United States, and all its components are relative, not absolute.
+
+
+
+ J m-2
+
+
+ The Energy Release Component (ERC) is a number related to the available energy per unit area within the flaming front at the head of a fire. It is usually given in BTU ft-2. Daily variations in ERC are due to changes in moisture content of the various fuels present, both live and dead. It may also be considered a composite fuel moisture value as it reflects the contribution that all live and dead fuels have to potential fire intensity. Energy Release Component is a cumulative index. The scale is open-ended and relative. Energy Release Component values depend on the fuel model input into the calculations and interpretation of precise values varies with ecology and region. It is an index in the US National Fire Danger Rating System. The US National Fire Danger Rating System comprises several numeric indexes that rate the potential over a large area for wildland fires to ignite, spread, and require action to suppress or manage. It was designed for use in the continental United States, and all its components are relative, not absolute.
+
+
+
+ 1
+
+
+ Severe Fire Danger Index (SFDI) is the normalized product of normalized Energy Release Component (ERC) and normalized Burning Index (BI) from the United States National Fire Danger Rating System (NFDRS). While SFDI is not officially part of the National Fire Danger Rating System, it is related to and intended to supplement NFDRS. It is commonly categorized into five classes based on percentile: low (0-60), moderate (60-80), high (80-90), very high (90-97), and extreme (97-100). It can be extended to future conditions by introducing an unprecedented category for values above the historical 100th percentile. As it is locally normalized, its interpretation remains the same across space.
+
+
1
@@ -13614,6 +14034,13 @@
"Number concentration" means the number of particles or other specified objects per unit volume. "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. "stp" means standard temperature (0 degC) and pressure (101325 Pa). The surface called "surface" means the lower boundary of the atmosphere.
+
+ m-3
+
+
+ "Number concentration" means the number of particles or other specified objects per unit volume. "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself.
+
+
m-3
@@ -13628,6 +14055,13 @@
"Number concentration" means the number of particles or other specified objects per unit volume. "Biological taxon" is a name or other label identifying an organism or a group of organisms as belonging to a unit of classification in a hierarchical taxonomy. There must be an auxiliary coordinate variable with standard name biological_taxon_name to identify the taxon in human readable format and optionally an auxiliary coordinate variable with standard name biological_taxon_lsid to provide a machine-readable identifier. See Section 6.1.2 of the CF convention (version 1.8 or later) for information about biological taxon auxiliary coordinate variables.
+
+ m-3
+
+
+ "Number concentration" means the number of particles or other specified objects per unit volume. "Pollen grain" refers to the male gametophyte of seed plants (either angiosperms or gymnosperms). The number concentration of pollen grains refers to the number of individual pollen grains per unit volume. "Biological taxon" is a name or other label identifying an organism or a group of organisms as belonging to a unit of classification in a hierarchical taxonomy. There must be an auxiliary coordinate variable with standard name biological_taxon_name to identify the taxon in human readable format and optionally an auxiliary coordinate variable with standard name biological_taxon_identifier to provide a machine-readable identifier. See Section 6.1.2 of the CF convention (version 1.8 or later) for information about biological taxon auxiliary coordinate variables.
+
+
m-3
@@ -13635,6 +14069,13 @@
The cloud condensation nuclei number concentration is the total number of aerosol particles per unit volume independent of and integrated over particle size that act as condensation nuclei for liquid-phase clouds. A coordinate variable with the standard name of relative_humidity should be specified to indicate that the property refers to a specific supersaturation with respect to liquid water. The ability of a particle to act as a condensation nucleus is determined by its size, chemical composition, and morphology. "stp" means standard temperature (0 degC) and pressure (101325 Pa).
+
+ m-3
+
+
+ "Number concentration" means the number of particles or other specified objects per unit volume. The cloud condensation nuclei number concentration is the total number of aerosol particles per unit volume independent of and integrated over particle size that act as condensation nuclei for liquid-phase clouds. A coordinate variable with the standard name of relative_humidity should be specified to indicate that the property refers to a specific supersaturation with respect to liquid water. The ability of a particle to act as a condensation nucleus is determined by its size, chemical composition, and morphology.
+
+
m-3
@@ -13768,6 +14209,34 @@
A variable with the standard name of number_of_observations contains the number of discrete observations or measurements from which the values of another data variable have been derived. The linkage between the data variable and the variable with a standard_name of number_of_observations is achieved using the ancillary_variables attribute.
+
+ m-3
+
+
+ The aerosol particle number size distribution is the number concentration of aerosol particles as a function of particle diameter. A coordinate variable with the standard name of electrical_mobility_particle_diameter, aerodynamic_particle_diameter, or optical_particle_diameter should be specified to indicate that the property applies at specific particle sizes selected by the indicated method. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. "log10_X" means common logarithm (i.e. base 10) of X. "stp" means standard temperature (0 degC) and pressure (101325 Pa).
+
+
+
+ m-3
+
+
+ The aerosol particle number size distribution is the number concentration of aerosol particles as a function of particle diameter. A coordinate variable with the standard name of electrical_mobility_particle_diameter, aerodynamic_particle_diameter, or optical_particle_diameter should be specified to indicate that the property applies at specific particle sizes selected by the indicated method. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection.
+
+
+
+ m-3
+
+
+ The cloud condensation nuclei number size distribution is the number concentration of aerosol particles as a function of particle diameter, where the particle acts as condensation nucleus for liquid-phase clouds. A coordinate variable with the standard name of relative_humidity should be specified to indicate that the property refers to a specific supersaturation with respect to liquid water. A coordinate variable with the standard name of electrical_mobility_particle_diameter should be specified to indicate that the property applies at specific mobility particle sizes. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. The ability of a particle to act as a condensation nucleus is determined by its size, chemical composition, and morphology. "stp" means standard temperature (0 degC) and pressure (101325 Pa).
+
+
+
+ m-3
+
+
+ The cloud condensation nuclei number size distribution is the number concentration of aerosol particles as a function of particle diameter, where the particle acts as condensation nucleus for liquid-phase clouds. A coordinate variable with the standard name of relative_humidity should be specified to indicate that the property refers to a specific supersaturation with respect to liquid water. A coordinate variable with the standard name of electrical_mobility_particle_diameter should be specified to indicate that the property applies at specific mobility particle sizes. To specify the relative humidity at which the particle sizes were selected, provide a scalar coordinate variable with the standard name of relative_humidity_for_aerosol_particle_size_selection. The ability of a particle to act as a condensation nucleus is determined by its size, chemical composition, and morphology.
+
+
kg s-1
@@ -14342,6 +14811,13 @@
The partial pressure of a dissolved gas in sea water is the partial pressure in air with which it would be in equilibrium. The partial pressure of a gaseous constituent of air is the pressure that it would exert if all other gaseous constituents were removed, assuming the volume, the temperature, and its number of moles remain unchanged. The chemical formula for methane is CH4.
+
+ degree_C
+
+
+ Perceived temperature (PT) is an equivalent air temperature of the actual thermal condition. It is the air temperature of a reference condition causing the same thermal perception in a human body considering air temperature, wind speed, humidity, solar and thermal radiation as well as clothing and activity level. It is not the perceived air temperature, that derives either from wind chill and heat index and has the standard_name apparent_air_temperature.
+
+
m
@@ -14398,6 +14874,13 @@
"Photolysis" is a chemical reaction in which a chemical compound is broken down by photons. The "reaction rate" is the rate at which the reactants of a chemical reaction form the products. The chemical formula for ozone is O3. The IUPAC name for ozone is trioxygen. "1D oxygen atom" means the singlet D state, an excited state, of the oxygen atom. The combined photolysis rate of ozone to both excited and ground state oxygen atoms has the standard name photolysis_rate_of_ozone.
+
+ degree_C
+
+
+ Physiological equivalent temperature (PET) is an equivalent air temperature of the actual thermal condition. It is the air temperature of a reference condition without wind and solar radiation at which the heat budget of the human body is balanced with the same core and skin temperature. Note that PET here is not potential evapotranspiration.
+
+
1
@@ -17583,6 +18066,13 @@
"Radioactivity" means the number of radioactive decays of a material per second. "Radioactivity concentration" means radioactivity per unit volume of the medium. "Tc" means the element "technetium" and "99Tc" is the isotope "technetium-99" with a half-life of 7.79e+07 days.
+
+ s
+
+
+ The quantity with standard name radio_signal_roundtrip_travel_time_in_air is the time taken for an electromagnetic signal to propagate from an emitting instrument such as a radar or lidar to a reflecting volume and back again. The signal returned to the instrument is the sum of all scattering from a given volume of air regardless of mechanism (examples are scattering by aerosols, hydrometeors and refractive index irregularities, or whatever else the instrument detects).
+
+
m
@@ -17660,6 +18150,13 @@
The quantity with standard name ratio_of_sea_water_practical_salinity_anomaly_to_relaxation_timescale is a correction term applied to modelled sea water practical salinity. The term is estimated as the deviation of model local sea water practical salinity from an observation-based climatology (e.g. World Ocean Database) weighted by a user-specified relaxation coefficient in s-1 (1/(relaxation timescale)). The phrase "ratio_of_X_to_Y" means X/Y. The term "anomaly" means difference from climatology. Practical Salinity, S_P, is a determination of the salinity of sea water, based on its electrical conductance. The measured conductance, corrected for temperature and pressure, is compared to the conductance of a standard potassium chloride solution, producing a value on the Practical Salinity Scale of 1978 (PSS-78). This name should not be used to describe salinity observations made before 1978, or ones not based on conductance measurements. Conversion of Practical Salinity to other precisely defined salinity measures should use the appropriate formulas specified by TEOS-10. Other standard names for precisely defined salinity quantities are sea_water_absolute_salinity (S_A); sea_water_preformed_salinity (S_*), sea_water_reference_salinity (S_R); sea_water_cox_salinity (S_C), used for salinity observations between 1967 and 1977; and sea_water_knudsen_salinity (S_K), used for salinity observations between 1901 and 1966. Salinity quantities that do not match any of the precise definitions should be given the more general standard name of sea_water_salinity. Reference: www.teos-10.org; Lewis, 1980 doi:10.1109/JOE.1980.1145448.
+
+ sr
+
+
+ The ratio of volume extinction coefficient to volume backwards scattering coefficient by ranging instrument in air due to ambient aerosol particles (often called "lidar ratio") is the ratio of the "volume extinction coefficient" and the "volume backwards scattering coefficient of radiative flux by ranging instrument in air due to ambient aerosol particles". The ratio is assumed to be related to the same wavelength as the incident radiation. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase.
+
+
m s-2
@@ -17681,6 +18178,13 @@
Realization is used to label a dimension that can be thought of as a statistical sample, e.g., labelling members of a model ensemble.
+
+ W
+
+
+ The quantity with standard name received_power_of_radio_wave_in_air_scattered_by_air refers to the received power of the signal at an instrument such as a radar or lidar. The signal returned to the instrument is the sum of all scattering from a given volume of air regardless of mechanism (examples are scattering by aerosols, hydrometeors and refractive index irregularities, or whatever else the instrument detects).
+
+
Pa
@@ -18262,6 +18766,13 @@
The specification of a physical process by the phrase due_to_process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. "Air pressure at low frequency" means variations in air pressure with periods longer than 20 days. These give rise to corresponding variations in sea surface topography. The quantity sea_surface_height_correction_due_to_air_pressure_at_low_frequency is commonly called the "inverted barometer effect" and the correction should be applied by adding it to the quantity with standard name altimeter_range. Additional altimeter range corrections are given by the quantities with standard names altimeter_range_correction_due_to_wet_troposphere, altimeter_range_correction_due_to_dry_troposphere, altimeter_range_correction_due_to_ionosphere and sea_surface_height_correction_due_to_air_pressure_and_wind_at_high_frequency.
+
+ m
+
+
+ Significant wave height is a statistic computed from wave measurements and corresponds to the average height of the highest one third of the waves, where the height is defined as the vertical distance from a wave trough to the following wave crest. Infragravity waves are waves occurring in the frequency range 0.04 to 0.004 s^-1 (wave periods of 25 to 250 seconds).
+
+
1
@@ -18563,6 +19074,13 @@
The wave directional spectrum can be written as a five dimensional function S(t,x,y,f,theta) where t is time, x and y are horizontal coordinates (such as longitude and latitude), f is frequency and theta is direction. S has the standard name sea_surface_wave_directional_variance_spectral_density. S can be integrated over direction to give S1= integral(S dtheta) and this quantity has the standard name sea_surface_wave_variance_spectral_density. The quantity with standard name sea_surface_wave_energy_at_variance_spectral_density_maximum, sometimes called peak wave energy, is the maximum value of the variance spectral density (max(S1)).
+
+ s-1
+
+
+ Frequency is the number of oscillations of a wave per unit time. The sea_surface_wave_frequency_at_variance_spectral_density_maximum is the frequency of the most energetic waves in the total wave spectrum at a specific location. The wave directional spectrum can be written as a five dimensional function S(t,x,y,f,theta) where t is time, x and y are horizontal coordinates (such as longitude and latitude), f is frequency and theta is direction. S has the standard name sea_surface_wave_directional_variance_spectral_density. S can be integrated over direction to give S1= integral(S dtheta) and this quantity has the standard name sea_surface_wave_variance_spectral_density.
+
+
degree
@@ -18682,6 +19200,13 @@
Wave slope describes an aspect of sea surface wave geometry related to sea surface roughness. Mean square slope describes a derivation over multiple waves within a sea-state, for example calculated from moments of the wave directional spectrum. The phrase "y_slope" indicates that slope values are derived from vector components along the grid y-axis.
+
+ m
+
+
+ The wave directional spectrum can be written as a five dimensional function S(t,x,y,k,theta) where t is time, x and y are horizontal coordinates (such as longitude and latitude), k is wavenumber and theta is direction. S has the standard name sea_surface_wave_directional_variance_spectral_density. S can be integrated over direction to give S1= integral(S dtheta) and this quantity has the standard name sea_surface_wave_variance_spectral_density. Wavenumber is the number of oscillations of a wave per unit distance. Wavenumber moments, M(n) of S1 can then be calculated as follows: M(n) = integral(S1 k^n dk), where k^n is k to the power of n. The inverse wave wavenumber, k(m-1), is calculated as the ratio M(-1)/M(0). The wavelength is the horizontal distance between repeated features on the waveform such as crests, troughs or upward passes through the mean level.
+
+
m-1
@@ -18693,7 +19218,7 @@
s
- A period is an interval of time, or the time-period of an oscillation. The sea_surface_wave_period_at_variance_spectral_density_maximum, sometimes called peak wave period, is the period of the most energetic waves in the total wave spectrum at a specific location.
+ A period is an interval of time, or the time-period of an oscillation. Wave period is the interval of time between repeated features on the waveform such as crests, troughs or upward passes through the mean level. The sea_surface_wave_period_at_variance_spectral_density_maximum, sometimes called peak wave period, is the period of the most energetic waves in the total wave spectrum at a specific location. The wave directional spectrum can be written as a five dimensional function S(t,x,y,f,theta) where t is time, x and y are horizontal coordinates (such as longitude and latitude), f is frequency and theta is direction. S has the standard name sea_surface_wave_directional_variance_spectral_density. S can be integrated over direction to give S1= integral(S dtheta) and this quantity has the standard name sea_surface_wave_variance_spectral_density.
@@ -18948,6 +19473,13 @@
+
+ S m-1
+
+
+ The electrical conductivity of sea water in a sample measured at a defined reference temperature. The reference temperature should be recorded in a scalar coordinate variable, or a coordinate variable with a single dimension of size one, and the standard name of temperature_of_analysis_of_sea_water. This quantity is sometimes called 'specific conductivity' when the reference temperature 25 degrees Celsius.
+
+
1e-3
@@ -19361,6 +19893,20 @@
Convective precipitation is that produced by the convection schemes in an atmosphere model. Some atmosphere models differentiate between shallow and deep convection. "Precipitation" in the earth's atmosphere means precipitation of water in all phases. In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ Pa
+
+
+ Shear strength is the amount of force applied to a normal plane required to bring a frozen soil to failure along a tangential plane. Shear strength depends on the angle of friction and cohesion of the soil.
+
+
+
+ Pa
+
+
+ Shear strength is the amount of force applied to a normal plane required to bring the soil to failure along a tangential plane. Shear strength depends on the angle of friction and cohesion of the soil.
+
+
1
@@ -19375,6 +19921,48 @@
"Single scattering albedo" is the fraction of radiation in an incident light beam scattered by the particles of an aerosol reference volume for a given wavelength. It is the ratio of the scattering and the extinction coefficients of the aerosol particles in the reference volume. A coordinate variable with a standard name of radiation_wavelength or radiation_frequency should be included to specify either the wavelength or frequency. "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. To specify the relative humidity and temperature at which the quantity described by the standard name applies, provide scalar coordinate variables with standard names of "relative_humidity" and "air_temperature". The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase.
+
+ kg m-2 s-1
+
+
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. "Sinking" is the gravitational settling of particulate matter suspended in a liquid. A sinking flux is positive downwards and is calculated relative to the movement of the surrounding fluid. Particulate means suspended solids of all sizes. Biogenic silica is a hydrated form of silica (silicon dioxide) with the chemical formula SiO2.nH2O sometimes referred to as opaline silica or opal. It is created by biological processes and in sea water it is predominantly the skeletal material of diatoms.
+
+
+
+ kg m-2 s-1
+
+
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. "Sinking" is the gravitational settling of particulate matter suspended in a liquid. A sinking flux is positive downwards and is calculated relative to the movement of the surrounding fluid. Particulate means suspended solids of all sizes.
+
+
+
+ kg m-2 s-1
+
+
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. "Sinking" is the gravitational settling of particulate matter suspended in a liquid. A sinking flux is positive downwards and is calculated relative to the movement of the surrounding fluid. Particulate means suspended solids of all sizes. Particulate inorganic carbon is carbon bound in molecules ionically that may be liberated from the particles as carbon dioxide by acidification.
+
+
+
+ kg m-2 s-1
+
+
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. "Sinking" is the gravitational settling of particulate matter suspended in a liquid. A sinking flux is positive downwards and is calculated relative to the movement of the surrounding fluid.
+
+
+
+ kg m-2 s-1
+
+
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. "Sinking" is the gravitational settling of particulate matter suspended in a liquid. A sinking flux is positive downwards and is calculated relative to the movement of the surrounding fluid. Particulate means suspended solids of all sizes.
+
+
+
+ kg m-2 s-1
+
+
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. "Sinking" is the gravitational settling of particulate matter suspended in a liquid. A sinking flux is positive downwards and is calculated relative to the movement of the surrounding fluid. Particulate means suspended solids of all sizes.
+
+
mol m-2 s-1
@@ -19473,6 +20061,13 @@
Soil albedo is the albedo of the soil surface assuming no snow. Albedo is the ratio of outgoing to incoming shortwave irradiance, where 'shortwave irradiance' means that both the incoming and outgoing radiation are integrated across the solar spectrum.
+
+ kg m-3
+
+
+ The density of the soil in its natural condition. Also known as bulk density. The density of a substance is its mass per unit volume.
+
+
kg m-2
@@ -19606,6 +20201,13 @@
soil_water_ph is the measure of acidity of soil moisture, defined as the negative logarithm of the concentration of dissolved hydrogen ions in soil water.
+
+ 1e-3
+
+
+ The quantity with standard name soil_water_salinity is the salt content of soil water, often on the Practical Salinity Scale of 1978. However, the unqualified term 'salinity' is generic and does not necessarily imply any particular method of calculation. The units of salinity are dimensionless and normally given as 1e-3 or 0.001 i.e. parts per thousand.
+
+
degree
@@ -19809,6 +20411,13 @@
"Specific" means per unit mass. "Turbulent kinetic energy" is the kinetic energy of chaotic fluctuations of the fluid flow.
+
+ Hz
+
+
+ The quantity with standard name spectral_width_of_radio_wave_in_air_scattered_by_air is the frequency width of the signal received by an instrument such as a radar or lidar. The signal returned to the instrument is the sum of all scattering from a given volume of air regardless of mechanism (examples are scattering by aerosols, hydrometeors and refractive index irregularities, or whatever else the instrument detects).
+
+
m s-1
@@ -19949,6 +20558,13 @@
"Upward" indicates a vector component which is positive when directed upward (negative downward). Ocean transport means transport by all processes, both sea water and sea ice. "square_of_X" means X*X.
+
+ K
+
+
+ In thermodynamics and fluid mechanics, stagnation temperature is the temperature at a stagnation point in a fluid flow. At a stagnation point the speed of the fluid is zero and all of the kinetic energy has been converted to internal energy and is added to the local static enthalpy. In both compressible and incompressible fluid flow, the stagnation temperature is equal to the total temperature at all points on the streamline leading to the stagnation point. In aviation, stagnation temperature is known as total air temperature and is measured by a temperature probe mounted on the surface of the aircraft. The probe is designed to bring the air to rest relative to the aircraft. As the air is brought to rest, kinetic energy is converted to internal energy. The air is compressed and experiences an adiabatic increase in temperature. Therefore, total air temperature is higher than the static (or ambient) air temperature. Total air temperature is an essential input to an air data computer in order to enable computation of static air temperature and hence true airspeed.
+
+
1
@@ -20383,6 +20999,13 @@
The surface called "surface" means the lower boundary of the atmosphere. "Downward" indicates a vector component which is positive when directed downward (negative upward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. In ocean biogeochemistry models, a "natural analogue" is used to simulate the effect on a modelled variable of imposing preindustrial atmospheric carbon dioxide concentrations, even when the model as a whole may be subjected to varying forcings. The phrase "expressed_as" is used in the construction A_expressed_as_B, where B is a chemical constituent of A. It means that the quantity indicated by the standard name is calculated solely with respect to the B contained in A, neglecting all other chemical constituents of A. The chemical formula for carbon dioxide is CO2.
+
+ kg m-2 s-1
+
+
+ The surface called "surface" means the lower boundary of the atmosphere. "Downward" indicates a vector component which is positive when directed downward (negative upward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The chemical formula for methane is CH4. The mass is the total mass of the molecules. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. Non-wetland soils are all soils except for wetlands. Wetlands are areas where water covers the soil, or is present either at or near the surface of the soil all year or for varying periods of time during the year, including during the growing season. The precise conditions under which non-wetland soils produce and consume methane can vary between models.
+
+
kg m-2 s-1
@@ -23596,6 +24219,34 @@
The surface called "surface" means the lower boundary of the atmosphere. Runoff is the liquid water which drains from land. If not specified, "runoff" refers to the sum of surface runoff and subsurface drainage. In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ m s-1
+
+
+ A velocity is a vector quantity. "x" indicates a vector component along the grid x-axis, positive with increasing x. Ocean currents are related to phenomena of different nature and processes, such as density currents, currents raised by the wind, tide, wave propagation, mass flow in estuaries, etc. This standard name refers to the sum of currents of all origins.
+
+
+
+ m s-1
+
+
+ A velocity is a vector quantity. "x" indicates a vector component along the grid x-axis, positive with increasing x. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. Tides are the rise and fall of sea levels caused by the combined effects of the gravitational forces exerted by the Moon and the Sun, and the rotation of the Earth. This rise in water level is accompanied by a horizontal movement of water called the tidal current.
+
+
+
+ m s-1
+
+
+ A velocity is a vector quantity. "y" indicates a vector component along the grid y-axis, positive with increasing y. Ocean currents are related to phenomena of different nature and processes, such as density currents, currents raised by the wind, tide, wave propagation, mass flow in estuaries, etc. This Standard Name refers to the sum of currents of all origins.
+
+
+
+ m s-1
+
+
+ A velocity is a vector quantity. "y" indicates a vector component along the grid y-axis, positive with increasing y. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. Tides are the rise and fall of sea levels caused by the combined effects of the gravitational forces exerted by the Moon and the Sun, and the rotation of the Earth. This rise in water level is accompanied by a horizontal movement of water called the tidal current.
+
+
kg m-265
@@ -23743,6 +24394,13 @@
The surface called "surface" means the lower boundary of the atmosphere. "Upward" indicates a vector component which is positive when directed upward (negative downward). The surface latent heat flux is the exchange of heat between the surface and the air on account of evaporation (including sublimation). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ W m-2
+
+
+ The quantity with standard name surface_upward_latent_heat_flux_due_to_evaporation does not include transpiration from vegetation. The surface called "surface" means the lower boundary of the atmosphere. "Upward" indicates a vector component which is positive when directed upward (negative downward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. Evaporation is the conversion of liquid or solid into vapor. (The conversion of solid alone into vapor is called "sublimation"). The surface latent heat flux is the exchange of heat between the surface and the air on account of evaporation (including sublimation).
+
+
W m-2
@@ -23932,6 +24590,28 @@
"Upward" indicates a vector component which is positive when directed upward (negative downward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. Heterotrophic respiration is respiration by heterotrophs ("consumers"), which are organisms (including animals and decomposers) that consume other organisms or dead organic material, rather than synthesising organic material from inorganic precursors using energy from the environment (especially sunlight) as autotrophs ("producers") do. Heterotrophic respiration goes on within both the soil and litter pools.
+
+ kg m-2 s-1
+
+
+ Methane emitted from the surface, generated by biomass burning (fires). Positive direction upwards.
+The surface called "surface" means the lower boundary of the atmosphere. "Upward" indicates a vector component which is positive when directed upward (negative downward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The chemical formula for methane is CH4. The mass is the total mass of the molecules. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. "Emission" means emission from a primary source located anywhere within the atmosphere, including at the lower boundary (i.e. the surface of the earth). "Emission" is a process entirely distinct from "re-emission" which is used in some standard names. The term "fires" means all biomass fires, whether naturally occurring or ignited by humans. The precise conditions under which fires produce and consume methane can vary between models.
+
+
+
+ kg m-2 s-1
+
+
+ The surface called "surface" means the lower boundary of the atmosphere. "Upward" indicates a vector component which is positive when directed upward (negative downward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The chemical formula for methane is CH4. The mass is the total mass of the molecules. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. "Emission" means emission from a primary source located anywhere within the atmosphere, including at the lower boundary (i.e. the surface of the earth). "Emission" is a process entirely distinct from "re-emission" which is used in some standard names. Herbivores are animals that feed on vegetation. Mammals are any vertebrates within the class Mammalia. Examples of large herbivorous mammals include cows, elks, and buffalos. These animals eat grass, tree bark, aquatic vegetation, and shrubby growth. Herbivores can also be medium-sized animals such as sheep and goats, which eat shrubby vegetation and grasses. Small herbivores include rabbits, chipmunks, squirrels, and mice. The precise conditions under which herbivorous mammals produce and consume methane can vary between models.
+
+
+
+ kg m-2 s-1
+
+
+ The surface called "surface" means the lower boundary of the atmosphere. "Upward" indicates a vector component which is positive when directed upward (negative downward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The chemical formula for methane is CH4. The mass is the total mass of the molecules. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. "Emission" means emission from a primary source located anywhere within the atmosphere, including at the lower boundary (i.e. the surface of the earth). "Emission" is a process entirely distinct from "re-emission" which is used in some standard names. Termites belong to any of a group of cellulose-eating insects, the social system of which shows remarkable parallels with those of ants and bees, although it has evolved independently. The precise conditions under which termites produce and consume methane can vary between models.
+
+
kg m-2 s-1
@@ -30120,6 +30800,13 @@
"Amount" means mass per unit area. The construction thickness_of_[X_]snowfall_amount means the accumulated "depth" of snow which fell i.e. the thickness of the layer of snow at its own density. There are corresponding standard names for liquid water equivalent (lwe) thickness.
+
+ m
+
+
+ Depth or height of the organic soil horizon (O or H horizons per the World Reference Base soil classification system), measured from the soil surface down to the mineral horizon. Organic layers are commonly composed of a succession of litter of recognizable origin, of partly decomposed litter, and of highly decomposed (humic) organic material.
+
+
m
@@ -30509,14 +31196,14 @@
kg m-2
- "Amount" means mass per unit area.
+ "Amount" means mass per unit area. Transpiration is the process by which liquid water in plant stomata is transferred as water vapor into the atmosphere.kg m-2 s-1
- In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+ In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. Transpiration is the process by which liquid water in plant stomata is transferred as water vapor into the atmosphere.
@@ -30701,6 +31388,13 @@
The "Ultraviolet Index" (UVI) is a measure of the amount of solar ultraviolet radiation that reaches the surface of the earth depending on factors such as time of day and cloud cover. It is often used to alert the public of the need to limit sun exposure and use sun creams to protect the skin. Each point on the Index scale is equivalent to 25 mW m-2 of UV radiation (reference: Australian Bureau of Meteorology, http://www.bom.gov.au/uv/about_uv_index.shtml). The UVI range is expressed as a numeric value from 0 to 20 and sometimes graphically as bands of color indicating the attendant risk of skin damage. A UVI of 0-2 is described as 'Low' (represented graphically in green); a UVI of 11 or greater is described as "Extreme" (represented graphically in purple). The higher the UVI, the greater the potential health risk to humans and the less time it takes for harm to occur. A phrase "assuming_condition" indicates that the named quantity is the value which would obtain if all aspects of the system were unaltered except for the assumption of the circumstances specified by the condition. "Overcast" means a fractional sky cover of 95% or more when at least a portion of this amount is attributable to clouds or obscuring phenomena (such as haze, dust, smoke, fog, etc.) aloft. (Reference: AMS Glossary: http://glossary.ametsoc.org/wiki/Main_Page). Standard names are also defined for the quantities ultraviolet_index and ultraviolet_index_assuming_clear_sky.
+
+ degree_C
+
+
+ Universal Thermal Comfort Index (UTCI) is an equivalent temperature of the actual thermal condition. Reference: utci.org. It is the air temperature of a reference condition causing the same dynamic physiological response in a human body considering its energy budget, physiology and clothing adaptation.
+
+
m s-140
@@ -30820,6 +31514,13 @@
"Upward" indicates a vector component which is positive when directed upward (negative downward). The latent heat flux is the exchange of heat across a surface on account of evaporation and condensation (including sublimation and deposition). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ W m-2
+
+
+ "Upward" indicates a vector component which is positive when directed upward (negative downward). In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase. Transpiration is the process by which liquid water in plant stomata is transferred as water vapor into the atmosphere. The latent heat flux due to transpiration is the release of latent heat from plant surfaces to the air due to the release of water vapor.
+
+
kg m-2 s-1
@@ -31023,6 +31724,13 @@
The vertical_component_of_ocean_xy_tracer_diffusivity means the vertical component of the diffusivity of tracers in the ocean due to lateral mixing. This quantity could appear in formulations of lateral diffusivity in which "lateral" does not mean "iso-level", e.g. it would not be used for isopycnal diffusivity. "Tracer diffusivity" means the diffusivity of heat and salinity due to motion which is not resolved on the grid scale of the model.
+
+ kg m-2
+
+
+ “Drainage” is the process of removal of excess water from soil by gravitational flow. "Amount" means mass per unit area. The vertical drainage amount in soil is the amount of water that drains through the bottom of a soil column extending from the surface to a specified depth.
+
+
m
@@ -31142,6 +31850,13 @@
The volume scattering/absorption/attenuation coefficient is the fractional change of radiative flux per unit path length due to the stated process. Coefficients with canonical units of m2 s-1 i.e. multiplied by density have standard names with specific_ instead of volume_. Backwards scattering refers to the sum of scattering into all backward angles i.e. scattering_angle exceeds pi/2 radians. A scattering_angle should not be specified with this quantity. The scattering/absorption/attenuation coefficient is assumed to be an integral over all wavelengths, unless a coordinate of radiation_wavelength is included to specify the wavelength. "Aerosol" means the system of suspended liquid or solid particles in air (except cloud droplets) and their carrier gas, the air itself. "Dried_aerosol" means that the aerosol sample has been dried from the ambient state, but that the dry state (relative humidity less than 40 per cent) has not necessarily been reached. To specify the relative humidity at which the sample was measured, provide a scalar coordinate variable with the standard name of "relative_humidity". The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase.
+
+ m-1 sr-1
+
+
+ Volume backwards scattering coefficient by ranging instrument is the fraction of radiative flux, per unit path length and per unit solid angle, scattered at 180 degrees angle respect to the incident radiation and obtained through ranging techniques like lidar and radar. Backwards scattering coefficient is assumed to be related to the same wavelength of incident radiation. "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase.
+
+
m-1
@@ -31163,6 +31878,13 @@
Radiative flux is the sum of shortwave and longwave radiative fluxes. In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. The volume scattering/absorption/attenuation coefficient is the fractional change of radiative flux per unit path length due to the stated process. The scattering/absorption/attenuation coefficient is assumed to be an integral over all wavelengths, unless a coordinate of radiation_wavelength is included to specify the wavelength. Attenuation is the sum of absorption and scattering. Attenuation is sometimes called "extinction". Beam attenuation refers to the decrease of radiative flux along the direction of the incident path. It is distinguished from attenuation of the downwelling component of radiative flux from any incident direction, also called "diffuse" attenuation. The phrase "corrected for pure water attenuance" means the attenuation coefficient has been adjusted/calibrated to remove the influence of absorption/scattering by the water itself. Coefficients with canonical units of m2 s-1 i.e. multiplied by density have standard names with specific_ instead of volume_.
+
+ 1
+
+
+ The volume extinction Angstrom exponent is the Angstrom exponent obtained for the aerosol extinction instead that for the aerosol optical thickness. It is alpha in the following equation relating aerosol extinction (ext) at the wavelength lambda to aerosol extinction at a different wavelength lambda0: ext(lambda) = ext(lambda0) * [lambda/lambda0] ** (-1 * alpha). "Ambient_aerosol" means that the aerosol is measured or modelled at the ambient state of pressure, temperature and relative humidity that exists in its immediate environment. "Ambient aerosol particles" are aerosol particles that have taken up ambient water through hygroscopic growth. The extent of hygroscopic growth depends on the relative humidity and the composition of the particles. The specification of a physical process by the phrase "due_to_" process means that the quantity named is a single term in a sum of terms which together compose the general quantity named by omitting the phrase.
+
+
m-1
@@ -31317,11 +32039,18 @@
"Water" means water in all phases. Evaporation is the conversion of liquid or solid into vapor. (The conversion of solid alone into vapor is called "sublimation".) In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ kg m-2
+
+
+ "Evapotranspiration" means all water vapor fluxes into the atmosphere from the surface: liquid evaporation, sublimation, and transpiration. "Amount" means mass per unit area. Evaporation is the conversion of liquid or solid into vapor. (The conversion of solid alone into vapor is called "sublimation".) Transpiration is the process by which liquid water in plant stomata is transferred as water vapor into the atmosphere. Unless indicated in the cell_methods attribute, a quantity is assumed to apply to the whole area of each horizontal grid box.
+
+
kg m-2 s-1evspsbl
- Water means water in all phases. "Evapotranspiration" means all water vapor fluxes into the atmosphere from the surface: liquid evaporation, sublimation and transpiration. Evaporation is the conversion of liquid or solid into vapor. Transpiration is the process by which water is carried from the roots of plants and evaporates from the stomata. (The conversion of solid alone into vapor is called "sublimation".) In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. Unless indicated in the cell_methods attribute, a quantity is assumed to apply to the whole area of each horizontal grid box.
+ Water means water in all phases. "Evapotranspiration" means all water vapor fluxes into the atmosphere from the surface: liquid evaporation, sublimation and transpiration. Evaporation is the conversion of liquid or solid into vapor. Transpiration is the process by which liquid water in plant stomata is transferred as water vapor into the atmosphere. (The conversion of solid alone into vapor is called "sublimation".) In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics. Unless indicated in the cell_methods attribute, a quantity is assumed to apply to the whole area of each horizontal grid box.
@@ -31429,6 +32158,13 @@
"Water" means water in all phases. Evaporation is the conversion of liquid or solid into vapor. (The conversion of solid alone into vapor is called "sublimation".) Potential evaporation is the rate at which evaporation would take place under unaltered ambient conditions (temperature, relative humidity, wind, etc.) if the supply of water were unlimited, as if from an open water surface. In accordance with common usage in geophysical disciplines, "flux" implies per unit area, called "flux density" in physics.
+
+ kg m-2
+
+
+ Potential evapotranspiration is the rate at which evapotranspiration would occur under ambient conditions from a uniformly vegetated area when the water supply is not limiting. "Evapotranspiration" means all water vapor fluxes into the atmosphere from the surface: liquid evaporation, sublimation and transpiration. Transpiration is the process by which liquid water in plant stomata is transferred as water vapor into the atmosphere. Evaporation is the conversion of liquid or solid into vapor. (The conversion of solid alone into vapor is called "sublimation"). Amount means mass per unit area.
+
+
kg m-2 s-1
@@ -31654,6 +32390,10 @@
+
+ moles_of_particulate_inorganic_carbon_per_unit_mass_in_sea_water
+
+
temperature_in_ground
@@ -31662,6 +32402,62 @@
biological_taxon_lsid
+
+ tendency_of_atmosphere_number_content_of_aerosol_particles_due_to_turbulent_deposition
+
+
+
+ lagrangian_tendency_of_atmosphere_sigma_coordinate
+
+
+
+ lagrangian_tendency_of_atmosphere_sigma_coordinate
+
+
+
+ electrical_mobility_diameter_of_ambient_aerosol_particles
+
+
+
+ diameter_of_ambient_aerosol_particles
+
+
+
+ mass_concentration_of_biomass_burning_dry_aerosol_particles_in_air
+
+
+
+ effective_radius_of_stratiform_cloud_rain_particles
+
+
+
+ effective_radius_of_stratiform_cloud_ice_particles
+
+
+
+ effective_radius_of_stratiform_cloud_graupel_particles
+
+
+
+ effective_radius_of_convective_cloud_snow_particles
+
+
+
+ effective_radius_of_convective_cloud_rain_particles
+
+
+
+ effective_radius_of_convective_cloud_ice_particles
+
+
+
+ histogram_of_backscattering_ratio_in_air_over_height_above_reference_ellipsoid
+
+
+
+ backscattering_ratio_in_air
+
+
soot_content_of_surface_snow
@@ -31690,86 +32486,6 @@
integral_wrt_time_of_surface_downward_northward_stress
-
- tendency_of_atmosphere_mass_content_of_water_vapor_due_to_sublimation_of_surface_snow_and_ice
-
-
-
- surface_snow_density
-
-
-
- atmosphere_upward_relative_vorticity
-
-
-
- atmosphere_upward_absolute_vorticity
-
-
-
- area_type
-
-
-
- area_type
-
-
-
- mass_fraction_of_liquid_precipitation_in_air
-
-
-
- mass_fraction_of_liquid_precipitation_in_air
-
-
-
- tendency_of_mole_concentration_of_particulate_organic_matter_expressed_as_carbon_in_sea_water_due_to_net_primary_production_by_diazotrophic_phytoplankton
-
-
-
- nitrogen_growth_limitation_of_diazotrophic_phytoplankton
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_diazotrophic_phytoplankton
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_diazotrophic_phytoplankton
-
-
-
- mole_concentration_of_diazotrophic_phytoplankton_expressed_as_carbon_in_sea_water
-
-
-
- mass_concentration_of_diazotrophic_phytoplankton_expressed_as_chlorophyll_in_sea_water
-
-
-
- iron_growth_limitation_of_diazotrophic_phytoplankton
-
-
-
- growth_limitation_of_diazotrophic_phytoplankton_due_to_solar_irradiance
-
-
-
- air_pseudo_equivalent_potential_temperature
-
-
-
- tendency_of_mass_fraction_of_stratiform_cloud_ice_in_air_due_to_melting_to_cloud_liquid_water
-
-
-
- tendency_of_mass_fraction_of_stratiform_cloud_ice_in_air_due_to_heterogeneous_nucleation_from_cloud_liquid_water
-
-
-
- tendency_of_mass_fraction_of_stratiform_cloud_ice_in_air_due_to_riming_from_cloud_liquid_water
-
-
sea_water_velocity_from_direction
@@ -31842,368 +32558,272 @@
tendency_of_sea_water_conservative_temperature_expressed_as_heat_content_due_to_parameterized_dianeutral_mixing
-
- effective_radius_of_stratiform_cloud_snow_particles
-
-
-
- tendency_of_atmosphere_moles_of_cfc11
-
-
-
- moles_of_cfc11_per_unit_mass_in_sea_water
-
-
-
- atmosphere_moles_of_cfc11
-
-
-
- tendency_of_atmosphere_moles_of_cfc113
-
-
-
- atmosphere_moles_of_cfc113
-
-
-
- tendency_of_atmosphere_moles_of_cfc114
-
-
-
- atmosphere_moles_of_cfc114
-
-
-
- tendency_of_atmosphere_moles_of_cfc115
-
-
-
- atmosphere_moles_of_cfc115
-
-
-
- tendency_of_atmosphere_moles_of_cfc12
-
-
-
- atmosphere_moles_of_cfc12
-
-
-
- tendency_of_atmosphere_moles_of_halon1202
-
-
-
- atmosphere_moles_of_halon1202
-
-
-
- tendency_of_atmosphere_moles_of_halon1211
-
-
-
- atmosphere_moles_of_halon1211
-
-
-
- tendency_of_atmosphere_moles_of_halon1301
-
-
-
- atmosphere_moles_of_halon1301
-
-
-
- tendency_of_atmosphere_moles_of_halon2402
-
-
-
- atmosphere_moles_of_halon2402
-
-
-
- tendency_of_atmosphere_moles_of_hcc140a
-
-
-
- atmosphere_moles_of_hcc140a
-
-
-
- tendency_of_troposphere_moles_of_hcc140a
-
-
-
- tendency_of_middle_atmosphere_moles_of_hcc140a
-
-
-
- tendency_of_troposphere_moles_of_hcfc22
+
+ rate_of_hydroxyl_radical_destruction_due_to_reaction_with_nmvoc
-
- tendency_of_atmosphere_moles_of_hcfc22
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_miscellaneous_phytoplankton
-
- atmosphere_moles_of_hcfc22
+
+ mole_fraction_of_inorganic_bromine_in_air
-
- tendency_of_atmosphere_number_content_of_aerosol_particles_due_to_turbulent_deposition
+
+ water_vapor_saturation_deficit_in_air
-
- lagrangian_tendency_of_atmosphere_sigma_coordinate
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_agricultural_waste_burning
-
- lagrangian_tendency_of_atmosphere_sigma_coordinate
+
+ tendency_of_atmosphere_moles_of_carbon_tetrachloride
-
- electrical_mobility_diameter_of_ambient_aerosol_particles
+
+ tendency_of_atmosphere_moles_of_carbon_monoxide
-
- diameter_of_ambient_aerosol_particles
+
+ platform_yaw
-
- mass_concentration_of_biomass_burning_dry_aerosol_particles_in_air
+
+ platform_pitch
-
- effective_radius_of_stratiform_cloud_rain_particles
+
+ platform_roll
-
- effective_radius_of_stratiform_cloud_ice_particles
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_due_to_nitrate_utilization
-
- effective_radius_of_stratiform_cloud_graupel_particles
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_picophytoplankton
-
- effective_radius_of_convective_cloud_snow_particles
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_phytoplankton
-
- effective_radius_of_convective_cloud_rain_particles
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_diatoms
-
- effective_radius_of_convective_cloud_ice_particles
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_calcareous_phytoplankton
-
- histogram_of_backscattering_ratio_in_air_over_height_above_reference_ellipsoid
+
+ mole_concentration_of_diatoms_expressed_as_nitrogen_in_sea_water
-
- backscattering_ratio_in_air
+
+ tendency_of_mole_concentration_of_dissolved_inorganic_silicon_in_sea_water_due_to_biological_processes
-
- product_of_northward_wind_and_lagrangian_tendency_of_air_pressure
+
+ tendency_of_mole_concentration_of_dissolved_inorganic_phosphorus_in_sea_water_due_to_biological_processes
-
- product_of_eastward_wind_and_lagrangian_tendency_of_air_pressure
+
+ tendency_of_atmosphere_mole_concentration_of_carbon_monoxide_due_to_chemical_destruction
-
- carbon_mass_flux_into_litter_and_soil_due_to_anthropogenic_land_use_or_land_cover_change
+
+ volume_extinction_coefficient_in_air_due_to_ambient_aerosol_particles
-
- floating_ice_shelf_area_fraction
+
+ mole_fraction_of_noy_expressed_as_nitrogen_in_air
-
- atmosphere_moles_of_carbon_tetrachloride
+
+ tendency_of_atmosphere_moles_of_methane
-
- mole_fraction_of_methylglyoxal_in_air
+
+ tendency_of_specific_humidity_due_to_stratiform_precipitation
-
- mole_fraction_of_dichlorine_peroxide_in_air
+
+ tendency_of_air_temperature_due_to_stratiform_precipitation
-
- atmosphere_mass_content_of_convective_cloud_liquid_water
+
+ stratiform_precipitation_flux
-
- effective_radius_of_cloud_liquid_water_particles_at_liquid_water_cloud_top
+
+ stratiform_precipitation_amount
-
- air_equivalent_temperature
+
+ lwe_thickness_of_stratiform_precipitation_amount
-
- air_pseudo_equivalent_temperature
+
+ lwe_stratiform_precipitation_rate
-
- mass_content_of_cloud_liquid_water_in_atmosphere_layer
+
+ water_evaporation_amount_from_canopy
-
- air_equivalent_potential_temperature
+
+ water_evaporation_flux_from_canopy
-
- number_concentration_of_stratiform_cloud_liquid_water_particles_at_stratiform_liquid_water_cloud_top
+
+ precipitation_flux_onto_canopy
-
- number_concentration_of_convective_cloud_liquid_water_particles_at_convective_liquid_water_cloud_top
+
+ outgoing_water_volume_transport_along_river_channel
-
- effective_radius_of_stratiform_cloud_liquid_water_particles_at_stratiform_liquid_water_cloud_top
+
+ tendency_of_sea_ice_amount_due_to_conversion_of_snow_to_sea_ice
-
- effective_radius_of_stratiform_cloud_liquid_water_particles
+
+ tendency_of_atmosphere_mass_content_of_mercury_dry_aerosol_particles_due_to_emission
-
- effective_radius_of_convective_cloud_liquid_water_particles_at_convective_liquid_water_cloud_top
+
+ mass_fraction_of_mercury_dry_aerosol_particles_in_air
-
- effective_radius_of_convective_cloud_liquid_water_particles
+
+ tendency_of_atmosphere_mass_content_of_water_vapor_due_to_sublimation_of_surface_snow_and_ice
-
- effective_radius_of_cloud_liquid_water_particles
+
+ surface_snow_density
-
- atmosphere_mass_content_of_cloud_liquid_water
+
+ atmosphere_upward_relative_vorticity
-
- mole_fraction_of_noy_expressed_as_nitrogen_in_air
+
+ atmosphere_upward_absolute_vorticity
-
- tendency_of_atmosphere_moles_of_methane
+
+ area_type
-
- rate_of_hydroxyl_radical_destruction_due_to_reaction_with_nmvoc
+
+ area_type
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_miscellaneous_phytoplankton
+
+ mass_fraction_of_liquid_precipitation_in_air
-
- mole_fraction_of_inorganic_bromine_in_air
+
+ mass_fraction_of_liquid_precipitation_in_air
-
- water_vapor_saturation_deficit_in_air
+
+ tendency_of_mole_concentration_of_particulate_organic_matter_expressed_as_carbon_in_sea_water_due_to_net_primary_production_by_diazotrophic_phytoplankton
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_agricultural_waste_burning
+
+ nitrogen_growth_limitation_of_diazotrophic_phytoplankton
-
- tendency_of_atmosphere_moles_of_carbon_tetrachloride
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_diazotrophic_phytoplankton
-
- tendency_of_atmosphere_moles_of_carbon_monoxide
+
+ net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_diazotrophic_phytoplankton
-
- platform_yaw
+
+ mole_concentration_of_diazotrophic_phytoplankton_expressed_as_carbon_in_sea_water
-
- platform_pitch
+
+ mass_concentration_of_diazotrophic_phytoplankton_expressed_as_chlorophyll_in_sea_water
-
- platform_roll
+
+ iron_growth_limitation_of_diazotrophic_phytoplankton
-
- tendency_of_specific_humidity_due_to_stratiform_precipitation
+
+ growth_limitation_of_diazotrophic_phytoplankton_due_to_solar_irradiance
-
- tendency_of_air_temperature_due_to_stratiform_precipitation
+
+ air_pseudo_equivalent_potential_temperature
-
- stratiform_precipitation_flux
+
+ tendency_of_mass_fraction_of_stratiform_cloud_ice_in_air_due_to_melting_to_cloud_liquid_water
-
- stratiform_precipitation_amount
+
+ tendency_of_mass_fraction_of_stratiform_cloud_ice_in_air_due_to_heterogeneous_nucleation_from_cloud_liquid_water
-
- lwe_thickness_of_stratiform_precipitation_amount
+
+ tendency_of_mass_fraction_of_stratiform_cloud_ice_in_air_due_to_riming_from_cloud_liquid_water
-
- lwe_stratiform_precipitation_rate
+
+ tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_wet_deposition
-
- water_evaporation_amount_from_canopy
+
+ tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_wet_deposition
-
- water_evaporation_flux_from_canopy
+
+ stratiform_cloud_area_fraction
-
- precipitation_flux_onto_canopy
+
+ surface_upwelling_radiance_per_unit_wavelength_in_air_reflected_by_sea_water
-
- outgoing_water_volume_transport_along_river_channel
+
+ surface_upwelling_radiance_per_unit_wavelength_in_air_emerging_from_sea_water
-
- tendency_of_sea_ice_amount_due_to_conversion_of_snow_to_sea_ice
+
+ surface_upwelling_radiance_per_unit_wavelength_in_air
-
- tendency_of_atmosphere_mass_content_of_mercury_dry_aerosol_particles_due_to_emission
+
+ surface_upwelling_longwave_flux_in_air
-
- mass_fraction_of_mercury_dry_aerosol_particles_in_air
+
+ incoming_water_volume_transport_along_river_channel
-
- tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_wet_deposition
+
+ sea_water_potential_temperature_expressed_as_heat_content
-
- tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_wet_deposition
+
+ sea_water_potential_temperature_expressed_as_heat_content
-
- stratiform_cloud_area_fraction
+
+ sea_ice_temperature_expressed_as_heat_content
-
- magnitude_of_sea_ice_displacement
+
+ sea_ice_temperature_expressed_as_heat_content
@@ -32338,472 +32958,380 @@
surface_upwelling_radiance_per_unit_wavelength_in_sea_water
-
- volume_scattering_coefficient_of_radiative_flux_in_air_due_to_ambient_aerosol_particles
+
+ platform_name
-
- volume_scattering_coefficient_of_radiative_flux_in_air_due_to_dried_aerosol_particles
+
+ water_vapor_partial_pressure_in_air
-
- soil_mass_content_of_carbon
+
+ effective_radius_of_stratiform_cloud_snow_particles
-
- slow_soil_pool_mass_content_of_carbon
+
+ tendency_of_atmosphere_moles_of_cfc11
-
- root_mass_content_of_carbon
+
+ moles_of_cfc11_per_unit_mass_in_sea_water
-
- miscellaneous_living_matter_mass_content_of_carbon
+
+ atmosphere_moles_of_cfc11
-
- fast_soil_pool_mass_content_of_carbon
+
+ tendency_of_atmosphere_moles_of_cfc113
-
- medium_soil_pool_mass_content_of_carbon
+
+ atmosphere_moles_of_cfc113
-
- leaf_mass_content_of_carbon
+
+ tendency_of_atmosphere_moles_of_cfc114
-
- carbon_mass_content_of_forestry_and_agricultural_products
+
+ atmosphere_moles_of_cfc114
-
- carbon_mass_content_of_forestry_and_agricultural_products
+
+ tendency_of_atmosphere_moles_of_cfc115
-
- surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_plant_respiration_for_biomass_maintenance
+
+ atmosphere_moles_of_cfc115
-
- surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_plant_respiration_for_biomass_growth
+
+ tendency_of_atmosphere_moles_of_cfc12
-
- surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_plant_respiration
+
+ tendency_of_atmosphere_moles_of_halon2402
-
- surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_respiration_in_soil
+
+ atmosphere_moles_of_halon2402
-
- surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_heterotrophic_respiration
+
+ tendency_of_atmosphere_moles_of_hcc140a
-
- northward_transformed_eulerian_mean_air_velocity
+
+ atmosphere_moles_of_hcc140a
-
- eastward_transformed_eulerian_mean_air_velocity
-
-
-
- surface_litter_mass_content_of_carbon
-
-
-
- litter_mass_content_of_carbon
-
-
-
- tendency_of_atmosphere_mass_content_of_nitrogen_compounds_expressed_as_nitrogen_due_to_wet_deposition
-
-
-
- mole_concentration_of_phytoplankton_expressed_as_nitrogen_in_sea_water
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_due_to_nitrate_utilization
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_picophytoplankton
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_phytoplankton
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_diatoms
-
-
-
- net_primary_mole_productivity_of_biomass_expressed_as_carbon_by_calcareous_phytoplankton
-
-
-
- mole_concentration_of_diatoms_expressed_as_nitrogen_in_sea_water
-
-
-
- tendency_of_mole_concentration_of_dissolved_inorganic_silicon_in_sea_water_due_to_biological_processes
-
-
-
- tendency_of_mole_concentration_of_dissolved_inorganic_phosphorus_in_sea_water_due_to_biological_processes
-
-
-
- tendency_of_atmosphere_mole_concentration_of_carbon_monoxide_due_to_chemical_destruction
-
-
-
- volume_extinction_coefficient_in_air_due_to_ambient_aerosol_particles
-
-
-
- water_vapor_partial_pressure_in_air
-
-
-
- platform_name
-
-
-
- platform_id
-
-
-
- mass_flux_of_carbon_into_litter_from_vegetation
-
-
-
- subsurface_litter_mass_content_of_carbon
-
-
-
- stem_mass_content_of_carbon
-
-
-
- mole_concentration_of_dissolved_inorganic_14C_in_sea_water
-
-
-
- surface_downward_mass_flux_of_14C_dioxide_abiotic_analogue_expressed_as_carbon
-
-
-
- surface_downward_mass_flux_of_13C_dioxide_abiotic_analogue_expressed_as_13C
+
+ tendency_of_troposphere_moles_of_hcc140a
-
- mole_concentration_of_dissolved_inorganic_13C_in_sea_water
+
+ tendency_of_middle_atmosphere_moles_of_hcc140a
-
- surface_upwelling_radiance_per_unit_wavelength_in_air_reflected_by_sea_water
+
+ tendency_of_troposphere_moles_of_hcfc22
-
- surface_upwelling_radiance_per_unit_wavelength_in_air_emerging_from_sea_water
+
+ tendency_of_atmosphere_moles_of_hcfc22
-
- surface_upwelling_radiance_per_unit_wavelength_in_air
+
+ atmosphere_moles_of_hcfc22
-
- surface_upwelling_longwave_flux_in_air
+
+ product_of_northward_wind_and_lagrangian_tendency_of_air_pressure
-
- incoming_water_volume_transport_along_river_channel
+
+ product_of_eastward_wind_and_lagrangian_tendency_of_air_pressure
-
- sea_water_potential_temperature_expressed_as_heat_content
+
+ carbon_mass_flux_into_litter_and_soil_due_to_anthropogenic_land_use_or_land_cover_change
-
- sea_water_potential_temperature_expressed_as_heat_content
+
+ floating_ice_shelf_area_fraction
-
- sea_ice_temperature_expressed_as_heat_content
+
+ atmosphere_moles_of_carbon_tetrachloride
-
- sea_ice_temperature_expressed_as_heat_content
+
+ mole_fraction_of_methylglyoxal_in_air
-
- water_evapotranspiration_flux
+
+ mole_fraction_of_dichlorine_peroxide_in_air
-
- surface_water_evaporation_flux
+
+ volume_scattering_coefficient_of_radiative_flux_in_air_due_to_ambient_aerosol_particles
-
- water_volume_transport_into_sea_water_from_rivers
+
+ volume_scattering_coefficient_of_radiative_flux_in_air_due_to_dried_aerosol_particles
-
- stratiform_graupel_flux
+
+ soil_mass_content_of_carbon
-
- wood_debris_mass_content_of_carbon
+
+ slow_soil_pool_mass_content_of_carbon
-
- toa_outgoing_shortwave_flux_assuming_clear_sky_and_no_aerosol
+
+ root_mass_content_of_carbon
-
- water_flux_into_sea_water_from_rivers
+
+ miscellaneous_living_matter_mass_content_of_carbon
-
- integral_wrt_height_of_product_of_northward_wind_and_specific_humidity
+
+ fast_soil_pool_mass_content_of_carbon
-
- integral_wrt_height_of_product_of_eastward_wind_and_specific_humidity
+
+ medium_soil_pool_mass_content_of_carbon
-
- integral_wrt_depth_of_sea_water_temperature
+
+ leaf_mass_content_of_carbon
-
- integral_wrt_depth_of_sea_water_temperature
+
+ carbon_mass_content_of_forestry_and_agricultural_products
-
- integral_wrt_depth_of_sea_water_temperature
+
+ carbon_mass_content_of_forestry_and_agricultural_products
-
- integral_wrt_depth_of_sea_water_temperature
+
+ surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_plant_respiration_for_biomass_maintenance
-
- integral_wrt_depth_of_sea_water_practical_salinity
+
+ surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_plant_respiration_for_biomass_growth
-
- northward_ocean_heat_transport_due_to_parameterized_eddy_advection
+
+ surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_plant_respiration
-
- tendency_of_ocean_eddy_kinetic_energy_content_due_to_parameterized_eddy_advection
+
+ surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_respiration_in_soil
-
- ocean_tracer_laplacian_diffusivity_due_to_parameterized_mesoscale_eddy_advection
+
+ surface_upward_mass_flux_of_carbon_dioxide_expressed_as_carbon_due_to_heterotrophic_respiration
-
- ocean_tracer_biharmonic_diffusivity_due_to_parameterized_mesoscale_eddy_advection
+
+ eastward_transformed_eulerian_mean_air_velocity
-
- upward_sea_water_velocity_due_to_parameterized_mesoscale_eddies
+
+ surface_litter_mass_content_of_carbon
-
- sea_water_y_velocity_due_to_parameterized_mesoscale_eddies
+
+ litter_mass_content_of_carbon
-
- sea_water_x_velocity_due_to_parameterized_mesoscale_eddies
+
+ tendency_of_atmosphere_mass_content_of_nitrogen_compounds_expressed_as_nitrogen_due_to_wet_deposition
-
- eastward_sea_water_velocity_due_to_parameterized_mesoscale_eddies
+
+ mole_concentration_of_phytoplankton_expressed_as_nitrogen_in_sea_water
-
- northward_sea_water_velocity_due_to_parameterized_mesoscale_eddies
+
+ atmosphere_mass_content_of_convective_cloud_liquid_water
-
- tendency_of_sea_water_temperature_due_to_parameterized_eddy_advection
+
+ effective_radius_of_cloud_liquid_water_particles_at_liquid_water_cloud_top
-
- tendency_of_sea_water_salinity_due_to_parameterized_eddy_advection
+
+ air_equivalent_temperature
-
- ocean_y_overturning_mass_streamfunction_due_to_parameterized_eddy_advection
+
+ air_pseudo_equivalent_temperature
-
- ocean_meridional_overturning_mass_streamfunction_due_to_parameterized_eddy_advection
+
+ mass_content_of_cloud_liquid_water_in_atmosphere_layer
-
- ocean_mass_y_transport_due_to_advection_and_parameterized_eddy_advection
+
+ air_equivalent_potential_temperature
-
- ocean_mass_x_transport_due_to_advection_and_parameterized_eddy_advection
+
+ number_concentration_of_stratiform_cloud_liquid_water_particles_at_stratiform_liquid_water_cloud_top
-
- ocean_heat_y_transport_due_to_parameterized_eddy_advection
+
+ number_concentration_of_convective_cloud_liquid_water_particles_at_convective_liquid_water_cloud_top
-
- ocean_heat_x_transport_due_to_parameterized_eddy_advection
+
+ effective_radius_of_stratiform_cloud_liquid_water_particles_at_stratiform_liquid_water_cloud_top
-
- northward_ocean_salt_transport_due_to_parameterized_eddy_advection
+
+ effective_radius_of_stratiform_cloud_liquid_water_particles
-
- northward_ocean_freshwater_transport_due_to_parameterized_eddy_advection
+
+ effective_radius_of_convective_cloud_liquid_water_particles_at_convective_liquid_water_cloud_top
-
- integral_wrt_time_of_toa_outgoing_longwave_flux
+
+ effective_radius_of_convective_cloud_liquid_water_particles
-
- integral_wrt_time_of_toa_net_downward_shortwave_flux
+
+ effective_radius_of_cloud_liquid_water_particles
-
- integral_wrt_time_of_surface_net_downward_shortwave_flux
+
+ atmosphere_mass_content_of_cloud_liquid_water
-
- integral_wrt_time_of_surface_net_downward_longwave_flux
+
+ atmosphere_moles_of_cfc12
-
- integral_wrt_time_of_surface_downward_sensible_heat_flux
+
+ tendency_of_atmosphere_moles_of_halon1202
-
- integral_wrt_time_of_surface_downward_latent_heat_flux
+
+ atmosphere_moles_of_halon1202
-
- integral_wrt_time_of_air_temperature_excess
+
+ tendency_of_atmosphere_moles_of_halon1211
-
- integral_wrt_time_of_air_temperature_deficit
+
+ atmosphere_moles_of_halon1211
-
- tendency_of_mass_concentration_of_elemental_carbon_dry_aerosol_particles_in_air_due_to_emission_from_aviation
+
+ tendency_of_atmosphere_moles_of_halon1301
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_wet_deposition
+
+ atmosphere_moles_of_halon1301
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_turbulent_deposition
+
+ platform_id
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_gravitational_settling
+
+ mass_flux_of_carbon_into_litter_from_vegetation
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_waste_treatment_and_disposal
+
+ subsurface_litter_mass_content_of_carbon
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_savanna_and_grassland_fires
+
+ stem_mass_content_of_carbon
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_residential_and_commercial_combustion
+
+ mole_concentration_of_dissolved_inorganic_14C_in_sea_water
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_maritime_transport
+
+ surface_downward_mass_flux_of_14C_dioxide_abiotic_analogue_expressed_as_carbon
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_land_transport
+
+ surface_downward_mass_flux_of_13C_dioxide_abiotic_analogue_expressed_as_13C
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_industrial_processes_and_combustion
+
+ mole_concentration_of_dissolved_inorganic_13C_in_sea_water
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_forest_fires
+
+ northward_transformed_eulerian_mean_air_velocity
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_energy_production_and_distribution
+
+ surface_water_evaporation_flux
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission
+
+ water_volume_transport_into_sea_water_from_rivers
-
- tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_dry_deposition
+
+ stratiform_graupel_flux
-
- mass_fraction_of_elemental_carbon_dry_aerosol_particles_in_air
+
+ wood_debris_mass_content_of_carbon
-
- atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles
+
+ toa_outgoing_shortwave_flux_assuming_clear_sky_and_no_aerosol
-
- mass_concentration_of_elemental_carbon_dry_aerosol_particles_in_air
+
+ water_flux_into_sea_water_from_rivers
-
- lagrangian_tendency_of_air_pressure
+
+ integral_wrt_height_of_product_of_northward_wind_and_specific_humidity
-
- lagrangian_tendency_of_air_pressure
+
+ integral_wrt_height_of_product_of_eastward_wind_and_specific_humidity
-
- air_pressure_at_mean_sea_level
+
+ integral_wrt_depth_of_sea_water_temperature
-
- sea_floor_depth_below_geoid
+
+ integral_wrt_depth_of_sea_water_temperature
-
- sea_surface_height_above_geoid
+
+ integral_wrt_depth_of_sea_water_temperature
-
- sea_surface_height_above_geoid
+
+ integral_wrt_depth_of_sea_water_temperature
-
- tendency_of_atmosphere_mass_content_of_sea_salt_dry_aerosol_particles_due_to_emission
+
+ integral_wrt_depth_of_sea_water_practical_salinity
-
- tendency_of_atmosphere_mass_content_of_sea_salt_dry_aerosol_particles_due_to_emission
+
+ magnitude_of_sea_ice_displacement
@@ -33002,68 +33530,244 @@
ocean_mixed_layer_thickness_defined_by_vertical_tracer_diffusivity_deficit
-
- sea_surface_swell_wave_mean_period
+
+ northward_ocean_heat_transport_due_to_parameterized_eddy_advection
-
- sea_surface_wind_wave_mean_period
+
+ tendency_of_ocean_eddy_kinetic_energy_content_due_to_parameterized_eddy_advection
-
- sea_surface_wave_mean_period
+
+ ocean_tracer_laplacian_diffusivity_due_to_parameterized_mesoscale_eddy_advection
-
- sea_surface_wind_wave_to_direction
+
+ ocean_tracer_biharmonic_diffusivity_due_to_parameterized_mesoscale_eddy_advection
-
- sea_surface_swell_wave_to_direction
+
+ upward_sea_water_velocity_due_to_parameterized_mesoscale_eddies
-
- mass_content_of_water_in_soil
+
+ sea_water_y_velocity_due_to_parameterized_mesoscale_eddies
-
- mass_content_of_water_in_soil_layer
+
+ sea_water_x_velocity_due_to_parameterized_mesoscale_eddies
-
- sea_surface_wave_significant_height
+
+ eastward_sea_water_velocity_due_to_parameterized_mesoscale_eddies
-
- sea_surface_wind_wave_significant_height
+
+ northward_sea_water_velocity_due_to_parameterized_mesoscale_eddies
-
- sea_surface_swell_wave_significant_height
+
+ tendency_of_sea_water_temperature_due_to_parameterized_eddy_advection
-
- tendency_of_atmosphere_moles_of_sulfate_dry_aerosol_particles
+
+ tendency_of_sea_water_salinity_due_to_parameterized_eddy_advection
-
- tendency_of_atmosphere_moles_of_nitric_acid_trihydrate_ambient_aerosol_particles
+
+ ocean_y_overturning_mass_streamfunction_due_to_parameterized_eddy_advection
-
- tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_turbulent_deposition
+
+ ocean_meridional_overturning_mass_streamfunction_due_to_parameterized_eddy_advection
-
- tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_turbulent_deposition
+
+ ocean_mass_y_transport_due_to_advection_and_parameterized_eddy_advection
-
- tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_gravitational_settling
+
+ ocean_mass_x_transport_due_to_advection_and_parameterized_eddy_advection
-
- tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_gravitational_settling
+
+ ocean_heat_y_transport_due_to_parameterized_eddy_advection
+
+
+
+ ocean_heat_x_transport_due_to_parameterized_eddy_advection
+
+
+
+ northward_ocean_salt_transport_due_to_parameterized_eddy_advection
+
+
+
+ northward_ocean_freshwater_transport_due_to_parameterized_eddy_advection
+
+
+
+ integral_wrt_time_of_toa_outgoing_longwave_flux
+
+
+
+ integral_wrt_time_of_toa_net_downward_shortwave_flux
+
+
+
+ integral_wrt_time_of_surface_net_downward_shortwave_flux
+
+
+
+ integral_wrt_time_of_surface_net_downward_longwave_flux
+
+
+
+ tendency_of_mass_concentration_of_elemental_carbon_dry_aerosol_particles_in_air_due_to_emission_from_aviation
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_wet_deposition
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_turbulent_deposition
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_gravitational_settling
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_waste_treatment_and_disposal
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_savanna_and_grassland_fires
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_residential_and_commercial_combustion
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_maritime_transport
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_land_transport
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_industrial_processes_and_combustion
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_forest_fires
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission_from_energy_production_and_distribution
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_emission
+
+
+
+ tendency_of_atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles_due_to_dry_deposition
+
+
+
+ mass_fraction_of_elemental_carbon_dry_aerosol_particles_in_air
+
+
+
+ atmosphere_mass_content_of_elemental_carbon_dry_aerosol_particles
+
+
+
+ mass_concentration_of_elemental_carbon_dry_aerosol_particles_in_air
+
+
+
+ lagrangian_tendency_of_air_pressure
+
+
+
+ lagrangian_tendency_of_air_pressure
+
+
+
+ air_pressure_at_mean_sea_level
+
+
+
+ sea_floor_depth_below_geoid
+
+
+
+ sea_surface_height_above_geoid
+
+
+
+ sea_surface_height_above_geoid
+
+
+
+ tendency_of_atmosphere_mass_content_of_sea_salt_dry_aerosol_particles_due_to_emission
+
+
+
+ tendency_of_atmosphere_mass_content_of_sea_salt_dry_aerosol_particles_due_to_emission
+
+
+
+ sea_surface_swell_wave_mean_period
+
+
+
+ sea_surface_wind_wave_mean_period
+
+
+
+ sea_surface_wave_mean_period
+
+
+
+ sea_surface_wind_wave_to_direction
+
+
+
+ atmosphere_moles_of_carbon_monoxide
+
+
+
+ tendency_of_atmosphere_mass_content_of_water_vapor_due_to_advection
+
+
+
+ tendency_of_atmosphere_moles_of_nitrous_oxide
+
+
+
+ tendency_of_atmosphere_moles_of_molecular_hydrogen
+
+
+
+ tendency_of_atmosphere_moles_of_methyl_chloride
+
+
+
+ tendency_of_atmosphere_moles_of_methyl_bromide
+
+
+
+ y_wind
+
+
+
+ x_wind
@@ -33410,6 +34114,118 @@
atmosphere_convective_available_potential_energy
+
+ integral_wrt_time_of_surface_downward_sensible_heat_flux
+
+
+
+ integral_wrt_time_of_surface_downward_latent_heat_flux
+
+
+
+ integral_wrt_time_of_air_temperature_excess
+
+
+
+ integral_wrt_time_of_air_temperature_deficit
+
+
+
+ sea_water_y_velocity
+
+
+
+ sea_water_x_velocity
+
+
+
+ mole_concentration_of_organic_detritus_expressed_as_silicon_in_sea_water
+
+
+
+ mole_concentration_of_organic_detritus_expressed_as_nitrogen_in_sea_water
+
+
+
+ mole_concentration_of_microzooplankton_expressed_as_nitrogen_in_sea_water
+
+
+
+ mole_concentration_of_mesozooplankton_expressed_as_nitrogen_in_sea_water
+
+
+
+ atmosphere_moles_of_nitrous_oxide
+
+
+
+ atmosphere_moles_of_molecular_hydrogen
+
+
+
+ atmosphere_moles_of_methyl_chloride
+
+
+
+ atmosphere_moles_of_methyl_bromide
+
+
+
+ atmosphere_moles_of_methane
+
+
+
+ equivalent_thickness_at_stp_of_atmosphere_ozone_content
+
+
+
+ tendency_of_atmosphere_moles_of_nitric_acid_trihydrate_ambient_aerosol_particles
+
+
+
+ tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_turbulent_deposition
+
+
+
+ tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_turbulent_deposition
+
+
+
+ tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_gravitational_settling
+
+
+
+ tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_expressed_as_sulfur_due_to_gravitational_settling
+
+
+
+ sea_surface_swell_wave_to_direction
+
+
+
+ mass_content_of_water_in_soil
+
+
+
+ mass_content_of_water_in_soil_layer
+
+
+
+ sea_surface_wave_significant_height
+
+
+
+ sea_surface_wind_wave_significant_height
+
+
+
+ sea_surface_swell_wave_significant_height
+
+
+
+ tendency_of_atmosphere_moles_of_sulfate_dry_aerosol_particles
+
+
mass_concentration_of_chlorophyll_in_sea_water
@@ -33438,14 +34254,6 @@
land_ice_surface_specific_mass_balance_rate
-
- tendency_of_atmosphere_mass_content_of_water_vapor_due_to_advection
-
-
-
- equivalent_thickness_at_stp_of_atmosphere_ozone_content
-
-
tendency_of_atmosphere_mass_content_of_particulate_organic_matter_dry_aerosol_particles_expressed_as_carbon_due_to_emission_from_industrial_processes_and_combustion
@@ -33502,22 +34310,6 @@
tendency_of_atmosphere_mass_content_of_sulfate_dry_aerosol_particles_due_to_emission
-
- tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_turbulence
-
-
-
- tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_shallow_convection
-
-
-
- tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_deep_convection
-
-
-
- atmosphere_net_upward_convective_mass_flux
-
-
tendency_of_troposphere_moles_of_molecular_hydrogen
@@ -33558,78 +34350,6 @@
tendency_of_middle_atmosphere_moles_of_carbon_monoxide
-
- tendency_of_atmosphere_moles_of_nitrous_oxide
-
-
-
- tendency_of_atmosphere_moles_of_molecular_hydrogen
-
-
-
- tendency_of_atmosphere_moles_of_methyl_chloride
-
-
-
- tendency_of_atmosphere_moles_of_methyl_bromide
-
-
-
- y_wind
-
-
-
- x_wind
-
-
-
- sea_water_y_velocity
-
-
-
- sea_water_x_velocity
-
-
-
- mole_concentration_of_organic_detritus_expressed_as_silicon_in_sea_water
-
-
-
- mole_concentration_of_organic_detritus_expressed_as_nitrogen_in_sea_water
-
-
-
- mole_concentration_of_microzooplankton_expressed_as_nitrogen_in_sea_water
-
-
-
- mole_concentration_of_mesozooplankton_expressed_as_nitrogen_in_sea_water
-
-
-
- atmosphere_moles_of_nitrous_oxide
-
-
-
- atmosphere_moles_of_molecular_hydrogen
-
-
-
- atmosphere_moles_of_methyl_chloride
-
-
-
- atmosphere_moles_of_methyl_bromide
-
-
-
- atmosphere_moles_of_methane
-
-
-
- atmosphere_moles_of_carbon_monoxide
-
-
tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_convection
@@ -33770,46 +34490,6 @@
eastward_water_vapor_flux_in_air
-
- surface_upward_sensible_heat_flux
-
-
-
- surface_temperature
-
-
-
- surface_temperature
-
-
-
- surface_temperature
-
-
-
- surface_net_downward_radiative_flux
-
-
-
- mole_fraction_of_hypochlorous_acid_in_air
-
-
-
- mole_fraction_of_chlorine_monoxide_in_air
-
-
-
- mole_fraction_of_chlorine_dioxide_in_air
-
-
-
- wind_mixing_energy_flux_into_sea_water
-
-
-
- water_flux_into_sea_water
-
-
upward_eastward_momentum_flux_in_air_due_to_orographic_gravity_waves
@@ -33838,6 +34518,30 @@
wave_frequency
+
+ tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_turbulence
+
+
+
+ tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_shallow_convection
+
+
+
+ tendency_of_mass_content_of_water_vapor_in_atmosphere_layer_due_to_deep_convection
+
+
+
+ atmosphere_net_upward_convective_mass_flux
+
+
+
+ mass_fraction_of_ozone_in_air
+
+
+
+ mass_fraction_of_convective_cloud_condensed_water_in_air
+
+
sea_surface_wind_wave_period
@@ -33850,6 +34554,46 @@
mass_concentration_of_suspended_matter_in_sea_water
+
+ surface_upward_sensible_heat_flux
+
+
+
+ surface_temperature
+
+
+
+ surface_temperature
+
+
+
+ surface_temperature
+
+
+
+ surface_net_downward_radiative_flux
+
+
+
+ mole_fraction_of_hypochlorous_acid_in_air
+
+
+
+ mole_fraction_of_chlorine_monoxide_in_air
+
+
+
+ mole_fraction_of_chlorine_dioxide_in_air
+
+
+
+ wind_mixing_energy_flux_into_sea_water
+
+
+
+ water_flux_into_sea_water
+
+
surface_drag_coefficient_in_air
@@ -33878,6 +34622,10 @@
mole_fraction_of_ozone_in_air
+
+ water_evapotranspiration_flux
+
+
isotropic_shortwave_radiance_in_air
@@ -33885,14 +34633,6 @@
isotropic_longwave_radiance_in_air
-
-
- mass_fraction_of_ozone_in_air
-
-
-
- mass_fraction_of_convective_cloud_condensed_water_in_air
-
diff --git a/lib/iris/__init__.py b/lib/iris/__init__.py
index 38465472ee..0e6670533f 100644
--- a/lib/iris/__init__.py
+++ b/lib/iris/__init__.py
@@ -89,12 +89,12 @@ def callback(cube, field, filename):
"""
+from collections.abc import Iterable
import contextlib
import glob
import importlib
import itertools
import os.path
-import pathlib
import threading
import iris._constraints
@@ -256,7 +256,8 @@ def context(self, **kwargs):
def _generate_cubes(uris, callback, constraints):
"""Returns a generator of cubes given the URIs and a callback."""
- if isinstance(uris, (str, pathlib.PurePath)):
+ if isinstance(uris, str) or not isinstance(uris, Iterable):
+ # Make a string, or other single item, into an iterable.
uris = [uris]
# Group collections of uris by their iris handler
@@ -273,6 +274,10 @@ def _generate_cubes(uris, callback, constraints):
urls = [":".join(x) for x in groups]
for cube in iris.io.load_http(urls, callback):
yield cube
+ elif scheme == "data":
+ data_objects = [x[1] for x in groups]
+ for cube in iris.io.load_data_objects(data_objects, callback):
+ yield cube
else:
raise ValueError("Iris cannot handle the URI scheme: %s" % scheme)
diff --git a/lib/iris/_concatenate.py b/lib/iris/_concatenate.py
index 5debc452ee..01a1bb689b 100644
--- a/lib/iris/_concatenate.py
+++ b/lib/iris/_concatenate.py
@@ -22,7 +22,7 @@
#
# * Cope with auxiliary coordinate factories.
#
-# * Allow concatentation over a user specified dimension.
+# * Allow concatenation over a user specified dimension.
#
@@ -160,6 +160,39 @@ def name(self):
return self.defn.name()
+class _DerivedCoordAndDims(
+ namedtuple("DerivedCoordAndDims", ["coord", "dims", "aux_factory"])
+):
+ """
+ Container for a derived coordinate, the associated AuxCoordFactory, and the
+ associated data dimension(s) spanned over a :class:`iris.cube.Cube`.
+
+ Args:
+
+ * coord:
+ A :class:`iris.coords.DimCoord` or :class:`iris.coords.AuxCoord`
+ coordinate instance.
+
+ * dims:
+ A tuple of the data dimension(s) spanned by the coordinate.
+
+ * aux_factory:
+ A :class:`iris.aux_factory.AuxCoordFactory` instance.
+
+ """
+
+ __slots__ = ()
+
+ def __eq__(self, other):
+ """Do not take aux factories into account for equality."""
+ result = NotImplemented
+ if isinstance(other, _DerivedCoordAndDims):
+ equal_coords = self.coord == other.coord
+ equal_dims = self.dims == other.dims
+ result = equal_coords and equal_dims
+ return result
+
+
class _OtherMetaData(namedtuple("OtherMetaData", ["defn", "dims"])):
"""
Container for the metadata that defines a cell measure or ancillary
@@ -280,6 +313,7 @@ def concatenate(
check_aux_coords=True,
check_cell_measures=True,
check_ancils=True,
+ check_derived_coords=True,
):
"""
Concatenate the provided cubes over common existing dimensions.
@@ -296,6 +330,30 @@ def concatenate(
If True, raise an informative
:class:`~iris.exceptions.ContatenateError` if registration fails.
+ * check_aux_coords
+ Checks if the points and bounds of auxiliary coordinates of the cubes
+ match. This check is not applied to auxiliary coordinates that span the
+ dimension the concatenation is occurring along. Defaults to True.
+
+ * check_cell_measures
+ Checks if the data of cell measures of the cubes match. This check is
+ not applied to cell measures that span the dimension the concatenation
+ is occurring along. Defaults to True.
+
+ * check_ancils
+ Checks if the data of ancillary variables of the cubes match. This
+ check is not applied to ancillary variables that span the dimension the
+ concatenation is occurring along. Defaults to True.
+
+ * check_derived_coords
+ Checks if the points and bounds of derived coordinates of the cubes
+ match. This check is not applied to derived coordinates that span the
+ dimension the concatenation is occurring along. Note that differences
+ in scalar coordinates and dimensional coordinates used to derive the
+ coordinate are still checked. Checks for auxiliary coordinates used to
+ derive the coordinates can be ignored with `check_aux_coords`. Defaults
+ to True.
+
Returns:
A :class:`iris.cube.CubeList` of concatenated :class:`iris.cube.Cube`
instances.
@@ -321,6 +379,7 @@ def concatenate(
check_aux_coords,
check_cell_measures,
check_ancils,
+ check_derived_coords,
)
if registered:
axis = proto_cube.axis
@@ -378,6 +437,8 @@ def __init__(self, cube):
self.cm_metadata = []
self.ancillary_variables_and_dims = []
self.av_metadata = []
+ self.derived_coords_and_dims = []
+ self.derived_metadata = []
self.dim_mapping = []
# Determine whether there are any anonymous cube dimensions.
@@ -437,6 +498,17 @@ def meta_key_func(dm):
av_and_dims = _CoordAndDims(av, tuple(dims))
self.ancillary_variables_and_dims.append(av_and_dims)
+ def name_key_func(factory):
+ return factory.name()
+
+ for factory in sorted(cube.aux_factories, key=name_key_func):
+ coord = factory.make_coord(cube.coord_dims)
+ dims = cube.coord_dims(coord)
+ metadata = _CoordMetaData(coord, dims)
+ self.derived_metadata.append(metadata)
+ coord_and_dims = _DerivedCoordAndDims(coord, tuple(dims), factory)
+ self.derived_coords_and_dims.append(coord_and_dims)
+
def _coordinate_differences(self, other, attr, reason="metadata"):
"""
Determine the names of the coordinates that differ between `self` and
@@ -544,6 +616,14 @@ def match(self, other, error_on_mismatch):
msgs.append(
msg_template.format("Ancillary variables", *differences)
)
+ # Check derived coordinates.
+ if self.derived_metadata != other.derived_metadata:
+ differences = self._coordinate_differences(
+ other, "derived_metadata"
+ )
+ msgs.append(
+ msg_template.format("Derived coordinates", *differences)
+ )
# Check scalar coordinates.
if self.scalar_coords != other.scalar_coords:
differences = self._coordinate_differences(
@@ -597,6 +677,7 @@ def __init__(self, cube_signature):
self.ancillary_variables_and_dims = (
cube_signature.ancillary_variables_and_dims
)
+ self.derived_coords_and_dims = cube_signature.derived_coords_and_dims
self.dim_coords = cube_signature.dim_coords
self.dim_mapping = cube_signature.dim_mapping
self.dim_extents = []
@@ -779,6 +860,11 @@ def concatenate(self):
# Concatenate the new ancillary variables
ancillary_variables_and_dims = self._build_ancillary_variables()
+ # Concatenate the new aux factories
+ aux_factories = self._build_aux_factories(
+ dim_coords_and_dims, aux_coords_and_dims
+ )
+
# Concatenate the new data payload.
data = self._build_data()
@@ -790,6 +876,7 @@ def concatenate(self):
aux_coords_and_dims=aux_coords_and_dims,
cell_measures_and_dims=cell_measures_and_dims,
ancillary_variables_and_dims=ancillary_variables_and_dims,
+ aux_factories=aux_factories,
**kwargs,
)
else:
@@ -807,6 +894,7 @@ def register(
check_aux_coords=False,
check_cell_measures=False,
check_ancils=False,
+ check_derived_coords=False,
):
"""
Determine whether the given source-cube is suitable for concatenation
@@ -827,6 +915,31 @@ def register(
* error_on_mismatch:
If True, raise an informative error if registration fails.
+ * check_aux_coords
+ Checks if the points and bounds of auxiliary coordinates of the
+ cubes match. This check is not applied to auxiliary coordinates
+ that span the dimension the concatenation is occurring along.
+ Defaults to False.
+
+ * check_cell_measures
+ Checks if the data of cell measures of the cubes match. This check
+ is not applied to cell measures that span the dimension the
+ concatenation is occurring along. Defaults to False.
+
+ * check_ancils
+ Checks if the data of ancillary variables of the cubes match. This
+ check is not applied to ancillary variables that span the dimension
+ the concatenation is occurring along. Defaults to False.
+
+ * check_derived_coords
+ Checks if the points and bounds of derived coordinates of the cubes
+ match. This check is not applied to derived coordinates that span
+ the dimension the concatenation is occurring along. Note that
+ differences in scalar coordinates and dimensional coordinates used
+ to derive the coordinate are still checked. Checks for auxiliary
+ coordinates used to derive the coordinates can be ignored with
+ `check_aux_coords`. Defaults to False.
+
Returns:
Boolean.
@@ -905,6 +1018,21 @@ def register(
if not coord_a == coord_b:
match = False
+ # Check for compatible derived coordinates.
+ if match:
+ if check_derived_coords:
+ for coord_a, coord_b in zip(
+ self._cube_signature.derived_coords_and_dims,
+ cube_signature.derived_coords_and_dims,
+ ):
+ # Derived coords that span the candidate axis can differ
+ if (
+ candidate_axis not in coord_a.dims
+ or candidate_axis not in coord_b.dims
+ ):
+ if not coord_a == coord_b:
+ match = False
+
if match:
# Register the cube as a source-cube for this proto-cube.
self._add_skeleton(coord_signature, cube.lazy_data())
@@ -1088,6 +1216,64 @@ def _build_ancillary_variables(self):
return ancillary_variables_and_dims
+ def _build_aux_factories(self, dim_coords_and_dims, aux_coords_and_dims):
+ """
+ Generate the aux factories for the new concatenated cube.
+
+ Args:
+
+ * dim_coords_and_dims:
+ A list of dimension coordinate and dimension tuple pairs from the
+ concatenated cube.
+
+ * aux_coords_and_dims:
+ A list of auxiliary coordinates and dimension(s) tuple pairs from
+ the concatenated cube.
+
+ Returns:
+ A list of :class:`iris.aux_factory.AuxCoordFactory`.
+
+ """
+ # Setup convenience hooks.
+ cube_signature = self._cube_signature
+ old_dim_coords = cube_signature.dim_coords
+ old_aux_coords = [a[0] for a in cube_signature.aux_coords_and_dims]
+ new_dim_coords = [d[0] for d in dim_coords_and_dims]
+ new_aux_coords = [a[0] for a in aux_coords_and_dims]
+ scalar_coords = cube_signature.scalar_coords
+
+ aux_factories = []
+
+ # Generate all the factories for the new concatenated cube.
+ for i, (coord, dims, factory) in enumerate(
+ cube_signature.derived_coords_and_dims
+ ):
+ # Check whether the derived coordinate of the factory spans the
+ # nominated dimension of concatenation.
+ if self.axis in dims:
+ # Update the dependencies of the factory with coordinates of
+ # the concatenated cube. We need to check all coordinate types
+ # here (dim coords, aux coords, and scalar coords).
+ new_dependencies = {}
+ for old_dependency in factory.dependencies.values():
+ if old_dependency in old_dim_coords:
+ dep_idx = old_dim_coords.index(old_dependency)
+ new_dependency = new_dim_coords[dep_idx]
+ elif old_dependency in old_aux_coords:
+ dep_idx = old_aux_coords.index(old_dependency)
+ new_dependency = new_aux_coords[dep_idx]
+ else:
+ dep_idx = scalar_coords.index(old_dependency)
+ new_dependency = scalar_coords[dep_idx]
+ new_dependencies[id(old_dependency)] = new_dependency
+
+ # Create new factory with the updated dependencies.
+ factory = factory.updated(new_dependencies)
+
+ aux_factories.append(factory)
+
+ return aux_factories
+
def _build_data(self):
"""
Generate the data payload for the new concatenated cube.
diff --git a/lib/iris/_lazy_data.py b/lib/iris/_lazy_data.py
index e0566fc8f2..4c294a7d2f 100644
--- a/lib/iris/_lazy_data.py
+++ b/lib/iris/_lazy_data.py
@@ -47,6 +47,15 @@ def is_lazy_data(data):
return result
+def is_lazy_masked_data(data):
+ """
+ Return True if the argument is both an Iris 'lazy' data array and the
+ underlying array is of masked type. Otherwise return False.
+
+ """
+ return is_lazy_data(data) and ma.isMA(da.utils.meta_from_array(data))
+
+
@lru_cache
def _optimum_chunksize_internals(
chunks,
diff --git a/lib/iris/_merge.py b/lib/iris/_merge.py
index 5ca5f31a8e..0f748d6d34 100644
--- a/lib/iris/_merge.py
+++ b/lib/iris/_merge.py
@@ -298,7 +298,7 @@ class _CoordSignature(
):
"""
Criterion for identifying a specific type of :class:`iris.cube.Cube`
- based on its scalar and vector coorinate data and metadata, and
+ based on its scalar and vector coordinate data and metadata, and
auxiliary coordinate factories.
Args:
@@ -516,7 +516,7 @@ class _Relation(namedtuple("Relation", ["separable", "inseparable"])):
* separable:
A set of independent candidate dimension names.
- * inseperable:
+ * inseparable:
A set of dependent candidate dimension names.
"""
@@ -1419,7 +1419,7 @@ def _define_space(self, space, positions, indexes, function_matrix):
"""
- # Heuristic reordering of coordinate defintion indexes into
+ # Heuristic reordering of coordinate definition indexes into
# preferred dimension order.
def axis_and_name(name):
axis_dict = {"T": 1, "Z": 2, "Y": 3, "X": 4}
@@ -1467,7 +1467,7 @@ def axis_and_name(name):
}
else:
# TODO: Consider appropriate sort order (ascending,
- # decending) i.e. use CF positive attribute.
+ # descending) i.e. use CF positive attribute.
cells = sorted(indexes[name])
points = np.array(
[cell.point for cell in cells],
diff --git a/lib/iris/_representation/cube_summary.py b/lib/iris/_representation/cube_summary.py
index 6b0d4cf0f3..4e0fcfb1ea 100644
--- a/lib/iris/_representation/cube_summary.py
+++ b/lib/iris/_representation/cube_summary.py
@@ -264,13 +264,11 @@ def __init__(self, title, cell_methods):
self.names = []
self.values = []
self.contents = []
- for method in cell_methods:
- name = method.method
- # Remove "method: " from the front of the string, leaving the value.
- value = str(method)[len(name + ": ") :]
- self.names.append(name)
+ for index, method in enumerate(cell_methods):
+ value = str(method)
+ self.names.append(str(index))
self.values.append(value)
- content = "{}: {}".format(name, value)
+ content = "{}: {}".format(index, value)
self.contents.append(content)
diff --git a/lib/iris/analysis/__init__.py b/lib/iris/analysis/__init__.py
index f34cda1402..4cd9ccbe05 100644
--- a/lib/iris/analysis/__init__.py
+++ b/lib/iris/analysis/__init__.py
@@ -35,12 +35,18 @@
"""
-from collections import OrderedDict
+from __future__ import annotations
+
from collections.abc import Iterable
import functools
from functools import wraps
+from inspect import getfullargspec
+import itertools
+from numbers import Number
+from typing import Optional, Union
import warnings
+from cf_units import Unit
import dask.array as da
import numpy as np
import numpy.ma as ma
@@ -55,7 +61,9 @@
)
from iris.analysis._regrid import CurvilinearRegridder, RectilinearRegridder
import iris.coords
+from iris.coords import _DimensionalMetadata
from iris.exceptions import LazyAggregatorError
+import iris.util
__all__ = (
"Aggregator",
@@ -467,11 +475,13 @@ def __init__(
Kwargs:
* units_func (callable):
- | *Call signature*: (units)
+ | *Call signature*: (units, \**kwargs)
If provided, called to convert a cube's units.
Returns an :class:`cf_units.Unit`, or a
value that can be made into one.
+ To ensure backwards-compatibility, also accepts a callable with
+ call signature (units).
* lazy_func (callable or None):
An alternative to :data:`call_func` implementing a lazy
@@ -479,7 +489,8 @@ def __init__(
main operation, but should raise an error in unhandled cases.
Additional kwargs::
- Passed through to :data:`call_func` and :data:`lazy_func`.
+ Passed through to :data:`call_func`, :data:`lazy_func`, and
+ :data:`units_func`.
Aggregators are used by cube aggregation methods such as
:meth:`~iris.cube.Cube.collapsed` and
@@ -625,7 +636,11 @@ def update_metadata(self, cube, coords, **kwargs):
"""
# Update the units if required.
if self.units_func is not None:
- cube.units = self.units_func(cube.units)
+ argspec = getfullargspec(self.units_func)
+ if argspec.varkw is None: # old style
+ cube.units = self.units_func(cube.units)
+ else: # new style (preferred)
+ cube.units = self.units_func(cube.units, **kwargs)
def post_process(self, collapsed_cube, data_result, coords, **kwargs):
"""
@@ -693,13 +708,13 @@ class PercentileAggregator(_Aggregator):
"""
def __init__(self, units_func=None, **kwargs):
- """
+ r"""
Create a percentile aggregator.
Kwargs:
* units_func (callable):
- | *Call signature*: (units)
+ | *Call signature*: (units, \**kwargs)
If provided, called to convert a cube's units.
Returns an :class:`cf_units.Unit`, or a
@@ -934,13 +949,13 @@ class WeightedPercentileAggregator(PercentileAggregator):
"""
def __init__(self, units_func=None, lazy_func=None, **kwargs):
- """
+ r"""
Create a weighted percentile aggregator.
Kwargs:
* units_func (callable):
- | *Call signature*: (units)
+ | *Call signature*: (units, \**kwargs)
If provided, called to convert a cube's units.
Returns an :class:`cf_units.Unit`, or a
@@ -1172,8 +1187,112 @@ def post_process(self, collapsed_cube, data_result, coords, **kwargs):
return result
+class _Weights(np.ndarray):
+ """Class for handling weights for weighted aggregation.
+
+ This subclasses :class:`numpy.ndarray`; thus, all methods and properties of
+ :class:`numpy.ndarray` (e.g., `shape`, `ndim`, `view()`, etc.) are
+ available.
+
+ Details on subclassing :class:`numpy.ndarray` are given here:
+ https://numpy.org/doc/stable/user/basics.subclassing.html
+
+ """
+
+ def __new__(cls, weights, cube, units=None):
+ """Create class instance.
+
+ Args:
+
+ * weights (Cube, string, _DimensionalMetadata, array-like):
+ If given as a :class:`iris.cube.Cube`, use its data and units. If
+ given as a :obj:`str` or :class:`iris.coords._DimensionalMetadata`,
+ assume this is (the name of) a
+ :class:`iris.coords._DimensionalMetadata` object of the cube (i.e.,
+ one of :meth:`iris.cube.Cube.coords`,
+ :meth:`iris.cube.Cube.cell_measures`, or
+ :meth:`iris.cube.Cube.ancillary_variables`). If given as an
+ array-like object, use this directly and assume units of `1`. If
+ `units` is given, ignore all units derived above and use the ones
+ given by `units`.
+ * cube (Cube):
+ Input cube for aggregation. If weights is given as :obj:`str` or
+ :class:`iris.coords._DimensionalMetadata`, try to extract the
+ :class:`iris.coords._DimensionalMetadata` object and corresponding
+ dimensional mappings from this cube. Otherwise, this argument is
+ ignored.
+ * units (string, Unit):
+ If ``None``, use units derived from `weights`. Otherwise, overwrite
+ the units derived from `weights` and use `units`.
+
+ """
+ # `weights` is a cube
+ # Note: to avoid circular imports of Cube we use duck typing using the
+ # "hasattr" syntax here
+ # --> Extract data and units from cube
+ if hasattr(weights, "add_aux_coord"):
+ obj = np.asarray(weights.data).view(cls)
+ obj.units = weights.units
+
+ # `weights`` is a string or _DimensionalMetadata object
+ # --> Extract _DimensionalMetadata object from cube, broadcast it to
+ # correct shape using the corresponding dimensional mapping, and use
+ # its data and units
+ elif isinstance(weights, (str, _DimensionalMetadata)):
+ dim_metadata = cube._dimensional_metadata(weights)
+ arr = dim_metadata._values
+ if dim_metadata.shape != cube.shape:
+ arr = iris.util.broadcast_to_shape(
+ arr,
+ cube.shape,
+ dim_metadata.cube_dims(cube),
+ )
+ obj = np.asarray(arr).view(cls)
+ obj.units = dim_metadata.units
+
+ # Remaining types (e.g., np.ndarray): try to convert to ndarray.
+ else:
+ obj = np.asarray(weights).view(cls)
+ obj.units = Unit("1")
+
+ # Overwrite units from units argument if necessary
+ if units is not None:
+ obj.units = units
+
+ return obj
+
+ def __array_finalize__(self, obj):
+ """See https://numpy.org/doc/stable/user/basics.subclassing.html.
+
+ Note
+ ----
+ `obj` cannot be `None` here since ``_Weights.__new__`` does not call
+ ``super().__new__`` explicitly.
+
+ """
+ self.units = getattr(obj, "units", Unit("1"))
+
+ @classmethod
+ def update_kwargs(cls, kwargs, cube):
+ """Update ``weights`` keyword argument in-place.
+
+ Args:
+
+ * kwargs (dict):
+ Keyword arguments that will be updated in-place if a `weights`
+ keyword is present which is not ``None``.
+ * cube (Cube):
+ Input cube for aggregation. If weights is given as :obj:`str`, try
+ to extract a cell measure with the corresponding name from this
+ cube. Otherwise, this argument is ignored.
+
+ """
+ if kwargs.get("weights") is not None:
+ kwargs["weights"] = cls(kwargs["weights"], cube)
+
+
def create_weighted_aggregator_fn(aggregator_fn, axis, **kwargs):
- """Return an aggregator function that can explicitely handle weights.
+ """Return an aggregator function that can explicitly handle weights.
Args:
@@ -1398,7 +1517,7 @@ def _weighted_quantile_1D(data, weights, quantiles, **kwargs):
array or float. Calculated quantile values (set to np.nan wherever sum
of weights is zero or masked)
"""
- # Return np.nan if no useable points found
+ # Return np.nan if no usable points found
if np.isclose(weights.sum(), 0.0) or ma.is_masked(weights.sum()):
return np.resize(np.array(np.nan), len(quantiles))
# Sort the data
@@ -1535,7 +1654,7 @@ def _proportion(array, function, axis, **kwargs):
# Otherwise, it is possible for numpy to return a masked array that has
# a dtype for its data that is different to the dtype of the fill-value,
# which can cause issues outside this function.
- # Reference - tests/unit/analyis/test_PROPORTION.py Test_masked.test_ma
+ # Reference - tests/unit/analysis/test_PROPORTION.py Test_masked.test_ma
numerator = _count(array, axis=axis, function=function, **kwargs)
result = ma.asarray(numerator / total_non_masked)
@@ -1583,27 +1702,19 @@ def _lazy_max_run(array, axis=-1, **kwargs):
def _rms(array, axis, **kwargs):
- # XXX due to the current limitations in `da.average` (see below), maintain
- # an explicit non-lazy aggregation function for now.
- # Note: retaining this function also means that if weights are passed to
- # the lazy aggregator, the aggregation will fall back to using this
- # non-lazy aggregator.
- rval = np.sqrt(ma.average(np.square(array), axis=axis, **kwargs))
- if not ma.isMaskedArray(array):
- rval = np.asarray(rval)
+ rval = np.sqrt(ma.average(array**2, axis=axis, **kwargs))
+
return rval
-@_build_dask_mdtol_function
def _lazy_rms(array, axis, **kwargs):
- # XXX This should use `da.average` and not `da.mean`, as does the above.
- # However `da.average` current doesn't handle masked weights correctly
- # (see https://github.com/dask/dask/issues/3846).
- # To work around this we use da.mean, which doesn't support weights at
- # all. Thus trying to use this aggregator with weights will currently
- # raise an error in dask due to the unexpected keyword `weights`,
- # rather than silently returning the wrong answer.
- return da.sqrt(da.mean(array**2, axis=axis, **kwargs))
+ # Note that, since we specifically need the ma version of average to handle
+ # weights correctly with masked data, we cannot rely on NEP13/18 and need
+ # to implement a separate lazy RMS function.
+
+ rval = da.sqrt(da.ma.average(array**2, axis=axis, **kwargs))
+
+ return rval
def _sum(array, **kwargs):
@@ -1638,6 +1749,18 @@ def _sum(array, **kwargs):
return rvalue
+def _sum_units_func(units, **kwargs):
+ """Multiply original units with weight units if possible."""
+ weights = kwargs.get("weights")
+ if weights is None: # no weights given or weights are None
+ result = units
+ elif hasattr(weights, "units"): # weights are _Weights
+ result = units * weights.units
+ else: # weights are regular np.ndarrays
+ result = units
+ return result
+
+
def _peak(array, **kwargs):
def column_segments(column):
nan_indices = np.where(np.isnan(column))[0]
@@ -1753,7 +1876,7 @@ def interp_order(length):
COUNT = Aggregator(
"count",
_count,
- units_func=lambda units: 1,
+ units_func=lambda units, **kwargs: 1,
lazy_func=_build_dask_mdtol_function(_count),
)
"""
@@ -1785,7 +1908,7 @@ def interp_order(length):
MAX_RUN = Aggregator(
None,
iris._lazy_data.non_lazy(_lazy_max_run),
- units_func=lambda units: 1,
+ units_func=lambda units, **kwargs: 1,
lazy_func=_build_dask_mdtol_function(_lazy_max_run),
)
"""
@@ -1913,6 +2036,7 @@ def interp_order(length):
result = cube.collapsed('longitude', iris.analysis.MEDIAN)
+
This aggregator handles masked data, but NOT lazy data. For lazy aggregation,
please try :obj:`~.PERCENTILE`.
@@ -2029,7 +2153,11 @@ def interp_order(length):
"""
-PROPORTION = Aggregator("proportion", _proportion, units_func=lambda units: 1)
+PROPORTION = Aggregator(
+ "proportion",
+ _proportion,
+ units_func=lambda units, **kwargs: 1,
+)
"""
An :class:`~iris.analysis.Aggregator` instance that calculates the
proportion, as a fraction, of :class:`~iris.cube.Cube` data occurrences
@@ -2071,14 +2199,16 @@ def interp_order(length):
the root mean square over a :class:`~iris.cube.Cube`, as computed by
((x0**2 + x1**2 + ... + xN-1**2) / N) ** 0.5.
-Additional kwargs associated with the use of this aggregator:
+Parameters
+----------
-* weights (float ndarray):
+weights : array-like, optional
Weights matching the shape of the cube or the length of the window for
rolling window operations. The weights are applied to the squares when
taking the mean.
-**For example**:
+Example
+-------
To compute the zonal root mean square over the *longitude* axis of a cube::
@@ -2128,6 +2258,7 @@ def interp_order(length):
SUM = WeightedAggregator(
"sum",
_sum,
+ units_func=_sum_units_func,
lazy_func=_build_dask_mdtol_function(_sum),
)
"""
@@ -2165,7 +2296,7 @@ def interp_order(length):
VARIANCE = Aggregator(
"variance",
ma.var,
- units_func=lambda units: units * units,
+ units_func=lambda units, **kwargs: units * units,
lazy_func=_build_dask_mdtol_function(da.var),
ddof=1,
)
@@ -2257,8 +2388,11 @@ class _Groupby:
"""
def __init__(
- self, groupby_coords, shared_coords=None, climatological=False
- ):
+ self,
+ groupby_coords: list[iris.coords.Coord],
+ shared_coords: Optional[list[tuple[iris.coords.Coord, int]]] = None,
+ climatological: bool = False,
+ ) -> None:
"""
Determine the group slices over the group-by coordinates.
@@ -2282,15 +2416,15 @@ def __init__(
"""
#: Group-by and shared coordinates that have been grouped.
- self.coords = []
- self._groupby_coords = []
- self._shared_coords = []
- self._slices_by_key = OrderedDict()
+ self.coords: list[iris.coords.Coord] = []
+ self._groupby_coords: list[iris.coords.Coord] = []
+ self._shared_coords: list[tuple[iris.coords.Coord, int]] = []
+ self._groupby_indices: list[tuple[int, ...]] = []
self._stop = None
# Ensure group-by coordinates are iterable.
if not isinstance(groupby_coords, Iterable):
raise TypeError(
- "groupby_coords must be a " "`collections.Iterable` type."
+ "groupby_coords must be a `collections.Iterable` type."
)
# Add valid group-by coordinates.
@@ -2302,7 +2436,7 @@ def __init__(
# Ensure shared coordinates are iterable.
if not isinstance(shared_coords, Iterable):
raise TypeError(
- "shared_coords must be a " "`collections.Iterable` type."
+ "shared_coords must be a `collections.Iterable` type."
)
# Add valid shared coordinates.
for coord, dim in shared_coords:
@@ -2313,9 +2447,11 @@ def __init__(
# Stores mapping from original cube coords to new ones, as metadata may
# not match
- self.coord_replacement_mapping = []
+ self.coord_replacement_mapping: list[
+ tuple[iris.coords.Coord, iris.coords.Coord]
+ ] = []
- def _add_groupby_coord(self, coord):
+ def _add_groupby_coord(self, coord: iris.coords.Coord) -> None:
if coord.ndim != 1:
raise iris.exceptions.CoordinateMultiDimError(coord)
if self._stop is None:
@@ -2324,12 +2460,12 @@ def _add_groupby_coord(self, coord):
raise ValueError("Group-by coordinates have different lengths.")
self._groupby_coords.append(coord)
- def _add_shared_coord(self, coord, dim):
+ def _add_shared_coord(self, coord: iris.coords.Coord, dim: int) -> None:
if coord.shape[dim] != self._stop and self._stop is not None:
raise ValueError("Shared coordinates have different lengths.")
self._shared_coords.append((coord, dim))
- def group(self):
+ def group(self) -> list[tuple[int, ...]]:
"""
Calculate the groups and associated slices over one or more group-by
coordinates.
@@ -2338,147 +2474,84 @@ def group(self):
group slices.
Returns:
- A generator of the coordinate group slices.
-
- """
- if self._groupby_coords:
- if not self._slices_by_key:
- items = []
- groups = []
-
- for coord in self._groupby_coords:
- groups.append(iris.coords._GroupIterator(coord.points))
- items.append(next(groups[-1]))
-
- # Construct the group slice for each group over the group-by
- # coordinates. Keep constructing until all group-by coordinate
- # groups are exhausted.
- while any([item is not None for item in items]):
- # Determine the extent (start, stop) of the group given
- # each current group-by coordinate group.
- start = max(
- [
- item.groupby_slice.start
- for item in items
- if item is not None
- ]
- )
- stop = min(
- [
- item.groupby_slice.stop
- for item in items
- if item is not None
- ]
- )
- # Construct composite group key for the group using the
- # start value from each group-by coordinate.
- key = tuple(
- [coord.points[start] for coord in self._groupby_coords]
- )
- # Associate group slice with group key within the ordered
- # dictionary.
- self._slices_by_key.setdefault(key, []).append(
- slice(start, stop)
- )
- # Prepare for the next group slice construction over the
- # group-by coordinates.
- for item_index, item in enumerate(items):
- if item is None:
- continue
- # Get coordinate current group slice.
- groupby_slice = item.groupby_slice
- # Determine whether coordinate has spanned all its
- # groups i.e. its full length
- # or whether we need to get the coordinates next group.
- if groupby_slice.stop == self._stop:
- # This coordinate has exhausted all its groups,
- # so remove it.
- items[item_index] = None
- elif groupby_slice.stop == stop:
- # The current group of this coordinate is
- # exhausted, so get the next one.
- items[item_index] = next(groups[item_index])
-
- # Merge multiple slices together into one tuple.
- self._slice_merge()
- # Calculate the new group-by coordinates.
- self._compute_groupby_coords()
- # Calculate the new shared coordinates.
- self._compute_shared_coords()
- # Generate the group-by slices/groups.
- for groupby_slice in self._slices_by_key.values():
- yield groupby_slice
-
- return
-
- def _slice_merge(self):
- """
- Merge multiple slices into one tuple and collapse items from
- containing list.
-
- """
- # Iterate over the ordered dictionary in order to reduce
- # multiple slices into a single tuple and collapse
- # all items from containing list.
- for key, groupby_slices in self._slices_by_key.items():
- if len(groupby_slices) > 1:
- # Compress multiple slices into tuple representation.
- groupby_indicies = []
-
- for groupby_slice in groupby_slices:
- groupby_indicies.extend(
- range(groupby_slice.start, groupby_slice.stop)
- )
-
- self._slices_by_key[key] = tuple(groupby_indicies)
- else:
- # Remove single inner slice from list.
- self._slices_by_key[key] = groupby_slices[0]
-
- def _compute_groupby_coords(self):
+ A list of the coordinate group slices.
+
+ """
+ if not self._groupby_indices:
+ # Construct the group indices for each group over the group-by
+ # coordinates. Keep constructing until all group-by coordinate
+ # groups are exhausted.
+
+ def group_iterator(points):
+ start = 0
+ for _, group in itertools.groupby(points):
+ stop = sum((1 for _ in group), start)
+ yield slice(start, stop)
+ start = stop
+
+ groups = [group_iterator(c.points) for c in self._groupby_coords]
+ groupby_slices = [next(group) for group in groups]
+ indices_by_key: dict[
+ tuple[Union[Number, str], ...], list[int]
+ ] = {}
+ while any(s is not None for s in groupby_slices):
+ # Determine the extent (start, stop) of the group given
+ # each current group-by coordinate group.
+ start = max(s.start for s in groupby_slices if s is not None)
+ stop = min(s.stop for s in groupby_slices if s is not None)
+ # Construct composite group key for the group using the
+ # start value from each group-by coordinate.
+ key = tuple(
+ coord.points[start] for coord in self._groupby_coords
+ )
+ # Associate group slice with group key within the ordered
+ # dictionary.
+ indices_by_key.setdefault(key, []).extend(range(start, stop))
+ # Prepare for the next group slice construction over the
+ # group-by coordinates.
+ for index, groupby_slice in enumerate(groupby_slices):
+ if groupby_slice is None:
+ continue
+ # Determine whether coordinate has spanned all its
+ # groups i.e. its full length
+ # or whether we need to get the coordinates next group.
+ if groupby_slice.stop == self._stop:
+ # This coordinate has exhausted all its groups,
+ # so remove it.
+ groupby_slices[index] = None
+ elif groupby_slice.stop == stop:
+ # The current group of this coordinate is
+ # exhausted, so get the next one.
+ groupby_slices[index] = next(groups[index])
+
+ # Cache the indices
+ self._groupby_indices = [tuple(i) for i in indices_by_key.values()]
+ # Calculate the new group-by coordinates.
+ self._compute_groupby_coords()
+ # Calculate the new shared coordinates.
+ self._compute_shared_coords()
+
+ # Return the group-by indices/groups.
+ return self._groupby_indices
+
+ def _compute_groupby_coords(self) -> None:
"""Create new group-by coordinates given the group slices."""
-
- groupby_slice = []
-
- # Iterate over the ordered dictionary in order to construct
- # a group-by slice that samples the first element from each group.
- for key_slice in self._slices_by_key.values():
- if isinstance(key_slice, tuple):
- groupby_slice.append(key_slice[0])
- else:
- groupby_slice.append(key_slice.start)
-
- groupby_slice = np.array(groupby_slice)
+ # Construct a group-by slice that samples the first element from each
+ # group.
+ groupby_slice = np.array([i[0] for i in self._groupby_indices])
# Create new group-by coordinates from the group-by slice.
self.coords = [coord[groupby_slice] for coord in self._groupby_coords]
- def _compute_shared_coords(self):
+ def _compute_shared_coords(self) -> None:
"""Create the new shared coordinates given the group slices."""
-
- groupby_indices = []
- groupby_bounds = []
-
- # Iterate over the ordered dictionary in order to construct a list of
- # tuple group indices, and a list of the respective bounds of those
- # indices.
- for key_slice in self._slices_by_key.values():
- if isinstance(key_slice, tuple):
- indices = key_slice
- else:
- indices = tuple(range(*key_slice.indices(self._stop)))
-
- groupby_indices.append(indices)
- groupby_bounds.append((indices[0], indices[-1]))
-
- # Create new shared bounded coordinates.
for coord, dim in self._shared_coords:
climatological_coord = (
self.climatological and coord.units.is_time_reference()
)
if coord.points.dtype.kind in "SU":
if coord.bounds is None:
- new_points = []
+ new_points_list = []
new_bounds = None
# np.apply_along_axis does not work with str.join, so we
# need to loop through the array directly. First move axis
@@ -2486,32 +2559,32 @@ def _compute_shared_coords(self):
work_arr = np.moveaxis(coord.points, dim, -1)
shape = work_arr.shape
work_shape = (-1, shape[-1])
- new_shape = (len(self),)
+ new_shape: tuple[int, ...] = (len(self),)
if coord.ndim > 1:
new_shape += shape[:-1]
work_arr = work_arr.reshape(work_shape)
- for indices in groupby_indices:
+ for indices in self._groupby_indices:
for arr in work_arr:
- new_points.append("|".join(arr.take(indices)))
+ new_points_list.append("|".join(arr.take(indices)))
# Reinstate flattened dimensions. Aggregated dim now leads.
- new_points = np.array(new_points).reshape(new_shape)
+ new_points = np.array(new_points_list).reshape(new_shape)
# Move aggregated dimension back to position it started in.
new_points = np.moveaxis(new_points, 0, dim)
else:
msg = (
- "collapsing the bounded string coordinate {0!r}"
- " is not supported".format(coord.name())
+ "collapsing the bounded string coordinate"
+ f" {coord.name()!r} is not supported"
)
raise ValueError(msg)
else:
- new_bounds = []
+ new_bounds_list = []
if coord.has_bounds():
# Derive new coord's bounds from bounds.
item = coord.bounds
- maxmin_axis = (dim, -1)
+ maxmin_axis: Union[int, tuple[int, int]] = (dim, -1)
first_choices = coord.bounds.take(0, -1)
last_choices = coord.bounds.take(1, -1)
@@ -2528,12 +2601,13 @@ def _compute_shared_coords(self):
# Construct list of coordinate group boundary pairs.
if monotonic:
# Use first and last bound or point for new bounds.
- for start, stop in groupby_bounds:
+ for indices in self._groupby_indices:
+ start, stop = indices[0], indices[-1]
if (
getattr(coord, "circular", False)
and (stop + 1) == self._stop
):
- new_bounds.append(
+ new_bounds_list.append(
[
first_choices.take(start, dim),
first_choices.take(0, dim)
@@ -2541,7 +2615,7 @@ def _compute_shared_coords(self):
]
)
else:
- new_bounds.append(
+ new_bounds_list.append(
[
first_choices.take(start, dim),
last_choices.take(stop, dim),
@@ -2549,9 +2623,9 @@ def _compute_shared_coords(self):
)
else:
# Use min and max bound or point for new bounds.
- for indices in groupby_indices:
+ for indices in self._groupby_indices:
item_slice = item.take(indices, dim)
- new_bounds.append(
+ new_bounds_list.append(
[
item_slice.min(axis=maxmin_axis),
item_slice.max(axis=maxmin_axis),
@@ -2562,7 +2636,7 @@ def _compute_shared_coords(self):
# dimension last, and the aggregated dimension back in its
# original position.
new_bounds = np.moveaxis(
- np.array(new_bounds), (0, 1), (dim, -1)
+ np.array(new_bounds_list), (0, 1), (dim, -1)
)
# Now create the new bounded group shared coordinate.
@@ -2574,8 +2648,8 @@ def _compute_shared_coords(self):
new_points = new_bounds.mean(-1)
except TypeError:
msg = (
- "The {0!r} coordinate on the collapsing dimension"
- " cannot be collapsed.".format(coord.name())
+ f"The {coord.name()!r} coordinate on the collapsing"
+ " dimension cannot be collapsed."
)
raise ValueError(msg)
@@ -2593,29 +2667,16 @@ def _compute_shared_coords(self):
self.coords.append(new_coord)
- def __len__(self):
+ def __len__(self) -> int:
"""Calculate the number of groups given the group-by coordinates."""
+ return len(self.group())
- if self._slices_by_key:
- value = len(self._slices_by_key)
- else:
- value = len([s for s in self.group()])
-
- return value
-
- def __repr__(self):
+ def __repr__(self) -> str:
groupby_coords = [coord.name() for coord in self._groupby_coords]
-
- if self._shared_coords_by_name:
- shared_coords = [coord.name() for coord in self._shared_coords]
- shared_string = ", shared_coords=%r)" % shared_coords
- else:
- shared_string = ")"
-
- return "%s(%r%s" % (
- self.__class__.__name__,
- groupby_coords,
- shared_string,
+ shared_coords = [coord.name() for coord, _ in self._shared_coords]
+ return (
+ f"{self.__class__.__name__}({groupby_coords!r}"
+ f", shared_coords={shared_coords!r})"
)
@@ -2807,7 +2868,7 @@ def __init__(self, mdtol=1):
Both sourge and target cubes must have an XY grid defined by
separate X and Y dimensions with dimension coordinates.
All of the XY dimension coordinates must also be bounded, and have
- the same cooordinate system.
+ the same coordinate system.
"""
if not (0 <= mdtol <= 1):
diff --git a/lib/iris/analysis/_area_weighted.py b/lib/iris/analysis/_area_weighted.py
index 3b728e9a43..edead3948a 100644
--- a/lib/iris/analysis/_area_weighted.py
+++ b/lib/iris/analysis/_area_weighted.py
@@ -433,7 +433,7 @@ def _spherical_area(y_bounds, x_bounds, radius=1.0):
Args:
* y_bounds:
- An (n, 2) shaped NumPy array of latitide bounds in radians.
+ An (n, 2) shaped NumPy array of latitude bounds in radians.
* x_bounds:
An (m, 2) shaped NumPy array of longitude bounds in radians.
* radius:
@@ -586,7 +586,7 @@ def _regrid_area_weighted_array(
y_dim = src_data.ndim - 2
# Create empty "pre-averaging" data array that will enable the
- # src_data data coresponding to a given target grid point,
+ # src_data data corresponding to a given target grid point,
# to be stacked per point.
# Note that dtype is not preserved and that the array mask
# allows for regions that do not overlap.
diff --git a/lib/iris/analysis/_interpolation.py b/lib/iris/analysis/_interpolation.py
index f5e89a9e51..34dcae3026 100644
--- a/lib/iris/analysis/_interpolation.py
+++ b/lib/iris/analysis/_interpolation.py
@@ -213,7 +213,7 @@ def __init__(self, src_cube, coords, method, extrapolation_mode):
# Trigger any deferred loading of the source cube's data and snapshot
# its state to ensure that the interpolator is impervious to external
# changes to the original source cube. The data is loaded to prevent
- # the snaphot having lazy data, avoiding the potential for the
+ # the snapshot having lazy data, avoiding the potential for the
# same data to be loaded again and again.
if src_cube.has_lazy_data():
src_cube.data
diff --git a/lib/iris/analysis/_regrid.py b/lib/iris/analysis/_regrid.py
index f1891a48e4..4592a0ede7 100644
--- a/lib/iris/analysis/_regrid.py
+++ b/lib/iris/analysis/_regrid.py
@@ -239,7 +239,7 @@ def _regrid_indices(cells, depth, points):
x_indices = _regrid_indices(tx_cells, tx_depth, sx_points)
y_indices = _regrid_indices(ty_cells, ty_depth, sy_points)
- # Now construct a sparse M x N matix, where M is the flattened target
+ # Now construct a sparse M x N matrix, where M is the flattened target
# space, and N is the flattened source space. The sparse matrix will then
# be populated with those source cube points that contribute to a specific
# target cube cell.
@@ -1021,7 +1021,7 @@ def _create_cube(
The dimensions of the X and Y coordinate within the source Cube.
tgt_coords : tuple of :class:`iris.coords.Coord`\\ 's
Either two 1D :class:`iris.coords.DimCoord`\\ 's, two 1D
- :class:`iris.experimental.ugrid.DimCoord`\\ 's or two ND
+ :class:`iris.experimental.ugrid.DimCoord`\\ 's or two n-D
:class:`iris.coords.AuxCoord`\\ 's representing the new grid's
X and Y coordinates.
num_tgt_dims : int
diff --git a/lib/iris/analysis/cartography.py b/lib/iris/analysis/cartography.py
index a8e90a63ad..0d17f0b38a 100644
--- a/lib/iris/analysis/cartography.py
+++ b/lib/iris/analysis/cartography.py
@@ -15,6 +15,7 @@
import cartopy.crs as ccrs
import cartopy.img_transform
import cf_units
+import dask.array as da
import numpy as np
import numpy.ma as ma
@@ -1012,7 +1013,7 @@ def _transform_distance_vectors_tolerance_mask(
"""
Return a mask that can be applied to data array to mask elements
where the magnitude of vectors are not preserved due to numerical
- errors introduced by the tranformation between coordinate systems.
+ errors introduced by the transformation between coordinate systems.
Args:
* src_crs (`cartopy.crs.Projection`):
@@ -1206,9 +1207,15 @@ def rotate_winds(u_cube, v_cube, target_cs):
x = x.transpose()
y = y.transpose()
- # Create resulting cubes.
- ut_cube = u_cube.copy()
- vt_cube = v_cube.copy()
+ # Create resulting cubes - produce lazy output data if at least
+ # one input cube has lazy data
+ lazy_output = u_cube.has_lazy_data() or v_cube.has_lazy_data()
+ if lazy_output:
+ ut_cube = u_cube.copy(data=da.empty_like(u_cube.lazy_data()))
+ vt_cube = v_cube.copy(data=da.empty_like(v_cube.lazy_data()))
+ else:
+ ut_cube = u_cube.copy()
+ vt_cube = v_cube.copy()
ut_cube.rename("transformed_{}".format(u_cube.name()))
vt_cube.rename("transformed_{}".format(v_cube.name()))
@@ -1236,8 +1243,12 @@ def rotate_winds(u_cube, v_cube, target_cs):
apply_mask = mask.any()
if apply_mask:
# Make masked arrays to accept masking.
- ut_cube.data = ma.asanyarray(ut_cube.data)
- vt_cube.data = ma.asanyarray(vt_cube.data)
+ if lazy_output:
+ ut_cube = ut_cube.copy(data=da.ma.empty_like(ut_cube.core_data()))
+ vt_cube = vt_cube.copy(data=da.ma.empty_like(vt_cube.core_data()))
+ else:
+ ut_cube.data = ma.asanyarray(ut_cube.data)
+ vt_cube.data = ma.asanyarray(vt_cube.data)
# Project vectors with u, v components one horiz slice at a time and
# insert into the resulting cubes.
@@ -1250,16 +1261,20 @@ def rotate_winds(u_cube, v_cube, target_cs):
for dim in dims:
index[dim] = slice(None, None)
index = tuple(index)
- u = u_cube.data[index]
- v = v_cube.data[index]
+ u = u_cube.core_data()[index]
+ v = v_cube.core_data()[index]
ut, vt = _transform_distance_vectors(u, v, ds, dx2, dy2)
if apply_mask:
- ut = ma.asanyarray(ut)
- ut[mask] = ma.masked
- vt = ma.asanyarray(vt)
- vt[mask] = ma.masked
- ut_cube.data[index] = ut
- vt_cube.data[index] = vt
+ if lazy_output:
+ ut = da.ma.masked_array(ut, mask=mask)
+ vt = da.ma.masked_array(vt, mask=mask)
+ else:
+ ut = ma.asanyarray(ut)
+ ut[mask] = ma.masked
+ vt = ma.asanyarray(vt)
+ vt[mask] = ma.masked
+ ut_cube.core_data()[index] = ut
+ vt_cube.core_data()[index] = vt
# Calculate new coords of locations in target coordinate system.
xyz_tran = target_crs.transform_points(src_crs, x, y)
diff --git a/lib/iris/analysis/maths.py b/lib/iris/analysis/maths.py
index 09a02ad51c..b77c6cd80f 100644
--- a/lib/iris/analysis/maths.py
+++ b/lib/iris/analysis/maths.py
@@ -225,33 +225,35 @@ def _assert_is_cube(cube):
@_lenient_client(services=SERVICES)
def add(cube, other, dim=None, in_place=False):
"""
- Calculate the sum of two cubes, or the sum of a cube and a
- coordinate or scalar value.
+ Calculate the sum of two cubes, or the sum of a cube and a coordinate or
+ array or scalar value.
- When summing two cubes, they must both have the same coordinate
- systems & data resolution.
+ When summing two cubes, they must both have the same coordinate systems and
+ data resolution.
- When adding a coordinate to a cube, they must both share the same
- number of elements along a shared axis.
+ When adding a coordinate to a cube, they must both share the same number of
+ elements along a shared axis.
- Args:
+ Parameters
+ ----------
- * cube:
- An instance of :class:`iris.cube.Cube`.
- * other:
- An instance of :class:`iris.cube.Cube` or :class:`iris.coords.Coord`,
- or a number or :class:`numpy.ndarray`.
+ cube : iris.cube.Cube
+ First operand to add.
- Kwargs:
+ other: iris.cube.Cube, iris.coords.Coord, number, numpy.ndarray or dask.array.Array
+ Second operand to add.
- * dim:
- If supplying a coord with no match on the cube, you must supply
- the dimension to process.
- * in_place:
- Whether to create a new Cube, or alter the given "cube".
+ dim : int, optional
+ If `other` is a coord which does not exist on the cube, specify the
+ dimension to which it should be mapped.
- Returns:
- An instance of :class:`iris.cube.Cube`.
+ in_place : bool, default=False
+ If `True`, alters the input cube. Otherwise a new cube is created.
+
+ Returns
+ -------
+
+ iris.cube.Cube
Notes
------
@@ -280,32 +282,34 @@ def add(cube, other, dim=None, in_place=False):
def subtract(cube, other, dim=None, in_place=False):
"""
Calculate the difference between two cubes, or the difference between
- a cube and a coordinate or scalar value.
+ a cube and a coordinate or array or scalar value.
- When subtracting two cubes, they must both have the same coordinate
- systems & data resolution.
+ When differencing two cubes, they must both have the same coordinate systems
+ and data resolution.
- When subtracting a coordinate to a cube, they must both share the
- same number of elements along a shared axis.
+ When subtracting a coordinate from a cube, they must both share the same
+ number of elements along a shared axis.
- Args:
+ Parameters
+ ----------
- * cube:
- An instance of :class:`iris.cube.Cube`.
- * other:
- An instance of :class:`iris.cube.Cube` or :class:`iris.coords.Coord`,
- or a number or :class:`numpy.ndarray`.
+ cube : iris.cube.Cube
+ Cube from which to subtract.
- Kwargs:
+ other: iris.cube.Cube, iris.coords.Coord, number, numpy.ndarray or dask.array.Array
+ Object to subtract from the cube.
- * dim:
- If supplying a coord with no match on the cube, you must supply
- the dimension to process.
- * in_place:
- Whether to create a new Cube, or alter the given "cube".
+ dim : int, optional
+ If `other` is a coord which does not exist on the cube, specify the
+ dimension to which it should be mapped.
- Returns:
- An instance of :class:`iris.cube.Cube`.
+ in_place : bool, default=False
+ If `True`, alters the input cube. Otherwise a new cube is created.
+
+ Returns
+ -------
+
+ iris.cube.Cube
Notes
------
@@ -348,8 +352,8 @@ def _add_subtract_common(
operation_name - the public name of the operation (e.g. 'divide')
cube - the cube whose data is used as the first argument
to `operation_function`
- other - the cube, coord, ndarray or number whose data is
- used as the second argument
+ other - the cube, coord, ndarray, dask array or number whose
+ data is used as the second argument
new_dtype - the expected dtype of the output. Used in the
case of scalar masked arrays
dim - dimension along which to apply `other` if it's a
@@ -384,24 +388,35 @@ def _add_subtract_common(
@_lenient_client(services=SERVICES)
def multiply(cube, other, dim=None, in_place=False):
"""
- Calculate the product of a cube and another cube or coordinate.
+ Calculate the product of two cubes, or the product of a cube and a coordinate
+ or array or scalar value.
- Args:
+ When multiplying two cubes, they must both have the same coordinate systems
+ and data resolution.
- * cube:
- An instance of :class:`iris.cube.Cube`.
- * other:
- An instance of :class:`iris.cube.Cube` or :class:`iris.coords.Coord`,
- or a number or :class:`numpy.ndarray`.
+ When mulplying a cube by a coordinate, they must both share the same number
+ of elements along a shared axis.
- Kwargs:
+ Parameters
+ ----------
- * dim:
- If supplying a coord with no match on the cube, you must supply
- the dimension to process.
+ cube : iris.cube.Cube
+ First operand to multiply.
- Returns:
- An instance of :class:`iris.cube.Cube`.
+ other: iris.cube.Cube, iris.coords.Coord, number, numpy.ndarray or dask.array.Array
+ Second operand to multiply.
+
+ dim : int, optional
+ If `other` is a coord which does not exist on the cube, specify the
+ dimension to which it should be mapped.
+
+ in_place : bool, default=False
+ If `True`, alters the input cube. Otherwise a new cube is created.
+
+ Returns
+ -------
+
+ iris.cube.Cube
Notes
------
@@ -461,24 +476,35 @@ def _inplace_common_checks(cube, other, math_op):
@_lenient_client(services=SERVICES)
def divide(cube, other, dim=None, in_place=False):
"""
- Calculate the division of a cube by a cube or coordinate.
+ Calculate the ratio of two cubes, or the ratio of a cube and a coordinate
+ or array or scalar value.
- Args:
+ When dividing a cube by another cube, they must both have the same coordinate
+ systems and data resolution.
- * cube:
- An instance of :class:`iris.cube.Cube`.
- * other:
- An instance of :class:`iris.cube.Cube` or :class:`iris.coords.Coord`,
- or a number or :class:`numpy.ndarray`.
+ When dividing a cube by a coordinate, they must both share the same number
+ of elements along a shared axis.
- Kwargs:
+ Parameters
+ ----------
- * dim:
- If supplying a coord with no match on the cube, you must supply
- the dimension to process.
+ cube : iris.cube.Cube
+ Numerator.
- Returns:
- An instance of :class:`iris.cube.Cube`.
+ other: iris.cube.Cube, iris.coords.Coord, number, numpy.ndarray or dask.array.Array
+ Denominator.
+
+ dim : int, optional
+ If `other` is a coord which does not exist on the cube, specify the
+ dimension to which it should be mapped.
+
+ in_place : bool, default=False
+ If `True`, alters the input cube. Otherwise a new cube is created.
+
+ Returns
+ -------
+
+ iris.cube.Cube
Notes
------
@@ -842,8 +868,8 @@ def _binary_op_common(
operation_name - the public name of the operation (e.g. 'divide')
cube - the cube whose data is used as the first argument
to `operation_function`
- other - the cube, coord, ndarray or number whose data is
- used as the second argument
+ other - the cube, coord, ndarray, dask array or number whose
+ data is used as the second argument
new_dtype - the expected dtype of the output. Used in the
case of scalar masked arrays
new_unit - unit for the resulting quantity
@@ -883,7 +909,10 @@ def _binary_op_common(
rhs = other.core_data()
else:
# The rhs must be an array.
- rhs = np.asanyarray(other)
+ if iris._lazy_data.is_lazy_data(other):
+ rhs = other
+ else:
+ rhs = np.asanyarray(other)
def unary_func(lhs):
data = operation_function(lhs, rhs)
@@ -1194,7 +1223,7 @@ def __call__(
Kwargs:
* other
- A cube, coord, ndarray or number whose data is used as the
+ A cube, coord, ndarray, dask array or number whose data is used as the
second argument to the data function.
* new_name:
diff --git a/lib/iris/analysis/trajectory.py b/lib/iris/analysis/trajectory.py
index 24f7a9dede..84ce89ab6f 100644
--- a/lib/iris/analysis/trajectory.py
+++ b/lib/iris/analysis/trajectory.py
@@ -347,7 +347,7 @@ def interpolate(cube, sample_points, method=None):
for columns_coord in columns.dim_coords + columns.aux_coords:
src_dims = cube.coord_dims(columns_coord)
if not squish_my_dims.isdisjoint(src_dims):
- # Mapping the cube indicies onto the coord
+ # Mapping the cube indices onto the coord
initial_coord_inds = [initial_inds[ind] for ind in src_dims]
# Making the final ones the same way as for the cube
# 0 will always appear in the initial ones because we know this
@@ -660,7 +660,7 @@ def _nearest_neighbour_indices_ndcoords(cube, sample_points, cache=None):
for c, (coord, coord_dims) in enumerate(
sample_space_coords_and_dims
):
- # Index of this datum along this coordinate (could be nD).
+ # Index of this datum along this coordinate (could be n-D).
if coord_dims:
keys = tuple(ndi[ind] for ind in coord_dims)
else:
diff --git a/lib/iris/common/metadata.py b/lib/iris/common/metadata.py
index cb3149fe58..7def79f51e 100644
--- a/lib/iris/common/metadata.py
+++ b/lib/iris/common/metadata.py
@@ -52,7 +52,7 @@
def hexdigest(item):
"""
- Calculate a hexidecimal string hash representation of the provided item.
+ Calculate a hexadecimal string hash representation of the provided item.
Calculates a 64-bit non-cryptographic hash of the provided item, using
the extremely fast ``xxhash`` hashing algorithm, and returns the hexdigest
@@ -67,7 +67,7 @@ def hexdigest(item):
The item that requires to have its hexdigest calculated.
Returns:
- The string hexidecimal representation of the item's 64-bit hash.
+ The string hexadecimal representation of the item's 64-bit hash.
"""
# Special case: deal with numpy arrays.
diff --git a/lib/iris/common/resolve.py b/lib/iris/common/resolve.py
index a0c97dfc00..8d5d57d4a4 100644
--- a/lib/iris/common/resolve.py
+++ b/lib/iris/common/resolve.py
@@ -144,7 +144,7 @@ class Resolve:
forecast_reference_time 1859-09-01 06:00:00
height 1.5 m
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'A1B'
@@ -162,7 +162,7 @@ class Resolve:
height 1.5 m
time 1860-06-01 00:00:00, bound=(1859-12-01 00:00:00, 1860-12-01 00:00:00)
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'E1'
@@ -185,7 +185,7 @@ class Resolve:
forecast_reference_time 1859-09-01 06:00:00
height 1.5 m
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
STASH m01s03i236
@@ -726,7 +726,7 @@ def _create_prepared_item(
If container or type(coord) is DimCoord/AuxCoord (i.e. not
MeshCoord), then points+bounds define the built AuxCoord/DimCoord.
- Theses points+bounds come either from those args, or the 'coord'.
+ These points+bounds come either from those args, or the 'coord'.
Alternatively, when container or type(coord) is MeshCoord, then
points==bounds==None and the preparted item contains
mesh/location/axis properties for the resulting MeshCoord.
@@ -1014,7 +1014,7 @@ def _assign_mapping(extent, unmapped_local_items, free_items=None):
# Map to the first available unmapped local dimension or
# the first available free dimension.
# Dimension shape doesn't matter here as the extent is 1,
- # therefore broadcasting will take care of any discrepency
+ # therefore broadcasting will take care of any discrepancy
# between src and tgt dimension extent.
if unmapped_local_items:
result, _ = unmapped_local_items.pop(0)
@@ -2542,7 +2542,7 @@ def mapped(self):
forecast_reference_time 1859-09-01 06:00:00
height 1.5 m
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'A1B'
@@ -2559,7 +2559,7 @@ def mapped(self):
height 1.5 m
time 1860-06-01 00:00:00, bound=(1859-12-01 00:00:00, 1860-12-01 00:00:00)
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'E1'
@@ -2610,7 +2610,7 @@ def shape(self):
forecast_reference_time 1859-09-01 06:00:00
height 1.5 m
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'A1B'
@@ -2627,7 +2627,7 @@ def shape(self):
height 1.5 m
time 1860-06-01 00:00:00, bound=(1859-12-01 00:00:00, 1860-12-01 00:00:00)
Cell methods:
- mean time (6 hour)
+ 0 time: mean (interval: 6 hour)
Attributes:
Conventions 'CF-1.5'
Model scenario 'E1'
diff --git a/lib/iris/config.py b/lib/iris/config.py
index 3659ac7dcd..79d141e53f 100644
--- a/lib/iris/config.py
+++ b/lib/iris/config.py
@@ -171,8 +171,7 @@ def get_dir_option(section, option, default=None):
)
# Override the data repository if the appropriate environment variable
-# has been set. This is used in setup.py in the TestRunner command to
-# enable us to simulate the absence of external data.
+# has been set.
override = os.environ.get("OVERRIDE_TEST_DATA_REPOSITORY")
if override:
TEST_DATA_DIR = None
diff --git a/lib/iris/coords.py b/lib/iris/coords.py
index 91bb786ae8..63bc524637 100644
--- a/lib/iris/coords.py
+++ b/lib/iris/coords.py
@@ -10,10 +10,10 @@
from abc import ABCMeta, abstractmethod
from collections import namedtuple
-from collections.abc import Container, Iterator
+from collections.abc import Container
import copy
from functools import lru_cache
-from itertools import chain, zip_longest
+from itertools import zip_longest
import operator
import warnings
import zlib
@@ -1218,10 +1218,6 @@ def __new__(
BOUND_POSITION_END = 1
-# Private named tuple class for coordinate groups.
-_GroupbyItem = namedtuple("GroupbyItem", "groupby_point, groupby_slice")
-
-
def _get_2d_coord_bound_grid(bounds):
"""
Creates a grid using the bounds of a 2D coordinate with 4 sided cells.
@@ -1936,11 +1932,12 @@ def _discontiguity_in_bounds(self, rtol=1e-5, atol=1e-8):
* contiguous: (boolean)
True if there are no discontiguities.
* diffs: (array or tuple of arrays)
- The diffs along the bounds of the coordinate. If self is a 2D
- coord of shape (Y, X), a tuple of arrays is returned, where the
- first is an array of differences along the x-axis, of the shape
- (Y, X-1) and the second is an array of differences along the
- y-axis, of the shape (Y-1, X).
+ A boolean array or tuple of boolean arrays which are true where
+ there are discontiguities between neighbouring bounds. If self is
+ a 2D coord of shape (Y, X), a pair of arrays is returned, where
+ the first is an array of differences along the x-axis, of the
+ shape (Y, X-1) and the second is an array of differences along
+ the y-axis, of the shape (Y-1, X).
"""
self._sanity_check_bounds()
@@ -1949,7 +1946,9 @@ def _discontiguity_in_bounds(self, rtol=1e-5, atol=1e-8):
contiguous = np.allclose(
self.bounds[1:, 0], self.bounds[:-1, 1], rtol=rtol, atol=atol
)
- diffs = np.abs(self.bounds[:-1, 1] - self.bounds[1:, 0])
+ diffs = ~np.isclose(
+ self.bounds[1:, 0], self.bounds[:-1, 1], rtol=rtol, atol=atol
+ )
elif self.ndim == 2:
@@ -1957,31 +1956,55 @@ def mod360_adjust(compare_axis):
bounds = self.bounds.copy()
if compare_axis == "x":
- upper_bounds = bounds[:, :-1, 1]
- lower_bounds = bounds[:, 1:, 0]
+ # Extract the pairs of upper bounds and lower bounds which
+ # connect along the "x" axis. These connect along indices
+ # as shown by the following diagram:
+ #
+ # 3---2 + 3---2
+ # | | | |
+ # 0---1 + 0---1
+ upper_bounds = np.stack(
+ (bounds[:, :-1, 1], bounds[:, :-1, 2])
+ )
+ lower_bounds = np.stack(
+ (bounds[:, 1:, 0], bounds[:, 1:, 3])
+ )
elif compare_axis == "y":
- upper_bounds = bounds[:-1, :, 3]
- lower_bounds = bounds[1:, :, 0]
+ # Extract the pairs of upper bounds and lower bounds which
+ # connect along the "y" axis. These connect along indices
+ # as shown by the following diagram:
+ #
+ # 3---2
+ # | |
+ # 0---1
+ # + +
+ # 3---2
+ # | |
+ # 0---1
+ upper_bounds = np.stack(
+ (bounds[:-1, :, 3], bounds[:-1, :, 2])
+ )
+ lower_bounds = np.stack(
+ (bounds[1:, :, 0], bounds[1:, :, 1])
+ )
if self.name() in ["longitude", "grid_longitude"]:
# If longitude, adjust for longitude wrapping
diffs = upper_bounds - lower_bounds
- index = diffs > 180
+ index = np.abs(diffs) > 180
if index.any():
sign = np.sign(diffs)
modification = (index.astype(int) * 360) * sign
upper_bounds -= modification
- diffs_between_cells = np.abs(upper_bounds - lower_bounds)
- cell_size = lower_bounds - upper_bounds
- diffs_along_axis = diffs_between_cells > (
- atol + rtol * cell_size
+ diffs_along_bounds = ~np.isclose(
+ upper_bounds, lower_bounds, rtol=rtol, atol=atol
)
-
- points_close_enough = diffs_along_axis <= (
- atol + rtol * cell_size
+ diffs_along_axis = np.logical_or(
+ diffs_along_bounds[0], diffs_along_bounds[1]
)
- contiguous_along_axis = np.all(points_close_enough)
+
+ contiguous_along_axis = ~np.any(diffs_along_axis)
return diffs_along_axis, contiguous_along_axis
diffs_along_x, match_cell_x1 = mod360_adjust(compare_axis="x")
@@ -3078,23 +3101,23 @@ def __init__(self, method, coords=None, intervals=None, comments=None):
def __str__(self):
"""Return a custom string representation of CellMethod"""
# Group related coord names intervals and comments together
- cell_components = zip_longest(
- self.coord_names, self.intervals, self.comments, fillvalue=""
+ coord_string = " ".join([f"{coord}:" for coord in self.coord_names])
+ method_string = str(self.method)
+ interval_string = " ".join(
+ [f"interval: {interval}" for interval in self.intervals]
)
+ comment_string = " ".join([comment for comment in self.comments])
- collection_summaries = []
- cm_summary = "%s: " % self.method
-
- for coord_name, interval, comment in cell_components:
- other_info = ", ".join(filter(None, chain((interval, comment))))
- if other_info:
- coord_summary = "%s (%s)" % (coord_name, other_info)
- else:
- coord_summary = "%s" % coord_name
+ if interval_string and comment_string:
+ comment_string = "".join(
+ [f" comment: {comment}" for comment in self.comments]
+ )
+ cm_summary = f"{coord_string} {method_string}"
- collection_summaries.append(coord_summary)
+ if interval_string or comment_string:
+ cm_summary += f" ({interval_string}{comment_string})"
- return cm_summary + ", ".join(collection_summaries)
+ return cm_summary
def __add__(self, other):
# Disable the default tuple behaviour of tuple concatenation
@@ -3131,26 +3154,3 @@ def xml_element(self, doc):
cellMethod_xml_element.appendChild(coord_xml_element)
return cellMethod_xml_element
-
-
-# See ExplicitCoord._group() for the description/context.
-class _GroupIterator(Iterator):
- def __init__(self, points):
- self._points = points
- self._start = 0
-
- def __next__(self):
- num_points = len(self._points)
- if self._start >= num_points:
- raise StopIteration
-
- stop = self._start + 1
- m = self._points[self._start]
- while stop < num_points and self._points[stop] == m:
- stop += 1
-
- group = _GroupbyItem(m, slice(self._start, stop))
- self._start = stop
- return group
-
- next = __next__
diff --git a/lib/iris/cube.py b/lib/iris/cube.py
index abe37c35fb..4c52303b2f 100644
--- a/lib/iris/cube.py
+++ b/lib/iris/cube.py
@@ -28,6 +28,7 @@
import iris._lazy_data as _lazy
import iris._merge
import iris.analysis
+from iris.analysis import _Weights
from iris.analysis.cartography import wrap_lons
import iris.analysis.maths
import iris.aux_factory
@@ -541,6 +542,7 @@ def concatenate_cube(
check_aux_coords=True,
check_cell_measures=True,
check_ancils=True,
+ check_derived_coords=True,
):
"""
Return the concatenated contents of the :class:`CubeList` as a single
@@ -553,20 +555,30 @@ def concatenate_cube(
Kwargs:
* check_aux_coords
- Checks the auxiliary coordinates of the cubes match. This check
- is not applied to auxiliary coordinates that span the dimension
- the concatenation is occurring along. Defaults to True.
+ Checks if the points and bounds of auxiliary coordinates of the
+ cubes match. This check is not applied to auxiliary coordinates
+ that span the dimension the concatenation is occurring along.
+ Defaults to True.
* check_cell_measures
- Checks the cell measures of the cubes match. This check
- is not applied to cell measures that span the dimension
- the concatenation is occurring along. Defaults to True.
+ Checks if the data of cell measures of the cubes match. This check
+ is not applied to cell measures that span the dimension the
+ concatenation is occurring along. Defaults to True.
* check_ancils
- Checks the ancillary variables of the cubes match. This check
- is not applied to ancillary variables that span the dimension
+ Checks if the data of ancillary variables of the cubes match. This
+ check is not applied to ancillary variables that span the dimension
the concatenation is occurring along. Defaults to True.
+ * check_derived_coords
+ Checks if the points and bounds of derived coordinates of the cubes
+ match. This check is not applied to derived coordinates that span
+ the dimension the concatenation is occurring along. Note that
+ differences in scalar coordinates and dimensional coordinates used
+ to derive the coordinate are still checked. Checks for auxiliary
+ coordinates used to derive the coordinates can be ignored with
+ `check_aux_coords`. Defaults to True.
+
.. note::
Concatenation cannot occur along an anonymous dimension.
@@ -586,6 +598,7 @@ def concatenate_cube(
check_aux_coords=check_aux_coords,
check_cell_measures=check_cell_measures,
check_ancils=check_ancils,
+ check_derived_coords=check_derived_coords,
)
n_res_cubes = len(res)
if n_res_cubes == 1:
@@ -612,6 +625,7 @@ def concatenate(
check_aux_coords=True,
check_cell_measures=True,
check_ancils=True,
+ check_derived_coords=True,
):
"""
Concatenate the cubes over their common dimensions.
@@ -619,20 +633,30 @@ def concatenate(
Kwargs:
* check_aux_coords
- Checks the auxiliary coordinates of the cubes match. This check
- is not applied to auxiliary coordinates that span the dimension
- the concatenation is occurring along. Defaults to True.
+ Checks if the points and bounds of auxiliary coordinates of the
+ cubes match. This check is not applied to auxiliary coordinates
+ that span the dimension the concatenation is occurring along.
+ Defaults to True.
* check_cell_measures
- Checks the cell measures of the cubes match. This check
- is not applied to cell measures that span the dimension
- the concatenation is occurring along. Defaults to True.
+ Checks if the data of cell measures of the cubes match. This check
+ is not applied to cell measures that span the dimension the
+ concatenation is occurring along. Defaults to True.
* check_ancils
- Checks the ancillary variables of the cubes match. This check
- is not applied to ancillary variables that span the dimension
+ Checks if the data of ancillary variables of the cubes match. This
+ check is not applied to ancillary variables that span the dimension
the concatenation is occurring along. Defaults to True.
+ * check_derived_coords
+ Checks if the points and bounds of derived coordinates of the cubes
+ match. This check is not applied to derived coordinates that span
+ the dimension the concatenation is occurring along. Note that
+ differences in scalar coordinates and dimensional coordinates used
+ to derive the coordinate are still checked. Checks for auxiliary
+ coordinates used to derive the coordinates can be ignored with
+ `check_aux_coords`. Defaults to True.
+
Returns:
A new :class:`iris.cube.CubeList` of concatenated
:class:`iris.cube.Cube` instances.
@@ -717,6 +741,7 @@ def concatenate(
check_aux_coords=check_aux_coords,
check_cell_measures=check_cell_measures,
check_ancils=check_ancils,
+ check_derived_coords=check_derived_coords,
)
def realise_data(self):
@@ -787,8 +812,8 @@ class Cube(CFVariableMixin):
time \
1998-12-01 00:00:00, bound=(1994-12-01 00:00:00, 1998-12-01 00:00:00)
Cell methods:
- mean within years time
- mean over years time
+ 0 time: mean within years
+ 1 time: mean over years
Attributes:
STASH m01s16i203
source 'Data from Met Office Unified Model'
@@ -3721,9 +3746,15 @@ def collapsed(self, coords, aggregator, **kwargs):
sum :data:`~iris.analysis.SUM`.
Weighted aggregations support an optional *weights* keyword argument.
- If set, this should be supplied as an array of weights whose shape
- matches the cube. Values for latitude-longitude area weights may be
- calculated using :func:`iris.analysis.cartography.area_weights`.
+ If set, this can be supplied as an array, cube, or (names of)
+ :meth:`~iris.cube.Cube.coords`, :meth:`~iris.cube.Cube.cell_measures`,
+ or :meth:`~iris.cube.Cube.ancillary_variables`. In all cases, the
+ weights should be 1d (for collapsing over a 1d coordinate) or match the
+ shape of the cube. When weights are not given as arrays, units are
+ correctly handled for weighted sums, i.e., the original unit of the
+ cube is multiplied by the units of the weights. Values for
+ latitude-longitude area weights may be calculated using
+ :func:`iris.analysis.cartography.area_weights`.
Some Iris aggregators support "lazy" evaluation, meaning that
cubes resulting from this method may represent data arrays which are
@@ -3768,8 +3799,8 @@ def collapsed(self, coords, aggregator, **kwargs):
longitude \
180.0 degrees, bound=(0.0, 360.0) degrees
Cell methods:
- mean month, year
- mean longitude
+ 0 month: year: mean
+ 1 longitude: mean
Attributes:
Conventions 'CF-1.5'
STASH m01s00i024
@@ -3802,6 +3833,10 @@ def collapsed(self, coords, aggregator, **kwargs):
cube.collapsed(['latitude', 'longitude'],
iris.analysis.VARIANCE)
"""
+ # Update weights kwargs (if necessary) to handle different types of
+ # weights
+ _Weights.update_kwargs(kwargs, self)
+
# Convert any coordinate names to coordinates
coords = self._as_list_of_coords(coords)
@@ -3970,10 +4005,14 @@ def aggregated_by(
also be supplied. These include :data:`~iris.analysis.MEAN` and
:data:`~iris.analysis.SUM`.
- Weighted aggregations support an optional *weights* keyword argument. If
- set, this should be supplied as an array of weights whose shape matches
- the cube or as 1D array whose length matches the dimension over which is
- aggregated.
+ Weighted aggregations support an optional *weights* keyword argument.
+ If set, this can be supplied as an array, cube, or (names of)
+ :meth:`~iris.cube.Cube.coords`, :meth:`~iris.cube.Cube.cell_measures`,
+ or :meth:`~iris.cube.Cube.ancillary_variables`. In all cases, the
+ weights should be 1d or match the shape of the cube. When weights are
+ not given as arrays, units are correctly handled for weighted sums,
+ i.e., the original unit of the cube is multiplied by the units of the
+ weights.
Parameters
----------
@@ -4025,13 +4064,17 @@ def aggregated_by(
Scalar coordinates:
forecast_period 0 hours
Cell methods:
- mean month, year
- mean year
+ 0 month: year: mean
+ 1 year: mean
Attributes:
Conventions 'CF-1.5'
STASH m01s00i024
"""
+ # Update weights kwargs (if necessary) to handle different types of
+ # weights
+ _Weights.update_kwargs(kwargs, self)
+
groupby_coords = []
dimension_to_groupby = None
@@ -4070,10 +4113,16 @@ def aggregated_by(
f"that is aggregated, got {len(weights):d}, expected "
f"{self.shape[dimension_to_groupby]:d}"
)
- weights = iris.util.broadcast_to_shape(
- weights,
- self.shape,
- (dimension_to_groupby,),
+
+ # iris.util.broadcast_to_shape does not preserve _Weights type
+ weights = _Weights(
+ iris.util.broadcast_to_shape(
+ weights,
+ self.shape,
+ (dimension_to_groupby,),
+ ),
+ self,
+ units=weights.units,
)
if weights.shape != self.shape:
raise ValueError(
@@ -4129,98 +4178,65 @@ def aggregated_by(
data_shape = list(self.shape + aggregator.aggregate_shape(**kwargs))
data_shape[dimension_to_groupby] = len(groupby)
- # Aggregate the group-by data.
+ # Choose appropriate data and functions for data aggregation.
if aggregator.lazy_func is not None and self.has_lazy_data():
- front_slice = (slice(None, None),) * dimension_to_groupby
- back_slice = (slice(None, None),) * (
- len(data_shape) - dimension_to_groupby - 1
- )
+ stack = da.stack
+ input_data = self.lazy_data()
+ agg_method = aggregator.lazy_aggregate
+ else:
+ input_data = self.data
+ # Note numpy.stack does not preserve masks.
+ stack = ma.stack if ma.isMaskedArray(input_data) else np.stack
+ agg_method = aggregator.aggregate
+
+ # Create data and weights slices.
+ front_slice = (slice(None),) * dimension_to_groupby
+ back_slice = (slice(None),) * (
+ len(data_shape) - dimension_to_groupby - 1
+ )
+
+ groupby_subarrs = map(
+ lambda groupby_slice: iris.util._slice_data_with_keys(
+ input_data, front_slice + (groupby_slice,) + back_slice
+ )[1],
+ groupby.group(),
+ )
- # Create cube and weights slices
- groupby_subcubes = map(
- lambda groupby_slice: self[
+ if weights is not None:
+ groupby_subweights = map(
+ lambda groupby_slice: weights[
front_slice + (groupby_slice,) + back_slice
- ].lazy_data(),
+ ],
groupby.group(),
)
- if weights is not None:
- groupby_subweights = map(
- lambda groupby_slice: weights[
- front_slice + (groupby_slice,) + back_slice
- ],
- groupby.group(),
- )
- else:
- groupby_subweights = (None for _ in range(len(groupby)))
+ else:
+ groupby_subweights = (None for _ in range(len(groupby)))
- agg = iris.analysis.create_weighted_aggregator_fn(
- aggregator.lazy_aggregate, axis=dimension_to_groupby, **kwargs
+ # Aggregate data slices.
+ agg = iris.analysis.create_weighted_aggregator_fn(
+ agg_method, axis=dimension_to_groupby, **kwargs
+ )
+ result = list(map(agg, groupby_subarrs, groupby_subweights))
+
+ # If weights are returned, "result" is a list of tuples (each tuple
+ # contains two elements; the first is the aggregated data, the
+ # second is the aggregated weights). Convert these to two lists
+ # (one for the aggregated data and one for the aggregated weights)
+ # before combining the different slices.
+ if return_weights:
+ result, weights_result = list(zip(*result))
+ aggregateby_weights = stack(
+ weights_result, axis=dimension_to_groupby
)
- result = list(map(agg, groupby_subcubes, groupby_subweights))
-
- # If weights are returned, "result" is a list of tuples (each tuple
- # contains two elements; the first is the aggregated data, the
- # second is the aggregated weights). Convert these to two lists
- # (one for the aggregated data and one for the aggregated weights)
- # before combining the different slices.
- if return_weights:
- result, weights_result = list(zip(*result))
- aggregateby_weights = da.stack(
- weights_result, axis=dimension_to_groupby
- )
- else:
- aggregateby_weights = None
- aggregateby_data = da.stack(result, axis=dimension_to_groupby)
else:
- cube_slice = [slice(None, None)] * len(data_shape)
- for i, groupby_slice in enumerate(groupby.group()):
- # Slice the cube with the group-by slice to create a group-by
- # sub-cube.
- cube_slice[dimension_to_groupby] = groupby_slice
- groupby_sub_cube = self[tuple(cube_slice)]
-
- # Slice the weights
- if weights is not None:
- groupby_sub_weights = weights[tuple(cube_slice)]
- kwargs["weights"] = groupby_sub_weights
-
- # Perform the aggregation over the group-by sub-cube and
- # repatriate the aggregated data into the aggregate-by cube
- # data. If weights are also returned, handle them separately.
- result = aggregator.aggregate(
- groupby_sub_cube.data, axis=dimension_to_groupby, **kwargs
- )
- if return_weights:
- weights_result = result[1]
- result = result[0]
- else:
- weights_result = None
-
- # Determine aggregation result data type for the aggregate-by
- # cube data on first pass.
- if i == 0:
- if ma.isMaskedArray(self.data):
- aggregateby_data = ma.zeros(
- data_shape, dtype=result.dtype
- )
- else:
- aggregateby_data = np.zeros(
- data_shape, dtype=result.dtype
- )
- if weights_result is not None:
- aggregateby_weights = np.zeros(
- data_shape, dtype=weights_result.dtype
- )
- else:
- aggregateby_weights = None
- cube_slice[dimension_to_groupby] = i
- aggregateby_data[tuple(cube_slice)] = result
- if weights_result is not None:
- aggregateby_weights[tuple(cube_slice)] = weights_result
+ aggregateby_weights = None
- # Restore original weights.
- if weights is not None:
- kwargs["weights"] = weights
+ aggregateby_data = stack(result, axis=dimension_to_groupby)
+ # Ensure plain ndarray is output if plain ndarray was input.
+ if ma.isMaskedArray(aggregateby_data) and not ma.isMaskedArray(
+ input_data
+ ):
+ aggregateby_data = ma.getdata(aggregateby_data)
# Add the aggregation meta data to the aggregate-by cube.
aggregator.update_metadata(
@@ -4289,8 +4305,11 @@ def rolling_window(self, coord, aggregator, window, **kwargs):
* kwargs:
Aggregator and aggregation function keyword arguments. The weights
- argument to the aggregator, if any, should be a 1d array with the
- same length as the chosen window.
+ argument to the aggregator, if any, should be a 1d array, cube, or
+ (names of) :meth:`~iris.cube.Cube.coords`,
+ :meth:`~iris.cube.Cube.cell_measures`, or
+ :meth:`~iris.cube.Cube.ancillary_variables` with the same length as
+ the chosen window.
Returns:
:class:`iris.cube.Cube`.
@@ -4321,7 +4340,7 @@ def rolling_window(self, coord, aggregator, window, **kwargs):
forecast_reference_time 2011-07-23 00:00:00
realization 10
Cell methods:
- mean time (1 hour)
+ 0 time: mean (interval: 1 hour)
Attributes:
STASH m01s00i024
source \
@@ -4346,8 +4365,8 @@ def rolling_window(self, coord, aggregator, window, **kwargs):
forecast_reference_time 2011-07-23 00:00:00
realization 10
Cell methods:
- mean time (1 hour)
- mean time
+ 0 time: mean (interval: 1 hour)
+ 1 time: mean
Attributes:
STASH m01s00i024
source \
@@ -4358,6 +4377,10 @@ def rolling_window(self, coord, aggregator, window, **kwargs):
possible windows of size 3 from the original cube.
"""
+ # Update weights kwargs (if necessary) to handle different types of
+ # weights
+ _Weights.update_kwargs(kwargs, self)
+
coord = self._as_list_of_coords(coord)[0]
if getattr(coord, "circular", False):
@@ -4459,8 +4482,14 @@ def rolling_window(self, coord, aggregator, window, **kwargs):
"as the window."
)
kwargs = dict(kwargs)
- kwargs["weights"] = iris.util.broadcast_to_shape(
- weights, rolling_window_data.shape, (dimension + 1,)
+
+ # iris.util.broadcast_to_shape does not preserve _Weights type
+ kwargs["weights"] = _Weights(
+ iris.util.broadcast_to_shape(
+ weights, rolling_window_data.shape, (dimension + 1,)
+ ),
+ self,
+ units=weights.units,
)
data_result = aggregator.aggregate(
rolling_window_data, axis=dimension + 1, **kwargs
diff --git a/lib/iris/experimental/regrid.py b/lib/iris/experimental/regrid.py
index 7c5d8e99cc..76c6002d2b 100644
--- a/lib/iris/experimental/regrid.py
+++ b/lib/iris/experimental/regrid.py
@@ -295,23 +295,23 @@ def __init__(self, src_cube, tgt_grid_cube, method, projection=None):
if src_x_coord.coord_system != src_y_coord.coord_system:
raise ValueError(
"'src_cube' lateral geographic coordinates have "
- "differing coordinate sytems."
+ "differing coordinate systems."
)
if src_x_coord.coord_system is None:
raise ValueError(
"'src_cube' lateral geographic coordinates have "
- "no coordinate sytem."
+ "no coordinate system."
)
tgt_x_coord, tgt_y_coord = get_xy_dim_coords(tgt_grid_cube)
if tgt_x_coord.coord_system != tgt_y_coord.coord_system:
raise ValueError(
"'tgt_grid_cube' lateral geographic coordinates "
- "have differing coordinate sytems."
+ "have differing coordinate systems."
)
if tgt_x_coord.coord_system is None:
raise ValueError(
"'tgt_grid_cube' lateral geographic coordinates "
- "have no coordinate sytem."
+ "have no coordinate system."
)
if projection is None:
@@ -572,12 +572,12 @@ def __call__(self, src_cube):
if src_x_coord.coord_system != src_y_coord.coord_system:
raise ValueError(
"'src' lateral geographic coordinates have "
- "differing coordinate sytems."
+ "differing coordinate systems."
)
if src_cs is None:
raise ValueError(
"'src' lateral geographic coordinates have "
- "no coordinate sytem."
+ "no coordinate system."
)
# Check the source grid units.
diff --git a/lib/iris/experimental/ugrid/load.py b/lib/iris/experimental/ugrid/load.py
index a522d91313..cfa3935991 100644
--- a/lib/iris/experimental/ugrid/load.py
+++ b/lib/iris/experimental/ugrid/load.py
@@ -209,7 +209,8 @@ def load_meshes(uris, var_name=None):
result = {}
for source in valid_sources:
- meshes_dict = _meshes_from_cf(CFUGridReader(source))
+ with CFUGridReader(source) as cf_reader:
+ meshes_dict = _meshes_from_cf(cf_reader)
meshes = list(meshes_dict.values())
if var_name is not None:
meshes = list(filter(lambda m: m.var_name == var_name, meshes))
diff --git a/lib/iris/experimental/ugrid/mesh.py b/lib/iris/experimental/ugrid/mesh.py
index 0d566da73f..af557c345c 100644
--- a/lib/iris/experimental/ugrid/mesh.py
+++ b/lib/iris/experimental/ugrid/mesh.py
@@ -2855,7 +2855,7 @@ def __init__(
# N.B. at present, coords in a Mesh are stored+accessed by 'axis', which
# means they must have a standard_name. So ...
- # (a) the 'location' (face/edge) coord *always* has a useable phenomenon
+ # (a) the 'location' (face/edge) coord *always* has a usable phenomenon
# identity.
# (b) we still want to check that location+node coords have the same
# phenomenon (i.e. physical meaning identity + units), **but** ...
diff --git a/lib/iris/experimental/ugrid/utils.py b/lib/iris/experimental/ugrid/utils.py
index 4efab6490b..a13a43d3fd 100644
--- a/lib/iris/experimental/ugrid/utils.py
+++ b/lib/iris/experimental/ugrid/utils.py
@@ -220,7 +220,7 @@ def recombine_submeshes(
# Use the mesh_dim to transpose inputs + outputs, if required, as it is
# simpler for all the array operations to always have the mesh dim *last*.
if mesh_dim == mesh_cube.ndim - 1:
- # Mesh dim is already the last one : no tranpose required
+ # Mesh dim is already the last one : no transpose required
untranspose_dims = None
else:
dim_range = np.arange(mesh_cube.ndim, dtype=int)
diff --git a/lib/iris/fileformats/__init__.py b/lib/iris/fileformats/__init__.py
index 96a848deb0..86b304b82c 100644
--- a/lib/iris/fileformats/__init__.py
+++ b/lib/iris/fileformats/__init__.py
@@ -9,6 +9,7 @@
"""
from iris.io.format_picker import (
+ DataSourceObjectProtocol,
FileExtension,
FormatAgent,
FormatSpecification,
@@ -125,16 +126,34 @@ def _load_grib(*args, **kwargs):
)
-_nc_dap = FormatSpecification(
- "NetCDF OPeNDAP",
- UriProtocol(),
- lambda protocol: protocol in ["http", "https"],
- netcdf.load_cubes,
- priority=6,
- constraint_aware_handler=True,
+FORMAT_AGENT.add_spec(
+ FormatSpecification(
+ "NetCDF OPeNDAP",
+ UriProtocol(),
+ lambda protocol: protocol in ["http", "https"],
+ netcdf.load_cubes,
+ priority=6,
+ constraint_aware_handler=True,
+ )
+)
+
+# NetCDF file presented as an open, readable netCDF4 dataset (or mimic).
+FORMAT_AGENT.add_spec(
+ FormatSpecification(
+ "NetCDF dataset",
+ DataSourceObjectProtocol(),
+ lambda object: all(
+ hasattr(object, x)
+ for x in ("variables", "dimensions", "groups", "ncattrs")
+ ),
+ # Note: this uses the same call as the above "NetCDF_v4" (and "NetCDF OPeNDAP")
+ # The handler itself needs to detect what is passed + handle it appropriately.
+ netcdf.load_cubes,
+ priority=4,
+ constraint_aware_handler=True,
+ )
)
-FORMAT_AGENT.add_spec(_nc_dap)
-del _nc_dap
+
#
# UM Fieldsfiles.
diff --git a/lib/iris/fileformats/_nc_load_rules/helpers.py b/lib/iris/fileformats/_nc_load_rules/helpers.py
index 35163c47d5..bbf9c660c5 100644
--- a/lib/iris/fileformats/_nc_load_rules/helpers.py
+++ b/lib/iris/fileformats/_nc_load_rules/helpers.py
@@ -13,6 +13,8 @@
build routines, and which it does not use.
"""
+import re
+from typing import List
import warnings
import cf_units
@@ -28,10 +30,6 @@
import iris.exceptions
import iris.fileformats.cf as cf
import iris.fileformats.netcdf
-from iris.fileformats.netcdf import (
- UnknownCellMethodWarning,
- parse_cell_methods,
-)
from iris.fileformats.netcdf.loader import _get_cf_var_data
import iris.std_names
import iris.util
@@ -184,6 +182,210 @@
CF_VALUE_STD_NAME_PROJ_Y = "projection_y_coordinate"
+################################################################################
+# Handling of cell-methods.
+
+_CM_COMMENT = "comment"
+_CM_EXTRA = "extra"
+_CM_INTERVAL = "interval"
+_CM_METHOD = "method"
+_CM_NAME = "name"
+_CM_PARSE_NAME = re.compile(r"([\w_]+\s*?:\s+)+")
+_CM_PARSE = re.compile(
+ r"""
+ (?P([\w_]+\s*?:\s+)+)
+ (?P[\w_\s]+(?![\w_]*\s*?:))\s*
+ (?:
+ \(\s*
+ (?P.+)
+ \)\s*
+ )?
+ """,
+ re.VERBOSE,
+)
+
+# Cell methods.
+_CM_KNOWN_METHODS = [
+ "point",
+ "sum",
+ "mean",
+ "maximum",
+ "minimum",
+ "mid_range",
+ "standard_deviation",
+ "variance",
+ "mode",
+ "median",
+]
+
+
+def _split_cell_methods(nc_cell_methods: str) -> List[re.Match]:
+ """
+ Split a CF cell_methods attribute string into a list of zero or more cell
+ methods, each of which is then parsed with a regex to return a list of match
+ objects.
+
+ Args:
+
+ * nc_cell_methods: The value of the cell methods attribute to be split.
+
+ Returns:
+
+ * nc_cell_methods_matches: A list of the re.Match objects associated with
+ each parsed cell method
+
+ Splitting is done based on words followed by colons outside of any brackets.
+ Validation of anything other than being laid out in the expected format is
+ left to the calling function.
+ """
+
+ # Find name candidates
+ name_start_inds = []
+ for m in _CM_PARSE_NAME.finditer(nc_cell_methods):
+ name_start_inds.append(m.start())
+
+ # Remove those that fall inside brackets
+ bracket_depth = 0
+ for ind, cha in enumerate(nc_cell_methods):
+ if cha == "(":
+ bracket_depth += 1
+ elif cha == ")":
+ bracket_depth -= 1
+ if bracket_depth < 0:
+ msg = (
+ "Cell methods may be incorrectly parsed due to mismatched "
+ "brackets"
+ )
+ warnings.warn(msg, UserWarning, stacklevel=2)
+ if bracket_depth > 0 and ind in name_start_inds:
+ name_start_inds.remove(ind)
+
+ # List tuples of indices of starts and ends of the cell methods in the string
+ method_indices = []
+ for ii in range(len(name_start_inds) - 1):
+ method_indices.append((name_start_inds[ii], name_start_inds[ii + 1]))
+ method_indices.append((name_start_inds[-1], len(nc_cell_methods)))
+
+ # Index the string and match against each substring
+ nc_cell_methods_matches = []
+ for start_ind, end_ind in method_indices:
+ nc_cell_method_str = nc_cell_methods[start_ind:end_ind]
+ nc_cell_method_match = _CM_PARSE.match(nc_cell_method_str.strip())
+ if not nc_cell_method_match:
+ msg = (
+ f"Failed to fully parse cell method string: {nc_cell_methods}"
+ )
+ warnings.warn(msg, UserWarning, stacklevel=2)
+ continue
+ nc_cell_methods_matches.append(nc_cell_method_match)
+
+ return nc_cell_methods_matches
+
+
+class UnknownCellMethodWarning(Warning):
+ pass
+
+
+def parse_cell_methods(nc_cell_methods):
+ """
+ Parse a CF cell_methods attribute string into a tuple of zero or
+ more CellMethod instances.
+
+ Args:
+
+ * nc_cell_methods (str):
+ The value of the cell methods attribute to be parsed.
+
+ Returns:
+
+ * cell_methods
+ An iterable of :class:`iris.coords.CellMethod`.
+
+ Multiple coordinates, intervals and comments are supported.
+ If a method has a non-standard name a warning will be issued, but the
+ results are not affected.
+
+ """
+
+ cell_methods = []
+ if nc_cell_methods is not None:
+ for m in _split_cell_methods(nc_cell_methods):
+ d = m.groupdict()
+ method = d[_CM_METHOD]
+ method = method.strip()
+ # Check validity of method, allowing for multi-part methods
+ # e.g. mean over years.
+ method_words = method.split()
+ if method_words[0].lower() not in _CM_KNOWN_METHODS:
+ msg = "NetCDF variable contains unknown cell method {!r}"
+ warnings.warn(
+ msg.format("{}".format(method_words[0])),
+ UnknownCellMethodWarning,
+ )
+ d[_CM_METHOD] = method
+ name = d[_CM_NAME]
+ name = name.replace(" ", "")
+ name = name.rstrip(":")
+ d[_CM_NAME] = tuple([n for n in name.split(":")])
+ interval = []
+ comment = []
+ if d[_CM_EXTRA] is not None:
+ #
+ # tokenise the key words and field colon marker
+ #
+ d[_CM_EXTRA] = d[_CM_EXTRA].replace(
+ "comment:", "<><<:>>"
+ )
+ d[_CM_EXTRA] = d[_CM_EXTRA].replace(
+ "interval:", "<><<:>>"
+ )
+ d[_CM_EXTRA] = d[_CM_EXTRA].split("<<:>>")
+ if len(d[_CM_EXTRA]) == 1:
+ comment.extend(d[_CM_EXTRA])
+ else:
+ next_field_type = comment
+ for field in d[_CM_EXTRA]:
+ field_type = next_field_type
+ index = field.rfind("<>")
+ if index == 0:
+ next_field_type = interval
+ continue
+ elif index > 0:
+ next_field_type = interval
+ else:
+ index = field.rfind("<>")
+ if index == 0:
+ next_field_type = comment
+ continue
+ elif index > 0:
+ next_field_type = comment
+ if index != -1:
+ field = field[:index]
+ field_type.append(field.strip())
+ #
+ # cater for a shared interval over multiple axes
+ #
+ if len(interval):
+ if len(d[_CM_NAME]) != len(interval) and len(interval) == 1:
+ interval = interval * len(d[_CM_NAME])
+ #
+ # cater for a shared comment over multiple axes
+ #
+ if len(comment):
+ if len(d[_CM_NAME]) != len(comment) and len(comment) == 1:
+ comment = comment * len(d[_CM_NAME])
+ d[_CM_INTERVAL] = tuple(interval)
+ d[_CM_COMMENT] = tuple(comment)
+ cell_method = iris.coords.CellMethod(
+ d[_CM_METHOD],
+ coords=d[_CM_NAME],
+ intervals=d[_CM_INTERVAL],
+ comments=d[_CM_COMMENT],
+ )
+ cell_methods.append(cell_method)
+ return tuple(cell_methods)
+
+
################################################################################
def build_cube_metadata(engine):
"""Add the standard meta data to the cube."""
@@ -347,7 +549,7 @@ def build_transverse_mercator_coordinate_system(engine, cf_grid_var):
cf_grid_var, CF_ATTR_GRID_SCALE_FACTOR_AT_CENT_MERIDIAN, None
)
- # The following accounts for the inconsistancy in the transverse
+ # The following accounts for the inconsistency in the transverse
# mercator description within the CF spec.
if longitude_of_central_meridian is None:
longitude_of_central_meridian = getattr(
@@ -670,7 +872,7 @@ def get_attr_units(cf_var, attributes):
):
attr_units = cf_units._NO_UNIT_STRING
- # Get any assoicated calendar for a time reference coordinate.
+ # Get any associated calendar for a time reference coordinate.
if cf_units.as_unit(attr_units).is_time_reference():
attr_calendar = getattr(cf_var, CF_ATTR_CALENDAR, None)
@@ -727,7 +929,7 @@ def get_cf_bounds_var(cf_coord_var):
attr_bounds = getattr(cf_coord_var, CF_ATTR_BOUNDS, None)
attr_climatology = getattr(cf_coord_var, CF_ATTR_CLIMATOLOGY, None)
- # Determine bounds, prefering standard bounds over climatology.
+ # Determine bounds, preferring standard bounds over climatology.
# NB. No need to raise a warning if the bounds/climatology
# variable is missing, as that will already have been done by
# iris.fileformats.cf.
@@ -1270,7 +1472,7 @@ def _is_rotated(engine, cf_name, cf_attr_value):
################################################################################
def is_rotated_latitude(engine, cf_name):
- """Determine whether the CF coodinate variable is rotated latitude."""
+ """Determine whether the CF coordinate variable is rotated latitude."""
return _is_rotated(engine, cf_name, CF_VALUE_STD_NAME_GRID_LAT)
diff --git a/lib/iris/fileformats/cf.py b/lib/iris/fileformats/cf.py
index a3a23dc323..2ed01846bd 100644
--- a/lib/iris/fileformats/cf.py
+++ b/lib/iris/fileformats/cf.py
@@ -20,10 +20,10 @@
import re
import warnings
-import netCDF4
import numpy as np
import numpy.ma as ma
+from iris.fileformats.netcdf import _thread_safe_nc
import iris.util
#
@@ -1043,15 +1043,25 @@ class CFReader:
# TODO: remove once iris.experimental.ugrid.CFUGridReader is folded in.
CFGroup = CFGroup
- def __init__(self, filename, warn=False, monotonic=False):
- self._dataset = None
- self._filename = os.path.expanduser(filename)
+ def __init__(self, file_source, warn=False, monotonic=False):
+ # Ensure safe operation for destructor, should init fail.
+ self._own_file = False
+ if isinstance(file_source, str):
+ # Create from filepath : open it + own it (=close when we die).
+ self._filename = os.path.expanduser(file_source)
+ self._dataset = _thread_safe_nc.DatasetWrapper(
+ self._filename, mode="r"
+ )
+ self._own_file = True
+ else:
+ # We have been passed an open dataset.
+ # We use it but don't own it (don't close it).
+ self._dataset = file_source
+ self._filename = self._dataset.filepath()
#: Collection of CF-netCDF variables associated with this netCDF file
self.cf_group = self.CFGroup()
- self._dataset = netCDF4.Dataset(self._filename, mode="r")
-
# Issue load optimisation warning.
if warn and self._dataset.file_format in [
"NETCDF3_CLASSIC",
@@ -1068,6 +1078,19 @@ def __init__(self, filename, warn=False, monotonic=False):
self._build_cf_groups()
self._reset()
+ def __enter__(self):
+ # Enable use as a context manager
+ # N.B. this **guarantees* closure of the file, when the context is exited.
+ # Note: ideally, the class would not do so much work in the __init__ call, and
+ # would do all that here, after acquiring necessary permissions/locks.
+ # But for legacy reasons, we can't do that. So **effectively**, the context
+ # (in terms of access control) already started, when we created the object.
+ return self
+
+ def __exit__(self, exc_type, exc_value, traceback):
+ # When used as a context-manager, **always** close the file on exit.
+ self._close()
+
@property
def filename(self):
"""The file that the CFReader is reading."""
@@ -1294,10 +1317,15 @@ def _reset(self):
for nc_var_name in self._dataset.variables.keys():
self.cf_group[nc_var_name].cf_attrs_reset()
- def __del__(self):
+ def _close(self):
# Explicitly close dataset to prevent file remaining open.
- if self._dataset is not None:
+ if self._own_file and self._dataset is not None:
self._dataset.close()
+ self._dataset = None
+
+ def __del__(self):
+ # Be sure to close dataset when CFReader is destroyed / garbage-collected.
+ self._close()
def _getncattr(dataset, attr, default=None):
diff --git a/lib/iris/fileformats/name.py b/lib/iris/fileformats/name.py
index a0b799697d..9a779cc92d 100644
--- a/lib/iris/fileformats/name.py
+++ b/lib/iris/fileformats/name.py
@@ -8,7 +8,7 @@
def _get_NAME_loader(filename):
"""
- Return the approriate load function for a NAME file based
+ Return the appropriate load function for a NAME file based
on the contents of its header.
"""
diff --git a/lib/iris/fileformats/name_loaders.py b/lib/iris/fileformats/name_loaders.py
index b9b64a343e..0189a8806f 100644
--- a/lib/iris/fileformats/name_loaders.py
+++ b/lib/iris/fileformats/name_loaders.py
@@ -588,7 +588,7 @@ def _build_cell_methods(av_or_ints, coord):
Args:
* av_or_ints (iterable of strings):
- An iterable of strings containing the colummn heading entries
+ An iterable of strings containing the column heading entries
to be parsed.
* coord (string or :class:`iris.coords.Coord`):
The coordinate name (or :class:`iris.coords.Coord` instance)
@@ -1079,7 +1079,7 @@ def load_NAMEIII_version2(filename):
elif zunits == "Pa":
z_name = "air_pressure"
else:
- ValueError("Vertical coordinate unkown")
+ ValueError("Vertical coordinate unknown")
zindex = data.index(zgrid[0])
dim_coords.append("Z")
diff --git a/lib/iris/fileformats/netcdf/__init__.py b/lib/iris/fileformats/netcdf/__init__.py
index 505e173b0b..b696b200ff 100644
--- a/lib/iris/fileformats/netcdf/__init__.py
+++ b/lib/iris/fileformats/netcdf/__init__.py
@@ -18,6 +18,11 @@
# Note: *must* be done before importing from submodules, as they also use this !
logger = iris.config.get_logger(__name__)
+# Note: these probably shouldn't be public, but for now they are.
+from .._nc_load_rules.helpers import (
+ UnknownCellMethodWarning,
+ parse_cell_methods,
+)
from .loader import DEBUG, NetCDFDataProxy, load_cubes
from .saver import (
CF_CONVENTIONS_VERSION,
@@ -25,8 +30,6 @@
SPATIO_TEMPORAL_AXES,
CFNameCoordMap,
Saver,
- UnknownCellMethodWarning,
- parse_cell_methods,
save,
)
diff --git a/lib/iris/fileformats/netcdf/_dask_locks.py b/lib/iris/fileformats/netcdf/_dask_locks.py
new file mode 100644
index 0000000000..15ac117a8b
--- /dev/null
+++ b/lib/iris/fileformats/netcdf/_dask_locks.py
@@ -0,0 +1,140 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""
+Module containing code to create locks enabling dask workers to co-operate.
+
+This matter is complicated by needing different solutions for different dask scheduler
+types, i.e. local 'threads' scheduler, local 'processes' or distributed.
+
+In any case, an "iris.fileformats.netcdf.saver.Saver" object contains a netCDF4.Dataset
+targeting an output file, and creates a Saver.file_write_lock object to serialise
+write-accesses to the file from dask tasks : All dask-task file writes go via a
+"iris.fileformats.netcdf.saver.NetCDFWriteProxy" object, which also contains a link
+to the Saver.file_write_lock, and uses it to prevent workers from fouling each other.
+
+For each chunk written, the NetCDFWriteProxy acquires the common per-file lock;
+opens a Dataset on the file; performs a write to the relevant variable; closes the
+Dataset and then releases the lock. This process is obviously very similar to what the
+NetCDFDataProxy does for reading lazy chunks.
+
+For a threaded scheduler, the Saver.lock is a simple threading.Lock(). The workers
+(threads) execute tasks which contain a NetCDFWriteProxy, as above. All of those
+contain the common lock, and this is simply **the same object** for all workers, since
+they share an address space.
+
+For a distributed scheduler, the Saver.lock is a `distributed.Lock()` which is
+identified with the output filepath. This is distributed to the workers by
+serialising the task function arguments, which will include the NetCDFWriteProxy.
+A worker behaves like a process, though it may execute on a remote machine. When a
+distributed.Lock is deserialised to reconstruct the worker task, this creates an object
+that communicates with the scheduler. These objects behave as a single common lock,
+as they all have the same string 'identity', so the scheduler implements inter-process
+communication so that they can mutually exclude each other.
+
+It is also *conceivable* that multiple processes could write to the same file in
+parallel, if the operating system supports it. However, this also requires that the
+libnetcdf C library is built with parallel access option, which is not common.
+With the "ordinary" libnetcdf build, a process which attempts to open for writing a file
+which is _already_ open for writing simply raises an access error.
+In any case, Iris netcdf saver will not support this mode of operation, at present.
+
+We don't currently support a local "processes" type scheduler. If we did, the
+behaviour should be very similar to a distributed scheduler. It would need to use some
+other serialisable shared-lock solution in place of 'distributed.Lock', which requires
+a distributed scheduler to function.
+
+"""
+import threading
+
+import dask.array
+import dask.base
+import dask.multiprocessing
+import dask.threaded
+
+
+# A dedicated error class, allowing filtering and testing of errors raised here.
+class DaskSchedulerTypeError(ValueError):
+ pass
+
+
+def dask_scheduler_is_distributed():
+ """Return whether a distributed.Client is active."""
+ # NOTE: this replicates logic in `dask.base.get_scheduler` : if a distributed client
+ # has been created + is still active, then the default scheduler will always be
+ # "distributed".
+ is_distributed = False
+ # NOTE: must still work when 'distributed' is not available.
+ try:
+ import distributed
+
+ client = distributed.get_client()
+ is_distributed = client is not None
+ except (ImportError, ValueError):
+ pass
+ return is_distributed
+
+
+def get_dask_array_scheduler_type():
+ """
+ Work out what type of scheduler an array.compute*() will use.
+
+ Returns one of 'distributed', 'threads' or 'processes'.
+ The return value is a valid argument for dask.config.set(scheduler=).
+ This cannot distinguish between distributed local and remote clusters -- both of
+ those simply return 'distributed'.
+
+ NOTE: this takes account of how dask is *currently* configured. It will be wrong
+ if the config changes before the compute actually occurs.
+
+ """
+ if dask_scheduler_is_distributed():
+ result = "distributed"
+ else:
+ # Call 'get_scheduler', which respects the config settings, but pass an array
+ # so we default to the default scheduler for that type of object.
+ trial_dask_array = dask.array.zeros(1)
+ get_function = dask.base.get_scheduler(collections=[trial_dask_array])
+ # Detect the ones which we recognise.
+ if get_function == dask.threaded.get:
+ result = "threads"
+ elif get_function == dask.local.get_sync:
+ result = "single-threaded"
+ elif get_function == dask.multiprocessing.get:
+ result = "processes"
+ else:
+ msg = f"Dask default scheduler for arrays is unrecognised : {get_function}"
+ raise DaskSchedulerTypeError(msg)
+
+ return result
+
+
+def get_worker_lock(identity: str):
+ """
+ Return a mutex Lock which can be shared by multiple Dask workers.
+
+ The type of Lock generated depends on the dask scheduler type, which must therefore
+ be set up before this is called.
+
+ """
+ scheduler_type = get_dask_array_scheduler_type()
+ if scheduler_type in ("threads", "single-threaded"):
+ # N.B. the "identity" string is never used in this case, as the same actual
+ # lock object is used by all workers.
+ lock = threading.Lock()
+ elif scheduler_type == "distributed":
+ from dask.distributed import Lock as DistributedLock
+
+ lock = DistributedLock(identity)
+ else:
+ msg = (
+ "The configured dask array scheduler type is "
+ f'"{scheduler_type}", '
+ "which is not supported by the Iris netcdf saver."
+ )
+ raise DaskSchedulerTypeError(msg)
+
+ # NOTE: not supporting 'processes' scheduler, for now.
+ return lock
diff --git a/lib/iris/fileformats/netcdf/_thread_safe_nc.py b/lib/iris/fileformats/netcdf/_thread_safe_nc.py
new file mode 100644
index 0000000000..21c697acab
--- /dev/null
+++ b/lib/iris/fileformats/netcdf/_thread_safe_nc.py
@@ -0,0 +1,403 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""
+Module to ensure all calls to the netCDF4 library are thread-safe.
+
+Intention is that no other Iris module should import the netCDF4 module.
+
+"""
+from abc import ABC
+from threading import Lock
+import typing
+
+import netCDF4
+import numpy as np
+
+_GLOBAL_NETCDF4_LOCK = Lock()
+
+# Doesn't need thread protection, but this allows all netCDF4 refs to be
+# replaced with thread_safe refs.
+default_fillvals = netCDF4.default_fillvals
+
+
+class _ThreadSafeWrapper(ABC):
+ """
+ Contains a netCDF4 class instance, ensuring wrapping all API calls within _GLOBAL_NETCDF4_LOCK.
+
+ Designed to 'gate keep' all the instance's API calls, but allowing the
+ same API as if working directly with the instance itself.
+
+ Using a contained object instead of inheritance, as we cannot successfully
+ subclass or monkeypatch netCDF4 classes, because they are only wrappers for
+ the C-layer.
+ """
+
+ # Note: this is only used to create a "contained" from passed args.
+ CONTAINED_CLASS = NotImplemented
+ # Note: this defines how we identify/check that a contained is of the expected type
+ # (in a duck-type way).
+ _DUCKTYPE_CHECK_PROPERTIES: typing.List[str] = [NotImplemented]
+
+ # Allows easy type checking, avoiding difficulties with isinstance and mocking.
+ THREAD_SAFE_FLAG = True
+
+ @classmethod
+ def is_contained_type(cls, instance):
+ return all(
+ hasattr(instance, attr) for attr in cls._DUCKTYPE_CHECK_PROPERTIES
+ )
+
+ @classmethod
+ def from_existing(cls, instance):
+ """Pass an existing instance to __init__, where it is contained."""
+ assert cls.is_contained_type(instance)
+ return cls(instance)
+
+ def __init__(self, *args, **kwargs):
+ """Contain an existing instance, or generate a new one from arguments."""
+ if len(args) == 1 and self.is_contained_type(args[0]):
+ # Passed a contained-type object : Wrap ourself around that.
+ instance = args[0]
+ # We should never find ourselves "wrapping a wrapper".
+ assert not hasattr(instance, "THREAD_SAFE_FLAG")
+ else:
+ # Create a contained object of the intended type from passed args.
+ with _GLOBAL_NETCDF4_LOCK:
+ instance = self.CONTAINED_CLASS(*args, **kwargs)
+
+ self._contained_instance = instance
+
+ def __getattr__(self, item):
+ if item == "_contained_instance":
+ # Special behaviour when accessing the _contained_instance itself.
+ return object.__getattribute__(self, item)
+ else:
+ with _GLOBAL_NETCDF4_LOCK:
+ return getattr(self._contained_instance, item)
+
+ def __setattr__(self, key, value):
+ if key == "_contained_instance":
+ # Special behaviour when accessing the _contained_instance itself.
+ object.__setattr__(self, key, value)
+ else:
+ with _GLOBAL_NETCDF4_LOCK:
+ return setattr(self._contained_instance, key, value)
+
+ def __getitem__(self, item):
+ with _GLOBAL_NETCDF4_LOCK:
+ return self._contained_instance.__getitem__(item)
+
+ def __setitem__(self, key, value):
+ with _GLOBAL_NETCDF4_LOCK:
+ return self._contained_instance.__setitem__(key, value)
+
+
+class DimensionWrapper(_ThreadSafeWrapper):
+ """
+ Accessor for a netCDF4.Dimension, always acquiring _GLOBAL_NETCDF4_LOCK.
+
+ All API calls should be identical to those for netCDF4.Dimension.
+ """
+
+ CONTAINED_CLASS = netCDF4.Dimension
+ _DUCKTYPE_CHECK_PROPERTIES = ["isunlimited"]
+
+
+class VariableWrapper(_ThreadSafeWrapper):
+ """
+ Accessor for a netCDF4.Variable, always acquiring _GLOBAL_NETCDF4_LOCK.
+
+ All API calls should be identical to those for netCDF4.Variable.
+ """
+
+ CONTAINED_CLASS = netCDF4.Variable
+ _DUCKTYPE_CHECK_PROPERTIES = ["dimensions", "dtype"]
+
+ def setncattr(self, *args, **kwargs) -> None:
+ """
+ Calls netCDF4.Variable.setncattr within _GLOBAL_NETCDF4_LOCK.
+
+ Only defined explicitly in order to get some mocks to work.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ return self._contained_instance.setncattr(*args, **kwargs)
+
+ @property
+ def dimensions(self) -> typing.List[str]:
+ """
+ Calls netCDF4.Variable.dimensions within _GLOBAL_NETCDF4_LOCK.
+
+ Only defined explicitly in order to get some mocks to work.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ # Return value is a list of strings so no need for
+ # DimensionWrapper, unlike self.get_dims().
+ return self._contained_instance.dimensions
+
+ # All Variable API that returns Dimension(s) is wrapped to instead return
+ # DimensionWrapper(s).
+
+ def get_dims(self, *args, **kwargs) -> typing.Tuple[DimensionWrapper]:
+ """
+ Calls netCDF4.Variable.get_dims() within _GLOBAL_NETCDF4_LOCK, returning DimensionWrappers.
+
+ The original returned netCDF4.Dimensions are simply replaced with their
+ respective DimensionWrappers, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ dimensions_ = list(
+ self._contained_instance.get_dims(*args, **kwargs)
+ )
+ return tuple([DimensionWrapper.from_existing(d) for d in dimensions_])
+
+
+class GroupWrapper(_ThreadSafeWrapper):
+ """
+ Accessor for a netCDF4.Group, always acquiring _GLOBAL_NETCDF4_LOCK.
+
+ All API calls should be identical to those for netCDF4.Group.
+ """
+
+ CONTAINED_CLASS = netCDF4.Group
+ # Note: will also accept a whole Dataset object, but that is OK.
+ _DUCKTYPE_CHECK_PROPERTIES = ["createVariable"]
+
+ # All Group API that returns Dimension(s) is wrapped to instead return
+ # DimensionWrapper(s).
+
+ @property
+ def dimensions(self) -> typing.Dict[str, DimensionWrapper]:
+ """
+ Calls dimensions of netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning DimensionWrappers.
+
+ The original returned netCDF4.Dimensions are simply replaced with their
+ respective DimensionWrappers, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ dimensions_ = self._contained_instance.dimensions
+ return {
+ k: DimensionWrapper.from_existing(v)
+ for k, v in dimensions_.items()
+ }
+
+ def createDimension(self, *args, **kwargs) -> DimensionWrapper:
+ """
+ Calls createDimension() from netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning DimensionWrapper.
+
+ The original returned netCDF4.Dimension is simply replaced with its
+ respective DimensionWrapper, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ new_dimension = self._contained_instance.createDimension(
+ *args, **kwargs
+ )
+ return DimensionWrapper.from_existing(new_dimension)
+
+ # All Group API that returns Variable(s) is wrapped to instead return
+ # VariableWrapper(s).
+
+ @property
+ def variables(self) -> typing.Dict[str, VariableWrapper]:
+ """
+ Calls variables of netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning VariableWrappers.
+
+ The original returned netCDF4.Variables are simply replaced with their
+ respective VariableWrappers, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ variables_ = self._contained_instance.variables
+ return {
+ k: VariableWrapper.from_existing(v) for k, v in variables_.items()
+ }
+
+ def createVariable(self, *args, **kwargs) -> VariableWrapper:
+ """
+ Calls createVariable() from netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning VariableWrapper.
+
+ The original returned netCDF4.Variable is simply replaced with its
+ respective VariableWrapper, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ new_variable = self._contained_instance.createVariable(
+ *args, **kwargs
+ )
+ return VariableWrapper.from_existing(new_variable)
+
+ def get_variables_by_attributes(
+ self, *args, **kwargs
+ ) -> typing.List[VariableWrapper]:
+ """
+ Calls get_variables_by_attributes() from netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning VariableWrappers.
+
+ The original returned netCDF4.Variables are simply replaced with their
+ respective VariableWrappers, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ variables_ = list(
+ self._contained_instance.get_variables_by_attributes(
+ *args, **kwargs
+ )
+ )
+ return [VariableWrapper.from_existing(v) for v in variables_]
+
+ # All Group API that returns Group(s) is wrapped to instead return
+ # GroupWrapper(s).
+
+ @property
+ def groups(self):
+ """
+ Calls groups of netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning GroupWrappers.
+
+ The original returned netCDF4.Groups are simply replaced with their
+ respective GroupWrappers, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ groups_ = self._contained_instance.groups
+ return {k: GroupWrapper.from_existing(v) for k, v in groups_.items()}
+
+ @property
+ def parent(self):
+ """
+ Calls parent of netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning a GroupWrapper.
+
+ The original returned netCDF4.Group is simply replaced with its
+ respective GroupWrapper, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ parent_ = self._contained_instance.parent
+ return GroupWrapper.from_existing(parent_)
+
+ def createGroup(self, *args, **kwargs):
+ """
+ Calls createGroup() from netCDF4.Group/Dataset within _GLOBAL_NETCDF4_LOCK, returning GroupWrapper.
+
+ The original returned netCDF4.Group is simply replaced with its
+ respective GroupWrapper, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ new_group = self._contained_instance.createGroup(*args, **kwargs)
+ return GroupWrapper.from_existing(new_group)
+
+
+class DatasetWrapper(GroupWrapper):
+ """
+ Accessor for a netCDF4.Dataset, always acquiring _GLOBAL_NETCDF4_LOCK.
+
+ All API calls should be identical to those for netCDF4.Dataset.
+ """
+
+ CONTAINED_CLASS = netCDF4.Dataset
+ # Note: 'close' exists on Dataset but not Group (though a rather weak distinction).
+ _DUCKTYPE_CHECK_PROPERTIES = ["createVariable", "close"]
+
+ @classmethod
+ def fromcdl(cls, *args, **kwargs):
+ """
+ Calls netCDF4.Dataset.fromcdl() within _GLOBAL_NETCDF4_LOCK, returning a DatasetWrapper.
+
+ The original returned netCDF4.Dataset is simply replaced with its
+ respective DatasetWrapper, ensuring that downstream calls are
+ also performed within _GLOBAL_NETCDF4_LOCK.
+ """
+ with _GLOBAL_NETCDF4_LOCK:
+ instance = cls.CONTAINED_CLASS.fromcdl(*args, **kwargs)
+ return cls.from_existing(instance)
+
+
+class NetCDFDataProxy:
+ """A reference to the data payload of a single NetCDF file variable."""
+
+ __slots__ = ("shape", "dtype", "path", "variable_name", "fill_value")
+
+ def __init__(self, shape, dtype, path, variable_name, fill_value):
+ self.shape = shape
+ self.dtype = dtype
+ self.path = path
+ self.variable_name = variable_name
+ self.fill_value = fill_value
+
+ @property
+ def ndim(self):
+ return len(self.shape)
+
+ def __getitem__(self, keys):
+ # Using a DatasetWrapper causes problems with invalid ID's and the
+ # netCDF4 library, presumably because __getitem__ gets called so many
+ # times by Dask. Use _GLOBAL_NETCDF4_LOCK directly instead.
+ with _GLOBAL_NETCDF4_LOCK:
+ dataset = netCDF4.Dataset(self.path)
+ try:
+ variable = dataset.variables[self.variable_name]
+ # Get the NetCDF variable data and slice.
+ var = variable[keys]
+ finally:
+ dataset.close()
+ return np.asanyarray(var)
+
+ def __repr__(self):
+ fmt = (
+ "<{self.__class__.__name__} shape={self.shape}"
+ " dtype={self.dtype!r} path={self.path!r}"
+ " variable_name={self.variable_name!r}>"
+ )
+ return fmt.format(self=self)
+
+ def __getstate__(self):
+ return {attr: getattr(self, attr) for attr in self.__slots__}
+
+ def __setstate__(self, state):
+ for key, value in state.items():
+ setattr(self, key, value)
+
+
+class NetCDFWriteProxy:
+ """
+ The "opposite" of a NetCDFDataProxy : An object mimicking the data access of a
+ netCDF4.Variable, but where the data is to be ***written to***.
+
+ It encapsulates the netcdf file and variable which are actually to be written to.
+ This opens the file each time, to enable writing the data chunk, then closes it.
+ TODO: could be improved with a caching scheme, but this just about works.
+ """
+
+ def __init__(self, filepath, cf_var, file_write_lock):
+ self.path = filepath
+ self.varname = cf_var.name
+ self.lock = file_write_lock
+
+ def __setitem__(self, keys, array_data):
+ # Write to the variable.
+ # First acquire a file-specific lock for all workers writing to this file.
+ self.lock.acquire()
+ # Open the file for writing + write to the specific file variable.
+ # Exactly as above, in NetCDFDataProxy : a DatasetWrapper causes problems with
+ # invalid ID's and the netCDF4 library, for so-far unknown reasons.
+ # Instead, use _GLOBAL_NETCDF4_LOCK, and netCDF4 _directly_.
+ with _GLOBAL_NETCDF4_LOCK:
+ dataset = None
+ try:
+ dataset = netCDF4.Dataset(self.path, "r+")
+ var = dataset.variables[self.varname]
+ var[keys] = array_data
+ finally:
+ try:
+ if dataset:
+ dataset.close()
+ finally:
+ # *ALWAYS* let go !
+ self.lock.release()
+
+ def __repr__(self):
+ return f"<{self.__class__.__name__} path={self.path!r} var={self.varname!r}>"
diff --git a/lib/iris/fileformats/netcdf/loader.py b/lib/iris/fileformats/netcdf/loader.py
index 95f394c70d..20d255ea44 100644
--- a/lib/iris/fileformats/netcdf/loader.py
+++ b/lib/iris/fileformats/netcdf/loader.py
@@ -13,9 +13,9 @@
Also : `CF Conventions `_.
"""
+from collections.abc import Iterable
import warnings
-import netCDF4
import numpy as np
from iris._lazy_data import as_lazy_data
@@ -34,6 +34,7 @@
import iris.coords
import iris.exceptions
import iris.fileformats.cf
+from iris.fileformats.netcdf import _thread_safe_nc
from iris.fileformats.netcdf.saver import _CF_ATTRS
import iris.io
import iris.util
@@ -44,6 +45,10 @@
# Get the logger : shared logger for all in 'iris.fileformats.netcdf'.
from . import logger
+# An expected part of the public loader API, but includes thread safety
+# concerns so is housed in _thread_safe_nc.
+NetCDFDataProxy = _thread_safe_nc.NetCDFDataProxy
+
def _actions_engine():
# Return an 'actions engine', which provides a pyke-rules-like interface to
@@ -55,48 +60,6 @@ def _actions_engine():
return engine
-class NetCDFDataProxy:
- """A reference to the data payload of a single NetCDF file variable."""
-
- __slots__ = ("shape", "dtype", "path", "variable_name", "fill_value")
-
- def __init__(self, shape, dtype, path, variable_name, fill_value):
- self.shape = shape
- self.dtype = dtype
- self.path = path
- self.variable_name = variable_name
- self.fill_value = fill_value
-
- @property
- def ndim(self):
- return len(self.shape)
-
- def __getitem__(self, keys):
- dataset = netCDF4.Dataset(self.path)
- try:
- variable = dataset.variables[self.variable_name]
- # Get the NetCDF variable data and slice.
- var = variable[keys]
- finally:
- dataset.close()
- return np.asanyarray(var)
-
- def __repr__(self):
- fmt = (
- "<{self.__class__.__name__} shape={self.shape}"
- " dtype={self.dtype!r} path={self.path!r}"
- " variable_name={self.variable_name!r}>"
- )
- return fmt.format(self=self)
-
- def __getstate__(self):
- return {attr: getattr(self, attr) for attr in self.__slots__}
-
- def __setstate__(self, state):
- for key, value in state.items():
- setattr(self, key, value)
-
-
def _assert_case_specific_facts(engine, cf, cf_group):
# Initialise a data store for built cube elements.
# This is used to patch element attributes *not* setup by the actions
@@ -211,26 +174,61 @@ def _get_actual_dtype(cf_var):
return dummy_data.dtype
+# An arbitrary variable array size, below which we will fetch real data from a variable
+# rather than making a lazy array for deferred access.
+# Set by experiment at roughly the point where it begins to save us memory, but actually
+# mostly done for speed improvement. See https://github.com/SciTools/iris/pull/5069
+_LAZYVAR_MIN_BYTES = 5000
+
+
def _get_cf_var_data(cf_var, filename):
- # Get lazy chunked data out of a cf variable.
- dtype = _get_actual_dtype(cf_var)
-
- # Create cube with deferred data, but no metadata
- fill_value = getattr(
- cf_var.cf_data,
- "_FillValue",
- netCDF4.default_fillvals[cf_var.dtype.str[1:]],
- )
- proxy = NetCDFDataProxy(
- cf_var.shape, dtype, filename, cf_var.cf_name, fill_value
- )
- # Get the chunking specified for the variable : this is either a shape, or
- # maybe the string "contiguous".
- chunks = cf_var.cf_data.chunking()
- # In the "contiguous" case, pass chunks=None to 'as_lazy_data'.
- if chunks == "contiguous":
- chunks = None
- return as_lazy_data(proxy, chunks=chunks)
+ """
+ Get an array representing the data of a CF variable.
+
+ This is typically a lazy array based around a NetCDFDataProxy, but if the variable
+ is "sufficiently small", we instead fetch the data as a real (numpy) array.
+ The latter is especially valuable for scalar coordinates, which are otherwise
+ unnecessarily slow + wasteful of memory.
+
+ """
+ if hasattr(cf_var, "_data_array"):
+ # The variable is not an actual netCDF4 file variable, but an emulating
+ # object with an attached data array (either numpy or dask), which can be
+ # returned immediately as-is. This is used as a hook to translate data to/from
+ # netcdf data container objects in other packages, such as xarray.
+ # See https://github.com/SciTools/iris/issues/4994 "Xarray bridge".
+ result = cf_var._data_array
+ else:
+ total_bytes = cf_var.size * cf_var.dtype.itemsize
+ if total_bytes < _LAZYVAR_MIN_BYTES:
+ # Don't make a lazy array, as it will cost more memory AND more time to access.
+ # Instead fetch the data immediately, as a real array, and return that.
+ result = cf_var[:]
+
+ else:
+ # Get lazy chunked data out of a cf variable.
+ dtype = _get_actual_dtype(cf_var)
+
+ # Make a data-proxy that mimics array access and can fetch from the file.
+ fill_value = getattr(
+ cf_var.cf_data,
+ "_FillValue",
+ _thread_safe_nc.default_fillvals[cf_var.dtype.str[1:]],
+ )
+ proxy = NetCDFDataProxy(
+ cf_var.shape, dtype, filename, cf_var.cf_name, fill_value
+ )
+ # Get the chunking specified for the variable : this is either a shape, or
+ # maybe the string "contiguous".
+ chunks = cf_var.cf_data.chunking()
+ # In the "contiguous" case, pass chunks=None to 'as_lazy_data'.
+ if chunks == "contiguous":
+ chunks = None
+
+ # Return a dask array providing deferred access.
+ result = as_lazy_data(proxy, chunks=chunks)
+
+ return result
class _OrderedAddableList(list):
@@ -494,14 +492,15 @@ def inner(cf_datavar):
return result
-def load_cubes(filenames, callback=None, constraints=None):
+def load_cubes(file_sources, callback=None, constraints=None):
"""
Loads cubes from a list of NetCDF filenames/OPeNDAP URLs.
Args:
- * filenames (string/list):
+ * file_sources (string/list):
One or more NetCDF filenames/OPeNDAP URLs to load from.
+ OR open datasets.
Kwargs:
@@ -529,66 +528,69 @@ def load_cubes(filenames, callback=None, constraints=None):
# Create an actions engine.
engine = _actions_engine()
- if isinstance(filenames, str):
- filenames = [filenames]
+ if isinstance(file_sources, str) or not isinstance(file_sources, Iterable):
+ file_sources = [file_sources]
- for filename in filenames:
- # Ingest the netCDF file.
+ for file_source in file_sources:
+ # Ingest the file. At present may be a filepath or an open netCDF4.Dataset.
meshes = {}
if PARSE_UGRID_ON_LOAD:
- cf = CFUGridReader(filename)
- meshes = _meshes_from_cf(cf)
+ cf_reader_class = CFUGridReader
else:
- cf = iris.fileformats.cf.CFReader(filename)
+ cf_reader_class = iris.fileformats.cf.CFReader
- # Process each CF data variable.
- data_variables = list(cf.cf_group.data_variables.values()) + list(
- cf.cf_group.promoted.values()
- )
- for cf_var in data_variables:
- if var_callback and not var_callback(cf_var):
- # Deliver only selected results.
- continue
-
- # cf_var-specific mesh handling, if a mesh is present.
- # Build the mesh_coords *before* loading the cube - avoids
- # mesh-related attributes being picked up by
- # _add_unused_attributes().
- mesh_name = None
- mesh = None
- mesh_coords, mesh_dim = [], None
+ with cf_reader_class(file_source) as cf:
if PARSE_UGRID_ON_LOAD:
- mesh_name = getattr(cf_var, "mesh", None)
- if mesh_name is not None:
+ meshes = _meshes_from_cf(cf)
+
+ # Process each CF data variable.
+ data_variables = list(cf.cf_group.data_variables.values()) + list(
+ cf.cf_group.promoted.values()
+ )
+ for cf_var in data_variables:
+ if var_callback and not var_callback(cf_var):
+ # Deliver only selected results.
+ continue
+
+ # cf_var-specific mesh handling, if a mesh is present.
+ # Build the mesh_coords *before* loading the cube - avoids
+ # mesh-related attributes being picked up by
+ # _add_unused_attributes().
+ mesh_name = None
+ mesh = None
+ mesh_coords, mesh_dim = [], None
+ if PARSE_UGRID_ON_LOAD:
+ mesh_name = getattr(cf_var, "mesh", None)
+ if mesh_name is not None:
+ try:
+ mesh = meshes[mesh_name]
+ except KeyError:
+ message = (
+ f"File does not contain mesh: '{mesh_name}' - "
+ f"referenced by variable: '{cf_var.cf_name}' ."
+ )
+ logger.debug(message)
+ if mesh is not None:
+ mesh_coords, mesh_dim = _build_mesh_coords(mesh, cf_var)
+
+ cube = _load_cube(engine, cf, cf_var, cf.filename)
+
+ # Attach the mesh (if present) to the cube.
+ for mesh_coord in mesh_coords:
+ cube.add_aux_coord(mesh_coord, mesh_dim)
+
+ # Process any associated formula terms and attach
+ # the corresponding AuxCoordFactory.
try:
- mesh = meshes[mesh_name]
- except KeyError:
- message = (
- f"File does not contain mesh: '{mesh_name}' - "
- f"referenced by variable: '{cf_var.cf_name}' ."
- )
- logger.debug(message)
- if mesh is not None:
- mesh_coords, mesh_dim = _build_mesh_coords(mesh, cf_var)
-
- cube = _load_cube(engine, cf, cf_var, filename)
-
- # Attach the mesh (if present) to the cube.
- for mesh_coord in mesh_coords:
- cube.add_aux_coord(mesh_coord, mesh_dim)
-
- # Process any associated formula terms and attach
- # the corresponding AuxCoordFactory.
- try:
- _load_aux_factory(engine, cube)
- except ValueError as e:
- warnings.warn("{}".format(e))
-
- # Perform any user registered callback function.
- cube = run_callback(callback, cube, cf_var, filename)
-
- # Callback mechanism may return None, which must not be yielded
- if cube is None:
- continue
-
- yield cube
+ _load_aux_factory(engine, cube)
+ except ValueError as e:
+ warnings.warn("{}".format(e))
+
+ # Perform any user registered callback function.
+ cube = run_callback(callback, cube, cf_var, file_source)
+
+ # Callback mechanism may return None, which must not be yielded
+ if cube is None:
+ continue
+
+ yield cube
diff --git a/lib/iris/fileformats/netcdf/saver.py b/lib/iris/fileformats/netcdf/saver.py
index 650c5e3338..312eea9c43 100644
--- a/lib/iris/fileformats/netcdf/saver.py
+++ b/lib/iris/fileformats/netcdf/saver.py
@@ -23,10 +23,10 @@
import warnings
import cf_units
+import dask
import dask.array as da
-import netCDF4
+from dask.delayed import Delayed
import numpy as np
-import numpy.ma as ma
from iris._lazy_data import _co_realise_lazy_arrays, is_lazy_data
from iris.aux_factory import (
@@ -45,6 +45,7 @@
from iris.coords import AncillaryVariable, AuxCoord, CellMeasure, DimCoord
import iris.exceptions
import iris.fileformats.cf
+from iris.fileformats.netcdf import _dask_locks, _thread_safe_nc
import iris.io
import iris.util
@@ -156,207 +157,6 @@
}
-# Cell methods.
-_CM_KNOWN_METHODS = [
- "point",
- "sum",
- "mean",
- "maximum",
- "minimum",
- "mid_range",
- "standard_deviation",
- "variance",
- "mode",
- "median",
-]
-
-_CM_COMMENT = "comment"
-_CM_EXTRA = "extra"
-_CM_INTERVAL = "interval"
-_CM_METHOD = "method"
-_CM_NAME = "name"
-_CM_PARSE_NAME = re.compile(r"([\w_]+\s*?:\s+)+")
-_CM_PARSE = re.compile(
- r"""
- (?P([\w_]+\s*?:\s+)+)
- (?P[\w_\s]+(?![\w_]*\s*?:))\s*
- (?:
- \(\s*
- (?P.+)
- \)\s*
- )?
- """,
- re.VERBOSE,
-)
-
-
-class UnknownCellMethodWarning(Warning):
- pass
-
-
-def _split_cell_methods(nc_cell_methods: str) -> List[re.Match]:
- """
- Split a CF cell_methods attribute string into a list of zero or more cell
- methods, each of which is then parsed with a regex to return a list of match
- objects.
-
- Args:
-
- * nc_cell_methods: The value of the cell methods attribute to be split.
-
- Returns:
-
- * nc_cell_methods_matches: A list of the re.Match objects associated with
- each parsed cell method
-
- Splitting is done based on words followed by colons outside of any brackets.
- Validation of anything other than being laid out in the expected format is
- left to the calling function.
- """
-
- # Find name candidates
- name_start_inds = []
- for m in _CM_PARSE_NAME.finditer(nc_cell_methods):
- name_start_inds.append(m.start())
-
- # Remove those that fall inside brackets
- bracket_depth = 0
- for ind, cha in enumerate(nc_cell_methods):
- if cha == "(":
- bracket_depth += 1
- elif cha == ")":
- bracket_depth -= 1
- if bracket_depth < 0:
- msg = (
- "Cell methods may be incorrectly parsed due to mismatched "
- "brackets"
- )
- warnings.warn(msg, UserWarning, stacklevel=2)
- if bracket_depth > 0 and ind in name_start_inds:
- name_start_inds.remove(ind)
-
- # List tuples of indices of starts and ends of the cell methods in the string
- method_indices = []
- for ii in range(len(name_start_inds) - 1):
- method_indices.append((name_start_inds[ii], name_start_inds[ii + 1]))
- method_indices.append((name_start_inds[-1], len(nc_cell_methods)))
-
- # Index the string and match against each substring
- nc_cell_methods_matches = []
- for start_ind, end_ind in method_indices:
- nc_cell_method_str = nc_cell_methods[start_ind:end_ind]
- nc_cell_method_match = _CM_PARSE.match(nc_cell_method_str.strip())
- if not nc_cell_method_match:
- msg = (
- f"Failed to fully parse cell method string: {nc_cell_methods}"
- )
- warnings.warn(msg, UserWarning, stacklevel=2)
- continue
- nc_cell_methods_matches.append(nc_cell_method_match)
-
- return nc_cell_methods_matches
-
-
-def parse_cell_methods(nc_cell_methods):
- """
- Parse a CF cell_methods attribute string into a tuple of zero or
- more CellMethod instances.
-
- Args:
-
- * nc_cell_methods (str):
- The value of the cell methods attribute to be parsed.
-
- Returns:
-
- * cell_methods
- An iterable of :class:`iris.coords.CellMethod`.
-
- Multiple coordinates, intervals and comments are supported.
- If a method has a non-standard name a warning will be issued, but the
- results are not affected.
-
- """
-
- cell_methods = []
- if nc_cell_methods is not None:
- for m in _split_cell_methods(nc_cell_methods):
- d = m.groupdict()
- method = d[_CM_METHOD]
- method = method.strip()
- # Check validity of method, allowing for multi-part methods
- # e.g. mean over years.
- method_words = method.split()
- if method_words[0].lower() not in _CM_KNOWN_METHODS:
- msg = "NetCDF variable contains unknown cell method {!r}"
- warnings.warn(
- msg.format("{}".format(method_words[0])),
- UnknownCellMethodWarning,
- )
- d[_CM_METHOD] = method
- name = d[_CM_NAME]
- name = name.replace(" ", "")
- name = name.rstrip(":")
- d[_CM_NAME] = tuple([n for n in name.split(":")])
- interval = []
- comment = []
- if d[_CM_EXTRA] is not None:
- #
- # tokenise the key words and field colon marker
- #
- d[_CM_EXTRA] = d[_CM_EXTRA].replace(
- "comment:", "<><<:>>"
- )
- d[_CM_EXTRA] = d[_CM_EXTRA].replace(
- "interval:", "<><<:>>"
- )
- d[_CM_EXTRA] = d[_CM_EXTRA].split("<<:>>")
- if len(d[_CM_EXTRA]) == 1:
- comment.extend(d[_CM_EXTRA])
- else:
- next_field_type = comment
- for field in d[_CM_EXTRA]:
- field_type = next_field_type
- index = field.rfind("<>")
- if index == 0:
- next_field_type = interval
- continue
- elif index > 0:
- next_field_type = interval
- else:
- index = field.rfind("<>")
- if index == 0:
- next_field_type = comment
- continue
- elif index > 0:
- next_field_type = comment
- if index != -1:
- field = field[:index]
- field_type.append(field.strip())
- #
- # cater for a shared interval over multiple axes
- #
- if len(interval):
- if len(d[_CM_NAME]) != len(interval) and len(interval) == 1:
- interval = interval * len(d[_CM_NAME])
- #
- # cater for a shared comment over multiple axes
- #
- if len(comment):
- if len(d[_CM_NAME]) != len(comment) and len(comment) == 1:
- comment = comment * len(d[_CM_NAME])
- d[_CM_INTERVAL] = tuple(interval)
- d[_CM_COMMENT] = tuple(comment)
- cell_method = iris.coords.CellMethod(
- d[_CM_METHOD],
- coords=d[_CM_NAME],
- intervals=d[_CM_INTERVAL],
- comments=d[_CM_COMMENT],
- )
- cell_methods.append(cell_method)
- return tuple(cell_methods)
-
-
class CFNameCoordMap:
"""Provide a simple CF name to CF coordinate mapping."""
@@ -459,63 +259,159 @@ def _setncattr(variable, name, attribute):
Put the given attribute on the given netCDF4 Data type, casting
attributes as we go to bytes rather than unicode.
+ NOTE: variable needs to be a _thread_safe_nc._ThreadSafeWrapper subclass.
+
"""
+ assert hasattr(variable, "THREAD_SAFE_FLAG")
attribute = _bytes_if_ascii(attribute)
return variable.setncattr(name, attribute)
-class _FillValueMaskCheckAndStoreTarget:
+# NOTE : this matches :class:`iris.experimental.ugrid.mesh.Mesh.ELEMENTS`,
+# but in the preferred order for coord/connectivity variables in the file.
+MESH_ELEMENTS = ("node", "edge", "face")
+
+
+_FillvalueCheckInfo = collections.namedtuple(
+ "_FillvalueCheckInfo", ["user_value", "check_value", "dtype", "varname"]
+)
+
+
+def _data_fillvalue_check(arraylib, data, check_value):
"""
- To be used with da.store. Remembers whether any element was equal to a
- given value and whether it was masked, before passing the chunk to the
- given target.
+ Check whether an array is masked, and whether it contains a fill-value.
+
+ Parameters
+ ----------
+ arraylib : module
+ Either numpy or dask.array : When dask, results are lazy computations.
+ data : array-like
+ Array to check (numpy or dask)
+ check_value : number or None
+ If not None, fill-value to check for existence in the array.
+ If None, do not do value-in-array check
+
+ Returns
+ -------
+ is_masked : bool
+ True if array has any masked points.
+ contains_value : bool
+ True if array contains check_value.
+ Always False if check_value is None.
"""
+ is_masked = arraylib.any(arraylib.ma.getmaskarray(data))
+ if check_value is None:
+ contains_value = False
+ else:
+ contains_value = arraylib.any(data == check_value)
+ return is_masked, contains_value
- def __init__(self, target, fill_value=None):
- self.target = target
- self.fill_value = fill_value
- self.contains_value = False
- self.is_masked = False
- def __setitem__(self, keys, arr):
- if self.fill_value is not None:
- self.contains_value = self.contains_value or self.fill_value in arr
- self.is_masked = self.is_masked or ma.is_masked(arr)
- self.target[keys] = arr
+class SaverFillValueWarning(UserWarning):
+ pass
-# NOTE : this matches :class:`iris.experimental.ugrid.mesh.Mesh.ELEMENTS`,
-# but in the preferred order for coord/connectivity variables in the file.
-MESH_ELEMENTS = ("node", "edge", "face")
+def _fillvalue_report(fill_info, is_masked, contains_fill_value, warn=False):
+ """
+ From the given information, work out whether there was a possible or actual
+ fill-value collision, and if so construct a warning.
+
+ Parameters
+ ----------
+ fill_info : _FillvalueCheckInfo
+ A named-tuple containing the context of the fill-value check
+ is_masked : bool
+ whether the data array was masked
+ contains_fill_value : bool
+ whether the data array contained the fill-value
+ warn : bool
+ if True, also issue any resulting warning immediately.
+
+ Returns
+ -------
+ None or :class:`Warning`
+ If not None, indicates a known or possible problem with filling
+
+ """
+ varname = fill_info.varname
+ user_value = fill_info.user_value
+ check_value = fill_info.check_value
+ is_byte_data = fill_info.dtype.itemsize == 1
+ result = None
+ if is_byte_data and is_masked and user_value is None:
+ result = SaverFillValueWarning(
+ f"CF var '{varname}' contains byte data with masked points, but "
+ "no fill_value keyword was given. As saved, these "
+ "points will read back as valid values. To save as "
+ "masked byte data, `_FillValue` needs to be explicitly "
+ "set. For Cube data this can be done via the 'fill_value' "
+ "keyword during saving, otherwise use ncedit/equivalent."
+ )
+ elif contains_fill_value:
+ result = SaverFillValueWarning(
+ f"CF var '{varname}' contains unmasked data points equal to the "
+ f"fill-value, {check_value}. As saved, these points will read back "
+ "as missing data. To save these as normal values, "
+ "`_FillValue` needs to be set to not equal any valid data "
+ "points. For Cube data this can be done via the 'fill_value' "
+ "keyword during saving, otherwise use ncedit/equivalent."
+ )
+
+ if warn and result is not None:
+ warnings.warn(result)
+ return result
class Saver:
"""A manager for saving netcdf files."""
- def __init__(self, filename, netcdf_format):
+ def __init__(self, filename, netcdf_format, compute=True):
"""
A manager for saving netcdf files.
- Args:
-
- * filename (string):
+ Parameters
+ ----------
+ filename : string or netCDF4.Dataset
Name of the netCDF file to save the cube.
+ OR a writeable object supporting the :class:`netCF4.Dataset` api.
- * netcdf_format (string):
+ netcdf_format : string
Underlying netCDF file format, one of 'NETCDF4', 'NETCDF4_CLASSIC',
'NETCDF3_CLASSIC' or 'NETCDF3_64BIT'. Default is 'NETCDF4' format.
- Returns:
- None.
-
- For example::
+ compute : bool, default=True
+ If ``True``, delayed variable saves will be completed on exit from the Saver
+ context (after first closing the target file), equivalent to
+ :meth:`complete()`.
+
+ If ``False``, the file is created and closed without writing the data of
+ variables for which the source data was lazy. These writes can be
+ completed later, see :meth:`delayed_completion`.
+
+ .. Note::
+ If ``filename`` is an open dataset, rather than a filepath, then the
+ caller must specify ``compute=False``, **close the dataset**, and
+ complete delayed saving afterwards.
+ If ``compute`` is ``True`` in this case, an error is raised.
+ This is because lazy content must be written by delayed save operations,
+ which will only succeed if the dataset can be (re-)opened for writing.
+ See :func:`save`.
+
+ Returns
+ -------
+ None
+
+ Example
+ -------
+ >>> import iris
+ >>> from iris.fileformats.netcdf.saver import Saver
+ >>> cubes = iris.load(iris.sample_data_path('atlantic_profiles.nc'))
+ >>> with Saver("tmp.nc", "NETCDF4") as sman:
+ ... # Iterate through the cubelist.
+ ... for cube in cubes:
+ ... sman.write(cube)
- # Initialise Manager for saving
- with Saver(filename, netcdf_format) as sman:
- # Iterate through the cubelist.
- for cube in cubes:
- sman.write(cube)
"""
if netcdf_format not in [
@@ -542,30 +438,77 @@ def __init__(self, filename, netcdf_format):
self._mesh_dims = {}
#: A dictionary, mapping formula terms to owner cf variable name
self._formula_terms_cache = {}
- #: NetCDF dataset
- try:
- self._dataset = netCDF4.Dataset(
- filename, mode="w", format=netcdf_format
- )
- except RuntimeError:
- dir_name = os.path.dirname(filename)
- if not os.path.isdir(dir_name):
- msg = "No such file or directory: {}".format(dir_name)
- raise IOError(msg)
- if not os.access(dir_name, os.R_OK | os.W_OK):
- msg = "Permission denied: {}".format(filename)
- raise IOError(msg)
- else:
- raise
+ #: Target filepath
+ self.filepath = (
+ None # this line just for the API page -- value is set later
+ )
+ #: Whether to complete delayed saves on exit (and raise associated warnings).
+ self.compute = compute
+ # N.B. the file-write-lock *type* actually depends on the dask scheduler type.
+ #: A per-file write lock to prevent dask attempting overlapping writes.
+ self.file_write_lock = (
+ None # this line just for the API page -- value is set later
+ )
+
+ # A list of delayed writes for lazy saving
+ # a list of triples (source, target, fill-info).
+ self._delayed_writes = []
+
+ # Detect if we were passed a pre-opened dataset (or something like one)
+ self._to_open_dataset = hasattr(filename, "createVariable")
+ if self._to_open_dataset:
+ # We were passed a *dataset*, so we don't open (or close) one of our own.
+ self._dataset = filename
+ if compute:
+ msg = (
+ "Cannot save to a user-provided dataset with 'compute=True'. "
+ "Please use 'compute=False' and complete delayed saving in the "
+ "calling code after the file is closed."
+ )
+ raise ValueError(msg)
+
+ # Put it inside a _thread_safe_nc wrapper to ensure thread-safety.
+ # Except if it already is one, since they forbid "re-wrapping".
+ if not hasattr(self._dataset, "THREAD_SAFE_FLAG"):
+ self._dataset = _thread_safe_nc.DatasetWrapper.from_existing(
+ self._dataset
+ )
+
+ # In this case the dataset gives a filepath, not the other way around.
+ self.filepath = self._dataset.filepath()
+
+ else:
+ # Given a filepath string/path : create a dataset from that
+ try:
+ self.filepath = os.path.abspath(filename)
+ self._dataset = _thread_safe_nc.DatasetWrapper(
+ self.filepath, mode="w", format=netcdf_format
+ )
+ except RuntimeError:
+ dir_name = os.path.dirname(self.filepath)
+ if not os.path.isdir(dir_name):
+ msg = "No such file or directory: {}".format(dir_name)
+ raise IOError(msg)
+ if not os.access(dir_name, os.R_OK | os.W_OK):
+ msg = "Permission denied: {}".format(self.filepath)
+ raise IOError(msg)
+ else:
+ raise
+
+ self.file_write_lock = _dask_locks.get_worker_lock(self.filepath)
def __enter__(self):
return self
def __exit__(self, type, value, traceback):
"""Flush any buffered data to the CF-netCDF file before closing."""
-
self._dataset.sync()
- self._dataset.close()
+ if not self._to_open_dataset:
+ # Only close if the Saver created it.
+ self._dataset.close()
+ # Complete after closing, if required
+ if self.compute:
+ self.complete()
def write(
self,
@@ -1030,7 +973,7 @@ def _add_inner_related_vars(
"""
if coordlike_elements:
- # Choose the approriate parent attribute
+ # Choose the appropriate parent attribute
elem_type = type(coordlike_elements[0])
if elem_type in (AuxCoord, DimCoord):
role_attribute_name = "coordinates"
@@ -1184,7 +1127,7 @@ def _add_aux_factories(self, cube, cf_var_cube, dimension_names):
warnings.warn(msg)
else:
# Override `standard_name`, `long_name`, and `axis` of the
- # primary coord that signals the presense of a dimensionless
+ # primary coord that signals the presence of a dimensionless
# vertical coord, then set the `formula_terms` attribute.
primary_coord = factory.dependencies[factory_defn.primary]
if primary_coord in primaries:
@@ -1491,7 +1434,7 @@ def cf_valid_var_name(var_name):
A var_name suitable for passing through for variable creation.
"""
- # Replace invalid charaters with an underscore ("_").
+ # Replace invalid characters with an underscore ("_").
var_name = re.sub(r"[^a-zA-Z0-9]", "_", var_name)
# Ensure the variable name starts with a letter.
if re.match(r"^[^a-zA-Z]", var_name):
@@ -1927,16 +1870,6 @@ def _create_generic_cf_array_var(
# Check if this is a dim-coord.
is_dimcoord = cube is not None and element in cube.dim_coords
- if isinstance(element, iris.coords.CellMeasure):
- # Disallow saving of *masked* cell measures.
- # NOTE: currently, this is the only functional difference in
- # variable creation between an ancillary and a cell measure.
- if iris.util.is_masked(data):
- # We can't save masked points properly, as we don't maintain
- # a fill_value. (Load will not record one, either).
- msg = "Cell measures with missing data are not supported."
- raise ValueError(msg)
-
if is_dimcoord:
# By definition of a CF-netCDF coordinate variable this
# coordinate must be 1-D and the name of the CF-netCDF variable
@@ -2331,7 +2264,13 @@ def _create_cf_data_variable(
dtype = data.dtype.newbyteorder("=")
def set_packing_ncattrs(cfvar):
- """Set netCDF packing attributes."""
+ """
+ Set netCDF packing attributes.
+
+ NOTE: cfvar needs to be a _thread_safe_nc._ThreadSafeWrapper subclass.
+
+ """
+ assert hasattr(cfvar, "THREAD_SAFE_FLAG")
if packing:
if scale_factor:
_setncattr(cfvar, "scale_factor", scale_factor)
@@ -2442,8 +2381,7 @@ def _increment_name(self, varname):
return "{}_{}".format(varname, num)
- @staticmethod
- def _lazy_stream_data(data, fill_value, fill_warn, cf_var):
+ def _lazy_stream_data(self, data, fill_value, fill_warn, cf_var):
if hasattr(data, "shape") and data.shape == (1,) + cf_var.shape:
# (Don't do this check for string data).
# Reduce dimensionality where the data array has an extra dimension
@@ -2452,62 +2390,192 @@ def _lazy_stream_data(data, fill_value, fill_warn, cf_var):
# contains just 1 row, so the cf_var is 1D.
data = data.squeeze(axis=0)
- if is_lazy_data(data):
+ if hasattr(cf_var, "_data_array"):
+ # The variable is not an actual netCDF4 file variable, but an emulating
+ # object with an attached data array (either numpy or dask), which should be
+ # copied immediately to the target. This is used as a hook to translate
+ # data to/from netcdf data container objects in other packages, such as
+ # xarray.
+ # See https://github.com/SciTools/iris/issues/4994 "Xarray bridge".
+ # N.B. also, in this case there is no need for fill-value checking as the
+ # data is not being translated to an in-file representation.
+ cf_var._data_array = data
+ else:
+ # Decide whether we are checking for fill-value collisions.
+ dtype = cf_var.dtype
+ # fill_warn allows us to skip warning if packing attributes have been
+ # specified. It would require much more complex operations to work out
+ # what the values and fill_value _would_ be in such a case.
+ if fill_warn:
+ if fill_value is not None:
+ fill_value_to_check = fill_value
+ else:
+ # Retain 'fill_value == None', to show that no specific value was given.
+ # But set 'fill_value_to_check' to a calculated value
+ fill_value_to_check = _thread_safe_nc.default_fillvals[
+ dtype.str[1:]
+ ]
+ # Cast the check-value to the correct dtype.
+ # NOTE: In the case of 'S1' dtype (at least), the default (Python) value
+ # does not have a compatible type. This causes a deprecation warning at
+ # numpy 1.24, *and* was preventing correct fill-value checking of character
+ # data, since they are actually bytes (dtype 'S1').
+ fill_value_to_check = np.array(
+ fill_value_to_check, dtype=dtype
+ )
+ else:
+ # A None means we will NOT check for collisions.
+ fill_value_to_check = None
+
+ fill_info = _FillvalueCheckInfo(
+ user_value=fill_value,
+ check_value=fill_value_to_check,
+ dtype=dtype,
+ varname=cf_var.name,
+ )
+
+ doing_delayed_save = is_lazy_data(data)
+ if doing_delayed_save:
+ # save lazy data with a delayed operation. For now, we just record the
+ # necessary information -- a single, complete delayed action is constructed
+ # later by a call to delayed_completion().
+ def store(data, cf_var, fill_info):
+ # Create a data-writeable object that we can stream into, which
+ # encapsulates the file to be opened + variable to be written.
+ write_wrapper = _thread_safe_nc.NetCDFWriteProxy(
+ self.filepath, cf_var, self.file_write_lock
+ )
+ # Add to the list of delayed writes, used in delayed_completion().
+ self._delayed_writes.append(
+ (data, write_wrapper, fill_info)
+ )
+ # In this case, fill-value checking is done later. But return 2 dummy
+ # values, to be consistent with the non-streamed "store" signature.
+ is_masked, contains_value = False, False
+ return is_masked, contains_value
- def store(data, cf_var, fill_value):
- # Store lazy data and check whether it is masked and contains
- # the fill value
- target = _FillValueMaskCheckAndStoreTarget(cf_var, fill_value)
- da.store([data], [target])
- return target.is_masked, target.contains_value
+ else:
+ # Real data is always written directly, i.e. not via lazy save.
+ # We also check it immediately for any fill-value problems.
+ def store(data, cf_var, fill_info):
+ cf_var[:] = data
+ return _data_fillvalue_check(
+ np, data, fill_info.check_value
+ )
- else:
+ # Store the data and check if it is masked and contains the fill value.
+ is_masked, contains_fill_value = store(data, cf_var, fill_info)
- def store(data, cf_var, fill_value):
- cf_var[:] = data
- is_masked = np.ma.is_masked(data)
- contains_value = fill_value is not None and fill_value in data
- return is_masked, contains_value
+ if not doing_delayed_save:
+ # Issue a fill-value warning immediately, if appropriate.
+ _fillvalue_report(
+ fill_info, is_masked, contains_fill_value, warn=True
+ )
- dtype = cf_var.dtype
+ def delayed_completion(self) -> Delayed:
+ """
+ Create and return a :class:`dask.delayed.Delayed` to perform file completion
+ for delayed saves.
+
+ This contains all the delayed writes, which complete the file by filling out
+ the data of variables initially created empty, and also the checks for
+ potential fill-value collisions.
+ When computed, it returns a list of any warnings which were generated in the
+ save operation.
+
+ Returns
+ -------
+ completion : :class:`dask.delayed.Delayed`
+
+ Notes
+ -----
+ The dataset *must* be closed (saver has exited its context) before the
+ result can be computed, otherwise computation will hang (never return).
+ """
+ if self._delayed_writes:
+ # Create a single delayed da.store operation to complete the file.
+ sources, targets, fill_infos = zip(*self._delayed_writes)
+ store_op = da.store(sources, targets, compute=False, lock=False)
+
+ # Construct a delayed fill-check operation for each (lazy) source array.
+ delayed_fillvalue_checks = [
+ # NB with arraylib=dask.array, this routine does lazy array computation
+ _data_fillvalue_check(da, source, fillinfo.check_value)
+ for source, fillinfo in zip(sources, fill_infos)
+ ]
+
+ # Return a single delayed object which completes the delayed saves and
+ # returns a list of any fill-value warnings.
+ @dask.delayed
+ def compute_and_return_warnings(store_op, fv_infos, fv_checks):
+ # Note: we don't actually *do* anything with the 'store_op' argument,
+ # but including it here ensures that dask will compute it (thus
+ # performing all the delayed saves), before calling this function.
+ results = []
+ # Pair each fill_check result (is_masked, contains_value) with its
+ # fillinfo and construct a suitable Warning if needed.
+ for fillinfo, (is_masked, contains_value) in zip(
+ fv_infos, fv_checks
+ ):
+ fv_warning = _fillvalue_report(
+ fill_info=fillinfo,
+ is_masked=is_masked,
+ contains_fill_value=contains_value,
+ )
+ if fv_warning is not None:
+ # Collect the warnings and return them.
+ results.append(fv_warning)
+ return results
+
+ result = compute_and_return_warnings(
+ store_op,
+ fv_infos=fill_infos,
+ fv_checks=delayed_fillvalue_checks,
+ )
- # fill_warn allows us to skip warning if packing attributes have been
- # specified. It would require much more complex operations to work out
- # what the values and fill_value _would_ be in such a case.
- if fill_warn:
- if fill_value is not None:
- fill_value_to_check = fill_value
- else:
- fill_value_to_check = netCDF4.default_fillvals[dtype.str[1:]]
else:
- fill_value_to_check = None
+ # Return a delayed, which returns an empty list, for usage consistency.
+ @dask.delayed
+ def no_op():
+ return []
- # Store the data and check if it is masked and contains the fill value.
- is_masked, contains_fill_value = store(
- data, cf_var, fill_value_to_check
- )
+ result = no_op()
- if dtype.itemsize == 1 and fill_value is None:
- if is_masked:
- msg = (
- "CF var '{}' contains byte data with masked points, but "
- "no fill_value keyword was given. As saved, these "
- "points will read back as valid values. To save as "
- "masked byte data, `_FillValue` needs to be explicitly "
- "set. For Cube data this can be done via the 'fill_value' "
- "keyword during saving, otherwise use ncedit/equivalent."
- )
- warnings.warn(msg.format(cf_var.name))
- elif contains_fill_value:
+ return result
+
+ def complete(self, issue_warnings=True) -> List[Warning]:
+ """
+ Complete file by computing any delayed variable saves.
+
+ This requires that the Saver has closed the dataset (exited its context).
+
+ Parameters
+ ----------
+ issue_warnings : bool, default = True
+ If true, issue all the resulting warnings with :func:`warnings.warn`.
+
+ Returns
+ -------
+ warnings : list of Warning
+ Any warnings that were raised while writing delayed data.
+
+ """
+ if self._dataset.isopen():
msg = (
- "CF var '{}' contains unmasked data points equal to the "
- "fill-value, {}. As saved, these points will read back "
- "as missing data. To save these as normal values, "
- "`_FillValue` needs to be set to not equal any valid data "
- "points. For Cube data this can be done via the 'fill_value' "
- "keyword during saving, otherwise use ncedit/equivalent."
+ "Cannot call Saver.complete() until its dataset is closed, "
+ "i.e. the saver's context has exited."
)
- warnings.warn(msg.format(cf_var.name, fill_value))
+ raise ValueError(msg)
+
+ delayed_write = self.delayed_completion()
+ # Complete the saves now, and handle any delayed warnings that occurred
+ result_warnings = delayed_write.compute()
+ if issue_warnings:
+ # Issue any delayed warnings from the compute.
+ for delayed_warning in result_warnings:
+ warnings.warn(delayed_warning)
+
+ return result_warnings
def save(
@@ -2526,6 +2594,7 @@ def save(
least_significant_digit=None,
packing=None,
fill_value=None,
+ compute=True,
):
"""
Save cube(s) to a netCDF file, given the cube and the filename.
@@ -2550,6 +2619,11 @@ def save(
* filename (string):
Name of the netCDF file to save the cube(s).
+ **Or** an open, writeable :class:`netCDF4.Dataset`, or compatible object.
+
+ .. Note::
+ When saving to a dataset, ``compute`` **must** be ``False`` :
+ See the ``compute`` parameter.
Kwargs:
@@ -2648,8 +2722,34 @@ def save(
`:class:`iris.cube.CubeList`, or a single element, and each element of
this argument will be applied to each cube separately.
+ * compute (bool):
+ Default is ``True``, meaning complete the file immediately, and return ``None``.
+
+ When ``False``, create the output file but don't write any lazy array content to
+ its variables, such as lazy cube data or aux-coord points and bounds.
+ Instead return a :class:`dask.delayed.Delayed` which, when computed, will
+ stream all the lazy content via :meth:`dask.store`, to complete the file.
+ Several such data saves can be performed in parallel, by passing a list of them
+ into a :func:`dask.compute` call.
+
+ .. Note::
+ when computed, the returned :class:`dask.delayed.Delayed` object returns
+ a list of :class:`Warning`\\s : These are any warnings which *would* have
+ been issued in the save call, if ``compute`` had been ``True``.
+
+ .. Note::
+ If saving to an open dataset instead of a filepath, then the caller
+ **must** specify ``compute=False``, and complete delayed saves **after
+ closing the dataset**.
+ This is because delayed saves may be performed in other processes : These
+ must (re-)open the dataset for writing, which will fail if the file is
+ still open for writing by the caller.
+
Returns:
- None.
+ result (None, or dask.delayed.Delayed):
+ If `compute=True`, returns `None`.
+ Otherwise returns a :class:`dask.delayed.Delayed`, which implements delayed
+ writing to fill in the variables data.
.. note::
@@ -2748,7 +2848,9 @@ def is_valid_packspec(p):
raise ValueError(msg)
# Initialise Manager for saving
- with Saver(filename, netcdf_format) as sman:
+ # N.B. make the Saver compute=False, as we want control over creation of the
+ # delayed-completion object.
+ with Saver(filename, netcdf_format, compute=compute) as sman:
# Iterate through the cubelist.
for cube, packspec, fill_value in zip(cubes, packspecs, fill_values):
sman.write(
@@ -2793,3 +2895,12 @@ def is_valid_packspec(p):
# Add conventions attribute.
sman.update_global_attributes(Conventions=conventions)
+
+ if compute:
+ # No more to do, since we used Saver(compute=True).
+ result = None
+ else:
+ # Return a delayed completion object.
+ result = sman.delayed_completion()
+
+ return result
diff --git a/lib/iris/fileformats/pp.py b/lib/iris/fileformats/pp.py
index cff088cf89..ad0c6272ad 100644
--- a/lib/iris/fileformats/pp.py
+++ b/lib/iris/fileformats/pp.py
@@ -1678,7 +1678,7 @@ def load(filename, read_data=False, little_ended=False):
def _interpret_fields(fields):
"""
- Turn the fields read with load and FF2PP._extract_field into useable
+ Turn the fields read with load and FF2PP._extract_field into usable
fields. One of the primary purposes of this function is to either convert
"deferred bytes" into "deferred arrays" or "loaded bytes" into actual
numpy arrays (via the _create_field_data) function.
diff --git a/lib/iris/fileformats/rules.py b/lib/iris/fileformats/rules.py
index 51940b7c4d..707fd58757 100644
--- a/lib/iris/fileformats/rules.py
+++ b/lib/iris/fileformats/rules.py
@@ -404,15 +404,15 @@ def _load_pairs_from_fields_and_filenames(
def load_pairs_from_fields(fields, converter):
"""
Convert an iterable of fields into an iterable of Cubes using the
- provided convertor.
+ provided converter.
Args:
* fields:
An iterable of fields.
- * convertor:
- An Iris convertor function, suitable for use with the supplied fields.
+ * converter:
+ An Iris converter function, suitable for use with the supplied fields.
See the description in :class:`iris.fileformats.rules.Loader`.
Returns:
diff --git a/lib/iris/fileformats/um_cf_map.py b/lib/iris/fileformats/um_cf_map.py
index 01539960a5..b93b192bbd 100644
--- a/lib/iris/fileformats/um_cf_map.py
+++ b/lib/iris/fileformats/um_cf_map.py
@@ -88,7 +88,7 @@
'm01s00i012': CFName('mass_fraction_of_cloud_ice_in_air', None, 'kg kg-1'),
'm01s00i013': CFName('convective_cloud_area_fraction', None, '1'),
'm01s00i020': CFName('soil_temperature', None, 'K'),
- 'm01s00i023': CFName('snowfall_amount', None, 'kg m-2'),
+ 'm01s00i023': CFName('surface_snow_amount', None, 'kg m-2'),
'm01s00i024': CFName('surface_temperature', None, 'K'),
'm01s00i025': CFName('atmosphere_boundary_layer_thickness', None, 'm'),
'm01s00i026': CFName('surface_roughness_length', None, 'm'),
@@ -1207,7 +1207,7 @@
CFName('sea_ice_thickness', None, 'm'): 687,
CFName('sea_surface_elevation', None, 'm'): 608,
CFName('snow_grain_size', None, '1e-6 m'): 1507,
- CFName('snowfall_amount', None, 'kg m-2'): 93,
+ CFName('surface_snow_amount', None, 'kg m-2'): 93,
CFName('snowfall_flux', None, 'kg m-2 s-1'): 108,
CFName('soil_albedo', None, '1'): 1395,
CFName('soil_carbon_content', None, 'kg m-2'): 1397,
diff --git a/lib/iris/io/__init__.py b/lib/iris/io/__init__.py
index 7dd08c723c..4e5004ff10 100644
--- a/lib/iris/io/__init__.py
+++ b/lib/iris/io/__init__.py
@@ -94,6 +94,8 @@ def decode_uri(uri, default="file"):
In addition to well-formed URIs, it also supports bare file paths as strings
or :class:`pathlib.PurePath`. Both Windows and UNIX style paths are
accepted.
+ It also supports 'bare objects', i.e. anything which is not a string.
+ These are identified with a scheme of 'data', and returned unchanged.
.. testsetup::
@@ -119,20 +121,31 @@ def decode_uri(uri, default="file"):
>>> print(decode_uri('dataZoo/...'))
('file', 'dataZoo/...')
+ >>> print(decode_uri({}))
+ ('data', {})
+
"""
if isinstance(uri, pathlib.PurePath):
uri = str(uri)
- # make sure scheme has at least 2 letters to avoid windows drives
- # put - last in the brackets so it refers to the character, not a range
- # reference on valid schemes: http://tools.ietf.org/html/std66#section-3.1
- match = re.match(r"^([a-zA-Z][a-zA-Z0-9+.-]+):(.+)", uri)
- if match:
- scheme = match.group(1)
- part = match.group(2)
+
+ if isinstance(uri, str):
+ # make sure scheme has at least 2 letters to avoid windows drives
+ # put - last in the brackets so it refers to the character, not a range
+ # reference on valid schemes: http://tools.ietf.org/html/std66#section-3.1
+ match = re.match(r"^([a-zA-Z][a-zA-Z0-9+.-]+):(.+)", uri)
+ if match:
+ scheme = match.group(1)
+ part = match.group(2)
+ else:
+ # Catch bare UNIX and Windows paths
+ scheme = default
+ part = uri
else:
- # Catch bare UNIX and Windows paths
- scheme = default
+ # We can pass things other than strings, like open files.
+ # These are simply identified as 'data objects'.
+ scheme = "data"
part = uri
+
return scheme, part
@@ -216,7 +229,7 @@ def load_files(filenames, callback, constraints=None):
)
handler_map[handling_format_spec].append(fn)
- # Call each iris format handler with the approriate filenames
+ # Call each iris format handler with the appropriate filenames
for handling_format_spec in sorted(handler_map):
fnames = handler_map[handling_format_spec]
if handling_format_spec.constraint_aware_handler:
@@ -240,6 +253,13 @@ def load_http(urls, callback):
intended interface for loading is :func:`iris.load`.
"""
+ #
+ # NOTE: this routine is *also* called by "load_data_objects", in which case the
+ # 'urls' will actually be 'data objects'.
+ # In principle, however, their scopes are different, so it's just an implementation
+ # detail that right now the same code will do for both.
+ # If that changes sometime, the two routines may go their separate ways.
+
# Create default dict mapping iris format handler to its associated filenames
from iris.fileformats import FORMAT_AGENT
@@ -255,6 +275,26 @@ def load_http(urls, callback):
yield cube
+def load_data_objects(urls, callback):
+ """
+ Takes a list of data-source objects and a callback function, and returns a
+ generator of Cubes.
+ The 'objects' take the place of 'uris' in the load calls.
+ The appropriate types of the data-source objects are expected to be
+ recognised by the handlers : This is done in the usual way by passing the
+ context to the format picker to get a handler for each.
+
+ .. note::
+
+ Typically, this function should not be called directly; instead, the
+ intended interface for loading is :func:`iris.load`.
+
+ """
+ # NOTE: this operation is currently *identical* to the http one. But it seems
+ # sensible to provide a distinct handler function for this scheme.
+ yield from load_http(urls, callback)
+
+
def _dot_save(cube, target):
# A simple wrapper for `iris.fileformats.dot.save` which allows the
# saver to be registered without triggering the import of
@@ -454,7 +494,7 @@ def save(source, target, saver=None, **kwargs):
# Single cube?
if isinstance(source, Cube):
- saver(source, target, **kwargs)
+ result = saver(source, target, **kwargs)
# CubeList or sequence of cubes?
elif isinstance(source, CubeList) or (
@@ -477,9 +517,13 @@ def save(source, target, saver=None, **kwargs):
if i != 0:
kwargs["append"] = True
saver(cube, target, **kwargs)
+
+ result = None
# Netcdf saver.
else:
- saver(source, target, **kwargs)
+ result = saver(source, target, **kwargs)
else:
raise ValueError("Cannot save; non Cube found in source")
+
+ return result
diff --git a/lib/iris/io/format_picker.py b/lib/iris/io/format_picker.py
index a8e333c566..9def0ada98 100644
--- a/lib/iris/io/format_picker.py
+++ b/lib/iris/io/format_picker.py
@@ -331,3 +331,22 @@ def get_element(self, basename, file_handle):
from iris.io import decode_uri
return decode_uri(basename)[0]
+
+
+class DataSourceObjectProtocol(FileElement):
+ """
+ A :class:`FileElement` that simply returns the URI entry itself.
+
+ This enables a arbitrary non-string data object to be passed, subject to
+ subsequent checks on the object itself (specified in the handler).
+
+ """
+
+ def __init__(self):
+ super().__init__(requires_fh=False)
+
+ def get_element(self, basename, file_handle):
+ # In this context, there should *not* be a file opened by the handler.
+ # Just return 'basename', which in this case is not a name, or even a
+ # string, but a passed 'data object'.
+ return basename
diff --git a/lib/iris/iterate.py b/lib/iris/iterate.py
index d6bac77d3b..cf16c9cbe6 100644
--- a/lib/iris/iterate.py
+++ b/lib/iris/iterate.py
@@ -233,7 +233,7 @@ def __init__(self, cubes, requested_dims_by_cube, ordered, coords_by_cube):
break
# If a coordinate with an equivalent definition (i.e. same
# metadata) is not found in the master_dimensioned_coord_list,
- # add the coords assocaited with the dimension to the list,
+ # add the coords associated with the dimension to the list,
# add the size of the dimension to the master_dims_index and
# store the offset.
if not found:
diff --git a/lib/iris/palette.py b/lib/iris/palette.py
index a1c0a1e878..3ba17ffc97 100644
--- a/lib/iris/palette.py
+++ b/lib/iris/palette.py
@@ -57,7 +57,7 @@ def is_brewer(cmap):
def _default_cmap_norm(args, kwargs):
"""
- This function injects default cmap and norm behavour into the keyword
+ This function injects default cmap and norm behaviour into the keyword
arguments, based on the cube referenced within the positional arguments.
"""
cube = None
diff --git a/lib/iris/pandas.py b/lib/iris/pandas.py
index 417b6b11de..4d6681e94e 100644
--- a/lib/iris/pandas.py
+++ b/lib/iris/pandas.py
@@ -159,6 +159,8 @@ def as_cube(
as_cube(series, calendars={0: cf_units.CALENDAR_360_DAY})
as_cube(data_frame, calendars={1: cf_units.CALENDAR_STANDARD})
+ Since this function converts to/from a Pandas object, laziness will not be preserved.
+
"""
message = (
"iris.pandas.as_cube has been deprecated, and will be removed in a "
@@ -170,7 +172,7 @@ def as_cube(
if pandas_array.ndim not in [1, 2]:
raise ValueError(
"Only 1D or 2D Pandas arrays "
- "can currently be conveted to Iris cubes."
+ "can currently be converted to Iris cubes."
)
# Make the copy work consistently across NumPy 1.6 and 1.7.
@@ -240,6 +242,8 @@ def as_cubes(
:class:`dask.dataframe.DataFrame`\\ s are not supported.
+ Since this function converts to/from a Pandas object, laziness will not be preserved.
+
Examples
--------
>>> from iris.pandas import as_cubes
@@ -341,12 +345,13 @@ def as_cubes(
... var_name="longitude",
... value_name="air_temperature"
... )
+ >>> my_df["longitude"] = my_df["longitude"].infer_objects()
>>> print(my_df)
- latitude longitude air_temperature
- 0 35 0 300
- 1 25 0 301
- 2 35 10 302
- 3 25 10 303
+ latitude longitude air_temperature
+ 0 35 0 300
+ 1 25 0 301
+ 2 35 10 302
+ 3 25 10 303
>>> my_df = my_df.set_index(["latitude", "longitude"])
>>> my_df = my_df.sort_index()
>>> converted_cube = as_cubes(my_df)[0]
@@ -599,6 +604,10 @@ def as_series(cube, copy=True):
If you have a large array that cannot be copied,
make sure it is not masked and use copy=False.
+ Notes
+ ------
+ Since this function converts to/from a Pandas object, laziness will not be preserved.
+
"""
message = (
"iris.pandas.as_series has been deprecated, and will be removed in a "
@@ -809,6 +818,10 @@ def as_data_frame(
419903 298.995148
Name: surface_temperature, Length: 419904, dtype: float32
+ Notes
+ ------
+ Since this function converts to/from a Pandas object, laziness will not be preserved.
+
"""
def merge_metadata(meta_var_list):
diff --git a/lib/iris/plot.py b/lib/iris/plot.py
index 8cd849b716..d319c1361b 100644
--- a/lib/iris/plot.py
+++ b/lib/iris/plot.py
@@ -904,7 +904,7 @@ def _replace_axes_with_cartopy_axes(cartopy_proj):
ax = plt.gca()
if not isinstance(ax, cartopy.mpl.geoaxes.GeoAxes):
- fig = plt.gcf()
+ fig = ax.get_figure()
if isinstance(ax, matplotlib.axes.SubplotBase):
_ = fig.add_subplot(
ax.get_subplotspec(),
@@ -1112,6 +1112,11 @@ def contour(cube, *args, **kwargs):
See :func:`matplotlib.pyplot.contour` for details of other valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
result = _draw_2d_from_points("contour", None, cube, *args, **kwargs)
return result
@@ -1136,6 +1141,11 @@ def contourf(cube, *args, **kwargs):
See :func:`matplotlib.pyplot.contourf` for details of other valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
coords = kwargs.get("coords")
kwargs.setdefault("antialiased", True)
@@ -1200,6 +1210,11 @@ def default_projection(cube):
import matplotlib.pyplot as plt
ax = plt.ax(projection=default_projection(cube))
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
# XXX logic seems flawed, but it is what map_setup did...
cs = cube.coord_system("CoordSystem")
@@ -1218,6 +1233,11 @@ def default_projection_extent(cube, mode=iris.coords.POINT_MODE):
points, or the limits of the cell's bounds.
The default is iris.coords.POINT_MODE.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
extents = cartography._xy_range(cube, mode)
xlim = extents[0]
@@ -1255,7 +1275,13 @@ def _fill_orography(cube, coords, mode, vert_plot, horiz_plot, style_args):
def orography_at_bounds(cube, facecolor="#888888", coords=None, axes=None):
- """Plots orography defined at cell boundaries from the given Cube."""
+ """Plots orography defined at cell boundaries from the given Cube.
+
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+ """
# XXX Needs contiguous orography corners to work.
raise NotImplementedError(
@@ -1288,7 +1314,13 @@ def horiz_plot(v_coord, orography, style_args):
def orography_at_points(cube, facecolor="#888888", coords=None, axes=None):
- """Plots orography defined at sample points from the given Cube."""
+ """Plots orography defined at sample points from the given Cube.
+
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+ """
style_args = {"facecolor": facecolor}
@@ -1334,6 +1366,11 @@ def outline(cube, coords=None, color="k", linewidth=None, axes=None):
The axes to use for drawing. Defaults to the current axes if none
provided.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
result = _draw_2d_from_bounds(
"pcolormesh",
@@ -1376,6 +1413,11 @@ def pcolor(cube, *args, **kwargs):
See :func:`matplotlib.pyplot.pcolor` for details of other valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
kwargs.setdefault("antialiased", True)
kwargs.setdefault("snap", False)
@@ -1410,6 +1452,11 @@ def pcolormesh(cube, *args, **kwargs):
See :func:`matplotlib.pyplot.pcolormesh` for details of other
valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
result = _draw_2d_from_bounds("pcolormesh", cube, *args, **kwargs)
return result
@@ -1435,6 +1482,11 @@ def points(cube, *args, **kwargs):
See :func:`matplotlib.pyplot.scatter` for details of other valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
def _scatter_args(u, v, data, *args, **kwargs):
@@ -1526,6 +1578,11 @@ def barbs(u_cube, v_cube, *args, **kwargs):
See :func:`matplotlib.pyplot.barbs` for details of other valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
#
# TODO: check u + v cubes for compatibility.
@@ -1576,6 +1633,11 @@ def quiver(u_cube, v_cube, *args, **kwargs):
See :func:`matplotlib.pyplot.quiver` for details of other valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
#
# TODO: check u + v cubes for compatibility.
@@ -1622,6 +1684,11 @@ def plot(*args, **kwargs):
See :func:`matplotlib.pyplot.plot` for details of additional valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
if "coords" in kwargs:
raise TypeError(
@@ -1654,6 +1721,11 @@ def scatter(x, y, *args, **kwargs):
See :func:`matplotlib.pyplot.scatter` for details of additional
valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
# here we are more specific about argument types than generic 1d plotting
if not isinstance(x, (iris.cube.Cube, iris.coords.Coord)):
@@ -1689,6 +1761,11 @@ def fill_between(x, y1, y2, *args, **kwargs):
See :func:`matplotlib.pyplot.fill_between` for details of additional valid
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
# here we are more specific about argument types than generic 1d plotting
if not isinstance(x, (iris.cube.Cube, iris.coords.Coord)):
@@ -1704,6 +1781,41 @@ def fill_between(x, y1, y2, *args, **kwargs):
)
+def hist(x, *args, **kwargs):
+ """
+ Compute and plot a histogram.
+
+ Args:
+
+ * x:
+ A :class:`~iris.cube.Cube`, :class:`~iris.coords.Coord`,
+ :class:`~iris.coords.CellMeasure`, or :class:`~iris.coords.AncillaryVariable`
+ that will be used as the values that will be used to create the
+ histogram.
+ Note that if a coordinate is given, the points are used, ignoring the
+ bounds.
+
+ See :func:`matplotlib.pyplot.hist` for details of additional valid
+ keyword arguments.
+
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
+ """
+ if isinstance(x, iris.cube.Cube):
+ data = x.data
+ elif isinstance(x, iris.coords._DimensionalMetadata):
+ data = x._values
+ else:
+ raise TypeError(
+ "x must be a cube, coordinate, cell measure or "
+ "ancillary variable."
+ )
+ return plt.hist(data, *args, **kwargs)
+
+
# Provide convenience show method from pyplot
show = plt.show
@@ -1737,6 +1849,11 @@ def symbols(x, y, symbols, size, axes=None, units="inches"):
* units: ['inches', 'points']
The unit for the symbol size.
+ Notes
+ ------
+ This function does maintain laziness when called; it doesn't realise data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
if axes is None:
axes = plt.gca()
@@ -1862,6 +1979,11 @@ def animate(cube_iterator, plot_func, fig=None, **kwargs):
>>> ani = iplt.animate(cube_iter, qplt.contourf)
>>> iplt.show()
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
kwargs.setdefault("interval", 100)
coords = kwargs.pop("coords", None)
diff --git a/lib/iris/quickplot.py b/lib/iris/quickplot.py
index 6006314265..9209d4b3b7 100644
--- a/lib/iris/quickplot.py
+++ b/lib/iris/quickplot.py
@@ -174,6 +174,11 @@ def contour(cube, *args, **kwargs):
See :func:`iris.plot.contour` for details of valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
coords = kwargs.get("coords")
axes = kwargs.get("axes")
@@ -201,6 +206,10 @@ def contourf(cube, *args, **kwargs):
See :func:`iris.plot.contourf` for details of valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
"""
coords = kwargs.get("coords")
axes = kwargs.get("axes")
@@ -229,6 +238,11 @@ def outline(cube, coords=None, color="k", linewidth=None, axes=None):
The width of the lines showing the cell outlines. If None, the default
width in patch.linewidth in matplotlibrc is used.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
result = iplt.outline(
cube, color=color, linewidth=linewidth, coords=coords, axes=axes
@@ -244,6 +258,10 @@ def pcolor(cube, *args, **kwargs):
See :func:`iris.plot.pcolor` for details of valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
"""
coords = kwargs.get("coords")
axes = kwargs.get("axes")
@@ -258,6 +276,11 @@ def pcolormesh(cube, *args, **kwargs):
See :func:`iris.plot.pcolormesh` for details of valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
coords = kwargs.get("coords")
axes = kwargs.get("axes")
@@ -272,6 +295,11 @@ def points(cube, *args, **kwargs):
See :func:`iris.plot.points` for details of valid keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
coords = kwargs.get("coords")
axes = kwargs.get("axes")
@@ -288,6 +316,11 @@ def plot(*args, **kwargs):
See :func:`iris.plot.plot` for details of valid arguments and
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
axes = kwargs.get("axes")
result = iplt.plot(*args, **kwargs)
@@ -303,6 +336,11 @@ def scatter(x, y, *args, **kwargs):
See :func:`iris.plot.scatter` for details of valid arguments and
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+
"""
axes = kwargs.get("axes")
result = iplt.scatter(x, y, *args, **kwargs)
@@ -317,6 +355,10 @@ def fill_between(x, y1, y2, *args, **kwargs):
See :func:`iris.plot.fill_between` for details of valid arguments and
keyword arguments.
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
"""
axes = kwargs.get("axes")
result = iplt.fill_between(x, y1, y2, *args, **kwargs)
@@ -324,5 +366,35 @@ def fill_between(x, y1, y2, *args, **kwargs):
return result
+def hist(x, *args, **kwargs):
+ """
+ Compute and plot a labelled histogram.
+
+ See :func:`iris.plot.hist` for details of valid arguments and
+ keyword arguments.
+
+ Notes
+ ------
+ This function does not maintain laziness when called; it realises data.
+ See more at :doc:`/userguide/real_and_lazy_data`.
+ """
+ axes = kwargs.get("axes")
+ result = iplt.hist(x, *args, **kwargs)
+ title = _title(x, with_units=False)
+ label = _title(x, with_units=True)
+
+ if axes is None:
+ axes = plt.gca()
+
+ orientation = kwargs.get("orientation")
+ if orientation == "horizontal":
+ axes.set_ylabel(label)
+ else:
+ axes.set_xlabel(label)
+ axes.set_title(title)
+
+ return result
+
+
# Provide a convenience show method from pyplot.
show = plt.show
diff --git a/lib/iris/tests/experimental/test_raster.py b/lib/iris/tests/experimental/test_raster.py
index c654483bfd..ffd03e6f4d 100644
--- a/lib/iris/tests/experimental/test_raster.py
+++ b/lib/iris/tests/experimental/test_raster.py
@@ -42,7 +42,7 @@ def check_tiff_header(self, tiff_filename, expect_keys, expect_entries):
def check_tiff(self, cube, header_keys, header_items):
# Check that the cube saves correctly to TIFF :
# * the header contains expected keys and (some) values
- # * the data array retrives correctly
+ # * the data array retrieves correctly
import iris.experimental.raster
with self.temp_filename(".tif") as temp_filename:
diff --git a/lib/iris/tests/graphics/__init__.py b/lib/iris/tests/graphics/__init__.py
index 544d989564..5ee555cb6e 100755
--- a/lib/iris/tests/graphics/__init__.py
+++ b/lib/iris/tests/graphics/__init__.py
@@ -113,10 +113,10 @@ def write_repo_json(data: Dict[str, str]) -> None:
string_data = {}
for key, val in data.items():
string_data[key] = str(val)
- with open(IMAGE_REPO_PATH, "wb") as fo:
+ with open(IMAGE_REPO_PATH, "wb") as fout:
json.dump(
string_data,
- codecs.getwriter("utf-8")(fo),
+ codecs.getwriter("utf-8")(fout),
indent=4,
sort_keys=True,
)
diff --git a/lib/iris/tests/integration/concatenate/test_concatenate.py b/lib/iris/tests/integration/concatenate/test_concatenate.py
index 091ecd4378..1f39b2589d 100644
--- a/lib/iris/tests/integration/concatenate/test_concatenate.py
+++ b/lib/iris/tests/integration/concatenate/test_concatenate.py
@@ -16,13 +16,43 @@
import cf_units
import numpy as np
-from iris._concatenate import concatenate
+from iris._concatenate import _DerivedCoordAndDims, concatenate
+import iris.aux_factory
import iris.coords
import iris.cube
import iris.tests.stock as stock
from iris.util import unify_time_units
+class Test_DerivedCoordAndDims:
+ def test_equal(self):
+ assert _DerivedCoordAndDims(
+ "coord", "dims", "aux_factory"
+ ) == _DerivedCoordAndDims("coord", "dims", "aux_factory")
+
+ def test_non_equal_coord(self):
+ assert _DerivedCoordAndDims(
+ "coord_0", "dims", "aux_factory"
+ ) != _DerivedCoordAndDims("coord_1", "dims", "aux_factory")
+
+ def test_non_equal_dims(self):
+ assert _DerivedCoordAndDims(
+ "coord", "dims_0", "aux_factory"
+ ) != _DerivedCoordAndDims("coord", "dims_1", "aux_factory")
+
+ def test_non_equal_aux_factory(self):
+ # Note: aux factories are not taken into account for equality!
+ assert _DerivedCoordAndDims(
+ "coord", "dims", "aux_factory_0"
+ ) == _DerivedCoordAndDims("coord", "dims", "aux_factory_1")
+
+ def test_non_equal_types(self):
+ assert (
+ _DerivedCoordAndDims("coord", "dims", "aux_factory")
+ != "I am not a _DerivedCoordAndDims"
+ )
+
+
class Test_concatenate__epoch(tests.IrisTest):
def simple_1d_time_cubes(self, reftimes, coords_points):
cubes = []
@@ -187,6 +217,127 @@ def test_ignore_diff_ancillary_variables(self):
self.assertEqual(result[0].shape, (4, 2))
+class Test_cubes_with_derived_coord(tests.IrisTest):
+ def create_cube(self):
+ data = np.arange(4).reshape(2, 2)
+ aux_factories = []
+
+ # DimCoords
+ sigma = iris.coords.DimCoord([0.0, 10.0], var_name="sigma", units="1")
+ t_unit = cf_units.Unit(
+ "hours since 1970-01-01 00:00:00", calendar="standard"
+ )
+ time = iris.coords.DimCoord([0, 6], standard_name="time", units=t_unit)
+
+ # AtmosphereSigmaFactory (does not span concatenated dim)
+ ptop = iris.coords.AuxCoord(100.0, var_name="ptop", units="Pa")
+ surface_p = iris.coords.AuxCoord([1.0, 2.0], var_name="ps", units="Pa")
+ aux_factories.append(
+ iris.aux_factory.AtmosphereSigmaFactory(ptop, sigma, surface_p)
+ )
+
+ # HybridHeightFactory (span concatenated dim)
+ delta = iris.coords.AuxCoord(10.0, var_name="delta", units="m")
+ orog = iris.coords.AuxCoord(data, var_name="orog", units="m")
+ aux_factories.append(
+ iris.aux_factory.HybridHeightFactory(delta, sigma, orog)
+ )
+
+ dim_coords_and_dims = [(time, 0), (sigma, 1)]
+ aux_coords_and_dims = [
+ (ptop, ()),
+ (delta, ()),
+ (surface_p, 1),
+ (orog, (0, 1)),
+ ]
+
+ cube = iris.cube.Cube(
+ data,
+ standard_name="air_temperature",
+ units="K",
+ dim_coords_and_dims=dim_coords_and_dims,
+ aux_coords_and_dims=aux_coords_and_dims,
+ aux_factories=aux_factories,
+ )
+ return cube
+
+ def test_equal_derived_coords(self):
+ cube_a = self.create_cube()
+ cube_b = cube_a.copy()
+ cube_b.coord("time").points = [12, 18]
+
+ result = concatenate([cube_a, cube_b])
+ self.assertEqual(len(result), 1)
+ self.assertEqual(result[0].shape, (4, 2))
+
+ np.testing.assert_allclose(
+ result[0].coord("air_pressure").points, [100.0, -880.0]
+ )
+ np.testing.assert_allclose(
+ result[0].coord("altitude").points,
+ [[10.0, 20.0], [10.0, 40.0], [10.0, 20.0], [10.0, 40.0]],
+ )
+
+ def test_equal_derived_coords_with_bounds(self):
+ cube_a = self.create_cube()
+ cube_a.coord("sigma").bounds = [[0.0, 5.0], [5.0, 20.0]]
+ cube_b = cube_a.copy()
+ cube_b.coord("time").points = [12, 18]
+
+ result = concatenate([cube_a, cube_b])
+ self.assertEqual(len(result), 1)
+ self.assertEqual(result[0].shape, (4, 2))
+
+ np.testing.assert_allclose(
+ result[0].coord("air_pressure").bounds,
+ [[100.0, -395.0], [-390.0, -1860.0]],
+ )
+
+ def test_diff_altitude(self):
+ """Gives one cube since altitude spans concatenation dim."""
+ cube_a = self.create_cube()
+ cube_b = cube_a.copy()
+ cube_b.coord("time").points = [12, 18]
+ cube_b.coord("orog").points = [[0, 0], [0, 0]]
+
+ result = concatenate([cube_a, cube_b])
+ self.assertEqual(len(result), 1)
+ self.assertEqual(result[0].shape, (4, 2))
+
+ np.testing.assert_allclose(
+ result[0].coord("altitude").points,
+ [[10.0, 20.0], [10.0, 40.0], [10.0, 10.0], [10.0, 10.0]],
+ )
+
+ def test_diff_air_pressure(self):
+ """Gives two cubes since altitude does not span concatenation dim."""
+ cube_a = self.create_cube()
+ cube_b = cube_a.copy()
+ cube_b.coord("time").points = [12, 18]
+ cube_b.coord("ps").points = [10.0, 20.0]
+
+ result = concatenate([cube_a, cube_b], check_aux_coords=False)
+ self.assertEqual(len(result), 2)
+
+ def test_ignore_diff_air_pressure(self):
+ cube_a = self.create_cube()
+ cube_b = cube_a.copy()
+ cube_b.coord("time").points = [12, 18]
+ cube_b.coord("ps").points = [10.0, 20.0]
+
+ result = concatenate(
+ [cube_a, cube_b],
+ check_aux_coords=False,
+ check_derived_coords=False,
+ )
+ self.assertEqual(len(result), 1)
+ self.assertEqual(result[0].shape, (4, 2))
+
+ np.testing.assert_allclose(
+ result[0].coord("air_pressure").points, [100.0, -880.0]
+ )
+
+
class Test_anonymous_dims(tests.IrisTest):
def setUp(self):
data = np.arange(12).reshape(2, 3, 2)
diff --git a/lib/iris/tests/integration/fast_load/test_fast_load.py b/lib/iris/tests/integration/fast_load/test_fast_load.py
index a510ef7257..318292615b 100644
--- a/lib/iris/tests/integration/fast_load/test_fast_load.py
+++ b/lib/iris/tests/integration/fast_load/test_fast_load.py
@@ -419,7 +419,7 @@ def test_load_raw(self):
expected = CubeList(fldset_1 + fldset_2)
else:
# 'Raw' cubes have combined (vector) times within each file.
- # The 'other' phenomenon appears seperately.
+ # The 'other' phenomenon appears separately.
expected = CubeList(
[
CubeList(fldset_1[:2]).merge_cube(),
diff --git a/lib/iris/tests/runner/__init__.py b/lib/iris/tests/integration/netcdf/__init__.py
similarity index 75%
rename from lib/iris/tests/runner/__init__.py
rename to lib/iris/tests/integration/netcdf/__init__.py
index b561e1cf87..f500b52520 100644
--- a/lib/iris/tests/runner/__init__.py
+++ b/lib/iris/tests/integration/netcdf/__init__.py
@@ -3,7 +3,4 @@
# This file is part of Iris and is released under the LGPL license.
# See COPYING and COPYING.LESSER in the root of the repository for full
# licensing details.
-"""
-Empty file to allow import.
-
-"""
+"""Integration tests for loading and saving netcdf files."""
diff --git a/lib/iris/tests/integration/netcdf/test__dask_locks.py b/lib/iris/tests/integration/netcdf/test__dask_locks.py
new file mode 100644
index 0000000000..c41af1b356
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test__dask_locks.py
@@ -0,0 +1,115 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""
+Unit tests for the :mod:`iris.fileformats.netcdf._dask_locks` package.
+
+Note: these integration tests replace any unit testing of this module, due to its total
+dependence on Dask, and even on Dask's implementation details rather than supported
+and documented API and behaviour.
+So (a) it is essential to check the module's behaviour against actual Dask operation,
+and (b) mock-ist testing of the implementation code in isolation would not add anything
+of much value.
+"""
+import dask
+import dask.config
+import distributed
+import pytest
+
+from iris.fileformats.netcdf._dask_locks import (
+ DaskSchedulerTypeError,
+ dask_scheduler_is_distributed,
+ get_dask_array_scheduler_type,
+ get_worker_lock,
+)
+
+
+@pytest.fixture(
+ params=[
+ "UnspecifiedScheduler",
+ "ThreadedScheduler",
+ "SingleThreadScheduler",
+ "ProcessScheduler",
+ "DistributedScheduler",
+ ]
+)
+def dask_scheduler(request):
+ # Control Dask to enable a specific scheduler type.
+ sched_typename = request.param
+ if sched_typename == "UnspecifiedScheduler":
+ config_name = None
+ elif sched_typename == "SingleThreadScheduler":
+ config_name = "single-threaded"
+ elif sched_typename == "ThreadedScheduler":
+ config_name = "threads"
+ elif sched_typename == "ProcessScheduler":
+ config_name = "processes"
+ else:
+ assert sched_typename == "DistributedScheduler"
+ config_name = "distributed"
+
+ if config_name == "distributed":
+ _distributed_client = distributed.Client()
+
+ if config_name is None:
+ context = None
+ else:
+ context = dask.config.set(scheduler=config_name)
+ context.__enter__()
+
+ yield sched_typename
+
+ if context:
+ context.__exit__(None, None, None)
+
+ if config_name == "distributed":
+ _distributed_client.close()
+
+
+def test_dask_scheduler_is_distributed(dask_scheduler):
+ result = dask_scheduler_is_distributed()
+ # Should return 'True' only with a distributed scheduler.
+ expected = dask_scheduler == "DistributedScheduler"
+ assert result == expected
+
+
+def test_get_dask_array_scheduler_type(dask_scheduler):
+ result = get_dask_array_scheduler_type()
+ expected = {
+ "UnspecifiedScheduler": "threads",
+ "ThreadedScheduler": "threads",
+ "ProcessScheduler": "processes",
+ "SingleThreadScheduler": "single-threaded",
+ "DistributedScheduler": "distributed",
+ }[dask_scheduler]
+ assert result == expected
+
+
+def test_get_worker_lock(dask_scheduler):
+ test_identity = ""
+ error = None
+ try:
+ result = get_worker_lock(test_identity)
+ except DaskSchedulerTypeError as err:
+ error = err
+ result = None
+
+ if dask_scheduler == "ProcessScheduler":
+ assert result is None
+ assert isinstance(error, DaskSchedulerTypeError)
+ msg = 'scheduler type is "processes", which is not supported'
+ assert msg in error.args[0]
+ else:
+ assert error is None
+ assert result is not None
+ if dask_scheduler == "DistributedScheduler":
+ assert isinstance(result, distributed.Lock)
+ assert result.name == test_identity
+ else:
+ # low-level object doesn't have a readily available class for isinstance
+ assert all(
+ hasattr(result, att)
+ for att in ("acquire", "release", "locked")
+ )
diff --git a/lib/iris/tests/integration/netcdf/test_attributes.py b/lib/iris/tests/integration/netcdf/test_attributes.py
new file mode 100644
index 0000000000..a73d6c7d49
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_attributes.py
@@ -0,0 +1,119 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""Integration tests for attribute-related loading and saving netcdf files."""
+
+# Import iris.tests first so that some things can be initialised before
+# importing anything else.
+import iris.tests as tests # isort:skip
+
+from contextlib import contextmanager
+from unittest import mock
+
+import iris
+from iris.cube import Cube, CubeList
+from iris.fileformats.netcdf import CF_CONVENTIONS_VERSION
+
+
+class TestUmVersionAttribute(tests.IrisTest):
+ def test_single_saves_as_global(self):
+ cube = Cube(
+ [1.0],
+ standard_name="air_temperature",
+ units="K",
+ attributes={"um_version": "4.3"},
+ )
+ with self.temp_filename(".nc") as nc_path:
+ iris.save(cube, nc_path)
+ self.assertCDL(nc_path)
+
+ def test_multiple_same_saves_as_global(self):
+ cube_a = Cube(
+ [1.0],
+ standard_name="air_temperature",
+ units="K",
+ attributes={"um_version": "4.3"},
+ )
+ cube_b = Cube(
+ [1.0],
+ standard_name="air_pressure",
+ units="hPa",
+ attributes={"um_version": "4.3"},
+ )
+ with self.temp_filename(".nc") as nc_path:
+ iris.save(CubeList([cube_a, cube_b]), nc_path)
+ self.assertCDL(nc_path)
+
+ def test_multiple_different_saves_on_variables(self):
+ cube_a = Cube(
+ [1.0],
+ standard_name="air_temperature",
+ units="K",
+ attributes={"um_version": "4.3"},
+ )
+ cube_b = Cube(
+ [1.0],
+ standard_name="air_pressure",
+ units="hPa",
+ attributes={"um_version": "4.4"},
+ )
+ with self.temp_filename(".nc") as nc_path:
+ iris.save(CubeList([cube_a, cube_b]), nc_path)
+ self.assertCDL(nc_path)
+
+
+@contextmanager
+def _patch_site_configuration():
+ def cf_patch_conventions(conventions):
+ return ", ".join([conventions, "convention1, convention2"])
+
+ def update(config):
+ config["cf_profile"] = mock.Mock(name="cf_profile")
+ config["cf_patch"] = mock.Mock(name="cf_patch")
+ config["cf_patch_conventions"] = cf_patch_conventions
+
+ orig_site_config = iris.site_configuration.copy()
+ update(iris.site_configuration)
+ yield
+ iris.site_configuration = orig_site_config
+
+
+class TestConventionsAttributes(tests.IrisTest):
+ def test_patching_conventions_attribute(self):
+ # Ensure that user defined conventions are wiped and those which are
+ # saved patched through site_config can be loaded without an exception
+ # being raised.
+ cube = Cube(
+ [1.0],
+ standard_name="air_temperature",
+ units="K",
+ attributes={"Conventions": "some user defined conventions"},
+ )
+
+ # Patch the site configuration dictionary.
+ with _patch_site_configuration(), self.temp_filename(".nc") as nc_path:
+ iris.save(cube, nc_path)
+ res = iris.load_cube(nc_path)
+
+ self.assertEqual(
+ res.attributes["Conventions"],
+ "{}, {}, {}".format(
+ CF_CONVENTIONS_VERSION, "convention1", "convention2"
+ ),
+ )
+
+
+class TestStandardName(tests.IrisTest):
+ def test_standard_name_roundtrip(self):
+ standard_name = "air_temperature detection_minimum"
+ cube = iris.cube.Cube(1, standard_name=standard_name)
+ with self.temp_filename(suffix=".nc") as fout:
+ iris.save(cube, fout)
+ detection_limit_cube = iris.load_cube(fout)
+ self.assertEqual(detection_limit_cube.standard_name, standard_name)
+
+
+if __name__ == "__main__":
+ tests.main()
diff --git a/lib/iris/tests/integration/netcdf/test_aux_factories.py b/lib/iris/tests/integration/netcdf/test_aux_factories.py
new file mode 100644
index 0000000000..d89f275336
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_aux_factories.py
@@ -0,0 +1,160 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""Integration tests for aux-factory-related loading and saving netcdf files."""
+
+# Import iris.tests first so that some things can be initialised before
+# importing anything else.
+import iris.tests as tests # isort:skip
+
+import iris
+from iris.tests import stock as stock
+
+
+@tests.skip_data
+class TestAtmosphereSigma(tests.IrisTest):
+ def setUp(self):
+ # Modify stock cube so it is suitable to have a atmosphere sigma
+ # factory added to it.
+ cube = stock.realistic_4d_no_derived()
+ cube.coord("surface_altitude").rename("surface_air_pressure")
+ cube.coord("surface_air_pressure").units = "Pa"
+ cube.coord("sigma").units = "1"
+ ptop_coord = iris.coords.AuxCoord(1000.0, var_name="ptop", units="Pa")
+ cube.add_aux_coord(ptop_coord, ())
+ cube.remove_coord("level_height")
+ # Construct and add atmosphere sigma factory.
+ factory = iris.aux_factory.AtmosphereSigmaFactory(
+ cube.coord("ptop"),
+ cube.coord("sigma"),
+ cube.coord("surface_air_pressure"),
+ )
+ cube.add_aux_factory(factory)
+ self.cube = cube
+
+ def test_save(self):
+ with self.temp_filename(suffix=".nc") as filename:
+ iris.save(self.cube, filename)
+ self.assertCDL(filename)
+
+ def test_save_load_loop(self):
+ # Ensure that the AtmosphereSigmaFactory is automatically loaded
+ # when loading the file.
+ with self.temp_filename(suffix=".nc") as filename:
+ iris.save(self.cube, filename)
+ cube = iris.load_cube(filename, "air_potential_temperature")
+ assert cube.coords("air_pressure")
+
+
+@tests.skip_data
+class TestHybridPressure(tests.IrisTest):
+ def setUp(self):
+ # Modify stock cube so it is suitable to have a
+ # hybrid pressure factory added to it.
+ cube = stock.realistic_4d_no_derived()
+ cube.coord("surface_altitude").rename("surface_air_pressure")
+ cube.coord("surface_air_pressure").units = "Pa"
+ cube.coord("level_height").rename("level_pressure")
+ cube.coord("level_pressure").units = "Pa"
+ # Construct and add hybrid pressure factory.
+ factory = iris.aux_factory.HybridPressureFactory(
+ cube.coord("level_pressure"),
+ cube.coord("sigma"),
+ cube.coord("surface_air_pressure"),
+ )
+ cube.add_aux_factory(factory)
+ self.cube = cube
+
+ def test_save(self):
+ with self.temp_filename(suffix=".nc") as filename:
+ iris.save(self.cube, filename)
+ self.assertCDL(filename)
+
+ def test_save_load_loop(self):
+ # Tests an issue where the variable names in the formula
+ # terms changed to the standard_names instead of the variable names
+ # when loading a previously saved cube.
+ with self.temp_filename(suffix=".nc") as filename, self.temp_filename(
+ suffix=".nc"
+ ) as other_filename:
+ iris.save(self.cube, filename)
+ cube = iris.load_cube(filename, "air_potential_temperature")
+ iris.save(cube, other_filename)
+ other_cube = iris.load_cube(
+ other_filename, "air_potential_temperature"
+ )
+ self.assertEqual(cube, other_cube)
+
+
+@tests.skip_data
+class TestSaveMultipleAuxFactories(tests.IrisTest):
+ def test_hybrid_height_and_pressure(self):
+ cube = stock.realistic_4d()
+ cube.add_aux_coord(
+ iris.coords.DimCoord(
+ 1200.0, long_name="level_pressure", units="hPa"
+ )
+ )
+ cube.add_aux_coord(
+ iris.coords.DimCoord(0.5, long_name="other sigma", units="1")
+ )
+ cube.add_aux_coord(
+ iris.coords.DimCoord(
+ 1000.0, long_name="surface_air_pressure", units="hPa"
+ )
+ )
+ factory = iris.aux_factory.HybridPressureFactory(
+ cube.coord("level_pressure"),
+ cube.coord("other sigma"),
+ cube.coord("surface_air_pressure"),
+ )
+ cube.add_aux_factory(factory)
+ with self.temp_filename(suffix=".nc") as filename:
+ iris.save(cube, filename)
+ self.assertCDL(filename)
+
+ def test_shared_primary(self):
+ cube = stock.realistic_4d()
+ factory = iris.aux_factory.HybridHeightFactory(
+ cube.coord("level_height"),
+ cube.coord("sigma"),
+ cube.coord("surface_altitude"),
+ )
+ factory.rename("another altitude")
+ cube.add_aux_factory(factory)
+ with self.temp_filename(
+ suffix=".nc"
+ ) as filename, self.assertRaisesRegex(
+ ValueError, "multiple aux factories"
+ ):
+ iris.save(cube, filename)
+
+ def test_hybrid_height_cubes(self):
+ hh1 = stock.simple_4d_with_hybrid_height()
+ hh1.attributes["cube"] = "hh1"
+ hh2 = stock.simple_4d_with_hybrid_height()
+ hh2.attributes["cube"] = "hh2"
+ sa = hh2.coord("surface_altitude")
+ sa.points = sa.points * 10
+ with self.temp_filename(".nc") as fname:
+ iris.save([hh1, hh2], fname)
+ cubes = iris.load(fname, "air_temperature")
+ cubes = sorted(cubes, key=lambda cube: cube.attributes["cube"])
+ self.assertCML(cubes)
+
+ def test_hybrid_height_cubes_on_dimension_coordinate(self):
+ hh1 = stock.hybrid_height()
+ hh2 = stock.hybrid_height()
+ sa = hh2.coord("surface_altitude")
+ sa.points = sa.points * 10
+ emsg = "Unable to create dimensonless vertical coordinate."
+ with self.temp_filename(".nc") as fname, self.assertRaisesRegex(
+ ValueError, emsg
+ ):
+ iris.save([hh1, hh2], fname)
+
+
+if __name__ == "__main__":
+ tests.main()
diff --git a/lib/iris/tests/integration/netcdf/test_coord_systems.py b/lib/iris/tests/integration/netcdf/test_coord_systems.py
new file mode 100644
index 0000000000..3175664b4c
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_coord_systems.py
@@ -0,0 +1,281 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""Integration tests for coord-system-related loading and saving netcdf files."""
+
+# Import iris.tests first so that some things can be initialised before
+# importing anything else.
+import iris.tests as tests # isort:skip
+
+from os.path import join as path_join
+import shutil
+import tempfile
+
+import iris
+from iris.coords import DimCoord
+from iris.cube import Cube
+from iris.tests import stock as stock
+from iris.tests.stock.netcdf import ncgen_from_cdl
+from iris.tests.unit.fileformats.netcdf.loader import test_load_cubes as tlc
+
+
+@tests.skip_data
+class TestCoordSystem(tests.IrisTest):
+ def setUp(self):
+ tlc.setUpModule()
+
+ def tearDown(self):
+ tlc.tearDownModule()
+
+ def test_load_laea_grid(self):
+ cube = iris.load_cube(
+ tests.get_data_path(
+ ("NetCDF", "lambert_azimuthal_equal_area", "euro_air_temp.nc")
+ )
+ )
+ self.assertCML(cube, ("netcdf", "netcdf_laea.cml"))
+
+ datum_cf_var_cdl = """
+ netcdf output {
+ dimensions:
+ y = 4 ;
+ x = 3 ;
+ variables:
+ float data(y, x) ;
+ data :standard_name = "toa_brightness_temperature" ;
+ data :units = "K" ;
+ data :grid_mapping = "mercator" ;
+ int mercator ;
+ mercator:grid_mapping_name = "mercator" ;
+ mercator:longitude_of_prime_meridian = 0. ;
+ mercator:earth_radius = 6378169. ;
+ mercator:horizontal_datum_name = "OSGB36" ;
+ float y(y) ;
+ y:axis = "Y" ;
+ y:units = "m" ;
+ y:standard_name = "projection_y_coordinate" ;
+ float x(x) ;
+ x:axis = "X" ;
+ x:units = "m" ;
+ x:standard_name = "projection_x_coordinate" ;
+
+ // global attributes:
+ :Conventions = "CF-1.7" ;
+ :standard_name_vocabulary = "CF Standard Name Table v27" ;
+
+ data:
+
+ data =
+ 0, 1, 2,
+ 3, 4, 5,
+ 6, 7, 8,
+ 9, 10, 11 ;
+
+ mercator = _ ;
+
+ y = 1, 2, 3, 5 ;
+
+ x = -6, -4, -2 ;
+
+ }
+ """
+
+ datum_wkt_cdl = """
+netcdf output5 {
+dimensions:
+ y = 4 ;
+ x = 3 ;
+variables:
+ float data(y, x) ;
+ data :standard_name = "toa_brightness_temperature" ;
+ data :units = "K" ;
+ data :grid_mapping = "mercator" ;
+ int mercator ;
+ mercator:grid_mapping_name = "mercator" ;
+ mercator:longitude_of_prime_meridian = 0. ;
+ mercator:earth_radius = 6378169. ;
+ mercator:longitude_of_projection_origin = 0. ;
+ mercator:false_easting = 0. ;
+ mercator:false_northing = 0. ;
+ mercator:scale_factor_at_projection_origin = 1. ;
+ mercator:crs_wkt = "PROJCRS[\\"unknown\\",BASEGEOGCRS[\\"unknown\\",DATUM[\\"OSGB36\\",ELLIPSOID[\\"unknown\\",6378169,0,LENGTHUNIT[\\"metre\\",1,ID[\\"EPSG\\",9001]]]],PRIMEM[\\"Greenwich\\",0,ANGLEUNIT[\\"degree\\",0.0174532925199433],ID[\\"EPSG\\",8901]]],CONVERSION[\\"unknown\\",METHOD[\\"Mercator (variant B)\\",ID[\\"EPSG\\",9805]],PARAMETER[\\"Latitude of 1st standard parallel\\",0,ANGLEUNIT[\\"degree\\",0.0174532925199433],ID[\\"EPSG\\",8823]],PARAMETER[\\"Longitude of natural origin\\",0,ANGLEUNIT[\\"degree\\",0.0174532925199433],ID[\\"EPSG\\",8802]],PARAMETER[\\"False easting\\",0,LENGTHUNIT[\\"metre\\",1],ID[\\"EPSG\\",8806]],PARAMETER[\\"False northing\\",0,LENGTHUNIT[\\"metre\\",1],ID[\\"EPSG\\",8807]]],CS[Cartesian,2],AXIS[\\"(E)\\",east,ORDER[1],LENGTHUNIT[\\"metre\\",1,ID[\\"EPSG\\",9001]]],AXIS[\\"(N)\\",north,ORDER[2],LENGTHUNIT[\\"metre\\",1,ID[\\"EPSG\\",9001]]]]" ;
+ float y(y) ;
+ y:axis = "Y" ;
+ y:units = "m" ;
+ y:standard_name = "projection_y_coordinate" ;
+ float x(x) ;
+ x:axis = "X" ;
+ x:units = "m" ;
+ x:standard_name = "projection_x_coordinate" ;
+
+// global attributes:
+ :standard_name_vocabulary = "CF Standard Name Table v27" ;
+ :Conventions = "CF-1.7" ;
+data:
+
+ data =
+ 0, 1, 2,
+ 3, 4, 5,
+ 6, 7, 8,
+ 9, 10, 11 ;
+
+ mercator = _ ;
+
+ y = 1, 2, 3, 5 ;
+
+ x = -6, -4, -2 ;
+}
+ """
+
+ def test_load_datum_wkt(self):
+ expected = "OSGB 1936"
+ nc_path = tlc.cdl_to_nc(self.datum_wkt_cdl)
+ with iris.FUTURE.context(datum_support=True):
+ cube = iris.load_cube(nc_path)
+ test_crs = cube.coord("projection_y_coordinate").coord_system
+ actual = str(test_crs.as_cartopy_crs().datum)
+ self.assertMultiLineEqual(expected, actual)
+
+ def test_no_load_datum_wkt(self):
+ nc_path = tlc.cdl_to_nc(self.datum_wkt_cdl)
+ with self.assertWarnsRegex(FutureWarning, "iris.FUTURE.datum_support"):
+ cube = iris.load_cube(nc_path)
+ test_crs = cube.coord("projection_y_coordinate").coord_system
+ actual = str(test_crs.as_cartopy_crs().datum)
+ self.assertMultiLineEqual(actual, "unknown")
+
+ def test_load_datum_cf_var(self):
+ expected = "OSGB 1936"
+ nc_path = tlc.cdl_to_nc(self.datum_cf_var_cdl)
+ with iris.FUTURE.context(datum_support=True):
+ cube = iris.load_cube(nc_path)
+ test_crs = cube.coord("projection_y_coordinate").coord_system
+ actual = str(test_crs.as_cartopy_crs().datum)
+ self.assertMultiLineEqual(expected, actual)
+
+ def test_no_load_datum_cf_var(self):
+ nc_path = tlc.cdl_to_nc(self.datum_cf_var_cdl)
+ with self.assertWarnsRegex(FutureWarning, "iris.FUTURE.datum_support"):
+ cube = iris.load_cube(nc_path)
+ test_crs = cube.coord("projection_y_coordinate").coord_system
+ actual = str(test_crs.as_cartopy_crs().datum)
+ self.assertMultiLineEqual(actual, "unknown")
+
+ def test_save_datum(self):
+ expected = "OSGB 1936"
+ saved_crs = iris.coord_systems.Mercator(
+ ellipsoid=iris.coord_systems.GeogCS.from_datum("OSGB36")
+ )
+
+ base_cube = stock.realistic_3d()
+ base_lat_coord = base_cube.coord("grid_latitude")
+ test_lat_coord = DimCoord(
+ base_lat_coord.points,
+ standard_name="projection_y_coordinate",
+ coord_system=saved_crs,
+ )
+ base_lon_coord = base_cube.coord("grid_longitude")
+ test_lon_coord = DimCoord(
+ base_lon_coord.points,
+ standard_name="projection_x_coordinate",
+ coord_system=saved_crs,
+ )
+ test_cube = Cube(
+ base_cube.data,
+ standard_name=base_cube.standard_name,
+ units=base_cube.units,
+ dim_coords_and_dims=(
+ (base_cube.coord("time"), 0),
+ (test_lat_coord, 1),
+ (test_lon_coord, 2),
+ ),
+ )
+
+ with self.temp_filename(suffix=".nc") as filename:
+ iris.save(test_cube, filename)
+ with iris.FUTURE.context(datum_support=True):
+ cube = iris.load_cube(filename)
+
+ test_crs = cube.coord("projection_y_coordinate").coord_system
+ actual = str(test_crs.as_cartopy_crs().datum)
+ self.assertMultiLineEqual(expected, actual)
+
+
+class TestLoadMinimalGeostationary(tests.IrisTest):
+ """
+ Check we can load data with a geostationary grid-mapping, even when the
+ 'false-easting' and 'false_northing' properties are missing.
+
+ """
+
+ _geostationary_problem_cdl = """
+netcdf geostationary_problem_case {
+dimensions:
+ y = 2 ;
+ x = 3 ;
+variables:
+ short radiance(y, x) ;
+ radiance:standard_name = "toa_outgoing_radiance_per_unit_wavelength" ;
+ radiance:units = "W m-2 sr-1 um-1" ;
+ radiance:coordinates = "y x" ;
+ radiance:grid_mapping = "imager_grid_mapping" ;
+ short y(y) ;
+ y:units = "rad" ;
+ y:axis = "Y" ;
+ y:long_name = "fixed grid projection y-coordinate" ;
+ y:standard_name = "projection_y_coordinate" ;
+ short x(x) ;
+ x:units = "rad" ;
+ x:axis = "X" ;
+ x:long_name = "fixed grid projection x-coordinate" ;
+ x:standard_name = "projection_x_coordinate" ;
+ int imager_grid_mapping ;
+ imager_grid_mapping:grid_mapping_name = "geostationary" ;
+ imager_grid_mapping:perspective_point_height = 35786023. ;
+ imager_grid_mapping:semi_major_axis = 6378137. ;
+ imager_grid_mapping:semi_minor_axis = 6356752.31414 ;
+ imager_grid_mapping:latitude_of_projection_origin = 0. ;
+ imager_grid_mapping:longitude_of_projection_origin = -75. ;
+ imager_grid_mapping:sweep_angle_axis = "x" ;
+
+data:
+
+ // coord values, just so these can be dim-coords
+ y = 0, 1 ;
+ x = 0, 1, 2 ;
+
+}
+"""
+
+ @classmethod
+ def setUpClass(cls):
+ # Create a temp directory for transient test files.
+ cls.temp_dir = tempfile.mkdtemp()
+ cls.path_test_cdl = path_join(cls.temp_dir, "geos_problem.cdl")
+ cls.path_test_nc = path_join(cls.temp_dir, "geos_problem.nc")
+ # Create reference CDL and netcdf files from the CDL text.
+ ncgen_from_cdl(
+ cdl_str=cls._geostationary_problem_cdl,
+ cdl_path=cls.path_test_cdl,
+ nc_path=cls.path_test_nc,
+ )
+
+ @classmethod
+ def tearDownClass(cls):
+ # Destroy the temp directory.
+ shutil.rmtree(cls.temp_dir)
+
+ def test_geostationary_no_false_offsets(self):
+ # Check we can load the test data and coordinate system properties are correct.
+ cube = iris.load_cube(self.path_test_nc)
+ # Check the coordinate system properties has the correct default properties.
+ cs = cube.coord_system()
+ self.assertIsInstance(cs, iris.coord_systems.Geostationary)
+ self.assertEqual(cs.false_easting, 0.0)
+ self.assertEqual(cs.false_northing, 0.0)
+
+
+if __name__ == "__main__":
+ tests.main()
diff --git a/lib/iris/tests/integration/netcdf/test_delayed_save.py b/lib/iris/tests/integration/netcdf/test_delayed_save.py
new file mode 100644
index 0000000000..616feb3b0e
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_delayed_save.py
@@ -0,0 +1,339 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""
+Integration tests for delayed saving.
+"""
+import warnings
+
+from cf_units import Unit
+import dask.array as da
+import dask.config
+from dask.delayed import Delayed
+import distributed
+import numpy as np
+import pytest
+
+import iris
+from iris.fileformats.netcdf._thread_safe_nc import default_fillvals
+from iris.fileformats.netcdf.saver import SaverFillValueWarning
+import iris.tests
+from iris.tests.stock import realistic_4d
+
+
+class Test__lazy_stream_data:
+ @pytest.fixture(autouse=True)
+ def output_path(self, tmp_path):
+ # A temporary output netcdf-file path, **unique to each test call**.
+ self.temp_output_filepath = tmp_path / "tmp.nc"
+ yield self.temp_output_filepath
+
+ @pytest.fixture(autouse=True, scope="module")
+ def all_vars_lazy(self):
+ # For the operation of these tests, we want to force all netcdf variables
+ # to load as lazy data, i.e. **don't** use real data for 'small' ones.
+ old_value = iris.fileformats.netcdf.loader._LAZYVAR_MIN_BYTES
+ iris.fileformats.netcdf.loader._LAZYVAR_MIN_BYTES = 0
+ yield
+ iris.fileformats.netcdf.loader._LAZYVAR_MIN_BYTES = old_value
+
+ @staticmethod
+ @pytest.fixture(params=[False, True], ids=["SaveImmediate", "SaveDelayed"])
+ def save_is_delayed(request):
+ return request.param
+
+ @staticmethod
+ def make_testcube(
+ include_lazy_content=True,
+ ensure_fillvalue_collision=False,
+ data_is_maskedbytes=False,
+ include_extra_coordlikes=False,
+ ):
+ cube = realistic_4d()
+
+ def fix_array(array):
+ """
+ Make a new, custom array to replace the provided cube/coord data.
+ Optionally provide default-fill-value collisions, and/or replace with lazy
+ content.
+ """
+ if array is not None:
+ if data_is_maskedbytes:
+ dmin, dmax = 0, 255
+ else:
+ dmin, dmax = array.min(), array.max()
+ array = np.random.uniform(dmin, dmax, size=array.shape)
+
+ if data_is_maskedbytes:
+ array = array.astype("u1")
+ array = np.ma.masked_array(array)
+ # To trigger, it must also have at least one *masked point*.
+ array[tuple([0] * array.ndim)] = np.ma.masked
+
+ if ensure_fillvalue_collision:
+ # Set point at midpoint index = default-fill-value
+ fill_value = default_fillvals[array.dtype.str[1:]]
+ inds = tuple(dim // 2 for dim in array.shape)
+ array[inds] = fill_value
+
+ if include_lazy_content:
+ # Make the array lazy.
+ # Ensure we always have multiple chunks (relatively small ones).
+ chunks = list(array.shape)
+ chunks[0] = 1
+ array = da.from_array(array, chunks=chunks)
+
+ return array
+
+ # Replace the cube data, and one aux-coord, according to the control settings.
+ cube.data = fix_array(cube.data)
+ auxcoord = cube.coord("surface_altitude")
+ auxcoord.points = fix_array(auxcoord.points)
+
+ if include_extra_coordlikes:
+ # Also concoct + attach an ancillary variable and a cell-measure, so we can
+ # check that they behave the same as coordinates.
+ ancil_dims = [0, 2]
+ cm_dims = [0, 3]
+ ancil_shape = [cube.shape[idim] for idim in ancil_dims]
+ cm_shape = [cube.shape[idim] for idim in cm_dims]
+ from iris.coords import AncillaryVariable, CellMeasure
+
+ ancil = AncillaryVariable(
+ fix_array(np.zeros(ancil_shape)), long_name="sample_ancil"
+ )
+ cube.add_ancillary_variable(ancil, ancil_dims)
+ cm = CellMeasure(
+ fix_array(np.zeros(cm_shape)), long_name="sample_cm"
+ )
+ cube.add_cell_measure(cm, cm_dims)
+ return cube
+
+ def test_realfile_loadsave_equivalence(self, save_is_delayed, output_path):
+ input_filepath = iris.tests.get_data_path(
+ ["NetCDF", "global", "xyz_t", "GEMS_CO2_Apr2006.nc"]
+ )
+ original_cubes = iris.load(input_filepath)
+
+ # Preempt some standard changes that an iris save will impose.
+ for cube in original_cubes:
+ if cube.units == Unit("-"):
+ # replace 'unknown unit' with 'no unit'.
+ cube.units = Unit("?")
+ # Fix conventions attribute to what iris.save outputs.
+ cube.attributes["Conventions"] = "CF-1.7"
+
+ original_cubes = sorted(original_cubes, key=lambda cube: cube.name())
+ result = iris.save(
+ original_cubes, output_path, compute=not save_is_delayed
+ )
+ if save_is_delayed:
+ # In this case, must also "complete" the save.
+ result.compute()
+ reloaded_cubes = iris.load(output_path)
+ reloaded_cubes = sorted(reloaded_cubes, key=lambda cube: cube.name())
+ assert reloaded_cubes == original_cubes
+ # NOTE: it might be nicer to use assertCDL, but unfortunately importing
+ # unittest.TestCase seems to lose us the ability to use fixtures.
+
+ @classmethod
+ @pytest.fixture(
+ params=[
+ "ThreadedScheduler",
+ "DistributedScheduler",
+ "SingleThreadScheduler",
+ ]
+ )
+ def scheduler_type(cls, request):
+ sched_typename = request.param
+ if sched_typename == "ThreadedScheduler":
+ config_name = "threads"
+ elif sched_typename == "SingleThreadScheduler":
+ config_name = "single-threaded"
+ else:
+ assert sched_typename == "DistributedScheduler"
+ config_name = "distributed"
+
+ if config_name == "distributed":
+ _distributed_client = distributed.Client()
+
+ with dask.config.set(scheduler=config_name):
+ yield sched_typename
+
+ if config_name == "distributed":
+ _distributed_client.close()
+
+ def test_scheduler_types(
+ self, output_path, scheduler_type, save_is_delayed
+ ):
+ # Check operation works and behaves the same with different schedulers,
+ # especially including distributed.
+
+ # Just check that the dask scheduler is setup as 'expected'.
+ if scheduler_type == "ThreadedScheduler":
+ expected_dask_scheduler = "threads"
+ elif scheduler_type == "SingleThreadScheduler":
+ expected_dask_scheduler = "single-threaded"
+ else:
+ assert scheduler_type == "DistributedScheduler"
+ expected_dask_scheduler = "distributed"
+
+ assert dask.config.get("scheduler") == expected_dask_scheduler
+
+ # Use a testcase that produces delayed warnings (and check those too).
+ cube = self.make_testcube(
+ include_lazy_content=True, ensure_fillvalue_collision=True
+ )
+ with warnings.catch_warnings(record=True) as logged_warnings:
+ result = iris.save(cube, output_path, compute=not save_is_delayed)
+
+ if not save_is_delayed:
+ assert result is None
+ assert len(logged_warnings) == 2
+ issued_warnings = [log.message for log in logged_warnings]
+ else:
+ assert result is not None
+ assert len(logged_warnings) == 0
+ warnings.simplefilter("error")
+ issued_warnings = result.compute()
+
+ assert len(issued_warnings) == 2
+ expected_msg = "contains unmasked data points equal to the fill-value"
+ assert all(
+ expected_msg in warning.args[0] for warning in issued_warnings
+ )
+
+ def test_time_of_writing(
+ self, save_is_delayed, output_path, scheduler_type
+ ):
+ # Check when lazy data is *actually* written :
+ # - in 'immediate' mode, on initial file write
+ # - in 'delayed' mode, only when the delayed-write is computed.
+ original_cube = self.make_testcube(include_extra_coordlikes=True)
+ assert original_cube.has_lazy_data()
+ assert original_cube.coord("surface_altitude").has_lazy_points()
+ assert original_cube.cell_measure("sample_cm").has_lazy_data()
+ assert original_cube.ancillary_variable("sample_ancil").has_lazy_data()
+
+ result = iris.save(
+ original_cube,
+ output_path,
+ compute=not save_is_delayed,
+ )
+ assert save_is_delayed == (result is not None)
+
+ # Read back : NOTE avoid loading the separate surface-altitude cube.
+ readback_cube = iris.load_cube(
+ output_path, "air_potential_temperature"
+ )
+ # Check the components to be tested *are* lazy. See: self.all_vars_lazy().
+ assert readback_cube.has_lazy_data()
+ assert readback_cube.coord("surface_altitude").has_lazy_points()
+ assert readback_cube.cell_measure("sample_cm").has_lazy_data()
+ assert readback_cube.ancillary_variable("sample_ancil").has_lazy_data()
+
+ # If 'delayed', the lazy content should all be masked, otherwise none of it.
+ def getmask(cube_or_coord):
+ cube_or_coord = (
+ cube_or_coord.copy()
+ ) # avoid realising the original
+ if hasattr(cube_or_coord, "points"):
+ data = cube_or_coord.points
+ else:
+ data = cube_or_coord.data
+ return np.ma.getmaskarray(data)
+
+ test_components = [
+ readback_cube,
+ readback_cube.coord("surface_altitude"),
+ readback_cube.ancillary_variable("sample_ancil"),
+ readback_cube.cell_measure("sample_cm"),
+ ]
+
+ def fetch_masks():
+ data_mask, coord_mask, ancil_mask, cm_mask = [
+ getmask(data) for data in test_components
+ ]
+ return data_mask, coord_mask, ancil_mask, cm_mask
+
+ data_mask, coord_mask, ancil_mask, cm_mask = fetch_masks()
+ if save_is_delayed:
+ assert np.all(data_mask)
+ assert np.all(coord_mask)
+ assert np.all(ancil_mask)
+ assert np.all(cm_mask)
+ else:
+ assert np.all(~data_mask)
+ assert np.all(~coord_mask)
+ assert np.all(~ancil_mask)
+ assert np.all(~cm_mask)
+
+ if save_is_delayed:
+ # Complete the write.
+ result.compute()
+
+ # Re-fetch the lazy arrays. The data should now **not be masked**.
+ data_mask, coord_mask, ancil_mask, cm_mask = fetch_masks()
+ # All written now ?
+ assert np.all(~data_mask)
+ assert np.all(~coord_mask)
+ assert np.all(~ancil_mask)
+ assert np.all(~cm_mask)
+
+ @pytest.mark.parametrize(
+ "warning_type", ["WarnMaskedBytes", "WarnFillvalueCollision"]
+ )
+ def test_fill_warnings(self, warning_type, output_path, save_is_delayed):
+ # Test collision warnings for data with fill-value collisions, or for masked
+ # byte data.
+ if warning_type == "WarnFillvalueCollision":
+ make_fv_collide = True
+ make_maskedbytes = False
+ expected_msg = (
+ "contains unmasked data points equal to the fill-value"
+ )
+ else:
+ assert warning_type == "WarnMaskedBytes"
+ make_fv_collide = False
+ make_maskedbytes = True
+ expected_msg = "contains byte data with masked points"
+
+ cube = self.make_testcube(
+ include_lazy_content=True,
+ ensure_fillvalue_collision=make_fv_collide,
+ data_is_maskedbytes=make_maskedbytes,
+ )
+ with warnings.catch_warnings(record=True) as logged_warnings:
+ result = iris.save(cube, output_path, compute=not save_is_delayed)
+
+ result_warnings = [
+ log.message
+ for log in logged_warnings
+ if isinstance(log.message, SaverFillValueWarning)
+ ]
+
+ if save_is_delayed:
+ # Should have had *no* fill-warnings in the initial save.
+ assert len(result_warnings) == 0
+ # Complete the operation now
+ with warnings.catch_warnings():
+ # NOTE: warnings should *not* be issued here, instead they are returned.
+ warnings.simplefilter("error", category=SaverFillValueWarning)
+ result_warnings = result.compute()
+
+ # Either way, we should now have 2 similar warnings.
+ assert len(result_warnings) == 2
+ assert all(
+ expected_msg in warning.args[0] for warning in result_warnings
+ )
+
+ def test_no_delayed_writes(self, output_path):
+ # Just check that a delayed save returns a usable 'delayed' object, even when
+ # there is no lazy content = no delayed writes to perform.
+ cube = self.make_testcube(include_lazy_content=False)
+ warnings.simplefilter("error")
+ result = iris.save(cube, output_path, compute=False)
+ assert isinstance(result, Delayed)
+ assert result.compute() == []
diff --git a/lib/iris/tests/integration/netcdf/test_general.py b/lib/iris/tests/integration/netcdf/test_general.py
new file mode 100644
index 0000000000..dc0c29455f
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_general.py
@@ -0,0 +1,495 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""Integration tests for loading and saving netcdf files."""
+# Import iris.tests first so that some things can be initialised before
+# importing anything else.
+import iris.tests as tests # isort:skip
+
+from itertools import repeat
+import os.path
+from pathlib import Path
+import shutil
+import tempfile
+from unittest import mock
+import warnings
+
+import numpy as np
+import numpy.ma as ma
+import pytest
+
+import iris
+import iris.coord_systems
+from iris.coords import CellMethod
+from iris.cube import Cube, CubeList
+import iris.exceptions
+from iris.fileformats.netcdf import Saver, UnknownCellMethodWarning
+
+# Get the netCDF4 module, but in a sneaky way that avoids triggering the "do not import
+# netCDF4" check in "iris.tests.test_coding_standards.test_netcdf4_import()".
+import iris.fileformats.netcdf._thread_safe_nc as threadsafe_nc
+
+nc = threadsafe_nc.netCDF4
+
+from iris.tests.stock.netcdf import ncgen_from_cdl
+
+
+class TestLazySave(tests.IrisTest):
+ @tests.skip_data
+ def test_lazy_preserved_save(self):
+ fpath = tests.get_data_path(
+ ("NetCDF", "label_and_climate", "small_FC_167_mon_19601101.nc")
+ )
+ # While loading, "turn off" loading small variables as real data.
+ with mock.patch(
+ "iris.fileformats.netcdf.loader._LAZYVAR_MIN_BYTES", 0
+ ):
+ acube = iris.load_cube(fpath, "air_temperature")
+ self.assertTrue(acube.has_lazy_data())
+ # Also check a coord with lazy points + bounds.
+ self.assertTrue(acube.coord("forecast_period").has_lazy_points())
+ self.assertTrue(acube.coord("forecast_period").has_lazy_bounds())
+ with self.temp_filename(".nc") as nc_path:
+ with Saver(nc_path, "NETCDF4") as saver:
+ saver.write(acube)
+ # Check that cube data is not realised, also coord points + bounds.
+ self.assertTrue(acube.has_lazy_data())
+ self.assertTrue(acube.coord("forecast_period").has_lazy_points())
+ self.assertTrue(acube.coord("forecast_period").has_lazy_bounds())
+
+
+@tests.skip_data
+class TestCellMeasures(tests.IrisTest):
+ def setUp(self):
+ self.fname = tests.get_data_path(("NetCDF", "ORCA2", "votemper.nc"))
+
+ def test_load_raw(self):
+ (cube,) = iris.load_raw(self.fname)
+ self.assertEqual(len(cube.cell_measures()), 1)
+ self.assertEqual(cube.cell_measures()[0].measure, "area")
+
+ def test_load(self):
+ cube = iris.load_cube(self.fname)
+ self.assertEqual(len(cube.cell_measures()), 1)
+ self.assertEqual(cube.cell_measures()[0].measure, "area")
+
+ def test_merge_cell_measure_aware(self):
+ (cube1,) = iris.load_raw(self.fname)
+ (cube2,) = iris.load_raw(self.fname)
+ cube2._cell_measures_and_dims[0][0].var_name = "not_areat"
+ cubes = CubeList([cube1, cube2]).merge()
+ self.assertEqual(len(cubes), 2)
+
+ def test_concatenate_cell_measure_aware(self):
+ (cube1,) = iris.load_raw(self.fname)
+ cube1 = cube1[:, :, 0, 0]
+ cm_and_dims = cube1._cell_measures_and_dims
+ (cube2,) = iris.load_raw(self.fname)
+ cube2 = cube2[:, :, 0, 0]
+ cube2._cell_measures_and_dims[0][0].var_name = "not_areat"
+ cube2.coord("time").points = cube2.coord("time").points + 1
+ cubes = CubeList([cube1, cube2]).concatenate()
+ self.assertEqual(cubes[0]._cell_measures_and_dims, cm_and_dims)
+ self.assertEqual(len(cubes), 2)
+
+ def test_concatenate_cell_measure_match(self):
+ (cube1,) = iris.load_raw(self.fname)
+ cube1 = cube1[:, :, 0, 0]
+ cm_and_dims = cube1._cell_measures_and_dims
+ (cube2,) = iris.load_raw(self.fname)
+ cube2 = cube2[:, :, 0, 0]
+ cube2.coord("time").points = cube2.coord("time").points + 1
+ cubes = CubeList([cube1, cube2]).concatenate()
+ self.assertEqual(cubes[0]._cell_measures_and_dims, cm_and_dims)
+ self.assertEqual(len(cubes), 1)
+
+ def test_round_trip(self):
+ (cube,) = iris.load(self.fname)
+ with self.temp_filename(suffix=".nc") as filename:
+ iris.save(cube, filename, unlimited_dimensions=[])
+ (round_cube,) = iris.load_raw(filename)
+ self.assertEqual(len(round_cube.cell_measures()), 1)
+ self.assertEqual(round_cube.cell_measures()[0].measure, "area")
+
+ def test_print(self):
+ cube = iris.load_cube(self.fname)
+ printed = cube.__str__()
+ self.assertIn(
+ (
+ "Cell measures:\n"
+ " cell_area - - "
+ " x x"
+ ),
+ printed,
+ )
+
+
+class TestCellMethod_unknown(tests.IrisTest):
+ def test_unknown_method(self):
+ cube = Cube([1, 2], long_name="odd_phenomenon")
+ cube.add_cell_method(CellMethod(method="oddity", coords=("x",)))
+ temp_dirpath = tempfile.mkdtemp()
+ try:
+ temp_filepath = os.path.join(temp_dirpath, "tmp.nc")
+ iris.save(cube, temp_filepath)
+ with warnings.catch_warnings(record=True) as warning_records:
+ iris.load(temp_filepath)
+ # Filter to get the warning we are interested in.
+ warning_messages = [record.message for record in warning_records]
+ warning_messages = [
+ warn
+ for warn in warning_messages
+ if isinstance(warn, UnknownCellMethodWarning)
+ ]
+ self.assertEqual(len(warning_messages), 1)
+ message = warning_messages[0].args[0]
+ msg = (
+ "NetCDF variable 'odd_phenomenon' contains unknown cell "
+ "method 'oddity'"
+ )
+ self.assertIn(msg, message)
+ finally:
+ shutil.rmtree(temp_dirpath)
+
+
+def _get_scale_factor_add_offset(cube, datatype):
+ """Utility function used by netCDF data packing tests."""
+ if isinstance(datatype, dict):
+ dt = np.dtype(datatype["dtype"])
+ else:
+ dt = np.dtype(datatype)
+ cmax = cube.data.max()
+ cmin = cube.data.min()
+ n = dt.itemsize * 8
+ if ma.isMaskedArray(cube.data):
+ masked = True
+ else:
+ masked = False
+ if masked:
+ scale_factor = (cmax - cmin) / (2**n - 2)
+ else:
+ scale_factor = (cmax - cmin) / (2**n - 1)
+ if dt.kind == "u":
+ add_offset = cmin
+ elif dt.kind == "i":
+ if masked:
+ add_offset = (cmax + cmin) / 2
+ else:
+ add_offset = cmin + 2 ** (n - 1) * scale_factor
+ return (scale_factor, add_offset)
+
+
+@tests.skip_data
+class TestPackedData(tests.IrisTest):
+ def _single_test(self, datatype, CDLfilename, manual=False):
+ # Read PP input file.
+ file_in = tests.get_data_path(
+ (
+ "PP",
+ "cf_processing",
+ "000003000000.03.236.000128.1990.12.01.00.00.b.pp",
+ )
+ )
+ cube = iris.load_cube(file_in)
+ scale_factor, offset = _get_scale_factor_add_offset(cube, datatype)
+ if manual:
+ packspec = dict(
+ dtype=datatype, scale_factor=scale_factor, add_offset=offset
+ )
+ else:
+ packspec = datatype
+ # Write Cube to netCDF file.
+ with self.temp_filename(suffix=".nc") as file_out:
+ iris.save(cube, file_out, packing=packspec)
+ decimal = int(-np.log10(scale_factor))
+ packedcube = iris.load_cube(file_out)
+ # Check that packed cube is accurate to expected precision
+ self.assertArrayAlmostEqual(
+ cube.data, packedcube.data, decimal=decimal
+ )
+ # Check the netCDF file against CDL expected output.
+ self.assertCDL(
+ file_out,
+ (
+ "integration",
+ "netcdf",
+ "general",
+ "TestPackedData",
+ CDLfilename,
+ ),
+ )
+
+ def test_single_packed_signed(self):
+ """Test saving a single CF-netCDF file with packing."""
+ self._single_test("i2", "single_packed_signed.cdl")
+
+ def test_single_packed_unsigned(self):
+ """Test saving a single CF-netCDF file with packing into unsigned."""
+ self._single_test("u1", "single_packed_unsigned.cdl")
+
+ def test_single_packed_manual_scale(self):
+ """Test saving a single CF-netCDF file with packing with scale
+ factor and add_offset set manually."""
+ self._single_test("i2", "single_packed_manual.cdl", manual=True)
+
+ def _multi_test(self, CDLfilename, multi_dtype=False):
+ """Test saving multiple packed cubes with pack_dtype list."""
+ # Read PP input file.
+ file_in = tests.get_data_path(
+ ("PP", "cf_processing", "abcza_pa19591997_daily_29.b.pp")
+ )
+ cubes = iris.load(file_in)
+ # ensure cube order is the same:
+ cubes.sort(key=lambda cube: cube.cell_methods[0].method)
+ datatype = "i2"
+ scale_factor, offset = _get_scale_factor_add_offset(cubes[0], datatype)
+ if multi_dtype:
+ packdict = dict(
+ dtype=datatype, scale_factor=scale_factor, add_offset=offset
+ )
+ packspec = [packdict, None, "u2"]
+ dtypes = packspec
+ else:
+ packspec = datatype
+ dtypes = repeat(packspec)
+
+ # Write Cube to netCDF file.
+ with self.temp_filename(suffix=".nc") as file_out:
+ iris.save(cubes, file_out, packing=packspec)
+ # Check the netCDF file against CDL expected output.
+ self.assertCDL(
+ file_out,
+ (
+ "integration",
+ "netcdf",
+ "general",
+ "TestPackedData",
+ CDLfilename,
+ ),
+ )
+ packedcubes = iris.load(file_out)
+ packedcubes.sort(key=lambda cube: cube.cell_methods[0].method)
+ for cube, packedcube, dtype in zip(cubes, packedcubes, dtypes):
+ if dtype:
+ sf, ao = _get_scale_factor_add_offset(cube, dtype)
+ decimal = int(-np.log10(sf))
+ # Check that packed cube is accurate to expected precision
+ self.assertArrayAlmostEqual(
+ cube.data, packedcube.data, decimal=decimal
+ )
+ else:
+ self.assertArrayEqual(cube.data, packedcube.data)
+
+ def test_multi_packed_single_dtype(self):
+ """Test saving multiple packed cubes with the same pack_dtype."""
+ # Read PP input file.
+ self._multi_test("multi_packed_single_dtype.cdl")
+
+ def test_multi_packed_multi_dtype(self):
+ """Test saving multiple packed cubes with pack_dtype list."""
+ # Read PP input file.
+ self._multi_test("multi_packed_multi_dtype.cdl", multi_dtype=True)
+
+
+class TestScalarCube(tests.IrisTest):
+ def test_scalar_cube_save_load(self):
+ cube = iris.cube.Cube(1, long_name="scalar_cube")
+ with self.temp_filename(suffix=".nc") as fout:
+ iris.save(cube, fout)
+ scalar_cube = iris.load_cube(fout)
+ self.assertEqual(scalar_cube.name(), "scalar_cube")
+
+
+@tests.skip_data
+class TestConstrainedLoad(tests.IrisTest):
+ filename = tests.get_data_path(
+ ("NetCDF", "label_and_climate", "A1B-99999a-river-sep-2070-2099.nc")
+ )
+
+ def test_netcdf_with_NameConstraint(self):
+ constr = iris.NameConstraint(var_name="cdf_temp_dmax_tmean_abs")
+ cubes = iris.load(self.filename, constr)
+ self.assertEqual(len(cubes), 1)
+ self.assertEqual(cubes[0].var_name, "cdf_temp_dmax_tmean_abs")
+
+ def test_netcdf_with_no_constraint(self):
+ cubes = iris.load(self.filename)
+ self.assertEqual(len(cubes), 3)
+
+
+class TestSkippedCoord:
+ # If a coord/cell measure/etcetera cannot be added to the loaded Cube, a
+ # Warning is raised and the coord is skipped.
+ # This 'catching' is generic to all CannotAddErrors, but currently the only
+ # such problem that can exist in a NetCDF file is a mismatch of dimensions
+ # between phenomenon and coord.
+
+ cdl_core = """
+dimensions:
+ length_scale = 1 ;
+ lat = 3 ;
+variables:
+ float lat(lat) ;
+ lat:standard_name = "latitude" ;
+ lat:units = "degrees_north" ;
+ short lst_unc_sys(length_scale) ;
+ lst_unc_sys:long_name = "uncertainty from large-scale systematic
+ errors" ;
+ lst_unc_sys:units = "kelvin" ;
+ lst_unc_sys:coordinates = "lat" ;
+
+data:
+ lat = 0, 1, 2;
+ """
+
+ @pytest.fixture(autouse=True)
+ def create_nc_file(self, tmp_path):
+ file_name = "dim_mismatch"
+ cdl = f"netcdf {file_name}" + "{\n" + self.cdl_core + "\n}"
+ self.nc_path = (tmp_path / file_name).with_suffix(".nc")
+ ncgen_from_cdl(
+ cdl_str=cdl,
+ cdl_path=None,
+ nc_path=str(self.nc_path),
+ )
+ yield
+ self.nc_path.unlink()
+
+ def test_lat_not_loaded(self):
+ # iris#5068 includes discussion of possible retention of the skipped
+ # coords in the future.
+ with pytest.warns(
+ match="Missing data dimensions for multi-valued DimCoord"
+ ):
+ cube = iris.load_cube(self.nc_path)
+ with pytest.raises(iris.exceptions.CoordinateNotFoundError):
+ _ = cube.coord("lat")
+
+
+@tests.skip_data
+class TestDatasetAndPathLoads(tests.IrisTest):
+ @classmethod
+ def setUpClass(cls):
+ cls.filepath = tests.get_data_path(
+ ["NetCDF", "global", "xyz_t", "GEMS_CO2_Apr2006.nc"]
+ )
+ cls.phenom_id = "Carbon Dioxide"
+ cls.expected = iris.load_cube(cls.filepath, cls.phenom_id)
+
+ def test_basic_load(self):
+ # test loading from an open Dataset, in place of a filepath spec.
+ ds = nc.Dataset(self.filepath)
+ result = iris.load_cube(ds, self.phenom_id)
+ # It should still be open (!)
+ self.assertTrue(ds.isopen())
+ ds.close()
+
+ # Check that result is just the same as a 'direct' load.
+ self.assertEqual(self.expected, result)
+
+ def test_path_string_load_same(self):
+ # Check that loading from a Path is the same as passing a filepath string.
+ # Apart from general utility, checks that we won't mistake a Path for a Dataset.
+ path = Path(self.filepath)
+ result = iris.load_cube(path, self.phenom_id)
+ self.assertEqual(result, self.expected)
+
+
+@tests.skip_data
+class TestDatasetAndPathSaves(tests.IrisTest):
+ @classmethod
+ def setUpClass(cls):
+ # Create a temp directory for transient test files.
+ cls.temp_dir = tempfile.mkdtemp()
+ cls.testpath = tests.get_data_path(
+ ["NetCDF", "global", "xyz_t", "GEMS_CO2_Apr2006.nc"]
+ )
+ # Load some test data for save testing.
+ testdata = iris.load(cls.testpath)
+ # Sort to ensure non-random cube order.
+ testdata = sorted(testdata, key=lambda cube: cube.name())
+ cls.testdata = testdata
+
+ @classmethod
+ def tearDownClass(cls):
+ # Destroy the temp directory.
+ shutil.rmtree(cls.temp_dir)
+
+ def test_basic_save(self):
+ # test saving to a Dataset, in place of a filepath spec.
+ # NOTE that this requires 'compute=False', as delayed saves can only operate on
+ # a closed file.
+
+ # Save to netcdf file in the usual way.
+ filepath_direct = f"{self.temp_dir}/tmp_direct.nc"
+ iris.save(self.testdata, filepath_direct)
+ # Check against test-specific CDL result file.
+ self.assertCDL(filepath_direct)
+
+ # Save same data indirectly via a netcdf dataset.
+ filepath_indirect = f"{self.temp_dir}/tmp_indirect.nc"
+ nc_dataset = nc.Dataset(filepath_indirect, "w")
+ # NOTE: we **must** use delayed saving here, as we cannot do direct saving to
+ # a user-owned dataset.
+ result = iris.save(
+ self.testdata, nc_dataset, saver="nc", compute=False
+ )
+
+ # Do some very basic sanity checks on the resulting Dataset.
+ # It should still be open (!)
+ self.assertTrue(nc_dataset.isopen())
+ self.assertEqual(
+ ["time", "levelist", "latitude", "longitude"],
+ list(nc_dataset.dimensions),
+ )
+ self.assertEqual(
+ ["co2", "time", "levelist", "latitude", "longitude", "lnsp"],
+ list(nc_dataset.variables),
+ )
+ nc_dataset.close()
+
+ # Check the saved file against the same CDL as the 'normal' save.
+ self.assertCDL(filepath_indirect)
+
+ # Confirm that cube content is however not yet written.
+ ds = nc.Dataset(filepath_indirect)
+ for cube in self.testdata:
+ assert np.all(ds.variables[cube.var_name][:].mask)
+ ds.close()
+
+ # Complete the delayed saves.
+ result.compute()
+
+ # Check that data now *is* written.
+ ds = nc.Dataset(filepath_indirect)
+ for cube in self.testdata:
+ assert np.all(ds.variables[cube.var_name][:] == cube.data)
+ ds.close()
+
+ def test_computed_delayed_save__fail(self):
+ # Call as above 'test_basic_save' but with "compute=True" : this should raise
+ # an error.
+ filepath_indirect = f"{self.temp_dir}/tmp_indirect_complete.nc"
+ nc_dataset = nc.Dataset(filepath_indirect, "w")
+
+ # NOTE: a "normal" compute=True call should raise an error.
+ msg = "Cannot save to a user-provided dataset with 'compute=True'"
+ with pytest.raises(ValueError, match=msg):
+ iris.save(self.testdata, nc_dataset, saver="nc")
+
+ def test_path_string_save_same(self):
+ # Ensure that save to a Path is the same as passing a filepath string.
+ # Apart from general utility, checks that we won't mistake a Path for a Dataset.
+ tempfile_fromstr = f"{self.temp_dir}/tmp_fromstr.nc"
+ iris.save(self.testdata, tempfile_fromstr)
+ tempfile_frompath = f"{self.temp_dir}/tmp_frompath.nc"
+ path = Path(tempfile_frompath)
+ iris.save(self.testdata, path)
+ self.assertCDL(tempfile_fromstr)
+ self.assertCDL(tempfile_frompath)
+
+
+if __name__ == "__main__":
+ tests.main()
diff --git a/lib/iris/tests/integration/netcdf/test_self_referencing.py b/lib/iris/tests/integration/netcdf/test_self_referencing.py
new file mode 100644
index 0000000000..3395296e11
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_self_referencing.py
@@ -0,0 +1,126 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""Integration tests for iris#3367 - loading a self-referencing NetCDF file."""
+
+# Import iris.tests first so that some things can be initialised before
+# importing anything else.
+import iris.tests as tests # isort:skip
+
+import os
+import tempfile
+from unittest import mock
+
+import numpy as np
+
+import iris
+from iris.fileformats.netcdf import _thread_safe_nc
+
+
+@tests.skip_data
+class TestCMIP6VolcelloLoad(tests.IrisTest):
+ def setUp(self):
+ self.fname = tests.get_data_path(
+ (
+ "NetCDF",
+ "volcello",
+ "volcello_Ofx_CESM2_deforest-globe_r1i1p1f1_gn.nc",
+ )
+ )
+
+ def test_cmip6_volcello_load_issue_3367(self):
+ # Ensure that reading a file which references itself in
+ # `cell_measures` can be read. At the same time, ensure that we
+ # still receive a warning about other variables mentioned in
+ # `cell_measures` i.e. a warning should be raised about missing
+ # areacello.
+ areacello_str = "areacello"
+ volcello_str = "volcello"
+ expected_msg = (
+ "Missing CF-netCDF measure variable %r, "
+ "referenced by netCDF variable %r" % (areacello_str, volcello_str)
+ )
+
+ with mock.patch("warnings.warn") as warn:
+ # ensure file loads without failure
+ cube = iris.load_cube(self.fname)
+ warn.assert_has_calls([mock.call(expected_msg)])
+
+ # extra check to ensure correct variable was found
+ assert cube.standard_name == "ocean_volume"
+
+
+class TestSelfReferencingVarLoad(tests.IrisTest):
+ def setUp(self):
+ self.temp_dir_path = os.path.join(
+ tempfile.mkdtemp(), "issue_3367_volcello_test_file.nc"
+ )
+ dataset = _thread_safe_nc.DatasetWrapper(self.temp_dir_path, "w")
+
+ dataset.createDimension("lat", 4)
+ dataset.createDimension("lon", 5)
+ dataset.createDimension("lev", 3)
+
+ latitudes = dataset.createVariable("lat", np.float64, ("lat",))
+ longitudes = dataset.createVariable("lon", np.float64, ("lon",))
+ levels = dataset.createVariable("lev", np.float64, ("lev",))
+ volcello = dataset.createVariable(
+ "volcello", np.float32, ("lat", "lon", "lev")
+ )
+
+ latitudes.standard_name = "latitude"
+ latitudes.units = "degrees_north"
+ latitudes.axis = "Y"
+ latitudes[:] = np.linspace(-90, 90, 4)
+
+ longitudes.standard_name = "longitude"
+ longitudes.units = "degrees_east"
+ longitudes.axis = "X"
+ longitudes[:] = np.linspace(0, 360, 5)
+
+ levels.standard_name = "olevel"
+ levels.units = "centimeters"
+ levels.positive = "down"
+ levels.axis = "Z"
+ levels[:] = np.linspace(0, 10**5, 3)
+
+ volcello.id = "volcello"
+ volcello.out_name = "volcello"
+ volcello.standard_name = "ocean_volume"
+ volcello.units = "m3"
+ volcello.realm = "ocean"
+ volcello.frequency = "fx"
+ volcello.cell_measures = "area: areacello volume: volcello"
+ volcello = np.arange(4 * 5 * 3).reshape((4, 5, 3))
+
+ dataset.close()
+
+ def test_self_referencing_load_issue_3367(self):
+ # Ensure that reading a file which references itself in
+ # `cell_measures` can be read. At the same time, ensure that we
+ # still receive a warning about other variables mentioned in
+ # `cell_measures` i.e. a warning should be raised about missing
+ # areacello.
+ areacello_str = "areacello"
+ volcello_str = "volcello"
+ expected_msg = (
+ "Missing CF-netCDF measure variable %r, "
+ "referenced by netCDF variable %r" % (areacello_str, volcello_str)
+ )
+
+ with mock.patch("warnings.warn") as warn:
+ # ensure file loads without failure
+ cube = iris.load_cube(self.temp_dir_path)
+ warn.assert_called_with(expected_msg)
+
+ # extra check to ensure correct variable was found
+ assert cube.standard_name == "ocean_volume"
+
+ def tearDown(self):
+ os.remove(self.temp_dir_path)
+
+
+if __name__ == "__main__":
+ tests.main()
diff --git a/lib/iris/tests/integration/netcdf/test_thread_safety.py b/lib/iris/tests/integration/netcdf/test_thread_safety.py
new file mode 100644
index 0000000000..5ed32d0671
--- /dev/null
+++ b/lib/iris/tests/integration/netcdf/test_thread_safety.py
@@ -0,0 +1,124 @@
+# Copyright Iris contributors
+#
+# This file is part of Iris and is released under the LGPL license.
+# See COPYING and COPYING.LESSER in the root of the repository for full
+# licensing details.
+"""
+Integration tests covering thread safety during loading/saving netcdf files.
+
+These tests are intended to catch non-thread-safe behaviour by producing CI
+'irregularities' that are noticed and investigated. They cannot reliably
+produce standard pytest failures, since the tools for 'correctly'
+testing non-thread-safe behaviour are not available at the Python layer.
+Thread safety problems can be either produce errors (like a normal test) OR
+segfaults (test doesn't complete, pytest-xdiff starts a new group worker, the
+end exit code is still non-0), and some problems do not occur in every test
+run.
+
+Token assertions are included after the line that is expected to reveal
+a thread safety problem, as this seems to be good testing practice.
+
+"""
+from pathlib import Path
+
+import dask
+from dask import array as da
+import numpy as np
+import pytest
+
+import iris
+from iris.cube import Cube, CubeList
+from iris.tests import get_data_path
+
+
+@pytest.fixture
+def tiny_chunks():
+ """Guarantee that Dask will use >1 thread by guaranteeing >1 chunk."""
+
+ def _check_tiny_loaded_chunks(cube: Cube):
+ assert cube.has_lazy_data()
+ cube_lazy_data = cube.core_data()
+ assert np.product(cube_lazy_data.chunksize) < cube_lazy_data.size
+
+ with dask.config.set({"array.chunk-size": "1KiB"}):
+ yield _check_tiny_loaded_chunks
+
+
+@pytest.fixture
+def save_common(tmp_path):
+ save_path = tmp_path / "tmp.nc"
+
+ def _func(cube: Cube):
+ assert not save_path.exists()
+ iris.save(cube, save_path)
+ assert save_path.exists()
+
+ yield _func
+
+
+@pytest.fixture
+def get_cubes_from_netcdf():
+ load_dir_path = Path(get_data_path(["NetCDF", "global", "xyt"]))
+ loaded = iris.load(load_dir_path.glob("*"), "tcco2")
+ smaller = CubeList([c[0] for c in loaded])
+ yield smaller
+
+
+def test_realise_data(tiny_chunks, get_cubes_from_netcdf):
+ cube = get_cubes_from_netcdf[0]
+ tiny_chunks(cube)
+ _ = cube.data # Any problems are expected here.
+ assert not cube.has_lazy_data()
+
+
+def test_realise_data_multisource(get_cubes_from_netcdf):
+ """Load from multiple sources to force Dask to use multiple threads."""
+ cubes = get_cubes_from_netcdf
+ final_cube = sum(cubes)
+ _ = final_cube.data # Any problems are expected here.
+ assert not final_cube.has_lazy_data()
+
+
+def test_save(tiny_chunks, save_common):
+ cube = Cube(da.ones(10000))
+ tiny_chunks(cube)
+ save_common(cube) # Any problems are expected here.
+
+
+def test_stream(tiny_chunks, get_cubes_from_netcdf, save_common):
+ cube = get_cubes_from_netcdf[0]
+ tiny_chunks(cube)
+ save_common(cube) # Any problems are expected here.
+
+
+def test_stream_multisource(get_cubes_from_netcdf, save_common):
+ """Load from multiple sources to force Dask to use multiple threads."""
+ cubes = get_cubes_from_netcdf
+ final_cube = sum(cubes)
+ save_common(final_cube) # Any problems are expected here.
+
+
+def test_stream_multisource__manychunks(
+ tiny_chunks, get_cubes_from_netcdf, save_common
+):
+ """
+ As above, but with many more small chunks.
+
+ As this previously showed additional, sporadic problems which only emerge
+ (statistically) with larger numbers of chunks.
+
+ """
+ cubes = get_cubes_from_netcdf
+ final_cube = sum(cubes)
+ save_common(final_cube) # Any problems are expected here.
+
+
+def test_comparison(get_cubes_from_netcdf):
+ """
+ Comparing multiple loaded files forces co-realisation.
+
+ See :func:`iris._lazy_data._co_realise_lazy_arrays` .
+ """
+ cubes = get_cubes_from_netcdf
+ _ = cubes[:-1] == cubes[1:] # Any problems are expected here.
+ assert all([c.has_lazy_data() for c in cubes])
diff --git a/lib/iris/tests/integration/test_Datums.py b/lib/iris/tests/integration/test_Datums.py
index 6953534f2d..43287c7040 100755
--- a/lib/iris/tests/integration/test_Datums.py
+++ b/lib/iris/tests/integration/test_Datums.py
@@ -3,7 +3,7 @@
# This file is part of Iris and is released under the LGPL license.
# See COPYING and COPYING.LESSER in the root of the repository for full
# licensing details.
-"""Integration tests for :class:`iris.coord_systems` datum suppport."""
+"""Integration tests for :class:`iris.coord_systems` datum support."""
# Import iris.tests first so that some things can be initialised before
# importing anything else.
diff --git a/lib/iris/tests/integration/test_cube.py b/lib/iris/tests/integration/test_cube.py
index 996362f594..ad6666d28e 100644
--- a/lib/iris/tests/integration/test_cube.py
+++ b/lib/iris/tests/integration/test_cube.py
@@ -9,6 +9,8 @@
# importing anything else.
import iris.tests as tests # isort:skip
+from unittest import mock
+
import numpy as np
import iris
@@ -23,7 +25,13 @@ def test_agg_by_aux_coord(self):
problem_test_file = tests.get_data_path(
("NetCDF", "testing", "small_theta_colpex.nc")
)
- cube = iris.load_cube(problem_test_file, "air_potential_temperature")
+ # While loading, "turn off" loading small variables as real data.
+ with mock.patch(
+ "iris.fileformats.netcdf.loader._LAZYVAR_MIN_BYTES", 0
+ ):
+ cube = iris.load_cube(
+ problem_test_file, "air_potential_temperature"
+ )
# Test aggregating by aux coord, notably the `forecast_period` aux
# coord on `cube`, whose `_points` attribute is a lazy array.
diff --git a/lib/iris/tests/integration/test_netcdf.py b/lib/iris/tests/integration/test_netcdf.py
deleted file mode 100644
index 851c539ade..0000000000
--- a/lib/iris/tests/integration/test_netcdf.py
+++ /dev/null
@@ -1,958 +0,0 @@
-# Copyright Iris contributors
-#
-# This file is part of Iris and is released under the LGPL license.
-# See COPYING and COPYING.LESSER in the root of the repository for full
-# licensing details.
-"""Integration tests for loading and saving netcdf files."""
-
-# Import iris.tests first so that some things can be initialised before
-# importing anything else.
-import iris.tests as tests # isort:skip
-
-from contextlib import contextmanager
-from itertools import repeat
-import os.path
-from os.path import join as path_join
-import shutil
-import tempfile
-from unittest import mock
-import warnings
-
-import netCDF4 as nc
-import numpy as np
-import numpy.ma as ma
-import pytest
-
-import iris
-import iris.coord_systems
-from iris.coords import CellMethod, DimCoord
-from iris.cube import Cube, CubeList
-import iris.exceptions
-from iris.fileformats.netcdf import (
- CF_CONVENTIONS_VERSION,
- Saver,
- UnknownCellMethodWarning,
-)
-import iris.tests.stock as stock
-from iris.tests.stock.netcdf import ncgen_from_cdl
-import iris.tests.unit.fileformats.netcdf.test_load_cubes as tlc
-
-
-@tests.skip_data
-class TestAtmosphereSigma(tests.IrisTest):
- def setUp(self):
- # Modify stock cube so it is suitable to have a atmosphere sigma
- # factory added to it.
- cube = stock.realistic_4d_no_derived()
- cube.coord("surface_altitude").rename("surface_air_pressure")
- cube.coord("surface_air_pressure").units = "Pa"
- cube.coord("sigma").units = "1"
- ptop_coord = iris.coords.AuxCoord(1000.0, var_name="ptop", units="Pa")
- cube.add_aux_coord(ptop_coord, ())
- cube.remove_coord("level_height")
- # Construct and add atmosphere sigma factory.
- factory = iris.aux_factory.AtmosphereSigmaFactory(
- cube.coord("ptop"),
- cube.coord("sigma"),
- cube.coord("surface_air_pressure"),
- )
- cube.add_aux_factory(factory)
- self.cube = cube
-
- def test_save(self):
- with self.temp_filename(suffix=".nc") as filename:
- iris.save(self.cube, filename)
- self.assertCDL(filename)
-
- def test_save_load_loop(self):
- # Ensure that the AtmosphereSigmaFactory is automatically loaded
- # when loading the file.
- with self.temp_filename(suffix=".nc") as filename:
- iris.save(self.cube, filename)
- cube = iris.load_cube(filename, "air_potential_temperature")
- assert cube.coords("air_pressure")
-
-
-@tests.skip_data
-class TestHybridPressure(tests.IrisTest):
- def setUp(self):
- # Modify stock cube so it is suitable to have a
- # hybrid pressure factory added to it.
- cube = stock.realistic_4d_no_derived()
- cube.coord("surface_altitude").rename("surface_air_pressure")
- cube.coord("surface_air_pressure").units = "Pa"
- cube.coord("level_height").rename("level_pressure")
- cube.coord("level_pressure").units = "Pa"
- # Construct and add hybrid pressure factory.
- factory = iris.aux_factory.HybridPressureFactory(
- cube.coord("level_pressure"),
- cube.coord("sigma"),
- cube.coord("surface_air_pressure"),
- )
- cube.add_aux_factory(factory)
- self.cube = cube
-
- def test_save(self):
- with self.temp_filename(suffix=".nc") as filename:
- iris.save(self.cube, filename)
- self.assertCDL(filename)
-
- def test_save_load_loop(self):
- # Tests an issue where the variable names in the formula
- # terms changed to the standard_names instead of the variable names
- # when loading a previously saved cube.
- with self.temp_filename(suffix=".nc") as filename, self.temp_filename(
- suffix=".nc"
- ) as other_filename:
- iris.save(self.cube, filename)
- cube = iris.load_cube(filename, "air_potential_temperature")
- iris.save(cube, other_filename)
- other_cube = iris.load_cube(
- other_filename, "air_potential_temperature"
- )
- self.assertEqual(cube, other_cube)
-
-
-@tests.skip_data
-class TestSaveMultipleAuxFactories(tests.IrisTest):
- def test_hybrid_height_and_pressure(self):
- cube = stock.realistic_4d()
- cube.add_aux_coord(
- iris.coords.DimCoord(
- 1200.0, long_name="level_pressure", units="hPa"
- )
- )
- cube.add_aux_coord(
- iris.coords.DimCoord(0.5, long_name="other sigma", units="1")
- )
- cube.add_aux_coord(
- iris.coords.DimCoord(
- 1000.0, long_name="surface_air_pressure", units="hPa"
- )
- )
- factory = iris.aux_factory.HybridPressureFactory(
- cube.coord("level_pressure"),
- cube.coord("other sigma"),
- cube.coord("surface_air_pressure"),
- )
- cube.add_aux_factory(factory)
- with self.temp_filename(suffix=".nc") as filename:
- iris.save(cube, filename)
- self.assertCDL(filename)
-
- def test_shared_primary(self):
- cube = stock.realistic_4d()
- factory = iris.aux_factory.HybridHeightFactory(
- cube.coord("level_height"),
- cube.coord("sigma"),
- cube.coord("surface_altitude"),
- )
- factory.rename("another altitude")
- cube.add_aux_factory(factory)
- with self.temp_filename(
- suffix=".nc"
- ) as filename, self.assertRaisesRegex(
- ValueError, "multiple aux factories"
- ):
- iris.save(cube, filename)
-
- def test_hybrid_height_cubes(self):
- hh1 = stock.simple_4d_with_hybrid_height()
- hh1.attributes["cube"] = "hh1"
- hh2 = stock.simple_4d_with_hybrid_height()
- hh2.attributes["cube"] = "hh2"
- sa = hh2.coord("surface_altitude")
- sa.points = sa.points * 10
- with self.temp_filename(".nc") as fname:
- iris.save([hh1, hh2], fname)
- cubes = iris.load(fname, "air_temperature")
- cubes = sorted(cubes, key=lambda cube: cube.attributes["cube"])
- self.assertCML(cubes)
-
- def test_hybrid_height_cubes_on_dimension_coordinate(self):
- hh1 = stock.hybrid_height()
- hh2 = stock.hybrid_height()
- sa = hh2.coord("surface_altitude")
- sa.points = sa.points * 10
- emsg = "Unable to create dimensonless vertical coordinate."
- with self.temp_filename(".nc") as fname, self.assertRaisesRegex(
- ValueError, emsg
- ):
- iris.save([hh1, hh2], fname)
-
-
-class TestUmVersionAttribute(tests.IrisTest):
- def test_single_saves_as_global(self):
- cube = Cube(
- [1.0],
- standard_name="air_temperature",
- units="K",
- attributes={"um_version": "4.3"},
- )
- with self.temp_filename(".nc") as nc_path:
- iris.save(cube, nc_path)
- self.assertCDL(nc_path)
-
- def test_multiple_same_saves_as_global(self):
- cube_a = Cube(
- [1.0],
- standard_name="air_temperature",
- units="K",
- attributes={"um_version": "4.3"},
- )
- cube_b = Cube(
- [1.0],
- standard_name="air_pressure",
- units="hPa",
- attributes={"um_version": "4.3"},
- )
- with self.temp_filename(".nc") as nc_path:
- iris.save(CubeList([cube_a, cube_b]), nc_path)
- self.assertCDL(nc_path)
-
- def test_multiple_different_saves_on_variables(self):
- cube_a = Cube(
- [1.0],
- standard_name="air_temperature",
- units="K",
- attributes={"um_version": "4.3"},
- )
- cube_b = Cube(
- [1.0],
- standard_name="air_pressure",
- units="hPa",
- attributes={"um_version": "4.4"},
- )
- with self.temp_filename(".nc") as nc_path:
- iris.save(CubeList([cube_a, cube_b]), nc_path)
- self.assertCDL(nc_path)
-
-
-@contextmanager
-def _patch_site_configuration():
- def cf_patch_conventions(conventions):
- return ", ".join([conventions, "convention1, convention2"])
-
- def update(config):
- config["cf_profile"] = mock.Mock(name="cf_profile")
- config["cf_patch"] = mock.Mock(name="cf_patch")
- config["cf_patch_conventions"] = cf_patch_conventions
-
- orig_site_config = iris.site_configuration.copy()
- update(iris.site_configuration)
- yield
- iris.site_configuration = orig_site_config
-
-
-class TestConventionsAttributes(tests.IrisTest):
- def test_patching_conventions_attribute(self):
- # Ensure that user defined conventions are wiped and those which are
- # saved patched through site_config can be loaded without an exception
- # being raised.
- cube = Cube(
- [1.0],
- standard_name="air_temperature",
- units="K",
- attributes={"Conventions": "some user defined conventions"},
- )
-
- # Patch the site configuration dictionary.
- with _patch_site_configuration(), self.temp_filename(".nc") as nc_path:
- iris.save(cube, nc_path)
- res = iris.load_cube(nc_path)
-
- self.assertEqual(
- res.attributes["Conventions"],
- "{}, {}, {}".format(
- CF_CONVENTIONS_VERSION, "convention1", "convention2"
- ),
- )
-
-
-class TestLazySave(tests.IrisTest):
- @tests.skip_data
- def test_lazy_preserved_save(self):
- fpath = tests.get_data_path(
- ("NetCDF", "label_and_climate", "small_FC_167_mon_19601101.nc")
- )
- acube = iris.load_cube(fpath, "air_temperature")
- self.assertTrue(acube.has_lazy_data())
- # Also check a coord with lazy points + bounds.
- self.assertTrue(acube.coord("forecast_period").has_lazy_points())
- self.assertTrue(acube.coord("forecast_period").has_lazy_bounds())
- with self.temp_filename(".nc") as nc_path:
- with Saver(nc_path, "NETCDF4") as saver:
- saver.write(acube)
- # Check that cube data is not realised, also coord points + bounds.
- self.assertTrue(acube.has_lazy_data())
- self.assertTrue(acube.coord("forecast_period").has_lazy_points())
- self.assertTrue(acube.coord("forecast_period").has_lazy_bounds())
-
-
-@tests.skip_data
-class TestCellMeasures(tests.IrisTest):
- def setUp(self):
- self.fname = tests.get_data_path(("NetCDF", "ORCA2", "votemper.nc"))
-
- def test_load_raw(self):
- (cube,) = iris.load_raw(self.fname)
- self.assertEqual(len(cube.cell_measures()), 1)
- self.assertEqual(cube.cell_measures()[0].measure, "area")
-
- def test_load(self):
- cube = iris.load_cube(self.fname)
- self.assertEqual(len(cube.cell_measures()), 1)
- self.assertEqual(cube.cell_measures()[0].measure, "area")
-
- def test_merge_cell_measure_aware(self):
- (cube1,) = iris.load_raw(self.fname)
- (cube2,) = iris.load_raw(self.fname)
- cube2._cell_measures_and_dims[0][0].var_name = "not_areat"
- cubes = CubeList([cube1, cube2]).merge()
- self.assertEqual(len(cubes), 2)
-
- def test_concatenate_cell_measure_aware(self):
- (cube1,) = iris.load_raw(self.fname)
- cube1 = cube1[:, :, 0, 0]
- cm_and_dims = cube1._cell_measures_and_dims
- (cube2,) = iris.load_raw(self.fname)
- cube2 = cube2[:, :, 0, 0]
- cube2._cell_measures_and_dims[0][0].var_name = "not_areat"
- cube2.coord("time").points = cube2.coord("time").points + 1
- cubes = CubeList([cube1, cube2]).concatenate()
- self.assertEqual(cubes[0]._cell_measures_and_dims, cm_and_dims)
- self.assertEqual(len(cubes), 2)
-
- def test_concatenate_cell_measure_match(self):
- (cube1,) = iris.load_raw(self.fname)
- cube1 = cube1[:, :, 0, 0]
- cm_and_dims = cube1._cell_measures_and_dims
- (cube2,) = iris.load_raw(self.fname)
- cube2 = cube2[:, :, 0, 0]
- cube2.coord("time").points = cube2.coord("time").points + 1
- cubes = CubeList([cube1, cube2]).concatenate()
- self.assertEqual(cubes[0]._cell_measures_and_dims, cm_and_dims)
- self.assertEqual(len(cubes), 1)
-
- def test_round_trip(self):
- (cube,) = iris.load(self.fname)
- with self.temp_filename(suffix=".nc") as filename:
- iris.save(cube, filename, unlimited_dimensions=[])
- (round_cube,) = iris.load_raw(filename)
- self.assertEqual(len(round_cube.cell_measures()), 1)
- self.assertEqual(round_cube.cell_measures()[0].measure, "area")
-
- def test_print(self):
- cube = iris.load_cube(self.fname)
- printed = cube.__str__()
- self.assertIn(
- (
- "Cell measures:\n"
- " cell_area - - "
- " x x"
- ),
- printed,
- )
-
-
-@tests.skip_data
-class TestCMIP6VolcelloLoad(tests.IrisTest):
- def setUp(self):
- self.fname = tests.get_data_path(
- (
- "NetCDF",
- "volcello",
- "volcello_Ofx_CESM2_deforest-globe_r1i1p1f1_gn.nc",
- )
- )
-
- def test_cmip6_volcello_load_issue_3367(self):
- # Ensure that reading a file which references itself in
- # `cell_measures` can be read. At the same time, ensure that we
- # still receive a warning about other variables mentioned in
- # `cell_measures` i.e. a warning should be raised about missing
- # areacello.
- areacello_str = "areacello"
- volcello_str = "volcello"
- expected_msg = (
- "Missing CF-netCDF measure variable %r, "
- "referenced by netCDF variable %r" % (areacello_str, volcello_str)
- )
-
- with mock.patch("warnings.warn") as warn:
- # ensure file loads without failure
- cube = iris.load_cube(self.fname)
- warn.assert_has_calls([mock.call(expected_msg)])
-
- # extra check to ensure correct variable was found
- assert cube.standard_name == "ocean_volume"
-
-
-class TestSelfReferencingVarLoad(tests.IrisTest):
- def setUp(self):
- self.temp_dir_path = os.path.join(
- tempfile.mkdtemp(), "issue_3367_volcello_test_file.nc"
- )
- dataset = nc.Dataset(self.temp_dir_path, "w")
-
- dataset.createDimension("lat", 4)
- dataset.createDimension("lon", 5)
- dataset.createDimension("lev", 3)
-
- latitudes = dataset.createVariable("lat", np.float64, ("lat",))
- longitudes = dataset.createVariable("lon", np.float64, ("lon",))
- levels = dataset.createVariable("lev", np.float64, ("lev",))
- volcello = dataset.createVariable(
- "volcello", np.float32, ("lat", "lon", "lev")
- )
-
- latitudes.standard_name = "latitude"
- latitudes.units = "degrees_north"
- latitudes.axis = "Y"
- latitudes[:] = np.linspace(-90, 90, 4)
-
- longitudes.standard_name = "longitude"
- longitudes.units = "degrees_east"
- longitudes.axis = "X"
- longitudes[:] = np.linspace(0, 360, 5)
-
- levels.standard_name = "olevel"
- levels.units = "centimeters"
- levels.positive = "down"
- levels.axis = "Z"
- levels[:] = np.linspace(0, 10**5, 3)
-
- volcello.id = "volcello"
- volcello.out_name = "volcello"
- volcello.standard_name = "ocean_volume"
- volcello.units = "m3"
- volcello.realm = "ocean"
- volcello.frequency = "fx"
- volcello.cell_measures = "area: areacello volume: volcello"
- volcello = np.arange(4 * 5 * 3).reshape((4, 5, 3))
-
- dataset.close()
-
- def test_self_referencing_load_issue_3367(self):
- # Ensure that reading a file which references itself in
- # `cell_measures` can be read. At the same time, ensure that we
- # still receive a warning about other variables mentioned in
- # `cell_measures` i.e. a warning should be raised about missing
- # areacello.
- areacello_str = "areacello"
- volcello_str = "volcello"
- expected_msg = (
- "Missing CF-netCDF measure variable %r, "
- "referenced by netCDF variable %r" % (areacello_str, volcello_str)
- )
-
- with mock.patch("warnings.warn") as warn:
- # ensure file loads without failure
- cube = iris.load_cube(self.temp_dir_path)
- warn.assert_called_with(expected_msg)
-
- # extra check to ensure correct variable was found
- assert cube.standard_name == "ocean_volume"
-
- def tearDown(self):
- os.remove(self.temp_dir_path)
-
-
-class TestCellMethod_unknown(tests.IrisTest):
- def test_unknown_method(self):
- cube = Cube([1, 2], long_name="odd_phenomenon")
- cube.add_cell_method(CellMethod(method="oddity", coords=("x",)))
- temp_dirpath = tempfile.mkdtemp()
- try:
- temp_filepath = os.path.join(temp_dirpath, "tmp.nc")
- iris.save(cube, temp_filepath)
- with warnings.catch_warnings(record=True) as warning_records:
- iris.load(temp_filepath)
- # Filter to get the warning we are interested in.
- warning_messages = [record.message for record in warning_records]
- warning_messages = [
- warn
- for warn in warning_messages
- if isinstance(warn, UnknownCellMethodWarning)
- ]
- self.assertEqual(len(warning_messages), 1)
- message = warning_messages[0].args[0]
- msg = (
- "NetCDF variable 'odd_phenomenon' contains unknown cell "
- "method 'oddity'"
- )
- self.assertIn(msg, message)
- finally:
- shutil.rmtree(temp_dirpath)
-
-
-@tests.skip_data
-class TestCoordSystem(tests.IrisTest):
- def setUp(self):
- tlc.setUpModule()
-
- def tearDown(self):
- tlc.tearDownModule()
-
- def test_load_laea_grid(self):
- cube = iris.load_cube(
- tests.get_data_path(
- ("NetCDF", "lambert_azimuthal_equal_area", "euro_air_temp.nc")
- )
- )
- self.assertCML(cube, ("netcdf", "netcdf_laea.cml"))
-
- datum_cf_var_cdl = """
- netcdf output {
- dimensions:
- y = 4 ;
- x = 3 ;
- variables:
- float data(y, x) ;
- data :standard_name = "toa_brightness_temperature" ;
- data :units = "K" ;
- data :grid_mapping = "mercator" ;
- int mercator ;
- mercator:grid_mapping_name = "mercator" ;
- mercator:longitude_of_prime_meridian = 0. ;
- mercator:earth_radius = 6378169. ;
- mercator:horizontal_datum_name = "OSGB36" ;
- float y(y) ;
- y:axis = "Y" ;
- y:units = "m" ;
- y:standard_name = "projection_y_coordinate" ;
- float x(x) ;
- x:axis = "X" ;
- x:units = "m" ;
- x:standard_name = "projection_x_coordinate" ;
-
- // global attributes:
- :Conventions = "CF-1.7" ;
- :standard_name_vocabulary = "CF Standard Name Table v27" ;
-
- data:
-
- data =
- 0, 1, 2,
- 3, 4, 5,
- 6, 7, 8,
- 9, 10, 11 ;
-
- mercator = _ ;
-
- y = 1, 2, 3, 5 ;
-
- x = -6, -4, -2 ;
-
- }
- """
-
- datum_wkt_cdl = """
-netcdf output5 {
-dimensions:
- y = 4 ;
- x = 3 ;
-variables:
- float data(y, x) ;
- data :standard_name = "toa_brightness_temperature" ;
- data :units = "K" ;
- data :grid_mapping = "mercator" ;
- int mercator ;
- mercator:grid_mapping_name = "mercator" ;
- mercator:longitude_of_prime_meridian = 0. ;
- mercator:earth_radius = 6378169. ;
- mercator:longitude_of_projection_origin = 0. ;
- mercator:false_easting = 0. ;
- mercator:false_northing = 0. ;
- mercator:scale_factor_at_projection_origin = 1. ;
- mercator:crs_wkt = "PROJCRS[\\"unknown\\",BASEGEOGCRS[\\"unknown\\",DATUM[\\"OSGB36\\",ELLIPSOID[\\"unknown\\",6378169,0,LENGTHUNIT[\\"metre\\",1,ID[\\"EPSG\\",9001]]]],PRIMEM[\\"Greenwich\\",0,ANGLEUNIT[\\"degree\\",0.0174532925199433],ID[\\"EPSG\\",8901]]],CONVERSION[\\"unknown\\",METHOD[\\"Mercator (variant B)\\",ID[\\"EPSG\\",9805]],PARAMETER[\\"Latitude of 1st standard parallel\\",0,ANGLEUNIT[\\"degree\\",0.0174532925199433],ID[\\"EPSG\\",8823]],PARAMETER[\\"Longitude of natural origin\\",0,ANGLEUNIT[\\"degree\\",0.0174532925199433],ID[\\"EPSG\\",8802]],PARAMETER[\\"False easting\\",0,LENGTHUNIT[\\"metre\\",1],ID[\\"EPSG\\",8806]],PARAMETER[\\"False northing\\",0,LENGTHUNIT[\\"metre\\",1],ID[\\"EPSG\\",8807]]],CS[Cartesian,2],AXIS[\\"(E)\\",east,ORDER[1],LENGTHUNIT[\\"metre\\",1,ID[\\"EPSG\\",9001]]],AXIS[\\"(N)\\",north,ORDER[2],LENGTHUNIT[\\"metre\\",1,ID[\\"EPSG\\",9001]]]]" ;
- float y(y) ;
- y:axis = "Y" ;
- y:units = "m" ;
- y:standard_name = "projection_y_coordinate" ;
- float x(x) ;
- x:axis = "X" ;
- x:units = "m" ;
- x:standard_name = "projection_x_coordinate" ;
-
-// global attributes:
- :standard_name_vocabulary = "CF Standard Name Table v27" ;
- :Conventions = "CF-1.7" ;
-data:
-
- data =
- 0, 1, 2,
- 3, 4, 5,
- 6, 7, 8,
- 9, 10, 11 ;
-
- mercator = _ ;
-
- y = 1, 2, 3, 5 ;
-
- x = -6, -4, -2 ;
-}
- """
-
- def test_load_datum_wkt(self):
- expected = "OSGB 1936"
- nc_path = tlc.cdl_to_nc(self.datum_wkt_cdl)
- with iris.FUTURE.context(datum_support=True):
- cube = iris.load_cube(nc_path)
- test_crs = cube.coord("projection_y_coordinate").coord_system
- actual = str(test_crs.as_cartopy_crs().datum)
- self.assertMultiLineEqual(expected, actual)
-
- def test_no_load_datum_wkt(self):
- nc_path = tlc.cdl_to_nc(self.datum_wkt_cdl)
- with self.assertWarnsRegex(FutureWarning, "iris.FUTURE.datum_support"):
- cube = iris.load_cube(nc_path)
- test_crs = cube.coord("projection_y_coordinate").coord_system
- actual = str(test_crs.as_cartopy_crs().datum)
- self.assertMultiLineEqual(actual, "unknown")
-
- def test_load_datum_cf_var(self):
- expected = "OSGB 1936"
- nc_path = tlc.cdl_to_nc(self.datum_cf_var_cdl)
- with iris.FUTURE.context(datum_support=True):
- cube = iris.load_cube(nc_path)
- test_crs = cube.coord("projection_y_coordinate").coord_system
- actual = str(test_crs.as_cartopy_crs().datum)
- self.assertMultiLineEqual(expected, actual)
-
- def test_no_load_datum_cf_var(self):
- nc_path = tlc.cdl_to_nc(self.datum_cf_var_cdl)
- with self.assertWarnsRegex(FutureWarning, "iris.FUTURE.datum_support"):
- cube = iris.load_cube(nc_path)
- test_crs = cube.coord("projection_y_coordinate").coord_system
- actual = str(test_crs.as_cartopy_crs().datum)
- self.assertMultiLineEqual(actual, "unknown")
-
- def test_save_datum(self):
- expected = "OSGB 1936"
- saved_crs = iris.coord_systems.Mercator(
- ellipsoid=iris.coord_systems.GeogCS.from_datum("OSGB36")
- )
-
- base_cube = stock.realistic_3d()
- base_lat_coord = base_cube.coord("grid_latitude")
- test_lat_coord = DimCoord(
- base_lat_coord.points,
- standard_name="projection_y_coordinate",
- coord_system=saved_crs,
- )
- base_lon_coord = base_cube.coord("grid_longitude")
- test_lon_coord = DimCoord(
- base_lon_coord.points,
- standard_name="projection_x_coordinate",
- coord_system=saved_crs,
- )
- test_cube = Cube(
- base_cube.data,
- standard_name=base_cube.standard_name,
- units=base_cube.units,
- dim_coords_and_dims=(
- (base_cube.coord("time"), 0),
- (test_lat_coord, 1),
- (test_lon_coord, 2),
- ),
- )
-
- with self.temp_filename(suffix=".nc") as filename:
- iris.save(test_cube, filename)
- with iris.FUTURE.context(datum_support=True):
- cube = iris.load_cube(filename)
-
- test_crs = cube.coord("projection_y_coordinate").coord_system
- actual = str(test_crs.as_cartopy_crs().datum)
- self.assertMultiLineEqual(expected, actual)
-
-
-def _get_scale_factor_add_offset(cube, datatype):
- """Utility function used by netCDF data packing tests."""
- if isinstance(datatype, dict):
- dt = np.dtype(datatype["dtype"])
- else:
- dt = np.dtype(datatype)
- cmax = cube.data.max()
- cmin = cube.data.min()
- n = dt.itemsize * 8
- if ma.isMaskedArray(cube.data):
- masked = True
- else:
- masked = False
- if masked:
- scale_factor = (cmax - cmin) / (2**n - 2)
- else:
- scale_factor = (cmax - cmin) / (2**n - 1)
- if dt.kind == "u":
- add_offset = cmin
- elif dt.kind == "i":
- if masked:
- add_offset = (cmax + cmin) / 2
- else:
- add_offset = cmin + 2 ** (n - 1) * scale_factor
- return (scale_factor, add_offset)
-
-
-@tests.skip_data
-class TestPackedData(tests.IrisTest):
- def _single_test(self, datatype, CDLfilename, manual=False):
- # Read PP input file.
- file_in = tests.get_data_path(
- (
- "PP",
- "cf_processing",
- "000003000000.03.236.000128.1990.12.01.00.00.b.pp",
- )
- )
- cube = iris.load_cube(file_in)
- scale_factor, offset = _get_scale_factor_add_offset(cube, datatype)
- if manual:
- packspec = dict(
- dtype=datatype, scale_factor=scale_factor, add_offset=offset
- )
- else:
- packspec = datatype
- # Write Cube to netCDF file.
- with self.temp_filename(suffix=".nc") as file_out:
- iris.save(cube, file_out, packing=packspec)
- decimal = int(-np.log10(scale_factor))
- packedcube = iris.load_cube(file_out)
- # Check that packed cube is accurate to expected precision
- self.assertArrayAlmostEqual(
- cube.data, packedcube.data, decimal=decimal
- )
- # Check the netCDF file against CDL expected output.
- self.assertCDL(
- file_out,
- ("integration", "netcdf", "TestPackedData", CDLfilename),
- )
-
- def test_single_packed_signed(self):
- """Test saving a single CF-netCDF file with packing."""
- self._single_test("i2", "single_packed_signed.cdl")
-
- def test_single_packed_unsigned(self):
- """Test saving a single CF-netCDF file with packing into unsigned."""
- self._single_test("u1", "single_packed_unsigned.cdl")
-
- def test_single_packed_manual_scale(self):
- """Test saving a single CF-netCDF file with packing with scale
- factor and add_offset set manually."""
- self._single_test("i2", "single_packed_manual.cdl", manual=True)
-
- def _multi_test(self, CDLfilename, multi_dtype=False):
- """Test saving multiple packed cubes with pack_dtype list."""
- # Read PP input file.
- file_in = tests.get_data_path(
- ("PP", "cf_processing", "abcza_pa19591997_daily_29.b.pp")
- )
- cubes = iris.load(file_in)
- # ensure cube order is the same:
- cubes.sort(key=lambda cube: cube.cell_methods[0].method)
- datatype = "i2"
- scale_factor, offset = _get_scale_factor_add_offset(cubes[0], datatype)
- if multi_dtype:
- packdict = dict(
- dtype=datatype, scale_factor=scale_factor, add_offset=offset
- )
- packspec = [packdict, None, "u2"]
- dtypes = packspec
- else:
- packspec = datatype
- dtypes = repeat(packspec)
-
- # Write Cube to netCDF file.
- with self.temp_filename(suffix=".nc") as file_out:
- iris.save(cubes, file_out, packing=packspec)
- # Check the netCDF file against CDL expected output.
- self.assertCDL(
- file_out,
- ("integration", "netcdf", "TestPackedData", CDLfilename),
- )
- packedcubes = iris.load(file_out)
- packedcubes.sort(key=lambda cube: cube.cell_methods[0].method)
- for cube, packedcube, dtype in zip(cubes, packedcubes, dtypes):
- if dtype:
- sf, ao = _get_scale_factor_add_offset(cube, dtype)
- decimal = int(-np.log10(sf))
- # Check that packed cube is accurate to expected precision
- self.assertArrayAlmostEqual(
- cube.data, packedcube.data, decimal=decimal
- )
- else:
- self.assertArrayEqual(cube.data, packedcube.data)
-
- def test_multi_packed_single_dtype(self):
- """Test saving multiple packed cubes with the same pack_dtype."""
- # Read PP input file.
- self._multi_test("multi_packed_single_dtype.cdl")
-
- def test_multi_packed_multi_dtype(self):
- """Test saving multiple packed cubes with pack_dtype list."""
- # Read PP input file.
- self._multi_test("multi_packed_multi_dtype.cdl", multi_dtype=True)
-
-
-class TestScalarCube(tests.IrisTest):
- def test_scalar_cube_save_load(self):
- cube = iris.cube.Cube(1, long_name="scalar_cube")
- with self.temp_filename(suffix=".nc") as fout:
- iris.save(cube, fout)
- scalar_cube = iris.load_cube(fout)
- self.assertEqual(scalar_cube.name(), "scalar_cube")
-
-
-class TestStandardName(tests.IrisTest):
- def test_standard_name_roundtrip(self):
- standard_name = "air_temperature detection_minimum"
- cube = iris.cube.Cube(1, standard_name=standard_name)
- with self.temp_filename(suffix=".nc") as fout:
- iris.save(cube, fout)
- detection_limit_cube = iris.load_cube(fout)
- self.assertEqual(detection_limit_cube.standard_name, standard_name)
-
-
-class TestLoadMinimalGeostationary(tests.IrisTest):
- """
- Check we can load data with a geostationary grid-mapping, even when the
- 'false-easting' and 'false_northing' properties are missing.
-
- """
-
- _geostationary_problem_cdl = """
-netcdf geostationary_problem_case {
-dimensions:
- y = 2 ;
- x = 3 ;
-variables:
- short radiance(y, x) ;
- radiance:standard_name = "toa_outgoing_radiance_per_unit_wavelength" ;
- radiance:units = "W m-2 sr-1 um-1" ;
- radiance:coordinates = "y x" ;
- radiance:grid_mapping = "imager_grid_mapping" ;
- short y(y) ;
- y:units = "rad" ;
- y:axis = "Y" ;
- y:long_name = "fixed grid projection y-coordinate" ;
- y:standard_name = "projection_y_coordinate" ;
- short x(x) ;
- x:units = "rad" ;
- x:axis = "X" ;
- x:long_name = "fixed grid projection x-coordinate" ;
- x:standard_name = "projection_x_coordinate" ;
- int imager_grid_mapping ;
- imager_grid_mapping:grid_mapping_name = "geostationary" ;
- imager_grid_mapping:perspective_point_height = 35786023. ;
- imager_grid_mapping:semi_major_axis = 6378137. ;
- imager_grid_mapping:semi_minor_axis = 6356752.31414 ;
- imager_grid_mapping:latitude_of_projection_origin = 0. ;
- imager_grid_mapping:longitude_of_projection_origin = -75. ;
- imager_grid_mapping:sweep_angle_axis = "x" ;
-
-data:
-
- // coord values, just so these can be dim-coords
- y = 0, 1 ;
- x = 0, 1, 2 ;
-
-}
-"""
-
- @classmethod
- def setUpClass(cls):
- # Create a temp directory for transient test files.
- cls.temp_dir = tempfile.mkdtemp()
- cls.path_test_cdl = path_join(cls.temp_dir, "geos_problem.cdl")
- cls.path_test_nc = path_join(cls.temp_dir, "geos_problem.nc")
- # Create reference CDL and netcdf files from the CDL text.
- ncgen_from_cdl(
- cdl_str=cls._geostationary_problem_cdl,
- cdl_path=cls.path_test_cdl,
- nc_path=cls.path_test_nc,
- )
-
- @classmethod
- def tearDownClass(cls):
- # Destroy the temp directory.
- shutil.rmtree(cls.temp_dir)
-
- def test_geostationary_no_false_offsets(self):
- # Check we can load the test data and coordinate system properties are correct.
- cube = iris.load_cube(self.path_test_nc)
- # Check the coordinate system properties has the correct default properties.
- cs = cube.coord_system()
- self.assertIsInstance(cs, iris.coord_systems.Geostationary)
- self.assertEqual(cs.false_easting, 0.0)
- self.assertEqual(cs.false_northing, 0.0)
-
-
-@tests.skip_data
-class TestConstrainedLoad(tests.IrisTest):
- filename = tests.get_data_path(
- ("NetCDF", "label_and_climate", "A1B-99999a-river-sep-2070-2099.nc")
- )
-
- def test_netcdf_with_NameConstraint(self):
- constr = iris.NameConstraint(var_name="cdf_temp_dmax_tmean_abs")
- cubes = iris.load(self.filename, constr)
- self.assertEqual(len(cubes), 1)
- self.assertEqual(cubes[0].var_name, "cdf_temp_dmax_tmean_abs")
-
- def test_netcdf_with_no_constraint(self):
- cubes = iris.load(self.filename)
- self.assertEqual(len(cubes), 3)
-
-
-class TestSkippedCoord:
- # If a coord/cell measure/etcetera cannot be added to the loaded Cube, a
- # Warning is raised and the coord is skipped.
- # This 'catching' is generic to all CannotAddErrors, but currently the only
- # such problem that can exist in a NetCDF file is a mismatch of dimensions
- # between phenomenon and coord.
-
- cdl_core = """
-dimensions:
- length_scale = 1 ;
- lat = 3 ;
-variables:
- float lat(lat) ;
- lat:standard_name = "latitude" ;
- lat:units = "degrees_north" ;
- short lst_unc_sys(length_scale) ;
- lst_unc_sys:long_name = "uncertainty from large-scale systematic
- errors" ;
- lst_unc_sys:units = "kelvin" ;
- lst_unc_sys:coordinates = "lat" ;
-
-data:
- lat = 0, 1, 2;
- """
-
- @pytest.fixture(autouse=True)
- def create_nc_file(self, tmp_path):
- file_name = "dim_mismatch"
- cdl = f"netcdf {file_name}" + "{\n" + self.cdl_core + "\n}"
- self.nc_path = (tmp_path / file_name).with_suffix(".nc")
- ncgen_from_cdl(
- cdl_str=cdl,
- cdl_path=None,
- nc_path=str(self.nc_path),
- )
- yield
- self.nc_path.unlink()
-
- def test_lat_not_loaded(self):
- # iris#5068 includes discussion of possible retention of the skipped
- # coords in the future.
- with pytest.warns(
- match="Missing data dimensions for multi-valued DimCoord"
- ):
- cube = iris.load_cube(self.nc_path)
- with pytest.raises(iris.exceptions.CoordinateNotFoundError):
- _ = cube.coord("lat")
-
-
-if __name__ == "__main__":
- tests.main()
diff --git a/lib/iris/tests/results/COLPEX/small_colpex_theta_p_alt.cml b/lib/iris/tests/results/COLPEX/small_colpex_theta_p_alt.cml
index da315c36af..07bdb02725 100644
--- a/lib/iris/tests/results/COLPEX/small_colpex_theta_p_alt.cml
+++ b/lib/iris/tests/results/COLPEX/small_colpex_theta_p_alt.cml
@@ -8,506 +8,531 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
@@ -516,8 +541,9 @@
-
+
@@ -531,506 +557,531 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
@@ -1039,8 +1090,9 @@
-
+
@@ -1054,56 +1106,70 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/FF/air_temperature_1.cml b/lib/iris/tests/results/FF/air_temperature_1.cml
index 043b9acc16..99c30075a0 100644
--- a/lib/iris/tests/results/FF/air_temperature_1.cml
+++ b/lib/iris/tests/results/FF/air_temperature_1.cml
@@ -8,10 +8,10 @@
-
+
-
+
@@ -21,26 +21,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/FF/air_temperature_2.cml b/lib/iris/tests/results/FF/air_temperature_2.cml
index 200a80b54a..c94604b516 100644
--- a/lib/iris/tests/results/FF/air_temperature_2.cml
+++ b/lib/iris/tests/results/FF/air_temperature_2.cml
@@ -8,10 +8,10 @@
-
+
-
+
@@ -21,26 +21,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/FF/soil_temperature_1.cml b/lib/iris/tests/results/FF/soil_temperature_1.cml
index 57303636c1..e014ac6b6f 100644
--- a/lib/iris/tests/results/FF/soil_temperature_1.cml
+++ b/lib/iris/tests/results/FF/soil_temperature_1.cml
@@ -8,27 +8,29 @@
-
+
-
+
-
+
-
+
@@ -40,7 +42,7 @@
-
+
diff --git a/lib/iris/tests/results/FF/surface_altitude_1.cml b/lib/iris/tests/results/FF/surface_altitude_1.cml
index 2669624d37..e64c146e1a 100644
--- a/lib/iris/tests/results/FF/surface_altitude_1.cml
+++ b/lib/iris/tests/results/FF/surface_altitude_1.cml
@@ -8,32 +8,34 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/PP/extra_char_data.w_data_loaded.pp.txt b/lib/iris/tests/results/PP/extra_char_data.w_data_loaded.pp.txt
new file mode 100644
index 0000000000..9e1bfa95bf
--- /dev/null
+++ b/lib/iris/tests/results/PP/extra_char_data.w_data_loaded.pp.txt
@@ -0,0 +1,641 @@
+[PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27870
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 56
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 5
+ lblev: 1
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 897024, 0, 2, 0, 0, 1)
+ brsvd: (20.000338, 0.9977165, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -12.0
+ blev: 9.998206
+ brlev: 0.0
+ bhlev: 0.99885815
+ bhrlev: 1.0
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -0.9375
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[ 0.8562012 0.9094238 0.9614258 ... 0.6916504 0.74731445
+ 0.8022461 ]
+ [-0.29174805 -0.2397461 -0.18725586 ... -0.36645508 -0.34594727
+ -0.32763672]
+ [-0.76000977 -0.6833496 -0.6347656 ... -0.9243164 -0.8911133
+ -0.7675781 ]
+ ...
+ [-4.647461 -4.7456055 -4.8171387 ... -4.3222656 -4.428955
+ -4.536133 ]
+ [-4.4577637 -4.5183105 -4.580078 ... -4.283203 -4.350342
+ -4.4038086 ]
+ [-4.2226562 -4.284668 -4.342041 ... -4.01001 -4.085205
+ -4.15625 ]]
+ field_title: AJHQA Time mean !C Atmos u compnt of wind after timestep at 9.998 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27678
+ lbcode: 1
+ lbhem: 0
+ lbrow: 144
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 57
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 6
+ lblev: 1
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1208320, 0, 3, 0, 0, 1)
+ brsvd: (20.000338, 0.9977165, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -12.0
+ blev: 9.998206
+ brlev: 0.0
+ bhlev: 0.99885815
+ bhrlev: 1.0
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -90.625
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[-1.2304688 -1.2202148 -1.2077637 ... -1.2546387 -1.246582
+ -1.2387695 ]
+ [-1.0026855 -1.0119629 -1.0195312 ... -0.9663086 -0.9802246
+ -0.9904785 ]
+ [-0.76538086 -0.8845215 -1.0141602 ... -0.72143555 -0.7011719
+ -0.71118164]
+ ...
+ [-2.1013184 -1.9470215 -1.7893066 ... -2.564209 -2.4177246
+ -2.2590332 ]
+ [-2.0922852 -1.9360352 -1.7756348 ... -2.5288086 -2.3864746
+ -2.2421875 ]
+ [-2.0959473 -1.9523926 -1.8071289 ... -2.5092773 -2.3747559
+ -2.2368164 ]]
+ field_title: AJHQA Time mean !C Atmos v compnt of wind after timestep at 9.998 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27867
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 27
+ lbpack: 0
+ lbrel: 2
+ lbfc: 19
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 1
+ lblev: 1
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1519616, 0, 4, 0, 0, 1)
+ brsvd: (49.998882, 0.99429625, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -10.0
+ blev: 20.000338
+ brlev: 0.0
+ bhlev: 0.9977165
+ bhrlev: 1.0
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[282.4619 282.4619 282.4619 ... 282.4619 282.4619 282.4619 ]
+ [282.3506 282.37598 282.40234 ... 282.27344 282.29883 282.3252 ]
+ [281.95508 282.03418 282.10938 ... 281.7578 281.81348 281.87988]
+ ...
+ [245.83203 245.84277 245.83398 ... 245.82031 245.82129 245.82324]
+ [244.42969 244.4248 244.42383 ... 244.45312 244.45215 244.44043]
+ [243.26758 243.26758 243.26758 ... 243.26758 243.26758 243.26758]]
+ field_title: AJHQA Time mean !C Atmos theta after timestep at 20.00 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27870
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 95
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 13
+ lblev: 1
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1789952, 0, 10, 0, 0, 1)
+ brsvd: (49.998882, 0.99429625, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -99.0
+ blev: 20.000338
+ brlev: 0.0
+ bhlev: 0.9977165
+ bhrlev: 1.0
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[0.00079939 0.00079939 0.00079939 ... 0.00079939 0.00079939 0.00079939]
+ [0.00087261 0.00087106 0.00086934 ... 0.00087724 0.00087613 0.00087428]
+ [0.00093523 0.00092579 0.00091752 ... 0.00095657 0.00094989 0.00094373]
+ ...
+ [0.00037911 0.0003811 0.00038037 ... 0.00037897 0.00037865 0.0003793 ]
+ [0.00033554 0.0003354 0.00033541 ... 0.0003389 0.00033855 0.00033566]
+ [0.00030907 0.00030907 0.00030907 ... 0.00030907 0.00030907 0.00030907]]
+ field_title: AJHQA Time mean !C Atmos specific humidity after timestep at 20.00 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27870
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 56
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 5
+ lblev: 2
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 905216, 0, 2, 0, 0, 1)
+ brsvd: (80.00135, 0.9908815, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -12.0
+ blev: 49.998882
+ brlev: 20.000338
+ bhlev: 0.99429625
+ bhrlev: 0.9977165
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -0.9375
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[ 1.0332031 1.0991211 1.1638184 ... 0.82910156 0.89819336
+ 0.96606445]
+ [-0.46777344 -0.41455078 -0.35766602 ... -0.5932617 -0.5517578
+ -0.51293945]
+ [-1.072998 -1.005127 -0.9387207 ... -1.3034668 -1.2263184
+ -1.1523438 ]
+ ...
+ [-5.9941406 -6.099365 -6.1816406 ... -5.6379395 -5.7575684
+ -5.8745117 ]
+ [-5.8913574 -5.9609375 -6.027832 ... -5.675537 -5.7558594
+ -5.8239746 ]
+ [-5.727051 -5.7910156 -5.848633 ... -5.4992676 -5.581299
+ -5.6572266 ]]
+ field_title: AJHQA Time mean !C Atmos u compnt of wind after timestep at 50.00 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27678
+ lbcode: 1
+ lbhem: 0
+ lbrow: 144
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 57
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 6
+ lblev: 2
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1216512, 0, 3, 0, 0, 1)
+ brsvd: (80.00135, 0.9908815, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -12.0
+ blev: 49.998882
+ brlev: 20.000338
+ bhlev: 0.99429625
+ bhrlev: 0.9977165
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -90.625
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[-1.5361328 -1.5249023 -1.5117188 ... -1.5610352 -1.5537109
+ -1.5454102 ]
+ [-1.2714844 -1.2890625 -1.3078613 ... -1.2194824 -1.2355957
+ -1.2526855 ]
+ [-1.0349121 -1.1855469 -1.3476562 ... -0.96240234 -0.94018555
+ -0.9621582 ]
+ ...
+ [-2.333252 -2.1430664 -1.9562988 ... -2.888916 -2.708252
+ -2.5219727 ]
+ [-2.2441406 -2.0427246 -1.8383789 ... -2.8112793 -2.6252441
+ -2.4382324 ]
+ [-2.1965332 -2.0041504 -1.809082 ... -2.755127 -2.5720215
+ -2.3859863 ]]
+ field_title: AJHQA Time mean !C Atmos v compnt of wind after timestep at 50.00 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27867
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 27
+ lbpack: 0
+ lbrel: 2
+ lbfc: 19
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 1
+ lblev: 2
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1527808, 0, 4, 0, 0, 1)
+ brsvd: (130.00023, 0.98520386, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -10.0
+ blev: 80.00135
+ brlev: 49.998882
+ bhlev: 0.9908815
+ bhrlev: 0.99429625
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[282.4961 282.4961 282.4961 ... 282.4961 282.4961 282.4961 ]
+ [282.38672 282.4121 282.4375 ... 282.31152 282.33594 282.36133]
+ [282.0957 282.16992 282.2422 ... 281.9121 281.96582 282.02734]
+ ...
+ [246.62598 246.63086 246.625 ... 246.59863 246.60938 246.61816]
+ [245.46387 245.46582 245.4707 ... 245.45703 245.46191 245.46387]
+ [244.5625 244.5625 244.5625 ... 244.5625 244.5625 244.5625 ]]
+ field_title: AJHQA Time mean !C Atmos theta after timestep at 80.00 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27870
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 95
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 13
+ lblev: 2
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1818624, 0, 10, 0, 0, 1)
+ brsvd: (130.00023, 0.98520386, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -99.0
+ blev: 80.00135
+ brlev: 49.998882
+ bhlev: 0.9908815
+ bhrlev: 0.99429625
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[0.00077913 0.00077913 0.00077913 ... 0.00077913 0.00077913 0.00077913]
+ [0.00085118 0.0008495 0.00084755 ... 0.00085498 0.00085392 0.00085248]
+ [0.00091165 0.00090317 0.00089486 ... 0.00092995 0.00092435 0.00091926]
+ ...
+ [0.00038609 0.00038648 0.00038594 ... 0.00038624 0.00038616 0.00038646]
+ [0.00034904 0.00034909 0.0003492 ... 0.0003502 0.00035007 0.00034903]
+ [0.00032891 0.00032891 0.00032891 ... 0.00032891 0.00032891 0.00032891]]
+ field_title: AJHQA Time mean !C Atmos specific humidity after timestep at 80.00 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27870
+ lbcode: 1
+ lbhem: 0
+ lbrow: 145
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 56
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 5
+ lblev: 3
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 913408, 0, 2, 0, 0, 1)
+ brsvd: (179.99911, 0.97954255, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -12.0
+ blev: 130.00023
+ brlev: 80.00135
+ bhlev: 0.98520386
+ bhrlev: 0.9908815
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -91.25
+ bdy: 1.25
+ bzx: -0.9375
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[ 1.0524902 1.1252441 1.1967773 ... 0.8273926 0.90356445
+ 0.9785156 ]
+ [-0.6694336 -0.61328125 -0.5529785 ... -0.8195801 -0.7685547
+ -0.72021484]
+ [-1.3225098 -1.2358398 -1.1459961 ... -1.5771484 -1.4953613
+ -1.4130859 ]
+ ...
+ [-6.96875 -7.027832 -7.0776367 ... -6.741455 -6.8256836
+ -6.900879 ]
+ [-7.010498 -7.0480957 -7.0776367 ... -6.8447266 -6.9067383
+ -6.963135 ]
+ [-6.9716797 -7.010254 -7.04126 ... -6.8120117 -6.8725586
+ -6.9257812 ]]
+ field_title: AJHQA Time mean !C Atmos u compnt of wind after timestep at 130.0 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+, PP Field
+ lbyr: 2007
+ lbmon: 12
+ lbdat: 1
+ lbhr: 0
+ lbmin: 0
+ lbday: 336
+ lbyrd: 2008
+ lbmond: 1
+ lbdatd: 1
+ lbhrd: 0
+ lbmind: 0
+ lbdayd: 1
+ lbtim: 121
+ lbft: 26280
+ lblrec: 27678
+ lbcode: 1
+ lbhem: 0
+ lbrow: 144
+ lbnpt: 192
+ lbext: 30
+ lbpack: 0
+ lbrel: 2
+ lbfc: 57
+ lbcfc: 0
+ lbproc: 128
+ lbvc: 65
+ lbrvc: 0
+ lbexp: 2388992
+ lbegin: 0
+ lbnrec: 0
+ lbproj: 802
+ lbtyp: 6
+ lblev: 3
+ lbrsvd: (0, 0, 0, 0)
+ lbsrce: 6061111
+ lbuser: (1, 1224704, 0, 3, 0, 0, 1)
+ brsvd: (179.99911, 0.97954255, 0.0, 0.0)
+ bdatum: 0.0
+ bacc: -12.0
+ blev: 130.00023
+ brlev: 80.00135
+ bhlev: 0.98520386
+ bhrlev: 0.9908815
+ bplat: 90.0
+ bplon: 0.0
+ bgor: 0.0
+ bzy: -90.625
+ bdy: 1.25
+ bzx: -1.875
+ bdx: 1.875
+ bmdi: -1073741800.0
+ bmks: 1.0
+ data: [[-1.7414551 -1.7321777 -1.7211914 ... -1.7590332 -1.7546387 -1.7485352]
+ [-1.482666 -1.5065918 -1.5327148 ... -1.4162598 -1.4372559 -1.4589844]
+ [-1.3601074 -1.5227051 -1.6989746 ... -1.2714844 -1.2514648 -1.2753906]
+ ...
+ [-1.7216797 -1.4643555 -1.2097168 ... -2.4348145 -2.1984863 -1.9648438]
+ [-1.529541 -1.295166 -1.0639648 ... -2.2402344 -2.0048828 -1.7670898]
+ [-1.4748535 -1.2502441 -1.0231934 ... -2.137207 -1.9177246 -1.6970215]]
+ field_title: AJHQA Time mean !C Atmos v compnt of wind after timestep at 130.0 metres !C 01/12/2007 00:00 -> 01/01/2008 00:00
+]
\ No newline at end of file
diff --git a/lib/iris/tests/results/abf/load.cml b/lib/iris/tests/results/abf/load.cml
index e7954ab229..bf15e4499c 100644
--- a/lib/iris/tests/results/abf/load.cml
+++ b/lib/iris/tests/results/abf/load.cml
@@ -6,26 +6,26 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/abs.cml b/lib/iris/tests/results/analysis/abs.cml
index b0a37b6074..524e05a09a 100644
--- a/lib/iris/tests/results/analysis/abs.cml
+++ b/lib/iris/tests/results/analysis/abs.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition.cml b/lib/iris/tests/results/analysis/addition.cml
index 4f9600694d..a0f4db9e58 100644
--- a/lib/iris/tests/results/analysis/addition.cml
+++ b/lib/iris/tests/results/analysis/addition.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition_coord_x.cml b/lib/iris/tests/results/analysis/addition_coord_x.cml
index a086b8ad8b..4259c2d621 100644
--- a/lib/iris/tests/results/analysis/addition_coord_x.cml
+++ b/lib/iris/tests/results/analysis/addition_coord_x.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition_coord_y.cml b/lib/iris/tests/results/analysis/addition_coord_y.cml
index 266e81c912..7b11e214fe 100644
--- a/lib/iris/tests/results/analysis/addition_coord_y.cml
+++ b/lib/iris/tests/results/analysis/addition_coord_y.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition_different_std_name.cml b/lib/iris/tests/results/analysis/addition_different_std_name.cml
index 14b0b42dd8..b137858af8 100644
--- a/lib/iris/tests/results/analysis/addition_different_std_name.cml
+++ b/lib/iris/tests/results/analysis/addition_different_std_name.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition_in_place.cml b/lib/iris/tests/results/analysis/addition_in_place.cml
index 4f9600694d..a0f4db9e58 100644
--- a/lib/iris/tests/results/analysis/addition_in_place.cml
+++ b/lib/iris/tests/results/analysis/addition_in_place.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition_in_place_coord.cml b/lib/iris/tests/results/analysis/addition_in_place_coord.cml
index 00dee609eb..8559128b63 100644
--- a/lib/iris/tests/results/analysis/addition_in_place_coord.cml
+++ b/lib/iris/tests/results/analysis/addition_in_place_coord.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/addition_scalar.cml b/lib/iris/tests/results/analysis/addition_scalar.cml
index daf0050069..69853fa215 100644
--- a/lib/iris/tests/results/analysis/addition_scalar.cml
+++ b/lib/iris/tests/results/analysis/addition_scalar.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/easy.cml b/lib/iris/tests/results/analysis/aggregated_by/easy.cml
index d02c3f12d1..87b10a52cd 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/easy.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/easy.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/multi.cml b/lib/iris/tests/results/analysis/aggregated_by/multi.cml
index 75cb67c054..6542b915a1 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/multi.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/multi.cml
@@ -6,28 +6,29 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/multi_missing.cml b/lib/iris/tests/results/analysis/aggregated_by/multi_missing.cml
index dc9bdd0df8..1558d17a9a 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/multi_missing.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/multi_missing.cml
@@ -6,28 +6,29 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/multi_shared.cml b/lib/iris/tests/results/analysis/aggregated_by/multi_shared.cml
index 81d775e741..aa6fefc293 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/multi_shared.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/multi_shared.cml
@@ -4,52 +4,55 @@
+ [16, 15],
+ [14, 0],
+ [13, 11],
+ [10, 10],
+ [ 9, 8],
+ [ 7, 5],
+ [ 4, 4],
+ [ 3, 2]]" id="35dc92ed" long_name="gamma" points="[18. , 15.5, 7. , 12. , 10. , 8.5, 6. , 4. ,
+ 2.5]" shape="(9,)" units="Unit('1')" value_type="float64"/>
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/single.cml b/lib/iris/tests/results/analysis/aggregated_by/single.cml
index 3f2ea6fce2..bc6cbd0301 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/single.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/single.cml
@@ -6,24 +6,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/single_missing.cml b/lib/iris/tests/results/analysis/aggregated_by/single_missing.cml
index 51e1ae4ff1..df1a9861d4 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/single_missing.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/single_missing.cml
@@ -6,24 +6,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/single_rms.cml b/lib/iris/tests/results/analysis/aggregated_by/single_rms.cml
index 2961a6b48d..34bd38240e 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/single_rms.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/single_rms.cml
@@ -6,24 +6,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/single_shared.cml b/lib/iris/tests/results/analysis/aggregated_by/single_shared.cml
index adbf893864..a554a1083d 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/single_shared.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/single_shared.cml
@@ -6,34 +6,34 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/single_shared_circular.cml b/lib/iris/tests/results/analysis/aggregated_by/single_shared_circular.cml
index eba017837d..ec1d9b5780 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/single_shared_circular.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/single_shared_circular.cml
@@ -3,38 +3,37 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_easy.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_easy.cml
index 8c434479c9..f6ffc02b55 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_easy.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_easy.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_multi.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_multi.cml
index cca744ff87..78703b47eb 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_multi.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_multi.cml
@@ -6,28 +6,29 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_missing.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_missing.cml
index 8c11bdb505..120084b030 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_missing.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_missing.cml
@@ -6,28 +6,29 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_shared.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_shared.cml
index ab7a7195fd..1758cf3791 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_shared.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_multi_shared.cml
@@ -4,52 +4,55 @@
+ [16, 15],
+ [14, 0],
+ [13, 11],
+ [10, 10],
+ [ 9, 8],
+ [ 7, 5],
+ [ 4, 4],
+ [ 3, 2]]" id="35dc92ed" long_name="gamma" points="[18. , 15.5, 7. , 12. , 10. , 8.5, 6. , 4. ,
+ 2.5]" shape="(9,)" units="Unit('1')" value_type="float64"/>
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_single.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_single.cml
index d5bb9775fe..96a7e4ec85 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_single.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_single.cml
@@ -6,24 +6,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_single_missing.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_single_missing.cml
index f7d57a9828..8d11643346 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_single_missing.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_single_missing.cml
@@ -6,24 +6,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared.cml
index 50a2c44a98..dad52ae602 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared.cml
@@ -6,34 +6,34 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared_circular.cml b/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared_circular.cml
index 657fb43414..e371728745 100644
--- a/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared_circular.cml
+++ b/lib/iris/tests/results/analysis/aggregated_by/weighted_single_shared_circular.cml
@@ -3,38 +3,37 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/apply_ifunc.cml b/lib/iris/tests/results/analysis/apply_ifunc.cml
index fe0e394ee6..e2f5658832 100644
--- a/lib/iris/tests/results/analysis/apply_ifunc.cml
+++ b/lib/iris/tests/results/analysis/apply_ifunc.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/apply_ifunc_frompyfunc.cml b/lib/iris/tests/results/analysis/apply_ifunc_frompyfunc.cml
index 29cb6f611e..d3405f401f 100644
--- a/lib/iris/tests/results/analysis/apply_ifunc_frompyfunc.cml
+++ b/lib/iris/tests/results/analysis/apply_ifunc_frompyfunc.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/apply_ifunc_original.cml b/lib/iris/tests/results/analysis/apply_ifunc_original.cml
index 62a569f7cc..b01e2134af 100644
--- a/lib/iris/tests/results/analysis/apply_ifunc_original.cml
+++ b/lib/iris/tests/results/analysis/apply_ifunc_original.cml
@@ -7,36 +7,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/apply_ufunc.cml b/lib/iris/tests/results/analysis/apply_ufunc.cml
index fe0e394ee6..e2f5658832 100644
--- a/lib/iris/tests/results/analysis/apply_ufunc.cml
+++ b/lib/iris/tests/results/analysis/apply_ufunc.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/apply_ufunc_frompyfunc.cml b/lib/iris/tests/results/analysis/apply_ufunc_frompyfunc.cml
index 7b1511f028..670f74a9ba 100644
--- a/lib/iris/tests/results/analysis/apply_ufunc_frompyfunc.cml
+++ b/lib/iris/tests/results/analysis/apply_ufunc_frompyfunc.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/apply_ufunc_original.cml b/lib/iris/tests/results/analysis/apply_ufunc_original.cml
index 62a569f7cc..b01e2134af 100644
--- a/lib/iris/tests/results/analysis/apply_ufunc_original.cml
+++ b/lib/iris/tests/results/analysis/apply_ufunc_original.cml
@@ -7,36 +7,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/areaweights_original.cml b/lib/iris/tests/results/analysis/areaweights_original.cml
index 651bb648dd..dab90dcfd5 100644
--- a/lib/iris/tests/results/analysis/areaweights_original.cml
+++ b/lib/iris/tests/results/analysis/areaweights_original.cml
@@ -7,26 +7,27 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/cos_simple.xml b/lib/iris/tests/results/analysis/calculus/cos_simple.xml
index 478902833f..2b624df1c4 100644
--- a/lib/iris/tests/results/analysis/calculus/cos_simple.xml
+++ b/lib/iris/tests/results/analysis/calculus/cos_simple.xml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/cos_simple_radians.xml b/lib/iris/tests/results/analysis/calculus/cos_simple_radians.xml
index 478902833f..2b624df1c4 100644
--- a/lib/iris/tests/results/analysis/calculus/cos_simple_radians.xml
+++ b/lib/iris/tests/results/analysis/calculus/cos_simple_radians.xml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/curl_contrived_cartesian2.cml b/lib/iris/tests/results/analysis/calculus/curl_contrived_cartesian2.cml
index a744dfc782..96ea1ecc60 100644
--- a/lib/iris/tests/results/analysis/calculus/curl_contrived_cartesian2.cml
+++ b/lib/iris/tests/results/analysis/calculus/curl_contrived_cartesian2.cml
@@ -3,25 +3,28 @@
-
+
-
+
-
+
@@ -34,25 +37,28 @@
-
+
-
+
-
+
@@ -65,25 +71,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/delta_handmade_simple_wrt_x.cml b/lib/iris/tests/results/analysis/calculus/delta_handmade_simple_wrt_x.cml
index ee1301b11d..b4f065084b 100644
--- a/lib/iris/tests/results/analysis/calculus/delta_handmade_simple_wrt_x.cml
+++ b/lib/iris/tests/results/analysis/calculus/delta_handmade_simple_wrt_x.cml
@@ -6,7 +6,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lat.cml b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lat.cml
index 0693498989..86f407a6f2 100644
--- a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lat.cml
+++ b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lat.cml
@@ -3,17 +3,17 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lon.cml b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lon.cml
index 376c624265..5b624bf398 100644
--- a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lon.cml
+++ b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_lon.cml
@@ -3,12 +3,12 @@
-
+
-
+
@@ -16,7 +16,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_x.cml b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_x.cml
index d54dae3424..30441d8a56 100644
--- a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_x.cml
+++ b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_x.cml
@@ -3,12 +3,12 @@
-
+
-
+
@@ -16,7 +16,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_y.cml b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_y.cml
index 7561c7b02f..2ce91bd232 100644
--- a/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_y.cml
+++ b/lib/iris/tests/results/analysis/calculus/delta_handmade_wrt_y.cml
@@ -3,17 +3,17 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/grad_contrived1.cml b/lib/iris/tests/results/analysis/calculus/grad_contrived1.cml
index 0696e1be75..d4fffd150a 100644
--- a/lib/iris/tests/results/analysis/calculus/grad_contrived1.cml
+++ b/lib/iris/tests/results/analysis/calculus/grad_contrived1.cml
@@ -3,28 +3,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/grad_contrived2.cml b/lib/iris/tests/results/analysis/calculus/grad_contrived2.cml
index ffa976d4a4..7433be8bc2 100644
--- a/lib/iris/tests/results/analysis/calculus/grad_contrived2.cml
+++ b/lib/iris/tests/results/analysis/calculus/grad_contrived2.cml
@@ -3,32 +3,40 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/grad_contrived_non_spherical1.cml b/lib/iris/tests/results/analysis/calculus/grad_contrived_non_spherical1.cml
index 077e3df4ab..c01b94e6db 100644
--- a/lib/iris/tests/results/analysis/calculus/grad_contrived_non_spherical1.cml
+++ b/lib/iris/tests/results/analysis/calculus/grad_contrived_non_spherical1.cml
@@ -3,24 +3,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lat.cml b/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lat.cml
index 7cc09660ab..eda0fd2036 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lat.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lat.cml
@@ -4,25 +4,27 @@
+ -60.75, -56.25, -51.75, -47.25, -42.75, -38.25,
+ -33.75, -29.25, -24.75, -20.25, -15.75, -11.25,
+ -6.75, -2.25, 2.25, 6.75, 11.25, 15.75,
+ 20.25, 24.75, 29.25, 33.75, 38.25, 42.75,
+ 47.25, 51.75, 56.25, 60.75, 65.25, 69.75,
+ 74.25, 78.75, 83.25, 87.75, 92.25, 96.75,
+ 101.25, 105.75, 110.25, 114.75, 119.25, 123.75,
+ 128.25]" shape="(49,)" standard_name="latitude" units="Unit('degrees')" value_type="float32">
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lon.cml b/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lon.cml
index ced788b5c6..6e929a2e79 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lon.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade2_wrt_lon.cml
@@ -3,26 +3,28 @@
-
+
+ -155.25, -150.75, -146.25, -141.75, -137.25,
+ -132.75, -128.25, -123.75, -119.25, -114.75,
+ -110.25, -105.75, -101.25, -96.75, -92.25,
+ -87.75, -83.25, -78.75, -74.25, -69.75,
+ -65.25, -60.75, -56.25, -51.75, -47.25,
+ -42.75, -38.25, -33.75, -29.25, -24.75,
+ -20.25, -15.75, -11.25, -6.75, -2.25,
+ 2.25, 6.75, 11.25, 15.75, 20.25,
+ 24.75, 29.25, 33.75, 38.25]" shape="(49,)" standard_name="longitude" units="Unit('degrees')" value_type="float32">
diff --git a/lib/iris/tests/results/analysis/calculus/handmade_simple_wrt_x.cml b/lib/iris/tests/results/analysis/calculus/handmade_simple_wrt_x.cml
index c055a46e59..adbd8c4dac 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade_simple_wrt_x.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade_simple_wrt_x.cml
@@ -6,7 +6,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/handmade_wrt_lat.cml b/lib/iris/tests/results/analysis/calculus/handmade_wrt_lat.cml
index 98612df27b..39db8cb583 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade_wrt_lat.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade_wrt_lat.cml
@@ -3,17 +3,17 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/handmade_wrt_lon.cml b/lib/iris/tests/results/analysis/calculus/handmade_wrt_lon.cml
index ceeb537ac6..fb80441bd7 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade_wrt_lon.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade_wrt_lon.cml
@@ -3,12 +3,12 @@
-
+
-
+
@@ -16,7 +16,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/handmade_wrt_x.cml b/lib/iris/tests/results/analysis/calculus/handmade_wrt_x.cml
index cbe823b2e0..b43273a21f 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade_wrt_x.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade_wrt_x.cml
@@ -3,12 +3,12 @@
-
+
-
+
@@ -16,7 +16,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/calculus/handmade_wrt_y.cml b/lib/iris/tests/results/analysis/calculus/handmade_wrt_y.cml
index b0eaa31da8..9698f9ec8d 100644
--- a/lib/iris/tests/results/analysis/calculus/handmade_wrt_y.cml
+++ b/lib/iris/tests/results/analysis/calculus/handmade_wrt_y.cml
@@ -3,17 +3,17 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/count_bar_2d.cml b/lib/iris/tests/results/analysis/count_bar_2d.cml
index 3457187d4e..49d25934a2 100644
--- a/lib/iris/tests/results/analysis/count_bar_2d.cml
+++ b/lib/iris/tests/results/analysis/count_bar_2d.cml
@@ -3,13 +3,13 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/count_foo_1d.cml b/lib/iris/tests/results/analysis/count_foo_1d.cml
index 6a76951959..f611029fa2 100644
--- a/lib/iris/tests/results/analysis/count_foo_1d.cml
+++ b/lib/iris/tests/results/analysis/count_foo_1d.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/count_foo_2d.cml b/lib/iris/tests/results/analysis/count_foo_2d.cml
index af4ee81c3f..9fcac4a5bd 100644
--- a/lib/iris/tests/results/analysis/count_foo_2d.cml
+++ b/lib/iris/tests/results/analysis/count_foo_2d.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/count_foo_bar_2d.cml b/lib/iris/tests/results/analysis/count_foo_bar_2d.cml
index 47a25bbd84..73ca30312b 100644
--- a/lib/iris/tests/results/analysis/count_foo_bar_2d.cml
+++ b/lib/iris/tests/results/analysis/count_foo_bar_2d.cml
@@ -3,10 +3,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/delta_one_element_explicit.xml b/lib/iris/tests/results/analysis/delta_and_midpoint/delta_one_element_explicit.xml
index 494d198e64..41e7d6453a 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/delta_one_element_explicit.xml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/delta_one_element_explicit.xml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/midpoint_one_element_explicit.xml b/lib/iris/tests/results/analysis/delta_and_midpoint/midpoint_one_element_explicit.xml
index 8b68a16b47..a09710eaf3 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/midpoint_one_element_explicit.xml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/midpoint_one_element_explicit.xml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple1.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple1.cml
index b4c123e294..5927c572e4 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple1.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple1.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_delta.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_delta.cml
index c81ccfc9e8..a87393f917 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_delta.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_delta.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_midpoint.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_midpoint.cml
index f97d74bff8..020cff992d 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_midpoint.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple1_midpoint.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple2.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple2.cml
index 50bc4d77c1..c56488a758 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple2.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple2.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_delta.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_delta.cml
index a4621734d3..7e965ff4d5 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_delta.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_delta.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_midpoint.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_midpoint.cml
index a981e2b79c..c04f72ce52 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_midpoint.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple2_midpoint.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple3.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple3.cml
index f088c97634..467e78de62 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple3.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple3.cml
@@ -1,5 +1,5 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_delta.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_delta.cml
index 74d7546592..3d7e1bc12d 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_delta.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_delta.cml
@@ -1,5 +1,5 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_midpoint.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_midpoint.cml
index 961a953ea5..b193b0015c 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_midpoint.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple3_midpoint.cml
@@ -1,5 +1,5 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple4.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple4.cml
index fd4e8ed6bf..c07dcbc18a 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple4.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple4.cml
@@ -1,5 +1,5 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_delta.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_delta.cml
index dc6b09a87a..d59b173304 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_delta.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_delta.cml
@@ -1,4 +1,4 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_midpoint.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_midpoint.cml
index e413c214e9..d954504f42 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_midpoint.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple4_midpoint.cml
@@ -1,4 +1,4 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple5.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple5.cml
index 0aad76ca07..05770d2c52 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple5.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple5.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_delta.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_delta.cml
index 73ee9c9070..aec0ded3f3 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_delta.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_delta.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_midpoint.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_midpoint.cml
index 3e93c682ba..591ba00330 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_midpoint.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple5_midpoint.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/delta_and_midpoint/simple6.cml b/lib/iris/tests/results/analysis/delta_and_midpoint/simple6.cml
index 6413204d03..fe47685acd 100644
--- a/lib/iris/tests/results/analysis/delta_and_midpoint/simple6.cml
+++ b/lib/iris/tests/results/analysis/delta_and_midpoint/simple6.cml
@@ -1,2 +1,2 @@
-
+
diff --git a/lib/iris/tests/results/analysis/division.cml b/lib/iris/tests/results/analysis/division.cml
index 762f51ec0a..90fe592390 100644
--- a/lib/iris/tests/results/analysis/division.cml
+++ b/lib/iris/tests/results/analysis/division.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/division_by_array.cml b/lib/iris/tests/results/analysis/division_by_array.cml
index 14b0b42dd8..b137858af8 100644
--- a/lib/iris/tests/results/analysis/division_by_array.cml
+++ b/lib/iris/tests/results/analysis/division_by_array.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/division_by_latitude.cml b/lib/iris/tests/results/analysis/division_by_latitude.cml
index 42437d1e36..c05e82f7f3 100644
--- a/lib/iris/tests/results/analysis/division_by_latitude.cml
+++ b/lib/iris/tests/results/analysis/division_by_latitude.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/division_by_longitude.cml b/lib/iris/tests/results/analysis/division_by_longitude.cml
index 264ce9b793..243b4158af 100644
--- a/lib/iris/tests/results/analysis/division_by_longitude.cml
+++ b/lib/iris/tests/results/analysis/division_by_longitude.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/division_by_singular_coord.cml b/lib/iris/tests/results/analysis/division_by_singular_coord.cml
index 4c9c58d760..6e91963596 100644
--- a/lib/iris/tests/results/analysis/division_by_singular_coord.cml
+++ b/lib/iris/tests/results/analysis/division_by_singular_coord.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/division_scalar.cml b/lib/iris/tests/results/analysis/division_scalar.cml
index 14b0b42dd8..b137858af8 100644
--- a/lib/iris/tests/results/analysis/division_scalar.cml
+++ b/lib/iris/tests/results/analysis/division_scalar.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/exp.cml b/lib/iris/tests/results/analysis/exp.cml
index 357a84363e..120a71e587 100644
--- a/lib/iris/tests/results/analysis/exp.cml
+++ b/lib/iris/tests/results/analysis/exp.cml
@@ -3,17 +3,17 @@
-
+
diff --git a/lib/iris/tests/results/analysis/exponentiate.cml b/lib/iris/tests/results/analysis/exponentiate.cml
index bb825f6714..066e7c3749 100644
--- a/lib/iris/tests/results/analysis/exponentiate.cml
+++ b/lib/iris/tests/results/analysis/exponentiate.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/first_quartile_foo_1d.cml b/lib/iris/tests/results/analysis/first_quartile_foo_1d.cml
index f027f2d9f8..eb2a08de76 100644
--- a/lib/iris/tests/results/analysis/first_quartile_foo_1d.cml
+++ b/lib/iris/tests/results/analysis/first_quartile_foo_1d.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/first_quartile_foo_1d_fast_percentile.cml b/lib/iris/tests/results/analysis/first_quartile_foo_1d_fast_percentile.cml
index f027f2d9f8..eb2a08de76 100644
--- a/lib/iris/tests/results/analysis/first_quartile_foo_1d_fast_percentile.cml
+++ b/lib/iris/tests/results/analysis/first_quartile_foo_1d_fast_percentile.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/first_quartile_foo_2d.cml b/lib/iris/tests/results/analysis/first_quartile_foo_2d.cml
index 1bc809ce63..ca83009959 100644
--- a/lib/iris/tests/results/analysis/first_quartile_foo_2d.cml
+++ b/lib/iris/tests/results/analysis/first_quartile_foo_2d.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/first_quartile_foo_2d_fast_percentile.cml b/lib/iris/tests/results/analysis/first_quartile_foo_2d_fast_percentile.cml
index 1bc809ce63..ca83009959 100644
--- a/lib/iris/tests/results/analysis/first_quartile_foo_2d_fast_percentile.cml
+++ b/lib/iris/tests/results/analysis/first_quartile_foo_2d_fast_percentile.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d.cml b/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d.cml
index cadd1e8b65..16f8ec2d69 100644
--- a/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d.cml
+++ b/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d.cml
@@ -3,10 +3,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d_fast_percentile.cml b/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d_fast_percentile.cml
index cadd1e8b65..16f8ec2d69 100644
--- a/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d_fast_percentile.cml
+++ b/lib/iris/tests/results/analysis/first_quartile_foo_bar_2d_fast_percentile.cml
@@ -3,10 +3,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/gmean_latitude.cml b/lib/iris/tests/results/analysis/gmean_latitude.cml
index 26b7fdc8af..ca4a5a39f2 100644
--- a/lib/iris/tests/results/analysis/gmean_latitude.cml
+++ b/lib/iris/tests/results/analysis/gmean_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/gmean_latitude_longitude.cml b/lib/iris/tests/results/analysis/gmean_latitude_longitude.cml
index 94ed36ac88..a31a89ab34 100644
--- a/lib/iris/tests/results/analysis/gmean_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/gmean_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/gmean_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/gmean_latitude_longitude_1call.cml
index 1db977312b..dd97d15f27 100644
--- a/lib/iris/tests/results/analysis/gmean_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/gmean_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/hmean_latitude.cml b/lib/iris/tests/results/analysis/hmean_latitude.cml
index 70e3fcb540..86af8f99bc 100644
--- a/lib/iris/tests/results/analysis/hmean_latitude.cml
+++ b/lib/iris/tests/results/analysis/hmean_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/hmean_latitude_longitude.cml b/lib/iris/tests/results/analysis/hmean_latitude_longitude.cml
index f762fd643b..3b469be379 100644
--- a/lib/iris/tests/results/analysis/hmean_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/hmean_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/hmean_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/hmean_latitude_longitude_1call.cml
index 369dca3203..759e94104a 100644
--- a/lib/iris/tests/results/analysis/hmean_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/hmean_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/last_quartile_foo_3d_masked.cml b/lib/iris/tests/results/analysis/last_quartile_foo_3d_masked.cml
index 059541e208..cd3d7ac69a 100644
--- a/lib/iris/tests/results/analysis/last_quartile_foo_3d_masked.cml
+++ b/lib/iris/tests/results/analysis/last_quartile_foo_3d_masked.cml
@@ -3,16 +3,16 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked.cml b/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked.cml
index 059541e208..cd3d7ac69a 100644
--- a/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked.cml
+++ b/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked.cml
@@ -3,16 +3,16 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked_fast_percentile.cml b/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked_fast_percentile.cml
index 059541e208..cd3d7ac69a 100644
--- a/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked_fast_percentile.cml
+++ b/lib/iris/tests/results/analysis/last_quartile_foo_3d_notmasked_fast_percentile.cml
@@ -3,16 +3,16 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/log.cml b/lib/iris/tests/results/analysis/log.cml
index c24e071dc5..9a90864c58 100644
--- a/lib/iris/tests/results/analysis/log.cml
+++ b/lib/iris/tests/results/analysis/log.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/log10.cml b/lib/iris/tests/results/analysis/log10.cml
index abd4065526..226322cb61 100644
--- a/lib/iris/tests/results/analysis/log10.cml
+++ b/lib/iris/tests/results/analysis/log10.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/log2.cml b/lib/iris/tests/results/analysis/log2.cml
index d121ad9a9d..0c26538dd4 100644
--- a/lib/iris/tests/results/analysis/log2.cml
+++ b/lib/iris/tests/results/analysis/log2.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/maths_original.cml b/lib/iris/tests/results/analysis/maths_original.cml
index 15fbb5210f..f3f838f1b8 100644
--- a/lib/iris/tests/results/analysis/maths_original.cml
+++ b/lib/iris/tests/results/analysis/maths_original.cml
@@ -7,36 +7,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/max_latitude.cml b/lib/iris/tests/results/analysis/max_latitude.cml
index 89542d27d3..fa00aacec5 100644
--- a/lib/iris/tests/results/analysis/max_latitude.cml
+++ b/lib/iris/tests/results/analysis/max_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/max_latitude_longitude.cml b/lib/iris/tests/results/analysis/max_latitude_longitude.cml
index 7d24ca7f14..801d4302fa 100644
--- a/lib/iris/tests/results/analysis/max_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/max_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/max_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/max_latitude_longitude_1call.cml
index b4d1e0349c..2dc352e208 100644
--- a/lib/iris/tests/results/analysis/max_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/max_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/max_run_bar_2d.cml b/lib/iris/tests/results/analysis/max_run_bar_2d.cml
index 32a8a377be..6d56c2220b 100644
--- a/lib/iris/tests/results/analysis/max_run_bar_2d.cml
+++ b/lib/iris/tests/results/analysis/max_run_bar_2d.cml
@@ -3,13 +3,13 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/max_run_bar_2d_masked.cml b/lib/iris/tests/results/analysis/max_run_bar_2d_masked.cml
index 32a8a377be..6d56c2220b 100644
--- a/lib/iris/tests/results/analysis/max_run_bar_2d_masked.cml
+++ b/lib/iris/tests/results/analysis/max_run_bar_2d_masked.cml
@@ -3,13 +3,13 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/max_run_foo_1d.cml b/lib/iris/tests/results/analysis/max_run_foo_1d.cml
index b2a3bcef56..a5f53306db 100644
--- a/lib/iris/tests/results/analysis/max_run_foo_1d.cml
+++ b/lib/iris/tests/results/analysis/max_run_foo_1d.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/max_run_foo_2d.cml b/lib/iris/tests/results/analysis/max_run_foo_2d.cml
index fb8448136f..45e9836823 100644
--- a/lib/iris/tests/results/analysis/max_run_foo_2d.cml
+++ b/lib/iris/tests/results/analysis/max_run_foo_2d.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/mean_latitude.cml b/lib/iris/tests/results/analysis/mean_latitude.cml
index 80921e762d..44b26db3fb 100644
--- a/lib/iris/tests/results/analysis/mean_latitude.cml
+++ b/lib/iris/tests/results/analysis/mean_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/mean_latitude_longitude.cml b/lib/iris/tests/results/analysis/mean_latitude_longitude.cml
index 6ac9400a3a..0991425a9a 100644
--- a/lib/iris/tests/results/analysis/mean_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/mean_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/mean_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/mean_latitude_longitude_1call.cml
index affcf07c07..1b5ca1e3dc 100644
--- a/lib/iris/tests/results/analysis/mean_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/mean_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/median_latitude.cml b/lib/iris/tests/results/analysis/median_latitude.cml
index bbf3875688..b5439ed225 100644
--- a/lib/iris/tests/results/analysis/median_latitude.cml
+++ b/lib/iris/tests/results/analysis/median_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/median_latitude_longitude.cml b/lib/iris/tests/results/analysis/median_latitude_longitude.cml
index 5663f6d65f..f8116848a6 100644
--- a/lib/iris/tests/results/analysis/median_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/median_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/median_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/median_latitude_longitude_1call.cml
index c0c0d7c46b..53fd4ef29d 100644
--- a/lib/iris/tests/results/analysis/median_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/median_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/min_latitude.cml b/lib/iris/tests/results/analysis/min_latitude.cml
index bf20be30a9..13e52696f8 100644
--- a/lib/iris/tests/results/analysis/min_latitude.cml
+++ b/lib/iris/tests/results/analysis/min_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/min_latitude_longitude.cml b/lib/iris/tests/results/analysis/min_latitude_longitude.cml
index 3792645582..78cd58ca93 100644
--- a/lib/iris/tests/results/analysis/min_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/min_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/min_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/min_latitude_longitude_1call.cml
index b43231b7e6..672cef058a 100644
--- a/lib/iris/tests/results/analysis/min_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/min_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/multiply.cml b/lib/iris/tests/results/analysis/multiply.cml
index 8fb8658f5d..0a3c2cfb03 100644
--- a/lib/iris/tests/results/analysis/multiply.cml
+++ b/lib/iris/tests/results/analysis/multiply.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/multiply_different_std_name.cml b/lib/iris/tests/results/analysis/multiply_different_std_name.cml
index 2d89e5882f..829bbcc582 100644
--- a/lib/iris/tests/results/analysis/multiply_different_std_name.cml
+++ b/lib/iris/tests/results/analysis/multiply_different_std_name.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/original.cml b/lib/iris/tests/results/analysis/original.cml
index 414de1b6b5..b958136bd1 100644
--- a/lib/iris/tests/results/analysis/original.cml
+++ b/lib/iris/tests/results/analysis/original.cml
@@ -8,26 +8,26 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/original_common.cml b/lib/iris/tests/results/analysis/original_common.cml
index bbfa48d7d8..258ca67c46 100644
--- a/lib/iris/tests/results/analysis/original_common.cml
+++ b/lib/iris/tests/results/analysis/original_common.cml
@@ -8,26 +8,26 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/original_hmean.cml b/lib/iris/tests/results/analysis/original_hmean.cml
index bdc145022c..28cea63268 100644
--- a/lib/iris/tests/results/analysis/original_hmean.cml
+++ b/lib/iris/tests/results/analysis/original_hmean.cml
@@ -8,26 +8,26 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/proportion_bar_2d.cml b/lib/iris/tests/results/analysis/proportion_bar_2d.cml
index 263fcaba9e..f28f4b1546 100644
--- a/lib/iris/tests/results/analysis/proportion_bar_2d.cml
+++ b/lib/iris/tests/results/analysis/proportion_bar_2d.cml
@@ -3,13 +3,13 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/proportion_foo_1d.cml b/lib/iris/tests/results/analysis/proportion_foo_1d.cml
index a0bd3c982f..6ebd3e0f39 100644
--- a/lib/iris/tests/results/analysis/proportion_foo_1d.cml
+++ b/lib/iris/tests/results/analysis/proportion_foo_1d.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/proportion_foo_2d.cml b/lib/iris/tests/results/analysis/proportion_foo_2d.cml
index d715499e58..f2c803bb71 100644
--- a/lib/iris/tests/results/analysis/proportion_foo_2d.cml
+++ b/lib/iris/tests/results/analysis/proportion_foo_2d.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/proportion_foo_2d_masked.cml b/lib/iris/tests/results/analysis/proportion_foo_2d_masked.cml
index 263fcaba9e..f28f4b1546 100644
--- a/lib/iris/tests/results/analysis/proportion_foo_2d_masked.cml
+++ b/lib/iris/tests/results/analysis/proportion_foo_2d_masked.cml
@@ -3,13 +3,13 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/proportion_foo_bar_2d.cml b/lib/iris/tests/results/analysis/proportion_foo_bar_2d.cml
index 77123dd86e..9baab831e1 100644
--- a/lib/iris/tests/results/analysis/proportion_foo_bar_2d.cml
+++ b/lib/iris/tests/results/analysis/proportion_foo_bar_2d.cml
@@ -3,10 +3,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_both_circular.cml b/lib/iris/tests/results/analysis/regrid/linear_both_circular.cml
index 576ab4ace6..2ee0fc00d9 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_both_circular.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_both_circular.cml
@@ -7,27 +7,27 @@
-
+
-
+
-
+
-
+
@@ -39,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_circular_grid.cml b/lib/iris/tests/results/analysis/regrid/linear_circular_grid.cml
index d8fd78a749..3544db9698 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_circular_grid.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_circular_grid.cml
@@ -7,23 +7,27 @@
-
+
-
+
-
+
-
+
@@ -35,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_circular_src.cml b/lib/iris/tests/results/analysis/regrid/linear_circular_src.cml
index 1032b4fc6e..296de665da 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_circular_src.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_circular_src.cml
@@ -7,27 +7,27 @@
-
+
-
+
-
+
-
+
@@ -39,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_circular_srcmissingmask.cml b/lib/iris/tests/results/analysis/regrid/linear_circular_srcmissingmask.cml
index 1032b4fc6e..296de665da 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_circular_srcmissingmask.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_circular_srcmissingmask.cml
@@ -7,27 +7,27 @@
-
+
-
+
-
+
-
+
@@ -39,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_masked_altitude.cml b/lib/iris/tests/results/analysis/regrid/linear_masked_altitude.cml
index 1ac69490b4..b719738a62 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_masked_altitude.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_masked_altitude.cml
@@ -6,85 +6,94 @@
-
+ [[424.42307, 398.04324, nan, nan,
+ nan],
+ [368.6881 , 343.87836, nan, nan,
+ nan],
+ [375.09146, 347.86066, nan, nan,
+ nan],
+ [446.16125, 414.22037, nan, nan,
+ nan]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_non_circular.cml b/lib/iris/tests/results/analysis/regrid/linear_non_circular.cml
index 064409dde5..bb678502c1 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_non_circular.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_non_circular.cml
@@ -7,23 +7,27 @@
-
+
-
+
-
+
-
+
@@ -35,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_partial_overlap.cml b/lib/iris/tests/results/analysis/regrid/linear_partial_overlap.cml
index eb9adb4aef..fc39fee0f5 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_partial_overlap.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_partial_overlap.cml
@@ -6,77 +6,78 @@
-
+ [[ nan, nan, 367.72552, 355.62955],
+ [ nan, nan, 340.44327, 407.57434],
+ [ nan, nan, 336.60175, 419.0933 ],
+ [ nan, nan, 376.38995, 341.02115]]]" shape="(2, 4, 4)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -88,18 +89,19 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_subset.cml b/lib/iris/tests/results/analysis/regrid/linear_subset.cml
index 9bd62287fe..0121d84ebf 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_subset.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_subset.cml
@@ -6,85 +6,94 @@
-
+ [[424.42307, 398.04324, 305.16385, 254.07837,
+ 340.82806],
+ [368.6881 , 343.87836, 348.51068, 368.9184 ,
+ 407.57434],
+ [375.09146, 347.86066, 370.53574, 395.5417 ,
+ 397.02896],
+ [446.16125, 414.22037, 365.36652, 322.28683,
+ 296.69153]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_subset_anon.cml b/lib/iris/tests/results/analysis/regrid/linear_subset_anon.cml
index 1945b03a1a..ea3a804166 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_subset_anon.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_subset_anon.cml
@@ -6,85 +6,94 @@
-
+ [[424.42307, 398.04324, 305.16385, 254.07837,
+ 340.82806],
+ [368.6881 , 343.87836, 348.51068, 368.9184 ,
+ 407.57434],
+ [375.09146, 347.86066, 370.53574, 395.5417 ,
+ 397.02896],
+ [446.16125, 414.22037, 365.36652, 322.28683,
+ 296.69153]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,14 +105,18 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_subset_masked_1.cml b/lib/iris/tests/results/analysis/regrid/linear_subset_masked_1.cml
index 9bd62287fe..0121d84ebf 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_subset_masked_1.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_subset_masked_1.cml
@@ -6,85 +6,94 @@
-
+ [[424.42307, 398.04324, 305.16385, 254.07837,
+ 340.82806],
+ [368.6881 , 343.87836, 348.51068, 368.9184 ,
+ 407.57434],
+ [375.09146, 347.86066, 370.53574, 395.5417 ,
+ 397.02896],
+ [446.16125, 414.22037, 365.36652, 322.28683,
+ 296.69153]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/linear_subset_masked_2.cml b/lib/iris/tests/results/analysis/regrid/linear_subset_masked_2.cml
index 9bd62287fe..0121d84ebf 100644
--- a/lib/iris/tests/results/analysis/regrid/linear_subset_masked_2.cml
+++ b/lib/iris/tests/results/analysis/regrid/linear_subset_masked_2.cml
@@ -6,85 +6,94 @@
-
+ [[424.42307, 398.04324, 305.16385, 254.07837,
+ 340.82806],
+ [368.6881 , 343.87836, 348.51068, 368.9184 ,
+ 407.57434],
+ [375.09146, 347.86066, 370.53574, 395.5417 ,
+ 397.02896],
+ [446.16125, 414.22037, 365.36652, 322.28683,
+ 296.69153]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_both_circular.cml b/lib/iris/tests/results/analysis/regrid/nearest_both_circular.cml
index d8f1a9d0f6..9352ae6076 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_both_circular.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_both_circular.cml
@@ -7,27 +7,27 @@
-
+
-
+
-
+
-
+
@@ -39,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_circular_grid.cml b/lib/iris/tests/results/analysis/regrid/nearest_circular_grid.cml
index 16863839a1..c13e7872a2 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_circular_grid.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_circular_grid.cml
@@ -7,23 +7,27 @@
-
+
-
+
-
+
-
+
@@ -35,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_circular_src.cml b/lib/iris/tests/results/analysis/regrid/nearest_circular_src.cml
index 5eb032cf2c..400efcd7fa 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_circular_src.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_circular_src.cml
@@ -7,27 +7,27 @@
-
+
-
+
-
+
-
+
@@ -39,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_circular_srcmissingmask.cml b/lib/iris/tests/results/analysis/regrid/nearest_circular_srcmissingmask.cml
index 5eb032cf2c..400efcd7fa 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_circular_srcmissingmask.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_circular_srcmissingmask.cml
@@ -7,27 +7,27 @@
-
+
-
+
-
+
-
+
@@ -39,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_masked_altitude.cml b/lib/iris/tests/results/analysis/regrid/nearest_masked_altitude.cml
index a1cff2363e..905109b6b7 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_masked_altitude.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_masked_altitude.cml
@@ -6,85 +6,94 @@
-
+ [[434.5705 , 395.5391 , 219.27228, 219.27228,
+ 349.64597],
+ [345.97134, 310.52786, nan, nan,
+ 444.776 ],
+ [345.97134, 310.52786, nan, nan,
+ 444.776 ],
+ [461.227 , 414.88275, 323.68027, 323.68027,
+ 280.81027]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_non_circular.cml b/lib/iris/tests/results/analysis/regrid/nearest_non_circular.cml
index da162648be..6978ec7200 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_non_circular.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_non_circular.cml
@@ -7,23 +7,27 @@
-
+
-
+
-
+
-
+
@@ -35,24 +39,24 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_partial_overlap.cml b/lib/iris/tests/results/analysis/regrid/nearest_partial_overlap.cml
index 98a0b6b805..a769ed4a38 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_partial_overlap.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_partial_overlap.cml
@@ -6,77 +6,78 @@
-
+ [[ nan, nan, 395.5391 , 349.64597],
+ [ nan, nan, 310.52786, 444.776 ],
+ [ nan, nan, 310.52786, 444.776 ],
+ [ nan, nan, 414.88275, 280.81027]]]" shape="(2, 4, 4)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -88,18 +89,19 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_subset.cml b/lib/iris/tests/results/analysis/regrid/nearest_subset.cml
index a704cbecbb..6d7ef1b453 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_subset.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_subset.cml
@@ -6,85 +6,94 @@
-
+ [[434.5705 , 395.5391 , 219.27228, 219.27228,
+ 349.64597],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [461.227 , 414.88275, 323.68027, 323.68027,
+ 280.81027]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_subset_anon.cml b/lib/iris/tests/results/analysis/regrid/nearest_subset_anon.cml
index 40390f387c..c40a3475a3 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_subset_anon.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_subset_anon.cml
@@ -6,85 +6,94 @@
-
+ [[434.5705 , 395.5391 , 219.27228, 219.27228,
+ 349.64597],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [461.227 , 414.88275, 323.68027, 323.68027,
+ 280.81027]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,14 +105,18 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_1.cml b/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_1.cml
index a704cbecbb..6d7ef1b453 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_1.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_1.cml
@@ -6,85 +6,94 @@
-
+ [[434.5705 , 395.5391 , 219.27228, 219.27228,
+ 349.64597],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [461.227 , 414.88275, 323.68027, 323.68027,
+ 280.81027]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_2.cml b/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_2.cml
index a704cbecbb..6d7ef1b453 100644
--- a/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_2.cml
+++ b/lib/iris/tests/results/analysis/regrid/nearest_subset_masked_2.cml
@@ -6,85 +6,94 @@
-
+ [[434.5705 , 395.5391 , 219.27228, 219.27228,
+ 349.64597],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [345.97134, 310.52786, 425.15723, 425.15723,
+ 444.776 ],
+ [461.227 , 414.88275, 323.68027, 323.68027,
+ 280.81027]]]" shape="(2, 4, 5)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -96,18 +105,23 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/regrid/no_overlap.cml b/lib/iris/tests/results/analysis/regrid/no_overlap.cml
index da2f03f1ee..19033c255d 100644
--- a/lib/iris/tests/results/analysis/regrid/no_overlap.cml
+++ b/lib/iris/tests/results/analysis/regrid/no_overlap.cml
@@ -7,76 +7,78 @@
+ [[nan, nan, nan, nan],
+ [nan, nan, nan, nan],
+ [nan, nan, nan, nan],
+ [nan, nan, nan, nan]]]" shape="(2, 4, 4)" standard_name="altitude" units="Unit('m')" value_type="float32">
-
+
-
+
-
+
-
+
@@ -88,18 +90,19 @@
-
+
+ [nan, nan, nan, nan],
+ [nan, nan, nan, nan],
+ [nan, nan, nan, nan]]" shape="(4, 4)" standard_name="surface_altitude" units="Unit('m')" value_type="float32"/>
-
+
diff --git a/lib/iris/tests/results/analysis/rms_latitude.cml b/lib/iris/tests/results/analysis/rms_latitude.cml
index d4b1428fb2..e3b82802ca 100644
--- a/lib/iris/tests/results/analysis/rms_latitude.cml
+++ b/lib/iris/tests/results/analysis/rms_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/rms_latitude_longitude.cml b/lib/iris/tests/results/analysis/rms_latitude_longitude.cml
index 4293087847..d0c7c95535 100644
--- a/lib/iris/tests/results/analysis/rms_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/rms_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/rms_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/rms_latitude_longitude_1call.cml
index 9ca1d23b42..887b8b6ebb 100644
--- a/lib/iris/tests/results/analysis/rms_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/rms_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/rms_weighted_2d.cml b/lib/iris/tests/results/analysis/rms_weighted_2d.cml
index 433e27d359..b315bd0983 100644
--- a/lib/iris/tests/results/analysis/rms_weighted_2d.cml
+++ b/lib/iris/tests/results/analysis/rms_weighted_2d.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/rolling_window/simple_latitude.cml b/lib/iris/tests/results/analysis/rolling_window/simple_latitude.cml
index ff64076f83..2eb8d59561 100644
--- a/lib/iris/tests/results/analysis/rolling_window/simple_latitude.cml
+++ b/lib/iris/tests/results/analysis/rolling_window/simple_latitude.cml
@@ -3,11 +3,11 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/rolling_window/simple_longitude.cml b/lib/iris/tests/results/analysis/rolling_window/simple_longitude.cml
index b2c422057e..7979ae25b6 100644
--- a/lib/iris/tests/results/analysis/rolling_window/simple_longitude.cml
+++ b/lib/iris/tests/results/analysis/rolling_window/simple_longitude.cml
@@ -3,12 +3,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/rolling_window/size_4_longitude.cml b/lib/iris/tests/results/analysis/rolling_window/size_4_longitude.cml
index 0e4330ce82..6c19e04f6f 100644
--- a/lib/iris/tests/results/analysis/rolling_window/size_4_longitude.cml
+++ b/lib/iris/tests/results/analysis/rolling_window/size_4_longitude.cml
@@ -3,10 +3,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/sqrt.cml b/lib/iris/tests/results/analysis/sqrt.cml
index f8a1c48fc3..6bdeaee3e9 100644
--- a/lib/iris/tests/results/analysis/sqrt.cml
+++ b/lib/iris/tests/results/analysis/sqrt.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/std_dev_latitude.cml b/lib/iris/tests/results/analysis/std_dev_latitude.cml
index a45aefeff4..fec9f9d09c 100644
--- a/lib/iris/tests/results/analysis/std_dev_latitude.cml
+++ b/lib/iris/tests/results/analysis/std_dev_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/std_dev_latitude_longitude.cml b/lib/iris/tests/results/analysis/std_dev_latitude_longitude.cml
index 95e8e3694d..86d60a29ad 100644
--- a/lib/iris/tests/results/analysis/std_dev_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/std_dev_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/std_dev_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/std_dev_latitude_longitude_1call.cml
index f91f6005b7..26baf44a65 100644
--- a/lib/iris/tests/results/analysis/std_dev_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/std_dev_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/subtract.cml b/lib/iris/tests/results/analysis/subtract.cml
index 3466578756..d8a56d895b 100644
--- a/lib/iris/tests/results/analysis/subtract.cml
+++ b/lib/iris/tests/results/analysis/subtract.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/subtract_array.cml b/lib/iris/tests/results/analysis/subtract_array.cml
index 14b0b42dd8..b137858af8 100644
--- a/lib/iris/tests/results/analysis/subtract_array.cml
+++ b/lib/iris/tests/results/analysis/subtract_array.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/subtract_coord_x.cml b/lib/iris/tests/results/analysis/subtract_coord_x.cml
index 060814c6ba..ae951e328f 100644
--- a/lib/iris/tests/results/analysis/subtract_coord_x.cml
+++ b/lib/iris/tests/results/analysis/subtract_coord_x.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/subtract_coord_y.cml b/lib/iris/tests/results/analysis/subtract_coord_y.cml
index 4a9351cf6f..0aaf05808c 100644
--- a/lib/iris/tests/results/analysis/subtract_coord_y.cml
+++ b/lib/iris/tests/results/analysis/subtract_coord_y.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/subtract_scalar.cml b/lib/iris/tests/results/analysis/subtract_scalar.cml
index f458364143..889cde24bd 100644
--- a/lib/iris/tests/results/analysis/subtract_scalar.cml
+++ b/lib/iris/tests/results/analysis/subtract_scalar.cml
@@ -6,36 +6,51 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/sum_latitude.cml b/lib/iris/tests/results/analysis/sum_latitude.cml
index fbb8460fd8..bef5f48f72 100644
--- a/lib/iris/tests/results/analysis/sum_latitude.cml
+++ b/lib/iris/tests/results/analysis/sum_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/sum_latitude_longitude.cml b/lib/iris/tests/results/analysis/sum_latitude_longitude.cml
index cb992f3b9d..f5ce9b622c 100644
--- a/lib/iris/tests/results/analysis/sum_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/sum_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/sum_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/sum_latitude_longitude_1call.cml
index 6171dc516b..3dca019667 100644
--- a/lib/iris/tests/results/analysis/sum_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/sum_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/sum_weighted_1d.cml b/lib/iris/tests/results/analysis/sum_weighted_1d.cml
index 09958e4eb0..3579d60d1a 100644
--- a/lib/iris/tests/results/analysis/sum_weighted_1d.cml
+++ b/lib/iris/tests/results/analysis/sum_weighted_1d.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/sum_weighted_2d.cml b/lib/iris/tests/results/analysis/sum_weighted_2d.cml
index 57cf7d3d1f..4b8b04b1aa 100644
--- a/lib/iris/tests/results/analysis/sum_weighted_2d.cml
+++ b/lib/iris/tests/results/analysis/sum_weighted_2d.cml
@@ -3,13 +3,13 @@
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/third_quartile_foo_1d.cml b/lib/iris/tests/results/analysis/third_quartile_foo_1d.cml
index 038e7c8668..78d56bc4ec 100644
--- a/lib/iris/tests/results/analysis/third_quartile_foo_1d.cml
+++ b/lib/iris/tests/results/analysis/third_quartile_foo_1d.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/third_quartile_foo_1d_fast_percentile.cml b/lib/iris/tests/results/analysis/third_quartile_foo_1d_fast_percentile.cml
index 038e7c8668..78d56bc4ec 100644
--- a/lib/iris/tests/results/analysis/third_quartile_foo_1d_fast_percentile.cml
+++ b/lib/iris/tests/results/analysis/third_quartile_foo_1d_fast_percentile.cml
@@ -3,7 +3,7 @@
-
+
diff --git a/lib/iris/tests/results/analysis/variance_latitude.cml b/lib/iris/tests/results/analysis/variance_latitude.cml
index 5b55731396..1efa3dc26c 100644
--- a/lib/iris/tests/results/analysis/variance_latitude.cml
+++ b/lib/iris/tests/results/analysis/variance_latitude.cml
@@ -8,26 +8,25 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/variance_latitude_longitude.cml b/lib/iris/tests/results/analysis/variance_latitude_longitude.cml
index 359e40ef8a..9fbd2bac53 100644
--- a/lib/iris/tests/results/analysis/variance_latitude_longitude.cml
+++ b/lib/iris/tests/results/analysis/variance_latitude_longitude.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/variance_latitude_longitude_1call.cml b/lib/iris/tests/results/analysis/variance_latitude_longitude_1call.cml
index 0345eac77b..53484137ca 100644
--- a/lib/iris/tests/results/analysis/variance_latitude_longitude_1call.cml
+++ b/lib/iris/tests/results/analysis/variance_latitude_longitude_1call.cml
@@ -8,13 +8,13 @@
-
+
-
+
-
+
@@ -24,9 +24,8 @@
-
+
diff --git a/lib/iris/tests/results/analysis/weighted_mean_lat.cml b/lib/iris/tests/results/analysis/weighted_mean_lat.cml
index d2bb6f0df4..7786112b9c 100644
--- a/lib/iris/tests/results/analysis/weighted_mean_lat.cml
+++ b/lib/iris/tests/results/analysis/weighted_mean_lat.cml
@@ -3,15 +3,15 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/weighted_mean_latlon.cml b/lib/iris/tests/results/analysis/weighted_mean_latlon.cml
index e25e74c021..c7addc162a 100644
--- a/lib/iris/tests/results/analysis/weighted_mean_latlon.cml
+++ b/lib/iris/tests/results/analysis/weighted_mean_latlon.cml
@@ -3,15 +3,15 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/weighted_mean_lon.cml b/lib/iris/tests/results/analysis/weighted_mean_lon.cml
index 6ce89976b6..2fc50bf6d4 100644
--- a/lib/iris/tests/results/analysis/weighted_mean_lon.cml
+++ b/lib/iris/tests/results/analysis/weighted_mean_lon.cml
@@ -3,15 +3,15 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/weighted_mean_original.cml b/lib/iris/tests/results/analysis/weighted_mean_original.cml
index a69e633e26..a013add0cb 100644
--- a/lib/iris/tests/results/analysis/weighted_mean_original.cml
+++ b/lib/iris/tests/results/analysis/weighted_mean_original.cml
@@ -7,37 +7,63 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/analysis/weighted_mean_source.cml b/lib/iris/tests/results/analysis/weighted_mean_source.cml
index eb72035a4f..9ea20dfe46 100644
--- a/lib/iris/tests/results/analysis/weighted_mean_source.cml
+++ b/lib/iris/tests/results/analysis/weighted_mean_source.cml
@@ -3,15 +3,15 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/categorisation/customcheck.cml b/lib/iris/tests/results/categorisation/customcheck.cml
index 476a1c56ef..7fd65b6965 100644
--- a/lib/iris/tests/results/categorisation/customcheck.cml
+++ b/lib/iris/tests/results/categorisation/customcheck.cml
@@ -4,22 +4,23 @@
+ 0, 1, 1, 1, 1, 1, 2]" shape="(23,)" units="Unit('1')" value_type="int64"/>
+ 1970, 1970, 1970, 1970, 1970, 1971, 1971, 1971,
+ 1971, 1971, 1971, 1971, 1971, 1971, 1971]" shape="(23,)" units="Unit('1')" value_type="int64"/>
-
+
-
+
diff --git a/lib/iris/tests/results/categorisation/quickcheck.cml b/lib/iris/tests/results/categorisation/quickcheck.cml
index b8f3904ad1..58a8fafa5c 100644
--- a/lib/iris/tests/results/categorisation/quickcheck.cml
+++ b/lib/iris/tests/results/categorisation/quickcheck.cml
@@ -3,72 +3,77 @@
-
+
-
+
-
+
-
+
+ 2, 0, 1, 1, 2, 0, 1]" shape="(23,)" units="Unit('1')" value_type="int64"/>
-
+
-
+
+ 1, 1, 1, 1, 2, 2, 2]" shape="(23,)" units="Unit('1')" value_type="int64"/>
+ 1970, 1970, 1970, 1970, 1970, 1971, 1971, 1971,
+ 1971, 1971, 1971, 1971, 1971, 1971, 1971]" shape="(23,)" units="Unit('1')" value_type="int64"/>
-
+
-
+
+ 1, 0, 6, 5, 4, 3, 2]" shape="(23,)" units="Unit('1')" value_type="int64"/>
+ 1970, 1970, 1970, 1970, 1970, 1970, 1971, 1971,
+ 1971, 1971, 1971, 1971, 1971, 1971, 1971]" shape="(23,)" units="Unit('1')" value_type="int64"/>
-
+
diff --git a/lib/iris/tests/results/cdm/extract/lat_eq_10.cml b/lib/iris/tests/results/cdm/extract/lat_eq_10.cml
index e7213fc7bd..f6052ccb93 100644
--- a/lib/iris/tests/results/cdm/extract/lat_eq_10.cml
+++ b/lib/iris/tests/results/cdm/extract/lat_eq_10.cml
@@ -8,129 +8,139 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/cdm/extract/lat_gt_10.cml b/lib/iris/tests/results/cdm/extract/lat_gt_10.cml
index 3ffbbf89e5..c06345ab33 100644
--- a/lib/iris/tests/results/cdm/extract/lat_gt_10.cml
+++ b/lib/iris/tests/results/cdm/extract/lat_gt_10.cml
@@ -8,138 +8,148 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/cdm/extract/lat_gt_10_and_lon_ge_10.cml b/lib/iris/tests/results/cdm/extract/lat_gt_10_and_lon_ge_10.cml
index 7091aee748..b9f2a4b496 100644
--- a/lib/iris/tests/results/cdm/extract/lat_gt_10_and_lon_ge_10.cml
+++ b/lib/iris/tests/results/cdm/extract/lat_gt_10_and_lon_ge_10.cml
@@ -8,139 +8,148 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/cdm/masked_cube.cml b/lib/iris/tests/results/cdm/masked_cube.cml
index dcfa8c062f..64663a55fe 100644
--- a/lib/iris/tests/results/cdm/masked_cube.cml
+++ b/lib/iris/tests/results/cdm/masked_cube.cml
@@ -7,32 +7,30 @@
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/cdm/str_repr/cell_methods.__str__.txt b/lib/iris/tests/results/cdm/str_repr/cell_methods.__str__.txt
index ffb6a62daf..fa1cd1c04c 100644
--- a/lib/iris/tests/results/cdm/str_repr/cell_methods.__str__.txt
+++ b/lib/iris/tests/results/cdm/str_repr/cell_methods.__str__.txt
@@ -8,10 +8,10 @@ air_temperature / (K) (latitude: 73; longitude: 96)
pressure 1000.0 hPa
time 1998-12-01 00:00:00
Cell methods:
- mean longitude (6 minutes, This is a test comment), latitude (12 minutes)
- average longitude (6 minutes, This is another test comment), latitude (15 minutes, This is another comment)
- average longitude, latitude
- percentile longitude (6 minutes, This is another test comment)
+ 0 longitude: latitude: mean (interval: 6 minutes interval: 12 minutes comment: This is a test comment)
+ 1 longitude: latitude: average (interval: 6 minutes interval: 15 minutes comment: This is another test comment comment: This is another comment)
+ 2 longitude: latitude: average
+ 3 longitude: percentile (interval: 6 minutes comment: This is another test comment)
Attributes:
STASH m01s16i203
source 'Data from Met Office Unified Model'
\ No newline at end of file
diff --git a/lib/iris/tests/results/cdm/test_simple_cube_intersection.cml b/lib/iris/tests/results/cdm/test_simple_cube_intersection.cml
index 8d1b986397..c4d2c6dd81 100644
--- a/lib/iris/tests/results/cdm/test_simple_cube_intersection.cml
+++ b/lib/iris/tests/results/cdm/test_simple_cube_intersection.cml
@@ -3,12 +3,12 @@
-
+
-
+
@@ -22,12 +22,12 @@
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d.cml b/lib/iris/tests/results/concatenate/concat_2x2d.cml
index feeb553642..cd4fd537ff 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d.cml
@@ -3,16 +3,16 @@
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x.cml
index 9076ae2538..894f4df52c 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_bounds.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_bounds.cml
index 5597a876b2..07e66e82b6 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_bounds.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_bounds.cml
@@ -3,24 +3,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_xy.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_xy.cml
index 4c5c993b9e..37330ba58b 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_xy.cml
@@ -3,23 +3,23 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y.cml
index 2ace2a8024..51326ca74b 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y.cml
@@ -3,22 +3,22 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y_xy.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y_xy.cml
index e0f1fd2775..fa5b41299a 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_x_y_xy.cml
@@ -3,26 +3,26 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy.cml
index 5bc3c707f7..fac46bb54d 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy.cml
@@ -3,20 +3,20 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy_bounds.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy_bounds.cml
index 4f279cef01..d947bb394d 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy_bounds.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_xy_bounds.cml
@@ -3,26 +3,26 @@
-
+
-
+ [[201, 202, 203, 204],
+ [301, 302, 303, 304],
+ [202, 203, 204, 205],
+ [302, 303, 304, 305]]]" id="af0ab254" long_name="xy-aux" points="[[ 1., 101., 2., 102.],
+ [201., 301., 202., 302.]]" shape="(2, 4)" units="Unit('1')" value_type="float32"/>
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_y.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_y.cml
index 95575d1b65..4d66e2e2d5 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_y.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_y.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2x2d_aux_y_xy.cml b/lib/iris/tests/results/concatenate/concat_2x2d_aux_y_xy.cml
index dbe28f6a65..e55016f80a 100644
--- a/lib/iris/tests/results/concatenate/concat_2x2d_aux_y_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2x2d_aux_y_xy.cml
@@ -3,23 +3,23 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d.cml b/lib/iris/tests/results/concatenate/concat_2y2d.cml
index 55a896c12a..bdf2c04c91 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d.cml
@@ -3,16 +3,16 @@
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x.cml
index 6e8e367501..55d9978911 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_xy.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_xy.cml
index 20ce15e486..cdfed95f4e 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_xy.cml
@@ -3,27 +3,27 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y.cml
index f486652592..91bbdc381e 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y.cml
@@ -3,22 +3,22 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y_xy.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y_xy.cml
index cc1377cfd0..6e747200da 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_x_y_xy.cml
@@ -3,30 +3,30 @@
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_xy.cml
index 4e4a8d8729..ab85674486 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_xy.cml
@@ -3,24 +3,24 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_y.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_y.cml
index 73a11c74a8..40b1d6bbe9 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_y.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_y.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_2y2d_aux_y_xy.cml b/lib/iris/tests/results/concatenate/concat_2y2d_aux_y_xy.cml
index 8add7084dc..17af2b0653 100644
--- a/lib/iris/tests/results/concatenate/concat_2y2d_aux_y_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_2y2d_aux_y_xy.cml
@@ -3,27 +3,27 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_3d_simple.cml b/lib/iris/tests/results/concatenate/concat_3d_simple.cml
index 67c7cc4376..9afb0fd9dd 100644
--- a/lib/iris/tests/results/concatenate/concat_3d_simple.cml
+++ b/lib/iris/tests/results/concatenate/concat_3d_simple.cml
@@ -3,70 +3,70 @@
-
+
-
+
-
+
-
+ [[4000., 5000., 4000., 5000.],
+ [6000., 7000., 6000., 7000.],
+ [4000., 5000., 4000., 5000.],
+ [6000., 7000., 6000., 7000.]]]" shape="(4, 4, 4)" units="Unit('1')" value_type="float32"/>
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_4mix2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_4mix2d_aux_xy.cml
index b706f7b3cb..e53113e840 100644
--- a/lib/iris/tests/results/concatenate/concat_4mix2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_4mix2d_aux_xy.cml
@@ -3,22 +3,22 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_4x2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_4x2d_aux_xy.cml
index 229281f88c..67bba81710 100644
--- a/lib/iris/tests/results/concatenate/concat_4x2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_4x2d_aux_xy.cml
@@ -3,22 +3,22 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_4y2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_4y2d_aux_xy.cml
index bf9ee0a610..9efc2ab088 100644
--- a/lib/iris/tests/results/concatenate/concat_4y2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_4y2d_aux_xy.cml
@@ -3,22 +3,22 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_9mix2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_9mix2d_aux_xy.cml
index 636d7ad06d..c29783bca6 100644
--- a/lib/iris/tests/results/concatenate/concat_9mix2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_9mix2d_aux_xy.cml
@@ -3,28 +3,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_9x2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_9x2d_aux_xy.cml
index dea24c5518..1fd9d843c5 100644
--- a/lib/iris/tests/results/concatenate/concat_9x2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_9x2d_aux_xy.cml
@@ -3,28 +3,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_9y2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_9y2d_aux_xy.cml
index ed4b23ce08..1d62fae473 100644
--- a/lib/iris/tests/results/concatenate/concat_9y2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_9y2d_aux_xy.cml
@@ -3,28 +3,28 @@
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_anonymous.cml b/lib/iris/tests/results/concatenate/concat_anonymous.cml
index 7eeccb2241..c5f986cdcd 100644
--- a/lib/iris/tests/results/concatenate/concat_anonymous.cml
+++ b/lib/iris/tests/results/concatenate/concat_anonymous.cml
@@ -3,14 +3,14 @@
-
+
-
+
@@ -19,8 +19,8 @@
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_masked_2x2d.cml b/lib/iris/tests/results/concatenate/concat_masked_2x2d.cml
index f8b47f9627..6b25ac8259 100644
--- a/lib/iris/tests/results/concatenate/concat_masked_2x2d.cml
+++ b/lib/iris/tests/results/concatenate/concat_masked_2x2d.cml
@@ -3,14 +3,14 @@
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_masked_2y2d.cml b/lib/iris/tests/results/concatenate/concat_masked_2y2d.cml
index d4f31c7e44..86460bc8a9 100644
--- a/lib/iris/tests/results/concatenate/concat_masked_2y2d.cml
+++ b/lib/iris/tests/results/concatenate/concat_masked_2y2d.cml
@@ -3,14 +3,14 @@
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_merged_scalar_4mix2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_merged_scalar_4mix2d_aux_xy.cml
index 645d0aa95f..77143dc05a 100644
--- a/lib/iris/tests/results/concatenate/concat_merged_scalar_4mix2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_merged_scalar_4mix2d_aux_xy.cml
@@ -3,25 +3,25 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_merged_scalar_4x2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_merged_scalar_4x2d_aux_xy.cml
index 645d0aa95f..77143dc05a 100644
--- a/lib/iris/tests/results/concatenate/concat_merged_scalar_4x2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_merged_scalar_4x2d_aux_xy.cml
@@ -3,25 +3,25 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_merged_scalar_4y2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_merged_scalar_4y2d_aux_xy.cml
index 94bcb31795..8f72b0339e 100644
--- a/lib/iris/tests/results/concatenate/concat_merged_scalar_4y2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_merged_scalar_4y2d_aux_xy.cml
@@ -3,25 +3,25 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4mix2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4mix2d_aux_xy.cml
index 1b60930a09..3c078ffbcc 100644
--- a/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4mix2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4mix2d_aux_xy.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
-
+
@@ -24,19 +24,19 @@
-
+
+ [2.5, 3.5]]" id="78a0dfe8" long_name="x" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
-
+
+ [2.5, 3.5]]" id="6fdbcbab" long_name="y" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
@@ -45,19 +45,19 @@
-
+
+ [2.5, 3.5]]" id="78a0dfe8" long_name="x" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
-
+
-
+
@@ -66,19 +66,19 @@
-
+
-
+
-
+
+ [2.5, 3.5]]" id="6fdbcbab" long_name="y" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
diff --git a/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4x2_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4x2_aux_xy.cml
index 1f87f5b3cf..db474a8d40 100644
--- a/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4x2_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4x2_aux_xy.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
-
+
@@ -24,19 +24,19 @@
-
+
+ [2.5, 3.5]]" id="78a0dfe8" long_name="x" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
-
+
-
+
@@ -45,19 +45,19 @@
-
+
-
+
-
+
+ [2.5, 3.5]]" id="6fdbcbab" long_name="y" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
@@ -66,19 +66,19 @@
-
+
+ [2.5, 3.5]]" id="78a0dfe8" long_name="x" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
-
+
+ [2.5, 3.5]]" id="6fdbcbab" long_name="y" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
diff --git a/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4y2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4y2d_aux_xy.cml
index cca6094d9c..7cef64ff1e 100644
--- a/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4y2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_pre_merged_scalar_4y2d_aux_xy.cml
@@ -3,19 +3,19 @@
-
+
-
+
-
+
-
+
@@ -24,19 +24,19 @@
-
+
-
+
-
+
+ [2.5, 3.5]]" id="6fdbcbab" long_name="y" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
@@ -45,19 +45,19 @@
-
+
+ [2.5, 3.5]]" id="78a0dfe8" long_name="x" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
-
+
-
+
@@ -66,19 +66,19 @@
-
+
+ [2.5, 3.5]]" id="78a0dfe8" long_name="x" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
-
+
+ [2.5, 3.5]]" id="6fdbcbab" long_name="y" points="[2., 3.]" shape="(2,)" units="Unit('1')" value_type="float32"/>
diff --git a/lib/iris/tests/results/concatenate/concat_scalar_4mix2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_scalar_4mix2d_aux_xy.cml
index 8709ebd03d..6f2924b86a 100644
--- a/lib/iris/tests/results/concatenate/concat_scalar_4mix2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_scalar_4mix2d_aux_xy.cml
@@ -3,25 +3,25 @@
-
+
-
+
-
+
-
+
@@ -30,25 +30,25 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_scalar_4x2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_scalar_4x2d_aux_xy.cml
index 8709ebd03d..6f2924b86a 100644
--- a/lib/iris/tests/results/concatenate/concat_scalar_4x2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_scalar_4x2d_aux_xy.cml
@@ -3,25 +3,25 @@
-
+
-
+
-
+
-
+
@@ -30,25 +30,25 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/concatenate/concat_scalar_4y2d_aux_xy.cml b/lib/iris/tests/results/concatenate/concat_scalar_4y2d_aux_xy.cml
index 864e476e97..d5b2573933 100644
--- a/lib/iris/tests/results/concatenate/concat_scalar_4y2d_aux_xy.cml
+++ b/lib/iris/tests/results/concatenate/concat_scalar_4y2d_aux_xy.cml
@@ -3,25 +3,25 @@
-
+
-
+
-
+
-
+
@@ -30,25 +30,25 @@
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/all_10_load_match.cml b/lib/iris/tests/results/constrained_load/all_10_load_match.cml
index 0712af20fa..7be771967b 100644
--- a/lib/iris/tests/results/constrained_load/all_10_load_match.cml
+++ b/lib/iris/tests/results/constrained_load/all_10_load_match.cml
@@ -8,25 +8,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -38,10 +40,10 @@
-
+
-
+
@@ -59,25 +61,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -89,10 +93,10 @@
-
+
-
+
@@ -110,26 +114,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -141,10 +146,10 @@
-
+
-
+
@@ -162,26 +167,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -193,10 +199,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/all_ml_10_22_load_match.cml b/lib/iris/tests/results/constrained_load/all_ml_10_22_load_match.cml
index 20971021ac..44f796f630 100644
--- a/lib/iris/tests/results/constrained_load/all_ml_10_22_load_match.cml
+++ b/lib/iris/tests/results/constrained_load/all_ml_10_22_load_match.cml
@@ -8,26 +8,28 @@
-
+
-
+
-
+
-
+
-
+
@@ -39,11 +41,11 @@
-
+
-
+
@@ -61,26 +63,28 @@
-
+
-
+
-
+
-
+
-
+
@@ -92,11 +96,11 @@
-
+
-
+
@@ -114,27 +118,28 @@
-
+
-
+
-
+
-
+
-
+
@@ -146,11 +151,11 @@
-
+
-
+
@@ -168,27 +173,28 @@
-
+
-
+
-
+
-
+
-
+
@@ -200,11 +206,11 @@
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/attribute_constraint.cml b/lib/iris/tests/results/constrained_load/attribute_constraint.cml
index 664dc943bc..53529dc684 100644
--- a/lib/iris/tests/results/constrained_load/attribute_constraint.cml
+++ b/lib/iris/tests/results/constrained_load/attribute_constraint.cml
@@ -9,129 +9,140 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_match.cml b/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_match.cml
index 44e7d077df..2440c89883 100644
--- a/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_match.cml
+++ b/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_match.cml
@@ -8,25 +8,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -38,10 +40,10 @@
-
+
-
+
@@ -59,63 +61,67 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_strict.cml b/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_strict.cml
index 44e7d077df..2440c89883 100644
--- a/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_strict.cml
+++ b/lib/iris/tests/results/constrained_load/theta_10_and_theta_level_gt_30_le_3_load_strict.cml
@@ -8,25 +8,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -38,10 +40,10 @@
-
+
-
+
@@ -59,63 +61,67 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/theta_10_load_match.cml b/lib/iris/tests/results/constrained_load/theta_10_load_match.cml
index e2852d0151..4aee6bb065 100644
--- a/lib/iris/tests/results/constrained_load/theta_10_load_match.cml
+++ b/lib/iris/tests/results/constrained_load/theta_10_load_match.cml
@@ -8,25 +8,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -38,10 +40,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/theta_10_load_strict.cml b/lib/iris/tests/results/constrained_load/theta_10_load_strict.cml
index e2852d0151..4aee6bb065 100644
--- a/lib/iris/tests/results/constrained_load/theta_10_load_strict.cml
+++ b/lib/iris/tests/results/constrained_load/theta_10_load_strict.cml
@@ -8,25 +8,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -38,10 +40,10 @@
-
+
-
+
diff --git a/lib/iris/tests/results/constrained_load/theta_and_all_10_load_match.cml b/lib/iris/tests/results/constrained_load/theta_and_all_10_load_match.cml
index 772929b0da..02bee172aa 100644
--- a/lib/iris/tests/results/constrained_load/theta_and_all_10_load_match.cml
+++ b/lib/iris/tests/results/constrained_load/theta_and_all_10_load_match.cml
@@ -8,129 +8,140 @@
-
+
-
+
-
+
-
+
-
+
-
+
-
+
-
+
@@ -148,25 +159,27 @@
-
+
-
+
-
+
-
+
-
+
@@ -178,10 +191,10 @@
-
+
-
+
@@ -199,25 +212,27 @@
-
+
-
+
-
+
-
+