Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
90 commits
Select commit Hold shift + click to select a range
1d6d979
Reinstate CubeList._repr_html_() (#4976)
pp-mo Sep 23, 2022
d38c504
Fix name loader problem (#4933)
pp-mo Sep 26, 2022
2a41be2
Port dependency fixes to `v3.3.x`. (#4992)
trexfeathers Sep 26, 2022
96c00b6
Added a glossary for Iris docs. (#4902)
ESadek-MO Sep 28, 2022
04e757f
Adaptions for Matplotlib 3.6 (#4998)
trexfeathers Sep 28, 2022
6829e61
Update whatsnew for 3.3.1 release (#5002)
stephenworsley Sep 29, 2022
da03d73
Fix benchmarks `netcdf` import (#5001)
trexfeathers Sep 29, 2022
c4781ec
Merge branch 'v3.3.x' into mergeback_v3.3.1
stephenworsley Sep 29, 2022
f69b93f
Merge pull request #5004 from stephenworsley/mergeback_v3.3.1
trexfeathers Sep 29, 2022
2742211
Speed up operations that use the `Coord.cells` method for time coordi…
bouweandela Sep 29, 2022
24ff3bd
Correct wrong type of equality check in benchmark GHA. (#5005)
trexfeathers Sep 29, 2022
04122c0
remove outline workaround (#4999)
rcomer Sep 29, 2022
7ba620e
Updated environment lockfiles (#5008)
scitools-ci[bot] Oct 3, 2022
536ff51
Updated "glossary" entry to link to iris glossary (#5012)
ESadek-MO Oct 3, 2022
b5245f6
DOC: add show option to Makefiles (#5000)
rcomer Oct 5, 2022
959b590
Sample code to convert ORCA data into a meshcube. (#5013)
pp-mo Oct 6, 2022
bae9953
New LBFC-CF mappings (#4859)
trexfeathers Oct 7, 2022
d1f3e45
Cell comparison: remove ancient netcdftime/cftime workarounds (#4729)
rcomer Oct 7, 2022
379c098
Updated environment lockfiles (#5018)
scitools-ci[bot] Oct 10, 2022
6117f60
Speed up Cube.subset/Coord.intersect (#4955)
bouweandela Oct 11, 2022
ea67311
[pre-commit.ci] pre-commit autoupdate (#5021)
pre-commit-ci[bot] Oct 12, 2022
fd241df
Bump peter-evans/create-pull-request from 4.1.3 to 4.2.0 (#5028)
dependabot[bot] Oct 19, 2022
d56824c
Meshcoord metadata follows face/edge coords where present. (#5020)
pp-mo Oct 21, 2022
3234489
Added pandas_ndim FUTURE flag (#4909)
trexfeathers Oct 24, 2022
b5178a9
Prioritise dim coord in `_get_lon_lat_coords()` (#5029)
ESadek-MO Oct 27, 2022
8499fe0
Improve error messages when comparing against objects vs strings (#4928)
stephenworsley Nov 1, 2022
5ab3dc2
test for 'coord exists' constraint (#5043)
rcomer Nov 1, 2022
75c7570
not nan functions (#5039)
rcomer Nov 2, 2022
a6795cf
Use reusable workflow for refresh lockfile workflow (#5036)
lbdreyer Nov 2, 2022
421e193
Accept new copy behaviour from dask/dask#9555. (#5041)
trexfeathers Nov 4, 2022
8744f67
Improve iris.pandas cube -> data.frame (#4669)
hsteptoe Nov 4, 2022
f58682d
fix aggregated_by for derived coords (#4947)
stephenworsley Nov 9, 2022
8f5c69e
Reference the RTD version switcher in the docs latest warning. (#5055)
trexfeathers Nov 10, 2022
10f517b
Links to Gallery in docs (#5009)
ESadek-MO Nov 10, 2022
c1cb14c
`iris.pandas.as_data_frame()` doctests full width (#5057)
trexfeathers Nov 10, 2022
5c9d9b3
Update the `iris.experimental` module (#5056)
trexfeathers Nov 11, 2022
3da5341
WSTARBAR is upward velocity (#5060)
rcomer Nov 11, 2022
f815437
(Regridder unification) improve curvilinear regridding, generalise _c…
stephenworsley Nov 11, 2022
e7a8f31
PP field extra data words fix (#5058)
bjlittle Nov 14, 2022
c0ab2e7
added link to the docs archive. (#5064)
tkknight Nov 14, 2022
b2da2d7
More tolerant NetCDF4 loading - skip un-addable dimensional metadata …
trexfeathers Nov 16, 2022
f9c158c
[CI Bot] environment lockfiles auto-update (#5051)
scitools-ci[bot] Nov 16, 2022
add1365
Fix handling of data in "nearest" trajectory interpolate (#5062)
lbdreyer Nov 16, 2022
6443ac5
Make `iris.pandas.as_data_frame()` n-dimensional behaviour opt-in (#5…
trexfeathers Nov 16, 2022
a8616a2
Clarify Laziness in Iris Functions (#5066)
ESadek-MO Nov 16, 2022
c8b99a7
Move test_pandas into unit tests. (#5071)
trexfeathers Nov 16, 2022
ca0d9fe
Merge remote-tracking branch 'upstream/main' into update_pandas_ndim
Nov 16, 2022
f423142
Merge pull request #5073 from trexfeathers/update_pandas_ndim
trexfeathers Nov 16, 2022
2ed9b7e
Implemented constraint equality. (#3749)
pp-mo Nov 17, 2022
8bb85a8
More accurate netcdf4 pin `<1.6.1`. (#5075)
trexfeathers Nov 17, 2022
badd119
Merge remote-tracking branch 'upstream/main' into update_pandas_ndim_2
trexfeathers Nov 17, 2022
56a7e33
Merge pull request #5076 from trexfeathers/update_pandas_ndim_2
stephenworsley Nov 17, 2022
bd5fa5f
What's New fixes. (#5077)
trexfeathers Nov 17, 2022
1c05e92
Merge pull request #5074 from SciTools/pandas_ndim
stephenworsley Nov 17, 2022
4176d15
Documentation updates for `v3.4.0rc0` release. (#5078)
trexfeathers Nov 17, 2022
d6ee976
Restore latest What's New files.
trexfeathers Nov 17, 2022
3b058ba
Merge pull request #5079 from trexfeathers/mergeback_34x
stephenworsley Nov 17, 2022
57647d2
Updated environment lockfiles (#5080)
scitools-ci[bot] Nov 21, 2022
60e293c
Updated environment lockfiles (#5085)
scitools-ci[bot] Nov 28, 2022
aa2bfa9
[pre-commit.ci] pre-commit autoupdate (#5086)
pre-commit-ci[bot] Nov 29, 2022
d9c0743
Update What's New for 3.4 release. (#5088)
trexfeathers Dec 1, 2022
7e15882
Merge pull request #5090 from SciTools/v3.4.x
stephenworsley Dec 1, 2022
dfbc5c2
Updated environment lockfiles (#5092)
scitools-ci[bot] Dec 5, 2022
6eb0401
DOC: improve gallery test instructions (#5100)
rcomer Dec 9, 2022
bf33b02
Updated environment lockfiles (#5104)
scitools-ci[bot] Dec 12, 2022
efd356d
[pre-commit.ci] pre-commit autoupdate (#5107)
pre-commit-ci[bot] Dec 13, 2022
4a65b9e
Switch order of options and parameter in `ncgen` command (#5105)
fnattino Dec 14, 2022
5555d45
Remove test timings (#5101)
rcomer Dec 14, 2022
d629dcc
Announce @ESadek-MO as a core Iris developer. (#5111)
trexfeathers Dec 16, 2022
8762fab
Correct heading for v3.4 release highlights. (#5110)
trexfeathers Dec 16, 2022
a0f7af7
[pre-commit.ci] pre-commit autoupdate (#5114)
pre-commit-ci[bot] Dec 20, 2022
290fe70
Bump actions/stale from 6 to 7 (#5117)
dependabot[bot] Dec 21, 2022
f6e4b81
[pre-commit.ci] pre-commit autoupdate (#5120)
pre-commit-ci[bot] Jan 3, 2023
fc302c9
pip pin for sphinx<5 (#5122)
bjlittle Jan 3, 2023
f190415
link percentile as lazy option (#5128)
rcomer Jan 4, 2023
e1fae5e
Unpin theme (#5129)
tkknight Jan 5, 2023
34560a5
Spelling remove (#5130)
tkknight Jan 6, 2023
0dd1200
Fix link checks (#5109)
trexfeathers Jan 9, 2023
e883799
Update lock files and accompanying fixes. (#5132)
trexfeathers Jan 10, 2023
769d7f0
Updated citation (#5116)
jonseddon Jan 13, 2023
a3b3560
[pre-commit.ci] pre-commit autoupdate (#5136)
pre-commit-ci[bot] Jan 19, 2023
4a945ec
Iris ❤ Xarray docs page. (#5025)
trexfeathers Jan 24, 2023
30bcc4c
[pre-commit.ci] pre-commit autoupdate (#5143)
pre-commit-ci[bot] Jan 31, 2023
7181bbc
add readme #showyourstripes (#5141)
bjlittle Jan 31, 2023
7da248c
Fixing typo's in Gitwash. (#5145)
HGWright Feb 3, 2023
b08cfa6
[pre-commit.ci] pre-commit autoupdate (#5150)
pre-commit-ci[bot] Feb 7, 2023
de7919a
Replace apparently retired UDUNITS documentation link. (#5153)
trexfeathers Feb 9, 2023
58cdd78
Expand scope of common contributor links (#5159)
rcomer Feb 15, 2023
504c188
Plugin support (#5144)
vsherratt Feb 15, 2023
ca42c30
Updated environment lockfiles (#5163)
scitools-ci[bot] Feb 20, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 13 additions & 2 deletions .github/workflows/benchmark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ on:
schedule:
# Runs every day at 23:00.
- cron: "0 23 * * *"
workflow_dispatch:
inputs:
first_commit:
description: "Argument to be passed to the overnight benchmark script."
required: false
type: string

jobs:
benchmark:
Expand All @@ -15,7 +21,7 @@ jobs:
env:
IRIS_TEST_DATA_LOC_PATH: benchmarks
IRIS_TEST_DATA_PATH: benchmarks/iris-test-data
IRIS_TEST_DATA_VERSION: "2.15"
IRIS_TEST_DATA_VERSION: "2.18"
# Lets us manually bump the cache to rebuild
ENV_CACHE_BUILD: "0"
TEST_DATA_CACHE_BUILD: "2"
Expand Down Expand Up @@ -64,7 +70,12 @@ jobs:

- name: Run overnight benchmarks
run: |
first_commit=$(git log --after="$(date -d "1 day ago" +"%Y-%m-%d") 23:00:00" --pretty=format:"%h" | tail -n 1)
first_commit=${{ inputs.first_commit }}
if [ "$first_commit" == "" ]
then
first_commit=$(git log --after="$(date -d "1 day ago" +"%Y-%m-%d") 23:00:00" --pretty=format:"%h" | tail -n 1)
fi

if [ "$first_commit" != "" ]
then
nox --session="benchmarks(overnight)" -- $first_commit
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ jobs:
session: "tests"

env:
IRIS_TEST_DATA_VERSION: "2.16"
IRIS_TEST_DATA_VERSION: "2.18"
ENV_NAME: "ci-tests"

steps:
Expand Down
109 changes: 5 additions & 104 deletions .github/workflows/refresh-lockfiles.yml
Original file line number Diff line number Diff line change
@@ -1,13 +1,5 @@
# This workflow periodically creates new environment lock files based on the newest
# available packages and dependencies.
#
# Environment specifications are given as conda environment.yml files found in
# `requirements/ci/py**.yml`. These state the packages required, the conda channels
# that the packages will be pulled from, and any versions of packages that need to be
# pinned at specific versions.
#
# For environments that have changed, a pull request will be made and submitted
# to the main branch
# Updates the environment lock files. See the called workflow in the
# scitools/workflows repo for more details.

name: Refresh Lockfiles

Expand All @@ -20,98 +12,7 @@ on:
# https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#onschedule
- cron: "1 0 * * 6"


jobs:

get_python_matrix:
# Determines which Python versions should be included in the matrix used in
# the gen_lockfiles job.
if: "github.repository == 'SciTools/iris'"
runs-on: ubuntu-latest
outputs:
matrix: ${{ steps.get_py.outputs.matrix }}
steps:
- uses: actions/checkout@v3
- id: get_py
run: echo "::set-output name=matrix::$(ls -1 requirements/ci/py*.yml | xargs -n1 basename | sed 's/....$//' | jq -cnR '[inputs]')"

gen_lockfiles:
# this is a matrix job: it splits to create new lockfiles for each
# of the CI test python versions.
if: "github.repository == 'SciTools/iris'"
runs-on: ubuntu-latest
needs: get_python_matrix

strategy:
matrix:
python: ${{ fromJSON(needs.get_python_matrix.outputs.matrix) }}

steps:
- uses: actions/checkout@v3
- name: install requirements
run: |
source $CONDA/bin/activate base
conda install -y -c conda-forge conda-libmamba-solver conda-lock
- name: generate lockfile
env:
CONDA_EXPERIMENTAL_SOLVER: libmamba
run: |
$CONDA/bin/conda-lock lock -k explicit -p linux-64 -f requirements/ci/${{matrix.python}}.yml
mv conda-linux-64.lock ${{matrix.python}}-linux-64.lock
- name: output lockfile
uses: actions/upload-artifact@v3
with:
path: ${{matrix.python}}-linux-64.lock

create_pr:
# once the matrix job has completed all the lock files will have been uploaded as artifacts.
# Download the artifacts, add them to the repo, and create a PR.
if: "github.repository == 'SciTools/iris'"
runs-on: ubuntu-latest
needs: gen_lockfiles

steps:
- uses: actions/checkout@v3
- name: get artifacts
uses: actions/download-artifact@v3
with:
path: artifacts

- name: Update lock files in repo
run: |
cp artifacts/artifact/*.lock requirements/ci/nox.lock
rm -r artifacts

- name: "Generate token"
uses: tibdex/github-app-token@v1
id: generate-token
with:
app_id: ${{ secrets.AUTH_APP_ID }}
private_key: ${{ secrets.AUTH_APP_PRIVATE_KEY }}

- name: Create Pull Request
id: cpr
uses: peter-evans/create-pull-request@671dc9c9e0c2d73f07fa45a3eb0220e1622f0c5f
with:
token: ${{ steps.generate-token.outputs.token }}
commit-message: Updated environment lockfiles
committer: "Lockfile bot <[email protected]>"
author: "Lockfile bot <[email protected]>"
delete-branch: true
branch: auto-update-lockfiles
title: "[iris.ci] environment lockfiles auto-update"
body: |
Lockfiles updated to the latest resolvable environment.

### If the CI tasks fail, create a new branch based on this PR and add the required fixes to that branch.
labels: |
New: Pull Request
Bot

- name: Check Pull Request
if: steps.cpr.outputs.pull-request-number != ''
run: |
echo "pull-request #${{ steps.cpr.outputs.pull-request-number }}"
echo "pull-request URL ${{ steps.cpr.outputs.pull-request-url }}"
echo "pull-request operation [${{ steps.cpr.outputs.pull-request-operation }}]"
echo "pull-request head SHA ${{ steps.cpr.outputs.pull-request-head-sha }}"
refresh_lockfiles:
uses: scitools/workflows/.github/workflows/refresh-lockfiles.yml@main
secrets: inherit
2 changes: 1 addition & 1 deletion .github/workflows/stale.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ jobs:
if: "github.repository == 'SciTools/iris'"
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v6
- uses: actions/stale@v7
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

Expand Down
11 changes: 5 additions & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ minimum_pre_commit_version: 1.21.0

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.3.0
rev: v4.4.0
hooks:
# Prevent giant files from being committed.
- id: check-added-large-files
Expand All @@ -29,32 +29,31 @@ repos:
- id: no-commit-to-branch

- repo: https://github.com/psf/black
rev: 22.8.0
rev: 23.1.0
hooks:
- id: black
pass_filenames: false
args: [--config=./pyproject.toml, .]

- repo: https://github.com/PyCQA/flake8
rev: 5.0.4
rev: 6.0.0
hooks:
- id: flake8
types: [file, python]
args: [--config=./setup.cfg]

- repo: https://github.com/pycqa/isort
rev: 5.10.1
rev: 5.12.0
hooks:
- id: isort
types: [file, python]
args: [--filter-files]

- repo: https://github.com/asottile/blacken-docs
rev: v1.12.1
rev: 1.13.0
hooks:
- id: blacken-docs
types: [file, rst]
additional_dependencies: [black==21.6b0]

- repo: https://github.com/aio-libs/sort-all
rev: v1.2.0
Expand Down
21 changes: 21 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,3 +54,24 @@ For documentation see the
developer version or the most recent released
<a href="https://scitools-iris.readthedocs.io/en/stable/">stable</a> version.
</p>

## [#ShowYourStripes](https://showyourstripes.info/s/globe)

<h4 align="center">
<a href="https://showyourstripes.info/s/globe">
<img src="https://raw.githubusercontent.com/ed-hawkins/show-your-stripes/master/2021/GLOBE---1850-2021-MO.png"
height="50" width="800"
alt="#showyourstripes Global 1850-2021"></a>
</h4>

**Graphics and Lead Scientist**: [Ed Hawkins](http://www.met.reading.ac.uk/~ed/home/index.php), National Centre for Atmospheric Science, University of Reading.

**Data**: Berkeley Earth, NOAA, UK Met Office, MeteoSwiss, DWD, SMHI, UoR, Meteo France & ZAMG.

<p>
<a href="https://showyourstripes.info/s/globe">#ShowYourStripes</a> is distributed under a
<a href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>
<a href="https://creativecommons.org/licenses/by/4.0/">
<img src="https://i.creativecommons.org/l/by/4.0/80x15.png" alt="creative-commons-by" style="border-width:0"></a>
</p>

2 changes: 1 addition & 1 deletion benchmarks/benchmarks/experimental/ugrid/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ def time_create(self, *params):

class Connectivity(UGridCommon):
def setup(self, n_faces):
self.array = np.zeros([n_faces, 3], dtype=np.int)
self.array = np.zeros([n_faces, 3], dtype=int)
super().setup(n_faces)

def create(self):
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/benchmarks/generate_data/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ def load_realised():
file loading, but some benchmarks are only meaningful if starting with real
arrays.
"""
from iris.fileformats.netcdf import _get_cf_var_data as pre_patched
from iris.fileformats.netcdf.loader import _get_cf_var_data as pre_patched

def patched(cf_var, filename):
return as_concrete_data(pre_patched(cf_var, filename))
Expand Down
31 changes: 29 additions & 2 deletions benchmarks/benchmarks/import_iris.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,30 @@
# licensing details.
from importlib import import_module, reload

################
# Prepare info for reset_colormaps:

# Import and capture colormaps.
from matplotlib import colormaps # isort:skip

_COLORMAPS_ORIG = set(colormaps)

# Import iris.palette, which modifies colormaps.
import iris.palette

# Derive which colormaps have been added by iris.palette.
_COLORMAPS_MOD = set(colormaps)
COLORMAPS_EXTRA = _COLORMAPS_MOD - _COLORMAPS_ORIG

# Touch iris.palette to prevent linters complaining.
_ = iris.palette

################


class Iris:
@staticmethod
def _import(module_name):
def _import(module_name, reset_colormaps=False):
"""
Have experimented with adding sleep() commands into the imported
modules. The results reveal:
Expand All @@ -25,6 +45,13 @@ def _import(module_name):
and the repetitions are therefore no faster than the first run.
"""
mod = import_module(module_name)

if reset_colormaps:
# Needed because reload() will attempt to register new colormaps a
# second time, which errors by default.
for cm_name in COLORMAPS_EXTRA:
colormaps.unregister(cm_name)

reload(mod)

def time_iris(self):
Expand Down Expand Up @@ -205,7 +232,7 @@ def time_iterate(self):
self._import("iris.iterate")

def time_palette(self):
self._import("iris.palette")
self._import("iris.palette", reset_colormaps=True)

def time_plot(self):
self._import("iris.plot")
Expand Down
50 changes: 49 additions & 1 deletion benchmarks/benchmarks/regridding.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,8 +12,11 @@
# importing anything else
from iris import tests # isort:skip

import numpy as np

import iris
from iris.analysis import AreaWeighted
from iris.analysis import AreaWeighted, PointInCell
from iris.coords import AuxCoord


class HorizontalChunkedRegridding:
Expand Down Expand Up @@ -53,3 +56,48 @@ def time_regrid_area_w_new_grid(self) -> None:
out = self.chunked_cube.regrid(self.template_cube, self.scheme_area_w)
# Realise data
out.data


class CurvilinearRegridding:
def setup(self) -> None:
# Prepare a cube and a template

cube_file_path = tests.get_data_path(
["NetCDF", "regrid", "regrid_xyt.nc"]
)
self.cube = iris.load_cube(cube_file_path)

# Make the source cube curvilinear
x_coord = self.cube.coord("longitude")
y_coord = self.cube.coord("latitude")
xx, yy = np.meshgrid(x_coord.points, y_coord.points)
self.cube.remove_coord(x_coord)
self.cube.remove_coord(y_coord)
x_coord_2d = AuxCoord(
xx,
standard_name=x_coord.standard_name,
units=x_coord.units,
coord_system=x_coord.coord_system,
)
y_coord_2d = AuxCoord(
yy,
standard_name=y_coord.standard_name,
units=y_coord.units,
coord_system=y_coord.coord_system,
)
self.cube.add_aux_coord(x_coord_2d, (1, 2))
self.cube.add_aux_coord(y_coord_2d, (1, 2))

template_file_path = tests.get_data_path(
["NetCDF", "regrid", "regrid_template_global_latlon.nc"]
)
self.template_cube = iris.load_cube(template_file_path)

# Prepare a regridding scheme
self.scheme_pic = PointInCell()

def time_regrid_pic(self) -> None:
# Regrid the cube onto the template.
out = self.cube.regrid(self.template_cube, self.scheme_pic)
# Realise the data
out.data
10 changes: 5 additions & 5 deletions docs/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,6 @@ html-quick:
echo "make html-quick in $$i..."; \
(cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) html-quick); done

spelling:
@for i in $(SUBDIRS); do \
echo "make spelling in $$i..."; \
(cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) spelling); done

all:
@for i in $(SUBDIRS); do \
echo "make all in $$i..."; \
Expand Down Expand Up @@ -55,3 +50,8 @@ linkcheck:
echo "Running linkcheck in $$i..."; \
(cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) linkcheck); done

show:
@for i in $(SUBDIRS); do \
echo "Running show in $$i..."; \
(cd $$i; $(MAKE) $(MFLAGS) $(MYMAKEFLAGS) show); done

Loading