Iris is free software: you can redistribute it and/or modify it under
-the terms of the GNU Lesser General Public License as published by the
-Free Software Foundation, either version 3 of the License, or
-(at your option) any later version.
-
Iris is distributed in the hope that it will be useful,
-but WITHOUT ANY WARRANTY; without even the implied warranty of
-MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
-GNU Lesser General Public License for more details.
-
You should have received a copy of the GNU Lesser General Public License
-along with Iris. If not, see http://www.gnu.org/licenses/.
You may use and re-use the information featured on this website (not including logos) free of
-charge in any format or medium, under the terms of the
-Open Government Licence.
-We encourage users to establish hypertext links to this website.
-
Any email enquiries regarding the use and re-use of this information resource should be
-sent to: psi@nationalarchives.gsi.gov.uk.
If you need to make a backwards-incompatible change to a public API
-[1] that has been included in a release, then you should
-only make that change after a deprecation period. This deprecation
-period must last at least six months or two public releases, whichever
-results in the longer period of time. Once the deprecation period has
-expired the deprecated API should be removed/updated in the next
-major release.
The simplest form of deprecation occurs when you need to remove a public
-API. The public API in question is deprecated for a period before it is
-removed to allow time for user code to be updated. Sometimes the
-deprecation is accompanied by the introduction of a new public API.
-
Under these circumstances the following points apply:
-
-
-
Using the deprecated API must result in a concise deprecation
-warning.
-
-
Where possible, your deprecation warning should include advice on
-how to avoid using the deprecated API. For example, you might
-reference a preferred API, or more detailed documentation elsewhere.
-
-
You must update the docstring for the deprecated API to include a
-Sphinx deprecation directive:
-
-
..deprecated::<VERSION>
-
-
where you should replace <VERSION> with the major and minor version
-of Iris in which this API is first deprecated. For example: 1.8.
-
As with the deprecation warning, you should include advice on how to
-avoid using the deprecated API within the content of this directive.
-Feel free to include more detail in the updated docstring than in the
-deprecation warning.
-
-
You should check the documentation for references to the deprecated
-API and update them as appropriate.
When you need to change the default behaviour of a public API the
-situation is slightly more complex. The recommended solution is to use
-the iris.FUTURE object. The iris.FUTURE object provides
-boolean attributes that allow user code to control at run-time the
-default behaviour of corresponding public APIs. When a boolean attribute
-is set to False it causes the corresponding public API to use its
-deprecated default behaviour. When a boolean attribute is set to True
-it causes the corresponding public API to use its new default behaviour.
-
The following points apply in addition to those for removing a public
-API:
-
-
-
You should add a new boolean attribute to iris.FUTURE (by
-modifying iris.Future) that controls the default behaviour
-of the public API that needs updating. The initial state of the new
-boolean attribute should be False. You should name the new boolean
-attribute to indicate that setting it to True will select the new
-default behaviour.
-
You should include a reference to this iris.FUTURE flag in your
-deprecation warning and corresponding Sphinx deprecation directive.
When the time comes to make a new major release you should locate any
-deprecated APIs within the code that satisfy the six month/two release
-minimum period described previously. Locating deprecated APIs can easily
-be done by searching for the Sphinx deprecation directives and/or
-deprecation warnings.
For consistency, always use """tripledoublequotes""" around docstrings. Use r"""rawtripledoublequotes""" if you use any backslashes in your docstrings. For Unicode docstrings, use u"""Unicodetriple-quotedstring""".
-
All docstrings should be written in rST (reStructuredText) markup; an rST guide follows this page.
-
There are two forms of docstrings: single-line and multi-line docstrings.
The single line docstring of an object must state the purpose of that object, known as the purpose section. This terse overview must be on one line and ideally no longer than 90 characters.
Multi-line docstrings must consist of at least a purpose section akin to the single-line docstring, followed by a blank line and then any other content, as described below. The entire docstring should be indented to the same level as the quotes at the docstring’s first line.
The multi-line docstring description section should expand on what was stated in the one line purpose section. The description section should try not to document argument and keyword argument details. Such information should be documented in the following arguments and keywords section.
The class constructor should be documented in the docstring for its __init__ or __new__ method. Methods should be documented by their own docstring, not in the class header itself.
-
If a class subclasses another class and its behavior is mostly inherited from that class, its docstring should mention this and summarise the differences. Use the verb “override” to indicate that a subclass method replaces a superclass method and does not call the superclass method; use the verb “extend” to indicate that a subclass method calls the superclass method (in addition to its own behavior).
reST (http://en.wikipedia.org/wiki/ReStructuredText) is a lightweight markup language intended to be highly readable in source format. This guide will cover some of the more frequently used advanced reST markup syntaxes, for the basics of reST the following links may be useful:
Basic links can be created with `Textofthelink<http://example.com>`_ which will look like Text of the link
-
Documents in the same project can be cross referenced with the syntax :doc:`document_name` for example, to reference the “docstrings” page :doc:`docstrings` creates the following link Docstrings
-
References can be created between sections by first making a “label” where you would like the link to point to .._name_of_reference:: the appropriate link can now be created with :ref:`name_of_reference` (note the trailing underscore on the label)
-
Cross referencing other reference documentation can be achieved with the syntax :py:class:`zipfile.ZipFile` which will result in links such as zipfile.ZipFile and numpy.ndarray.
Your personal git configurations are saved in the .gitconfig file in
-your home directory.
-
Here is an example .gitconfig file:
-
[user]
- name = Your Name
- email = you@yourdomain.example.com
-
-[alias]
- ci = commit -a
- co = checkout
- st = status
- stat = status
- br = branch
- wdiff = diff --color-words
-
-[core]
- editor = vim
-
-[merge]
- summary = true
-
-
-
You can edit this file directly or you can use the gitconfig--global
-command:
You might well benefit from some aliases to common commands.
-
For example, you might well want to be able to shorten gitcheckout
-to gitco. Or you may want to alias gitdiff--color-words
-(which gives a nicely formatted output of the diff) to gitwdiff
This way of working helps to keep work well organized, with readable history.
-This in turn makes it easier for project maintainers (that might be you) to see
-what you’ve done, and why you did it.
It may sound strange, but deleting your own master branch can help reduce
-confusion about which branch you are on. See deleting master on github for
-details.
From time to time you should fetch the upstream (trunk) changes from github:
-
git fetch upstream
-
-
-
This will pull down any commits you don’t have, and set the remote branches to
-point to the right commit. For example, ‘trunk’ is the branch referred to by
-(remote/branchname) upstream/master - and if there have been commits since
-you last checked, upstream/master will change after you do the fetch.
When you are ready to make some changes to the code, you should start a new
-branch. Branches that are for a collection of related edits are often called
-‘feature branches’.
-
Making an new branch for each set of related changes will make it easier for
-someone reviewing your branch to see what you are doing.
-
Choose an informative name for the branch to remind yourself and the rest of us
-what the changes in the branch are for. For example add-ability-to-fly, or
-buxfix-for-issue-42.
-
# Update the mirror of trunk
-git fetch upstream
-# Make new feature branch starting at current trunk
-git branch my-new-feature upstream/master
-git checkout my-new-feature
-
-
-
Generally, you will want to keep your feature branches on your public github
-fork of iris. To do this, you git push this new branch up to your
-github repo. Generally (if you followed the instructions in these pages, and by
-default), git will have a link to your github repo, called origin. You push
-up to your own repo on github with:
-
git push origin my-new-feature
-
-
-
In git >= 1.7 you can ensure that the link is correctly set by using the
---set-upstream option:
-
git push --set-upstream origin my-new-feature
-
-
-
From now on git will know that my-new-feature is related to the
-my-new-feature branch in the github repo.
See which files have changed with gitstatus (see git status).
-You’ll see a listing like this one:
-
# On branch ny-new-feature
-# Changed but not updated:
-# (use "git add <file>..." to update what will be committed)
-# (use "git checkout -- <file>..." to discard changes in working directory)
-#
-# modified: README
-#
-# Untracked files:
-# (use "git add <file>..." to include in what will be committed)
-#
-# INSTALL
-no changes added to commit (use "git add" and/or "git commit -a")
-
-
-
-
Check what the actual changes are with gitdiff (git diff).
-
-
Add any new files to version control gitaddnew_file_name (see
-git add).
-
-
To commit all modified files into the local copy of your repo,, do
-gitcommit-am'Acommitmessage'. Note the -am options to
-commit. The m flag just signals that you’re going to type a
-message on the command line. The a flag — you can just take on
-faith — or see why the -a flag? — and the helpful use-case
-description in the tangled working copy problem. The git commit manual
-page might also be useful.
-
-
To push the changes up to your forked repo on github, do a git
-push (see git push).
When you are ready to ask for someone to review your code and consider a merge:
-
-
Go to the URL of your forked repo, say
-http://github.com/your-user-name/iris.
-
-
Use the ‘Switch Branches’ dropdown menu near the top left of the page to
-select the branch with your changes:
-
-
-
Click on the ‘Pull request’ button:
-
-
Enter a title for the set of changes, and some explanation of what you’ve
-done. Say if there is anything you’d like particular attention for - like a
-complicated change or some code you are not happy with.
-
If you don’t think your request is ready to be merged, just say so in your
-pull request message. This is still a good way of getting some preliminary
-code review.
If you want to work on some stuff with other people, where you are all
-committing into the same repository, or even the same branch, then just
-share it via github.
Let’s say you thought of some work you’d like to do. You
-Update the mirror of trunk and Make a new feature branch called
-cool-feature. At this stage trunk is at some commit, let’s call it E. Now
-you make some new commits on your cool-feature branch, let’s call them A, B,
-C. Maybe your changes take a while, or you come back to them after a while. In
-the meantime, trunk has progressed from commit E to commit (say) G:
-
A---B---C cool-feature
- /
-D---E---F---G trunk
-
-
-
At this stage you consider merging trunk into your feature branch, and you
-remember that this here page sternly advises you not to do that, because the
-history will get messy. Most of the time you can just ask for a review, and not
-worry that trunk has got a little ahead. But sometimes, the changes in trunk
-might affect your changes, and you need to harmonize them. In this situation
-you may prefer to do a rebase.
-
rebase takes your changes (A, B, C) and replays them as if they had been made to
-the current state of trunk. In other words, in this case, it takes the
-changes represented by A, B, C and replays them on top of G. After the rebase,
-your history will look like this:
# Update the mirror of trunk
-git fetch upstream
-# go to the feature branch
-git checkout cool-feature
-# make a backup in case you mess up
-git branch tmp cool-feature
-# rebase cool-feature onto trunk
-git rebase --onto upstream/master upstream/master cool-feature
-
-
-
In this situation, where you are already on branch cool-feature, the last
-command can be written more succinctly as:
-
git rebase upstream/master
-
-
-
When all looks good you can delete your backup branch:
If you have made changes to files that have also changed in trunk, this may
-generate merge conflicts that you need to resolve - see the git rebase man
-page for some instructions at the end of the “Description” section. There is
-some related help on merging in the git user manual - see resolving a merge.
Sometimes, you mess up merges or rebases. Luckily, in git it is
-relatively straightforward to recover from such mistakes.
-
If you mess up during a rebase:
-
git rebase --abort
-
-
-
If you notice you messed up after the rebase:
-
# reset branch back to the saved point
-git reset --hard tmp
-
-
-
If you forgot to make a backup branch:
-
# look at the reflog of the branch
-git reflog show cool-feature
-
-8630830 cool-feature@{0}: commit: BUG: io: close file handles immediately
-278dd2a cool-feature@{1}: rebase finished: refs/heads/my-feature-branch onto 11ee694744f2552d
-26aa21a cool-feature@{2}: commit: BUG: lib: make seek_gzip_factory not leak gzip obj
-...
-
-# reset the branch to where it was before the botched rebase
-git reset --hard cool-feature@{2}
-
There’s an embarassing typo in a commit you made? Or perhaps the you
-made several false starts you would like the posterity not to see.
-
This can be done via interactive rebasing.
-
Suppose that the commit history looks like this:
-
git log --oneline
-eadc391 Fix some remaining bugs
-a815645 Modify it so that it works
-2dec1ac Fix a few bugs + disable
-13d7934 First implementation
-6ad92e5 * masked is now an instance of a new object, MaskedConstant
-29001ed Add pre-nep for a copule of structured_array_extensions.
-...
-
-
-
and 6ad92e5 is the last commit in the cool-feature branch. Suppose we
-want to make the following changes:
-
-
Rewrite the commit message for 13d7934 to something more sensible.
-
Combine the commits 2dec1ac, a815645, eadc391 into a single one.
-
-
We do as follows:
-
# make a backup of the current state
-git branch tmp HEAD
-# interactive rebase
-git rebase -i 6ad92e5
-
-
-
This will open an editor with the following text in it:
-
pick 13d7934 First implementation
-pick 2dec1ac Fix a few bugs + disable
-pick a815645 Modify it so that it works
-pick eadc391 Fix some remaining bugs
-
-# Rebase 6ad92e5..eadc391 onto 6ad92e5
-#
-# Commands:
-# p, pick = use commit
-# r, reword = use commit, but edit the commit message
-# e, edit = use commit, but stop for amending
-# s, squash = use commit, but meld into previous commit
-# f, fixup = like "squash", but discard this commit's log message
-#
-# If you remove a line here THAT COMMIT WILL BE LOST.
-# However, if you remove everything, the rebase will be aborted.
-#
-
-
-
To achieve what we want, we will make the following changes to it:
-
r 13d7934 First implementation
-pick 2dec1ac Fix a few bugs + disable
-f a815645 Modify it so that it works
-f eadc391 Fix some remaining bugs
-
-
-
This means that (i) we want to edit the commit message for
-13d7934, and (ii) collapse the last three commits into one. Now we
-save and quit the editor.
-
Git will then immediately bring up an editor for editing the commit
-message. After revising it, we get the output:
-
[detached HEAD 721fc64] FOO: First implementation
- 2 files changed, 199 insertions(+), 66 deletions(-)
-[detached HEAD 0f22701] Fix a few bugs + disable
- 1 files changed, 79 insertions(+), 61 deletions(-)
-Successfully rebased and updated refs/heads/my-feature-branch.
-
-
-
and the history looks now like this:
-
0f22701 Fix a few bugs + disable
-721fc64 ENH: Sophisticated feature
-6ad92e5 * masked is now an instance of a new object, MaskedConstant
-
-
-
If it went wrong, recovery is again possible as explained above.
You need to do this only once. The instructions here are very similar
-to the instructions at http://help.github.com/forking/ — please see
-that page for more detail. We’re repeating some of it here just to give the
-specifics for the iris project, and to suggest some default names.
These pages describe a git and github workflow for the iris
-project.
-
There are several different workflows here, for different ways of
-working with iris.
-
This is not a comprehensive git reference, it’s just a workflow for our
-own project. It’s tailored to the github hosting service. You may well
-find better or quicker ways of getting stuff done with git, but these
-should get you started.
-
For general resources for learning git, see git resources.
Linus Torvalds on linux git workflow . Summary; use the git tools
-to make the history of your edits as clean as possible; merge from
-upstream edits as little as possible in branches where you are doing
-active development.
You can get these on your own machine with (e.g) githelppush or
-(same thing) gitpush--help, but, for convenience, here are the
-online manual pages for some common commands:
This page is for maintainers — those of us who merge our own or other
-peoples’ changes into the upstream repository.
-
Being as how you’re a maintainer, you are completely on top of the basic stuff
-in Development workflow.
-
The instructions in Linking your repository to the upstream repo add a remote that has read-only
-access to the upstream repo. Being a maintainer, you’ve got read-write access.
-
It’s good to have your upstream remote have a scary name, to remind you that
-it’s a read-write remote:
If there are only a few commits, consider rebasing to upstream:
-
# Fetch upstream changes
-git fetch upstream-rw
-# rebase
-git rebase upstream-rw/master
-
-
-
Remember that, if you do a rebase, and push that, you’ll have to close any
-github pull requests manually, because github will not be able to detect the
-changes have already been merged.
The merge will be detected by github, and should close any related pull requests
-automatically.
-
Note the --no-ff above. This forces git to make a merge commit, rather than
-doing a fast-forward, so that these set of commits branch off trunk then rejoin
-the main history with a merge, rather than appearing to have been made directly
-on top of trunk.
The first line above just shows the history in a compact way, with a text
-representation of the history graph. The second line shows the log of commits
-excluding those that can be reached from trunk (upstream-rw/master), and
-including those that can be reached from current HEAD (implied with the ..
-at the end). So, it shows the commits unique to this branch compared to trunk.
-The -p option shows the diff for these commits in patch form.
You’ve discovered a bug or something else you want to change
-in iris .. — excellent!
-
You’ve worked out a way to fix it — even better!
-
You want to tell us about it — best of all!
-
The easiest way is to make a patch or set of patches. Here
-we explain how. Making a patch is the simplest and quickest,
-but if you’re going to be doing anything more than simple
-quick things, please consider following the
-Git for development model instead.
# tell git who you are
-git config --global user.email you@yourdomain.example.com
-git config --global user.name "Your Name Comes Here"
-# get the repository if you don't have it
-git clone git://github.com/SciTools/iris.git
-# make a branch for your patching
-cd iris
-git branch the-fix-im-thinking-of
-git checkout the-fix-im-thinking-of
-# hack, hack, hack
-# Tell git about any new files you've made
-git add somewhere/tests/test_my_bug.py
-# commit work in progress as you go
-git commit -am 'BF - added tests for Funny bug'
-# hack hack, hack
-git commit -am 'BF - added fix for Funny bug'
-# make the patch files
-git format-patch -M -C master
-
-
-
Then, send the generated patch files to the iris
-mailing list — where we will thank you warmly.
Make a ‘feature branch’. This will be where you work on
-your bug fix. It’s nice and safe and leaves you with
-access to an unmodified copy of the code in the main
-branch:
# hack, hack, hack
-# Tell git about any new files you've made
-git add somewhere/tests/test_my_bug.py
-# commit work in progress as you go
-git commit -am 'BF - added tests for Funny bug'
-# hack hack, hack
-git commit -am 'BF - added fix for Funny bug'
-
-
-
Note the -am options to commit. The m flag just
-signals that you’re going to type a message on the command
-line. The a flag — you can just take on faith —
-or see why the -a flag?.
-
-
When you have finished, check you have committed all your
-changes:
-
git status
-
-
-
-
Finally, make your commits into patches. You want all the
-commits since you branched from the master branch:
-
git format-patch -M -C master
-
-
-
You will now have several files named for the commits:
If you find you have done some patches, and you have one or
-more feature branches, you will probably want to switch to
-development mode. You can do this with the repository you
-have.
# checkout and refresh master branch from main repo
-git checkout master
-git pull origin master
-# rename pointer to main repository to 'upstream'
-git remote rename origin upstream
-# point your repo to default read / write to your fork on github
-git remote add origin git@github.com:your-user-name/iris.git
-# push up any branches you've made and want to keep
-git push origin the-fix-im-thinking-of
-
Clone your fork to the local computer with gitclone
-git@github.com:your-user-name/iris.git
-
-
Investigate. Change directory to your new repo: cdiris. Then
-gitbranch-a to show you all branches. You’ll get something
-like:
-
* master
-remotes/origin/master
-
-
-
This tells you that you are currently on the master branch, and
-that you also have a remote connection to origin/master.
-What remote repository is remote/origin? Try gitremote-v to
-see the URLs for the remote. They will point to your github fork.
-
Now you want to connect to the upstream iris github repository, so
-you can merge in changes from trunk.
cd iris
-git remote add upstream git://github.com/SciTools/iris.git
-
-
-
upstream here is just the arbitrary name we’re using to refer to the
-main iris repository at iris github.
-
Note that we’ve used git:// for the URL rather than git@. The
-git:// URL is read only. This means we that we can’t accidentally
-(or deliberately) write to the upstream repo, and we are only going to
-use it to merge into our own code.
-
Just for your own satisfaction, show yourself that you now have a new
-‘remote’, with gitremote-vshow, giving you something like:
A pull request to a SciTools project master should be ready to merge into the
-master branch.
-
All pull request will be reviewed by a core developer who will manage the
-process of merging. It is the responsibility of a developer submitting a
-pull request to do their best to deliver a pull request which meets the
-requirements of the project it is submitted to.
-
The check list summarises criteria which will be checked before a pull request
-is merged. Before submitting a pull request please consider this list.
There is a python module for checking pep8 compliance: python-pep8
-
-
-
-
Do all the tests pass locally?
-
-
-
-
The Iris tests may be run with pythonsetup.pytest which has a command
-line utility included.
-
Coding standards, including PEP8 compliance and copyright message (including
-the correct year of the latest change), are tested.
-
-
-
-
Has a new test been provided?
-
Has iris-test-data been updated?
-
-
-
-
iris-test-data is a github project containing all the data to support the
-tests.
-
If this has been updated a reference to the relevant pull request should be
-provided.
-
-
-
-
Has the the documentation been updated to explain the new feature or bug fix?
-
-
-
-
with reference to the developer guide on docstrings
-
-
-
-
Have code examples been provided inside the relevant docstrings?
-
Has iris-sample-data been updated?
-
-
-
-
iris-sample-data is a github project containing all the data to support
-the gallery and examples.
-
-
-
-
Does the documentation build without errors?
-
-
-
-
The documentation is built using makehtml in ./docs/iris.
-
-
-
-
Do the documentation tests pass?
-
-
-
-
makedoctest, makeextest in ./docs/iris.
-
-
-
-
Is there an associated iris-code-generators pull request?
-
-
-
-
iris-code-generators is a github project which provides processes for
-generating a small subset of the Iris source code files from other
-information sources.
-
-
-
-
Has the travis file been updated to reflect any dependency updates?
-
-
-
-
./.travis.yml is used to manage the continuous integration testing.
Ideally, all code changes should be accompanied by one or more unit
-tests, and by zero or more integration tests. And where possible, new
-tests should not be added to the legacy tests.
-
But if in any doubt about what tests to add or how to write them please
-feel free to submit a pull-request in any state and ask for assistance.
Code changes should be accompanied by enough unit tests to give a
-high degree of confidence that the change works as expected. In
-addition, the unit tests can help describe the intent behind a change.
-
The docstring for each test module must state the unit under test.
-For example:
Within this test module each tested method must have one or more
-corresponding test classes:
-- Either: Test_name_of_public_method
-- Or: Test_name_of_public_method__aspect_of_method
-
And within those test classes, the test methods must be named according
-to the aspect of the tested method which they address.
Within that file the tests might look something like:
-
# Tests for the Cube.xml() method.
-classTest_xml(tests.IrisTest):
- deftest_some_general_stuff(self):
- ...
-
-# Tests for the Cube.xml() method, focussing on the behaviour of
-# the checksums.
-classTest_xml__checksum(tests.IrisTest):
- deftest_checksum_ignores_masked_values(self):
- ...
-
-# Tests for the Cube.add_dim_coord() method.
-classTest_add_dim_coord(tests.IrisTest):
- deftest_normal_usage(self):
- ...
-
- deftest_coord_already_present(self):
- ...
-
Within that file the tests might look something like:
-
# Tests focussing on the handling of different data types.
-classTestDtypeAndValues(tests.IrisTest):
- deftest_int16(self):
- ...
-
- deftest_int16_big_endian(self):
- ...
-
-# Tests focussing on the handling of different projections.
-classTestProjection(tests.IrisTest):
- deftest_no_ellipsoid(self):
- ...
-
Some code changes may require tests which exercise several units in
-order to demonstrate an important consequence of their interaction which
-may not be apparent when considering the units in isolation.
-
These tests must be placed in the lib/iris/tests/integration folder.
-Unlike unit tests, there is no fixed naming scheme for integration
-tests. But folders and files must be created as required to help
-developers locate relevant tests. It is recommended they are named
-according to the capabilities under test, e.g.
-metadata/test_pp_preservation.py, and not named according to the
-module(s) under test.
This example demonstrates low pass filtering a time-series by applying a
-weighted running mean over the time dimension.
-
The time-series used is the Darwin-only Southern Oscillation index (SOI),
-which is filtered using two different Lanczos filters, one to filter out
-time-scales of less than two years and one to filter out time-scales of
-less than 7 years.
Duchon C. E. (1979) Lanczos Filtering in One and Two Dimensions.
-Journal of Applied Meteorology, Vol 18, pp 1016-1022.
-
Trenberth K. E. (1984) Signal Versus Noise in the Southern Oscillation.
-Monthly Weather Review, Vol 112, pp 326-332
-
-
"""
-Applying a filter to a time-series
-==================================
-
-This example demonstrates low pass filtering a time-series by applying a
-weighted running mean over the time dimension.
-
-The time-series used is the Darwin-only Southern Oscillation index (SOI),
-which is filtered using two different Lanczos filters, one to filter out
-time-scales of less than two years and one to filter out time-scales of
-less than 7 years.
-
-References
-----------
-
- Duchon C. E. (1979) Lanczos Filtering in One and Two Dimensions.
- Journal of Applied Meteorology, Vol 18, pp 1016-1022.
-
- Trenberth K. E. (1984) Signal Versus Noise in the Southern Oscillation.
- Monthly Weather Review, Vol 112, pp 326-332
-
-"""
-importnumpyasnp
-importmatplotlib.pyplotasplt
-importiris
-importiris.plotasiplt
-
-
-deflow_pass_weights(window,cutoff):
- """Calculate weights for a low pass Lanczos filter.
-
- Args:
-
- window: int
- The length of the filter window.
-
- cutoff: float
- The cutoff frequency in inverse time steps.
-
- """
- order=((window-1)//2)+1
- nwts=2*order+1
- w=np.zeros([nwts])
- n=nwts//2
- w[n]=2*cutoff
- k=np.arange(1.,n)
- sigma=np.sin(np.pi*k/n)*n/(np.pi*k)
- firstfactor=np.sin(2.*np.pi*cutoff*k)/(np.pi*k)
- w[n-1:0:-1]=firstfactor*sigma
- w[n+1:-1]=firstfactor*sigma
- returnw[1:-1]
-
-
-defmain():
-
- # Load the monthly-valued Southern Oscillation Index (SOI) time-series.
- fname=iris.sample_data_path('SOI_Darwin.nc')
- soi=iris.load_cube(fname)
-
- # Window length for filters.
- window=121
-
- # Construct 2-year (24-month) and 7-year (84-month) low pass filters
- # for the SOI data which is monthly.
- wgts24=low_pass_weights(window,1./24.)
- wgts84=low_pass_weights(window,1./84.)
-
- # Apply each filter using the rolling_window method used with the weights
- # keyword argument. A weighted sum is required because the magnitude of
- # the weights are just as important as their relative sizes.
- soi24=soi.rolling_window('time',
- iris.analysis.SUM,
- len(wgts24),
- weights=wgts24)
- soi84=soi.rolling_window('time',
- iris.analysis.SUM,
- len(wgts84),
- weights=wgts84)
-
- # Plot the SOI time series and both filtered versions.
- plt.figure(figsize=(9,4))
- iplt.plot(soi,color='0.7',linewidth=1.,linestyle='-',
- alpha=1.,label='no filter')
- iplt.plot(soi24,color='b',linewidth=2.,linestyle='-',
- alpha=.7,label='2-year filter')
- iplt.plot(soi84,color='r',linewidth=2.,linestyle='-',
- alpha=.7,label='7-year filter')
- plt.ylim([-4,4])
- plt.title('Southern Oscillation Index (Darwin Only)')
- plt.xlabel('Time')
- plt.ylabel('SOI')
- plt.legend(fontsize=10)
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
In this example, we need to plot anomaly data where the values have a
-“logarithmic” significance – i.e. we want to give approximately equal ranges
-of colour between data values of, say, 1 and 10 as between 10 and 100.
-
As the data range also contains zero, that obviously does not suit a simple
-logarithmic interpretation. However, values of less than a certain absolute
-magnitude may be considered “not significant”, so we put these into a separate
-“zero band” which is plotted in white.
-
To do this, we create a custom value mapping function (normalization) using
-the matplotlib Norm class matplotlib.colours.SymLogNorm.
-We use this to make a cell-filled pseudocolour plot with a colorbar.
"""
-Colouring anomaly data with logarithmic scaling
-===============================================
-
-In this example, we need to plot anomaly data where the values have a
-"logarithmic" significance -- i.e. we want to give approximately equal ranges
-of colour between data values of, say, 1 and 10 as between 10 and 100.
-
-As the data range also contains zero, that obviously does not suit a simple
-logarithmic interpretation. However, values of less than a certain absolute
-magnitude may be considered "not significant", so we put these into a separate
-"zero band" which is plotted in white.
-
-To do this, we create a custom value mapping function (normalization) using
-the matplotlib Norm class `matplotlib.colours.SymLogNorm
-<http://matplotlib.org/api/colors_api.html#matplotlib.colors.SymLogNorm>`_.
-We use this to make a cell-filled pseudocolour plot with a colorbar.
-
-NOTE: By "pseudocolour", we mean that each data point is drawn as a "cell"
-region on the plot, coloured according to its data value.
-This is provided in Iris by the functions :meth:`iris.plot.pcolor` and
-:meth:`iris.plot.pcolormesh`, which call the underlying matplotlib
-functions of the same names (i.e. `matplotlib.pyplot.pcolor
-<http://matplotlib.org/api/pyplot_api.html#matplotlib.pyplot.pcolor>`_
-and `matplotlib.pyplot.pcolormesh
-<http://matplotlib.org/api/pyplot_api.html#matplotlib.pyplot.pcolormesh>`_).
-See also: http://en.wikipedia.org/wiki/False_color#Pseudocolor.
-
-"""
-importcartopy.crsasccrs
-importiris
-importiris.coord_categorisation
-importiris.plotasiplt
-importmatplotlib.pyplotasplt
-importmatplotlib.colorsasmcols
-
-
-defmain():
- # Load a sample air temperatures sequence.
- file_path=iris.sample_data_path('E1_north_america.nc')
- temperatures=iris.load_cube(file_path)
-
- # Create a year-number coordinate from the time information.
- iris.coord_categorisation.add_year(temperatures,'time')
-
- # Create a sample anomaly field for one chosen year, by extracting that
- # year and subtracting the time mean.
- sample_year=1982
- year_temperature=temperatures.extract(iris.Constraint(year=sample_year))
- time_mean=temperatures.collapsed('time',iris.analysis.MEAN)
- anomaly=year_temperature-time_mean
-
- # Construct a plot title string explaining which years are involved.
- years=temperatures.coord('year').points
- plot_title='Temperature anomaly'
- plot_title+='\n{} differences from {}-{} average.'.format(
- sample_year,years[0],years[-1])
-
- # Define scaling levels for the logarithmic colouring.
- minimum_log_level=0.1
- maximum_scale_level=3.0
-
- # Use a standard colour map which varies blue-white-red.
- # For suitable options, see the 'Diverging colormaps' section in:
- # http://matplotlib.org/examples/color/colormaps_reference.html
- anom_cmap='bwr'
-
- # Create a 'logarithmic' data normalization.
- anom_norm=mcols.SymLogNorm(linthresh=minimum_log_level,
- linscale=0,
- vmin=-maximum_scale_level,
- vmax=maximum_scale_level)
- # Setting "linthresh=minimum_log_level" makes its non-logarithmic
- # data range equal to our 'zero band'.
- # Setting "linscale=0" maps the whole zero band to the middle colour value
- # (i.e. 0.5), which is the neutral point of a "diverging" style colormap.
-
- # Create an Axes, specifying the map projection.
- plt.axes(projection=ccrs.LambertConformal())
-
- # Make a pseudocolour plot using this colour scheme.
- mesh=iplt.pcolormesh(anomaly,cmap=anom_cmap,norm=anom_norm)
-
- # Add a colourbar, with extensions to show handling of out-of-range values.
- bar=plt.colorbar(mesh,orientation='horizontal',extend='both')
-
- # Set some suitable fixed "logarithmic" colourbar tick positions.
- tick_levels=[-3,-1,-0.3,0.0,0.3,1,3]
- bar.set_ticks(tick_levels)
-
- # Modify the tick labels so that the centre one shows "+/-<minumum-level>".
- tick_levels[3]=r'$\pm${:g}'.format(minimum_log_level)
- bar.set_ticklabels(tick_levels)
-
- # Label the colourbar to show the units.
- bar.set_label('[{}, log scale]'.format(anomaly.units))
-
- # Add coastlines and a title.
- plt.gca().coastlines()
- plt.title(plot_title)
-
- # Display the result.
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example demonstrates contour plots of a cross-sectioned multi-dimensional cube which features
-a hybrid height vertical coordinate system.
-
"""
-Cross section plots
-===================
-
-This example demonstrates contour plots of a cross-sectioned multi-dimensional cube which features
-a hybrid height vertical coordinate system.
-
-"""
-
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-defmain():
- fname=iris.sample_data_path('hybrid_height.nc')
- theta=iris.load_cube(fname)
-
- # Extract a single height vs longitude cross-section. N.B. This could easily be changed to
- # extract a specific slice, or even to loop over *all* cross section slices.
- cross_section=next(theta.slices(['grid_longitude','model_level_number']))
-
- qplt.contourf(cross_section,coords=['grid_longitude','altitude'])
- iplt.show()
-
- # Now do the equivalent plot, only against model level
- plt.figure()
-
- qplt.contourf(cross_section,coords=['grid_longitude','model_level_number'])
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
In this case, we have a 240-year sequence of yearly average surface temperature
-over North America, and we want to calculate in how many years these exceed a
-certain temperature over a spell of 5 years or more.
-
"""
-Calculating a custom statistic
-==============================
-
-This example shows how to define and use a custom
-:class:`iris.analysis.Aggregator`, that provides a new statistical operator for
-use with cube aggregation functions such as :meth:`~iris.cube.Cube.collapsed`,
-:meth:`~iris.cube.Cube.aggregated_by` or
-:meth:`~iris.cube.Cube.rolling_window`.
-
-In this case, we have a 240-year sequence of yearly average surface temperature
-over North America, and we want to calculate in how many years these exceed a
-certain temperature over a spell of 5 years or more.
-
-"""
-importmatplotlib.pyplotasplt
-importnumpyasnp
-
-importiris
-fromiris.analysisimportAggregator
-importiris.plotasiplt
-importiris.quickplotasqplt
-fromiris.utilimportrolling_window
-
-
-# Define a function to perform the custom statistical operation.
-# Note: in order to meet the requirements of iris.analysis.Aggregator, it must
-# do the calculation over an arbitrary (given) data axis.
-defcount_spells(data,threshold,axis,spell_length):
- """
- Function to calculate the number of points in a sequence where the value
- has exceeded a threshold value for at least a certain number of timepoints.
-
- Generalised to operate on multiple time sequences arranged on a specific
- axis of a multidimensional array.
-
- Args:
-
- * data (array):
- raw data to be compared with value threshold.
-
- * threshold (float):
- threshold point for 'significant' datapoints.
-
- * axis (int):
- number of the array dimension mapping the time sequences.
- (Can also be negative, e.g. '-1' means last dimension)
-
- * spell_length (int):
- number of consecutive times at which value > threshold to "count".
-
- """
- ifaxis<0:
- # just cope with negative axis numbers
- axis+=data.ndim
- # Threshold the data to find the 'significant' points.
- data_hits=data>threshold
- # Make an array with data values "windowed" along the time axis.
- hit_windows=rolling_window(data_hits,window=spell_length,axis=axis)
- # Find the windows "full of True-s" (along the added 'window axis').
- full_windows=np.all(hit_windows,axis=axis+1)
- # Count points fulfilling the condition (along the time axis).
- spell_point_counts=np.sum(full_windows,axis=axis,dtype=int)
- returnspell_point_counts
-
-
-defmain():
- # Load the whole time-sequence as a single cube.
- file_path=iris.sample_data_path('E1_north_america.nc')
- cube=iris.load_cube(file_path)
-
- # Make an aggregator from the user function.
- SPELL_COUNT=Aggregator('spell_count',
- count_spells,
- units_func=lambdaunits:1)
-
- # Define the parameters of the test.
- threshold_temperature=280.0
- spell_years=5
-
- # Calculate the statistic.
- warm_periods=cube.collapsed('time',SPELL_COUNT,
- threshold=threshold_temperature,
- spell_length=spell_years)
- warm_periods.rename('Number of 5-year warm spells in 240 years')
-
- # Plot the results.
- qplt.contourf(warm_periods,cmap='RdYlBu_r')
- plt.gca().coastlines()
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example shows how a custom text file can be loaded using the standard Iris load mechanism.
-
The first stage in the process is to define an Iris FormatSpecification for the file format.
-To create a format specification we need to define the following:
-
-
format_name - Some text that describes the format specification we are creating
-
-
file_element - FileElement object describing the element which identifies
-this FormatSpecification.
-
Possible values are:
-
-
iris.io.format_picker.MagicNumber(n,o) - The n bytes from the file at offset o.
-
iris.io.format_picker.FileExtension() - The file’s extension.
-
iris.io.format_picker.LeadingLine() - The first line of the file.
-
-
-
file_element_value - The value that the file_element should take if a file matches this FormatSpecification
-
-
handler (optional) - A generator function that will be called when the file specification has been identified. This function is
-provided by the user and provides the means to parse the whole file. If no handler function is provided, then identification
-is still possible without any handling.
-
The handler function must define the following arguments:
-
-
list of filenames to process
-
callback function - An optional function to filter/alter the Iris cubes returned
-
-
The handler function must be defined as generator which yields each cube as they are produced.
-
-
priority (optional) - Integer giving a priority for considering this specification where higher priority means sooner consideration
-
-
-
In the following example, the function load_NAME_III() has been defined to handle the loading of the raw data from the custom file format.
-This function is called from NAME_to_cube() which uses this data to create and yield Iris cubes.
-
In the main() function the filenames are loaded via the iris.load_cube function which automatically
-invokes the FormatSpecification we defined. The cube returned from the load function is then used to produce a plot.
-
"""
-Loading a cube from a custom file format
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-This example shows how a custom text file can be loaded using the standard Iris load mechanism.
-
-The first stage in the process is to define an Iris :class:`FormatSpecification <iris.io.format_picker.FormatSpecification>` for the file format.
-To create a format specification we need to define the following:
-
-* format_name - Some text that describes the format specification we are creating
-* file_element - FileElement object describing the element which identifies
- this FormatSpecification.
-
- Possible values are:
-
- ``iris.io.format_picker.MagicNumber(n, o)`` - The n bytes from the file \
- at offset o.
-
- ``iris.io.format_picker.FileExtension()`` - The file's extension.
-
- ``iris.io.format_picker.LeadingLine()`` - The first line of the file.
-
-* file_element_value - The value that the file_element should take if a file matches this FormatSpecification
-* handler (optional) - A generator function that will be called when the file specification has been identified. This function is
- provided by the user and provides the means to parse the whole file. If no handler function is provided, then identification
- is still possible without any handling.
-
- The handler function must define the following arguments:
-
- * list of filenames to process
- * callback function - An optional function to filter/alter the Iris cubes returned
-
- The handler function must be defined as generator which yields each cube as they are produced.
-
-* priority (optional) - Integer giving a priority for considering this specification where higher priority means sooner consideration
-
-In the following example, the function :func:`load_NAME_III` has been defined to handle the loading of the raw data from the custom file format.
-This function is called from :func:`NAME_to_cube` which uses this data to create and yield Iris cubes.
-
-In the ``main()`` function the filenames are loaded via the ``iris.load_cube`` function which automatically
-invokes the ``FormatSpecification`` we defined. The cube returned from the load function is then used to produce a plot.
-
-"""
-importdatetime
-
-importmatplotlib.pyplotasplt
-importnumpyasnp
-
-importiris
-importiris.coordsasicoords
-importiris.coord_systemsasicoord_systems
-importiris.fileformats
-importiris.io.format_pickerasformat_picker
-importiris.plotasiplt
-
-
-UTC_format='%H%M%Z %d/%m/%Y'
-
-
-defload_NAME_III(filename):
- """
- Loads the Met Office's NAME III grid output files returning headers, column definitions and data arrays as 3 separate lists.
-
- """
-
- # loading a file gives a generator of lines which can be progressed using the next() method.
- # This will come in handy as we wish to progress through the file line by line.
- file_handle=file(filename)
-
- # define a dictionary which can hold the header metadata about this file
- headers={}
-
- # skip the NAME header of the file which looks something like 'NAME III (version X.X.X)'
- next(file_handle)
-
- # read the next 16 lines of header information, putting the form "header name: header value" into a dictionary
- for_inrange(16):
- header_name,header_value=file_handle.next().split(':')
-
- # strip off any spurious space characters in the header name and value
- header_name=header_name.strip()
- header_value=header_value.strip()
-
- # cast some headers into floats or integers if they match a given header name
- ifheader_namein['X grid origin','Y grid origin','X grid resolution','Y grid resolution']:
- header_value=float(header_value)
- elifheader_namein['X grid size','Y grid size','Number of fields']:
- header_value=int(header_value)
- elifheader_namein['Run time','Start of release','End of release']:
- # convert the time to python datetimes
- header_value=datetime.datetime.strptime(header_value,UTC_format)
-
- headers[header_name]=header_value
-
- # skip the next blank line in the file.
- next(file_handle)
-
- # Read the next 7 lines of column definitions
- column_headings={}
- forcolumn_header_namein['species_category','species','cell_measure','quantity','unit','z_level','time']:
- column_headings[column_header_name]=[col.strip()forcolinfile_handle.next().split(',')][:-1]
-
- # convert the time to python datetimes
- new_time_column_header=[]
- fori,tinenumerate(column_headings['time']):
- # the first 4 columns aren't time at all, so don't convert them to datetimes
- ifi>=4:
- new_time_column_header.append(datetime.datetime.strptime(t,UTC_format))
- else:
- new_time_column_header.append(t)
- column_headings['time']=new_time_column_header
-
- # skip the blank line after the column headers
- next(file_handle)
-
- # make a list of data arrays to hold the data for each column
- data_shape=(headers['Y grid size'],headers['X grid size'])
- data_arrays=[np.zeros(data_shape,dtype=np.float32)foriinrange(headers['Number of fields'])]
-
- # iterate over the remaining lines which represent the data in a column form
- forlineinfile_handle:
-
- # split the line by comma, removing the last empty column caused by the trailing comma
- vals=line.split(',')[:-1]
-
- # cast the x and y grid positions to floats and convert them to zero based indices
- # (the numbers are 1 based grid positions where 0.5 represents half a grid point.)
- x=float(vals[0])-1.5
- y=float(vals[1])-1.5
-
- # populate the data arrays (i.e. all columns but the leading 4)
- fori,data_arrayinenumerate(data_arrays):
- data_array[y,x]=float(vals[i+4])
-
- returnheaders,column_headings,data_arrays
-
-
-defNAME_to_cube(filenames,callback):
- """Returns a generator of cubes given a list of filenames and a callback."""
-
- forfilenameinfilenames:
- header,column_headings,data_arrays=load_NAME_III(filename)
-
- fori,data_arrayinenumerate(data_arrays):
- # turn the dictionary of column headers with a list of header information for each field into a dictionary of
- # headers for just this field. Ignore the first 4 columns of grid position (data was located with the data array).
- field_headings=dict([(k,v[i+4])fork,vincolumn_headings.iteritems()])
-
- # make an cube
- cube=iris.cube.Cube(data_array)
-
- # define the name and unit
- name=('%s%s'%(field_headings['species'],field_headings['quantity'])).upper().replace(' ','_')
- cube.rename(name)
- # Some units are badly encoded in the file, fix this by putting a space in between. (if gs is not found, then the
- # string will be returned unchanged)
- cube.units=field_headings['unit'].replace('gs','g s')
-
- # define and add the singular coordinates of the field (flight level, time etc.)
- cube.add_aux_coord(icoords.AuxCoord(field_headings['z_level'],long_name='flight_level',units='1'))
-
- # define the time unit and use it to serialise the datetime for the time coordinate
- time_unit=iris.unit.Unit('hours since epoch',calendar=iris.unit.CALENDAR_GREGORIAN)
- time_coord=icoords.AuxCoord(time_unit.date2num(field_headings['time']),standard_name='time',units=time_unit)
- cube.add_aux_coord(time_coord)
-
- # build a coordinate system which can be referenced by latitude and longitude coordinates
- lat_lon_coord_system=icoord_systems.GeogCS(6371229)
-
- # build regular latitude and longitude coordinates which have bounds
- start=header['X grid origin']+header['X grid resolution']
- step=header['X grid resolution']
- count=header['X grid size']
- pts=start+np.arange(count,dtype=np.float32)*step
- lon_coord=icoords.DimCoord(pts,standard_name='longitude',units='degrees',coord_system=lat_lon_coord_system)
- lon_coord.guess_bounds()
-
- start=header['Y grid origin']+header['Y grid resolution']
- step=header['Y grid resolution']
- count=header['Y grid size']
- pts=start+np.arange(count,dtype=np.float32)*step
- lat_coord=icoords.DimCoord(pts,standard_name='latitude',units='degrees',coord_system=lat_lon_coord_system)
- lat_coord.guess_bounds()
-
- # add the latitude and longitude coordinates to the cube, with mappings to data dimensions
- cube.add_dim_coord(lat_coord,0)
- cube.add_dim_coord(lon_coord,1)
-
- # implement standard iris callback capability. Although callbacks are not used in this example, the standard
- # mechanism for a custom loader to implement a callback is shown:
- cube=iris.io.run_callback(callback,cube,[header,field_headings,data_array],filename)
-
- # yield the cube created (the loop will continue when the next() element is requested)
- yieldcube
-
-
-# Create a format_picker specification of the NAME file format giving it a
-# priority greater than the built in NAME loader.
-_NAME_III_spec=format_picker.FormatSpecification('Name III',format_picker.LeadingLine(),
- lambdaline:line.startswith("NAME III"),NAME_to_cube,
- priority=6)
-
-# Register the NAME loader with iris
-iris.fileformats.FORMAT_AGENT.add_spec(_NAME_III_spec)
-
-
-
-# ---------------------------------------------
-# | Using the new loader |
-# ---------------------------------------------
-
-defmain():
- fname=iris.sample_data_path('NAME_output.txt')
-
- boundary_volc_ash_constraint=iris.Constraint('VOLCANIC_ASH_AIR_CONCENTRATION',flight_level='From FL000 - FL200')
-
- # Callback shown as None to illustrate where a cube-level callback function would be used if required
- cube=iris.load_cube(fname,boundary_volc_ash_constraint,callback=None)
-
- # draw contour levels for the data (the top level is just a catch-all)
- levels=(0.0002,0.002,0.004,1e10)
- cs=iplt.contourf(cube,levels=levels,
- colors=('#80ffff','#939598','#e00404'),
- )
-
- # draw a black outline at the lowest contour to highlight affected areas
- iplt.contour(cube,levels=(levels[0],100),
- colors='black')
-
- # set an extent and a background image for the map
- ax=plt.gca()
- ax.set_extent((-90,20,20,75))
- ax.stock_img('ne_shaded')
-
- # make a legend, with custom labels, for the coloured contour set
- artists,_=cs.legend_elements()
- labels=[
- r'$%s < x \leq %s$'%(levels[0],levels[1]),
- r'$%s < x \leq %s$'%(levels[1],levels[2]),
- r'$x > %s$'%levels[2]]
- ax.legend(artists,labels,title='Ash concentration / g m-3',loc='upper left')
-
- time=cube.coord('time')
- time_date=time.units.num2date(time.points[0]).strftime(UTC_format)
- plt.title('Volcanic ash concentration forecast\nvalid at %s'%time_date)
-
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example demonstrates a contour plot of global air temperature.
-The plot title and the labels for the axes are automatically derived from the metadata.
-
"""
-Quickplot of a 2d cube on a map
-===============================
-
-This example demonstrates a contour plot of global air temperature.
-The plot title and the labels for the axes are automatically derived from the metadata.
-
-"""
-importcartopy.crsasccrs
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-defmain():
- fname=iris.sample_data_path('air_temp.pp')
- temperature=iris.load_cube(fname)
-
- # Plot #1: contourf with axes longitude from -180 to 180
- fig=plt.figure(figsize=(12,5))
- plt.subplot(121)
- qplt.contourf(temperature,15)
- plt.gca().coastlines()
-
- # Plot #2: contourf with axes longitude from 0 to 360
- proj=ccrs.PlateCarree(central_longitude=-180.0)
- ax=plt.subplot(122,projection=proj)
- qplt.contourf(temperature,15)
- plt.gca().coastlines()
- iplt.show()
-
-if__name__=='__main__':
- main()
-
"""
-Multi-line temperature profile plot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-"""
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-defmain():
- fname=iris.sample_data_path('air_temp.pp')
-
- # Load exactly one cube from the given file.
- temperature=iris.load_cube(fname)
-
- # We only want a small number of latitudes, so filter some out
- # using "extract".
- temperature=temperature.extract(
- iris.Constraint(latitude=lambdacell:68<=cell<78))
-
- forcubeintemperature.slices('longitude'):
-
- # Create a string label to identify this cube (i.e. latitude: value).
- cube_label='latitude: %s'%cube.coord('latitude').points[0]
-
- # Plot the cube, and associate it with a label.
- qplt.plot(cube,label=cube_label)
-
- # Add the legend with 2 columns.
- plt.legend(ncol=2)
-
- # Put a grid on the plot.
- plt.grid(True)
-
- # Tell matplotlib not to extend the plot axes range to nicely
- # rounded numbers.
- plt.axis('tight')
-
- # Finally, show it.
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example demonstrates cell plots of data on the semi-structured ORCA2 model
-grid.
-
First, the data is projected into the PlateCarree coordinate reference system.
-
Second four pcolormesh plots are created from this projected dataset,
-using different projections for the output image.
-
"""
-Tri-Polar Grid Projected Plotting
-=================================
-
-This example demonstrates cell plots of data on the semi-structured ORCA2 model
-grid.
-
-First, the data is projected into the PlateCarree coordinate reference system.
-
-Second four pcolormesh plots are created from this projected dataset,
-using different projections for the output image.
-
-"""
-
-importmatplotlib.pyplotasplt
-
-importcartopy.crsasccrs
-importiris
-importiris.analysis.cartography
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-defmain():
- # Load data
- filepath=iris.sample_data_path('orca2_votemper.nc')
- cube=iris.load_cube(filepath)
-
- # Choose plot projections
- projections={}
- projections['Mollweide']=ccrs.Mollweide()
- projections['PlateCarree']=ccrs.PlateCarree()
- projections['NorthPolarStereo']=ccrs.NorthPolarStereo()
- projections['Orthographic']=ccrs.Orthographic(central_longitude=-90,
- central_latitude=45)
-
- pcarree=projections['PlateCarree']
- # Transform cube to target projection
- new_cube,extent=iris.analysis.cartography.project(cube,pcarree,
- nx=400,ny=200)
-
- # Plot data in each projection
- fornameinsorted(projections):
- fig=plt.figure()
- fig.suptitle('ORCA2 Data Projected to {}'.format(name))
- # Set up axes and title
- ax=plt.subplot(projection=projections[name])
- # Set limits
- ax.set_global()
- # plot with Iris quickplot pcolormesh
- qplt.pcolormesh(new_cube)
- # Draw coastlines
- ax.coastlines()
-
- iplt.show()
-
-if__name__=='__main__':
- main()
-
This example demonstrates computing a polynomial fit to 1D data from an Iris
-cube, adding the fit to the cube’s metadata, and plotting both the 1D data and
-the fit.
-
"""
-Fitting a polynomial
-====================
-
-This example demonstrates computing a polynomial fit to 1D data from an Iris
-cube, adding the fit to the cube's metadata, and plotting both the 1D data and
-the fit.
-
-"""
-
-importmatplotlib.pyplotasplt
-importnumpyasnp
-
-importiris
-importiris.quickplotasqplt
-
-
-defmain():
- fname=iris.sample_data_path('A1B_north_america.nc')
- cube=iris.load_cube(fname)
-
- # Extract a single time series at a latitude and longitude point.
- location=next(cube.slices(['time']))
-
- # Calculate a polynomial fit to the data at this time series.
- x_points=location.coord('time').points
- y_points=location.data
- degree=2
-
- p=np.polyfit(x_points,y_points,degree)
- y_fitted=np.polyval(p,x_points)
-
- # Add the polynomial fit values to the time series to take
- # full advantage of Iris plotting functionality.
- long_name='degree_{}_polynomial_fit_of_{}'.format(degree,cube.name())
- fit=iris.coords.AuxCoord(y_fitted,long_name=long_name,
- units=location.units)
- location.add_aux_coord(fit,0)
-
- qplt.plot(location.coord('time'),location,label='data')
- qplt.plot(location.coord('time'),
- location.coord(long_name),
- 'g-',label='polynomial fit')
- plt.legend(loc='best')
- plt.title('Trend of US air temperature over time')
-
- qplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example shows how to overlay data and graphics in different projections,
-demonstrating various features of Iris, Cartopy and matplotlib.
-
We wish to overlay two datasets, defined on different rotated-pole grids.
-To display both together, we make a pseudocoloured plot of the first, overlaid
-with contour lines from the second.
-We also add some lines and text annotations drawn in various projections.
-
We plot these over a specified region, in two different map projections.
-
"""
-Plotting in different projections
-=================================
-
-This example shows how to overlay data and graphics in different projections,
-demonstrating various features of Iris, Cartopy and matplotlib.
-
-We wish to overlay two datasets, defined on different rotated-pole grids.
-To display both together, we make a pseudocoloured plot of the first, overlaid
-with contour lines from the second.
-We also add some lines and text annotations drawn in various projections.
-
-We plot these over a specified region, in two different map projections.
-
-"""
-importcartopy.crsasccrs
-importiris
-importiris.plotasiplt
-importnumpyasnp
-importmatplotlib.pyplotasplt
-
-
-# Define a Cartopy 'ordinary' lat-lon coordinate reference system.
-crs_latlon=ccrs.PlateCarree()
-
-
-defmake_plot(projection_name,projection_crs):
-
- # Create a matplotlib Figure.
- fig=plt.figure()
-
- # Add a matplotlib Axes, specifying the required display projection.
- # NOTE: specifying 'projection' (a "cartopy.crs.Projection") makes the
- # resulting Axes a "cartopy.mpl.geoaxes.GeoAxes", which supports plotting
- # in different coordinate systems.
- ax=plt.axes(projection=projection_crs)
-
- # Set display limits to include a set region of latitude * longitude.
- # (Note: Cartopy-specific).
- ax.set_extent((-80.0,20.0,10.0,80.0),crs=crs_latlon)
-
- # Add coastlines and meridians/parallels (Cartopy-specific).
- ax.coastlines(linewidth=0.75,color='navy')
- ax.gridlines(crs=crs_latlon,linestyle='-')
-
- # Plot the first dataset as a pseudocolour filled plot.
- maindata_filepath=iris.sample_data_path('rotated_pole.nc')
- main_data=iris.load_cube(maindata_filepath)
- # NOTE: iplt.pcolormesh calls "pyplot.pcolormesh", passing in a coordinate
- # system with the 'transform' keyword: This enables the Axes (a cartopy
- # GeoAxes) to reproject the plot into the display projection.
- iplt.pcolormesh(main_data,cmap='RdBu_r')
-
- # Overplot the other dataset (which has a different grid), as contours.
- overlay_filepath=iris.sample_data_path('space_weather.nc')
- overlay_data=iris.load_cube(overlay_filepath,'total electron content')
- # NOTE: as above, "iris.plot.contour" calls "pyplot.contour" with a
- # 'transform' keyword, enabling Cartopy reprojection.
- iplt.contour(overlay_data,20,
- linewidths=2.0,colors='darkgreen',linestyles='-')
-
- # Draw a margin line, some way in from the border of the 'main' data...
- # First calculate rectangle corners, 7% in from each corner of the data.
- x_coord,y_coord=main_data.coord(axis='x'),main_data.coord(axis='y')
- x_start,x_end=np.min(x_coord.points),np.max(x_coord.points)
- y_start,y_end=np.min(y_coord.points),np.max(y_coord.points)
- margin=0.07
- margin_fractions=np.array([margin,1.0-margin])
- x_lower,x_upper=x_start+(x_end-x_start)*margin_fractions
- y_lower,y_upper=y_start+(y_end-y_start)*margin_fractions
- box_x_points=x_lower+(x_upper-x_lower)*np.array([0,1,1,0,0])
- box_y_points=y_lower+(y_upper-y_lower)*np.array([0,0,1,1,0])
- # Get the Iris coordinate sytem of the X coordinate (Y should be the same).
- cs_data1=x_coord.coord_system
- # Construct an equivalent Cartopy coordinate reference system ("crs").
- crs_data1=cs_data1.as_cartopy_crs()
- # Draw the rectangle in this crs, with matplotlib "pyplot.plot".
- # NOTE: the 'transform' keyword specifies a non-display coordinate system
- # for the plot points (as used by the "iris.plot" functions).
- plt.plot(box_x_points,box_y_points,transform=crs_data1,
- linewidth=2.0,color='white',linestyle='--')
-
- # Mark some particular places with a small circle and a name label...
- # Define some test points with latitude and longitude coordinates.
- city_data=[('London',51.5072,0.1275),
- ('Halifax, NS',44.67,-63.61),
- ('Reykjavik',64.1333,-21.9333)]
- # Place a single marker point and a text annotation at each place.
- forname,lat,lonincity_data:
- plt.plot(lon,lat,marker='o',markersize=7.0,markeredgewidth=2.5,
- markerfacecolor='black',markeredgecolor='white',
- transform=crs_latlon)
- # NOTE: the "plt.annotate call" does not have a "transform=" keyword,
- # so for this one we transform the coordinates with a Cartopy call.
- at_x,at_y=ax.projection.transform_point(lon,lat,
- src_crs=crs_latlon)
- plt.annotate(
- name,xy=(at_x,at_y),xytext=(30,20),textcoords='offset points',
- color='black',backgroundcolor='white',size='large',
- arrowprops=dict(arrowstyle='->',color='white',linewidth=2.5))
-
- # Add a title, and display.
- plt.title('A pseudocolour plot on the {} projection,\n'
- 'with overlaid contours.'.format(projection_name))
- iplt.show()
-
-
-defmain():
- # Demonstrate with two different display projections.
- make_plot('Equidistant Cylindrical',ccrs.PlateCarree())
- make_plot('North Polar Stereographic',ccrs.NorthPolarStereo())
-
-
-if__name__=='__main__':
- main()
-
This example uses several visualisation methods to achieve an array of
-differing images, including:
-
-
-
Visualisation of point based data
-
Contouring of point based data
-
Block plot of contiguous bounded data
-
Non native projection and a Natural Earth shaded relief image underlay
-
-
-
"""
-Rotated pole mapping
-=====================
-
-This example uses several visualisation methods to achieve an array of
-differing images, including:
-
- * Visualisation of point based data
- * Contouring of point based data
- * Block plot of contiguous bounded data
- * Non native projection and a Natural Earth shaded relief image underlay
-
-"""
-importcartopy.crsasccrs
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-importiris.analysis.cartography
-
-
-defmain():
- fname=iris.sample_data_path('rotated_pole.nc')
- air_pressure=iris.load_cube(fname)
-
- # Plot #1: Point plot showing data values & a colorbar
- plt.figure()
- points=qplt.points(air_pressure,c=air_pressure.data)
- cb=plt.colorbar(points,orientation='horizontal')
- cb.set_label(air_pressure.units)
- plt.gca().coastlines()
- iplt.show()
-
- # Plot #2: Contourf of the point based data
- plt.figure()
- qplt.contourf(air_pressure,15)
- plt.gca().coastlines()
- iplt.show()
-
- # Plot #3: Contourf overlayed by coloured point data
- plt.figure()
- qplt.contourf(air_pressure)
- iplt.points(air_pressure,c=air_pressure.data)
- plt.gca().coastlines()
- iplt.show()
-
- # For the purposes of this example, add some bounds to the latitude
- # and longitude
- air_pressure.coord('grid_latitude').guess_bounds()
- air_pressure.coord('grid_longitude').guess_bounds()
-
- # Plot #4: Block plot
- plt.figure()
- plt.axes(projection=ccrs.PlateCarree())
- iplt.pcolormesh(air_pressure)
- plt.gca().stock_img()
- plt.gca().coastlines()
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
Produces a time-series plot of North American temperature forecasts for 2 different emission scenarios.
-Constraining data to a limited spatial area also features in this example.
-
The data used comes from the HadGEM2-AO model simulations for the A1B and E1 scenarios, both of which
-were derived using the IMAGE Integrated Assessment Model (Johns et al. 2010; Lowe et al. 2009).
Johns T.C., et al. (2010) Climate change under aggressive mitigation: The ENSEMBLES multi-model
-experiment. Climate Dynamics (submitted)
-
Lowe J.A., C.D. Hewitt, D.P. Van Vuuren, T.C. Johns, E. Stehfest, J-F. Royer, and P. van der Linden, 2009.
-New Study For Climate Modeling, Analyses, and Scenarios. Eos Trans. AGU, Vol 90, No. 21.
-
-
-
See also
-
Further details on the aggregation functionality being used in this example can be found in
-Cube statistics.
-
-
"""
-Global average annual temperature plot
-======================================
-
-Produces a time-series plot of North American temperature forecasts for 2 different emission scenarios.
-Constraining data to a limited spatial area also features in this example.
-
-The data used comes from the HadGEM2-AO model simulations for the A1B and E1 scenarios, both of which
-were derived using the IMAGE Integrated Assessment Model (Johns et al. 2010; Lowe et al. 2009).
-
-References
-----------
-
- Johns T.C., et al. (2010) Climate change under aggressive mitigation: The ENSEMBLES multi-model
- experiment. Climate Dynamics (submitted)
-
- Lowe J.A., C.D. Hewitt, D.P. Van Vuuren, T.C. Johns, E. Stehfest, J-F. Royer, and P. van der Linden, 2009.
- New Study For Climate Modeling, Analyses, and Scenarios. Eos Trans. AGU, Vol 90, No. 21.
-
-.. seealso::
-
- Further details on the aggregation functionality being used in this example can be found in
- :ref:`cube-statistics`.
-
-"""
-importnumpyasnp
-importmatplotlib.pyplotasplt
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-importiris.analysis.cartography
-importmatplotlib.datesasmdates
-
-
-defmain():
- # Load data into three Cubes, one for each set of NetCDF files.
- e1=iris.load_cube(iris.sample_data_path('E1_north_america.nc'))
-
- a1b=iris.load_cube(iris.sample_data_path('A1B_north_america.nc'))
-
- # load in the global pre-industrial mean temperature, and limit the domain
- # to the same North American region that e1 and a1b are at.
- north_america=iris.Constraint(longitude=lambdav:225<=v<=315,
- latitude=lambdav:15<=v<=60)
- pre_industrial=iris.load_cube(iris.sample_data_path('pre-industrial.pp'),
- north_america)
-
- # Generate area-weights array. As e1 and a1b are on the same grid we can
- # do this just once and re-use. This method requires bounds on lat/lon
- # coords, so let's add some in sensible locations using the "guess_bounds"
- # method.
- e1.coord('latitude').guess_bounds()
- e1.coord('longitude').guess_bounds()
- e1_grid_areas=iris.analysis.cartography.area_weights(e1)
- pre_industrial.coord('latitude').guess_bounds()
- pre_industrial.coord('longitude').guess_bounds()
- pre_grid_areas=iris.analysis.cartography.area_weights(pre_industrial)
-
- # Perform the area-weighted mean for each of the datasets using the
- # computed grid-box areas.
- pre_industrial_mean=pre_industrial.collapsed(['latitude','longitude'],
- iris.analysis.MEAN,
- weights=pre_grid_areas)
- e1_mean=e1.collapsed(['latitude','longitude'],
- iris.analysis.MEAN,
- weights=e1_grid_areas)
- a1b_mean=a1b.collapsed(['latitude','longitude'],
- iris.analysis.MEAN,
- weights=e1_grid_areas)
-
- # Show ticks 30 years apart
- plt.gca().xaxis.set_major_locator(mdates.YearLocator(30))
-
- # Plot the datasets
- qplt.plot(e1_mean,label='E1 scenario',lw=1.5,color='blue')
- qplt.plot(a1b_mean,label='A1B-Image scenario',lw=1.5,color='red')
-
- # Draw a horizontal line showing the pre-industrial mean
- plt.axhline(y=pre_industrial_mean.data,color='gray',linestyle='dashed',
- label='pre-industrial',lw=1.5)
-
- # Establish where r and t have the same data, i.e. the observations
- common=np.where(a1b_mean.data==e1_mean.data)[0]
- observed=a1b_mean[common]
-
- # Plot the observed data
- qplt.plot(observed,label='observed',color='black',lw=1.5)
-
- # Add a legend and title
- plt.legend(loc="upper left")
- plt.title('North American mean air temperature',fontsize=18)
-
- plt.xlabel('Time / year')
-
- plt.grid()
-
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
Produces maps of global temperature forecasts from the A1B and E1 scenarios.
-
The data used comes from the HadGEM2-AO model simulations for the A1B and E1 scenarios, both of which were derived using the IMAGE
-Integrated Assessment Model (Johns et al. 2010; Lowe et al. 2009).
Johns T.C., et al. (2010) Climate change under aggressive mitigation: The ENSEMBLES multi-model experiment. Climate
-Dynamics (submitted)
-
Lowe J.A., C.D. Hewitt, D.P. Van Vuuren, T.C. Johns, E. Stehfest, J-F. Royer, and P. van der Linden, 2009. New
-Study For Climate Modeling, Analyses, and Scenarios. Eos Trans. AGU, Vol 90, No. 21.
-
-
"""
-Global average annual temperature maps
-======================================
-
-Produces maps of global temperature forecasts from the A1B and E1 scenarios.
-
-The data used comes from the HadGEM2-AO model simulations for the A1B and E1 scenarios, both of which were derived using the IMAGE
-Integrated Assessment Model (Johns et al. 2010; Lowe et al. 2009).
-
-References
-----------
-
- Johns T.C., et al. (2010) Climate change under aggressive mitigation: The ENSEMBLES multi-model experiment. Climate
- Dynamics (submitted)
-
- Lowe J.A., C.D. Hewitt, D.P. Van Vuuren, T.C. Johns, E. Stehfest, J-F. Royer, and P. van der Linden, 2009. New
- Study For Climate Modeling, Analyses, and Scenarios. Eos Trans. AGU, Vol 90, No. 21.
-
-
-"""
-importos.path
-importitertools
-importmatplotlib.pyplotasplt
-importnumpyasnp
-
-importiris
-importiris.coordsascoords
-importiris.plotasiplt
-
-
-defcop_metadata_callback(cube,field,filename):
- """ A function which adds an "Experiment" coordinate which comes from the filename. """
-
- # Extract the experiment name (such as a1b or e1) from the filename (in this case it is just the parent folder's name)
- containing_folder=os.path.dirname(filename)
- experiment_label=os.path.basename(containing_folder)
-
- # Create a coordinate with the experiment label in it
- exp_coord=coords.AuxCoord(experiment_label,long_name='Experiment',units='no_unit')
-
- # and add it to the cube
- cube.add_aux_coord(exp_coord)
-
-
-defmain():
- # Load e1 and a1 using the callback to update the metadata
- e1=iris.load_cube(iris.sample_data_path('E1.2098.pp'),
- callback=cop_metadata_callback)
- a1b=iris.load_cube(iris.sample_data_path('A1B.2098.pp'),
- callback=cop_metadata_callback)
-
- # Load the global average data and add an 'Experiment' coord it
- global_avg=iris.load_cube(iris.sample_data_path('pre-industrial.pp'))
-
- # Define evenly spaced contour levels: -2.5, -1.5, ... 15.5, 16.5 with the specific colours
- levels=np.arange(20)-2.5
- red=np.array([0,0,221,239,229,217,239,234,228,222,205,196,161,137,116,89,77,60,51])/256.
- green=np.array([16,217,242,243,235,225,190,160,128,87,72,59,33,21,29,30,30,29,26])/256.
- blue=np.array([255,255,243,169,99,51,63,37,39,21,27,23,22,26,29,28,27,25,22])/256.
-
- # Put those colours into an array which can be passed to conourf as the specific colours for each level
- colors=np.array([red,green,blue]).T
-
- # Subtract the global
-
-
- # Iterate over each latitude longitude slice for both e1 and a1b scenarios simultaneously
- fore1_slice,a1b_sliceinitertools.izip(e1.slices(['latitude','longitude']),a1b.slices(['latitude','longitude'])):
-
- time_coord=a1b_slice.coord('time')
-
- # Calculate the difference from the mean
- delta_e1=e1_slice-global_avg
- delta_a1b=a1b_slice-global_avg
-
- # Make a wider than normal figure to house two maps side-by-side
- fig=plt.figure(figsize=(12,5))
-
- # Get the time datetime from the coordinate
- time=time_coord.units.num2date(time_coord.points[0])
- # Set a title for the entire figure, giving the time in a nice format of "MonthName Year". Also, set the y value for the
- # title so that it is not tight to the top of the plot.
- fig.suptitle('Annual Temperature Predictions for '+time.strftime("%Y"),y=0.9,fontsize=18)
-
- # Add the first subplot showing the E1 scenario
- plt.subplot(121)
- plt.title('HadGEM2 E1 Scenario',fontsize=10)
- iplt.contourf(delta_e1,levels,colors=colors,linewidth=0,extend='both')
- plt.gca().coastlines()
- # get the current axes' subplot for use later on
- plt1_ax=plt.gca()
-
- # Add the second subplot showing the A1B scenario
- plt.subplot(122)
- plt.title('HadGEM2 A1B-Image Scenario',fontsize=10)
- contour_result=iplt.contourf(delta_a1b,levels,colors=colors,linewidth=0,extend='both')
- plt.gca().coastlines()
- # get the current axes' subplot for use later on
- plt2_ax=plt.gca()
-
-
- # Now add a colourbar who's leftmost point is the same as the leftmost point of the left hand plot
- # and rightmost point is the rightmost point of the right hand plot
-
- # Get the positions of the 2nd plot and the left position of the 1st plot
- left,bottom,width,height=plt2_ax.get_position().bounds
- first_plot_left=plt1_ax.get_position().bounds[0]
-
- # the width of the colorbar should now be simple
- width=left-first_plot_left+width
-
- # Add axes to the figure, to place the colour bar
- colorbar_axes=fig.add_axes([first_plot_left,bottom+0.07,width,0.03])
-
- # Add the colour bar
- cbar=plt.colorbar(contour_result,colorbar_axes,orientation='horizontal')
-
- # Label the colour bar and add ticks
- cbar.set_label(e1_slice.units)
- cbar.ax.tick_params(length=0)
-
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This space weather example plots a filled contour of rotated pole point
-data with a shaded relief image underlay. The plot shows aggregated
-vertical electron content in the ionosphere.
-
The plot exhibits an interesting outline effect due to excluding data
-values below a certain threshold.
-
"""
-Ionosphere space weather
-========================
-
-This space weather example plots a filled contour of rotated pole point
-data with a shaded relief image underlay. The plot shows aggregated
-vertical electron content in the ionosphere.
-
-The plot exhibits an interesting outline effect due to excluding data
-values below a certain threshold.
-
-"""
-
-importmatplotlib.pyplotasplt
-importnumpy.maasma
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-defmain():
- # Load the "total electron content" cube.
- filename=iris.sample_data_path('space_weather.nc')
- cube=iris.load_cube(filename,'total electron content')
-
- # Explicitly mask negative electron content.
- cube.data=ma.masked_less(cube.data,0)
-
- # Plot the cube using one hundred colour levels.
- qplt.contourf(cube,100)
- plt.title('Total Electron Content')
- plt.xlabel('longitude / degrees')
- plt.ylabel('latitude / degrees')
- plt.gca().stock_img()
- plt.gca().coastlines()
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example shows some processing of cubes in order to derive further related
-cubes; in this case the derived cubes are Exner pressure and air temperature
-which are calculated by combining air pressure, air potential temperature and
-specific humidity. Finally, the two new cubes are presented side-by-side in a
-plot.
-
"""
-Deriving Exner Pressure and Air Temperature
-===========================================
-
-This example shows some processing of cubes in order to derive further related
-cubes; in this case the derived cubes are Exner pressure and air temperature
-which are calculated by combining air pressure, air potential temperature and
-specific humidity. Finally, the two new cubes are presented side-by-side in a
-plot.
-
-"""
-importmatplotlib.pyplotasplt
-importmatplotlib.ticker
-
-importiris
-importiris.coordsascoords
-importiris.iterate
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-deflimit_colorbar_ticks(contour_object):
- """
- Takes a contour object which has an associated colorbar and limits the
- number of ticks on the colorbar to 4.
-
- """
- # Under Matplotlib v1.2.x the colorbar attribute of a contour object is
- # a tuple containing the colorbar and an axes object, whereas under
- # Matplotlib v1.3.x it is simply the colorbar.
- try:
- colorbar=contour_object.colorbar[0]
- exceptAttributeError:
- colorbar=contour_object.colorbar
-
- colorbar.locator=matplotlib.ticker.MaxNLocator(4)
- colorbar.update_ticks()
-
-
-defmain():
- fname=iris.sample_data_path('colpex.pp')
-
- # The list of phenomena of interest
- phenomena=['air_potential_temperature','air_pressure']
-
- # Define the constraint on standard name and model level
- constraints=[iris.Constraint(phenom,model_level_number=1)for
- phenominphenomena]
-
- air_potential_temperature,air_pressure=iris.load_cubes(fname,
- constraints)
-
- # Define a coordinate which represents 1000 hPa
- p0=coords.AuxCoord(1000,long_name='P0',units='hPa')
- # Convert reference pressure 'p0' into the same units as 'air_pressure'
- p0.convert_units(air_pressure.units)
-
- # Calculate Exner pressure
- exner_pressure=(air_pressure/p0)**(287.05/1005.0)
- # Set the name (the unit is scalar)
- exner_pressure.rename('exner_pressure')
-
- # Calculate air_temp
- air_temperature=exner_pressure*air_potential_temperature
- # Set the name (the unit is K)
- air_temperature.rename('air_temperature')
-
- # Now create an iterator which will give us lat lon slices of
- # exner pressure and air temperature in the form
- # (exner_slice, air_temp_slice).
- lat_lon_slice_pairs=iris.iterate.izip(exner_pressure,
- air_temperature,
- coords=['grid_latitude',
- 'grid_longitude'])
-
- # For the purposes of this example, we only want to demonstrate the first
- # plot.
- lat_lon_slice_pairs=[next(lat_lon_slice_pairs)]
-
- plt.figure(figsize=(8,4))
- forexner_slice,air_temp_sliceinlat_lon_slice_pairs:
- plt.subplot(121)
- cont=qplt.contourf(exner_slice)
-
- # The default colorbar has a few too many ticks on it, causing text to
- # overlap. Therefore, limit the number of ticks.
- limit_colorbar_ticks(cont)
-
- plt.subplot(122)
- cont=qplt.contourf(air_temp_slice)
- limit_colorbar_ticks(cont)
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example demonstrates the creation of a Hovmoller diagram with fine control over plot ticks and labels.
-The data comes from the Met Office OSTIA project and has been pre-processed to calculate the monthly mean sea
-surface temperature.
-
"""
-Hovmoller diagram of monthly surface temperature
-================================================
-
-This example demonstrates the creation of a Hovmoller diagram with fine control over plot ticks and labels.
-The data comes from the Met Office OSTIA project and has been pre-processed to calculate the monthly mean sea
-surface temperature.
-
-"""
-importmatplotlib.pyplotasplt
-importmatplotlib.datesasmdates
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-importiris.unit
-
-
-defmain():
- fname=iris.sample_data_path('ostia_monthly.nc')
-
- # load a single cube of surface temperature between +/- 5 latitude
- cube=iris.load_cube(fname,iris.Constraint('surface_temperature',latitude=lambdav:-5<v<5))
-
- # Take the mean over latitude
- cube=cube.collapsed('latitude',iris.analysis.MEAN)
-
- # Now that we have our data in a nice way, lets create the plot
- # contour with 20 levels
- qplt.contourf(cube,20)
-
- # Put a custom label on the y axis
- plt.ylabel('Time / years')
-
- # Stop matplotlib providing clever axes range padding
- plt.axis('tight')
-
- # As we are plotting annual variability, put years as the y ticks
- plt.gca().yaxis.set_major_locator(mdates.YearLocator())
-
- # And format the ticks to just show the year
- plt.gca().yaxis.set_major_formatter(mdates.DateFormatter('%Y'))
-
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example demonstrates the loading of a lagged ensemble dataset from the GloSea4 model, which is then used to
-produce two types of plot:
-
-
-
The first shows the “postage stamp” style image with an array of 14 images, one for each ensemble member with
-a shared colorbar. (The missing image in this example represents ensemble member number 6 which was a failed run)
-
The second plot shows the data limited to a region of interest, in this case a region defined for forecasting
-ENSO (El Nino-Southern Oscillation), which, for the purposes of this example, has had the ensemble mean subtracted
-from each ensemble member to give an anomaly surface temperature. In practice a better approach would be to take the
-climatological mean, calibrated to the model, from each ensemble member.
-
-
-
"""
-Seasonal ensemble model plots
-=============================
-
-This example demonstrates the loading of a lagged ensemble dataset from the GloSea4 model, which is then used to
-produce two types of plot:
-
- * The first shows the "postage stamp" style image with an array of 14 images, one for each ensemble member with
- a shared colorbar. (The missing image in this example represents ensemble member number 6 which was a failed run)
-
- * The second plot shows the data limited to a region of interest, in this case a region defined for forecasting
- ENSO (El Nino-Southern Oscillation), which, for the purposes of this example, has had the ensemble mean subtracted
- from each ensemble member to give an anomaly surface temperature. In practice a better approach would be to take the
- climatological mean, calibrated to the model, from each ensemble member.
-
-"""
-importmatplotlib.pyplotasplt
-importnumpyasnp
-
-importiris
-importiris.plotasiplt
-
-
-defrealization_metadata(cube,field,fname):
- """
- A function which modifies the cube's metadata to add a "realization" (ensemble member) coordinate from the filename if one
- doesn't already exist in the cube.
-
- """
- # add an ensemble member coordinate if one doesn't already exist
- ifnotcube.coords('realization'):
- # the ensemble member is encoded in the filename as *_???.pp where ??? is the ensemble member
- realization_number=fname[-6:-3]
-
- importiris.coords
- realization_coord=iris.coords.AuxCoord(np.int32(realization_number),'realization')
- cube.add_aux_coord(realization_coord)
-
-
-defmain():
- # extract surface temperature cubes which have an ensemble member coordinate, adding appropriate lagged ensemble metadata
- surface_temp=iris.load_cube(iris.sample_data_path('GloSea4','ensemble_???.pp'),
- iris.Constraint('surface_temperature',realization=lambdavalue:True),
- callback=realization_metadata,
- )
-
- # ----------------------------------------------------------------------------------------------------------------
- # Plot #1: Ensemble postage stamps
- # ----------------------------------------------------------------------------------------------------------------
-
- # for the purposes of this example, take the last time element of the cube
- last_timestep=surface_temp[:,-1,:,:]
-
- # Make 50 evenly spaced levels which span the dataset
- contour_levels=np.linspace(np.min(last_timestep.data),np.max(last_timestep.data),50)
-
- # Create a wider than normal figure to support our many plots
- plt.figure(figsize=(12,6),dpi=100)
-
- # Also manually adjust the spacings which are used when creating subplots
- plt.gcf().subplots_adjust(hspace=0.05,wspace=0.05,top=0.95,bottom=0.05,left=0.075,right=0.925)
-
- # iterate over all possible latitude longitude slices
- forcubeinlast_timestep.slices(['latitude','longitude']):
-
- # get the ensemble member number from the ensemble coordinate
- ens_member=cube.coord('realization').points[0]
-
- # plot the data in a 4x4 grid, with each plot's position in the grid being determined by ensemble member number
- # the special case for the 13th ensemble member is to have the plot at the bottom right
- ifens_member==13:
- plt.subplot(4,4,16)
- else:
- plt.subplot(4,4,ens_member+1)
-
- cf=iplt.contourf(cube,contour_levels)
-
- # add coastlines
- plt.gca().coastlines()
-
- # make an axes to put the shared colorbar in
- colorbar_axes=plt.gcf().add_axes([0.35,0.1,0.3,0.05])
- colorbar=plt.colorbar(cf,colorbar_axes,orientation='horizontal')
- colorbar.set_label('%s'%last_timestep.units)
-
- # limit the colorbar to 8 tick marks
- importmatplotlib.ticker
- colorbar.locator=matplotlib.ticker.MaxNLocator(8)
- colorbar.update_ticks()
-
- # get the time for the entire plot
- time_coord=last_timestep.coord('time')
- time=time_coord.units.num2date(time_coord.bounds[0,0])
-
- # set a global title for the postage stamps with the date formated by "monthname year"
- plt.suptitle('Surface temperature ensemble forecasts for %s'%time.strftime('%B %Y'))
-
- iplt.show()
-
- # ----------------------------------------------------------------------------------------------------------------
- # Plot #2: ENSO plumes
- # ----------------------------------------------------------------------------------------------------------------
-
- # Nino 3.4 lies between: 170W and 120W, 5N and 5S, so define a constraint which matches this
- nino_3_4_constraint=iris.Constraint(longitude=lambdav:-170+360<=v<=-120+360,latitude=lambdav:-5<=v<=5)
-
- nino_cube=surface_temp.extract(nino_3_4_constraint)
-
- # Subsetting a circular longitude coordinate always results in a circular coordinate, so set the coordinate to be non-circular
- nino_cube.coord('longitude').circular=False
-
- # Calculate the horizontal mean for the nino region
- mean=nino_cube.collapsed(['latitude','longitude'],iris.analysis.MEAN)
-
- # Calculate the ensemble mean of the horizontal mean. To do this, remove the "forecast_period" and
- # "forecast_reference_time" coordinates which span both "relalization" and "time".
- mean.remove_coord("forecast_reference_time")
- mean.remove_coord("forecast_period")
- ensemble_mean=mean.collapsed('realization',iris.analysis.MEAN)
-
- # take the ensemble mean from each ensemble member
- mean-=ensemble_mean.data
-
- plt.figure()
-
- forensemble_memberinmean.slices(['time']):
- # draw each ensemble member as a dashed line in black
- iplt.plot(ensemble_member,'--k')
-
- plt.title('Mean temperature anomaly for ENSO 3.4 region')
- plt.xlabel('Time')
- plt.ylabel('Temperature anomaly / K')
-
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example demonstrates how to plot vertical profiles of different
-variables in the same axes, and how to make a scatter plot of two
-variables. There is an oceanographic theme but the same techniques are
-equally applicable to atmospheric or other kinds of data.
-
The data used are profiles of potential temperature and salinity in the
-Equatorial and South Atlantic, output from an ocean model.
-
The y-axis of the first plot produced will be automatically inverted due to the
-presence of the attribute positive=down on the depth coordinate. This means
-depth values intuitively increase downward on the y-axis.
-
"""
-Oceanographic profiles and T-S diagrams
-=======================================
-
-This example demonstrates how to plot vertical profiles of different
-variables in the same axes, and how to make a scatter plot of two
-variables. There is an oceanographic theme but the same techniques are
-equally applicable to atmospheric or other kinds of data.
-
-The data used are profiles of potential temperature and salinity in the
-Equatorial and South Atlantic, output from an ocean model.
-
-The y-axis of the first plot produced will be automatically inverted due to the
-presence of the attribute positive=down on the depth coordinate. This means
-depth values intuitively increase downward on the y-axis.
-
-"""
-importiris
-importiris.iterate
-importiris.plotasiplt
-importmatplotlib.pyplotasplt
-
-
-defmain():
-
- # Load the gridded temperature and salinity data.
- fname=iris.sample_data_path('atlantic_profiles.nc')
- cubes=iris.load(fname)
- theta,=cubes.extract('sea_water_potential_temperature')
- salinity,=cubes.extract('sea_water_practical_salinity')
-
- # Extract profiles of temperature and salinity from a particular point in
- # the southern portion of the domain, and limit the depth of the profile
- # to 1000m.
- lon_cons=iris.Constraint(longitude=330.5)
- lat_cons=iris.Constraint(latitude=lambdal:-10<l<-9)
- depth_cons=iris.Constraint(depth=lambdad:d<=1000)
- theta_1000m=theta.extract(depth_cons&lon_cons&lat_cons)
- salinity_1000m=salinity.extract(depth_cons&lon_cons&lat_cons)
-
- # Plot these profiles on the same set of axes. In each case we call plot
- # with two arguments, the cube followed by the depth coordinate. Putting
- # them in this order places the depth coordinate on the y-axis.
- # The first plot is in the default axes. We'll use the same color for the
- # curve and its axes/tick labels.
- fig=plt.figure(figsize=(5,6))
- temperature_color=(.3,.4,.5)
- ax1=plt.gca()
- iplt.plot(theta_1000m,theta_1000m.coord('depth'),linewidth=2,
- color=temperature_color,alpha=.75)
- ax1.set_xlabel('Potential Temperature / K',color=temperature_color)
- ax1.set_ylabel('Depth / m')
- forticklabelinax1.get_xticklabels():
- ticklabel.set_color(temperature_color)
- # To plot salinity in the same axes we use twiny(). We'll use a different
- # color to identify salinity.
- salinity_color=(.6,.1,.15)
- ax2=plt.gca().twiny()
- iplt.plot(salinity_1000m,salinity_1000m.coord('depth'),linewidth=2,
- color=salinity_color,alpha=.75)
- ax2.set_xlabel('Salinity / PSU',color=salinity_color)
- forticklabelinax2.get_xticklabels():
- ticklabel.set_color(salinity_color)
- plt.tight_layout()
- iplt.show()
-
- # Now plot a T-S diagram using scatter. We'll use all the profiles here,
- # and each point will be coloured according to its depth.
- plt.figure(figsize=(6,6))
- depth_values=theta.coord('depth').points
- fors,tiniris.iterate.izip(salinity,theta,coords='depth'):
- iplt.scatter(s,t,c=depth_values,marker='+',cmap='RdYlBu_r')
- ax=plt.gca()
- ax.set_xlabel('Salinity / PSU')
- ax.set_ylabel('Potential Temperature / K')
- cb=plt.colorbar(orientation='horizontal')
- cb.set_label('Depth / m')
- plt.tight_layout()
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
- Enter search terms or a module, class or function name.
-
-
-
-
-
-
-
-
-
-
-
-
-
Index
-
-
- A
- | B
- | C
- | D
- | E
- | F
- | G
- | H
- | I
- | J
- | K
- | L
- | M
- | N
- | O
- | P
- | R
- | S
- | T
- | U
- | V
- | W
- | X
- | Y
- | Z
-
-
- Enter search terms or a module, class or function name.
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-Iris seeks to provide a powerful, easy to use, and community-driven Python library for analysing and visualising meteorological and oceanographic data sets.
-
-
-With Iris you can:
-
-
Use a single API to work on your data, irrespective of its original format.
-
Read and write (CF-)netCDF, GRIB, and PP files.
-
Easily produce graphs and maps via integration with matplotlib and cartopy.
Iris makes use of a range of other libraries and python modules. These
-dependencies must be in place before you can successfully install
-Iris. Once you have satisfied the requirements detailed below,
-extract the iris source package, cd to the new directory, and enter:
We strongly encourage people to contribute to Iris and for this type of
-development activity an in-place build can be useful. Once you’ve cloned
-the Iris git repository you can perform an in-place build by entering:
These are external packages which you will need to have installed before
-installing and running Iris.
-
Many of these packages are available in Linux package managers
-such as aptitude and yum. For example, it may be possible to install
-Numpy using:
-
apt-get install python-numpy
-
-
-
If you are installing dependencies with a package manager on Linux,
-you may need to install the development packages (look for a “-dev”
-postfix) in addition to the core packages.
These are optional packages which you may want to install to enable
-additonal Iris functionality such as plotting and
-loading/saving GRIB. These packages are required for the full Iris test
-suite to run.
(https://software.ecmwf.int/wiki/display/GRIB/Releases)
-API for the encoding and decoding WMO FM-92 GRIB edition 1 and
-edition 2 messages. A compression library such as Jasper is required
-to read JPEG2000 compressed GRIB2 files.
The libmo_unpack library can be used by Iris for decoding/unpacking
-PP files or Fields files that use an lbpack value of 1 or 4. This
-library is open source, licensed under the 2-clause BSD licence.
-It can be obtained from http://puma.nerc.ac.uk/trac/UM_TOOLS/wiki/unpack.
-
Use of this library is not enabled by default. If this library is
-available its use can be enabled by installing Iris with the following
-command:
-
python setup.py --with-unpack install
-
-
-
Note that if this library and/or its associated header files are installed
-in a custom location then additional compiler arguments may need to be
-passed in to ensure that the Python extension module linking against it
-builds correctly:
The default site configuration values can be overridden by creating the file
-iris/etc/site.cfg. For example, the following snippet can be used to
-specify a non-standard location for your udunits library:
A package for handling multi-dimensional data and associated metadata.
-
-
Note
-
The Iris documentation has further usage information, including
-a user guide which should be the first port of
-call for new users.
-
-
The functions in this module provide the main way to load and/or save
-your data.
-
The load() function provides a simple way to explore data from
-the interactive Python prompt. It will convert the source data into
-Cubes, and combine those cubes into
-higher-dimensional cubes where possible.
-
The load_cube() and load_cubes() functions are similar to
-load(), but they raise an exception if the number of cubes is not
-what was expected. They are more useful in scripts, where they can
-provide an early sanity check on incoming data.
-
The load_raw() function is provided for those occasions where the
-automatic combination of cubes into higher-dimensional cubes is
-undesirable. However, it is intended as a tool of last resort! If you
-experience a problem with the automatic combination process then please
-raise an issue with the Iris developers.
-
To persist a cube to the file-system, use the save() function.
-
All the load functions share very similar arguments:
-
-
-
-
uris:
-
Either a single filename/URI expressed as a string, or an
-iterable of filenames/URIs.
-
Filenames can contain ~ or ~user abbreviations, and/or
-Unix shell-style wildcards (e.g. * and ?). See the
-standard library function os.path.expanduser() and
-module fnmatch for more details.
-
-
-
-
-
constraints:
-
Either a single constraint, or an iterable of constraints.
-Each constraint can be either a CF standard name, an instance of
-iris.Constraint, or an instance of
-iris.AttributeConstraint.
-
For example:
-
# Load air temperature data.
-load_cube(uri,'air_temperature')
-
-# Load data with a specific model level number.
-load_cube(uri,iris.Constraint(model_level_number=1))
-
-# Load data with a specific STASH code.
-load_cube(uri,iris.AttributeConstraint(STASH='m01s00i004'))
-
-
-
-
-
-
-
callback:
-
A function to add metadata from the originating field and/or URI which
-obeys the following rules:
-
-
Function signature must be: (cube,field,filename).
-
Modifies the given cube inplace, unless a new cube is
-returned by the function.
This function is provided for those occasions where the automatic
-combination of cubes into higher-dimensional cubes is undesirable.
-However, it is intended as a tool of last resort! If you experience
-a problem with the automatic combination process then please raise
-an issue with the Iris developers.
-
For a full description of the arguments, please see the module
-documentation for iris.
target - A filename (or writable, depending on file format).
-
When given a filename or file, Iris can determine the
-file format.
-
-
-
-
-
-
Kwargs:
-
-
-
-
saver - Optional. Specifies the save function to use.
-
If omitted, Iris will attempt to determine the format.
-
This keyword can be used to implement a custom save
-format. Function form must be:
-my_saver(cube,target) plus any custom keywords. It
-is assumed that a saver will accept an append keyword
-if it’s file format can handle multiple cubes. See also
-iris.io.add_saver().
-
-
-
-
-
-
All other keywords are passed through to the saver function; see the
-relevant saver documentation for more information on keyword arguments.
-
Examples:
-
# Save a cube to PP
-iris.save(my_cube,"myfile.pp")
-
-# Save a cube list to a PP file, appending to the contents of the file
-# if it already exists
-iris.save(my_cube_list,"myfile.pp",append=True)
-
-# Save a cube to netCDF, defaults to NETCDF4 file format
-iris.save(my_cube,"myfile.nc")
-
-# Save a cube list to netCDF, using the NETCDF4_CLASSIC storage option
-iris.save(my_cube_list,"myfile.nc",netcdf_format="NETCDF3_CLASSIC")
-
Creates a new instance of a Constraint which can be used for filtering
-cube loading or cube list extraction.
-
Args:
-
-
-
name: string or None
-
If a string, it is used as the name to match against Cube.name().
-
-
-
-
-
cube_func: callable or None
-
If a callable, it must accept a Cube as its first and only argument
-and return either True or False.
-
-
-
-
-
coord_values: dict or None
-
If a dict, it must map coordinate name to the condition on the
-associated coordinate.
-
-
-
-
-
**kwargs:
-
The remaining keyword arguments are converted to coordinate
-constraints. The name of the argument gives the name of a
-coordinate, and the value of the argument is the condition to meet
-on that coordinate:
-
Constraint(model_level_number=10)
-
-
-
Coordinate level constraints can be of several types:
-
-
string, int or float - the value of the coordinate to match.
-e.g. model_level_number=10
-
list of values - the possible values that the coordinate may
-have to match. e.g. model_level_number=[10,12]
-
callable - a function which accepts a
-iris.coords.Cell instance as its first and only argument
-returning True or False if the value of the Cell is desired.
-e.g. model_level_number=lambdacell:5<cell<10
-
-
-
-
-
-
The user guide covers cube much of
-constraining in detail, however an example which uses all of the
-features of this class is given here for completeness:
To adjust the values simply update the relevant attribute from
-within your code. For example:
-
iris.FUTURE.cell_datetime_objects=True
-
-
-
If Iris code is executed with multiple threads, note the values of
-these options are thread-specific.
-
The option cell_datetime_objects controls whether the
-iris.coords.Coord.cell() method returns time coordinate
-values as simple numbers or as time objects with attributes for
-year, month, day, etc. In particular, this allows one to express
-certain time constraints using a simpler, more transparent
-syntax, such as:
-
# To select all data defined at midday.
-Constraint(time=lambdacell:cell.point.hour==12)
-
-# To ignore the 29th of February.
-Constraint(time=lambdacell:cell.point.day!=29and
- cell.point.month!=2)
-
The option netcdf_promote controls whether the netCDF loader
-will expose variables which define reference surfaces for
-dimensionless vertical coordinates as independent Cubes.
-
The option strict_grib_load controls whether GRIB files are
-loaded as Cubes using a new template-based conversion process.
-This new conversion process will raise an exception when it
-encounters a GRIB message which uses a template not supported
-by the conversion.
-
The option netcdf_no_unlimited, when True, changes the
-behaviour of the netCDF saver, such that no dimensions are set to
-unlimited. The current default is that the leading dimension is
-unlimited unless otherwise specified.
Return a context manager which allows temporary modification of
-the option values for the active thread.
-
On entry to the with statement, all keyword arguments are
-applied to the Future object. On exit from the with
-statement, the previous state is restored.
-
For example:
-
withiris.FUTURE.context():
- iris.FUTURE.cell_datetime_objects=True
- # ... code which expects time objects
-
-
-
Or more concisely:
-
with iris.FUTURE.context(cell_datetime_objects=True):
- # ... code which expects time objects
-
This module defines a suite of Aggregator instances,
-which are used to specify the statistical measure to calculate over a
-Cube, using methods such as
-aggregated_by() and collapsed().
-
The Aggregator is a convenience class that allows
-specific statistical aggregation operators to be defined and instantiated.
-These operators can then be used to collapse, or partially collapse, one or
-more dimensions of a Cube, as discussed in
-Cube statistics.
Additional kwargs associated with the use of this aggregator:
-
-
-
mdtol (float):
-
Tolerance of missing data. The value returned in each element of the
-returned array will be masked if the fraction of masked data contributing
-to that element exceeds mdtol. This fraction is calculated based on the
-number of masked elements. mdtol=0 means no missing data is tolerated
-while mdtol=1 means the resulting element will be masked if and only if
-all the contributing elements are masked. Defaults to 1.
-
-
-
-
-
weights (float ndarray):
-
Weights matching the shape of the cube or the length of the window
-for rolling window operations. Note that, latitude/longitude area
-weights can be calculated using
-iris.analysis.cartography.area_weights().
-
-
-
-
-
returned (boolean):
-
Set this to True to indicate that the collapsed weights are to be
-returned along with the collapsed data. Defaults to False.
-
-
-
-
-
For example:
-
To compute zonal means over the longitude axis of a cube:
An Aggregator instance that calculates
-the peak value derived from a spline interpolation over a
-Cube.
-
The peak calculation takes into account nan values. Therefore, if the number
-of non-nan values is zero the result itself will be an array of nan values.
-
The peak calculation also takes into account masked values. Therefore, if the
-number of non-masked values is zero the result itself will be a masked array.
-
If multiple coordinates are specified, then the peak calculations are
-performed individually, in sequence, for each coordinate specified.
An Aggregator instance that calculates the
-proportion, as a fraction, of Cube data occurrences
-that satisfy a particular criterion, as defined by a user supplied
-function.
-
Required kwargs associated with the use of this aggregator:
-
-
-
function (callable):
-
A function which converts an array of data values into a corresponding
-array of True/False values.
-
-
-
-
-
For example:
-
To compute the probability of precipitation exceeding 10
-(in cube data units) across ensemble members could be calculated with:
An Aggregator instance that calculates
-the root mean square over a Cube, as computed by
-((x0**2 + x1**2 + ... + xN-1**2) / N) ** 0.5.
-
Additional kwargs associated with the use of this aggregator:
-
-
-
weights (float ndarray):
-
Weights matching the shape of the cube or the length of the window for
-rolling window operations. The weights are applied to the squares when
-taking the mean.
-
-
-
-
-
For example:
-
To compute the zonal root mean square over the longitude axis of a cube:
Cell method definition formatter. Used in the fashion
-“cell_method.format(**kwargs)”, to produce a cell-method string
-which can include keyword values.
-
-
-
-
-
call_func (callable):
-
-
Call signature: (data, axis=None, **kwargs)
-
-
Data aggregation function.
-Returns an aggregation result, collapsing the ‘axis’ dimension of
-the ‘data’ argument.
-
-
-
-
-
Kwargs:
-
-
-
units_func (callable):
-
-
Call signature: (units)
-
-
If provided, called to convert a cube’s units.
-Returns an iris.units.Unit, or a
-value that can be made into one.
-
-
-
-
-
lazy_func (callable or None):
-
An alternative to call_func implementing a lazy
-aggregation. Note that, it need not support all features of the
-main operation, but should raise an error in unhandled cases.
-
-
-
-
-
-
Additional kwargs::
-
Passed through to call_func and lazy_func.
-
-
Aggregators are used by cube aggregation methods such as
-collapsed() and
-aggregated_by(). For example:
A variety of ready-made aggregators are provided in this module, such
-as MEAN and MAX. Custom
-aggregators can also be created for special purposes, see
-Calculating a custom statistic for a worked example.
Keyword arguments are passed through to the data aggregation function
-(for example, the “percent” keyword for a percentile aggregator).
-This function is usually used in conjunction with update_metadata(),
-which should be passed the same keyword arguments.
-
Args:
-
-
-
data (array):
-
Data array.
-
-
-
-
-
axis (int):
-
Axis to aggregate over.
-
-
-
-
-
Kwargs:
-
-
-
mdtol (float):
-
Tolerance of missing data. The value returned will be masked if
-the fraction of data to missing data is less than or equal to
-mdtol. mdtol=0 means no missing data is tolerated while mdtol=1
-will return the resulting value from the aggregation function.
-Defaults to 1.
-
-
-
-
-
kwargs:
-
All keyword arguments apart from those specified above, are
-passed through to the data aggregation function.
Perform aggregation over the data with a lazy operation, analogous to
-the ‘aggregate’ result.
-
Keyword arguments are passed through to the data aggregation function
-(for example, the “percent” keyword for a percentile aggregator).
-This function is usually used in conjunction with update_metadata(),
-which should be passed the same keyword arguments.
-
Args:
-
-
-
data (array):
-
A lazy array (biggus.Array).
-
-
-
-
-
axis (int or list of int):
-
The dimensions to aggregate over – note that this is defined
-differently to the ‘aggregate’ method ‘axis’ argument, which only
-accepts a single dimension index.
-
-
-
-
-
Kwargs:
-
-
-
kwargs:
-
All keyword arguments are passed through to the data aggregation
-function.
-
-
-
-
-
-
Returns:
-
A lazy array representing the aggregation operation
-(biggus.Array).
The one or more coordinates that were aggregated over.
-
-
-
-
-
Kwargs:
-
-
This function is intended to be used in conjunction with aggregate()
-and should be passed the same keywords (for example, the “ddof”
-keyword from a standard deviation aggregator).
-
-
-
Returns:
-
The collapsed cube with its aggregated data payload.
This function is intended to be used in conjuction with aggregate()
-and should be passed the same keywords (for example, the “ddof”
-keyword for a standard deviation aggregator).
Create a weighted aggregator for the given call_func.
-
Args:
-
-
-
cell_method (string):
-
Cell method string that supports string format substitution.
-
-
-
-
-
call_func (callable):
-
Data aggregation function. Call signature (data, axis, **kwargs).
-
-
-
-
-
Kwargs:
-
-
-
units_func (callable):
-
Units conversion function.
-
-
-
-
-
lazy_func (callable or None):
-
An alternative to call_func implementing a lazy
-aggregation. Note that, it need not support all features of the
-main operation, but should raise an error in unhandled cases.
Keyword arguments are passed through to the data aggregation function
-(for example, the “percent” keyword for a percentile aggregator).
-This function is usually used in conjunction with update_metadata(),
-which should be passed the same keyword arguments.
-
Args:
-
-
-
data (array):
-
Data array.
-
-
-
-
-
axis (int):
-
Axis to aggregate over.
-
-
-
-
-
Kwargs:
-
-
-
mdtol (float):
-
Tolerance of missing data. The value returned will be masked if
-the fraction of data to missing data is less than or equal to
-mdtol. mdtol=0 means no missing data is tolerated while mdtol=1
-will return the resulting value from the aggregation function.
-Defaults to 1.
-
-
-
-
-
kwargs:
-
All keyword arguments apart from those specified above, are
-passed through to the data aggregation function.
Perform aggregation over the data with a lazy operation, analogous to
-the ‘aggregate’ result.
-
Keyword arguments are passed through to the data aggregation function
-(for example, the “percent” keyword for a percentile aggregator).
-This function is usually used in conjunction with update_metadata(),
-which should be passed the same keyword arguments.
-
Args:
-
-
-
data (array):
-
A lazy array (biggus.Array).
-
-
-
-
-
axis (int or list of int):
-
The dimensions to aggregate over – note that this is defined
-differently to the ‘aggregate’ method ‘axis’ argument, which only
-accepts a single dimension index.
-
-
-
-
-
Kwargs:
-
-
-
kwargs:
-
All keyword arguments are passed through to the data aggregation
-function.
-
-
-
-
-
-
Returns:
-
A lazy array representing the aggregation operation
-(biggus.Array).
The one or more coordinates that were aggregated over.
-
-
-
-
-
Kwargs:
-
-
This function is intended to be used in conjunction with aggregate()
-and should be passed the same keywords (for example, the “weights”
-keywords from a mean aggregator).
-
-
-
Returns:
-
The collapsed cube with it’s aggregated data payload. Or a tuple
-pair of (cube, weights) if the keyword “returned” is specified
-and True.
This function is intended to be used in conjuction with aggregate()
-and should be passed the same keywords (for example, the “ddof”
-keyword for a standard deviation aggregator).
This class describes the linear interpolation and regridding scheme for
-interpolating or regridding over one or more orthogonal coordinates,
-typically for use with iris.cube.Cube.interpolate() or
-iris.cube.Cube.regrid().
Creates a linear interpolator to perform interpolation over the
-given Cube specified by the dimensions of
-the given coordinates.
-
Typically you should use iris.cube.Cube.interpolate() for
-interpolating a cube. There are, however, some situations when
-constructing your own interpolator is preferable. These are detailed
-in the user guide.
The names or coordinate instances that are to be
-interpolated over.
-
-
-
-
-
-
Returns:
-
A callable with the interface:
-
-
callable(sample_points, collapse_scalar=True)
-
where sample_points is a sequence containing an array of values
-for each of the coordinates passed to this method, and
-collapse_scalar determines whether to remove length one
-dimensions in the result cube caused by scalar values in
-sample_points.
-
The values for coordinates that correspond to date/times
-may optionally be supplied as datetime.datetime or
-netcdftime.datetime instances.
-
For example, for the callable returned by:
-Linear().interpolator(cube, [‘latitude’, ‘longitude’]),
-sample_points must have the form
-[new_lat_values, new_lon_values].
Creates a linear regridder to perform regridding from the source
-grid to the target grid.
-
Typically you should use iris.cube.Cube.regrid() for
-regridding a cube. There are, however, some situations when
-constructing your own regridder is preferable. These are detailed in
-the user guide.
This class describes the area-weighted regridding scheme for regridding
-over one or more orthogonal coordinates, typically for use with
-iris.cube.Cube.regrid().
Area-weighted regridding scheme suitable for regridding one or more
-orthogonal coordinates.
-
Kwargs:
-
-
-
mdtol (float):
-
Tolerance of missing data. The value returned in each element of
-the returned array will be masked if the fraction of missing data
-exceeds mdtol. This fraction is calculated based on the area of
-masked cells within each target cell. mdtol=0 means no masked
-data is tolerated while mdtol=1 will mean the resulting element
-will be masked if and only if all the overlapping elements of the
-source grid are masked. Defaults to 1.
Creates an area-weighted regridder to perform regridding from the
-source grid to the target grid.
-
Typically you should use iris.cube.Cube.regrid() for
-regridding a cube. There are, however, some situations when
-constructing your own regridder is preferable. These are detailed in
-the user guide.
This class describes the nearest-neighbour interpolation and regridding
-scheme for interpolating or regridding over one or more orthogonal
-coordinates, typically for use with iris.cube.Cube.interpolate()
-or iris.cube.Cube.regrid().
Creates a nearest-neighbour interpolator to perform
-interpolation over the given Cube specified
-by the dimensions of the specified coordinates.
-
Typically you should use iris.cube.Cube.interpolate() for
-interpolating a cube. There are, however, some situations when
-constructing your own interpolator is preferable. These are detailed
-in the user guide.
The names or coordinate instances that are to be
-interpolated over.
-
-
-
-
-
-
Returns:
-
A callable with the interface:
-
-
callable(sample_points, collapse_scalar=True)
-
where sample_points is a sequence containing an array of values
-for each of the coordinates passed to this method, and
-collapse_scalar determines whether to remove length one
-dimensions in the result cube caused by scalar values in
-sample_points.
-
The values for coordinates that correspond to date/times
-may optionally be supplied as datetime.datetime or
-netcdftime.datetime instances.
-
For example, for the callable returned by:
-Nearest().interpolator(cube, [‘latitude’, ‘longitude’]),
-sample_points must have the form
-[new_lat_values, new_lon_values].
Creates a nearest-neighbour regridder to perform regridding from the
-source grid to the target grid.
-
Typically you should use iris.cube.Cube.regrid() for
-regridding a cube. There are, however, some situations when
-constructing your own regridder is preferable. These are detailed in
-the user guide.
Given a cube calculate the difference between each value in the given coord’s direction.
-
Args:
-
-
-
coord
-
either a Coord instance or the unique name of a coordinate in the cube.
-If a Coord instance is provided, it does not necessarily have to exist in the cube.
Calculate the differential of a given cube with respect to the coord_to_differentiate.
-
Args:
-
-
-
coord_to_differentiate:
-
Either a Coord instance or the unique name of a coordinate which exists in the cube.
-If a Coord instance is provided, it does not necessarily have to exist on the cube.
Where d is the differential, v is the data value, c is the coordinate value and i is the index in the differential
-direction. Hence, in a normal situation if a cube has a shape (x: n; y: m) differentiating with respect to x will result in a cube
-of shape (x: n-1; y: m) and differentiating with respect to y will result in (x: n; y: m-1). If the coordinate to differentiate is
-circular then the resultant shape will be the same as the input cube.
-
In the returned cube the coord_to_differentiate object is
-redefined such that the output coordinate values are set to the
-averages of the original coordinate values (i.e. the mid-points).
-Similarly, the output lower bounds values are set to the averages of
-the original lower bounds values and the output upper bounds values
-are set to the averages of the original upper bounds values. In more
-formal terms:
-
-
C[i] = (c[i] + c[i+1]) / 2
-
B[i, 0] = (b[i, 0] + b[i+1, 0]) / 2
-
B[i, 1] = (b[i, 1] + b[i+1, 1]) / 2
-
-
where c and b represent the input coordinate values and bounds,
-and C and B the output coordinate values and bounds.
-
-
Note
-
Difference method used is the same as cube_delta() and therefore has the same limitations.
-
-
-
Note
-
Spherical differentiation does not occur in this routine.
When spherical calculus is used, i_cube is the phi vector component (e.g. eastward), j_cube is the theta component
-(e.g. northward) and k_cube is the radial component.
If False, weights are grid cell areas. If True, weights are grid
-cell areas divided by the total grid area.
-
-
-
-
-
The cube must have coordinates ‘latitude’ and ‘longitude’ with bounds.
-
Area weights are calculated for each lat/lon cell as:
-
-
-
Currently, only supports a spherical datum.
-Uses earth radius from the cube, if present and spherical.
-Defaults to iris.analysis.cartography.DEFAULT_SPHERICAL_EARTH_RADIUS.
Returns an array of latitude weights, with the same dimensions as
-the cube. The weights are the cosine of latitude.
-
These are n-dimensional latitude weights repeated over the dimensions
-not covered by the latitude coordinate.
-
The cube must have a coordinate with ‘latitude’ in the name. Out of
-range values (greater than 90 degrees or less than -90 degrees) will
-be clipped to the valid range.
-
Weights are calculated for each latitude as:
-
-
-
Examples:
-
Compute weights suitable for averaging type operations:
Nearest neighbour regrid to a specified target projection.
-
Return a new cube that is the result of projecting a cube with 1 or 2
-dimensional latitude-longitude coordinates from its coordinate system into
-a specified projection e.g. Robinson or Polar Stereographic.
-This function is intended to be used in cases where the cube’s coordinates
-prevent one from directly visualising the data, e.g. when the longitude
-and latitude are two dimensional and do not make up a regular grid.
An instance of the Cartopy Projection class, or an instance of
-iris.coord_systems.CoordSystem from which a projection
-will be obtained.
-
-
-
-
-
-
Kwargs:
-
-
-
nx
-
Desired number of sample points in the x direction for a domain
-covering the globe.
-
-
-
-
-
ny
-
Desired number of sample points in the y direction for a domain
-covering the globe.
-
-
-
-
-
-
Returns:
-
An instance of iris.cube.Cube and a list describing the
-extent of the projection.
-
-
-
Note
-
This function assumes global data and will if necessary extrapolate
-beyond the geographical extent of the source cube using a nearest
-neighbour approach. nx and ny then include those points which are
-outside of the target projection.
-
-
-
Note
-
Masked arrays are handled by passing their masked status to the
-resulting nearest neighbour values. If masked, the value in the
-resulting cube is set to 0.
-
-
-
Warning
-
This function uses a nearest neighbour approach rather than any form
-of linear/non-linear interpolation to determine the data value of each
-cell in the resulting cube. Consequently it may have an adverse effect
-on the statistics of the data e.g. the mean and standard deviation
-will not be preserved.
Transform wind vectors to a different coordinate system.
-
The input cubes contain U and V components parallel to the local X and Y
-directions of the input grid at each point.
-
The output cubes contain the same winds, at the same locations, but
-relative to the grid directions of a different coordinate system.
-Thus in vector terms, the magnitudes will always be the same, but the
-angles can be different.
-
The outputs retain the original horizontal dimension coordinates, but
-also have two 2-dimensional auxiliary coordinates containing the X and
-Y locations in the target coordinate system.
-
Args:
-
-
-
u_cube
-
An instance of iris.cube.Cube that contains the x-component
-of the vector.
-
-
-
-
-
v_cube
-
An instance of iris.cube.Cube that contains the y-component
-of the vector.
A (u’, v’) tuple of iris.cube.Cube instances that are the u
-and v components in the requested target coordinate system.
-The units are the same as the inputs.
-
-
-
Note
-
The U and V values relate to distance, with units such as ‘m s-1’.
-These are not the same as coordinate vectors, which transform in a
-different manner.
-
-
-
Note
-
The names of the output cubes are those of the inputs, prefixed with
-‘transformed_’ (e.g. ‘transformed_x_wind’).
-
-
-
Warning
-
Conversion between rotated-pole and non-rotated systems can be
-expressed analytically. However, this function always uses a numerical
-approach. In locations where this numerical approach does not preserve
-magnitude to an accuracy of 0.1%, the corresponding elements of the
-returned cubes will be masked.
The cube must have bounded horizontal coordinates.
-
-
Note
-
This routine works in Euclidean space. Area calculations do not
-account for the curvature of the Earth. And care must be taken to
-ensure any longitude values are expressed over a suitable interval.
-
-
-
Note
-
This routine currently does not handle all out-of-bounds cases
-correctly. In cases where both the coordinate bounds and the
-geometry’s bounds lie outside the physically realistic range
-(i.e., abs(latitude) > 90., as it is commonly the case when
-bounds are constructed via guess_bounds()), the weights
-calculation might be wrong. In this case, a UserWarning will
-be issued.
A Cube containing a bounded, horizontal grid definition.
-
-
-
-
-
geometry (a shapely geometry instance):
-
The geometry of interest. To produce meaningful results this geometry
-must have a non-zero area. Typically a Polygon or MultiPolygon.
-
-
-
-
-
Kwargs:
-
-
-
normalize:
-
Calculate each individual cell weight as the cell area overlap between
-the cell and the given shapely geometry divided by the total cell area.
-Default is False.
Returns a new cube using data value(s) closest to the given coordinate point values.
-
The sample_points mapping does not have to include coordinate values corresponding to all data
-dimensions. Any dimensions unspecified will default to a full slice.
Return a cube of the linearly interpolated points given the desired
-sample points.
-
Given a list of tuple pairs mapping coordinates (or coordinate names)
-to their desired values, return a cube with linearly interpolated values.
-If more than one coordinate is specified, the linear interpolation will be
-carried out in sequence, thus providing n-linear interpolation
-(bi-linear, tri-linear, etc.).
-
If the input cube’s data is masked, the result cube will have a data
-mask interpolated to the new sample points
By definition, linear interpolation requires all coordinates to
-be 1-dimensional.
-
-
-
Note
-
If a specified coordinate is single valued its value will be
-extrapolated to the desired sample points by assuming a gradient of
-zero.
-
-
Args:
-
-
-
cube
-
The cube to be interpolated.
-
-
-
-
-
sample_points
-
List of one or more tuple pairs mapping coordinate to desired
-points to interpolate. Points may be a scalar or a numpy array
-of values. Multi-dimensional coordinates are not supported.
-
-
-
-
-
Kwargs:
-
-
extrapolation_mode - string - one of ‘linear’, ‘nan’ or ‘error’
-
-
-
If ‘linear’ the point will be calculated by extending the
-gradient of closest two points.
-
If ‘nan’ the extrapolation point will be put as a NaN.
-
If ‘error’ a value error will be raised notifying of the
-attempted extrapolation.
-
-
-
-
-
-
Note
-
If the source cube’s data, or any of its resampled coordinates,
-have an integer data type they will be promoted to a floating
-point data type in the result.
Returns the data value closest to the given coordinate point values.
-
The sample_points mapping must include coordinate values corresponding to all data
-dimensions.
-
For example:
-
>>> cube=iris.load_cube(iris.sample_data_path('air_temp.pp'))
->>> iris.analysis.interpolate.nearest_neighbour_data_value(cube,[('latitude',0),('longitude',10)])
-299.21564
->>> iris.analysis.interpolate.nearest_neighbour_data_value(cube,[('latitude',0)])
-Traceback (most recent call last):
-...
-ValueError: The sample points [('latitude', 0)] was not specific enough to return a single value from the cube.
-
Returns the indices to select the data value(s) closest to the given coordinate point values.
-
The sample_points mapping does not have to include coordinate values corresponding to all data
-dimensions. Any dimensions unspecified will default to a full slice.
Returns all the cubes re-gridded to the highest horizontal resolution.
-
Horizontal resolution is defined by the number of grid points/cells covering the horizontal plane.
-See iris.analysis.interpolation.regrid() regarding mode of interpolation.
Create an ifunc from a data function and units function.
-
Args:
-
-
data_func:
-
-
Function to be applied to one or two data arrays, which
-are given as positional arguments. Should return another
-data array, with the same shape as the first array.
-
Can also have keyword arguments.
-
-
-
units_func:
-
-
Function to calculate the unit of the resulting cube.
-Should take the cube(s) as input and return
-an instance of iris.unit.Unit.
Calculates the n-D Pearson’s r correlation
-cube over the dimensions associated with the
-given coordinates.
-
Returns a cube of the correlation between the two
-cubes along the dimensions of the given
-coordinates, at each point in the remaining
-dimensions of the cubes.
-
For example providing two time/altitude/latitude/longitude
-cubes and corr_coords of ‘latitude’ and ‘longitude’ will result
-in a time/altitude cube describing the latitude/longitude
-(i.e. pattern) correlation at each time/altitude point.
-
Args:
-
-
-
cube_a, cube_b (cubes):
-
Between which the correlation field will be calculated.
-Cubes should be the same shape and have the
-same dimension coordinates.
-
-
-
-
-
corr_coords (list of str):
-
The cube coordinate names over which to calculate
-correlations. If no names are provided then
-correlation will be calculated over all cube
-dimensions.
Represents a “factory” which can manufacture an additional auxiliary
-coordinate on demand, by combining the values of other coordinates.
-
Each concrete subclass represents a specific formula for deriving
-values from other coordinates.
-
The standard_name, long_name, var_name, units, attributes and
-coord_system of the factory are used to set the corresponding
-properties of the resulting auxiliary coordinates.
Represents a “factory” which can manufacture an additional auxiliary
-coordinate on demand, by combining the values of other coordinates.
-
Each concrete subclass represents a specific formula for deriving
-values from other coordinates.
-
The standard_name, long_name, var_name, units, attributes and
-coord_system of the factory are used to set the corresponding
-properties of the resulting auxiliary coordinates.
Provides access to Iris-specific configuration values.
-
The default configuration values can be overridden by creating the file
-iris/etc/site.cfg. If it exists, this file must conform to the format
-defined by ConfigParser.
Local directory where sample data exists. Defaults to “sample_data”
-sub-directory of the Iris package install directory. The sample data
-directory supports the Iris gallery. Directory contents accessed via
-iris.sample_data_path().
Local directory where test data exists. Defaults to “test_data”
-sub-directory of the Iris package install directory. The test data
-directory supports the subset of Iris unit tests that require data.
-Directory contents accessed via iris.tests.get_data_path().
Returns the directory path from the given option and section, or
-returns the given default value if the section/option is not present
-or does not represent a valid directory.
The other functions all implement specific common cases
-(e.g. add_day_of_month()).
-Currently, these are all calendar functions, so they only apply to
-“Time coordinates”.
List of seasons defined by month abbreviations. Each month must
-appear once and only once. Defaults to standard meteorological
-seasons (‘djf’, ‘mam’, ‘jja’, ‘son’).
List of seasons defined by month abbreviations. Each month must
-appear once and only once. Defaults to standard meteorological
-seasons (‘djf’, ‘mam’, ‘jja’, ‘son’).
List of seasons defined by month abbreviations. Each month must
-appear once and only once. Defaults to standard meteorological
-seasons (‘djf’, ‘mam’, ‘jja’, ‘son’).
Return a cartopy projection representing our native map.
-
This will be the same as the as_cartopy_crs() for
-map projections but for spherical coord systems (which are not map
-projections) we use a map projection, such as PlateCarree.
The values (or value in the case of a scalar coordinate) of the
-coordinate for each cell.
-
-
-
-
-
Kwargs:
-
-
-
standard_name:
-
CF standard name of coordinate
-
-
-
-
-
long_name:
-
Descriptive name of coordinate
-
-
-
-
-
var_name:
-
CF variable name of coordinate
-
-
-
-
-
units
-
The Unit of the coordinate’s values.
-Can be a string, which will be converted to a Unit object.
-
-
-
-
-
bounds
-
An array of values describing the bounds of each cell. Given n
-bounds for each cell, the shape of the bounds array should be
-points.shape + (n,). For example, a 1d coordinate with 100 points
-and two bounds per cell would have a bounds array of shape
-(100, 2)
-
-
-
-
-
attributes
-
A dictionary containing other cf and user-defined attributes.
Return the single Cell instance which results from slicing the
-points/bounds with the given index.
-
-
Note
-
If iris.FUTURE.cell_datetime_objects is True, then this
-method will return Cell objects whose points and bounds
-attributes contain either datetime.datetime instances or
-netcdftime.datetime instances (depending on the calendar).
This may be a different shape to the points of the coordinate
-being copied.
-
-
-
-
-
bounds: A bounds array for the new coordinate.
-
Given n bounds for each cell, the shape of the bounds array
-should be points.shape + (n,). For example, a 1d coordinate
-with 100 points and two bounds per cell would have a bounds
-array of shape (100, 2).
-
-
-
-
-
-
Note
-
If the points argument is specified and bounds are not, the
-resulting coordinate will have no bounds.
Add contiguous bounds to a coordinate, calculated from its points.
-
Puts a cell boundary at the specified fraction between each point and
-the next, plus extrapolated lowermost and uppermost bound points, so
-that each point lies within a cell.
-
With regularly spaced points, the resulting bounds will also be
-regular, and all points lie at the same position within their cell.
-With irregular points, the first and last cells are given the same
-widths as the ones next to them.
-
Kwargs:
-
-
-
bound_position - The desired position of the bounds relative to the
-
position of the points.
-
-
-
-
-
-
Note
-
An error is raised if the coordinate already has bounds, is not
-one-dimensional, or is not monotonic.
-
-
-
Note
-
Unevenly spaced values, such from a wrapped longitude range, can
-produce unexpected results : In such cases you should assign
-suitable values directly to the bounds property, instead.
A single attribute key or iterable of attribute keys to ignore when
-comparing the coordinates. Default is None. To ignore all
-attributes, set this to other.attributes.
An immutable representation of a single cell of a coordinate, including the
-sample point and/or boundary position.
-
Notes on cell comparison:
-
Cells are compared in two ways, depending on whether they are
-compared to another Cell, or to a number/string.
-
Cell-Cell comparison is defined to produce a strict ordering. If
-two cells are not exactly equal (i.e. including whether they both
-define bounds or not) then they will have a consistent relative
-order.
-
Cell-number and Cell-string comparison is defined to support
-Constraint matching. The number/string will equal the Cell if, and
-only if, it is within the Cell (including on the boundary). The
-relative comparisons (lt, le, ..) are defined to be consistent with
-this interpretation. So for a given value n and Cell cell, only
-one of the following can be true:
-
-
n < cell
-
n == cell
-
n > cell
-
-
Similarly, n <= cell implies either n < cell or n == cell.
-And n >= cell implies either n > cell or n == cell.
The values (or value in the case of a scalar coordinate) of the
-coordinate for each cell.
-
-
-
-
-
Kwargs:
-
-
-
standard_name:
-
CF standard name of coordinate
-
-
-
-
-
long_name:
-
Descriptive name of coordinate
-
-
-
-
-
var_name:
-
CF variable name of coordinate
-
-
-
-
-
units
-
The Unit of the coordinate’s values.
-Can be a string, which will be converted to a Unit object.
-
-
-
-
-
bounds
-
An array of values describing the bounds of each cell. Given n
-bounds for each cell, the shape of the bounds array should be
-points.shape + (n,). For example, a 1d coordinate with 100 points
-and two bounds per cell would have a bounds array of shape
-(100, 2)
-
-
-
-
-
attributes
-
A dictionary containing other cf and user-defined attributes.
Return the single Cell instance which results from slicing the
-points/bounds with the given index.
-
-
Note
-
If iris.FUTURE.cell_datetime_objects is True, then this
-method will return Cell objects whose points and bounds
-attributes contain either datetime.datetime instances or
-netcdftime.datetime instances (depending on the calendar).
This may be a different shape to the points of the coordinate
-being copied.
-
-
-
-
-
bounds: A bounds array for the new coordinate.
-
Given n bounds for each cell, the shape of the bounds array
-should be points.shape + (n,). For example, a 1d coordinate
-with 100 points and two bounds per cell would have a bounds
-array of shape (100, 2).
-
-
-
-
-
-
Note
-
If the points argument is specified and bounds are not, the
-resulting coordinate will have no bounds.
Add contiguous bounds to a coordinate, calculated from its points.
-
Puts a cell boundary at the specified fraction between each point and
-the next, plus extrapolated lowermost and uppermost bound points, so
-that each point lies within a cell.
-
With regularly spaced points, the resulting bounds will also be
-regular, and all points lie at the same position within their cell.
-With irregular points, the first and last cells are given the same
-widths as the ones next to them.
-
Kwargs:
-
-
-
bound_position - The desired position of the bounds relative to the
-
position of the points.
-
-
-
-
-
-
Note
-
An error is raised if the coordinate already has bounds, is not
-one-dimensional, or is not monotonic.
-
-
-
Note
-
Unevenly spaced values, such from a wrapped longitude range, can
-produce unexpected results : In such cases you should assign
-suitable values directly to the bounds property, instead.
A single attribute key or iterable of attribute keys to ignore when
-comparing the coordinates. Default is None. To ignore all
-attributes, set this to other.attributes.
First it tries self.standard_name, then it tries the ‘long_name’
-attribute, then the ‘var_name’ attribute, before falling back to
-the value of default (which itself defaults to ‘unknown’).
Return the single Cell instance which results from slicing the
-points/bounds with the given index.
-
-
Note
-
If iris.FUTURE.cell_datetime_objects is True, then this
-method will return Cell objects whose points and bounds
-attributes contain either datetime.datetime instances or
-netcdftime.datetime instances (depending on the calendar).
Create a DimCoord with regularly spaced points, and
-optionally bounds.
-
The majority of the arguments are defined as for
-Coord.__init__(), but those which differ are defined below.
-
Args:
-
-
-
zeroth:
-
The value prior to the first point value.
-
-
-
-
-
step:
-
The numeric difference between successive point values.
-
-
-
-
-
count:
-
The number of point values.
-
-
-
-
-
Kwargs:
-
-
-
with_bounds:
-
If True, the resulting DimCoord will possess bound values
-which are equally spaced around the points. Otherwise no
-bounds values will be defined. Defaults to False.
Add contiguous bounds to a coordinate, calculated from its points.
-
Puts a cell boundary at the specified fraction between each point and
-the next, plus extrapolated lowermost and uppermost bound points, so
-that each point lies within a cell.
-
With regularly spaced points, the resulting bounds will also be
-regular, and all points lie at the same position within their cell.
-With irregular points, the first and last cells are given the same
-widths as the ones next to them.
-
Kwargs:
-
-
-
bound_position - The desired position of the bounds relative to the
-
position of the points.
-
-
-
-
-
-
Note
-
An error is raised if the coordinate already has bounds, is not
-one-dimensional, or is not monotonic.
-
-
-
Note
-
Unevenly spaced values, such from a wrapped longitude range, can
-produce unexpected results : In such cases you should assign
-suitable values directly to the bounds property, instead.
A single attribute key or iterable of attribute keys to ignore when
-comparing the coordinates. Default is None. To ignore all
-attributes, set this to other.attributes.
Perform aggregation over the cube given one or more “group
-coordinates”.
-
A “group coordinate” is a coordinate where repeating values represent a
-single group, such as a month coordinate on a daily time slice.
-TODO: It is not clear if repeating values must be consecutive to form a
-group.
-
The group coordinates must all be over the same cube dimension. Each
-common value group identified over all the group-by coordinates is
-collapsed using the provided aggregator.
Weighted aggregations support an optional weights keyword argument.
-If set, this should be supplied as an array of weights whose shape
-matches the cube. Values for latitude-longitude area weights may be
-calculated using iris.analysis.cartography.area_weights().
-
Some Iris aggregators support “lazy” evaluation, meaning that
-cubes resulting from this method may represent data arrays which are
-not computed until the data is requested (e.g. via cube.data or
-iris.save). If lazy evaluation exists for the given aggregator
-it will be used wherever possible when this cube’s data is itself
-a deferred array.
-
Args:
-
-
-
coords (string, coord or a list of strings/coords):
-
Coordinate names/coordinates over which the cube should be
-collapsed.
Conversely operations which operate on more than one coordinate
-at the same time are commutative as they are combined internally
-into a single operation. Hence the order of the coordinates
-supplied in the list does not matter:
Returns a tuple of the data dimensions relevant to the given
-coordinate.
-
When searching for the given coordinate in the cube the comparison is
-made using coordinate metadata equality. Hence the given coordinate
-instance need not exist on the cube, and may contain different
-coordinate values.
Return a list of coordinates in this cube fitting the given criteria.
-
Kwargs:
-
-
-
name_or_coord
-
Either
-
(a) a standard_name, long_name, or
-var_name. Defaults to value of default
-(which itself defaults to unknown) as defined in
-iris._cube_coord_common.CFVariableMixin.
Deprecated since version 1.6.: Please use the name_or_coord kwarg.
-
-
-
-
-
-
standard_name
-
The CF standard name of the desired coordinate. If None, does not
-check for standard name.
-
-
-
-
-
long_name
-
An unconstrained description of the coordinate. If None, does not
-check for long_name.
-
-
-
-
-
var_name
-
The CF variable name of the desired coordinate. If None, does not
-check for var_name.
-
-
-
-
-
attributes
-
A dictionary of attributes desired on the coordinates. If None,
-does not check for attributes.
-
-
-
-
-
axis
-
The desired coordinate axis, see
-iris.util.guess_coord_axis(). If None, does not check for
-axis. Accepts the values ‘X’, ‘Y’, ‘Z’ and ‘T’ (case-insensitive).
-
-
-
-
-
contains_dimension
-
The desired coordinate contains the data dimension. If None, does
-not check for the dimension.
-
-
-
-
-
dimensions
-
The exact data dimensions of the desired coordinate. Coordinates
-with no data dimension can be found with an empty tuple or list
-(i.e. () or []). If None, does not check for dimensions.
-
-
-
-
-
coord
-
-
Deprecated since version 1.6.: Please use the name_or_coord kwarg.
-
-
-
-
-
-
coord_system
-
Whether the desired coordinates have coordinate systems equal to
-the given coordinate system. If None, no check is done.
-
-
-
-
-
dim_coords
-
Set to True to only return coordinates that are the cube’s
-dimension coordinates. Set to False to only return coordinates
-that are the cube’s auxiliary and derived coordinates. If None,
-returns all coordinates.
Interpolate from this Cube to the given
-sample points using the given interpolation scheme.
-
Args:
-
-
-
sample_points:
-
A sequence of (coordinate, points) pairs over which to
-interpolate. The values for coordinates that correspond to
-dates or times may optionally be supplied as datetime.datetime or
-netcdftime.datetime instances.
-
-
-
-
-
scheme:
-
The type of interpolation to use to interpolate from this
-Cube to the given sample points. The
-interpolation schemes currently available in Iris are:
Whether to collapse the dimension of scalar sample points
-in the resulting cube. Default is True.
-
-
-
-
-
-
Returns:
-
A cube interpolated at the given sample points.
-If collapse_scalar is True then the dimensionality of the cube
-will be the number of original cube dimensions minus
-the number of scalar coordinates.
keyword arguments, where the keyword name specifies the name
-of the coordinate (as defined in iris.cube.Cube.coords())
-and the value defines the corresponding range of coordinate
-values as a tuple. The tuple must contain two, three, or four
-items corresponding to: (minimum, maximum, min_inclusive,
-max_inclusive). Where the items are defined as:
-
-
-
minimum
-
The minimum value of the range to select.
-
-
-
-
-
maximum
-
The maximum value of the range to select.
-
-
-
-
-
min_inclusive
-
If True, coordinate values equal to minimum will be included
-in the selection. Default is True.
-
-
-
-
-
max_inclusive
-
If True, coordinate values equal to maximum will be included
-in the selection. Default is True.
-
-
-
-
-
-
-
To perform an intersection that ignores any bounds on the coordinates,
-set the optional keyword argument ignore_bounds to True. Defaults to
-False.
-
-
Note
-
For ranges defined over “circular” coordinates (i.e. those
-where the units attribute has a modulus defined) the cube
-will be “rolled” to fit where neccesary.
-
-
-
Warning
-
Currently this routine only works with “circular”
-coordinates (as defined in the previous note.)
A single attribute key or iterable of attribute keys to ignore when
-comparing the cubes. Default is None. To ignore all attributes set
-this to other.attributes.
This function does not indicate whether the two cubes can be
-merged, instead it checks only the four items quoted above for
-equality. Determining whether two cubes will merge requires
-additional logic that is beyond the scope of this method.
Return a biggus.Array representing the
-multi-dimensional data of the Cube, and optionally provide a
-new array of values.
-
Accessing this method will never cause the data to be loaded.
-Similarly, calling methods on, or indexing, the returned Array
-will not cause the Cube to have loaded data.
-
If the data have already been loaded for the Cube, the returned
-Array will be a biggus.NumpyArrayAdapter which wraps
-the numpy array from self.data.
-
Kwargs:
-
-
-
array (biggus.Array or None):
-
When this is not None it sets the multi-dimensional data of
-the cube to the given value.
-
-
-
-
-
-
Returns:
-
A biggus.Array representing the multi-dimensional
-data of the Cube.
A cube defined with the horizontal dimensions of the target grid
-and the other dimensions from this cube. The data values of
-this cube will be converted to values on the new grid
-according to the given regridding scheme.
Aggregator and aggregation function keyword arguments. The weights
-argument to the aggregator, if any, should be a 1d array with the
-same length as the chosen window.
Return an iterator of all subcubes given the coordinates or dimension
-indices desired to be present in each subcube.
-
Args:
-
-
-
ref_to_slice (string, coord, dimension index or a list of these):
-
Determines which dimensions will be returned in the subcubes (i.e.
-the dimensions that are not iterated over).
-A mix of input types can also be provided. They must all be
-orthogonal (i.e. point to different dimensions).
-
-
-
-
-
Kwargs:
-
-
-
ordered: if True, the order which the coords to slice or data_dims
-
are given will be the order in which they represent the data in
-the resulting cube slices. If False, the order will follow that of
-the source cube. Default is True.
-
-
-
-
-
-
Returns:
-
An iterator of subcubes.
-
-
For example, to get all 2d longitude/latitude subcubes from a
-multi-dimensional cube:
Return an iterator of all subcubes along a given coordinate or
-dimension index, or multiple of these.
-
Args:
-
-
-
ref_to_slice (string, coord, dimension index or a list of these):
-
Determines which dimensions will be iterated along (i.e. the
-dimensions that are not returned in the subcubes).
-A mix of input types can also be provided.
-
-
-
-
-
-
Returns:
-
An iterator of subcubes.
-
-
For example, to get all subcubes along the time dimension:
The order of dimension references to slice along does not affect
-the order of returned items in the iterator; instead the ordering
-is based on the fastest-changing dimension.
The numpy.ndarray representing the multi-dimensional data of
-the cube.
-
-
Note
-
Cubes obtained from netCDF, PP, and FieldsFile files will only
-populate this attribute on its first use.
-
To obtain the shape of the data without causing it to be loaded,
-use the Cube.shape attribute.
-
-
-
Example::
-
>>> fname=iris.sample_data_path('air_temp.pp')
->>> cube=iris.load_cube(fname,'air_temperature')
->>> # cube.data does not yet have a value.
-...
->>> print(cube.shape)
-(73, 96)
->>> # cube.data still does not have a value.
-...
->>> cube=cube[:10,:20]
->>> # cube.data still does not have a value.
-...
->>> data=cube.data
->>> # Only now is the data loaded.
-...
->>> print(data.shape)
-(10, 20)
-
Return a tuple of all the dimension coordinates, ordered by dimension.
-
-
Note
-
The length of the returned tuple is not necessarily the same as
-Cube.ndim as there may be dimensions on the cube without
-dimension coordinates. It is therefore unreliable to use the
-resulting tuple to identify the dimension coordinates for a given
-dimension - instead use the Cube.coord() method with the
-dimensions and dim_coords keyword arguments.
This combines cubes with a common dimension coordinate, but occupying
-different regions of the coordinate value. The cubes are joined across
-that dimension.
Contrast this with iris.cube.CubeList.merge(), which makes a new
-dimension from values of an auxiliary scalar coordinate.
-
-
Note
-
If time coordinates in the list of cubes have differing epochs then
-the cubes will not be able to be concatenated. If this occurs, use
-iris.util.unify_time_units() to normalise the epochs of the
-time coordinates so that the cubes can be concatenated.
Filter each of the cubes which can be filtered by the given
-constraints.
-
This method iterates over each constraint given, and subsets each of
-the cubes in this CubeList where possible. Thus, a CubeList of length
-n when filtered with m constraints can generate a maximum of
-m * n cubes.
-
Keywords:
-
-
-
strict - boolean
-
If strict is True, then there must be exactly one cube which is
-filtered per constraint.
If time coordinates in the list of cubes have differing epochs then
-the cubes will not be able to be merged. If this occurs, use
-iris.util.unify_time_units() to normalise the epochs of the
-time coordinates so that the cubes can be merged.
First it tries self.standard_name, then it tries the ‘long_name’
-attribute, then the ‘var_name’ attribute, before falling back to
-the value of default (which itself defaults to ‘unknown’).
By default, the current figure will be used or a new figure instance
-created if no figure is available. See matplotlib.pyplot.gcf().
-
-
-
-
-
coords (list of Coord objects or coordinate names):
-
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
-
-
-
-
-
interval (int, float or long):
-
Defines the time interval in milliseconds between successive frames.
-A default interval of 100ms is set.
-
-
-
-
-
vmin, vmax (int, float or long):
-
Color scaling values, see matplotlib.colors.Normalize for
-further details. Default values are determined by the min-max across
-the data set over the entire sequence.
Delete cube attributes that are not identical over all cubes in a group.
-
This function simply deletes any attributes which are not the same for
-all the given cubes. The cubes will then have identical attributes. The
-given cubes are modified in-place.
This is a streamlined load operation, to be used only on fieldsfiles whose
-fields repeat regularly over the same vertical levels and times.
-The results are equivalent to those from iris.load(), but operation
-is substantially faster for suitable input.
-
The input files should conform to the following requirements:
-
-
the file must contain fields for all possible combinations of the
-vertical levels and time points found in the file.
-
-
the fields must occur in a regular repeating order within the file.
-
(For example: a sequence of fields for NV vertical levels, repeated
-for NP different forecast periods, repeated for NT different forecast
-times).
-
-
all other metadata must be identical across all fields of the same
-phenomenon.
-
-
-
Each group of fields with the same values of LBUSER4, LBUSER7 and LBPROC
-is identified as a separate phenomenon: These groups are processed
-independently and returned as separate result cubes.
-
-
Note
-
Each input file is loaded independently. Thus a single result cube can
-not combine data from multiple input files.
-
-
-
Note
-
The resulting time-related cordinates (‘time’, ‘forecast_time’ and
-‘forecast_period’) may be mapped to shared cube dimensions and in some
-cases can also be multidimensional. However, the vertical level
-information must have a simple one-dimensional structure, independent
-of the time points, otherwise an error will be raised.
-
-
-
Note
-
Where input data does not have a fully regular arrangement, the
-corresponding result cube will have a single anonymous extra dimension
-which indexes over all the input fields.
-
This can happen if, for example, some fields are missing; or have
-slightly different metadata; or appear out of order in the file.
-
-
-
Warning
-
Any non-regular metadata variation in the input should be strictly
-avoided, as not all irregularities are detected, which can cause
-erroneous results.
Return a new cube with data values calculated using the area weighted
-mean of data values from src_grid regridded onto the horizontal grid of
-grid_cube.
-
This function requires that the horizontal grids of both cubes are
-rectilinear (i.e. expressed in terms of two orthogonal 1D coordinates)
-and that these grids are in the same coordinate system. This function
-also requires that the coordinates describing the horizontal grids
-all have bounds.
-
-
Note
-
Elements in data array of the returned cube that lie either partially
-or entirely outside of the horizontal extent of the src_cube will
-be masked irrespective of the value of mdtol.
-
-
Args:
-
-
-
src_cube:
-
An instance of iris.cube.Cube that supplies the data,
-metadata and coordinates.
-
-
-
-
-
grid_cube:
-
An instance of iris.cube.Cube that supplies the desired
-horizontal grid definition.
-
-
-
-
-
Kwargs:
-
-
-
mdtol:
-
Tolerance of missing data. The value returned in each element of the
-returned cube’s data array will be masked if the fraction of masked
-data in the overlapping cells of the source cube exceeds mdtol. This
-fraction is calculated based on the area of masked cells within each
-target cell. mdtol=0 means no missing data is tolerated while mdtol=1
-will mean the resulting element will be masked if and only if all the
-overlapping cells of the source cube are masked. Defaults to 0.
Return a new cube with the data values calculated using the weighted
-mean of data values from src_cube and the weights from
-weights regridded onto the horizontal grid of grid_cube.
-
This function requires that the src_cube has a curvilinear
-horizontal grid and the target grid_cube is rectilinear
-i.e. expressed in terms of two orthogonal 1D horizontal coordinates.
-Both grids must be in the same coordinate system, and the grid_cube
-must have horizontal coordinates that are both bounded and contiguous.
-
Note that, for any given target grid_cube cell, only the points
-from the src_cube that are bound by that cell will contribute to
-the cell result. The bounded extent of the src_cube will not be
-considered here.
-
A target grid_cube cell result will be calculated as,
-, for
-all src_cube points that are bound by that cell.
-
-
Warning
-
-
Only 2D cubes are supported.
-
All coordinates that span the src_cube that don’t define
-the horizontal curvilinear grid will be ignored.
-
The iris.unit.Unit of the horizontal grid coordinates
-must be either degrees or radians.
-
-
-
Args:
-
-
-
src_cube:
-
A iris.cube.Cube instance that defines the source
-variable grid to be regridded.
-
-
-
-
-
weights:
-
A numpy.ndarray instance that defines the weights
-for the source variable grid cells. Must have the same shape
-as the src_cube.data.
-
-
-
-
-
grid_cube:
-
A iris.cube.Cube instance that defines the target
-rectilinear grid.
Define the target horizontal grid: Only the horizontal dimension
-coordinates are actually used.
-
-
-
-
-
-
Returns:
-
A new cube derived from source_cube, regridded onto the specified
-horizontal grid.
-
-
Any additional coordinates which map onto the horizontal dimensions are
-removed, while all other metadata is retained.
-If there are coordinate factories with 2d horizontal reference surfaces,
-the reference surfaces are also regridded, using ordinary bilinear
-interpolation.
-
-
Note
-
Both source and destination cubes must have two dimension coordinates
-identified with axes ‘X’ and ‘Y’ which share a coord_system with a
-Cartopy CRS.
-The grids are defined by iris.coords.Coord.contiguous_bounds() of
-these.
-
-
-
Note
-
Initialises the ESMF Manager, if it was not already called.
-This implements default Manager operations (e.g. logging).
-
To alter this, make a prior call to ESMF.Manager().
A string whose value represents the path to a file or
-URL to an OpenDAP resource conforming to the
-Unstructured Grid Metadata Conventions for Scientific Datasets
-https://github.com/ugrid-conventions/ugrid-conventions
-
-
-
-
-
name:
-
A string whose value represents a cube loading constraint of
-first the standard name if found, then the long name if found,
-then the variable name if found, before falling back to
-the value of the default which itself defaults to “unknown”
-
-
-
-
-
-
Returns:
-
An instance of iris.cube.Cube decorated with
-an instance of pyugrid.ugrid.Ugrid
-bound to an attribute of the cube called “mesh”
Write out any pending changes, and close the underlying file.
-
If the file was opened for update or creation then the current
-state of the fixed length header, the constant components (e.g.
-integer_constants, level_dependent_constants), and the list of
-fields are written to the file before closing. The process of
-writing to the file also updates the values in the fixed length
-header and fields which relate to layout within the file. For
-example, integer_constants_start and integer_constants_shape
-within the fixed length header, and the lbegin and lbnrec
-elements within the fields.
-
If the file was opened in read mode then no changes will be
-made.
-
After calling close() any subsequent modifications to any of
-the attributes will have no effect on the underlying file.
-
Calling close() more than once is allowed, but only the first
-call will have any effect.
Represents the FIXED_LENGTH_HEADER component of a UM FieldsFile
-variant.
-
Access to simple header items is provided via named attributes,
-e.g. fixed_length_header.sub_model. Other header items can be
-accessed via the raw attribute which provides a simple array
-view of the header.
ABF and ABL files are satellite file formats defined by Boston University.
-Including this module adds ABF and ABL loading to the session’s capabilities.
-
The documentation for this file format can be found
-here.
A CF-netCDF auxiliary coordinate variable is any netCDF variable that contains
-coordinate data, but is not a CF-netCDF coordinate variable by definition.
-
There is no relationship between the name of a CF-netCDF auxiliary coordinate
-variable and the name(s) of its dimension(s).
A CF-netCDF auxiliary coordinate variable is any netCDF variable that contains
-coordinate data, but is not a CF-netCDF coordinate variable by definition.
-
There is no relationship between the name of a CF-netCDF auxiliary coordinate
-variable and the name(s) of its dimension(s).
A CF-netCDF boundary variable is associated with a CF-netCDF variable that contains
-coordinate data. When a data value provides information about conditions in a cell
-occupying a region of space/time or some other dimension, the boundary variable
-provides a description of cell extent.
-
A CF-netCDF boundary variable will have one more dimension than its associated
-CF-netCDF coordinate variable or CF-netCDF auxiliary coordinate variable.
-
Identified by the CF-netCDF variable attribute ‘bounds’.
A CF-netCDF boundary variable is associated with a CF-netCDF variable that contains
-coordinate data. When a data value provides information about conditions in a cell
-occupying a region of space/time or some other dimension, the boundary variable
-provides a description of cell extent.
-
A CF-netCDF boundary variable will have one more dimension than its associated
-CF-netCDF coordinate variable or CF-netCDF auxiliary coordinate variable.
-
Identified by the CF-netCDF variable attribute ‘bounds’.
A CF-netCDF climatology variable is associated with a CF-netCDF variable that contains
-coordinate data. When a data value provides information about conditions in a cell
-occupying a region of space/time or some other dimension, the climatology variable
-provides a climatological description of cell extent.
-
A CF-netCDF climatology variable will have one more dimension than its associated
-CF-netCDF coordinate variable.
-
Identified by the CF-netCDF variable attribute ‘climatology’.
A CF-netCDF climatology variable is associated with a CF-netCDF variable that contains
-coordinate data. When a data value provides information about conditions in a cell
-occupying a region of space/time or some other dimension, the climatology variable
-provides a climatological description of cell extent.
-
A CF-netCDF climatology variable will have one more dimension than its associated
-CF-netCDF coordinate variable.
-
Identified by the CF-netCDF variable attribute ‘climatology’.
A CF-netCDF coordinate variable is a one-dimensional variable with the same name
-as its dimension, and it is defined as a numeric data type with values that are
-ordered monotonically. Missing values are not allowed in CF-netCDF coordinate
-variables. Also see [NUG] Section 2.3.1.
-
Identified by the above criterion, there is no associated CF-netCDF variable
-attribute.
A CF-netCDF coordinate variable is a one-dimensional variable with the same name
-as its dimension, and it is defined as a numeric data type with values that are
-ordered monotonically. Missing values are not allowed in CF-netCDF coordinate
-variables. Also see [NUG] Section 2.3.1.
-
Identified by the above criterion, there is no associated CF-netCDF variable
-attribute.
A CF-netCDF grid mapping variable contains a list of specific attributes that
-define a particular grid mapping. A CF-netCDF grid mapping variable must contain
-the attribute ‘grid_mapping_name’.
-
Based on the value of the ‘grid_mapping_name’ attribute, there are associated
-standard names of CF-netCDF coordinate variables that contain the mapping’s
-independent variables.
-
Identified by the CF-netCDF variable attribute ‘grid_mapping’.
A CF-netCDF grid mapping variable contains a list of specific attributes that
-define a particular grid mapping. A CF-netCDF grid mapping variable must contain
-the attribute ‘grid_mapping_name’.
-
Based on the value of the ‘grid_mapping_name’ attribute, there are associated
-standard names of CF-netCDF coordinate variables that contain the mapping’s
-independent variables.
-
Identified by the CF-netCDF variable attribute ‘grid_mapping’.
Create a FieldsFile header instance by reading the
-FIXED_LENGTH_HEADER section of the FieldsFile, making the names
-defined in FF_HEADER available as attributes of a FFHeader instance.
Returns a generator of GribWrapper
-fields from the given filename.
-
Args:
-
-
-
filename (string):
-
Name of the file to generate fields from.
-
-
-
-
-
Kwargs:
-
-
-
auto_regularise (True | False):
-
If True, any field defined on a reduced grid will be interpolated
-to an equivalent regular grid. If False, any field defined on a
-reduced grid will be loaded on the raw reduced grid with no shape
-information. The default behaviour is to interpolate fields on a
-reduced grid to an equivalent regular grid.
If True, any cube defined on a reduced grid will be interpolated
-to an equivalent regular grid. If False, any cube defined on a
-reduced grid will be loaded on the raw reduced grid with no shape
-information. If iris.FUTURE.strict_grib_load is True then this
-keyword has no effect, raw grids are always used. If the older GRIB
-loader is in use then the default behaviour is to interpolate cubes
-on a reduced grid to an equivalent regular grid.
-
-
Deprecated since version 1.8.: Please use strict_grib_load and regrid instead.
Specialised dictionary object for making lookup tables.
-
Returns None for unknown keys (instead of raising exception).
-Raises exception for any attempt to change an existing entry,
-(but it is still possible to remove keys)
Specialised dictionary object for making lookup tables.
-
Returns None for unknown keys (instead of raising exception).
-Raises exception for any attempt to change an existing entry,
-(but it is still possible to remove keys)
-pop(k[, d]) → v, remove specified key and return the corresponding value.¶
-
If key is not found, d is returned if given, otherwise KeyError is raised
-
-
-
-
-popitem() → (k, v), remove and return some (key, value) pair as a¶
-
2-tuple; but raise KeyError if D is empty.
-
-
-
-
-setdefault(k[, d]) → D.get(k,d), also set D[k]=d if k not in D¶
-
-
-
-
-update([E, ]**F) → None. Update D from dict/iterable E and F.¶
-
If E present and has a .keys() method, does: for k in E: D[k] = E[k]
-If E present and lacks .keys() method, does: for (k, v) in E: D[k] = v
-In either case, this is followed by: for k in F: D[k] = F[k]
The time range is taken from the ‘time’ coordinate bounds.
-If the cell-method coordinate is not ‘time’ itself, the type of statistic
-will not be derived and the save process will be aborted.
Save cube(s) to a netCDF file, given the cube and the filename.
-
-
Iris will write CF 1.5 compliant NetCDF files.
-
The attributes dictionaries on each cube in the saved cube list
-will be compared and common attributes saved as NetCDF global
-attributes where appropriate.
-
Keyword arguments specifying how to save the data are applied
-to each cube. To use different settings for different cubes, use
-the NetCDF Context manager (Saver) directly.
-
The save process will stream the data payload to the file using biggus,
-enabling large data payloads to be saved and maintaining the ‘lazy’
-status of the cube’s data payload, unless the netcdf_format is explicitly
-specified to be ‘NETCDF3’ or ‘NETCDF3_CLASSIC’.
Underlying netCDF file format, one of ‘NETCDF4’, ‘NETCDF4_CLASSIC’,
-‘NETCDF3_CLASSIC’ or ‘NETCDF3_64BIT’. Default is ‘NETCDF4’ format.
-
-
-
-
-
local_keys (iterable of strings):
-
An interable of cube attribute keys. Any cube attributes with
-matching keys will become attributes on the data variable rather
-than global attributes.
-
-
-
-
unlimited_dimensions (iterable of strings and/or
-iris.coords.Coord objects):
-
-
Explicit list of coordinate names (or coordinate objects) corresponding
-to coordinate dimensions of cube to save with the NetCDF dimension
-variable length ‘UNLIMITED’. By default, the outermost (first)
-dimension for each cube is used. Only the ‘NETCDF4’ format supports
-multiple ‘UNLIMITED’ dimensions. To save no unlimited dimensions, use
-unlimited_dimensions=[] (an empty list).
-
-
-
-
zlib (bool):
-
If True, the data will be compressed in the netCDF file using gzip
-compression (default False).
-
-
-
-
-
complevel (int):
-
An integer between 1 and 9 describing the level of compression desired
-(default 4). Ignored if zlib=False.
-
-
-
-
-
shuffle (bool):
-
If True, the HDF5 shuffle filter will be applied before compressing
-the data (default True). This significantly improves compression.
-Ignored if zlib=False.
-
-
-
-
-
fletcher32 (bool):
-
If True, the Fletcher32 HDF5 checksum algorithm is activated to
-detect errors. Default False.
-
-
-
-
-
contiguous (bool):
-
If True, the variable data is stored contiguously on disk. Default
-False. Setting to True for a variable with an unlimited dimension
-will trigger an error.
-
-
-
-
-
chunksizes (tuple of int):
-
Used to manually specify the HDF5 chunksizes for each dimension of the
-variable. A detailed discussion of HDF chunking and I/O performance is
-available here: http://www.hdfgroup.org/HDF5/doc/H5.user/Chunking.html.
-Basically, you want the chunk size for each dimension to match as
-closely as possible the size of the data block that users will read
-from the file. chunksizes cannot be set if contiguous=True.
-
-
-
-
-
endian (string):
-
Used to control whether the data is stored in little or big endian
-format on disk. Possible values are ‘little’, ‘big’ or ‘native’
-(default). The library will automatically handle endian conversions
-when the data is read, but if the data is always going to be read on a
-computer with the opposite format as the one used to create the file,
-there may be some performance advantage to be gained by setting the
-endian-ness.
-
-
-
-
-
least_significant_digit (int):
-
If least_significant_digit is specified, variable data will be
-truncated (quantized). In conjunction with zlib=True this produces
-‘lossy’, but significantly more efficient compression. For example, if
-least_significant_digit=1, data will be quantized using
-numpy.around(scale*data)/scale, where scale = 2**bits, and bits
-is determined so that a precision of 0.1 is retained (in this case
-bits=4). From
-http://www.cdc.noaa.gov/cdc/conventions/cdc_netcdf_standard.shtml:
-“least_significant_digit – power of ten of the smallest decimal place
-in unpacked data that is a reliable value”. Default is None, or no
-quantization, or ‘lossless’ compression.
-
-
-
-
-
-
Returns:
-
None.
-
-
-
Note
-
The zlib, complevel, shuffle, fletcher32, contiguous,
-chunksizes and endian keywords are silently ignored for netCDF 3
-files that do not use HDF5.
Deprecated since version 1.8.0: NetCDF default saving behaviour currently assigns the outermost
-dimensions to unlimited. This behaviour is to be deprecated, in
-favour of no automatic assignment. To switch to the new behaviour,
-set iris.FUTURE.netcdf_no_unlimited to True.
An interable of cube attribute keys. Any cube attributes with
-matching keys will become attributes on the data variable rather
-than global attributes.
-
-
-
-
unlimited_dimensions (iterable of strings and/or
-iris.coords.Coord objects):
-
-
Explicit list of coordinate names (or coordinate objects)
-corresponding to coordinate dimensions of cube to save with the
-NetCDF dimension variable length ‘UNLIMITED’. By default, the
-outermost (first) dimension for each cube is used. Only the
-‘NETCDF4’ format supports multiple ‘UNLIMITED’ dimensions. To save
-no unlimited dimensions, use unlimited_dimensions=[] (an empty
-list).
-
-
-
-
zlib (bool):
-
If True, the data will be compressed in the netCDF file using
-gzip compression (default False).
-
-
-
-
-
complevel (int):
-
An integer between 1 and 9 describing the level of compression
-desired (default 4). Ignored if zlib=False.
-
-
-
-
-
shuffle (bool):
-
If True, the HDF5 shuffle filter will be applied before
-compressing the data (default True). This significantly improves
-compression. Ignored if zlib=False.
-
-
-
-
-
fletcher32 (bool):
-
If True, the Fletcher32 HDF5 checksum algorithm is activated to
-detect errors. Default False.
-
-
-
-
-
contiguous (bool):
-
If True, the variable data is stored contiguously on disk.
-Default False. Setting to True for a variable with an unlimited
-dimension will trigger an error.
-
-
-
-
-
chunksizes (tuple of int):
-
Used to manually specify the HDF5 chunksizes for each dimension of
-the variable. A detailed discussion of HDF chunking and I/O
-performance is available here:
-http://www.hdfgroup.org/HDF5/doc/H5.user/Chunking.html. Basically,
-you want the chunk size for each dimension to match as closely as
-possible the size of the data block that users will read from the
-file. chunksizes cannot be set if contiguous=True.
-
-
-
-
-
endian (string):
-
Used to control whether the data is stored in little or big endian
-format on disk. Possible values are ‘little’, ‘big’ or ‘native’
-(default). The library will automatically handle endian conversions
-when the data is read, but if the data is always going to be read
-on a computer with the opposite format as the one used to create
-the file, there may be some performance advantage to be gained by
-setting the endian-ness.
-
-
-
-
-
least_significant_digit (int):
-
If least_significant_digit is specified, variable data will be
-truncated (quantized). In conjunction with zlib=True this
-produces ‘lossy’, but significantly more efficient compression. For
-example, if least_significant_digit=1, data will be quantized
-using numpy.around(scale*data)/scale, where scale = 2**bits,
-and bits is determined so that a precision of 0.1 is retained (in
-this case bits=4). From
-http://www.cdc.noaa.gov/cdc/conventions/cdc_netcdf_standard.shtml:
-“least_significant_digit – power of ten of the smallest decimal
-place in unpacked data that is a reliable value”. Default is
-None, or no quantization, or ‘lossless’ compression.
-
-
-
-
-
-
Returns:
-
None.
-
-
-
Note
-
The zlib, complevel, shuffle, fletcher32, contiguous,
-chunksizes and endian keywords are silently ignored for netCDF
-3 files that do not use HDF5.
-
-
-
Deprecated since version 1.8.0: NetCDF default saving behaviour currently assigns the outermost
-dimension as unlimited. This behaviour is to be deprecated, in
-favour of no automatic assignment. To switch to the new behaviour,
-set iris.FUTURE.netcdf_no_unlimited to True.
Flag whether or not the data should be read, if False an empty data manager
-will be provided which can subsequently load the data on demand. Default False.
-
-
-
-
-
To iterate through all of the fields in a pp file:
append - Whether to start a new file afresh or add the cube(s) to the end of the file.
-
Only applicable when target is a filename, not a file handle.
-Default is False.
-
-
-
-
-
field_coords - list of 2 coords or coord names which are to be used for
-
reducing the given cube into 2d slices, which will ultimately
-determine the x and y coordinates of the resulting fields.
-If None, the final two dimensions are chosen for slicing.
Save the PPField to the given file object (typically created with open()).
-
# to append the field to a file
-a_pp_field.save(open(filename,'ab'))
-
-# to overwrite/create a file
-a_pp_field.save(open(filename,'wb'))
-
-
-
-
Note
-
The fields which are automatically calculated are: ‘lbext’,
-‘lblrec’ and ‘lbuser[0]’. Some fields are not currently
-populated, these are: ‘lbegin’, ‘lbnrec’, ‘lbuser[1]’.
Uncompress PP field data that has been compressed using Run Length Encoding.
-
Provides access to the libmo_unpack library function runlenDecode.
-Decodes the field by expanding out the missing data points represented
-by a single missing data value followed by a value indicating the length
-of the run of missing data values.
-
Args:
-
-
-
data (numpy.ndarray):
-
The raw field byte array to be uncompressed.
-
-
-
-
-
lbrow (int):
-
The number of rows in the grid.
-
-
-
-
-
lbnpt (int):
-
The number of points (columns) per row in the grid.
-
-
-
-
-
bmdi (float):
-
The value used in the field to indicate missing data points.
-
-
-
-
-
-
Returns:
-
numpy.ndarray, 2d array containing normal uncompressed field data.
Return the number of non-overlapping occurrences of substring sub in
-string S[start:end]. Optional arguments start and end are interpreted
-as in slice notation.
Decodes S using the codec registered for encoding. encoding defaults
-to the default encoding. errors may be given to set a different error
-handling scheme. Default is ‘strict’ meaning that encoding errors raise
-a UnicodeDecodeError. Other possible values are ‘ignore’ and ‘replace’
-as well as any other name registered with codecs.register_error that is
-able to handle UnicodeDecodeErrors.
Encodes S using the codec registered for encoding. encoding defaults
-to the default encoding. errors may be given to set a different error
-handling scheme. Default is ‘strict’ meaning that encoding errors raise
-a UnicodeEncodeError. Other possible values are ‘ignore’, ‘replace’ and
-‘xmlcharrefreplace’ as well as any other name registered with
-codecs.register_error that is able to handle UnicodeEncodeErrors.
Return True if S ends with the specified suffix, False otherwise.
-With optional start, test S beginning at that position.
-With optional end, stop comparing S at that position.
-suffix can also be a tuple of strings to try.
Return the lowest index in S where substring sub is found,
-such that sub is contained within S[start:end]. Optional
-arguments start and end are interpreted as in slice notation.
Return True if S is a titlecased string and there is at least one
-character in S, i.e. uppercase characters may only follow uncased
-characters and lowercase characters only cased ones. Return False
-otherwise.
Return a copy of the string S with leading whitespace removed.
-If chars is given and not None, remove characters in chars instead.
-If chars is unicode, S will be converted to unicode before stripping
Search for the separator sep in S, and return the part before it,
-the separator itself, and the part after it. If the separator is not
-found, return S and two empty strings.
Return a copy of string S with all occurrences of substring
-old replaced by new. If the optional argument count is
-given, only the first count occurrences are replaced.
Return the highest index in S where substring sub is found,
-such that sub is contained within S[start:end]. Optional
-arguments start and end are interpreted as in slice notation.
Search for the separator sep in S, starting at the end of S, and return
-the part before it, the separator itself, and the part after it. If the
-separator is not found, return two empty strings and S.
Return a list of the words in the string S, using sep as the
-delimiter string, starting at the end of the string and working
-to the front. If maxsplit is given, at most maxsplit splits are
-done. If sep is not specified or is None, any whitespace string
-is a separator.
Return a copy of the string S with trailing whitespace removed.
-If chars is given and not None, remove characters in chars instead.
-If chars is unicode, S will be converted to unicode before stripping
Return a list of the words in the string S, using sep as the
-delimiter string. If maxsplit is given, at most maxsplit
-splits are done. If sep is not specified or is None, any
-whitespace string is a separator and empty strings are removed
-from the result.
Return True if S starts with the specified prefix, False otherwise.
-With optional start, test S beginning at that position.
-With optional end, stop comparing S at that position.
-prefix can also be a tuple of strings to try.
Return a copy of the string S with leading and trailing
-whitespace removed.
-If chars is given and not None, remove characters in chars instead.
-If chars is unicode, S will be converted to unicode before stripping
Return a copy of the string S, where all characters occurring
-in the optional argument deletechars are removed, and the
-remaining characters have been mapped through the given
-translation table, which must be a string of length 256 or None.
-If the table argument is None, no translation is applied and
-the operation simply removes the characters in deletechars.
Takes a list of filenames which may also be globs, and optionally a
-constraint set and a callback function, and returns a
-generator of Cubes from the given files.
-
-
Note
-
Typically, this function should not be called directly; instead, the
-intended interface for loading is iris.load().
target - A filename (or writable, depending on file format).
-
When given a filename or file, Iris can determine the
-file format.
-
-
-
-
-
-
Kwargs:
-
-
-
-
saver - Optional. Specifies the save function to use.
-
If omitted, Iris will attempt to determine the format.
-
This keyword can be used to implement a custom save
-format. Function form must be:
-my_saver(cube,target) plus any custom keywords. It
-is assumed that a saver will accept an append keyword
-if it’s file format can handle multiple cubes. See also
-iris.io.add_saver().
-
-
-
-
-
-
All other keywords are passed through to the saver function; see the
-relevant saver documentation for more information on keyword arguments.
-
Examples:
-
# Save a cube to PP
-iris.save(my_cube,"myfile.pp")
-
-# Save a cube list to a PP file, appending to the contents of the file
-# if it already exists
-iris.save(my_cube_list,"myfile.pp",append=True)
-
-# Save a cube to netCDF, defaults to NETCDF4 file format
-iris.save(my_cube,"myfile.nc")
-
-# Save a cube list to netCDF, using the NETCDF4_CLASSIC storage option
-iris.save(my_cube_list,"myfile.nc",netcdf_format="NETCDF3_CLASSIC")
-
The FormatAgent class is the containing object which is responsible for identifying the format of a given file
-by interrogating its children FormatSpecification instances.
-
Typically a FormatAgent will be created empty and then extended with the FormatAgent.add_spec() method:
Pick the first FormatSpecification which can handle the given
-filename and file/buffer object.
-
-
Note
-
buffer_obj may be None when a seekable file handle is not
-feasible (such as over the http protocol). In these cases only the
-format specifications which do not require a file handle are
-tested.
Every FormatSpecification instance has a name which can be accessed with the FormatSpecification.name property and
-a FileElement, such as filename extension or 32-bit magic number, with an associated value for format identification.
Return an iterator for iterating over a collection of cubes in step.
-
If the input cubes have dimensions for which there are no common
-coordinates, those dimensions will be treated as orthogonal. The
-resulting iterator will step through combinations of the associated
-coordinates.
One or more iris.cube.Cube instances over which to iterate in
-step. Each cube should be provided as a separate argument e.g.
-iris.iterate.izip(cube_a,cube_b,cube_c,...).
-
-
-
-
-
Kwargs:
-
-
-
coords (string, coord or a list of strings/coords):
-
Coordinate names/coordinates of the desired subcubes (i.e. those
-that are not iterated over). They must all be orthogonal (i.e. point
-to different dimensions).
-
-
-
-
-
ordered (Boolean):
-
If True (default), the order of the coordinates in the resulting
-subcubes will match the order of the coordinates in the coords
-keyword argument. If False, the order of the coordinates will
-be preserved and will match that of the input cubes.
-
-
-
-
-
-
Returns:
-
An iterator over a collection of tuples that contain the resulting
-subcubes.
Homogenize the input value for easy and efficient normalization.
-
value can be a scalar or sequence.
-
Returns result, is_scalar, where result is a
-masked array matching value. Float dtypes are preserved;
-integer types with two bytes or smaller are converted to
-np.float32, and larger types are converted to np.float.
-Preserving float32 when possible, and using in-place operations,
-can greatly improve speed for large arrays.
-
Experimental; we may want to add an option to force the
-use of float32.
Defaults to True. Must be True for masked data
-and some data types (see notes below).
-
-
-
-
-
-
-
Note
-
This function will copy your data by default.
-If you have a large array that cannot be copied,
-make sure it is not masked and use copy=False.
-
-
-
Note
-
Pandas will sometimes make a copy of the array,
-for example when creating from an int32 array.
-Iris will detect this and raise an exception if copy=False.
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
Draws a line plot based on the given cube(s) or coordinate(s).
-
The first one or two arguments may be cubes or coordinates to plot.
-Each of the following is valid:
-
# plot a 1d cube against its dimension coordinate
-plot(cube)
-
-# plot a 1d coordinate
-plot(coord)
-
-# plot a 1d cube against a given 1d coordinate, with the cube
-# values on the y-axis and the coordinate on the x-axis
-plot(coord,cube)
-
-# plot a 1d cube against a given 1d coordinate, with the cube
-# values on the x-axis and the coordinate on the y-axis
-plot(cube,coord)
-
-# plot two 1d coordinates against one-another
-plot(coord1,coord2)
-
-# plot two 1d cubes against one-another
-plot(cube1,cube2)
-
Use the given coordinates as the axes for the plot. The order of the
-given coordinates indicates which axis to use for each, where the first
-element is the horizontal axis of the plot and the second element is
-the vertical axis of the plot.
Provision of a service to handle missing packages at runtime.
-Current just a very thin layer but gives the option to extend
-handling as much as needed
Attempt the import else use the proxy module.
-It is important to note that ‘__import__()’ must be used
-instead of the higher-level ‘import’ as we need to
-ensure the scope of the import can be propagated out of this package.
-Also, note the splitting of name - this is because ‘__import__()’
-requires full package path, unlike ‘import’ (this issue is
-explicitly seen in lib/iris/fileformats/pp.py importing pp_packing)
This file contains a dictionary of standard value names that are mapped
-to another dictionary of other standard name attributes. Currently only
-the canonical_unit exists in these attribute dictionaries.
-
This file is automatically generated. Do not edit this file by hand.
-
-
The file will be generated during a standard build/installation:
-
python setup.py build
-python setup.py install
-
Also, the file can be re-generated in the source distribution via:
-
python setup.py std_names
-
Or for more control (e.g. to use an alternative XML file) via:
A PartialDateTime object specifies values for some subset of
-the calendar/time fields (year, month, hour, etc.) for comparing
-with datetime.datetime-like instances.
-
Comparisons are defined against any other class with all of the
-attributes: year, month, day, hour, minute, and second.
-Notably, this includes datetime.datetime and
-netcdftime.datetime. Comparison also extends to the
-microsecond attribute for classes, such as
-datetime.datetime, which define it.
-
A PartialDateTime object is not limited to any particular
-calendar, so no restriction is placed on the range of values
-allowed in its component fields. Thus, it is perfectly legitimate to
-create an instance as: PartialDateTime(month=2, day=30).
A class to represent S.I. units and support common operations to
-manipulate such units in a consistent manner as per UDUNITS-2.
-
These operations include scaling the unit, offsetting the unit by a
-constant or time, inverting the unit, raising the unit by a power,
-taking a root of the unit, taking a log of the unit, multiplying the
-unit by a constant or another unit, dividing the unit by a constant
-or another unit, comparing units, copying units and converting unit
-data to single precision or double precision floating point numbers.
-
This class also supports time and calendar defintion and manipulation.
An optional calendar may be provided for a unit which defines a
-time reference of the form ‘<time-unit> since <time-origin>’
-i.e. unit=’days since 1970-01-01 00:00:00’. For a unit that is a
-time reference, the default calendar is ‘standard’.
-
Accepted calendars are as follows,
-
-
‘standard’ or ‘gregorian’ - Mixed Gregorian/Julian calendar as
-defined by udunits.
-
-
‘proleptic_gregorian’ - A Gregorian calendar extended to dates
-before 1582-10-15. A year is a leap year if either,
-
-
-
It is divisible by 4 but not by 100, or
-
It is divisible by 400.
-
-
-
-
‘noleap’ or ‘365_day’ - A Gregorian calendar without leap
-years i.e. all years are 365 days long.
-
-
‘all_leap’ or ‘366_day’ - A Gregorian calendar with every year
-being a leap year i.e. all years are 366 days long.
-
-
‘360_day’ - All years are 360 days divided into 30 day months.
-
-
‘julian’ - Proleptic Julian calendar, extended to dates after
-1582-10-5. A year is a leap year if it is divisible by 4.
-
-
-
Args:
-
-
-
unit:
-
Specify the unit as defined by UDUNITS-2.
-
-
-
-
-
calendar (string):
-
Describes the calendar used in time calculations. The
-default is ‘standard’ or ‘gregorian’ for a time reference
-unit.
-
-
-
-
-
Returns:
-
-
Unit object.
-
Units should be set to “no_unit” for values which are strings.
-Units can also be set to “unknown” (or None).
-For example:
Converts a single value or numpy array of values from the current unit
-to the other target unit.
-
If the units are not convertible, then no conversion will take place.
-
Args:
-
-
-
value (int/float/long/numpy.ndarray):
-
Value/s to be converted.
-
-
-
-
-
other (string/Unit):
-
Target unit to convert to.
-
-
-
-
-
ctype (ctypes.c_float/ctypes.c_double):
-
Floating point 32-bit single-precision (iris.unit.FLOAT32) or
-64-bit double-precision (iris.unit.FLOAT64) of conversion. The
-default is 64-bit double-precision conversion.
Returns a datetime-like object calculated from the numeric time
-value using the current calendar and the unit time reference.
-
The current unit time reference must be of the form:
-‘<time-unit> since <time-origin>’
-i.e. ‘hours since 1970-01-01 00:00:00’
-
The datetime objects returned are ‘real’ Python datetime objects
-if the date falls in the Gregorian calendar (i.e. the calendar
-is ‘standard’, ‘gregorian’, or ‘proleptic_gregorian’ and the
-date is after 1582-10-15). Otherwise a ‘phoney’ datetime-like
-object (netcdftime.datetime) is returned which can handle dates
-that don’t exist in the Proleptic Gregorian calendar.
-
Works for scalars, sequences and numpy arrays. Returns a scalar
-if input is a scalar, else returns a numpy array.
-
Args:
-
-
time_value (float): Numeric time value/s. Maximum resolution
-is 1 second.
Returns a netcdftime.utime object which performs conversions of
-numeric time values to/from datetime objects given the current
-calendar and unit time reference.
-
The current unit time reference must be of the form:
-‘<time-unit> since <time-origin>’
-i.e. ‘hours since 1970-01-01 00:00:00’
Return numeric time value (resolution of 1 second) encoding of
-datetime object.
-
The units of the numeric time values are described by the unit and
-calendar arguments. The datetime objects must be in UTC with no
-time-zone offset. If there is a time-zone offset in unit, it will be
-applied to the returned numeric values.
-
Like the matplotlib.dates.date2num() function, except that it allows
-for different units and calendars. Behaves the same as if
-unit = ‘days since 0001-01-01 00:00:00’ and
-calendar = ‘proleptic_gregorian’.
-
Args:
-
-
-
date (datetime):
-
A datetime object or a sequence of datetime objects.
-The datetime objects should not include a time-zone offset.
-
-
-
-
-
unit (string):
-
A string of the form ‘<time-unit> since <time-origin>’ describing
-the time units. The <time-unit> can be days, hours, minutes or seconds.
-The <time-origin> is a date/time reference point. A valid choice
-would be unit=’hours since 1800-01-01 00:00:00 -6:00’.
Decode a double precision date/clock time value into its component
-parts and return as tuple.
-
Decode time into it’s year, month, day, hour, minute, second, and
-resolution component parts. Where resolution is the uncertainty of
-the time in seconds.
-
Args:
-
-
time (float): Date/clock time encoded as a double precision value.
-
-
-
Returns:
-
tuple of (year, month, day, hour, minute, second, resolution).
Encoding performed using UDUNITS-2 hybrid Gergorian/Julian calendar.
-Dates on or after 1582-10-15 are assumed to be Gregorian dates;
-dates before that are assumed to be Julian dates. In particular, the
-year 1 BCE is immediately followed by the year 1 CE.
Return date/clock time encoded as a double precision value.
-
Encoding performed using UDUNITS-2 hybrid Gregorian/Julian calendar.
-Dates on or after 1582-10-15 are assumed to be Gregorian dates;
-dates before that are assumed to be Julian dates. In particular, the
-year 1 BCE is immediately followed by the year 1 CE.
Return datetime encoding of numeric time value (resolution of 1 second).
-
The units of the numeric time value are described by the unit and
-calendar arguments. The returned datetime object represent UTC with
-no time-zone offset, even if the specified unit contain a time-zone
-offset.
-
Like the matplotlib.dates.num2date() function, except that it allows
-for different units and calendars. Behaves the same if
-unit = ‘days since 001-01-01 00:00:00’}
-calendar = ‘proleptic_gregorian’.
-
The datetime instances returned are ‘real’ python datetime
-objects if the date falls in the Gregorian calendar (i.e.
-calendar=’proleptic_gregorian’, or calendar = ‘standard’ or ‘gregorian’
-and the date is after 1582-10-15). Otherwise, they are ‘phony’ datetime
-objects which support some but not all the methods of ‘real’ python
-datetime objects. This is because the python datetime module cannot
-use the ‘proleptic_gregorian’ calendar, even before the switch
-occured from the Julian calendar in 1582. The datetime instances
-do not contain a time-zone offset, even if the specified unit
-contains one.
-
Args:
-
-
-
time_value (float):
-
Numeric time value/s. Maximum resolution is 1 second.
-
-
-
-
-
unit (sting):
-
A string of the form ‘<time-unit> since <time-origin>’
-describing the time units. The <time-unit> can be days, hours,
-minutes or seconds. The <time-origin> is the date/time reference
-point. A valid choice would be
-unit=’hours since 1800-01-01 00:00:00 -6:00’.
Return a cube with added length one dimensions to match the dimensionality
-and dimension ordering of target_cube.
-
This function can be used to add the dimensions that have been collapsed,
-aggregated or sliced out, promoting scalar coordinates to length one
-dimension coordinates where necessary. It operates by matching coordinate
-metadata to infer the dimensions that need modifying, so the provided
-cubes must have coordinates with the same metadata
-(see iris.coords.CoordDefn).
-
-
Note
-
This function will load and copy the data payload of src_cube.
Each dimension of the array must correspond to a dimension in the
-given shape. Striding is used to repeat the array until it matches
-the desired shape, returning repeated views on the original array.
-If you need to write to the resulting array, make a copy first.
A mapping of the dimensions of array to their corresponding
-element in shape. dim_map must be the same length as the
-number of dimensions in array. Each element of dim_map
-corresponds to a dimension of array and its value provides
-the index in shape which the dimension of array corresponds
-to, so the first element of dim_map gives the index of shape
-that corresponds to the first dimension of array etc.
-
-
-
-
-
Examples:
-
Broadcasting an array of shape (2, 3) to the shape (5, 2, 6, 3)
-where the first dimension of the array corresponds to the second
-element of the desired shape and the second dimension of the array
-corresponds to the fourth element of the desired shape:
A sequence of dimension indices, specifying which dimensions of
-array are represented in weights. The order the dimensions
-are given in is not important, but the order of the dimensions
-in weights should be the same as the relative ordering of the
-corresponding dimensions in array. For example, if array is
-4d with dimensions (ntime, nlev, nlat, nlon) and weights
-provides latitude-longitude grid weightings then dims could be
-set to [2, 3] or [3, 2] but weights must have shape
-(nlat, nlon) since the latitude dimension comes before the
-longitude dimension in array.
Returns a clipped version of the string based on the specified clip
-length and whether or not any graceful clip points can be found.
-
If the string to be clipped is shorter than the specified clip
-length, the original string is returned.
-
If the string is longer than the clip length, a graceful point (a
-space character) after the clip length is searched for. If a
-graceful point is found the string is clipped at this point and the
-rider is added. If no graceful point can be found, then the string
-is clipped exactly where the user requested and the rider is added.
-
Args:
-
-
-
the_str
-
The string to be clipped
-
-
-
-
-
clip_length
-
The length in characters that the input string should be clipped
-to. Defaults to a preconfigured value if not specified.
-
-
-
-
-
rider
-
A series of characters appended at the end of the returned
-string to show it has been clipped. Defaults to a preconfigured
-value if not specified.
-
-
-
-
-
-
Returns:
-
The string clipped to the required length with a rider appended.
-If the clip length was greater than the orignal string, the
-original string is returned unaltered.
Given a full slice full of tuples, return a dictionary mapping old
-data dimensions to new and a generator which gives the successive
-slices needed to index correctly (across columns).
-
This routine deals with the special functionality for tuple based
-indexing e.g. [0, (3, 5), :, (1, 6, 8)] by first providing a slice
-which takes the non tuple slices out first i.e. [0, :, :, :] then
-subsequently iterates through each of the tuples taking out the
-appropriate slices i.e. [(3, 5), :, :] followed by [:, :, (1, 6, 8)]
-
This method was developed as numpy does not support the direct
-approach of [(3, 5), : , (1, 6, 8)] for column based indexing.
Calculates the difference between values along a given dimension.
-
Args:
-
-
-
ndarray:
-
The array over which to do the difference.
-
-
-
-
-
dimension:
-
The dimension over which to do the difference on ndarray.
-
-
-
-
-
circular:
-
If not False then return n results in the requested dimension
-with the delta between the last and first element included in
-the result otherwise the result will be of length n-1 (where n
-is the length of ndarray in the given dimension’s direction)
-
If circular is numeric then the value of circular will be added
-to the last element of the given dimension if the last element
-is negative, otherwise the value of circular will be subtracted
-from the last element.
Compatibility does not guarantee that two cubes can be merged.
-Instead, this function is designed to provide a verbose description
-of the differences in metadata between two cubes. Determining whether
-two cubes will merge requires additional logic that is beyond the
-scope of this function.
Return whether the ‘result’ file has a later modification time than all of
-the ‘source’ files.
-
If a stored result depends entirely on known ‘sources’, it need only be
-re-built when one of them changes. This function can be used to test that
-by comparing file timestamps.
-
Args:
-
-
-
result_path (string):
-
The filepath of a file containing some derived result data.
-
-
-
-
-
source_paths (string or iterable of strings):
-
The path(s) to the original datafiles used to make the result. May
-include wildcards and ‘~’ expansions (like Iris load paths), but not
-URIs.
-
-
-
-
-
-
Returns:
-
True if all the sources are older than the result, else False.
-
If any of the file paths describes no existing files, an exception will
-be raised.
-
-
-
-
Note
-
There are obvious caveats to using file timestamps for this, as correct
-usage depends on how the sources might change. For example, a file
-could be replaced by one of the same name, but an older timestamp.
-
If wildcards and ‘~’ expansions are used, this introduces even more
-uncertainty, as then you cannot even be sure that the resulting list of
-file names is the same as the originals. For example, some files may
-have been deleted or others added.
-
-
-
Note
-
The result file may often be a pickle file. In that case, it
-also depends on the relevant module sources, so extra caution is
-required. Ideally, an additional check on iris.__version__ is advised.
Note that, the array must not contain missing data.
-
Kwargs:
-
-
-
strict (boolean)
-
Flag to enable strict monotonic checking
-
-
-
-
-
return_direction (boolean)
-
Flag to change return behaviour to return
-(monotonic_status, direction). Direction will be 1 for positive
-or -1 for negative. The direction is meaningless if the array is
-not monotonic.
-
-
-
-
-
Returns:
-
-
-
monotonic_status (boolean)
-
Whether the array was monotonic.
-
If the return_direction flag was given then the returned value
-will be:
Promotes an AuxCoord on the cube to a DimCoord. This AuxCoord must be
-associated with a single cube dimension. If the AuxCoord is associated
-with a dimension that already has a DimCoord, that DimCoord gets
-demoted to an AuxCoord.
Performs an in-place conversion of the time units of all time coords in the
-cubes in a given iterable. One common epoch is defined for each calendar
-found in the cubes to prevent units being defined with inconsistencies
-between epoch and calendar.
-
Each epoch is defined from the first suitable time coordinate found in the
-input cubes.
- Please activate JavaScript to enable the search
- functionality.
-
-
-
- From here you can search these documents. Enter your search
- words into the box below and click "search". Note that the search
- function will automatically search for all of the words. Pages
- containing fewer words won't appear in the result list.
-
-
-
-
-
-
-
\ No newline at end of file
diff --git a/iris/docs/dev/searchindex.js b/iris/docs/dev/searchindex.js
deleted file mode 100644
index ef836d25a..000000000
--- a/iris/docs/dev/searchindex.js
+++ /dev/null
@@ -1 +0,0 @@
-Search.setIndex({envversion:42,terms:{nwt:52,orthogon:[47,16,44,119,54,5],add_ax:[114,94],saver:[107,9,56,27,33,57],plot123:76,yellow:36,jan12:111,four:[16,96,78,47],secondli:27,prefix:[84,9,28],grid_dim:83,netcdf:[105,15,16,9,92,1],dirnam:114,plotdefn:38,sondjf:62,nearest_neighbour_data_valu:118,time_rang:83,count_spel:18,transformed_:28,messi:100,deviat:[63,54,28,55],swap:96,monthnam:[114,94],under:[23,105,114,8,90,109,39,92,76,122,60,97],aux:[122,55],lbuser:[57,67,80,84],worth:[37,111,119],suitabl:[23,24,88,54,28,76,77,34,13,87,42,63],merchant:8,digit:[84,54,61,80],everi:[37,51,119,27,40,11,77,2,96,97],risk:55,obs_time_after_cutoff:83,file_element_valu:[51,113],rdylbu_r:[18,30],distort:50,differenti:7,test_cub:109,extra_constants_start:93,make_coord:23,upstream:[36,126,6,79,100],forecast_reference_tim:[111,96,47,37,16,97,84,27,119,80,77,94,61,62],appar:109,interchang:[58,62],fieldsfil:[16,123],cml:23,zdx:80,tanh:23,scalar:[29,1,2,47,37,39,75,77,111,43,16,118,84,119,87,24,97,27,54,58,96,61,63],correct:[26,27,103,56,57,58,122,87],y_point:101,vector:[7,28,80,55,76,121],red:[105,34,119,114],scanning_mode_flag:83,pil:92,readthedoc:92,north_america:105,cmp:16,air_temp_slic:39,abil:[84,27,100,57,60,63],speci:113,mamjja:62,direct:[7,90,28,57,58,87],contour_result:114,consequ:[109,28,63],second:[29,47,114,78,16,38,119,94,126,124,2,80,57,40,87,96,13,53,63],phonei:2,aggreg:[105,16,18,54,46,87],user_callback:84,lbcfc:93,"4x4":94,panda:[9,92],ws_data_func:29,even:[6,8,119,68,2,100,54,87],lesser:8,relative_humid:58,msysgit:112,mdate:[105,22],date2num:[113,2],theta_1000m:30,air_potential_temperatur:[111,39,37,16,9,27,119,55,77,96,63],a1b_mean:105,neg:[71,67,18,80,46,87,61],altitud:[111,65,50,37,68,27,119,80,56,96,62],cordin:88,bplat:[93,80],"new":[104,90],net:[92,12],ever:56,conversionmetadata:[84,85,127],nep:100,metadata:[101,1,33,5,70,47,107,37,9,127,77,109,113,114,43,16,17,82,118,87,88,24,85,26,91,94,27,54],specialist:37,abov:[36,124,50,16,119,126,76,1,27,54,80,100,56,77],whose:[111,24,47,37,16,17,87,88,63],resampl:[118,54],never:[16,84],is_regular:[57,87],here:[35,111,36,96,6,47,30,37,80,9,95,117,19,112,40,107,11,100,57,5,79],all_points_in_rang:118,met:[111,113,8,37,16,121,9,67,97,27,119,80,69,33,96,22,63],fileel:[51,113],spell_count:18,extract_overlap:[16,55],path:[114,107,16,17,9,92,108,27,33,58,122,60,87,89],interpret:[24,37,61,84,1,80,56,57,34,58],get_opt:108,day_of_month:11,precis:[107,16,27,2,56,87,96,97],datetim:[24,86,114,16,57,119,113,27,2,55,54,53],permit:[57,2],studi:[105,114],portabl:76,mercat:[63,58,122,50],rule_typ:84,test_some_general_stuff:109,add_day_of_month:11,substr:84,unix:[33,9],format_spec:51,nug:1,txt:[113,119,80],cdc_netcdf_standard:107,unit:[36,59],highli:[12,119,63],plot:[34,52,104],test_my_bug:6,describ:[29,65,31,1,2,69,5,47,107,37,40,80,77,109,113,16,19,87,121,23,24,26,50,51,90,27,54,28,56,122],would:[24,96,37,64,119,113,94,76,27,40,77,56,2,12,100,63],glosp:65,tail:84,readi:[76,100,54,79,103],concret:23,call:[30,100,33,34,36,47,37,74,9,40,80,76,14,113,16,17,118,119,87,23,124,51,91,93,97,27,54,56,57,122,61],typo:100,recommend:[90,82,69,76,58,109,97,63],"25th":27,type:[7,1,2,100,6,37,86,9,75,80,76,109,16,83,118,84,119,121,24,51,92,94,27,28,55,56,57,58,122,97,62,63],until:[56,16,87,97,80],create_mod:93,pre_industri:105,readm:100,relat:[47,37,39,93,126,27,2,28,80,100,88,42],regrid_bilinear_rectilinear_src_and_grid:[118,122,55],notic:[47,16,27,119,80,100,77,97],warn:[84,1,90,80,55],warm:18,site_config:60,run_callback:[113,71,107,125,67,117,102,33,57,121],ensemble_mean:94,hold:[37,113,67],unpack:[107,92,93,21,55,62],phoni:2,must:[29,1,2,33,5,47,37,108,86,9,10,40,11,77,41,109,42,111,113,80,16,18,44,118,84,119,87,88,89,23,24,90,91,92,27,54,28,97],as_seri:86,join:[16,55,47,84,97],data_provid:93,restor:9,generalis:18,setup:[72,92,103],work:[19,79,100],spec:[33,16,41,56,55],conceptu:[56,119],itself:[23,125,24,16,17,83,84,40,77,54,96,97],reword:100,fnmatch:[27,9],akin:40,root:[54,1,2],rectifi:77,"_name_of_refer":12,coordext:[16,24],defer:[56,16,87,55],rule_log_ignor:108,climat:[111,105,114,107,37,1,60],give:[29,9,67,100,34,35,36,47,39,109,79,113,114,16,87,89,51,94,95,58,96,97],indic:[24,86,47,90,37,16,38,113,57,118,40,21,80,55,76,54,87,13],caution:[87,96],test_checksum_ignores_masked_valu:109,keep:[6,100],end:[113,24,71,16,67,126,84,100,87,104],u_prim:55,weekday_fullnam:11,quot:[16,40],enh:100,cylind:50,ordinari:[111,91,124,80],main_data:124,maindata_filepath:124,length:[24,47,114,107,37,16,52,93,84,27,54,21,55,87],hdf:107,how:[111,113,103,6,124,30,90,9,18,126,54,107,69,77,87,109,79,104],enforc:36,recoveri:100,enquiri:8,place:[109,124,114,30],word_siz:93,config:[9,36,6,79,92],realization_numb:94,updat:[36,90],rotate_pol:28,pole_latitud:63,hhuuggoo:36,to_xml_attr:23,after:[35,113,47,90,100,67,93,53,84,2,55,76,87,97],list_of_cub:16,badli:113,wrong:[42,100],mesh:[17,34],j_cube:7,lat:[105,124,37,39,119,28,63],law:76,save_grib2:[33,9,71],parallel:[28,124,50],demonstr:[70,47,124,30,81,52,39,101,68,119,55,56,94,78,22,109],reshap:[23,87,96],as_cartopy_project:50,attempt:[43,9,118,76,119,27,54,80,33,57,60,62,89],third:[16,96,47],utim:2,classmethod:[1,24,93],fl200:113,exclud:[37,46,126],receiv:[87,8],interpolation_point:56,maintain:[66,100,32],green:114,incorpor:[2,126],enter:[92,100],first:[29,64,30,9,67,3,100,126,47,78,108,38,39,40,80,77,13,110,111,113,114,16,17,82,84,119,87,12,23,24,124,90,51,107,92,93,94,27,95,57,58,96,97,62],order:[30,67,1,47,37,38,39,121,76,77,13,111,16,82,18,44,119,87,117,88,24,109,125,52,27,56,58,96,61,62],wrap_lon:28,add_weekday_shortnam:57,origin:[29,7,2,100,33,36,6,37,9,80,77,111,113,16,119,87,24,50,51,28,95,55,56,96,97],belong:111,data_shap:[113,93],feedback:60,diagnos:47,over:[65,101,68,94,99,37,88,76,77,13,42,111,113,114,43,16,48,18,44,84,119,83,87,22,23,24,96,124,50,51,52,61,54,28,56,57,58,122,97],govern:[76,8],becaus:[111,52,126,2,80,77,57,100,89],london:124,icoord:[113,97],affect:[113,47,16,100,56,87],flexibl:[27,63],vari:[34,80],collapsed_weight:54,streamlin:88,deg_f:2,xml_file:72,directli:[24,107,100,36,126,28,77,33,57,87,122,97,63],warm_period:18,deg_c:2,fix:[111,36,6,47,38,113,93,100,103,34,109,63],reset_load_rul:[56,67,71],better:[19,76,100,6,94],offic:[111,113,8,37,16,121,9,67,97,27,119,80,69,33,96,22,63],isspac:84,fig:[70,114,124,30,13,78],convert_unit:[24,16,39,97,77,57,61],persist:9,comprehens:[19,97],decod:[33,84,2,21,92],cell_measur:[1,113],cred:36,easier:[80,100],descend:56,isinteract:76,them:[111,24,6,47,30,37,90,113,1,100,87,109],anim:123,thei:[29,67,2,100,47,37,74,9,80,76,109,113,11,16,82,44,118,119,121,24,27,54,95,58,96,97,63],lambert:[58,50],safe:6,rectangl:124,"break":[84,63],band:34,choic:[36,2],false_east:50,cf_data_var:1,ut_unit:2,kwarg1:40,kwarg2:40,each:[65,30,100,7,34,5,105,47,107,37,38,9,11,76,77,13,42,113,80,114,16,44,119,87,88,23,24,124,78,52,93,94,27,54,28,58,96,60,97,109,63],"29001ed":100,neccesari:16,went:100,qplt:[105,70,78,39,18,68,115,20,101,81,76,46,58,13,22],side:[39,114],mean:[30,100,34,5,105,37,80,76,77,42,111,113,114,16,18,84,119,83,87,22,24,26,51,52,94,27,54,28,95,56,57,58,122,97,63],cube_func:9,"6ad92e5":100,grid_mapping_nam:[1,50],navi:124,grib1_phenom_to_cf_info:43,formatspecif:[51,113],is_brew:75,cartographi:[78,115,54,105],forgot:100,logo:8,new_ord:16,extract:[111,94,114,30,16,121,9,82,101,10,27,20,54,92,76,40,34,68,97,110],local_kei:107,vuuren:[105,114],network:[33,76,9,100],requested_x:118,goe:40,set_major_loc:[105,22],newli:77,num_byt:51,method:[7,2,105,47,37,9,10,40,80,115,76,77,109,111,113,15,43,16,83,84,119,21,87,23,24,26,51,52,93,61,27,54,55,56,57,58,96,60,97,63],len_format:51,adapt:47,got:[126,100],gov:[8,107],log2:29,linear:[16,10,118,54,28,100,119,60,63],navig:[77,82,104],written:[51,92,93,27,40,80,100],e00404:113,situat:[7,90,54,100,76,119,63],free:[109,8,90],test_bugfix:36,torvald:79,e_cont:44,pythonxi:92,derived_coord:16,x_lower:124,corr_coord:65,"086139178596568e":2,experiment_label:114,seekabl:51,add_custom_season_:122,convent:[111,47,107,37,16,17,92,1,40,80,56,96,60,97,62],hmean:54,angl:28,traceback:[56,16,27,118,47],partialdatetim:[57,27,53],initialis:[91,75,107],conditional_warn:84,unabl:26,last_timestep:94,temporari:[87,9],decode_uri:[33,51],onto:[122,16,91,119,80,100,5],bulev:80,real_constants_shap:93,rang:[34,47,80,76,42,113,16,83,118,20,87,22,24,92,27,28,55,119,58,60,53,63],render:40,volcanic_ash_air_concentr:[113,119],markerfacecolor:124,format_nam:[51,113],independ:[37,1,9,88,80],rank:54,pseudo_level:80,restrict:[29,37,16,27,77,60,53],hook:60,unlik:[109,88,89],alreadi:[24,6,91,37,16,9,93,94,118,100,33,87,126,97],messag:[85,6,71,26,43,100,9,92,83,80,103,56,122],lbyr:[93,67,80],pp_c:50,agre:76,payload:[107,16,93,54,55,57,87],jja:[111,11],"7beda5a":36,cartesian:7,isalnum:84,top:[29,74,7,67,1,2,100,33,5,71,107,37,108,38,9,10,75,11,14,113,114,43,16,83,117,118,84,21,126,86,87,121,89,23,24,26,50,51,125,93,94,54,28,110],"29th":9,sometim:[47,90,86,100,76,97],ut_definit:2,cs_ifunc:29,necessarili:[37,16,96,7],master:36,too:39,similarli:[111,24,7,16,27,54,76,96,97,63],invalidcubeerror:26,john:[105,114],zipfil:12,grid_area:111,hewitt:[105,114],a32:2,cube_label:[76,20],n_bound:24,tool:[47,72,9,92,79,63],fields_of_constants_start:93,zeroth:[24,121],rotated_lon:28,describe_diff:[16,87,47],brewer_cmap:76,e1_mean:105,weightedaggreg:[16,54,57],gitconfig:36,"_thi":90,def:[29,30,9,101,68,34,105,70,39,40,115,76,78,113,114,81,18,20,46,87,22,124,109,52,94,27,96,97],conserv:[119,91],rdbu_r:124,kept:27,technic:[79,104],bmdi:[21,93],pcolor:[74,34,38,63],target:[71,91,16,38,9,67,63,1,119,2,28,57,33,54,87,5,78,14],keyword:[29,38,2,33,47,107,74,9,75,76,40,13,111,16,44,119,87,121,71,124,52,97,27,54,56,57,58,122,61],provid:[29,86,65,53,67,1,2,3,102,103,33,34,47,99,107,37,108,38,9,121,75,11,76,77,14,113,43,16,18,44,118,119,21,87,117,22,23,7,50,51,125,90,92,93,61,110,27,54,58,96,60,97,63],add_histori:[16,57],tree:49,file_spec:33,project:[35,70,103,19,100,4,34,109,79,12],matter:16,experiment_id:9,minut:[57,36,2,53],"10th":54,colorbar_ax:[114,94],latitud:[65,7,30,101,119,105,47,37,9,10,80,115,76,77,42,111,113,114,16,118,20,46,87,22,124,50,94,27,54,28,55,57,58,96,60,97,63],fashion:54,a1b_slic:114,change_in_temperature_wrt_pressur:7,pass:[67,102,103,33,71,107,9,40,76,14,111,114,44,119,117,121,124,125,52,92,97,27,54,28,55,56,58,61],calibr:94,"7th":61,lbc:55,manner:[84,2,28,47],increment:[96,63],"__main__":[105,70,124,114,30,81,52,39,113,18,68,20,101,115,76,46,34,94,78,22],month_numb:[122,11],incompat:90,minu:16,"__new__":40,format_ag:[113,15],cope:[58,61,18],rpartit:84,duchon:52,fname:[70,94,30,16,52,39,113,101,68,20,80,115,76,58,41,22],temp_historyfile_shap:93,rider:87,blue:[105,34,36,114],has_lazy_data:[56,16],shall:80,glob:33,object:[29,7,9,67,1,2,100,36,71,107,37,38,39,10,127,40,80,75,13,110,113,43,16,117,118,84,87,121,89,23,24,85,50,51,125,90,92,93,53,27,54,56,57,58,61],geoserv:12,sample_data_path:[30,9,101,68,34,105,70,108,39,80,81,76,77,78,111,113,114,16,18,44,115,20,46,22,118,124,52,94,27,28,55,119,58,96,97,63],get_posit:114,regular:[111,24,71,113,88,28,80,56,57,58,41,87,63],letter:62,bsd:92,global_ash:119,epd:92,yourdomain:[36,6],simplic:80,don:[35,113,49,6,2,100,5],dom:[23,24],doc:[23,24,50,107,29,92,1,103,12],doe:[24,96,47,7,43,16,108,87,92,88,118,40,28,80,103,76,124,34,122,97,42],cfcoordinatevari:1,abbrev:36,wildcard:[33,87,27,9],hour_11:27,logist:97,unchang:[57,113,47],section:[36,126,40,100,34,12],dot:15,has_kei:[43,1],get_xy_grid:28,soi24:52,opposit:[107,55],random:63,ws_ifunc:29,projection_x_coordin:63,radiu:[28,50],syntax:[76,9,12],coordinat:56,protocol:[51,100],as_cartopy_glob:50,involv:[77,34,119,63],absolut:[24,29,34],tobler:92,submit:[105,109,114,103],layout:93,fill_valu:107,ellipsoid:[56,57,50],field2:93,menu:100,explain:[6,47,103,61,100,79,80,55,56,57,34,122,60,58,62,63],lambert_conform:50,apach:76,abbrevi:[9,62,11],orca2_votemp:78,theme:30,version:[100,67,31,1,69,71,8,107,76,113,16,84,87,90,52,92,55,56,57,58,122,60,61,62,63],rich:[56,77],nearest_neighbour_indic:118,folder:[109,114],latitude_longitud:50,oct:63,rotate_wind:[28,55],complevel:107,from_regular:[57,24,58],new_unit:29,stop:[24,16,38,67,84,3,100,87,22,110],column_dependent_constants_start:93,report:[76,63],symmetr:75,projection_cr:124,pearson:65,bar:[34,74,114],probabl:[54,6,47,62],patch:[126,32],transformed_x_wind:28,standard_parallel:56,cfancillarydatavari:1,remove_coord:[16,97,94],iplt:[105,70,124,114,30,81,52,39,113,18,68,20,115,76,46,34,94,78,22,63],septemb:58,cf_label_data:1,unexpectedli:26,blev:[93,80],lbuser1:93,regridd:54,habit:76,"_save_rul":83,datatyp:[56,55],result:[29,65,67,2,7,34,5,47,37,121,40,76,75,42,14,111,16,18,44,118,84,119,77,87,117,88,12,23,24,96,124,26,90,51,125,94,27,54,28,58,122,60,97,63],respons:[51,15,103],fail:[47,26,16,94,55,56,58,87],dims_to_collaps:24,zaytsev:36,best:[87,6,101,103],awar:[118,119],default_project:[38,63],area_weight:[111,105,16,54,28,55,61,62],databas:100,vert_coord_typ:93,wikipedia:[34,12],figur:[70,52,124,114,30,78,38,39,68,115,76,94,13],simplest:[76,36,6,90],drawn:[34,58,124],cube_b:[87,65,44],cube_c:44,approach:[87,94,28,80,63],cell_datetime_object:[57,24,27,9],attribut:90,accord:[23,30,16,9,1,80,34,109],unmodifi:6,space_weath:[46,96,124,44],extend:[2,34,47,37,40,76,77,79,114,16,118,84,119,89,51,61,27,54,20,96,53,63],familiaris:37,millibar:2,extens:[113,74,123,51,38,9,92,76,118,56,21,33,34,87],lazi:[107,16,92,54,55,56],extent:[111,113,37,38,1,27,28,5,78,63],toler:[119,24,5,54],incomplet:[111,47],protect:37,accident:95,expos:[56,16,119,9],impliment:36,cube_1:16,cube_2:16,against:[16,38,9,92,68,84,80,77,58,53],coordinatenotregularerror:26,logic:[16,87],constraint_aware_handl:51,uncertainti:[87,2],com:[35,112,36,49,6,65,17,100,92,126,95,69,33,122,12],col:113,restructuredtext:[40,12],kwd:9,"80ffff":113,from_fil:[125,93],sai:[34,126,100],loader:[113,71,9,84,55,56,57,58,61,63],permut:16,orography_at_bound:38,aggregate_bi:56,wider:[114,94],diff:[36,126,79,100],guid:[100,40,79,104,12],assum:[9,126,118,84,2,28,33,122,63],duplic:[16,47],jkettleb:57,my_slic:55,num_valu:93,numpi:[29,7,67,101,2,5,105,107,40,77,113,114,16,18,118,119,21,46,87,121,12,23,24,124,52,92,93,94,54,28,55,56,57,58,96,97,63],three:[111,105,47,50,37,16,9,80,100,33,109],been:[29,99,100,103,126,47,37,9,40,76,77,111,113,16,83,84,21,87,22,24,96,90,51,92,97,27,55,56,57,58,122,60,61,62,63],grib_save_rul:[15,71],accumul:[37,54],much:[74,9,82,100,56,89],dataset_typ:93,interest:[111,50,39,94,27,54,46,122,42,63],long_nam:[23,24,114,43,16,39,113,101,84,3,77,87,63,97,9],lbhr:[93,80],jpeg2000:92,longitude_of_prime_meridian:50,quickli:[76,119],life:37,"90th":54,sub_model:93,lazyarrai:23,suppress:55,scale_factor_at_central_meridian:[58,50],argument:[29,38,30,2,33,47,107,74,9,75,76,40,13,111,113,16,44,84,119,87,23,24,52,92,27,54,56,57,58,96,61],bhlev:[93,80],"catch":113,season_membership:11,dash:[105,94],expon:29,gnu:8,servic:[19,63,89],zlib:107,sourceforg:[92,12],air:[105,34,70,101],aim:[61,103],euclidean:42,calcul:52,datapoint:[111,18],classnam:109,betap:54,occas:9,aid:56,possess:[24,80],anchor:38,e1_north_america:[77,34,18,105],nimrod_fil:125,cont:39,minimum_log_level:34,brewer:[75,82],sting:2,need:[100,34,35,36,76,77,111,113,16,82,83,84,119,87,89,49,90,51,92,126,27,54,56,58,97,63],list_of_cubes_on_source_grid:119,kwarg:[29,38,7,67,1,102,33,5,71,107,74,9,10,40,11,75,13,42,14,43,16,121,44,118,84,86,87,117,88,23,24,50,51,125,93,54,28],cone:50,module_nam:89,add_aux_coord:[113,114,16,9,101,94,97,63],grib_new_from_sampl:83,notimplementederror:26,lookup_shap:93,udunit:[2,92,80],disabl:[99,100],file_handl:[51,113,67,110],prior:[24,91,55],perform:[2,105,47,107,9,80,77,11,16,18,119,87,91,92,54,28,55,56,57,58,62,63],suggest:[35,69],make:[36,90,12],format:[34,6,36,8,12],testaspectoffunct:109,who:[36,6,114,37,126,103,57],complex:[37,27,47,80,90],split:[113,37,9,84,56,89],avhrrbuvi01:117,dimensionless:[29,9,2,80,55,56,57,60],imdi:93,complet:[111,9,126,80,57,58,97,63],coord_to_differenti:7,hand:[72,87,38,114],action:[35,84],rais:[29,38,67,2,3,33,47,86,9,110,43,16,118,84,119,87,88,24,26,51,27,54,55,56,57,61],regrid_conservative_via_esmpi:[122,91],fagent:51,clim_season:111,tune:57,squar:[54,40],redefin:7,y_fit:101,undesir:9,scenario:[105,114],thu:[47,53,16,88,118,28,80,76,61],my_new_fil:100,source_path:87,inherit:40,ocean:[23,56,55,30],surface_air_pressur:[23,80],contact:63,greatest:47,thi:[30,100,101,68,103,34,126,35,105,36,6,8,39,40,78,113,114,115,70,18,19,20,46,22,12,49,124,90,109,52,92,94,95],gzip:[107,100],everyth:[84,97,100],nlon:87,suptitl:[78,114,94],left:[105,113,114,94,84,80,100,87],islow:84,identifi:[24,15,47,30,37,16,91,113,84,1,20,80,51,76,77,88],just:[35,105,113,49,6,95,114,50,52,18,126,19,27,119,80,100,76,63,22,89],savefig:76,ucar:[1,2,92],enjoy:64,temperature_cub:[76,7],human:[23,16,24,61],ut_utf8:2,accuraci:28,colormaps_refer:34,yet:[118,16,7,63],languag:[76,12],previous:[90,51,119,57,58,63],like:[2,100,36,47,74,80,76,109,110,113,43,16,84,119,87,12,93,126,27,95,56,57,53],easi:[76,96,75,79,92],plot_func:13,had:[96,94,54,100,57,122],ideal:[76,109,87,40],"366_dai":2,quickplot:[18,68],compressed_field_index3_shap:93,y_start:124,els:[113,6,43,51,9,94,2,100,57,87,89],grid_lat:28,save:[36,71,107,16,108,100,9,67,83,82,84,92,33,13,14],"_constraint":9,applic:[76,57,67,71,30],mayb:100,preserv:[67,44,75,28,80,55,56,119,121,63],int_head:93,nsigma:23,test_xml:109,ppfield:[67,127,80,56,58,121],linearli:[118,119],background:113,northpolarstereo:[78,124],apart:[16,54,80,105],cf_attr:1,measur:[54,1,2,71,50],semi:78,specif:[9,67,68,1,33,35,37,108,38,39,11,76,41,113,15,80,114,116,18,118,119,121,23,24,124,26,50,51,92,27,54,57],arbitrari:[47,37,16,18,95,56,61],transvers:[63,58,122,50],manual:100,regrid_area_weighted_rectilinear_src_and_grid:[5,122],remind:[77,126,100],confid:109,unstructur:[17,92,55],row_dependent_constants_shap:93,underli:[107,37,93,21,76,34,96,63],www:[65,8,107,0,92,76,1,33,41],right:[36,114,38,94,84,100,87,126],salin:30,test_int16:109,deal:[56,57,87,96,63],my_format:33,understand:82,interv:[111,24,37,16,80,55,13,42],scale_factor_at_projection_origin:58,feel:[109,90],lookup_start:93,githhub:100,intern:[16,47],sure:[36,86,87,100],successfulli:[27,92,100],ntime:87,test_name_of_public_method:109,test_int16_big_endian:109,auxiliary_coordin:1,bottom:[38,114,94],ut_nam:2,cflabelvari:1,subclass:[23,16,40],track:[36,126],x_bound:67,ruleresult:84,condit:[9,18,84,1,55,76,57],e_dens:44,foo:[56,40,100],file_extens:33,sample_data:108,num_word:93,core:[112,36,92,103,56,57,87],plu:[33,24,9,71],plt:[30,101,68,34,105,70,78,38,39,115,76,13,113,114,81,18,20,46,22,124,51,52,94,58,63],bold:36,uncompress:21,result_path:87,immut:24,galleri:[111,108,54,103],arrowstyl:124,fileformat:[108,9,113,50],promot:[118,1,87],repositori:8,post:[36,9,67,121,33,79,62],chapter:[1,47,55],mcol:34,contains_point:24,no_data_dimens:54,header_nam:113,latlon_common:83,structured_array_extens:100,slightli:[77,90,88,80,55],simul:[105,114],broadcast_weight:[57,87],firstfactor:52,commit:36,produc:[111,105,24,47,26,30,114,64,113,94,27,54,107,80,56,58,63,42,14],xyt:111,regularis:119,soi:52,regist:[113,67,84,1,75,76,63],basenam:[51,114],"float":[113,29,52,93,75,54,38,9,67,18,118,2,21,55,56,40,87,96,13],encod:[113,37,92,83,94,84,2,21,80,58,97],latlon_first_last:83,son:[111,11],down:[111,49,30,100,58,96],pyke:92,projection_numb:93,coord1:38,"_polynomial_fit_of_":101,wrap:[24,43,16,75,28,56],old:[84,87],secant_latitud:50,storag:[33,9],often:[111,37,119,80,100,57,87,96,97],pyugrid:[17,92],axisindex:36,suffici:[37,57,61],ondjfm:62,array2:87,support:[7,67,2,103,33,5,71,107,108,9,76,111,43,16,118,87,24,124,26,91,92,93,94,27,54,28,58,122,60,97,62,63],array1:87,"_grib_cf_map":43,transform:[124,84,28,55,78,63],coord_dims_func:23,avail:[112,107,16,38,92,10,119,13,121],stuck:100,reli:61,cube_11:27,fraction:[24,119,57,54,5,2],wgdos_unpack:21,lowest:[84,113,80],file_is_newer_than:[57,87],head:[84,36,126,100],medium:8,form:[23,113,39,50,16,90,9,126,2,28,33,40,54],forc:[75,60,126],regular_i:121,isupp:84,regular_x:121,"true":[29,1,119,2,33,36,47,107,86,9,75,11,76,77,43,16,18,44,84,20,87,71,24,50,51,90,94,27,54,28,55,56,57,96,53,62],reset:[1,100,67,71],absent:80,attr:[16,50],new_nam:29,compressed_field_index1_start:93,geospati:[41,92],maximum:[24,16,2,80,54,58],data_vari:1,tell:[36,6,20,95,76,97],non_equal_data_dimens:54,inaccur:119,fundament:[118,77,24],ncol:[76,20],statsoft:65,realization_coord:94,linden:[105,114],featur:[36,8],alongsid:37,mollweid:78,"abstract":[24,26,50,92,1,41,121],textbook:65,autoscal:75,add_weekday_fullnam:11,pcarre:78,exist:[29,7,2,33,47,99,37,72,108,9,11,111,43,16,119,87,93,94,27,55,56,58,96,60,97,62],false_north:50,a1b:[105,114],perez:79,check:[100,90],pre_industrial_mean:105,grib2_data:43,tip:79,refactor:100,overlay_data:124,pictori:37,test:[36,6,90,103,100,59],unlimit:[57,63,9,107,55],legitim:53,test_xml__checksum:109,intend:[47,9,54,28,33,77,12,63],dictmixin:1,textcoord:124,devon:69,intent:109,areaweight:[56,16,119,54,55],pyfaq:76,easili:[90,67,68,55,76,57,63],equidist:124,hardcor:35,level_dependent_const:93,longer:[90,126,40,77,56,57,87,97,62,63],swapcas:84,pseudo:[56,80],"_coordext":24,"13d7934":100,exce:[119,5,54,18],ignor:[113,77,24,122,7,26,107,16,108,9,29,118,84,1,54,55,56,57,5,97],datafram:[86,122],time:[49,100,90],push:[79,100],unicodedecodeerror:84,backward:[27,90,96,63],unidata:[33,9,1,2,92],daili:[16,47],osx:[112,92],forecast_6:27,concept:[56,79],skip:[113,55],global:[36,6,70],datum:28,cf_attrs_us:1,signific:[34,18,63],computation:119,unconstrain:16,source_cub:[118,122,91],row:21,millisecond:13,interp1d:118,"21st":55,middl:[77,34],depend:[23,24,96,47,7,37,51,9,82,92,56,80,103,33,57,87,41],system:[23,29,124,50,37,16,9,92,68,1,119,28,5,78,113],zone:[2,92],"170w":94,brsvd1:[93,80],graph:[126,36,100,92,14],brsvd3:93,brsvd2:[93,80],readabl:[23,24,16,100,76,12,61,63],day_of_year:11,linthresh:34,i_cub:7,cheat:79,rightmost:114,ensemble_:94,userwarn:42,get_el:51,derived_dim:23,sourc:[8,12],string:[67,1,2,102,33,34,71,107,37,9,40,11,41,110,113,16,17,82,44,118,84,20,87,121,23,24,26,51,27,54],time_coord:[113,114,94],legend_el:113,cfvariabl:1,repositoryspecificcheckoutinform:69,clear_phenomenon_ident:54,feasibl:51,broadli:63,octob:1,word:[84,36,67,93,100],load_cub:[9,67,101,68,119,102,34,105,70,71,107,39,121,80,81,76,77,78,111,113,114,16,18,44,115,84,20,46,117,22,118,124,125,52,94,27,28,55,56,57,58,96,60,97,63],exact:16,leftmost:114,seemingli:47,cool:[126,100],dim:[23,16,29,87,84],googlecod:92,level:[9,68,34,107,37,74,39,22,40,80,76,113,114,121,46,88,89,93,94,55,56,57,122,62,63],did:[57,100],iter:[113,24,39,114,30,43,16,107,9,48,67,118,84,33,94,13],dic:89,interpolated_cub:[56,10],quick:6,overlaid:124,round:[76,20],dir:[97,92],dict:[113,124,107,43,16,86,9],prevent:[36,47,26,119,28,77,56,57,87,122,63],netcdf3:107,netcdf4:[107,9,92,1,33,57],trend:101,htm:[76,92],lbrel_offset:93,viewvalu:43,lbuser2:93,lbuser3:[67,93],sensibl:[105,126,100],lbuser6:[67,93],lbuser7:[93,88,80],lbuser4:[93,88,80],semi_minor_axi:[56,50],remove_aux_factori:16,port:9,precipit:54,colormap:34,appear:[47,126,80,56,58,11,88,63],pyplot:[30,101,68,34,105,70,38,39,81,78,113,114,115,82,18,13,20,46,22,124,51,52,94],custom_c:50,favour:[57,58,122,107],y_end:124,current:[67,2,100,33,36,107,72,38,9,11,76,13,42,111,114,43,16,118,119,87,89,50,92,93,126,27,28,95,57,58,60,63],sinc:[111,113,6,47,107,37,16,119,67,82,27,2,80,100,57,87,71],kilomet:2,autoscale_non:75,ground:[56,122],file_path:[81,34,18],meld:100,cf_name:1,deriv:[105,70,114],dropdown:100,coordsystem:[24,7,50,16,67,118,28,76],gener:[35,113,6,8,103,19,100],test_export_geotiff:109,satisfi:[119,54,92,90],modif:[9,93,57,87,122,97],grid_cub:[118,16,5,91],along:[24,65,8,16,119,29,18,10,60,54,55,56,57,87,13],um_cf_map:15,box:[111,105,119],enso:94,hdf5:107,shift:58,save_png:14,scari:126,full_window:18,threshold_temperatur:18,lbsrce:93,behav:[96,2,47],load_fil:33,extrem:119,ut_format:2,commonli:[51,42],grid_stagg:93,overrid:40,corr:36,extra:[111,71,37,116,80,55,76,87,88],circumv:57,dtype:[23,24,107,16,113,18,2,75],modul:[109,92,103],"1985feba":117,prefer:[29,100,54,58,90],paramat:50,"1st":[57,114,80],marker:[27,124,30],instal:[49,104,100],nimrod_load_rul:15,manufactur:[23,80],linscal:34,is_time_refer:[57,2,60],memori:[56,16,87,83,55],rindex:84,lat_lon_coord_system:113,univers:[76,29,117,60,55],lazy_aggreg:54,subvers:79,live:63,handler:[51,113],msi:67,criteria:[16,27,9,103],msg:[56,47,26],mergeerror:[56,16,47,26],auto_regularis:71,ffheader:121,checkout:[36,126,6,79,100],capit:84,faith:[6,100],emiss:105,claus:92,finit:87,ctype:2,enhanc:[77,62],visual:[119,100],appendix:[1,0,63],templat:[92,9,83,62,55],vmax:[75,34,13],obj:100,time_mean:[56,34],easiest:6,fly:100,graphic:[76,124,0,100],grai:105,coordinatecollapseerror:26,analys:[105,0,114,28,69,56],judg:111,uniqu:[47,7,37,16,126,118,77],cat:16,can:[29,64,67,2,102,100,33,126,105,36,6,47,8,107,37,72,108,38,9,121,40,76,75,12,109,79,111,113,15,114,16,18,117,118,84,119,77,87,88,89,23,24,123,71,50,51,125,90,92,93,61,27,54,28,95,96,97,63],webster:36,purpos:[47,8,37,115,39,94,84,119,40,100,54,96],src_grid_cub:54,problemat:55,nearest:[24,16,10,118,54,28,119,122,63],rotatedgeogc:[63,67,80,50],want:36,stream:[107,80,55],predict:114,backslash:40,agent:51,abort:[83,100],salinity_color:30,cube_it:[58,13],load_rul:[15,71],zlev:23,occur:[47,7,90,16,88,84,2,80,97],alwai:[47,107,37,119,83,94,1,27,40,28,80,57,56,54,97,71],sundai:11,cfboundaryvari:1,multipl:[99,107,16,9,18,2,33,54,87,88],variou:[124,28,80,55,58,62,42],glosea4:[16,27,97,94],satisfact:95,write:[35,36,6,107,100,9,93,126,76,95,55,33,57,87,41,109,61],test_normal_usag:109,anyon:100,fourth:[87,47],name_to_cub:113,boundary_volc_ash_constraint:113,map:[34,113,18],product:[76,29,92,62],book:[112,79],max:[111,24,124,16,83,94,54,75,13],clone:[49,6,79,100],gradient:[118,119,122,54],southern:[52,94,30],mac:[92,62],pattern:[9,65,63],mai:[100,67,2,103,33,34,36,8,107,37,38,9,121,75,80,76,77,109,111,16,82,117,19,84,119,87,88,12,118,24,49,123,51,125,92,97,54,28,55,56,57,58,122,61,63],mquantil:54,underscor:[90,97,12],mam:[111,11],harmon:[54,100],global_attribut:1,practic:[36,82,94],array_lik:[16,87],ecmwf:[92,71,63],buxfix:100,clip_str:87,inform:[103,1,100,33,34,47,8,37,64,9,40,76,77,110,112,113,16,116,88,71,24,97,27,96,61],"switch":[6,100,2,107,55],tailor:19,combin:[23,39,51,9,100,44,2,16,88],callabl:[23,71,107,9,118,54,102,11,75],all_leap:2,ssh:[35,100],mdtol:[56,57,119,5,54],j_cmpt_curl_cub:7,lbrvc:93,globe:[37,28],cutoff:52,target_proj:[28,63],test_pph:50,creset:36,graphviz:92,graduat:[41,123],set_ext:[113,124,63],entitl:76,still:[113,43,16,51,76,100,55,56,119,58,122],pointer:[6,121],compressed_field_index1_shap:93,dynam:[105,121,114],chunksiz:107,snippet:[79,92],conjunct:[54,107],group:[16,1,54,48,88],thumb:[76,79],concis:[9,90],uk_hir:[111,16,27,96],platform:[27,92,63],window:[112,16,52,92,18,76,56,54,33,87],leadinglin:[51,113],djf:[111,11],mail:[6,100],main:[30,9,101,68,34,126,105,70,6,47,39,80,115,76,78,113,114,81,18,20,46,87,22,124,52,94,54,95,122,60,61,62,63],meridian:[124,50],non:[36,124,115,9,67,88,84,1,54,28,92,34,94,87,42],ignore_bound:16,vaild:2,add_month:[57,11],extract_by_trajectori:57,contains_dimens:16,surface_temp:94,initi:[36,49,60,90,63],bound:[7,1,5,105,71,37,38,80,115,76,41,42,111,113,114,16,17,83,119,47,24,94,27,28,55,56,57,58,96,97,62,63],udunits2:92,half:113,now:[99,30,67,68,100,126,35,36,6,47,64,39,77,111,114,16,119,22,12,49,51,97,27,95,60,61,62,63],discuss:[36,107,27,54,80,76,77,122],introduct:[104,90],lbnrec:[67,93],term:[23,7,8,43,1,28,80,56,5],reference_pressur:77,name:90,lanczo:52,perspect:50,equalise_cub:123,revers:[16,58,96,87],focuss:109,linspac:[118,16,119,94],separ:[113,37,44,84,27,100,34,122,88],low_pass_weight:52,januari:[57,11],my_rotated_v_wind_cub:55,collaps:[105,24,26,16,87,18,94,54,100,77,34,22],ters:40,my_custom_coordin:97,compil:92,domain:[105,119,28,30],replai:100,citat:38,replac:[23,24,90,16,36,84,40,55,76,57,87,122,97,113,63],arg2:40,compon:[7,90,67,93,2,28,80,53,63],continu:[76,113,103],ensur:[100,92,27,77,76,57,89,42,63],redistribut:8,yield:[77,113,26],year:[111,105,103,47,114,53,16,52,9,18,94,27,2,11,69,87,119,34,22,62],happen:88,cf_profil:60,the_str:87,shown:[113,47,61],jackson:69,duplicatedataerror:[16,47,26],"3rd":53,space:[113,114],add_sav:[33,9],profil:4,broadcast_to_shap:[57,87],formula:[23,56,1,50],near_zero:96,factori:[23,37,16,91,84,55,56,58,63],cfauxiliarycoordinatevari:1,integr:66,lag:[97,94],sample_count:10,migrat:56,brett:36,netcdf4_class:[33,9,107],fontsiz:[105,52,114],yuri:36,org:[29,8,107,86,92,80,69,76,34,41,53,12],"byte":[51,113,75,21,93],output_fil:87,care:[42,14],raster:[123,109],alphap:54,setdefault:[43,1],befor:[23,24,47,90,16,17,103,92,29,93,82,84,2,107,80,55,56,87,60,97,100],frequenc:52,remotesens:41,register_error:84,british:[8,63],turn:[37,76,113,100],"int":[113,0,107,16,52,9,92,18,53,84,93,2,21,54,40,87,13,97],yum:[112,92],fletcher32:107,"_orderedhash":[84,24,2],principl:40,think:[64,6,100],frequent:[92,47,80,12],lambda:[111,105,29,30,51,9,113,18,94,27,20,57,76,54,58,96,22],oper:[29,7,2,65,37,75,80,77,42,111,16,48,18,84,119,87,88,47,24,26,91,54,28,55,56,57,61],redhat:92,subplots_adjust:94,carri:[118,24],onc:[35,105,90,9,92,93,27,11,76,57,96],arrai:[29,2,5,105,36,47,37,86,75,115,76,42,113,114,16,18,118,119,21,87,23,24,96,124,92,93,94,54,28,55,56,57,58,122,97],"65af65":36,configpars:108,yourself:[35,37,95,100],param_categori:43,legaci:[109,83],lambertconform:[56,34,50],happi:100,open:[71,8,51,125,67,93,92,100,57,14],size:[111,113,52,124,107,16,38,84,57,77,87],given:[29,74,65,67,119,2,102,33,5,7,107,37,108,38,9,10,75,11,76,77,13,42,111,113,15,80,43,16,48,18,117,118,84,20,83,87,121,71,23,24,50,51,125,93,27,54,28,55,56,57,58,97,63],sheet:79,silent:107,workaround:53,seek_gzip_factori:100,associ:[65,9,67,1,2,103,47,107,37,39,75,80,76,77,44,84,20,87,121,51,92,54,55,56,57,58,60],ensemble_001:97,max_length:93,rotated_pol:[115,119,97,83,124],averag:[34,18],circl:124,old_coord:23,daunt:64,white:[34,124],conveni:[51,87,79,27,2,80,54,58,97,63],cite:104,pocoo:12,subplot:[78,70,39,114,94],"26aa21a":100,knowledg:[37,77,92],decode_tim:2,season:98,lazyaggregatorerror:26,copi:[36,8,100],specifi:[1,2,33,34,107,9,10,11,43,16,118,84,119,87,121,24,124,50,91,92,27,54,28,56,57,53],broadcast:[77,87],released:69,yearloc:[105,22],mostli:[56,40,80],date:[105,36,94,16,119,61,27,2,54,58,122,60,22,62,63],new_axi:[57,87],netcdf_no_unlimit:[107,9,55],than:[29,67,1,34,126,6,47,107,37,40,76,113,114,16,118,119,87,123,90,52,93,94,27,54,28,95,55,56,57,60,97,63],png:[30,101,68,34,105,70,39,81,76,78,113,114,115,18,119,46,22,124,51,52,94,20],serv:82,false_color:34,cf_label_dimens:1,instanc:[29,7,67,1,2,3,102,100,5,47,38,9,75,76,13,42,14,111,16,17,83,44,118,84,119,87,121,23,24,51,53,27,54,28,55,56,57,110,97],deletechar:84,aux_factori:9,sample_data_dir:108,were:[111,105,47,114,67,27,54,80,56,58,96,62],posit:[111,24,61,114,30,38,87,29,10,67,84,54,74,34,94,58,113],plot_titl:34,geometry_area_weight:[56,42],import_rul:84,seri:79,pre:[105,24,114,10,27,80,100,22,62],file_is_new:57,proleptic_gregorian:2,"_local":9,"carr\u00e9":119,pro:79,e1_grid_area:105,ani:[100,67,1,103,33,5,36,6,47,8,107,37,9,40,80,76,77,13,42,113,43,16,48,118,84,119,126,87,88,71,23,24,49,90,109,91,92,93,53,27,28,55,57,58,97,63],ticklabel:30,geoport:122,"13th":94,pyplot_api:34,deliv:[57,103,63],compressed_field_index3_start:93,saw:[111,96,119,47],ax2:30,shtml:107,ax1:30,engin:92,techniqu:30,advic:[47,90],callback:[125,113,117,71,26,107,114,121,9,67,88,102,33,57,94,97],src_coord:63,site_configur:[60,9],note:[1,2,100,34,5,36,6,47,86,9,80,76,89,111,16,18,119,87,12,24,124,92,126,54,95,57,96],"360_dai":2,forecast_period:[111,96,47,37,16,83,88,27,119,80,77,57,94,97],"18th":57,take:[29,101,2,100,33,94,6,47,37,39,76,77,14,111,113,119,87,22,51,97,54,55,56,57,58,96,61,63],advis:[87,64,100],poster:100,grid_map:1,noth:84,begin:84,printer:76,pandas_arrai:86,normal:[60,7,114,16,94,76,75,21,80,56,34,13,42,28],multipli:[58,24,2,29],compress:[76,57,107,21,92],k_cmpt_curl_cub:7,rotated_c:50,set_glob:78,detect:[107,16,86,126,60,88],time_unit:[113,67,71],hybridpressurefactori:[23,63],pair:[29,107,43,16,10,118,1,54,119],thredd:122,america:18,soi84:52,add_year:[16,34,11],splittableint:56,renam:[23,24,6,16,39,113,18,77],auxcoord:[23,24,39,94,114,37,16,9,113,101,61,84,11,87,56,77,58,122,97,63],whitepap:[56,104],later:[47,8,37,114,82,92,27,80,55,57,87],drive:47,quantiti:[113,119,24,2,92],ignorecubeexcept:[33,9,26],wind:[57,29,28],product_templ:83,link:[106,8],effect:[71,47,93,28,46,63],axi:[29,74,124,30,16,38,87,36,18,50,118,20,80,57,76,54,58,13,22],sigma:[23,111,37,52,27,80,55,56,96],width:[84,24,114],atol:24,show:[30,101,68,100,34,126,105,70,39,80,115,76,77,78,111,113,114,81,18,20,46,87,22,124,13,52,94,27,54,95,119,58,96,63],joshu:36,grid_templ:83,scitool:[49,6,103,92,126,95,69,122],delta:[23,87,54],serialis:113,miniconda:92,permiss:[76,57,58],hack:[6,100],threshold:[46,18,57],corner:[38,124],forecast:[105,113,94,71,0,107,37,114,88,84,1,80,57,122,61],centre_numb:43,"0f22701":100,xml:[76,16,87,109,72],onli:[29,39,67,68,2,100,5,35,47,107,37,72,9,10,40,11,76,75,111,80,43,16,83,84,20,77,87,88,71,23,24,90,51,52,91,92,93,126,27,54,28,95,55,56,119,122],explicitli:[29,8,107,37,74,27,76,46,63,89],ut_latin1:2,analyi:56,iriserror:26,plume:94,activ:[87,107,9,79,92],enough:[118,109,97],import_logg:108,black:[105,113,124,94],lbproj:93,uppermost:24,wspace:94,analyt:28,lbproc:[93,67,80,88],m01s16i203:[16,119,67,80],grib2:[71,43,81,9,92,56,55,33,57,58,122,62,63],startswith:[84,113],x_start:124,north_pole_grid_longitud:50,get:[36,100],from_coord:[24,58,11],trenberth:52,repr:[111,96,97],repo:[126,6,100],add_custom_season_membership:[57,62],cannot:[47,26,107,51,86,2,87,97],uriprotocol:51,theta:[68,121,7,30],requir:[105,113,124,52,18,103,109],multipolygon:42,prime:50,test_f:50,vmin:[75,34,13],grib_messag:71,artist:113,perspective_point_height:50,median:54,aptitud:92,irrespect:5,non_hybrid_surfac:83,bugfix:100,"376adbd":36,in_plac:29,where:[29,7,2,100,34,105,6,47,107,37,108,38,9,75,11,76,77,13,79,42,113,80,16,18,84,119,87,88,12,23,50,51,109,90,94,54,28,55,56,57,58,96,62,63],summari:[36,40],wiki:[34,92,12],cubeml:16,gergorian:2,precip_cub:54,surface_altitud:[37,111,27,96],is_udunit:2,index_bi:36,set_ylabel:30,calendar:[111,24,80,16,86,67,93,27,2,11,56,57,87,60,53,113],y_lower_bound:80,test_fn:50,maskedarrai:[16,119,54],yaxi:22,puma:92,is_vert:2,cumsum:29,enumer:[113,96,2],"278dd2a":100,behind:[109,79],proj:[70,28],myfil:[33,9],grib1:[43,56,62,55],between:[113,24,7,65,107,87,29,94,84,1,2,28,34,13,22,42,12],"import":[30,31,68,2,53,69,33,34,105,70,47,37,108,38,39,80,81,76,77,41,78,111,113,114,16,18,115,20,101,46,87,22,89,96,124,51,109,52,94,27,28,55,57,58,122,60,97,62,63],stehfest:[105,114],ff2pp:121,across:[111,47,16,27,54,119,87,13,88],units_func:[29,54,18],rotated_psl:119,is_convert:[57,2,60],august:27,parent:114,e1_slic:114,t_first:77,screen:[37,76],pypi:92,collapse_scalar:[16,54],gregorian:[16,27,2,80,56,60],set_data:93,come:[105,113,6,114,90,79,100,87,22],experiment_coord:9,copul:100,sketchi:36,markeredgecolor:124,fit:[8,4],region:[111,105,24,124,16,94,1,55,34],cfgroup:1,inconsist:[77,58,87,47],mani:[29,47,37,64,39,92,18,94,27,80,55,56,79],depth_con:30,cubemetadata:[16,87],as_compatible_shap:[57,87],color:[105,36,124,114,30,52,113,75,76,34,13],overview:40,period:[111,71,90,84,28,80,57,122,88],pop:[43,16,1,63],yearli:18,exploit:27,colon:[57,100],lbpack:[67,93,92],cfname:3,ultim:67,clarif:56,pythonimaginglibrari:92,coupl:76,sample_point:[16,10,118,119,55,54],invert:[2,30],ddof:54,cf_patch:60,invers:[52,75],mark:[76,124,94],return_indic:24,individu:[111,47,121,97,1,54,88,42],fl000:113,calendar_standard:2,valueerror:[24,26,16,38,29,67,118,84,54,3,51,119,110],curvatur:42,gribwrapp:[85,71],add_season_numb:[57,11],bmk:93,resolut:[111,29,113,118,119,2],squash:100,zero_cub:56,doesn:[47,26,94,100,56,77,97,63],rememb:[126,119,47,100],"__eq__":67,former:55,those:[24,47,88,90,114,16,9,92,44,27,54,28,80,100,119,126,109,79],airy1830:50,"case":[30,9,100,33,126,47,107,37,39,11,76,77,42,111,80,114,43,16,18,84,119,87,88,24,51,94,27,54,28,55,57,58,96,97,63],interoper:122,postscript:76,amend:[93,100],create_temp_filenam:87,add_season_:122,alter:[29,91,113,55],colorbar:[114,30,115,39,94,76,56,34],cast:[113,79],invok:[57,113],zorder:58,calendar_360_dai:86,add_dim_coord:[16,113,109,97],wgts84:52,ncube:87,margin:[56,124],planar:[92,50],advantag:[101,107],stdout:87,henc:[96,16,27,97,7],bathi:122,worri:100,destin:91,bhulev:80,bias:54,mod:[29,83],zonal:[111,54],axhlin:105,lbnpt:[93,21,80],"__init__":[24,40],ff_filenam:121,northward:7,develop:104,cpickl:57,author:69,alphabet:[84,97],same:[65,7,30,29,1,2,100,34,5,105,47,37,39,10,40,76,77,79,111,114,16,48,117,118,119,87,88,12,24,124,50,125,90,27,54,28,57,96],soi_darwin:52,time_d:113,regrid:[123,16,118,54,28,104],epoch:[113,71,16,67,2,57,87],html:[112,29,65,107,92,1,103,34,41,12],pad:[84,22],"_name_iii_spec":113,pai:1,data_dim:16,issue8005:53,type_of_statistical_process:83,week:[111,36,27],finish:[76,6,100],closest:[118,119,54],kelvin:[37,16,9,47,61],assist:[48,109],someon:[126,100],timepoint:[80,18],driven:64,capabl:[113,37,125,67,117,1,102,77,56,57,58,96,109,121],min_inclus:[16,24],extern:[60,92],load_nameiii_field:110,downloadd:69,air_temp:[70,16,39,118,119,27,20,28,80,76,77,58,63],appropri:[94,71,96,90,37,74,119,93,76,56,27,2,107,80,55,33,40,87,12,97,63],choos:[78,100],pennsylvania:76,markup:[40,12],know:[97,100],ccr:[70,124,115,34,78,58,63],pep8:[92,103],lblrec:[67,93],without:[113,47,8,90,16,100,87,76,2,80,103,56,57,58,97],as_data_fram:86,relief:[46,115],forecast_month:47,model:[105,98,6,114,39,68,78],nbound:24,dimension:[111,24,88,47,26,37,16,9,92,10,118,84,1,27,28,77,87,68,97],uses_weight:54,summer:62,execut:[57,119,9],notyetimplementederror:26,excel:[6,79],depth_c:23,rest:106,halfwai:24,coord_nam:[16,24,97,80,84],aspect:[76,51,27,109,47],touch:64,lon_con:30,speed:[29,75,80,55,56,88],is_contigu:24,compressed_field_index2_shap:93,irregular:[24,80,26,88],versu:[16,52],altitude_cub:27,uncas:84,clip_length:87,except:[36,90,16,39,54,9],littl:[107,79,27,119,100,87,97],identif:[51,113,83],fieldfil:55,masked_less:[46,119],load_strict:[57,63],coord2:38,buffer_obj:51,calculate_forecast_period:84,ufunc:[29,36],add_coord:84,real:[23,37,100,2,84],psi:8,xaxi:105,around:[75,24,40,107],read:[67,2,100,6,107,37,64,40,79,110,113,82,84,121,24,51,125,92,93,126,95,55,58],rejoin:126,"_aggreg":54,avali:119,postag:94,grib:[15,9,92],temperatur:[34,70,18,4],grid:[113,20,4],mon:11,pearsonr:65,gitk:100,load_nameii_field:110,psu:30,sternli:100,gridlin:[76,81,58,124],read_data:[67,93,121],darkgreen:124,exampleclass:40,whitespac:84,ordnanc:63,d304a73:36,pole:[124,4],integ:[113,24,107,16,121,38,29,93,67,118,84,54,3,51,75,96,53,110],roll:[16,87,54],rolling_window:[16,52,18,54,56,57,87,61],benefit:[57,36,64,77],dim_coords_and_dim:[16,84],either:[23,111,24,7,8,43,16,38,9,92,93,126,119,2,76,57,87,5,109,97,63],downward:30,inter:107,manag:[91,107,51,9,67,92,103,33,57,96,79],supplementari:82,udunits2_path:92,what:[47,100,9,119,27,40,95,103,76,77,109,97,104],a64:2,nlat:87,matur:63,default_spherical_earth_radiu:28,tick_level:34,constitut:27,compressed_field_index2_start:93,basic:[29,107,74,126,104,12],slice:[67,101,68,39,76,77,41,13,114,16,118,84,20,87,24,50,94,57,58,96,60,61],confirm:119,realization_metadata:94,definit:[23,111,24,96,0,50,37,51,113,118,84,1,2,57,54,58,5,121,42,63],src_grid:[5,54],exit:9,cylindr:[124,50],complic:100,lbegin_offset:93,master_data_repositori:122,refer:[90,100,19,95,103,12],power:[29,107,119,92,2,76,77],new_coord:[23,16,97],um_vers:[37,16,27,111,96],inspect:[37,47],coord_system:[23,113,7,124,9,28],debug:84,found:[111,105,113,47,8,43,16,17,64,92,117,84,27,80,51,76,77,87,88,12],"__name__":[105,70,124,114,30,81,52,39,113,18,68,20,101,115,76,46,34,94,78,22],docguid:12,"throw":36,total_prognostic_field:93,platecarre:[115,70,78,124,50],comparison:[24,16,9,53,27,119,56,57,58,87,62],central:50,oceansigmafactori:23,formula_term:1,assert_valid:16,viewkei:43,splitlin:84,degre:[111,24,96,50,37,16,109,119,113,101,2,28,80,55,46,5,54],intens:[57,119],target_grid:[54,55],cbar:114,attributeerror:39,industri:[105,114,80],backup:100,attributeconstraint:[27,9],encode_tim:2,routin:[7,26,16,74,10,118,27,55,58,60,87,42],effici:[27,75,80,107],contiguous_bound:[24,91],lastli:80,terminolog:[1,47],jasper:92,strip:[84,113],pivot:75,your:[36,40,8,90],path_to_join:9,get_cmap:76,loc:[105,113,101],fast:[57,126],format_pick:[33,113],complianc:[76,103],get_spec:51,area:[105,113,50,16,1,54,28,5,42],aren:113,lon:[105,124,37,39,119,28,63],overwrit:[93,67,62],box_y_point:124,start:[111,24,64,71,16,38,67,19,84,3,100,87,96,97,113,110],compliant:107,interfac:[38,92,76,56,54,33,122,63],low:[105,52,114,93,55],strictli:[24,88],svn:79,verb:40,hard:100,table2_vers:43,tupl:[23,24,107,75,43,16,121,38,39,29,44,118,84,1,2,3,54,87,63,110,28],regard:[118,61,8],amongst:76,oscil:[52,94],trajectori:54,realli:119,categor:[111,11],congratul:57,faster:[56,88],iterkei:[43,1],notat:84,tripl:40,immedi:[2,100],central_lat:50,possibl:[29,67,100,33,47,107,37,9,75,80,77,109,79,111,113,43,16,118,84,88,90,92,94,27,55,56,63],global_avg:114,data_start:93,preconfigur:87,expandtab:84,scalar_coord:[84,87],filter_funct:84,dec11:111,extest:103,concretereferencetarget:84,functionrul:84,certain:[47,37,9,18,27,80,55,33,46,34,122,63],dosctr:40,deep:[16,67],strongli:[92,97,82],intro:92,file:[36,6,8,103,100,109],grid_staggering_class:121,nerc:92,proport:[119,54],set_tick:34,fill:[38,47,124,74,84,46,34],incorrect:55,again:[36,96,100],grid_latitud:[111,115,37,16,39,44,27,80,55,58,96,13,97,63],get_data:93,hybrid:[23,119,2,68],shoyer:57,orient:[115,34,94,114,30],tight:[76,20,22,114],rotatedpol:63,wai:[24,6,47,124,90,37,9,36,79,19,126,80,100,87,96,22,63],eg_field:80,spatial:[37,56,119,105],writabl:[33,9],geometr:[54,92,42],you:[36,90,40,8,12],param_disciplin:43,intermedi:79,fork:100,polar:4,experiment_numb:93,"365_dai":2,sequenc:[2,33,34,47,74,9,10,75,80,13,16,18,118,87,88,24,51,93,27,54,57,97],"120w":94,pre_grid_area:105,vertex:56,docstr:90,multidimension:[47,67,18,118,27,80,96,88],testproject:109,debugstr:84,recent:[112,47,16,118,27,56],peak:54,to_cub:117,nino_3_4_constraint:94,agu:[105,114],reduc:[71,50,67,54,100,77],neighbour:[24,10,118,54,28,119,122,63],lookupt:43,untrack:100,directori:[36,49,108,92,95,100,57,122],mask:[122,75,86,119,118,54,28,100,46,5],kilogram:29,regrid_conserv:123,mimic:62,first_plot_left:114,horiz_grid_typ:[121,93],grib2_phenom_to_cf_info:43,level_pressur:80,is_valid:67,scm:112,represent:[24,0,16,82,126,2,100,87,14],all:[29,65,30,100,67,68,1,2,53,103,33,5,36,6,47,8,107,64,9,10,40,11,76,77,109,42,111,113,94,80,43,16,48,18,44,118,84,119,87,88,24,96,50,91,61,27,54,95,55,56,57,58,122,60,97,62,63],consider:[51,113],cube_text:14,illustr:[113,87,47],lack:43,month:[111,47,90,16,52,9,53,80,27,2,11,57,97,62],aggregate_shap:54,atlant:30,"2_installing_git":112,equator_height_9km_slic:96,higher:[113,47,37,51,9,80,89],timetupl:53,follow:[36,8,90,40,100,12],bound_posit:24,children:51,target_grid_cub:[56,122,54],wish:[76,113,119,97,124],offset_by_tim:2,strptime:113,equator_constraint:96,articl:[76,63],init:87,rtol:24,wmo:[33,9,92,0],rstrip:84,ylabel:[76,46,52,22,94],norm:[34,75],satellit:[56,117,50],sound:100,linearsegmentedcolormap:75,normalis:[16,27,61],straightforward:[27,100],getattr:29,fals:[29,67,1,2,100,47,107,86,9,11,76,42,14,43,16,44,118,84,87,121,71,24,90,51,94,27,54,28,57,53,62],mpl:124,cf_group:1,util:[24,16,9,92,18,84,2,28,103,42],print:[111,16,119,67,2,28,77,33,54,87,121],integer_constants_start:93,filespec:[33,117],mechan:[113,9,56,80,33,57,58,122],equalise_attribut:[57,48,47],fall:[23,24,16,17,119,2],veri:[35,36,64,9,76,27,119,80,56,77,96,89],ticker:[39,94],strang:100,add_custom_season:[57,62],unalt:87,fields_it:80,bounded_cel:62,data_arrai:113,occurr:[24,16,38,67,84,54,3,110],list:[59,6,100],ff_header:121,"414m":119,adjust:[48,9,94],cosin:28,strict_grib_load:[9,71],small:[76,20,124,103],"_that":90,superced:83,dimens:[29,86,99,67,1,65,107,38,9,121,77,110,113,16,18,44,118,84,87,88,23,24,52,91,54,28],quicker:19,ten:107,margin_fract:124,x_coord:[124,83],add_aux_factori:16,coorddefn:[16,24,87],zero:[113,50,16,52,93,118,84,27,54,80,34,122,109,42],b605216:36,design:[56,76,87,47,63],weekday_numb:11,further:[105,47,37,39,9,92,60,27,56,57,96,13,63],bilinear:[118,16,91],aux_coord:16,libmo_unpack:[21,92],default_projection_ext:[38,63],least_significant_digit:107,abf:15,clock:2,sun:11,sum:[29,16,52,18,54,119,58],abl:[36,16,126,84,27,117],other_cub:29,dimcoord:[24,37,16,87,113,27,55,56,57,58,122,61],cfreader:1,with_bound:24,m01s16i222:97,intersect:[24,50,16,29,119,55,56,61],consecut:[76,16,96,62,18],"public":8,contrast:[16,47],cfnamecoordmap:107,compact:126,quantiz:107,full:[111,123,16,108,9,92,18,118,2,101,58,96,87,89],real_head:93,xml_element:[23,24,50],variat:88,m01s16i004:9,sophist:100,behaviour:[24,47,50,90,9,77,76,119,75,107,55,56,57,87,109,71],trunk:12,add_season:[111,57,11],dx_dy:83,modifi:[48,8,90,16,100,9,88,87,33,57,34,94,60,97],ukmo__um_stash_sourc:56,valu:[29,74,7,30,83,67,101,1,119,2,3,53,34,5,64,47,107,37,72,108,38,9,10,121,75,11,115,76,13,42,110,111,113,80,114,43,16,17,18,118,84,20,21,46,87,88,23,24,51,52,92,93,94,27,54,28,55,56,57,58,122,60,97,62,63],ne_shad:113,search:[26,90,16,83,84,87],a815645:100,divisor:54,ahead:100,another_c:50,observ:105,checkoutd:69,figsiz:[70,114,30,52,39,94],doctest:103,level_10:[27,96],pick:[51,100],codebas:123,magnitud:[34,52,28],spans_three_month:111,via:[113,16,108,100,91,67,93,61,54,72,60,97,42,63],shorthand:96,data_repositori:122,src_cr:124,transit:63,brsvd4:93,filenam:[67,1,102,33,71,107,9,121,76,77,14,111,113,114,117,84,110,46,87,88,122,51,125,93,94,27,57,96,97],nvalid:113,thought:100,test_data_dir:[108,122],heurist:87,resource_dir:122,sman:107,select:[24,90,16,100,9,118,27,55,76,53,63],reproject:[56,124],cf_terms_by_root:1,aggress:[105,114],ash:[113,119],distinct:[37,111,27,119,80],full_season_mean:111,tackl:63,two:[29,65,30,67,2,94,5,38,39,75,40,114,16,118,84,87,24,124,26,90,52,91,50,54,28],dewpoint:58,taken:[47,37,83,2,80,42],calendar_gregorian:[113,86],clabel:76,minor:[56,90,50],more:[90,8,12],diagnost:47,diamond:119,desir:[24,107,16,9,44,118,76,119,28,56,87,5,63],hundr:46,no_unit:[114,2,11,57,60,97],function_nam:109,flag:[6,71,90,67,80,100,56,87,61],sample_year:34,particular:[111,124,8,30,51,9,67,1,27,2,100,54,53],known:[47,27,40,80,55,87],noleap:2,crs_latlon:124,none:[29,74,65,67,1,2,33,47,107,108,38,9,7,10,40,75,13,113,43,16,121,117,118,84,86,87,88,89,23,24,71,26,50,51,125,93,53,54,28,97],number_on:27,hour:[111,113,37,16,9,53,84,119,27,2,80,55,57,96,97,62],hous:114,latitude_of_projection_origin:50,der:[105,114],outlin:[74,46,113,38],endswith:[84,9],dev:92,from_msi:67,remain:[113,65,9,84,80,100],data_fram:86,vim:36,caveat:87,learn:[19,79],timestep:97,userguid:58,dec:[62,11],esmpi:[122,91],nino:94,arrowprop:124,prompt:9,little_func:87,projection_y_coordin:63,accept:[16,9,27,2,55,33,54,58],sphere:50,air_temp_t:77,minimum:[111,24,90,16,54,80,58],unreli:16,robinson:28,cours:[36,79],goal:[64,63],divid:[58,29,2,28,42],rather:[29,107,126,28,95,55,56,63],anoth:[24,72,36,16,38,119,29,1,2,55,40,87,61,63],surface_altitid:27,hugo:36,perhap:[57,100],formalis:63,divis:[76,77,29,2],unrotate_pol:28,cmcustomattribut:84,reject:[33,9],hslice:44,simpl:[24,6,114,107,9,29,93,84,40,34,88],oceansigmazfactori:23,isn:119,specific_humid:27,resourc:[19,8],referenc:[56,1,113,12],variant:[63,93,55],first_slic:96,reflect:103,plane:118,buffer:[76,51],set_ticklabel:34,varianc:[16,54],call_func:54,map_setup:63,png_filenam:51,circumst:[23,96,90],"short":[35,27],postfix:92,confus:[47,100],add_weekdai:[57,11],stash:[111,37,16,9,67,27,119,80,55,56,57,58,96,97,62],ambigu:122,mislead:47,add_month_numb:[122,11],imageri:56,netcdf_specif:51,spheric:[56,28,7,50],shade:[46,115],ensembl:[105,98,114],spell_year:18,point_mod:38,pot_temperatur:77,extremely_large_cub:56,orca2:78,hypertext:8,equatori:30,set_label:[115,34,94,114,30],namecoord:110,paper:[56,121],through:[113,123,50,51,107,9,67,84,44,56,76,27,54,55,33,119,87],sqrt:[29,28],non_equal_shap:54,paramet:[24,43,18,54,61,62],containing_fold:114,style:[9,94,76,103,33,34,63],binari:[33,9,2,80,14],itervalu:[43,1],equirectangular:119,resort:[9,63],pend:93,bypass:63,translationerror:26,feb12:111,might:[36,90],st_swithuns_daterang:27,good:[112,36,126,100,76,79,63],"return":[29,65,83,67,1,2,3,102,33,5,108,6,47,107,73,38,9,7,121,127,40,11,75,13,42,14,113,80,43,16,17,18,44,118,84,21,87,117,88,71,23,24,85,50,51,52,91,93,110,27,54,28,55,56,57,58,96,60,97,63],netcdf3_64bit:107,timestamp:87,var_nam:[23,16,24,87,61],framework:[37,27],"_cl":[24,16,38,84,3,110],hybrid_surfac:83,arakawa:[57,121],detach:100,rsplit:84,harrow:76,bigger:[56,96],productnam:69,handling_spec:51,eventu:123,hybrid_height:[76,96,119,68],cross_sect:68,instruct:[35,112,49,126,95,100],refresh:6,conic:50,legend:[105,113,52,101,20,76],radian:[24,5,2,55],radial:7,fulli:[37,16,82,97,119,57,109,88],unicod:[16,40,60,84],truncat:[84,58,107],embarass:100,memoryerror:56,inplac:[33,57,9,92],sea_water_practical_salin:30,injest:60,weight:[111,105,16,52,54,28,87,5,42],polar_stereo:81,unequ:47,monoton:[24,47,37,1,119,80,56,57,87],brewer_rdbu_11:63,lblrec_offset:93,varibl:56,connect:95,time_typ:93,beyond:[16,119,28,55,56,77,87,63],todo:[16,41],coord_dim:[23,16,63],standard:[9,67,1,2,103,34,47,37,72,39,40,11,76,77,113,80,16,17,88,71,23,24,50,92,27,54,28,55,97,62,63],my_slice_ov:55,item:[85,43,16,67,93,1,127,80],regrid_to_max_resolut:118,unknown:[23,24,43,16,17,2,77,57,60],publish:[8,69],primari:[111,38,119],footnot:90,delta_a1b:114,month_fullnam:[122,11],z_level:113,asarrai:16,variable_nam:107,my_fil:[33,76],matthew:36,difficulti:47,qualifi:109,set_height:43,proxi:9,advanc:12,geogc:[113,24,50,16,67,80,55,58,63],ws_units_func:29,is_scalar:75,differ:[109,52,19,40,4,34,78],kwilliam:57,ness:107,cf_ident:1,reason:[16,100,47,97],base:[29,74,67,1,2,3,100,5,71,107,37,38,9,10,40,80,115,76,75,110,113,43,16,83,117,118,84,119,87,121,89,23,24,26,50,51,125,92,93,27,54,28,55,56,57,53,62,63],set_color:30,put:[24,114,30,16,113,94,118,27,20,76,34,22],find_sav:33,add_formula_term:1,gdal:[41,92],basi:[76,87,47,97],sytem:124,nomin:57,thread:9,launch:14,cfgridmappingvari:1,omit:[33,58,9],american:105,field3:93,assign:[23,24,107,16,27,55],datamodel:80,major:[24,47,90,37,50,31],notifi:[23,118,108,119,54],obviou:87,data_result:54,transform_point:124,target_cub:87,number:[29,74,53,9,67,2,3,34,47,38,39,10,11,76,110,111,113,80,43,16,18,118,84,20,21,87,24,26,51,93,94,27,54,28,57,96,97,63],placehold:36,feet:37,run_identifi:93,summaris:[40,80,103],done:[6,47,90,43,16,100,19,84,27,2,51,76,63],cf_data:[43,1],blank:[113,40],stabl:[16,63],ljust:84,miss:[5,7,26,50,87,88,94,1,54,21,119,58,122,97,89],build_ext:92,stage:[113,80,100],"4aff2a8":36,guess:[76,111,87,119],exponenti:[29,60],linear_extrapolation_mod:54,interact:[109,9,100],word_depth:121,construct:[24,47,124,50,16,52,55,54,80,51,56,119,34,96,42],lstrip:84,isalpha:84,um_stash_sourc:56,statement:[76,54,9],natur:[57,29,27,115],dt2:2,dt1:2,scheme:[16,84,54,51,33,119,34,109],lbmon:[93,80],store:[47,71,107,37,118,80,87,97,63],unicodeencodeerror:84,luckili:100,bplon:[93,80],option:[113,6,8,126,100,34],relationship:[1,122],extract_strict:16,part:[0,8,51,84,27,2,80,69,33,119,58,122,97],warmli:6,ensure_arrai:[56,87],grace:87,update_tick:[39,94],std:54,approx_equ:87,kind:[76,30],at_i:124,time_processing_period:83,remot:[36,6,126,95,100,79],horizont:[105,114,30,16,38,91,94,118,1,28,115,34,5,13,42],firstli:27,at_x:124,cost:57,orthograph:[56,78,50],"1rc1":55,str:[57,87,97,65,84],arrang:[88,18],oceansg1factori:23,cleaner:63,comput:[23,105,36,71,107,16,92,101,54,28,95,119],atmospher:[56,30],cf_attrs_unus:1,std_name:[9,92],verticalperspect:[56,50],packag:[90,40,12],expir:90,lbyrd:[93,80],checksum:[16,109,107],read_mod:93,lbdai:[93,80],lie:[24,5,119,42],entireti:96,nationalarch:8,mpl_cm:76,real_constants_start:93,ancillari:[1,58,62],commentari:57,self:[37,16,24,109,40],salinity_1000m:30,click:[35,100],cf_phenom_to_grib2_info:43,build:[113,103],projection_nam:124,command:[36,49,6,100,92,79,103,76,97],brace:84,cube:[109,104,18,68],distribut:8,evenli:[114,94],setuptool:92,exner_slic:39,previou:[111,96,16,9,82,27,80,100,76,122,97],reach:[126,63],colpex:[77,39,55],exet:69,most:[111,112,47,37,16,118,84,27,119,80,100,56,97],plai:69,jsp:92,plan:63,watt:2,alpha:[52,30],custom:34,grid_north_pole_longitud:50,keepend:84,bug:[36,100,6,53,103],libudunits2:92,clear:[43,16,1,54,76],lbvc:[93,80],cover:[111,47,0,9,118,28,80,12,63],bzy:[93,80,121],bzx:[93,80,121],"2nd":114,getter:51,data_hit:18,clean:79,thank:[57,36,6,77],pars:[57,113],bulletin:58,grid_longitud:[111,115,96,37,16,39,44,27,80,55,58,68,13,97,63],humid:39,phenomenon:[47,37,16,88,43,3,57,97,71],cdc:107,cdl:63,reset_save_rul:67,alphanumer:84,wgdo:[21,93],generating_process_typ:83,lbdat:[93,80],session:[33,117],particularli:[57,62],central_latitud:78,gmean:54,fine:[57,119,22],test_add_dim_coord:109,ensemble_010:16,impact:61,polyfit:101,randn:63,three_months_bound:111,pretti:36,solut:[122,90],factor:[56,58,80],make_plot:124,add_season_year:[111,57,11],darwin:52,yml:103,i005:9,"6d8e1e":36,"__version__":87,caus:[23,113,47,88,90,16,39,61,54,56,87,122,97,63],param_cod:83,colorbrew:76,express:[24,9,84,27,28,80,76,57,5,42,63],nativ:[115,63,107,50],absolute_import:76,noaa:107,restart:87,air_temp_t_slic:77,limit_colorbar_tick:39,city_data:124,establish:[105,8,63],air_pot_temp:27,whenev:[76,57],coord:[23,105,29,114,7,30,39,50,9,113,101,68,124,20,11,115,54,34,94],rainfal:37,level_lt_16:27,cach:23,maxnloc:[39,94],earth_radiu:[56,67,50],new_file_nam:100,conourf:114,orca:56,temperature_1d:76,arr:87,set:[36,100,90],dump:57,sep:84,between_3_and_6:87,printout:80,atlantic_profil:30,see:[74,0,67,1,2,100,33,34,35,6,7,8,107,37,38,9,80,76,77,41,13,14,111,112,16,19,119,86,87,88,71,23,118,96,92,53,27,54,95,56,57,58,122,60,97,62,63],signifi:[23,37],sea:[56,57,22],arg:[29,74,65,83,67,1,2,102,33,5,7,107,73,38,9,10,121,40,11,75,41,13,42,14,43,16,17,48,18,44,118,84,21,86,87,117,88,71,23,24,85,26,50,51,125,52,91,93,127,54,28,110,53],phenomenon_point:71,cube1_ext:63,true_scale_lat:50,bdatum:93,contour:[70,38,124,114,115,74,39,113,68,46,22],analog:54,grid_north_pole_latitud:50,cube1:[38,54,63],cube2:[38,54,63],vector_coord:84,someth:[36,6,90,113,95,100,96,109],iteritem:[43,1,113],particip:1,new_sav:33,cartograph:92,smallest:107,various:80,extra_constants_shap:93,utc_format:113,experi:[105,114,37,64,9,82],phenomenon_bound:71,"15th":[56,27],cf_patch_convent:56,concentr:[113,119],numer:[24,96,37,84,1,27,2,28,119,87,122],complement:[62,63],isol:109,ostia_monthli:[111,16,118,22],lowercas:84,u_wind_acceler:7,matplotlib:[38,30,101,68,2,34,105,70,78,74,39,75,115,13,113,114,81,82,18,20,46,22,124,51,52,92,94],fixedlengthhead:93,ipython:100,x_point:101,miscellan:87,both:[29,101,33,34,5,105,47,80,76,42,111,114,16,118,119,24,124,52,91,94,27,56,57,96,63],last:[24,47,90,16,9,113,18,94,118,27,80,100,56,77,87,96,63],delimit:84,leq:113,pdt:53,rulescontain:84,season_year:[111,11],context:[16,27,9,107,57],collect:[111,48,47,37,51,100,44,84,1,27,55,60],pdf:[76,0],provis:[2,89],whole:[34,113,18],central_lon:50,load:[34,52,104,18],docstrings_sample_routin:40,simpli:[111,48,9,39,84,76,60,63],point:[29,74,65,30,67,101,119,2,100,34,5,6,47,37,38,9,10,75,11,115,76,12,111,113,80,114,16,121,18,44,118,20,21,46,87,88,71,7,24,96,124,90,93,94,27,54,28,95,55,56,57,58,122,97,62,63],instanti:54,address:[36,103,109,58,69],underlai:[46,115],header:[113,8,67,93,92,40,80,55,58,121,110],requires_fh:51,apply_proxi:89,linux:[100,79,92],cgreen:36,mistak:100,throughout:27,time_valu:2,vertic:[111,30,88,50,38,9,68,119,46,13,61,63],lbrel:93,stamp:94,help:[35,112,47,100,92,97,54,103,119,109,79],due:[46,58,55,47,30],empti:[113,51,107,43,16,64,67,93,84,55],coordanddim:84,plt1_ax:114,whoi:122,boston:[60,117],monthli:52,lat_lon_slice_pair:39,grib_fh:71,species_categori:113,waypoint:10,add_season_month_initi:57,effbot:[76,92],add_categorised_coord:[57,11],flight:113,exercis:109,cartopi:[70,124,50,115,38,91,92,119,28,76,34,78],clariti:76,imag:[105,113,114,50,51,92,94,115,76,46,41,78,14],shuffl:107,brlev:[93,80],gap:76,legacy_custom_rul:84,long_t:27,zfill:84,func:[23,87,75,113],demand:[23,67],transversemerc:[58,63,50],tight_layout:30,convers:[25,71,107,16,86,9,67,2,28,80,55,54,87,122,61],coordinatemultidimerror:26,look:[112,113,47,16,92,126,119,100,109,12],unlimited_dimens:107,raw:[113,71,93,18,40,21,80,56,122],erron:88,batch:62,formatt:54,"while":[91,119,5,54,100],unifi:[111,37,16,27,119,80,96,97,63],behavior:40,unify_time_unit:[56,16,87,47],meteorolog:52,anonym:[87,47,88],cleanup:87,loop:[113,96,97,68],propag:89,vol:[105,52,114],centr:[34,83],concatenate_cub:16,coordinatenotfounderror:[16,26],"2dec1ac":100,pcolormesh:[38,124,115,74,76,34,78],unitari:50,std_dev:[111,54],demot:[87,55],seen:[47,27,119,80,97,89],air_temp_mean:77,quit:100,cf_attrs_ignor:1,decor:[17,75],fedora:112,rotated_latitude_longitud:50,add_month_shortnam:57,netcdftim:[24,16,119,2,54,55,57,53],fillchar:84,shorten:[16,36,119,55],get_xticklabel:30,field_gener:84,shorter:87,funni:6,get_xy_contiguous_bounded_grid:28,outermost:[63,107,55],conflict:100,rule_log_dir:108,hadgem2:[105,114],regrid_area_weight:122,imagin:47,wherea:39,domin:119,consumpt:56,covari:28,v_cube:[29,28,55],user:[104,8,90],ref_to_slic:16,"1000m":30,wherev:16,specialis:[43,80],implement:[29,80,7,37,91,100,9,113,83,76,54,11,55,33,119,58,60,87],built:[87,103,113,63],lower:[84,119,7],task:[100,63],new_time_column_head:113,equival:[24,88,71,7,16,68,76,27,54,80,56,124,97],older:[56,87,71],entri:[43,9,93],categoris:[87,11],temp_historyfile_start:93,pickl:[57,87,122],person:36,v_data:29,branchnam:100,add_weekday_numb:11,gsi:8,bottom_16_level:27,ourselv:77,param_numb:43,ubuntu:112,propos:36,explan:100,test_name_of_public_method__aspect_of_method:109,userdict:1,izip:[44,39,114,30],collabor:100,shape:[29,65,7,5,47,107,37,77,42,16,119,87,121,71,23,24,50,92,54,28,56,58,96,97],a_pp_field:67,central_longitud:[70,78],tick:[105,114,30,39,94,34,22],botch:100,reap:57,max_absolute_error:87,pp_save_rul:80,pp_rule:15,appli:90,input:[29,47,7,16,119,44,118,2,28,80,77,75,87,122,88],subsequ:[23,67,93,76,80,56,87,97,63],float32:[16,113,2,75],test_coord_already_pres:109,saniti:9,lbexp:93,march:[60,61],lbext:[67,93],transpar:[56,9],big:[87,107],lazy_data:[56,16],intuit:[77,63,47,30],extrapolation_mod:[118,119,54],geotiff2:41,category_funct:11,insert:[16,100],equal_data_dimens:54,bit:[107,51,92,2,80,62],"2e991e8":36,formal:7,aggregated_bi:[111,16,122,54,18],regrid_weighted_curvilinear_to_rectilinear:[57,5],lost:100,hindcast:61,docutil:12,signal:[52,6,100],resolv:100,websit:8,procedurerul:84,"32bit":[122,121],number_two:27,eadc391:100,fixed_length_head:[121,93],felicityguest:57,encount:9,real_const:121,has_aux_factori:84,bacc:93,simplifi:[23,27,61,96],creation:[56,22,93,14],some:[36,12],back:[23,24,6,16,17,64,100],fernando:79,xmlcharrefreplac:84,unspecifi:118,sampl:8,a1b_north_america:[105,101],within_st_swithun:27,grib_gener:71,insepar:122,scale:52,load_cubes_32bit_iee:121,charg:8,though:80,pep:40,per:[16,24,27,2,21],cf_measur:1,recognit:47,substitut:[84,27,54],mathemat:[29,104],larg:[124,107,86,92,27,75,56,119,58,97,63],lat_con:30,recognis:[9,27,80,33,122,62],target_c:[28,55],phenomenon_nam:77,process_valu:75,update_metadata:54,nose:92,machin:79,run:[103,33,36,47,37,73,9,76,77,113,83,84,21,90,52,92,94,27,57,96,97,63],add_day_of_year:[56,62,11],vertical_perspect:50,array_equ:[57,87],step:[24,49,52,113,44,77,87,96,121],lbhrd:[93,80],prerequisit:119,meantim:[122,100],from:[36,40],tabsiz:84,subtract:[29,114,16,94,56,77,87,34],sub:[111,24,108,67,10,84,55,121],constraint:[30,9,67,33,34,105,39,121,80,76,111,113,16,17,20,22,24,122,26,94,27,56,57,58,96,62],transpos:[16,38,54,56,77,96,63],regridding_schem:56,m01s00i004:[37,111,27,9,96],uarr:[29,38,7,67,1,2,33,5,71,107,108,74,9,10,75,11,14,43,16,83,117,118,84,21,86,87,121,89,23,24,26,50,51,125,93,54,28,110],cmap:[124,30,18,75,76,34],newdynam:121,dim_coord:16,has_formula_term:1,block:[115,107],fulfil:[47,18],"__future__":76,is_no_unit:[57,2,60],concatenateerror:[16,47,26],within:[2,5,37,9,121,80,76,109,11,16,84,119,88,24,90,93,27,54,55,57,96,60,97],level_10_with_stash:27,lbsecd:[93,80],contributor:104,volcan:[113,119],document_nam:12,occupi:[16,1],span:[37,16,94,1,27,5],few:100,enabl:[71,124,107,92,27,55,57,87,60,61],question:[27,90],textual:[1,26],field_coord:67,adjac:111,doubt:109,arithmet:[56,57,77],includ:[100,2,103,47,8,37,64,9,80,115,76,111,16,117,118,84,87,24,124,90,92,126,27,54,28,55,58,60,53,63],suit:[34,54,92,62],forward:[87,126],entir:[114,94,40,54,87,5,13],etc:[113,24,16,108,9,92,83,53,118,80,58,87],properli:63,contour_object:39,geoax:[76,124],linu:79,decomposit:2,lat_coord:[16,113],non_equ:54,translat:[43,84,3,26],newer:57,scope:[16,87,89],linestyl:[105,52,124],cmap_norm:75,mitig:[105,114],dodsc:122,concaten:[16,123,104,26],consist:[24,47,37,29,84,40,2,62,63],ws_cube:29,caller:33,commut:16,x_lower_bound:80,scipi:[29,92,118,119,56,54],lbnrec_offset:93,highlight:[77,113,47],similar:[35,47,37,9,76,119,27,54,80,55,56,77],curv:30,sort:[16,78,97],constant:[122,2,93,80],rotated_lat:28,cop_metadata_callback:114,utilis:[76,27,63],soil_temperatur:58,repres:[7,67,1,2,100,94,47,37,108,38,39,11,76,77,111,113,80,43,16,17,118,84,21,87,121,23,24,50,51,93,97,27,54,57,58,61,63],"char":84,orographi:[23,38,67,121,80,56,122,61],clever:22,research:[60,69],home:[35,36],isdigit:84,curl:[122,7],improv:[107,76,75,55,56,62,63],regular_step:87,fixup:100,mean_anomali:56,polygon:42,dummi:53,commonest:80,titl:[113,105,70,52,124,114,74,77,101,94,84,2,69,76,46,34,78,100],sequenti:[47,82],nan:[56,118,119,54],invalid:[16,55],bgor:93,codec:84,pygrib:71,lbmin:[60,93,80],mock:92,nice:[36,6,114,79,20,76,22],yx_slice:96,draw:[105,113,38,124,74,94,76,78,63],exp:[29,60],u_wind_cub:7,update_mod:93,expandus:9,lbmond:[93,80],introduc:[56,87,123,47,62],lbft:[93,80],meaning:[119,42],cater:37,column_header_nam:113,c_doubl:2,param:[87,122],longitud:[65,7,30,101,68,105,70,47,37,119,10,80,115,76,77,42,111,113,114,16,118,20,46,87,24,124,50,94,27,54,28,55,56,57,58,96,60,97,63],ago:36,land:57,algorithm:[47,7,107,118,87,63],vice:84,svg:76,max_inclus:[16,24],set_major_formatt:22,depth:[23,64,79,30],summit:36,lbmind:[93,80],t_last:77,mstat:54,hit_window:18,extrapol:[24,118,54,28,119,122,63],partial:[5,54,53],aux_factory_class:84,cynthia:[76,75,63],lblev:[56,93,80],cfclimatologyvari:1,ellips:58,markers:124,exp_id:9,mondai:11,lagged_ensemble_callback:97,contourf:[70,74,114,81,38,39,113,18,68,115,76,46,58,94,13,22,63],netcdf_format:[33,9,107],privat:90,delta_e1:114,pydata:[86,92],load_nameiii_trajectori:110,effort:63,elsewher:90,friendli:61,send:6,carefulli:[63,55],palette_path:108,module_fil:72,level_10_or_16_fp_6:27,aris:[47,80],sent:8,datafil:87,whichev:90,result2:16,electron:[46,96,124,44],volum:1,relev:[23,24,90,16,103,9,121,84,28,80,55,33,58,109,87,62],tri:4,magic:[51,79],is_compat:[16,24,87,47,61],button:[35,100],geograph:[63,122,28,50],fewer:[96,54,47],"try":[47,64,39,82,84,40,95,97],brewer_purples_09:76,fileextens:[51,113],nlev:87,grouped_coord:54,nanmask:[119,54],arang:[113,114,52,2,80,87,96],dimes:23,pleas:[35,6,71,16,64,9,88,103,69,76,87,109,97,63],impli:[76,24,58,8,126],smaller:75,fortun:97,cfg:[108,122,92],stereograph:4,product_common:83,y_bound:67,blanklin:87,video:79,orog:23,cf_attrs_reset:1,earth:[50,37,115,28,58,42],download:[112,92],header_valu:113,hspace:94,percentil:54,append:[113,71,107,16,9,67,33,57,87],compat:[24,16,92,118,27,87,63],index:[111,24,52,7,16,38,67,93,82,84,27,54,3,92,76,77,87,88,110],prolept:2,compar:[111,24,107,16,119,48,18,53,84,1,27,2,80,77,54,58,126,87,62],run_act:84,name_pad:16,ens_memb:94,twini:30,find:[35,6,26,8,84,16,64,18,19,56,27,2,100,33,63],cell:[38,1,119,34,5,47,37,74,9,80,76,78,42,111,16,83,118,84,20,24,27,54,28,55,56,57,58,96,97,62,63],test_no_ellipsoid:109,squeez:[87,55],experiment:[109,9,90],secant:50,bdy:[93,80,121],bdx:[93,80,121],deduc:37,led:56,equalis:47,climatolog:[111,1,122,94],endian:107,ensemble_memb:[54,9,94],closur:75,plt2_ax:114,cube_delta:7,let:[111,105,47,97,119,100,77,126,22],sinh:23,linear1dextrapol:118,indexerror:16,becom:[107,119,80,76,77,87],implicit:63,facecolor:38,convert:[39,67,2,36,73,86,9,127,75,80,77,113,15,16,84,121,24,85,125,97,27,54,28,55,56,57,58,60,61],survei:63,weekdai:11,larger:[56,75,55],fetch:[126,95,100],test_pp_preserv:109,select_data_path:122,earli:9,typic:[80,16,57,67,83,2,11,51,33,54,42],pole_longitud:63,add_load_rul:58,control:[90,72,100,9,76,119,75,107,80,77,56,57,58,22],encode_d:2,forecast_tim:[88,7],column_dependent_constants_shap:93,revok:57,spell_point_count:18,approxim:[34,119],foundat:[79,8,63],inequ:87,apt:[112,92],expect:[123,47,26,9,76,27,119,77,56,57,96,109,97],volt:2,"boolean":[24,75,16,90,67,44,1,40,54,2,87,60,121],is_:60,rotated_air_temp:119,load_nameiii_timeseri:110,add_cell_method:16,cloud:0,encapsul:76,feb:69,zip:[111,119],commun:[79,63],doubl:[40,2,97],crown:8,middai:9,next:[113,24,90,125,39,82,101,68,80,55,96],my_stash:67,ungroupable_and_dimens:54,ut_ascii:2,cube_iter:13,jonathan:36,crai:62,remaind:76,linewidth:[52,124,114,30],celsiu:[16,84,97,77,61],src:63,prog:0,xytext:124,mismatch:47,reference_tim:83,name_load:15,guess_coord_axi:[16,87],insensit:16,c_float:2,iii:[113,110],greatli:75,account:100,alik:61,retriev:97,scalabl:76,sine_ifunc:29,alia:[24,36,16,38,67,84,3,100,110],maxsplit:84,when:[29,86,7,67,1,100,33,94,36,6,47,107,37,108,74,9,10,75,80,76,109,42,113,16,84,119,87,121,71,24,96,26,90,51,61,27,54,28,55,56,57,58,122,97,63],annot:124,lbfc:[57,93,80],anom_norm:34,obvious:34,thin:[76,89],cube_al:27,model_level:80,meet:[9,18,103],scatter:[74,58,38,30],transverse_merc:50,quickstart:106,tar:69,plate:119,process:[29,9,67,103,33,47,107,39,22,113,16,121,83,84,119,87,88,71,24,92,93,54,56,57,60,97,63],guess_bound:[111,105,24,115,113,76,119,56,42],iristest:109,pole_lat:28,load_name_iii:113,high:[74,92,119,76,109,88],tab:84,gca:[105,70,114,30,81,113,18,22,115,76,46,34,94,58,63],print_funct:76,surfac:18,palett:[108,9],sit:118,six:90,gcm:63,file_el:[51,113],copyright:104,png_spec:51,get_data_path:[108,90],"0x89504e470d0a1a0a":51,instead:[23,111,24,6,71,43,16,126,84,119,80,77,33,57,87,63,60,61,89],sin:[57,29,52,63],stock:63,overridden:[108,92],singular:113,regulargridinterpol:56,arakawac:121,"956fbab":36,has_bound:24,attent:100,nose2:92,brewer_cit:[76,63],"10m":55,physic:[37,119,92,42],winter:62,cloud_cov:0,essenti:[77,80],not_equ:54,overplot:124,coord_comparison:54,filepath:[84,87,78],correspond:[29,0,2,107,37,80,109,42,43,16,118,119,87,88,23,85,90,13,93,127,54,28,58,61,63],element:[23,24,96,47,93,50,16,38,29,84,94,56,54,28,80,51,33,87,5,13,113],issu:[100,9,82,1,103,42,89],osgb:[58,63,50],x_slice:96,allow:[29,1,2,35,47,37,9,119,87,50,90,93,61,27,54,55,56,57,58,60,53,63],gom15:122,lbrsvd1:93,pole_lon:28,fastest:16,elif:113,least:[23,90,18,84,54,40],markeredgewidth:124,spell_length:18,popitem:[43,1],meter:[37,58,29,2,47],whilst:[37,56,57],notabl:[53,62],comma:113,degree_:101,load_http:33,earlier:[47,82],perfect:[63,50],outer:36,chosen:[16,34,74,119,67],whether:[29,38,67,1,2,47,107,86,9,75,77,16,40,87,121,71,24,51,27,54,55,57,61],abffield:117,outcom:55,total:[124,44,119,28,46,61,42],criterion:[1,54,24],depth_valu:30,therefor:[7,37,16,39,54,119,97],ut_iso_8859_1:2,eastward:7,is_monoton:24,crash:79,pure:27,handl:[67,2,100,33,34,71,107,9,109,42,14,113,116,84,89,122,51,97,54,28,55,56,57,58,96,53,62],auto:55,spell:18,handi:113,pseudocolor:[74,34,38],mention:[37,40],my_rotated_u_wind_cub:55,dap:107,slices_ov:[16,55],cs_data1:124,byteord:16,warranti:[76,8],somewher:6,cs_cube:29,trac:92,anyth:[6,84,27,119,100,97],edit:36,wdiff:36,sub_cub:[16,58],justifi:84,vertical_mean:111,februari:[57,9,47,107],"_cube_coord_common":[23,16,24],mode:[6,16,38,93,118,54,76,119],runlendecod:21,disregard:119,conjuct:54,i_cmpt_curl_cub:7,integer_constants_shap:93,subset:[111,16,108,9,82,94,1,27,103,76,77,53,104],netcdf3_class:[33,9,107],chunk:107,load_nameii_timeseri:110,consum:57,meta:[56,77,75],"static":[43,24,75,67],our:[47,50,37,97,19,84,119,95,126,77,34,94,22,71],meth:[34,18],promote_aux_coord_to_dim_coord:[87,55],special:[94,27,54,80,76,87],out:[24,6,88,16,52,93,28,118,20,21,34,87,42,89],variabl:[23,96,24,122,47,61,30,16,17,107,9,97,76,1,27,55,56,57,58,5,22],x_val:16,inverse_flatten:50,basemap:63,contigu:[115,24,5,41,107],influenc:40,collapsed_cub:[57,54],spline:54,categori:[43,109,2,11],reflog:100,rel:[24,37,52,36,28,100,87,61,63],auxcoordfactori:[23,16,61,84,55],merg:104,ref:[105,1,100,12],air_temp_and_fp_6:96,math:54,scalar_cell_method:84,geostationari:[56,50],bwr:34,insid:[118,103],brewer_orrd_09:76,subgrid:121,manipul:[16,92,2,76,77,97],is_dimensionless:[57,2,60],dictionari:[23,24,47,0,107,37,16,9,113,10,43,1,54,72,87,60,110],releas:[113,90,67,93,92,55,56,57,58,122,60,61,62,63],sine_cub:29,queri:[56,58],orign:87,nimrodfield:[73,125],indent:40,wgts24:52,set_xlabel:30,unwant:[76,100],could:[37,119,68,27,54,55,77,87,96,60,97,63],membership:[62,11],timer:87,my_sav:[33,9],counterpart:74,stride:87,unhandl:54,roll_i:118,outsid:[80,5,119,28,42],y_upper:124,geometri:54,retain:[57,107,91,28,55],south:30,lbrow:[93,21,80],ndim:[24,107,16,18,87,97],softwar:[1,92,8],similar_cub:56,suffix:[84,87,27],parabl:79,lbuser5:93,regular_point:58,max_relative_error:87,"20correl":65,qualiti:76,stashcod:[58,80],cfvariablemixin:[23,16,24],"68f6752":36,lbsec:[93,80],dim_map:87,viewitem:43,lib:[100,109,80,89],paramid:63,facil:[27,63],redund:63,utc:2,prioriti:[51,113],src_cube:[87,5],strict:[16,24,87,82,84],canonical_unit:72,licens:104,perfectli:53,symlognorm:34,wrapper:[107,76,1,2,56,75,13],attach:[37,80],istitl:84,lbdatd:[93,80],orography_at_point:38,licenc:[76,92,8],man:100,cosine_latitude_weight:[58,28],export_geotiff:[122,109,41],shell:9,cmattribut:84,k_cube:7,travi:103,juli:27,shallow:43,some_paramet:16,geotiff:41,accompani:[109,90],rst:[40,12],exactli:[24,47,37,16,9,27,20,80,76,87,96],succinctli:100,m01s00i024:[111,16],tangl:100,aka:80,docstrings_attribut:40,mappabl:56,column_head:113,structur:[78,86,92,82,109,88],charact:[84,87,27,40,113],dai:[111,47,37,80,9,53,27,2,11,56,57,97,62],test_data:108,xlabel:[76,46,52,105,94],sens:119,sensit:9,bare:33,amount:[77,96,47,97],exhibit:46,colourbar:[34,114],cache_fil:57,nino_cub:94,clip:[87,28],julian:2,corran:36,exner_pressur:39,reserv:[56,60],close:[107,93,126,119,100,76,63],biggu:[16,54,92,107],border:124,ylim:52,optimis:[56,122],background_process_id:83,automat:[113,70,99,30,72,74,9,67,126,76,27,40,107,77,56,57,60,97,63],regridded_cub:[56,122,55],min:[23,111,24,124,94,54,75,13],diverg:34,bhrlev:[93,80],read_head:110,mid:7,longitude_of_central_meridian:[58,50],curvilinear:[57,5],mix:[16,58,2],especi:[58,60],discret:[37,63],which:[29,64,65,53,9,100,67,68,1,2,102,103,33,34,105,36,47,107,37,38,39,7,121,40,80,76,77,12,13,14,111,125,113,90,114,43,16,17,48,18,44,118,84,119,83,87,88,71,23,24,96,124,26,50,51,109,52,91,92,93,94,110,27,54,28,55,56,57,58,122,97,62,63],constraintmismatcherror:[27,26],uppercas:84,analysi:[105,115,52,9,92,18,94,34,78,22],monotonic_statu:87,a7ff2e5:36,unless:[8,84,107,9,76,80,33,97],add_spec:[51,113,15],clash:56,preliminari:100,mimick:38,my_cube_list:[33,9],relal:94,versa:84,"1e11":96,"1e10":113,discov:6,rh_inclus:87,"_fillvalu":55,ugrid:[123,92],intersection_of_cub:29,new_column:119,why:[6,47,64,103,100,56],enthought:92,homogen:75,afresh:[67,71],ensemble_coord:97,url:[107,17,69,95,100,33],x_upper:124,request:[126,36,59,100],uri:[33,51,87,15,9],inde:76,realiz:[16,94,47,80,97],"400m":119,determin:[111,24,107,75,16,57,9,67,94,76,1,2,28,55,33,54,87,13,61,62,63],constrain:[105,9,82],factory_class:84,fact:[27,80],tear:100,flight_level:[113,119],esmf:[122,91],verbos:[87,27,97],box_x_point:124,bring:100,fakemodul:89,add_season_membership:[57,11],varnam:56,exp_coord:114,maskedconst:100,locat:[105,113,47,90,37,17,39,92,101,94,27,28,69,109],nois:52,oceansfactori:23,correl:65,should:[29,65,100,67,2,103,33,35,36,8,38,9,40,77,41,109,113,114,16,44,19,119,87,88,23,24,124,26,90,51,126,27,54,56,58,122,60,61,63],jan:[80,11],suppos:[111,37,27,119,100,97],disciplin:43,is_unknown:[57,2,60],local:100,hope:[64,8],"721fc64":100,ancillary_vari:1,move:32,contribut:[57,5,54,92,63],pull:[36,49,6,126,100,59,79],column_dependent_const:121,regularli:[24,41,58,88],piecewis:87,cube_a:[87,65,44],increas:[56,47,30],sample_routin:40,column_slices_gener:87,lossless:[76,107],percentileaggreg:54,"485_vol_i_en_colour":0,organ:100,upper:[84,113,7,105],name_or_coord:[16,24,87],dpf:0,standard_nam:[23,24,43,16,87,113,84,27,54,3,80,58,122,97,62],stuff:[19,126,100],grib_phenom_transl:[15,71],partit:84,contain:[0,9,1,2,103,33,34,36,47,37,72,39,10,40,11,42,110,80,16,121,44,84,119,21,87,88,71,24,51,93,54,28,56,57,122,60,97,62,63],step_nam:87,dpi:94,view:[23,43,87,93,55,56,57,58,122,60,61,62,63],conform:[50,37,108,88,58,17],grid_lon:28,reykjavik:124,signatur:[74,9,56,54,33,57,61],smooth:57,frame:13,cf_variabl:1,equator_slic:96,name_of_refer:12,timer_nam:87,imread:51,statist:104,stack:47,modulu:[16,24,2],closer:47,lon_coord:[16,113],statu:[36,6,107,118,28,100,123,79],error:[24,47,26,107,16,103,118,84,54,55,56,119,58,122,88],ident:[48,47,26,16,80,56,77,88],correctli:[100,92,62,55,56,57,58,60,87,42],add_custom_season_year:[57,62],ieee:[58,122,121],boundari:[24,38,84,1,55,57],below:[24,47,86,119,92,40,69,76,46,87,122],tend:92,state:[8,90,9,93,40,100,76,109],quickest:[37,6],tslice:44,post_process:54,progress:[113,6,100],as_cartopy_cr:[124,50],email:8,anom_cmap:34,load_raw:[9,47,80,63],verifi:84,terhorst:36,kei:[35,24,47,107,43,16,83,1,54,56,119,60,71],formatag:51,onelin:[126,100],gribbability_check:83,hpa:[16,119,39,80,77],itertool:114,generate_std_nam:72,evaluates_tru:84,overlay_filepath:124,addit:[23,111,24,90,16,91,92,118,56,27,40,77,33,54,87,109,61,62,63],float64:2,admin:100,equal:[24,30,43,16,87,36,119,54,57,34,96,58],len:[52,54],april:[56,111,53],eta:23,equat:[77,96],freeli:76,minumum:34,regional_ash:119,strftime:[113,114,94],unrot:[28,80,55],comment:[24,80],unimpl:37,backgroundcolor:124,add_save_rul:67,global_psl:119,lbdayd:[93,80],numpyarrayadapt:16,solv:97,respect:[111,7,37,27,54,80,76,2,97],gribapi:83,update_global_attribut:107,rjust:84,creat:106,addition:[37,40,97],purpl:76,libdir:92,compos:87,defint:2,crs_data1:124,tutor:76,treat:[80,44],ungroup:54,magicnumb:[51,113],popul:[16,113,61,67],ifunc:[29,55],presenc:[47,30],infil:125,royer:[105,114],deliber:95,decim:[58,80,107],togeth:[111,47,124,27,54,97],coastlin:[70,124,114,81,18,94,115,76,46,34,78,58,63],funcanim:13,present:[24,43,16,108,38,39,67,84,3,80,57,110,28],testdtypeandvalu:109,level_height:[37,96,27,111,80],level_10_or_12_fp_6:27,align:[57,50],rectangular:119,year_temperatur:34,defin:[29,74,7,53,39,1,2,34,5,71,108,38,9,10,11,81,76,77,13,113,80,114,43,16,18,117,118,84,119,87,121,23,24,124,50,91,93,94,27,54,56,58,122,60,97,62,63],generating_process_id:83,new_lon_valu:54,matching_rul:84,howev:[9,119,27,54,28,80,100,56,77,34,96,88,63],layer:[96,89],datazoo:33,row_dependent_constants_start:93,helper:[54,47],almost:[87,47,80,55],"__import__":89,customis:[51,87],full_slic:87,archiv:[21,93,69],rfind:84,substanti:88,lightweight:12,opendap:17,incom:9,revis:100,tempor:37,greater:[113,87,54,28,80],uniti:57,cubelist:[71,99,107,16,9,67,27,33,88],welcom:60,contour_level:94,began:60,x_end:124,cross:12,access:[23,35,6,47,16,108,69,67,93,82,1,21,51,126],member:[94,54,47,97],python:[113,107,72,103,9,92,53,27,2,69,76,87,96,97,12],apply_ufunc:[29,55],inch:38,failur:[56,16,47,84],netcdf_promot:9,infer:[37,87,92],coord_categoris:[34,9],builtin:87,phi:7,http:[29,0,69,1,100,33,34,35,65,8,107,86,80,76,41,112,17,12,51,92,122,53],lbegin:[67,93],denot:80,lbrsvd4:93,expans:[87,27],interrog:[51,97],temperature_color:30,upon:[119,63],extract_nearest_neighbour:[118,58],lbrsvd2:93,lbrsvd3:93,sooner:[51,113],symmetricnorm:75,replace_coord:16,arrayprint:87,php:92,expand:[79,40,21,84,63],audit:57,fromkei:43,off:[76,113,126],center:84,nevertheless:47,logorithm:2,well:[36,84,100,19,76,27,119,80,55,33,57,58,97],polyv:101,cell_method:[16,54,97,80,84],regular_bound:58,titlecas:84,level_dependent_constants_start:93,exampl:104,y_val:16,achiev:[115,100,27,119,77,76,57,96,12],coord_valu:9,interpol:[54,28,104],netcdfdataproxi:107,usual:[29,27,54,80,100],distanc:[2,28],paus:35,librari:[122,107,9,92,21,69,41],less:[51,52,119,67,54,28,77,57,34,122,97],y_coord:[124,83],percent:54,additon:92,text:[36,124,38,39,113,126,40,12,80,100,76,14,63],obtain:[76,16,54,28,92],lazy_func:54,front:84,semi_major_axi:[56,50],integer_const:[93,55],as_cub:[84,86],simultan:114,superset:54,air_temperatur:[39,47,37,16,9,118,27,119,80,77,87,60,63],new_coord_map:23,field:[65,53,67,3,33,34,47,73,38,9,121,127,80,110,113,114,16,117,84,21,88,71,24,125,92,93,94,27,55,56,57,58,122,97,62],common:[111,105,36,99,107,16,9,82,44,119,27,2,11,33,54,87,79],data_set_format_vers:93,um_tool:92,script:[76,9,63],add:[29,74,80,67,101,69,33,34,126,100,105,36,6,47,64,9,75,11,115,76,77,109,79,113,38,114,16,117,84,20,87,71,24,124,90,51,94,95,55,56,97,62,63],valid:[23,24,74,26,37,16,108,38,119,2,28,80,54,96,13],circular:[24,7,16,94,55,56,87,122],densiti:[96,44],bound_mod:38,lookup:[43,93,80],logger:108,rectilinear:5,match:[113,24,96,75,16,55,9,29,84,44,56,1,27,54,107,51,33,57,87,94],int32:[86,97,94],candid:[57,122],auto_palett:75,"11ee694744f2552d":100,obs_file_typ:93,v_prime:55,phenom:39,assert:36,pixelisarea:41,name_output:[113,119],lbhem:93,dateformatt:22,lbtim:[93,80,67,62,55],height:[23,96,114,43,16,77,68,37,119,80,55,56,57,58,122,62],nimrod:15,python2:76,loss:122,resid:109,num2dat:[113,2,114,94],success:[56,24,13,87],pseudocolour:[34,74,124],realist:42,ostia:22,test_:109,unevenli:[24,119,80],stagger:[57,122],lbcode:[93,67,80],necessari:[37,119,28,76,87,96,63],have:[29,65,100,1,2,103,96,34,5,35,105,6,47,8,37,86,9,88,40,80,76,77,41,109,79,42,111,112,113,94,16,48,18,44,118,84,119,126,87,22,7,24,49,124,26,91,92,93,61,27,54,28,95,55,56,57,58,122,60,97,62,63],field_head:113,edu:[1,122,92],doiorurl:69,page:[35,112,19,40,100,12],pp_pack:15,exceed:[54,18],tran:[105,114],y_lower:124,captur:58,suppli:[23,29,16,74,118,54,119,5],phenomena:39,pariti:63,row_dependent_const:121,"export":[41,60,62],superclass:[16,24,40],oceansg2factori:23,guarante:[87,47],halifax:124,cfconvent:80,model_vers:93,tmp:[16,27,100],cube1_project:63,lead:[113,47,90,9,84,57,87],leak:100,avoid:[47,90,119,100,56,96,88],grib_message_id:83,overlap:[16,39,84,54,55,119,5,42],disk:[107,55],leap:[2,11],leav:[87,6],add_cub:84,overlai:[115,124],lbtyp:93,format_arrai:87,symbol:[38,9],encode_clock:2,encourag:[92,8,63],investig:[97,95],journal:[76,52,63],level_dependent_constants_shap:93,usag:[29,7,16,86,9,27,54,87,63],pot_temp_cub:27,facilit:[76,27],demote_dim_coord_to_aux_coord:[87,55],host:19,degener:57,obei:[33,9],although:[113,119],offset:[113,71,124,50,51,2],add_custom_season_numb:[57,62],microsecond:[57,53],simpler:9,icoord_system:113,about:[113,6,37,1,100,109,97],add_month_fullnam:[122,11],actual:[36,91,80,100,56,97],polynomi:4,column:[113,86,76,27,20,21,56,119,87],freedom:54,label:[113,105,36,74,6,124,114,30,58,52,87,70,101,1,20,76,34,22,12],annual_seasonal_mean:111,my_big_func:87,u_cub:[29,28,55],bridg:76,rule:[15,71,73,9,67,127,85,79,108],subcub:[16,44],constructor:[23,119,40],discard:100,original_cub:56,new_lat_valu:54,referencetarget:84,metr:[96,50],own:[40,100],"final":[111,6,16,39,67,119,20,100,76,77,122,88],air_pressur:[77,39,80,115],nearest_neighbour_index:[56,24],explicit:[107,63],hdfgroup:107,sea_water_potential_temperatur:[84,30],dataset:[105,124,37,17,94,27,56,58,78,97],assess:[105,114],weather:52,calculu:54,unit_convert:[57,61],cellmethod:[16,24,80],u_data:29,leverag:77,the_11th_hour:27,"6c11":92,van:[105,114],val:113,stock_img:[46,113,115],rotat:[124,4],lh_inclus:87,lowermost:24,spread:97,trigger:[16,87,38,107],fields_of_constants_shap:93,arg1:40,"var":54,log10:29,merge_cub:16,"function":59,expand_filespec:33,north:[105,58,124,18,50],unexpect:[56,24,47],advers:[28,63],keyerror:[43,26],gaussian:[57,58],lossi:107,neutral:34,rle_decod:21,gain:[58,47,107],spuriou:113,tick_param:114,sphinx:[90,12],eas:27,highest:[118,84],aux_coords_and_dim:[16,84],sudo:112,count:[113,24,16,38,67,18,84,54,3,110],also:[29,74,7,53,67,1,2,100,33,34,5,105,36,71,107,37,72,38,9,80,76,41,14,94,114,43,16,18,118,84,119,86,87,88,89,24,124,51,91,93,61,27,54,28,95,57,58,96,97,62,63],made:[23,36,6,37,16,93,126,54,80,100,56,57,63],cfdatavari:1,veget:60,fieldsfilevari:[93,55],seealso:105,scientif:[17,92],writeabl:56,displai:[92,34,61,124,14],longitude_of_projection_origin:[58,50],record:[80,63],point_cel:62,limit:[111,105,124,7,30,38,39,94,76,27,119,56,58,78,53],single_cub:16,otherwis:[23,24,43,16,9,84,119,27,2,11,54,87,88],problem:[47,100,9,103,56,77,97,63],trail:[84,113,12],significantli:107,get_dir_opt:108,reciproc:2,data_func:29,evalu:[56,16,27,92,84],gcf:[13,94],dure:[36,72,67,80,100,77],my_cub:[33,9],meaningless:87,cooordin:80,set_cmap:63,air_press:16,cube_list:[56,16],global_air_temp:119,surface_temperatur:[111,47,97,16,94,118,22],guidanc:[76,77,60,63],source_fil:57,air_pressure_at_sea_level:97,detail:[40,8,90],virtual:[23,56,92,63],other:[40,12],bool:[84,58,107],futur:[24,71,90,107,9,82,27,60,63],branch:36,varieti:[54,80],shape_of_the_earth:83,is_tim:2,field_generator_kwarg:84,stat:[36,54],repeat:[35,16,28,76,57,87,88],star:27,potenti:[23,30,37,39,27,119,77,57,122],season_numb:11,new_cub:[111,16,78,60,63],june:122,"100m":13,ndarrai:[23,29,16,67,2,21,40,87,5,54,121,12],calul:50,maximum_scale_level:34,last_fieldop_typ:93,cfmeasurevari:1,time_refer:[57,60],colors_api:34,hybridheightfactori:[23,80],debian:112,return_direct:87,an_example_pypi_project:12,eof:28,coordin:[29,74,65,30,9,67,68,1,34,5,7,107,38,39,10,11,77,13,42,111,113,114,16,82,83,44,118,84,87,121,23,24,124,26,50,78,91,94,54,28],bce:2,reliabl:107,endgam:121,global_pp:63,portion:30,auxiliari:[23,111,24,47,37,16,1,27,28,80,56,87,96],hindcast_workaround:[61,71],interpolation_schem:56,decemb:11,uninterpol:118,model_level_numb:[111,39,37,16,9,68,27,119,80,55,76,96,63]},objtypes:{"0":"py:module","1":"py:function","2":"py:attribute","3":"py:method","4":"py:class","5":"py:data","6":"py:classmethod","7":"py:staticmethod"},objnames:{"0":["py","module","Python module"],"1":["py","function","Python function"],"2":["py","attribute","Python attribute"],"3":["py","method","Python method"],"4":["py","class","Python class"],"5":["py","data","Python data"],"6":["py","classmethod","Python class method"],"7":["py","staticmethod","Python static method"]},filenames:["iris/iris/symbols","iris/iris/fileformats/cf","iris/iris/unit","iris/iris/fileformats/um_cf_map","examples/General/index","iris/iris/experimental/regrid","developers_guide/gitwash/patching","iris/iris/analysis/calculus","copyright","iris/iris","iris/iris/analysis/trajectory","iris/iris/coord_categorisation","developers_guide/documenting/rest_guide","iris/iris/experimental/animate","iris/iris/fileformats/dot","iris/iris/fileformats","iris/iris/cube","iris/iris/experimental/ugrid","examples/General/custom_aggregation","developers_guide/gitwash/git_intro","examples/General/lineplot_with_legend","iris/iris/fileformats/pp_packing","examples/Meteorology/hovmoller","iris/iris/aux_factory","iris/iris/coords","iris/iris/fileformats/um","iris/iris/exceptions","userguide/loading_iris_cubes","iris/iris/analysis/cartography","iris/iris/analysis/maths","examples/Oceanography/atlantic_profiles","whatsnew/index","developers_guide/gitwash/index","iris/iris/io","examples/General/anomaly_log_colouring","developers_guide/gitwash/forking_hell","developers_guide/gitwash/configure_git","userguide/iris_cubes","iris/iris/plot","examples/Meteorology/deriving_phenomena","developers_guide/documenting/docstrings","iris/iris/experimental/raster","iris/iris/analysis/geometry","iris/iris/fileformats/grib/grib_phenom_translation","iris/iris/iterate","examples/index","examples/Meteorology/TEC","userguide/merge_and_concat","iris/iris/experimental/equalise_cubes","developers_guide/gitwash/following_latest","iris/iris/coord_systems","iris/iris/io/format_picker","examples/General/SOI_filtering","iris/iris/time","iris/iris/analysis","whatsnew/1.8","whatsnew/1.7","whatsnew/1.6","whatsnew/1.5","developers_guide/index","whatsnew/1.3","whatsnew/1.2","whatsnew/1.1","whatsnew/1.0","userguide/end_of_userguide","iris/iris/analysis/stats","developers_guide/gitwash/git_development","iris/iris/fileformats/pp","examples/General/cross_section","userguide/citation","examples/General/global_map","iris/iris/fileformats/grib","iris/iris/std_names","iris/iris/fileformats/nimrod_load_rules","iris/iris/quickplot","iris/iris/palette","userguide/plotting_a_cube","userguide/cube_maths","examples/General/orca_projection","developers_guide/gitwash/git_resources","whitepapers/um_files_loading","examples/General/polar_stereo","userguide/index","iris/iris/fileformats/grib/grib_save_rules","iris/iris/fileformats/rules","iris/iris/fileformats/grib/load_rules","iris/iris/pandas","iris/iris/util","iris/iris/experimental/fieldsfile","iris/iris/proxy","developers_guide/deprecations","iris/iris/experimental/regrid_conservative","installing","iris/iris/experimental/um","examples/Meteorology/lagged_ensemble","developers_guide/gitwash/set_up_fork","userguide/subsetting_a_cube","userguide/navigating_a_cube","examples/Meteorology/index","iris/iris/experimental/concatenate","developers_guide/gitwash/development_workflow","examples/General/polynomial_fit","iris/iris/fileformats/name","developers_guide/pulls","contents","examples/Meteorology/COP_1d_plot","developers_guide/documenting/index","iris/iris/fileformats/netcdf","iris/iris/config","developers_guide/tests","iris/iris/fileformats/name_loaders","userguide/cube_statistics","developers_guide/gitwash/git_install","examples/General/custom_file_loading","examples/Meteorology/COP_maps","examples/General/rotated_pole_mapping","whitepapers/index","iris/iris/fileformats/abf","iris/iris/analysis/interpolate","userguide/interpolation_and_regridding","examples/Oceanography/index","iris/iris/fileformats/ff","whatsnew/1.4","iris/iris/experimental","examples/General/projections_and_annotations","iris/iris/fileformats/nimrod","developers_guide/gitwash/maintainer_workflow","iris/iris/fileformats/pp_rules"],titles:["iris.symbols","iris.fileformats.cf","iris.unit","iris.fileformats.um_cf_map","General visualisation examples","iris.experimental.regrid","Making a patch","iris.analysis.calculus","Iris copyright, licensing and contributors","Iris reference documentation","iris.analysis.trajectory","iris.coord_categorisation","reST quickstart","iris.experimental.animate","iris.fileformats.dot","iris.fileformats","iris.cube","iris.experimental.ugrid","Calculating a custom statistic","Introduction","Multi-line temperature profile plot","iris.fileformats.pp_packing","Hovmoller diagram of monthly surface temperature","iris.aux_factory","iris.coords","iris.fileformats.um","iris.exceptions","2. Loading Iris cubes","iris.analysis.cartography","iris.analysis.maths","Oceanographic profiles and T-S diagrams","What’s new in Iris","Working with iris source code","iris.io","Colouring anomaly data with logarithmic scaling","Making your own copy (fork) of iris","Configure git","1. Introduction","iris.plot","Deriving Exner Pressure and Air Temperature","Docstrings","iris.experimental.raster","iris.analysis.geometry","iris.fileformats.grib.grib_phenom_translation","iris.iterate","Iris examples","Ionosphere space weather","7. Merge and Concatenate","iris.experimental.equalise_cubes","Following the latest source","iris.coord_systems","iris.io.format_picker","Applying a filter to a time-series","iris.time","iris.analysis","What’s new in Iris 1.8","What’s new in Iris 1.7","What’s new in Iris 1.6","What’s new in Iris 1.5","Iris developer guide","What’s new in Iris 1.3","What’s new in Iris 1.2","What’s new in Iris 1.1","What’s new in Iris 1.0","11. End of the user guide","iris.analysis.stats","Git for development","iris.fileformats.pp","Cross section plots","10. Citing Iris","Quickplot of a 2d cube on a map","iris.fileformats.grib","iris.std_names","iris.fileformats.nimrod_load_rules","iris.quickplot","iris.palette","5. Plotting a cube","9. Basic cube mathematics","Tri-Polar Grid Projected Plotting","git resources","1. Iris handling of PP and Fieldsfiles","Example of a polar stereographic plot","Iris user guide","iris.fileformats.grib.grib_save_rules","iris.fileformats.rules","iris.fileformats.grib.load_rules","iris.pandas","iris.util","iris.experimental.fieldsfile","iris.proxy","Deprecations","iris.experimental.regrid_conservative","Installing Iris","iris.experimental.um","Seasonal ensemble model plots","Set up your fork","4. Subsetting a Cube","3. Navigating a cube","Meteorology visualisation examples","iris.experimental.concatenate","Development workflow","Fitting a polynomial","iris.fileformats.name","Pull Request Check List","Iris documentation table of contents","Global average annual temperature plot","Documentation in Iris","iris.fileformats.netcdf","iris.config","Testing","iris.fileformats.name_loaders","8. Cube statistics","Install git","Loading a cube from a custom file format","Global average annual temperature maps","Rotated pole mapping","Iris technical ‘Whitepapers’","iris.fileformats.abf","iris.analysis.interpolate","6. Cube interpolation and regridding","Oceanography visualisation examples","iris.fileformats.ff","What’s new in Iris 1.4","iris.experimental","Plotting in different projections","iris.fileformats.nimrod","Maintainer workflow","iris.fileformats.pp_rules"],objects:{"":{iris:[9,0,0,"-"]},"iris.fileformats.rules.RuleResult":{count:[84,3,1,""],index:[84,3,1,""],cube:[84,2,1,""],factories:[84,2,1,""],matching_rules:[84,2,1,""]},"iris.fileformats.cf.CFCoordinateVariable":{spans:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.exceptions.DuplicateDataError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.rules.ProcedureRule":{evaluates_true:[84,3,1,""],run_actions:[84,3,1,""],conditional_warning:[84,3,1,""]},"iris.cube.CubeMetadata":{count:[16,3,1,""],index:[16,3,1,""],name:[16,3,1,""],var_name:[16,2,1,""],long_name:[16,2,1,""],standard_name:[16,2,1,""],cell_methods:[16,2,1,""],units:[16,2,1,""],attributes:[16,2,1,""]},"iris.exceptions.InvalidCubeError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.analysis.geometry":{geometry_area_weights:[42,1,1,""]},"iris.coord_systems.Stereographic":{true_scale_lat:[50,2,1,""],xml_element:[50,3,1,""],ellipsoid:[50,2,1,""],central_lon:[50,2,1,""],as_cartopy_crs:[50,3,1,""],false_northing:[50,2,1,""],grid_mapping_name:[50,2,1,""],as_cartopy_projection:[50,3,1,""],false_easting:[50,2,1,""],central_lat:[50,2,1,""]},"iris.Constraint":{extract:[9,3,1,""]},"iris.fileformats":{pp_rules:[127,0,0,"-"],name_loaders:[110,0,0,"-"],pp:[67,0,0,"-"],FORMAT_AGENT:[15,5,1,""],name:[102,0,0,"-"],abf:[117,0,0,"-"],rules:[84,0,0,"-"],pp_packing:[21,0,0,"-"],cf:[1,0,0,"-"],um:[25,0,0,"-"],grib:[71,0,0,"-"],netcdf:[107,0,0,"-"],ff:[121,0,0,"-"],nimrod_load_rules:[73,0,0,"-"],um_cf_map:[3,0,0,"-"],nimrod:[125,0,0,"-"],dot:[14,0,0,"-"]},"iris.fileformats.dot":{save:[14,1,1,""],cube_text:[14,1,1,""],save_png:[14,1,1,""]},"iris.fileformats.rules.RulesContainer":{verify:[84,3,1,""],import_rules:[84,3,1,""]},"iris.experimental":{concatenate:[99,0,0,"-"],raster:[41,0,0,"-"],regrid_conservative:[91,0,0,"-"],um:[93,0,0,"-"],regrid:[5,0,0,"-"],fieldsfile:[88,0,0,"-"],ugrid:[17,0,0,"-"],equalise_cubes:[48,0,0,"-"],animate:[13,0,0,"-"]},"iris.exceptions.CoordinateMultiDimError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.aux_factory.AuxCoordFactory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],xml_element:[23,3,1,""],coord_system:[23,2,1,""],update:[23,3,1,""],long_name:[23,2,1,""],standard_name:[23,2,1,""],var_name:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],dependencies:[23,2,1,""],name:[23,3,1,""]},"iris.fileformats.netcdf":{NetCDFDataProxy:[107,4,1,""],Saver:[107,4,1,""],save:[107,1,1,""],load_cubes:[107,1,1,""],CFNameCoordMap:[107,4,1,""]},"iris.fileformats.ff.ENDGame":{regular_x:[121,3,1,""],regular_y:[121,3,1,""],vectors:[121,3,1,""]},"iris.fileformats.pp_packing":{wgdos_unpack:[21,5,1,""],rle_decode:[21,5,1,""]},"iris.time":{PartialDateTime:[53,4,1,""]},"iris.coord_systems.LambertConformal":{as_cartopy_projection:[50,3,1,""],xml_element:[50,3,1,""],ellipsoid:[50,2,1,""],secant_latitudes:[50,2,1,""],central_lon:[50,2,1,""],as_cartopy_crs:[50,3,1,""],false_northing:[50,2,1,""],grid_mapping_name:[50,2,1,""],false_easting:[50,2,1,""],central_lat:[50,2,1,""]},"iris.config.iris.config":{PALETTE_PATH:[108,5,1,""],IMPORT_LOGGER:[108,5,1,""],RULE_LOG_IGNORE:[108,5,1,""],SAMPLE_DATA_DIR:[108,5,1,""],RULE_LOG_DIR:[108,5,1,""],TEST_DATA_DIR:[108,5,1,""]},"iris.fileformats.nimrod_load_rules":{run:[73,1,1,""]},"iris.analysis.trajectory":{Trajectory:[10,4,1,""],interpolate:[10,1,1,""]},"iris.analysis.WeightedAggregator":{name:[54,3,1,""],aggregate_shape:[54,3,1,""],uses_weighting:[54,3,1,""],lazy_aggregate:[54,3,1,""],post_process:[54,3,1,""],aggregate:[54,3,1,""],update_metadata:[54,3,1,""]},"iris.fileformats.grib.GribWrapper":{phenomenon_points:[71,3,1,""],phenomenon_bounds:[71,3,1,""]},"iris.io":{find_saver:[33,1,1,""],load_files:[33,1,1,""],expand_filespecs:[33,1,1,""],format_picker:[51,0,0,"-"],run_callback:[33,1,1,""],load_http:[33,1,1,""],save:[33,1,1,""],add_saver:[33,1,1,""],decode_uri:[33,1,1,""]},"iris.analysis.calculus":{cube_delta:[7,1,1,""],curl:[7,1,1,""],differentiate:[7,1,1,""]},"iris.unit":{encode_time:[2,1,1,""],date2num:[2,1,1,""],decode_time:[2,1,1,""],num2date:[2,1,1,""],encode_date:[2,1,1,""],encode_clock:[2,1,1,""],Unit:[2,4,1,""]},"iris.exceptions.CoordinateNotFoundError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.experimental.ugrid":{ugrid:[17,1,1,""]},"iris.proxy":{FakeModule:[89,4,1,""],apply_proxy:[89,1,1,""]},"iris.aux_factory.OceanSigmaZFactory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],name:[23,3,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],var_name:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],dependencies:[23,2,1,""],xml_element:[23,3,1,""]},"iris.fileformats.cf.CFGroup":{auxiliary_coordinates:[1,2,1,""],formula_terms:[1,2,1,""],ancillary_variables:[1,2,1,""],pop:[1,3,1,""],has_key:[1,3,1,""],data_variables:[1,2,1,""],coordinates:[1,2,1,""],grid_mappings:[1,2,1,""],itervalues:[1,3,1,""],labels:[1,2,1,""],get:[1,3,1,""],keys:[1,3,1,""],global_attributes:[1,2,1,""],update:[1,3,1,""],iteritems:[1,3,1,""],popitem:[1,3,1,""],iterkeys:[1,3,1,""],setdefault:[1,3,1,""],items:[1,3,1,""],clear:[1,3,1,""],bounds:[1,2,1,""],values:[1,3,1,""],cell_measures:[1,2,1,""],promoted:[1,2,1,""],climatology:[1,2,1,""]},"iris.coords.CoordDefn":{count:[24,3,1,""],index:[24,3,1,""],name:[24,3,1,""],coord_system:[24,2,1,""],long_name:[24,2,1,""],standard_name:[24,2,1,""],var_name:[24,2,1,""],units:[24,2,1,""],attributes:[24,2,1,""]},"iris.symbols":{CLOUD_COVER:[0,5,1,""]},"iris.experimental.um.FieldsFileVariant":{UPDATE_MODE:[93,2,1,""],filename:[93,2,1,""],CREATE_MODE:[93,2,1,""],READ_MODE:[93,2,1,""],mode:[93,2,1,""],close:[93,3,1,""]},"iris.analysis.stats":{pearsonr:[65,1,1,""]},"documenting.docstrings_attribute":{ExampleClass:[40,4,1,""]},"iris.experimental.um.Field2":{blev:[93,2,1,""],lbrvc:[93,2,1,""],bacc:[93,2,1,""],lbrel:[93,2,1,""],bdatum:[93,2,1,""],LBNREC_OFFSET:[93,2,1,""],lblev:[93,2,1,""],LBLREC_OFFSET:[93,2,1,""],lbuser2:[93,2,1,""],lbuser3:[93,2,1,""],lbdayd:[93,2,1,""],lbuser1:[93,2,1,""],lbuser6:[93,2,1,""],lbuser7:[93,2,1,""],lbuser4:[93,2,1,""],lbyr:[93,2,1,""],lbyrd:[93,2,1,""],lbcfc:[93,2,1,""],lbpack:[93,2,1,""],lbnpt:[93,2,1,""],brsvd1:[93,2,1,""],bmks:[93,2,1,""],brsvd3:[93,2,1,""],get_data:[93,3,1,""],brsvd4:[93,2,1,""],lbmon:[93,2,1,""],lbvc:[93,2,1,""],bplat:[93,2,1,""],bgor:[93,2,1,""],lbsrce:[93,2,1,""],lbproj:[93,2,1,""],lbmin:[93,2,1,""],lbhr:[93,2,1,""],lbmind:[93,2,1,""],lbproc:[93,2,1,""],LBEGIN_OFFSET:[93,2,1,""],lbdatd:[93,2,1,""],lbexp:[93,2,1,""],bplon:[93,2,1,""],lbegin:[93,2,1,""],lbext:[93,2,1,""],lbrsvd4:[93,2,1,""],lbft:[93,2,1,""],lbmond:[93,2,1,""],brlev:[93,2,1,""],lbrsvd1:[93,2,1,""],lbrsvd2:[93,2,1,""],lbrsvd3:[93,2,1,""],brsvd2:[93,2,1,""],lbhrd:[93,2,1,""],bhlev:[93,2,1,""],set_data:[93,3,1,""],lbdat:[93,2,1,""],lbtim:[93,2,1,""],lbfc:[93,2,1,""],lbday:[93,2,1,""],lbnrec:[93,2,1,""],bmdi:[93,2,1,""],num_values:[93,3,1,""],bdy:[93,2,1,""],LBREL_OFFSET:[93,2,1,""],bhrlev:[93,2,1,""],bzy:[93,2,1,""],bzx:[93,2,1,""],bdx:[93,2,1,""],lblrec:[93,2,1,""],lbcode:[93,2,1,""],lbtyp:[93,2,1,""],lbhem:[93,2,1,""],lbuser5:[93,2,1,""],lbrow:[93,2,1,""]},"iris.experimental.um.Field3":{brsvd2:[93,2,1,""],lbrvc:[93,2,1,""],bacc:[93,2,1,""],lbrel:[93,2,1,""],bdatum:[93,2,1,""],lbuser5:[93,2,1,""],LBREL_OFFSET:[93,2,1,""],get_data:[93,3,1,""],lblev:[93,2,1,""],LBLREC_OFFSET:[93,2,1,""],lbuser2:[93,2,1,""],lbsecd:[93,2,1,""],lbuser1:[93,2,1,""],lbuser6:[93,2,1,""],lbuser7:[93,2,1,""],lbuser4:[93,2,1,""],lbyr:[93,2,1,""],lbyrd:[93,2,1,""],lbcfc:[93,2,1,""],bzx:[93,2,1,""],lbnpt:[93,2,1,""],lbproc:[93,2,1,""],brsvd1:[93,2,1,""],bmks:[93,2,1,""],brsvd3:[93,2,1,""],num_values:[93,3,1,""],brsvd4:[93,2,1,""],lbpack:[93,2,1,""],bplat:[93,2,1,""],LBEGIN_OFFSET:[93,2,1,""],bgor:[93,2,1,""],lbcode:[93,2,1,""],lbproj:[93,2,1,""],blev:[93,2,1,""],lbmin:[93,2,1,""],lbhr:[93,2,1,""],lbmind:[93,2,1,""],bdy:[93,2,1,""],lbmon:[93,2,1,""],lbdatd:[93,2,1,""],lbexp:[93,2,1,""],bplon:[93,2,1,""],lbegin:[93,2,1,""],lbext:[93,2,1,""],lbrsvd4:[93,2,1,""],lbft:[93,2,1,""],lbmond:[93,2,1,""],brlev:[93,2,1,""],lbrsvd1:[93,2,1,""],lbrsvd2:[93,2,1,""],lbrsvd3:[93,2,1,""],lbhrd:[93,2,1,""],bhlev:[93,2,1,""],set_data:[93,3,1,""],lbdat:[93,2,1,""],lbtim:[93,2,1,""],lbfc:[93,2,1,""],lbnrec:[93,2,1,""],bmdi:[93,2,1,""],lbuser3:[93,2,1,""],LBNREC_OFFSET:[93,2,1,""],lbvc:[93,2,1,""],bhrlev:[93,2,1,""],bzy:[93,2,1,""],bdx:[93,2,1,""],lblrec:[93,2,1,""],lbsrce:[93,2,1,""],lbtyp:[93,2,1,""],lbhem:[93,2,1,""],lbsec:[93,2,1,""],lbrow:[93,2,1,""]},"iris.coords.AuxCoord":{ndim:[24,2,1,""],rename:[24,3,1,""],nearest_neighbour_index:[24,3,1,""],dtype:[24,2,1,""],contiguous_bounds:[24,3,1,""],shape:[24,2,1,""],is_contiguous:[24,3,1,""],standard_name:[24,2,1,""],cell:[24,3,1,""],has_bounds:[24,3,1,""],var_name:[24,2,1,""],intersect:[24,3,1,""],units:[24,2,1,""],xml_element:[24,3,1,""],from_coord:[24,7,1,""],collapsed:[24,3,1,""],attributes:[24,2,1,""],copy:[24,3,1,""],convert_units:[24,3,1,""],name:[24,3,1,""],is_monotonic:[24,3,1,""],cells:[24,3,1,""],is_compatible:[24,3,1,""],bounds:[24,2,1,""],nbounds:[24,2,1,""],points:[24,2,1,""],guess_bounds:[24,3,1,""]},"iris.time.PartialDateTime":{hour:[53,2,1,""],month:[53,2,1,""],second:[53,2,1,""],microsecond:[53,2,1,""],timetuple:[53,2,1,""],year:[53,2,1,""],day:[53,2,1,""],minute:[53,2,1,""]},"iris.fileformats.grib":{grib_phenom_translation:[43,0,0,"-"],load_rules:[85,0,0,"-"],GribWrapper:[71,4,1,""],reset_load_rules:[71,1,1,""],load_cubes:[71,1,1,""],grib_generator:[71,1,1,""],grib_save_rules:[83,0,0,"-"],hindcast_workaround:[71,5,1,""],save_grib2:[71,1,1,""]},"iris.fileformats.cf.CFMeasureVariable":{cf_measure:[1,2,1,""],spans:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.coords.CoordExtent":{count:[24,3,1,""],index:[24,3,1,""],maximum:[24,2,1,""],name_or_coord:[24,2,1,""],minimum:[24,2,1,""],min_inclusive:[24,2,1,""],max_inclusive:[24,2,1,""]},"iris.analysis":{AreaWeighted:[54,4,1,""],COUNT:[54,5,1,""],cartography:[28,0,0,"-"],MIN:[54,5,1,""],Aggregator:[54,4,1,""],SUM:[54,5,1,""],PERCENTILE:[54,5,1,""],PEAK:[54,5,1,""],RMS:[54,5,1,""],stats:[65,0,0,"-"],Linear:[54,4,1,""],trajectory:[10,0,0,"-"],MEDIAN:[54,5,1,""],clear_phenomenon_identity:[54,1,1,""],MEAN:[54,5,1,""],calculus:[7,0,0,"-"],HMEAN:[54,5,1,""],interpolate:[118,0,0,"-"],STD_DEV:[54,5,1,""],VARIANCE:[54,5,1,""],maths:[29,0,0,"-"],Nearest:[54,4,1,""],geometry:[42,0,0,"-"],MAX:[54,5,1,""],WeightedAggregator:[54,4,1,""],PROPORTION:[54,5,1,""],coord_comparison:[54,1,1,""],GMEAN:[54,5,1,""]},"iris.AttributeConstraint":{extract:[9,3,1,""]},"iris.fileformats.rules.CMAttribute":{name:[84,2,1,""],value:[84,2,1,""]},"iris.analysis.Aggregator":{name:[54,3,1,""],lazy_aggregate:[54,3,1,""],aggregate_shape:[54,3,1,""],post_process:[54,3,1,""],aggregate:[54,3,1,""],update_metadata:[54,3,1,""]},"iris.palette.SymmetricNormalize":{process_value:[75,7,1,""],autoscale_None:[75,3,1,""],inverse:[75,3,1,""],autoscale:[75,3,1,""],vmin:[75,2,1,""],scaled:[75,3,1,""],vmax:[75,2,1,""]},"iris.util":{is_regular:[87,1,1,""],unify_time_units:[87,1,1,""],format_array:[87,1,1,""],clip_string:[87,1,1,""],new_axis:[87,1,1,""],broadcast_to_shape:[87,1,1,""],promote_aux_coord_to_dim_coord:[87,1,1,""],regular_step:[87,1,1,""],create_temp_filename:[87,1,1,""],guess_coord_axis:[87,1,1,""],approx_equal:[87,1,1,""],between:[87,1,1,""],file_is_newer_than:[87,1,1,""],ensure_array:[87,1,1,""],monotonic:[87,1,1,""],describe_diff:[87,1,1,""],demote_dim_coord_to_aux_coord:[87,1,1,""],column_slices_generator:[87,1,1,""],delta:[87,1,1,""],broadcast_weights:[87,1,1,""],squeeze:[87,1,1,""],array_equal:[87,1,1,""],reverse:[87,1,1,""],as_compatible_shape:[87,1,1,""],rolling_window:[87,1,1,""],timers:[87,5,1,""]},"iris.fileformats.ff.ArakawaC":{regular_x:[121,3,1,""],regular_y:[121,3,1,""],vectors:[121,3,1,""]},"iris.exceptions.IrisError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.experimental.um":{Field2:[93,4,1,""],Field3:[93,4,1,""],FixedLengthHeader:[93,4,1,""],FieldsFileVariant:[93,4,1,""],Field:[93,4,1,""]},"iris.fileformats.cf":{CFBoundaryVariable:[1,4,1,""],CFDataVariable:[1,4,1,""],CFClimatologyVariable:[1,4,1,""],CFGroup:[1,4,1,""],CFLabelVariable:[1,4,1,""],CFCoordinateVariable:[1,4,1,""],CFReader:[1,4,1,""],CFVariable:[1,4,1,""],CFMeasureVariable:[1,4,1,""],CFAuxiliaryCoordinateVariable:[1,4,1,""],CFGridMappingVariable:[1,4,1,""],CFAncillaryDataVariable:[1,4,1,""]},"iris.exceptions.IgnoreCubeException":{message:[26,2,1,""],args:[26,2,1,""]},"iris.coord_categorisation":{add_day_of_year:[11,1,1,""],add_month:[11,1,1,""],add_month_fullname:[11,1,1,""],add_month_number:[11,1,1,""],add_categorised_coord:[11,1,1,""],add_weekday_number:[11,1,1,""],add_season_number:[11,1,1,""],add_day_of_month:[11,1,1,""],add_weekday_fullname:[11,1,1,""],add_season_membership:[11,1,1,""],add_season:[11,1,1,""],add_season_year:[11,1,1,""],add_weekday:[11,1,1,""],add_year:[11,1,1,""]},"iris.fileformats.name_loaders":{load_NAMEIII_timeseries:[110,1,1,""],read_header:[110,1,1,""],load_NAMEII_timeseries:[110,1,1,""],NAMECoord:[110,4,1,""],load_NAMEIII_trajectory:[110,1,1,""],load_NAMEIII_field:[110,1,1,""],load_NAMEII_field:[110,1,1,""]},"iris.aux_factory.HybridPressureFactory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],name:[23,3,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],var_name:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],dependencies:[23,2,1,""],xml_element:[23,3,1,""]},"iris.exceptions.LazyAggregatorError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.cf.CFGridMappingVariable":{spans:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.fileformats.rules.ReferenceTarget":{count:[84,3,1,""],index:[84,3,1,""],name:[84,2,1,""],transform:[84,2,1,""]},"iris.io.format_picker.FileElement":{get_element:[51,3,1,""]},"iris.fileformats.ff.FFHeader":{GRID_STAGGERING_CLASS:[121,2,1,""],shape:[121,3,1,""],grid:[121,3,1,""],ff_filename:[121,2,1,""]},"iris.analysis.maths":{intersection_of_cubes:[29,1,1,""],log2:[29,1,1,""],divide:[29,1,1,""],subtract:[29,1,1,""],add:[29,1,1,""],abs:[29,1,1,""],apply_ufunc:[29,1,1,""],exponentiate:[29,1,1,""],multiply:[29,1,1,""],exp:[29,1,1,""],log10:[29,1,1,""],IFunc:[29,4,1,""],log:[29,1,1,""]},"iris.fileformats.ff.NewDynamics":{regular_x:[121,3,1,""],regular_y:[121,3,1,""],vectors:[121,3,1,""]},"iris.fileformats.grib.load_rules":{convert:[85,1,1,""]},"iris.fileformats.rules.FunctionRule":{evaluates_true:[84,3,1,""],run_actions:[84,3,1,""]},"iris.io.format_picker.FormatSpecification":{file_element:[51,2,1,""],handler:[51,2,1,""],name:[51,2,1,""],file_element_value:[51,2,1,""]},"iris.experimental.animate":{animate:[13,1,1,""]},"iris.fileformats.grib.grib_phenom_translation.LookupTable":{viewitems:[43,3,1,""],fromkeys:[43,7,1,""],setdefault:[43,3,1,""],viewvalues:[43,3,1,""],keys:[43,3,1,""],items:[43,3,1,""],clear:[43,3,1,""],get:[43,3,1,""],update:[43,3,1,""],pop:[43,3,1,""],has_key:[43,3,1,""],viewkeys:[43,3,1,""],values:[43,3,1,""],itervalues:[43,3,1,""],popitem:[43,3,1,""],iteritems:[43,3,1,""],copy:[43,3,1,""],iterkeys:[43,3,1,""]},"iris.aux_factory.OceanSg1Factory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],name:[23,3,1,""],var_name:[23,2,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],dependencies:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],xml_element:[23,3,1,""]},"iris.analysis.Linear":{LINEAR_EXTRAPOLATION_MODES:[54,2,1,""],regridder:[54,3,1,""],interpolator:[54,3,1,""]},"documenting.docstrings_attribute.ExampleClass":{a:[40,2,1,""],square:[40,2,1,""],b:[40,2,1,""]},"iris.coords.Coord":{rename:[24,3,1,""],ndim:[24,2,1,""],nearest_neighbour_index:[24,3,1,""],dtype:[24,2,1,""],long_name:[24,2,1,""],standard_name:[24,2,1,""],is_contiguous:[24,3,1,""],cell:[24,3,1,""],has_bounds:[24,3,1,""],var_name:[24,2,1,""],intersect:[24,3,1,""],units:[24,2,1,""],xml_element:[24,3,1,""],collapsed:[24,3,1,""],coord_system:[24,2,1,""],shape:[24,2,1,""],guess_bounds:[24,3,1,""],copy:[24,3,1,""],convert_units:[24,3,1,""],name:[24,3,1,""],is_monotonic:[24,3,1,""],cells:[24,3,1,""],is_compatible:[24,3,1,""],bounds:[24,2,1,""],nbounds:[24,2,1,""],contiguous_bounds:[24,3,1,""],points:[24,2,1,""],attributes:[24,2,1,""]},"iris.fileformats.rules":{calculate_forecast_period:[84,1,1,""],Factory:[84,4,1,""],FunctionRule:[84,4,1,""],CMAttribute:[84,4,1,""],has_aux_factory:[84,1,1,""],RulesContainer:[84,4,1,""],CMCustomAttribute:[84,4,1,""],log:[84,1,1,""],vector_coord:[84,1,1,""],CoordAndDims:[84,4,1,""],DebugString:[84,4,1,""],ReferenceTarget:[84,4,1,""],Rule:[84,4,1,""],ProcedureRule:[84,4,1,""],ConcreteReferenceTarget:[84,4,1,""],Reference:[84,4,1,""],load_cubes:[84,1,1,""],Loader:[84,4,1,""],RuleResult:[84,4,1,""],scalar_coord:[84,1,1,""],scalar_cell_method:[84,1,1,""],ConversionMetadata:[84,4,1,""],aux_factory:[84,1,1,""]},"iris.io.format_picker":{FileElement:[51,4,1,""],FormatAgent:[51,4,1,""],MagicNumber:[51,4,1,""],LeadingLine:[51,4,1,""],UriProtocol:[51,4,1,""],FormatSpecification:[51,4,1,""],FileExtension:[51,4,1,""]},"iris.fileformats.cf.CFAncillaryDataVariable":{has_formula_terms:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],spans:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.fileformats.rules.ConversionMetadata":{count:[84,3,1,""],index:[84,3,1,""],aux_coords_and_dims:[84,2,1,""],dim_coords_and_dims:[84,2,1,""],long_name:[84,2,1,""],standard_name:[84,2,1,""],cell_methods:[84,2,1,""],references:[84,2,1,""],units:[84,2,1,""],attributes:[84,2,1,""],factories:[84,2,1,""]},"iris.aux_factory.HybridHeightFactory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],name:[23,3,1,""],var_name:[23,2,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],dependencies:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],xml_element:[23,3,1,""]},"iris.fileformats.um_cf_map":{CFName:[3,4,1,""]},"iris.fileformats.rules.Rule":{evaluates_true:[84,3,1,""],run_actions:[84,3,1,""]},"iris.fileformats.abf":{ABFField:[117,4,1,""],load_cubes:[117,1,1,""]},"iris.fileformats.name_loaders.NAMECoord":{count:[110,3,1,""],index:[110,3,1,""],values:[110,2,1,""],name:[110,2,1,""],dimension:[110,2,1,""]},"iris.fileformats.rules.ConcreteReferenceTarget":{add_cube:[84,3,1,""],as_cube:[84,3,1,""]},"iris.aux_factory.OceanSigmaFactory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],xml_element:[23,3,1,""],var_name:[23,2,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],dependencies:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],name:[23,3,1,""]},"iris.fileformats.ff.Grid":{regular_x:[121,3,1,""],regular_y:[121,3,1,""],vectors:[121,3,1,""]},"iris.Future":{context:[9,3,1,""]},"iris.fileformats.cf.CFClimatologyVariable":{has_formula_terms:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],spans:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.plot":{contourf:[38,1,1,""],plot:[38,1,1,""],outline:[38,1,1,""],citation:[38,1,1,""],orography_at_bounds:[38,1,1,""],default_projection_extent:[38,1,1,""],contour:[38,1,1,""],symbols:[38,1,1,""],orography_at_points:[38,1,1,""],default_projection:[38,1,1,""],pcolormesh:[38,1,1,""],PlotDefn:[38,4,1,""],pcolor:[38,1,1,""],scatter:[38,1,1,""],points:[38,1,1,""]},"iris.fileformats.rules.CMCustomAttribute":{name:[84,2,1,""],value:[84,2,1,""]},"iris.pandas":{as_data_frame:[86,1,1,""],as_series:[86,1,1,""],as_cube:[86,1,1,""]},"iris.exceptions.ConstraintMismatchError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.pp.PPField":{save:[67,3,1,""],y_bounds:[67,2,1,""],data:[67,2,1,""],t2:[67,2,1,""],coord_system:[67,3,1,""],stash:[67,2,1,""],t1:[67,2,1,""],x_bounds:[67,2,1,""],lbpack:[67,2,1,""],lbcode:[67,2,1,""],time_unit:[67,3,1,""],lbtim:[67,2,1,""],calendar:[67,2,1,""],copy:[67,3,1,""],lbproc:[67,2,1,""]},"iris.fileformats.pp":{load:[67,1,1,""],reset_save_rules:[67,1,1,""],EARTH_RADIUS:[67,5,1,""],reset_load_rules:[67,1,1,""],load_cubes:[67,1,1,""],STASH:[67,4,1,""],add_save_rules:[67,1,1,""],PPField:[67,4,1,""],save:[67,1,1,""]},"iris.analysis.cartography":{rotate_winds:[28,1,1,""],unrotate_pole:[28,1,1,""],cosine_latitude_weights:[28,1,1,""],rotate_pole:[28,1,1,""],project:[28,1,1,""],get_xy_grids:[28,1,1,""],get_xy_contiguous_bounded_grids:[28,1,1,""],area_weights:[28,1,1,""],wrap_lons:[28,1,1,""]},"iris.coord_systems.OSGB":{as_cartopy_crs:[50,3,1,""],as_cartopy_projection:[50,3,1,""],xml_element:[50,3,1,""],grid_mapping_name:[50,2,1,""]},"iris.fileformats.rules.Factory":{count:[84,3,1,""],index:[84,3,1,""],factory_class:[84,2,1,""],args:[84,2,1,""]},"iris.exceptions":{CoordinateMultiDimError:[26,4,1,""],DuplicateDataError:[26,4,1,""],ConcatenateError:[26,4,1,""],InvalidCubeError:[26,4,1,""],LazyAggregatorError:[26,4,1,""],NotYetImplementedError:[26,4,1,""],CoordinateNotFoundError:[26,4,1,""],CoordinateCollapseError:[26,4,1,""],CoordinateNotRegularError:[26,4,1,""],MergeError:[26,4,1,""],TranslationError:[26,4,1,""],IgnoreCubeException:[26,4,1,""],IrisError:[26,4,1,""],ConstraintMismatchError:[26,4,1,""]},"iris.plot.PlotDefn":{count:[38,3,1,""],coords:[38,2,1,""],transpose:[38,2,1,""],index:[38,3,1,""]},"iris.fileformats.cf.CFDataVariable":{spans:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.coord_systems.CoordSystem":{as_cartopy_crs:[50,3,1,""],grid_mapping_name:[50,2,1,""],xml_element:[50,3,1,""],as_cartopy_projection:[50,3,1,""]},"iris.experimental.raster":{export_geotiff:[41,1,1,""]},"iris.analysis.Nearest":{regridder:[54,3,1,""],interpolator:[54,3,1,""]},"iris.exceptions.CoordinateNotRegularError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.grib.grib_phenom_translation":{cf_phenom_to_grib2_info:[43,1,1,""],grib1_phenom_to_cf_info:[43,1,1,""],LookupTable:[43,4,1,""],grib2_phenom_to_cf_info:[43,1,1,""]},"iris.analysis.interpolate":{Linear1dExtrapolator:[118,4,1,""],nearest_neighbour_indices:[118,1,1,""],extract_nearest_neighbour:[118,1,1,""],regrid_to_max_resolution:[118,1,1,""],regrid:[118,1,1,""],nearest_neighbour_data_value:[118,1,1,""],linear:[118,1,1,""]},"iris.coord_systems.TransverseMercator":{ellipsoid:[50,2,1,""],as_cartopy_projection:[50,3,1,""],latitude_of_projection_origin:[50,2,1,""],scale_factor_at_central_meridian:[50,2,1,""],longitude_of_central_meridian:[50,2,1,""],false_easting:[50,2,1,""],false_northing:[50,2,1,""],grid_mapping_name:[50,2,1,""],as_cartopy_crs:[50,3,1,""],xml_element:[50,3,1,""]},"iris.fileformats.abf.ABFField":{to_cube:[117,3,1,""]},"iris.fileformats.um_cf_map.CFName":{count:[3,3,1,""],long_name:[3,2,1,""],standard_name:[3,2,1,""],units:[3,2,1,""],index:[3,3,1,""]},"iris.fileformats.netcdf.Saver":{write:[107,3,1,""],update_global_attributes:[107,3,1,""]},"iris.fileformats.netcdf.NetCDFDataProxy":{ndim:[107,2,1,""],dtype:[107,2,1,""],fill_value:[107,2,1,""],shape:[107,2,1,""],variable_name:[107,2,1,""],path:[107,2,1,""]},"iris.coords.DimCoord":{rename:[24,3,1,""],ndim:[24,2,1,""],nearest_neighbour_index:[24,3,1,""],dtype:[24,2,1,""],contiguous_bounds:[24,3,1,""],standard_name:[24,2,1,""],is_contiguous:[24,3,1,""],cell:[24,3,1,""],has_bounds:[24,3,1,""],var_name:[24,2,1,""],intersect:[24,3,1,""],units:[24,2,1,""],circular:[24,2,1,""],xml_element:[24,3,1,""],from_coord:[24,7,1,""],from_regular:[24,6,1,""],collapsed:[24,3,1,""],shape:[24,2,1,""],guess_bounds:[24,3,1,""],copy:[24,3,1,""],convert_units:[24,3,1,""],name:[24,3,1,""],is_monotonic:[24,3,1,""],cells:[24,3,1,""],is_compatible:[24,3,1,""],bounds:[24,2,1,""],nbounds:[24,2,1,""],points:[24,2,1,""],attributes:[24,2,1,""]},iris:{load:[9,1,1,""],symbols:[0,0,0,"-"],io:[33,0,0,"-"],unit:[2,0,0,"-"],site_configuration:[9,5,1,""],plot:[38,0,0,"-"],palette:[75,0,0,"-"],quickplot:[74,0,0,"-"],std_names:[72,0,0,"-"],save:[9,1,1,""],config:[108,0,0,"-"],pandas:[86,0,0,"-"],load_raw:[9,1,1,""],cube:[16,0,0,"-"],Constraint:[9,4,1,""],sample_data_path:[9,1,1,""],util:[87,0,0,"-"],Future:[9,4,1,""],proxy:[89,0,0,"-"],iterate:[44,0,0,"-"],fileformats:[15,0,0,"-"],AttributeConstraint:[9,4,1,""],coord_categorisation:[11,0,0,"-"],load_cubes:[9,1,1,""],analysis:[54,0,0,"-"],FUTURE:[9,5,1,""],coords:[24,0,0,"-"],experimental:[123,0,0,"-"],time:[53,0,0,"-"],exceptions:[26,0,0,"-"],coord_systems:[50,0,0,"-"],aux_factory:[23,0,0,"-"],load_cube:[9,1,1,""]},"iris.coord_systems.RotatedGeogCS":{as_cartopy_projection:[50,3,1,""],north_pole_grid_longitude:[50,2,1,""],grid_north_pole_longitude:[50,2,1,""],as_cartopy_crs:[50,3,1,""],grid_mapping_name:[50,2,1,""],grid_north_pole_latitude:[50,2,1,""],ellipsoid:[50,2,1,""],xml_element:[50,3,1,""]},"iris.fileformats.pp_rules":{convert:[127,1,1,""]},"iris.experimental.regrid":{regrid_weighted_curvilinear_to_rectilinear:[5,1,1,""],regrid_area_weighted_rectilinear_src_and_grid:[5,1,1,""]},"iris.fileformats.grib.grib_save_rules":{non_hybrid_surfaces:[83,1,1,""],product_common:[83,1,1,""],time_range:[83,1,1,""],generating_process_id:[83,1,1,""],rotated_pole:[83,1,1,""],background_process_id:[83,1,1,""],surfaces:[83,1,1,""],reference_time:[83,1,1,""],scanning_mode_flags:[83,1,1,""],identification:[83,1,1,""],shape_of_the_earth:[83,1,1,""],latlon_first_last:[83,1,1,""],run:[83,1,1,""],obs_time_after_cutoff:[83,1,1,""],centre:[83,1,1,""],grid_template:[83,1,1,""],gribbability_check:[83,1,1,""],latlon_common:[83,1,1,""],param_code:[83,1,1,""],dx_dy:[83,1,1,""],data:[83,1,1,""],generating_process_type:[83,1,1,""],time_processing_period:[83,1,1,""],grid_dims:[83,1,1,""],type_of_statistical_processing:[83,1,1,""],hybrid_surfaces:[83,1,1,""],product_template:[83,1,1,""]},"iris.io.format_picker.FileExtension":{get_element:[51,3,1,""]},"iris.fileformats.name":{load_cubes:[102,1,1,""]},"iris.io.format_picker.LeadingLine":{get_element:[51,3,1,""]},"iris.experimental.regrid_conservative":{regrid_conservative_via_esmpy:[91,1,1,""]},"iris.coord_systems.VerticalPerspective":{xml_element:[50,3,1,""],as_cartopy_projection:[50,3,1,""],latitude_of_projection_origin:[50,2,1,""],test_pph:[50,2,1,""],longitude_of_projection_origin:[50,2,1,""],grid_mapping_name:[50,2,1,""],test_fe:[50,2,1,""],as_cartopy_crs:[50,3,1,""],ellipsoid:[50,2,1,""],test_fn:[50,2,1,""]},"iris.analysis.AreaWeighted":{regridder:[54,3,1,""]},"iris.experimental.um.Field":{int_headers:[93,2,1,""],num_values:[93,3,1,""],LBNREC_OFFSET:[93,2,1,""],LBEGIN_OFFSET:[93,2,1,""],LBREL_OFFSET:[93,2,1,""],get_data:[93,3,1,""],LBLREC_OFFSET:[93,2,1,""],real_headers:[93,2,1,""],set_data:[93,3,1,""]},"iris.analysis.interpolate.Linear1dExtrapolator":{all_points_in_range:[118,3,1,""],y:[118,2,1,""],roll_y:[118,2,1,""]},"iris.fileformats.rules.Loader":{count:[84,3,1,""],index:[84,3,1,""],converter:[84,2,1,""],legacy_custom_rules:[84,2,1,""],field_generator:[84,2,1,""],field_generator_kwargs:[84,2,1,""]},"iris.fileformats.ff":{load_cubes_32bit_ieee:[121,1,1,""],FFHeader:[121,4,1,""],load_cubes:[121,1,1,""],ArakawaC:[121,4,1,""],NewDynamics:[121,4,1,""],Grid:[121,4,1,""],ENDGame:[121,4,1,""],FF2PP:[121,4,1,""]},"iris.fileformats.cf.CFBoundaryVariable":{spans:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.coords":{AuxCoord:[24,4,1,""],Coord:[24,4,1,""],CoordDefn:[24,4,1,""],Cell:[24,4,1,""],DimCoord:[24,4,1,""],CoordExtent:[24,4,1,""],CellMethod:[24,4,1,""]},"iris.cube.CubeList":{xml:[16,3,1,""],sort:[16,3,1,""],extend:[16,3,1,""],reverse:[16,3,1,""],merge_cube:[16,3,1,""],index:[16,3,1,""],insert:[16,3,1,""],concatenate:[16,3,1,""],count:[16,3,1,""],remove:[16,3,1,""],extract_overlapping:[16,3,1,""],merge:[16,3,1,""],pop:[16,3,1,""],extract_strict:[16,3,1,""],extract:[16,3,1,""],concatenate_cube:[16,3,1,""],append:[16,3,1,""]},"iris.experimental.concatenate":{concatenate:[99,1,1,""]},documenting:{docstrings_attribute:[40,0,0,"-"],docstrings_sample_routine:[40,0,0,"-"]},"iris.exceptions.ConcatenateError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.io.format_picker.UriProtocol":{get_element:[51,3,1,""]},"iris.io.format_picker.FormatAgent":{add_spec:[51,3,1,""],get_spec:[51,3,1,""]},"iris.unit.Unit":{origin:[2,2,1,""],is_no_unit:[2,3,1,""],offset_by_time:[2,3,1,""],is_unknown:[2,3,1,""],num2date:[2,3,1,""],is_vertical:[2,3,1,""],is_convertible:[2,3,1,""],calendar:[2,2,1,""],category:[2,2,1,""],is_dimensionless:[2,3,1,""],log:[2,3,1,""],title:[2,3,1,""],invert:[2,3,1,""],modulus:[2,2,1,""],date2num:[2,3,1,""],format:[2,3,1,""],symbol:[2,2,1,""],utime:[2,3,1,""],definition:[2,2,1,""],convert:[2,3,1,""],is_time_reference:[2,3,1,""],name:[2,2,1,""],is_udunits:[2,3,1,""],ut_unit:[2,2,1,""],is_time:[2,3,1,""],root:[2,3,1,""]},"iris.aux_factory.OceanSg2Factory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],name:[23,3,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],var_name:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],dependencies:[23,2,1,""],xml_element:[23,3,1,""]},"documenting.docstrings_sample_routine":{sample_routine:[40,1,1,""]},"iris.io.format_picker.MagicNumber":{get_element:[51,3,1,""],len_formats:[51,2,1,""]},"iris.coords.Cell":{count:[24,3,1,""],index:[24,3,1,""],contains_point:[24,3,1,""],bound:[24,2,1,""],point:[24,2,1,""]},"iris.experimental.um.FixedLengthHeader":{NUM_WORDS:[93,2,1,""],run_identifier:[93,2,1,""],extra_constants_shape:[93,2,1,""],data_shape:[93,2,1,""],lookup_start:[93,2,1,""],raw:[93,2,1,""],model_version:[93,2,1,""],temp_historyfile_start:[93,2,1,""],row_dependent_constants_start:[93,2,1,""],compressed_field_index3_start:[93,2,1,""],compressed_field_index2_start:[93,2,1,""],calendar:[93,2,1,""],IMDI:[93,2,1,""],lookup_shape:[93,2,1,""],last_fieldop_type:[93,2,1,""],real_constants_start:[93,2,1,""],real_constants_shape:[93,2,1,""],max_length:[93,2,1,""],compressed_field_index3_shape:[93,2,1,""],column_dependent_constants_start:[93,2,1,""],time_type:[93,2,1,""],compressed_field_index1_start:[93,2,1,""],empty:[93,6,1,""],from_file:[93,6,1,""],level_dependent_constants_shape:[93,2,1,""],sub_model:[93,2,1,""],fields_of_constants_start:[93,2,1,""],compressed_field_index2_shape:[93,2,1,""],column_dependent_constants_shape:[93,2,1,""],vert_coord_type:[93,2,1,""],integer_constants_start:[93,2,1,""],dataset_type:[93,2,1,""],extra_constants_start:[93,2,1,""],horiz_grid_type:[93,2,1,""],projection_number:[93,2,1,""],grid_staggering:[93,2,1,""],total_prognostic_fields:[93,2,1,""],experiment_number:[93,2,1,""],data_start:[93,2,1,""],compressed_field_index1_shape:[93,2,1,""],level_dependent_constants_start:[93,2,1,""],obs_file_type:[93,2,1,""],integer_constants_shape:[93,2,1,""],fields_of_constants_shape:[93,2,1,""],row_dependent_constants_shape:[93,2,1,""],data_set_format_version:[93,2,1,""],temp_historyfile_shape:[93,2,1,""]},"iris.quickplot":{contourf:[74,1,1,""],plot:[74,1,1,""],outline:[74,1,1,""],contour:[74,1,1,""],points:[74,1,1,""],pcolormesh:[74,1,1,""],pcolor:[74,1,1,""],scatter:[74,1,1,""]},"iris.aux_factory.LazyArray":{reshape:[23,3,1,""],to_xml_attr:[23,3,1,""],view:[23,3,1,""]},"iris.fileformats.nimrod":{NimrodField:[125,4,1,""],load_cubes:[125,1,1,""]},"iris.coord_systems.Orthographic":{as_cartopy_projection:[50,3,1,""],ellipsoid:[50,2,1,""],latitude_of_projection_origin:[50,2,1,""],longitude_of_projection_origin:[50,2,1,""],as_cartopy_crs:[50,3,1,""],false_northing:[50,2,1,""],grid_mapping_name:[50,2,1,""],false_easting:[50,2,1,""],xml_element:[50,3,1,""]},"iris.cube":{Cube:[16,4,1,""],CubeMetadata:[16,4,1,""],CubeList:[16,4,1,""]},"iris.fileformats.cf.CFAuxiliaryCoordinateVariable":{spans:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.config":{get_dir_option:[108,1,1,""],get_option:[108,1,1,""]},"iris.aux_factory":{HybridPressureFactory:[23,4,1,""],OceanSg2Factory:[23,4,1,""],HybridHeightFactory:[23,4,1,""],AuxCoordFactory:[23,4,1,""],OceanSFactory:[23,4,1,""],OceanSigmaZFactory:[23,4,1,""],OceanSigmaFactory:[23,4,1,""],LazyArray:[23,4,1,""],OceanSg1Factory:[23,4,1,""]},"iris.coord_systems.GeogCS":{as_cartopy_globe:[50,3,1,""],xml_element:[50,3,1,""],as_cartopy_projection:[50,3,1,""],inverse_flattening:[50,2,1,""],semi_major_axis:[50,2,1,""],grid_mapping_name:[50,2,1,""],semi_minor_axis:[50,2,1,""],as_cartopy_crs:[50,3,1,""],longitude_of_prime_meridian:[50,2,1,""]},"iris.aux_factory.OceanSFactory":{rename:[23,3,1,""],updated:[23,3,1,""],derived_dims:[23,3,1,""],xml_element:[23,3,1,""],var_name:[23,2,1,""],update:[23,3,1,""],standard_name:[23,2,1,""],dependencies:[23,2,1,""],make_coord:[23,3,1,""],units:[23,2,1,""],attributes:[23,2,1,""],name:[23,3,1,""]},"iris.fileformats.rules.DebugString":{upper:[84,3,1,""],lstrip:[84,3,1,""],rpartition:[84,3,1,""],replace:[84,3,1,""],endswith:[84,3,1,""],splitlines:[84,3,1,""],expandtabs:[84,3,1,""],strip:[84,3,1,""],isdigit:[84,3,1,""],ljust:[84,3,1,""],find:[84,3,1,""],rjust:[84,3,1,""],isalnum:[84,3,1,""],title:[84,3,1,""],rindex:[84,3,1,""],rsplit:[84,3,1,""],format:[84,3,1,""],decode:[84,3,1,""],isalpha:[84,3,1,""],split:[84,3,1,""],rstrip:[84,3,1,""],encode:[84,3,1,""],translate:[84,3,1,""],isspace:[84,3,1,""],startswith:[84,3,1,""],index:[84,3,1,""],swapcase:[84,3,1,""],zfill:[84,3,1,""],capitalize:[84,3,1,""],count:[84,3,1,""],lower:[84,3,1,""],join:[84,3,1,""],center:[84,3,1,""],partition:[84,3,1,""],rfind:[84,3,1,""],istitle:[84,3,1,""],islower:[84,3,1,""],isupper:[84,3,1,""]},"iris.iterate":{izip:[44,1,1,""]},"iris.coords.CellMethod":{coord_names:[24,2,1,""],intervals:[24,2,1,""],xml_element:[24,3,1,""],comments:[24,2,1,""],method:[24,2,1,""]},"iris.experimental.equalise_cubes":{equalise_attributes:[48,1,1,""]},"iris.exceptions.MergeError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.rules.CoordAndDims":{add_coord:[84,3,1,""]},"iris.fileformats.cf.CFReader":{cf_group:[1,2,1,""]},"iris.fileformats.pp.STASH":{count:[67,3,1,""],index:[67,3,1,""],lbuser3:[67,3,1,""],from_msi:[67,7,1,""],section:[67,2,1,""],item:[67,2,1,""],is_valid:[67,2,1,""],lbuser6:[67,3,1,""],model:[67,2,1,""]},"iris.exceptions.CoordinateCollapseError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.palette":{SymmetricNormalize:[75,4,1,""],cmap_norm:[75,1,1,""],is_brewer:[75,1,1,""],auto_palette:[75,1,1,""]},"iris.cube.Cube":{ndim:[16,2,1,""],aux_coords:[16,2,1,""],slices:[16,3,1,""],dtype:[16,2,1,""],long_name:[16,2,1,""],standard_name:[16,2,1,""],add_aux_coord:[16,3,1,""],dim_coords:[16,2,1,""],summary:[16,3,1,""],extract:[16,3,1,""],add_aux_factory:[16,3,1,""],coord_dims:[16,3,1,""],xml:[16,3,1,""],aggregated_by:[16,3,1,""],copy:[16,3,1,""],regrid:[16,3,1,""],var_name:[16,2,1,""],units:[16,2,1,""],regridded:[16,3,1,""],metadata:[16,2,1,""],aux_factories:[16,2,1,""],derived_coords:[16,2,1,""],has_lazy_data:[16,3,1,""],lazy_data:[16,3,1,""],collapsed:[16,3,1,""],transpose:[16,3,1,""],coord_system:[16,3,1,""],interpolate:[16,3,1,""],shape:[16,2,1,""],cell_methods:[16,2,1,""],replace_coord:[16,3,1,""],add_dim_coord:[16,3,1,""],subset:[16,3,1,""],intersection:[16,3,1,""],data:[16,2,1,""],convert_units:[16,3,1,""],name:[16,3,1,""],slices_over:[16,3,1,""],is_compatible:[16,3,1,""],assert_valid:[16,3,1,""],coord:[16,3,1,""],add_cell_method:[16,3,1,""],rolling_window:[16,3,1,""],rename:[16,3,1,""],coords:[16,3,1,""],remove_aux_factory:[16,3,1,""],remove_coord:[16,3,1,""],add_history:[16,3,1,""],attributes:[16,2,1,""],aux_factory:[16,3,1,""]},"iris.experimental.fieldsfile":{load:[88,1,1,""]},"iris.fileformats.netcdf.CFNameCoordMap":{coords:[107,2,1,""],names:[107,2,1,""],name:[107,3,1,""],append:[107,3,1,""],coord:[107,3,1,""]},"iris.fileformats.nimrod.NimrodField":{read:[125,3,1,""]},"iris.exceptions.NotYetImplementedError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.cf.CFVariable":{cf_name:[1,2,1,""],cf_terms_by_root:[1,2,1,""],has_formula_terms:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_group:[1,2,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,3,1,""],cf_data:[1,2,1,""],spans:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""]},"iris.exceptions.TranslationError":{message:[26,2,1,""],args:[26,2,1,""]},"iris.fileformats.cf.CFLabelVariable":{spans:[1,3,1,""],cf_attrs_used:[1,3,1,""],cf_attrs_reset:[1,3,1,""],cf_attrs:[1,3,1,""],cf_label_dimensions:[1,3,1,""],cf_attrs_unused:[1,3,1,""],identify:[1,6,1,""],has_formula_terms:[1,3,1,""],cf_identity:[1,2,1,""],cf_attrs_ignored:[1,3,1,""],add_formula_term:[1,3,1,""],cf_label_data:[1,3,1,""]},"iris.coord_systems":{RotatedGeogCS:[50,4,1,""],CoordSystem:[50,4,1,""],LambertConformal:[50,4,1,""],OSGB:[50,4,1,""],GeogCS:[50,4,1,""],TransverseMercator:[50,4,1,""],Stereographic:[50,4,1,""],Orthographic:[50,4,1,""],VerticalPerspective:[50,4,1,""]}},titleterms:{represent:97,code:[49,8,32],partial:[111,57],global:[105,114],scalar:57,netcdf:[107,63],follow:49,profil:[60,20,30],sourc:[49,92,32],string:97,util:[57,87],upstream:95,fieldsfil:[88,80],tri:78,showcas:[56,57,55],list:103,iter:[96,44],hovmol:22,dimens:111,visualis:[120,98,4],stereograph:81,pressur:[39,63],aggreg:[111,57],download:69,panda:[86,122],proxi:89,bilinear:122,index:96,what:[31,55,56,57,58,122,60,61,62,63],abf:[60,117],pyplot:76,advanc:79,abl:60,access:97,delet:100,experiment:[48,41,123,99,122,17,91,93,5,13,88],"new":[100,77,31,55,56,57,58,122,60,61,62,63],"public":90,metadata:[97,80,63],deriv:39,equalis:57,gener:4,matplotlib:[76,63],trunk:[126,100],vertic:[122,80],um_cf_map:3,chang:[90,100,126,55,56,57,58,122,60,61,62,63],via:122,appli:52,modul:[25,123,71,9,15,54,33],deprec:[90,55,56,57,58,122,60,61,62,63],api:[56,90],instal:[112,92],nimrod_load_rul:73,unit:[109,2],plot:[105,124,81,38,68,20,76,94,78],from:[113,122,6,92,100],two:77,few:126,more:100,peopl:100,site:92,name_load:110,cach:[57,119],logarithm:34,account:35,work:[97,32],histori:[126,100],can:122,nearest:55,quickstart:12,process:80,share:100,pp_rule:127,explor:100,load_rul:85,onlin:79,recov:100,differenti:122,end:64,how:82,config:108,grib_phenom_transl:43,map:[76,115,70,114,63],resourc:[122,79],rebas:100,mess:100,clone:95,surfac:22,diagram:[22,30],cubelist:47,date:57,multipl:[77,27],data:[37,57,34,111,8],averag:[111,105,114],github:[35,100],practic:37,issu:47,inform:80,maintain:126,combin:77,ensembl:94,wind:55,rotat:[115,55],over:55,move:6,paramet:122,group:111,fit:101,fix:[55,56,57,58,122,60,61,62],might:100,non:119,anim:13,handl:80,now:122,introduct:[19,37],name:[36,122,102],edit:100,simpl:37,equalise_cub:48,updat:[122,100,49],subset:96,weight:[57,122,119],merge_cub:47,cartographi:28,year:122,extract:96,out:69,space:46,regrid_conserv:91,your:[35,95,100],content:[82,104],rewrit:100,internet:122,print:37,math:29,integr:[109,126],linear:122,navig:97,workflow:[126,79,100],interoperablilti:122,oceanograph:30,ask:100,log:36,raster:41,filter:52,thing:100,place:92,geometri:42,interact:76,softwar:69,render:76,miss:57,fanci:36,differ:[77,124],licens:8,system:63,master:100,ionospher:46,scheme:[76,55],cite:69,option:92,season:[122,62,94],copi:[35,49],broadcast:56,geotiff:122,conserv:122,technic:116,provid:57,remov:[97,90],aux_factori:23,horizont:[119,80],project:[78,124,63],collaps:[111,57],seri:[52,126],fork:[35,95],std_name:72,packag:[25,123,71,9,92,15,54,33],tabl:[82,104],incompat:[56,57,58,122,60,61,62,63],alias:36,build:92,singl:[40,100],analysi:[29,7,65,10,118,54,28,122,42],simplifi:122,distribut:92,galleri:64,ugrid:17,"class":[109,40],whitepap:116,request:103,bibtex:69,runtim:92,constrain:27,section:68,dot:14,phenomenon:80,copyright:8,configur:[35,36,122,92],local:49,cube:[111,70,96,37,16,113,76,27,119,77,56,57,122,60,97,63],palett:[76,75,63],get:49,repo:95,report:56,requir:92,coord:[24,122],patch:6,common:47,summari:[79,100],set:[35,95],regridd:119,structur:37,contour:76,review:100,iri:[0,1,2,3,5,7,8,9,10,11,13,14,15,122,16,17,21,23,24,25,26,27,28,29,31,32,33,35,37,38,41,42,43,44,45,47,48,50,51,54,55,56,57,58,59,60,61,62,63,64,65,67,69,71,72,73,74,75,76,80,82,83,84,86,87,88,89,91,92,93,53,99,102,103,104,106,107,108,110,116,117,118,121,85,123,125,127],between:77,email:36,attribut:[57,40,63],altern:92,extend:122,entir:111,conda:92,toler:57,annual:[105,114],tutori:79,fileformat:[110,125,25,71,107,43,73,67,83,117,127,84,1,15,3,102,85,14,121,21],load:[113,47,27,80,122,60,97,63],overview:[112,36,6,80,95,100],pole:115,guid:[64,59,82],anomali:[77,34],calculu:[122,7],grib_save_rul:83,oceanographi:120,basic:[76,77],monthli:22,trajectori:10,cartopi:63,neighbour:55,coordin:[37,119,80,55,57,97,62,63],statist:[111,80,18],multi:[76,40,20],save:[76,27],properti:40,air:39,calcul:[77,18],meteorolog:98,pack:92,brewer:[76,63],concatenate_cub:47,customis:60,opendap:122,sever:100,result:57,biggu:56,develop:[59,6,92,100,66],make:[35,56,6,100],cross:68,regrid:[56,5,119,122],document:[106,8,9,40,55,56,104],coord_categoris:11,lbrsvd4:80,lbrsvd5:80,assist:57,user:[36,64,82],pull:103,entri:69,colour:[76,34,63],grib:[43,122,71,83,85],exampl:[98,8,37,81,45,4,120],thi:[25,123,71,9,15,54,33],interpol:[56,57,119,122,118],model:94,dimension:[76,122],latest:49,rest:12,except:26,identif:80,quickplot:[70,74],rectilinear:122,format:113,categoris:62,temperatur:[39,20,22,114,105],grid:[78,80],nimrod:125,measur:80,manual:79,output:36,page:79,pp_pack:21,some:100,back:57,"export":122,mirror:100,scale:34,mathemat:77,refer:[105,52,9,114],coord_system:50,promot:57,repositori:[95,100],commit:[126,100],block:76,own:35,weather:46,contributor:[57,8],spline:57,merg:[56,36,122,47,100],git:[112,36,79,66],format_pick:51,citat:[76,63],area:[111,122,119],support:57,"long":126,custom:[113,62,18,92],avail:76,strict:27,editor:36,"function":[57,109,63],form:77,phenomena:77,unambigu:122,link:[95,12],translat:122,line:[76,40,20],bug:[55,56,57,58,60,61,62],concaten:[56,60,47,99],"default":90,sampl:40,featur:[100,55,56,57,58,122,60,61,62,63],creat:[35,76,12],mask:57,enhanc:122,file:[113,27,60],check:[126,103,69],fill:76,want:100,hybrid:63,detail:[112,36,6,95,100],other:[100,80,63],role:63,futur:57,branch:100,test:109,you:100,polar:[81,78],stat:65,symbol:0,docstr:40,polynomi:101,slice:55,peak:57,consid:100,exner:39,reduc:111,descript:40,rule:84,time:[52,97,27,80,57,53],push:126}})
\ No newline at end of file
diff --git a/iris/docs/dev/userguide/citation.html b/iris/docs/dev/userguide/citation.html
deleted file mode 100644
index 1a2e0f317..000000000
--- a/iris/docs/dev/userguide/citation.html
+++ /dev/null
@@ -1,195 +0,0 @@
-
-
-
-
-
-
-
- 10. Citing Iris — Iris 1.8.1rc1 documentation
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
The section Navigating a cube highlighted that
-every cube has a data attribute;
-this attribute can then be manipulated directly:
-
cube.data-=273.15
-
-
-
The problem with manipulating the data directly is that other metadata may
-become inconsistent; in this case the units of the cube are no longer what was
-intended. This example could be rectified by changing the units attribute:
-
cube.units='celsius'
-
-
-
-
Note
-
iris.cube.Cube.convert_units() can be used to automatically convert a
-cube’s data and update its units attribute.
-So, the two steps above can be achieved by:
-
cube.convert_units('celsius')
-
-
-
-
In order to reduce the amount of metadata which becomes inconsistent,
-fundamental arithmetic operations such as addition, subtraction, division
-and multiplication can be applied directly to any cube.
-
-
9.1. Calculating the difference between two cubes¶
-
Let’s load some air temperature which runs from 1860 to 2100:
And finally we can subtract the two.
-The result is a cube of the same size as the original two time slices,
-but with the data representing their difference:
-
>>> printt_last-t_first
-unknown / (K) (latitude: 37; longitude: 49)
- Dimension coordinates:
- latitude x -
- longitude - x
- Scalar coordinates:
- forecast_reference_time: 1859-09-01 06:00:00
- height: 1.5 m
-
-
-
-
Note
-
Notice that the coordinates “time” and “forecast_period” have been removed
-from the resultant cube;
-this is because these coordinates differed between the two input cubes.
-For more control on whether or not coordinates should be automatically
-ignored iris.analysis.maths.subtract() can be used instead.
In section Cube statistics we discussed how the dimensionality of a cube
-can be reduced using the Cube.collapsed method
-to calculate a statistic over a dimension.
-
Let’s use that method to calculate a mean of our air temperature time-series,
-which we’ll then use to calculate a time mean anomaly and highlight the powerful
-benefits of cube broadcasting.
-
First, let’s remind ourselves of the shape of our air temperature time-series
-cube:
As expected the time dimension has been collapsed, reducing the
-dimensionality of the resultant air_temp_mean cube. This time-series mean can
-now be used to calculate the time mean anomaly against the original
-time-series:
Notice that the calculation of the anomaly involves subtracting a
-2d cube from a 3d cube to yield a 3d result. This is only possible
-because cube broadcasting is performed during cube arithmetic operations.
-
Cube broadcasting follows similar broadcasting rules as
-NumPy, but
-the additional richness of Iris coordinate meta-data provides an enhanced
-capability beyond the basic broadcasting behaviour of NumPy.
-
As the coordinate meta-data of a cube uniquely describes each dimension, it is
-possible to leverage this knowledge to identify the similar dimensions involved
-in a cube arithmetic operation. This essentially means that we are no longer
-restricted to performing arithmetic on cubes with identical shapes.
-
This extended broadcasting behaviour is highlighted in the following
-examples. The first of these shows that it is possible to involve the
-transpose of the air temperature time-series in an arithmetic operation with
-itself.
-
Let’s first create the transpose of the air temperature time-series:
Notice that the result is the same dimensionality and shape as air_temp.
-Let’s check that the arithmetic operation has calculated a result that
-we would intuitively expect:
-
>>> result==2*air_temp
-True
-
-
-
Let’s extend this example slightly, by taking a slice from the middle
-latitude dimension of the transpose cube:
Compared to our original time-series, the air_temp_T_slice cube has one
-less dimension and it’s shape if different. However, this doesn’t prevent
-us from performing cube arithmetic with it, thanks to the extended cube
-broadcasting behaviour:
We must ensure that the units of pressure and p0 are the same,
-so convert the newly created coordinate using
-the iris.coords.Coord.convert_units() method:
-
p0.convert_units(pressure.units)
-
-
-
Now we can combine all of this information to calculate the air temperature
-using the equation above:
In the Subsetting a Cube section we saw how to extract a subset of a
-cube in order to reduce either its dimensionality or its resolution.
-Instead of simply extracting a sub-region of the data,
-we can produce statistical functions of the data values
-across a particular dimension,
-such as a ‘mean over time’ or ‘minimum over latitude’.
-
For instance, suppose we have a cube:
-
>>> importiris
->>> filename=iris.sample_data_path('uk_hires.pp')
->>> cube=iris.load_cube(filename,'air_potential_temperature')
->>> printcube
-air_potential_temperature / (K) (time: 3; model_level_number: 7; grid_latitude: 204; grid_longitude: 187)
- Dimension coordinates:
- time x - - -
- model_level_number - x - -
- grid_latitude - - x -
- grid_longitude - - - x
- Auxiliary coordinates:
- forecast_period x - - -
- level_height - x - -
- sigma - x - -
- surface_altitude - - x x
- Derived coordinates:
- altitude - x x x
- Scalar coordinates:
- forecast_reference_time: 2009-11-19 04:00:00
- Attributes:
- STASH: m01s00i004
- source: Data from Met Office Unified Model
- um_version: 7.3
-
-
-
In this case we have a 4 dimensional cube;
-to mean the vertical (z) dimension down to a single valued extent
-we can pass the coordinate name and the aggregation definition to the
-Cube.collapsed() method:
-
>>> importiris.analysis
->>> vertical_mean=cube.collapsed('model_level_number',iris.analysis.MEAN)
->>> printvertical_mean
-air_potential_temperature / (K) (time: 3; grid_latitude: 204; grid_longitude: 187)
- Dimension coordinates:
- time x - -
- grid_latitude - x -
- grid_longitude - - x
- Auxiliary coordinates:
- forecast_period x - -
- surface_altitude - x x
- Derived coordinates:
- altitude - x x
- Scalar coordinates:
- forecast_reference_time: 2009-11-19 04:00:00
- level_height: 696.667 m, bound=(0.0, 1393.33) m
- model_level_number: 10, bound=(1, 19)
- sigma: 0.92293, bound=(0.84586, 1.0)
- Attributes:
- STASH: m01s00i004
- source: Data from Met Office Unified Model
- um_version: 7.3
- Cell methods:
- mean: model_level_number
-
-
-
Similarly other analysis operators such as MAX, MIN and STD_DEV
-can be used instead of MEAN, see iris.analysis for a full list
-of currently supported operators.
-
For an example of using this functionality, the
-Hovmoller diagram example found
-in the gallery takes a zonal mean of an XYT cube by using the
-collapsed method with latitude and iris.analysis.MEAN as arguments.
Some operators support additional keywords to the cube.collapsed method.
-For example, iris.analysis.MEAN supports
-a weights keyword which can be combined with
-iris.analysis.cartography.area_weights() to calculate an area average.
-
Let’s use the same data as was loaded in the previous example.
-Since grid_latitude and grid_longitude were both point coordinates
-we must guess bound positions for them
-in order to calculate the area of the grid boxes:
The Cube.aggregated_by operation
-combines data for all points with the same value of a given coordinate.
-To do this, you need a coordinate whose points take on only a limited set
-of different values – the number of these then determines the size of the
-reduced dimension.
-The iris.coord_categorisation module can be used to make such
-‘categorical’ coordinates out of ordinary ones: The most common use is
-to aggregate data over regular time intervals,
-such as by calendar month or day of the week.
-
For example, let’s create two new coordinates on the cube
-to represent the climatological seasons and the season year respectively:
The primary change in the cube is that the cube’s data has been
-reduced in the ‘time’ dimension by aggregation (taking means, in this case).
-This has collected together all datapoints with the same values of season and
-season-year.
-The results are now indexed by the 19 different possible values of season and
-season-year in a new, reduced ‘time’ dimension.
-
We can see this by printing the first 10 values of season+year
-from the original cube: These points are individual months,
-so adjacent ones are often in the same season:
Because the original data started in April 2006 we have some incomplete seasons
-(e.g. there were only two months worth of data for ‘mam-2006’).
-In this case we can fix this by removing all of the resultant ‘times’ which
-do not cover a three month period (note: judged here as > 3*28 days):
If this was your first time reading the user guide, we hope you found it enjoyable and informative.
-It is advised that you now go back to the start and try experimenting with your own data.
It can be very daunting to start coding a project from an empty file, that is why you will find many in-depth
-examples in the Iris gallery which can be used as a goal driven reference to producing your own visualisations.
-
If you produce a visualisation which you think would add value to the gallery, please get in touch with us and
-we will consider including it as an example for all to benefit from.
If you are reading this user guide for the first time it is strongly recommended that you read the user guide
-fully before experimenting with your own data files.
-
Much of the content has supplementary links to the reference documentation; you will not need to follow these
-links in order to understand the guide but they may serve as a useful reference for future exploration.
-Since later pages depend on earlier ones, try reading this user guide sequentially using the next and previous links.
Iris provides powerful cube-aware interpolation and regridding functionality,
-exposed through Iris cube methods. This functionality is provided by building
-upon existing interpolation schemes implemented by SciPy.
-
In Iris we refer to the avaliable types of interpolation and regridding as
-schemes. The following are the interpolation schemes that are currently
-available in Iris:
Interpolating a cube is achieved with the interpolate()
-method. This method expects two arguments:
-
-
-
the sample points to interpolate, and
-
the second argument being the interpolation scheme to use.
-
-
-
The result is a new cube, interpolated at the sample points.
-
Sample points must be defined as an iterable of (coord,value(s)) pairs.
-The coord argument can be either a coordinate name or coordinate instance.
-The specified coordinate must exist on the cube being interpolated! For example:
-
-
-
coordinate names and scalar sample points: [('latitude',51.48),('longitude',0)],
-
a coordinate instance and a scalar sample point: [(cube.coord('latitude'),51.48)], and
-
a coordinate name and a NumPy array of sample points: [('longitude',np.linspace(-11,2,14))]
-
-
-
are all examples of valid sample points.
-
The values for coordinates that correspond to date/times can be supplied as
-datetime.datetime or netcdftime.datetime instances,
-e.g. [('time',datetime.datetime(2009,11,19,10,30))]).
-
Let’s take the air temperature cube we’ve seen previously:
-
>>> air_temp=iris.load_cube(iris.sample_data_path('air_temp.pp'))
->>> printair_temp
-air_temperature / (K) (latitude: 73; longitude: 96)
- Dimension coordinates:
- latitude x -
- longitude - x
- Scalar coordinates:
- forecast_period: 6477 hours, bound=(-28083.0, 6477.0) hours
- forecast_reference_time: 1998-03-01 03:00:00
- pressure: 1000.0 hPa
- time: 1998-12-01 00:00:00, bound=(1994-12-01 00:00:00, 1998-12-01 00:00:00)
- Attributes:
- STASH: m01s16i203
- source: Data from Met Office Unified Model
- Cell methods:
- mean within years: time
- mean over years: time
-
-
-
We can interpolate specific values from the coordinates of the cube:
-
>>> sample_points=[('latitude',51.48),('longitude',0)]
->>> printair_temp.interpolate(sample_points,iris.analysis.Linear())
-air_temperature / (K) (scalar cube)
- Scalar coordinates:
- forecast_period: 6477 hours, bound=(-28083.0, 6477.0) hours
- forecast_reference_time: 1998-03-01 03:00:00
- latitude: 51.48 degrees
- longitude: 0 degrees
- pressure: 1000.0 hPa
- time: 1998-12-01 00:00:00, bound=(1994-12-01 00:00:00, 1998-12-01 00:00:00)
- Attributes:
- STASH: m01s16i203
- source: Data from Met Office Unified Model
- Cell methods:
- mean within years: time
- mean over years: time
-
-
-
As we can see, the resulting cube is scalar and has longitude and latitude coordinates with
-the values defined in our sample points.
-
It isn’t necessary to specify sample points for every dimension, only those that you
-wish to interpolate over:
The sample points for a coordinate can be an array of values. When multiple coordinates are
-provided with arrays instead of scalar sample points, the coordinates on the resulting cube
-will be orthogonal:
Interpolation in Iris is not limited to horizontal-spatial coordinates - any
-coordinate satisfying the prerequisites of the chosen scheme may be interpolated
-over.
-
For instance, the iris.analysis.Linear scheme requires 1D numeric,
-monotonic, coordinates. Supposing we have a single column cube such as
-the one defined below:
We could regularise the vertical coordinate by defining 10 equally spaced altitude
-sample points between 400 and 1250 and interpolating our vertical coordinate onto
-these sample points:
The red diamonds on the extremes of the altitude values show that we have
-extrapolated data beyond the range of the original data. In some cases this is
-desirable but in other cases it is not. For example, this column defines
-a surface altitude value of 414m, so extrapolating an “air potential temperature”
-at 400m makes little physical sense in this case.
-
We can control the extrapolation mode when defining the interpolation scheme.
-Controlling the extrapolation mode allows us to avoid situations like the above where
-extrapolating values makes little physical sense.
-
The extrapolation mode is controlled by the extrapolation_mode keyword.
-For the available interpolation schemes available in Iris, the extrapolation_mode
-keyword must be one of:
-
-
-
extrapolate – the extrapolation points will be calculated by extending the gradient of the closest two points,
-
error – a ValueError exception will be raised, notifying an attempt to extrapolate,
-
nan – the extrapolation points will be be set to NaN,
-
mask – the extrapolation points will always be masked, even if the source data is not a MaskedArray, or
-
nanmask – if the source data is a MaskedArray the extrapolation points will be masked. Otherwise they will be set to NaN.
-
-
-
Using an extrapolation mode is achieved by constructing an interpolation scheme
-with the extrapolation mode keyword set as required. The constructed scheme
-is then passed to the interpolate() method.
-For example, to mask values that lie beyond the range of the original data:
If you need to interpolate a cube on multiple sets of sample points you can
-‘cache’ an interpolator to be used for each of these interpolations. This can
-shorten the execution time of your code as the most computationally
-intensive part of an interpolation is setting up the interpolator.
-
To cache an interpolator you must set up an interpolator scheme and call the
-scheme’s interpolator method. The interpolator method takes as arguments:
-
-
-
a cube to be interpolated, and
-
an iterable of coordinate names or coordinate instances of the coordinates that are to be interpolated over.
When this cached interpolator is called you must pass it an iterable of sample points
-that have the same form as the iterable of coordinates passed to the constructor.
-So, to use the cached interpolator defined above:
Regridding is conceptually a very similar process to interpolation in Iris.
-The primary difference is that interpolation is based on sample points, while
-regridding is based on the horizontal grid of another cube.
-
Regridding a cube is achieved with the cube.regrid() method.
-This method expects two arguments:
-
-
-
another cube that defines the target grid onto which the cube should be regridded, and
-
the regridding scheme to use.
-
-
-
-
Note
-
Regridding is a common operation needed to allow comparisons of data on different grids.
-The powerful mapping functionality provided by cartopy, however, means that regridding
-is often not necessary if performed just for visualisation purposes.
-
-
Let’s load two cubes that have different grids and coordinate systems:
Let’s regrid the global_air_temp cube onto a rotated pole grid
-using a linear regridding scheme. To achieve this we pass the rotated_psl
-cube to the regridder to supply the target grid to regrid the global_air_temp
-cube onto:
We could regrid the pressure values onto the global grid, but this will involve
-some form of extrapolation. As with interpolation, we can control the extrapolation
-mode when defining the regridding scheme.
-
For the available regridding schemes in Iris, the extrapolation_mode keyword
-must be one of:
-
-
-
-
extrapolate –
-
-
for Linear the extrapolation points will be calculated by extending the gradient of the closest two points.
-
for Nearest the extrapolation points will take their value from the nearest source point.
-
-
-
-
-
nan – the extrapolation points will be be set to NaN.
-
-
error – a ValueError exception will be raised, notifying an attempt to extrapolate.
-
-
mask – the extrapolation points will always be masked, even if the source data is not a MaskedArray.
-
-
nanmask – if the source data is a MaskedArray the extrapolation points will be masked. Otherwise they will be set to NaN.
-
-
-
-
The rotated_psl cube is defined on a limited area rotated pole grid. If we regridded
-the rotated_psl cube onto the global grid as defined by the global_air_temp cube
-any linearly extrapolated values would quickly become dominant and highly inaccurate.
-We can control this behaviour by defining the extrapolation_mode in the constructor
-of the regridding scheme to mask values that lie outside of the domain of the rotated
-pole grid:
Notice that although we can still see the approximate shape of the rotated pole grid, the
-cells have now become rectangular in a plate carrée (equirectangular) projection.
-The spatial grid of the resulting cube is really global, with a large proportion of the
-data being masked.
It is often the case that a point-based regridding scheme (such as
-iris.analysis.Linear or iris.analysis.Nearest) is not
-appropriate when you need to conserve quantities when regridding. The
-iris.analysis.AreaWeighted scheme is less general than
-Linear or Nearest, but is a
-conservative regridding scheme, meaning that the area-weighted total is
-approximately preserved across grids.
-
With the AreaWeighted regridding scheme, each target grid-box’s
-data is computed as a weighted mean of all grid-boxes from the source grid. The weighting
-for any given target grid-box is the area of the intersection with each of the
-source grid-boxes. This scheme performs well when regridding from a high
-resolution source grid to a lower resolution target grid, since all source data
-points will be accounted for in the target grid.
-
Let’s demonstrate this with the global air temperature cube we saw previously,
-along with a limited area cube containing total concentration of volcanic ash:
One of the key limitations of the AreaWeighted
-regridding scheme is that the two input grids must be defined in the same
-coordinate system as each other. Both input grids must also contain monotonic,
-bounded, 1D spatial coordinates.
-
-
Note
-
The AreaWeighted regridding scheme requires spatial
-areas, therefore the longitude and latitude coordinates must be bounded.
-If the longitude and latitude bounds are not defined in the cube we can
-guess the bounds based on the coordinates’ point values:
Note that the AreaWeighted regridding scheme allows us
-to define a missing data tolerance (mdtol), which specifies the tolerated
-fraction of masked data in any given target grid-box. If the fraction of masked
-data within a target grid-box exceeds this value, the data in this target
-grid-box will be masked in the result.
-
The fraction of masked data is calculated based on the area of masked source
-grid-boxes that overlaps with each target grid-box. Defining an mdtol in the
-AreaWeighted regridding scheme allows fine control
-of masked data tolerance. It is worth remembering that defining an mdtol of
-anything other than 1 will prevent the scheme from being fully conservative, as
-some data will be disregarded if it lies close to masked data.
-
To visualise the above regrid, let’s plot the original data, along with 3 distinct
-mdtol values to compare the result:
If you need to regrid multiple cubes with a common source grid onto a common
-target grid you can ‘cache’ a regridder to be used for each of these regrids.
-This can shorten the execution time of your code as the most computationally
-intensive part of a regrid is setting up the regridder.
-
To cache a regridder you must set up a regridder scheme and call the
-scheme’s regridder method. The regridder method takes as arguments:
-
-
-
a cube (that is to be regridded) defining the source grid, and
-
a cube defining the target grid to regrid the source cube to.
When this cached regridder is called you must pass it a cube on the same grid
-as the source grid cube (in this case global_air_temp) that is to be
-regridded to the target grid. For example:
In each case result will be the input cube regridded to the grid defined by
-the target grid cube (in this case rotated_psl) that we used to define the
-cached regridder.
The top level object in Iris is called a cube. A cube contains data and metadata about a phenomenon.
-
In Iris, a cube is an interpretation of the Climate and Forecast (CF) Metadata Conventions whose purpose is to:
-
-
require conforming datasets to contain sufficient metadata that they are self-describing... including physical
-units if appropriate, and that each value can be located in space (relative to earth-based coordinates) and time.
-
Whilst the CF conventions are often mentioned alongside NetCDF, Iris implements several major format importers which can take
-files of specific formats and turn them into Iris cubes. Additionally, a framework is provided which allows users
-to extend Iris’ import capability to cater for specialist or unimplemented formats.
-
A single cube describes one and only one phenomenon, always has a name, a unit and
-an n-dimensional data array to represents the cube’s phenomenon. In order to locate the
-data spatially, temporally, or in any other higher-dimensional space, a collection of coordinates
-exist on the cube.
A coordinate is a container to store metadata about some dimension(s) of a cube’s data array and therefore,
-by definition, its phenomenon.
-
-
-
Each coordinate has a name and a unit.
-
-
-
When a coordinate is added to a cube, the data dimensions that it represents are also provided.
-
-
The shape of a coordinate is always the same as the shape of the associated data dimension(s) on the cube.
-
A dimension not explicitly listed signifies that the coordinate is independent of that dimension.
-
Each dimension of a coordinate must be mapped to a data dimension. The only coordinates with no mapping are
-scalar coordinates.
-
-
-
-
-
Depending on the underlying data that the coordinate is representing, its values may be discrete points or be
-bounded to represent interval extents (e.g. temperature at point xvs rainfall accumulation between 0000-1200 hours).
-
-
Coordinates have an attributes dictionary which can hold arbitrary extra metadata, excluding certain restricted CF names
-
-
More complex coordinates may contain a coordinate system which is necessary to fully interpret the values
-contained within the coordinate.
-
-
-
-
There are two classes of coordinates:
-
-
DimCoord
-
-
-
Numeric
-
Monotonic
-
Representative of, at most, a single data dimension (1d)
-
-
-
AuxCoord
-
-
-
May be of any type, including strings
-
May represent multiple data dimensions (n-dimensional)
Suppose we have some gridded data which has 24 air temperature readings (in Kelvin) which is located at
-4 different longitudes, 2 different latitudes and 3 different heights. Our data array can be represented pictorially:
-
-
Where dimensions 0, 1, and 2 have lengths 3, 2 and 4 respectively.
-
The Iris cube to represent this data would consist of:
-
-
-
a standard name of air_temperature and a unit of kelvin
-
-
a data array of shape (3,2,4)
-
-
a coordinate, mapping to dimension 0, consisting of:
-
-
-
a standard name of height and unit of meters
-
an array of length 3 representing the 3 height points
-
-
-
-
a coordinate, mapping to dimension 1, consisting of:
-
-
-
a standard name of latitude and unit of degrees
-
an array of length 2 representing the 2 latitude points
-
a coordinate system such that the latitude points could be fully located on the globe
-
-
-
-
a coordinate, mapping to dimension 2, consisting of:
-
-
-
a standard name of longitude and unit of degrees
-
an array of length 4 representing the 4 longitude points
-
a coordinate system such that the longitude points could be fully located on the globe
-
-
-
-
-
-
Pictorially the cube has taken on more information than a simple array:
-
-
Additionally further information may be optionally attached to the cube.
-For example, it is possible to attach any of the following:
-
-
-
a coordinate, not mapping to any data dimensions, consisting of:
-
-
-
a standard name of time and unit of dayssince2000-01-0100:00
-
a data array of length 1 representing the time that the data array is valid for
-
-
-
-
an auxiliary coordinate, mapping to dimensions 1 and 2, consisting of:
-
-
-
a long name of placename and no unit
-
a 2d string array of shape (2,4) with the names of the 8 places that the lat/lons correspond to
-
-
-
-
an auxiliary coordinate “factory”, which can derive its own mapping, consisting of:
-
-
-
a standard name of height and a unit of feet
-
knowledge of how data values for this coordinate can be calculated given the heightinmeters coordinate
-
-
-
-
a cell method of “mean” over “ensemble” to indicate that the data has been meaned over
-a collection of “ensembles” (i.e. multiple model runs).
Every Iris cube can be printed to screen as you will see later in the user guide. It is worth familiarising yourself with the
-output as this is the quickest way of inspecting the contents of a cube. Here is the result of printing a real life cube:
-
air_potential_temperature / (K) (time: 3; model_level_number: 7; grid_latitude: 204; grid_longitude: 187)
- Dimension coordinates:
- time x - - -
- model_level_number - x - -
- grid_latitude - - x -
- grid_longitude - - - x
- Auxiliary coordinates:
- forecast_period x - - -
- level_height - x - -
- sigma - x - -
- surface_altitude - - x x
- Derived coordinates:
- altitude - x x x
- Scalar coordinates:
- forecast_reference_time: 2009-11-19 04:00:00
- Attributes:
- STASH: m01s00i004
- source: Data from Met Office Unified Model
- um_version: 7.3
-
-
-
Using this output we can deduce that:
-
-
-
The cube represents air potential temperature.
-
There are 4 data dimensions, and the data has a shape of (3,7,204,187)
-
The 4 data dimensions are mapped to the time, model_level_number,
-grid_latitude, grid_longitude coordinates respectively
-
There are three 1d auxiliary coordinates and one 2d auxiliary (surface_altitude)
-
There is a single altitude derived coordinate, which spans 3 data dimensions
-
There are 7 distinct values in the “model_level_number” coordinate. Similar inferences can
-be made for the other dimension coordinates.
-
There are 7, not necessarily distinct, values in the level_height coordinate.
-
There is a single forecast_reference_time scalar coordinate representing the entire cube.
-
The cube has one further attribute relating to the phenomenon.
-In this case the originating file format, PP, encodes information in a STASH code which in some cases can
-be useful for identifying advanced experiment information relating to the phenomenon.
Iris will attempt to return as few cubes as possible
-by collecting together multiple fields with a shared standard name
-into a single multidimensional cube.
-
The iris.load() function automatically recognises the format
-of the given files and attempts to produce Iris Cubes from their contents.
-
-
Note
-
Currently there is support for CF NetCDF, GRIB 1 & 2, PP and FieldsFiles
-file formats with a framework for this to be extended to custom formats.
-
-
In order to find out what has been loaded, the result can be printed:
This shows that there were 2 cubes as a result of loading the file, they were:
-air_potential_temperature and surface_altitude.
-
-
The surface_altitude cube was 2 dimensional with:
-
-
the two dimensions have extents of 204 and 187 respectively and are
-represented by the grid_latitude and grid_longitude coordinates.
-
-
-
The air_potential_temperature cubes were 4 dimensional with:
-
-
the same length grid_latitude and grid_longitude dimensions as
-surface_altitide
-
a time dimension of length 3
-
a model_level_number dimension of length 7
-
-
-
-
-
Note
-
The result of iris.load() is always a
-listofcubes.
-Anything that can be done with a Python list can be done
-with the resultant list of cubes.
-
-
-
Hint
-
Throughout this user guide you will see the function
-iris.sample_data_path being used to get the filename for the resources
-used in the examples. The result of this function is just a string.
-
Using this function allows us to provide examples which will work
-across platforms and with data installed in different locations,
-however in practice you will want to use your own strings:
To get the air potential temperature cube from the list of cubes
-returned by iris.load() in the previous example,
-list indexing can be used:
-
>>> importiris
->>> filename=iris.sample_data_path('uk_hires.pp')
->>> cubes=iris.load(filename)
->>> # get the first cube (list indexing is 0 based)
->>> air_potential_temperature=cubes[0]
->>> printair_potential_temperature
-air_potential_temperature / (K) (time: 3; model_level_number: 7; grid_latitude: 204; grid_longitude: 187)
- Dimension coordinates:
- time x - - -
- model_level_number - x - -
- grid_latitude - - x -
- grid_longitude - - - x
- Auxiliary coordinates:
- forecast_period x - - -
- level_height - x - -
- sigma - x - -
- surface_altitude - - x x
- Derived coordinates:
- altitude - x x x
- Scalar coordinates:
- forecast_reference_time: 2009-11-19 04:00:00
- Attributes:
- STASH: m01s00i004
- source: Data from Met Office Unified Model
- um_version: 7.3
-
-
-
Notice that the result of printing a cube is a little more verbose than
-it was when printing a list of cubes. In addition to the very short summary
-which is provided when printing a list of cubes, information is provided
-on the coordinates which constitute the cube in question.
-This was the output discussed at the end of the Introduction section.
-
-
Note
-
Dimensioned coordinates will have a dimension marker x in the
-appropriate column for each cube data dimension that they describe.
Given a large dataset, it is possible to restrict or constrain the load
-to match specific Iris cube metadata.
-Constrained loading provides the ability to generate a cube
-from a specific subset of data that is of particular interest.
-
As we have seen, loading the following file creates several Cubes:
To constrain the load to multiple distinct constraints, a list of constraints
-can be provided. This is equivalent to running load once for each constraint
-but is likely to be more efficient:
The iris.Constraint class can be used to restrict coordinate values
-on load. For example, to constrain the load to match
-a specific model_level_number:
As well as being able to combine constraints using &,
-the iris.Constraint class can accept multiple arguments,
-and a list of values can be given to constrain a coordinate to one of
-a collection of values:
A common requirement is to limit the value of a coordinate to a specific range,
-this can be achieved by passing the constraint a function:
-
defbottom_16_levels(cell):
- # return True or False as to whether the cell in question should be kept
- returncell<=16
-
-filename=iris.sample_data_path('uk_hires.pp')
-level_lt_16=iris.Constraint(model_level_number=bottom_16_levels)
-cubes=iris.load(filename,level_lt_16)
-
-
-
-
Note
-
As with many of the examples later in this documentation, the
-simple function above can be conveniently written as a lambda function
-on a single line:
-
bottom_16_levels=lambdacell:cell<=16
-
-
-
-
Cube attributes can also be part of the constraint criteria. Supposing a
-cube attribute of STASH existed, as is the case when loading PP files,
-then specific STASH codes can be filtered:
Iris follows NetCDF-CF rules in representing time coordinate values as normalised,
-purely numeric, values which are normalised by the calendar specified in the coordinate’s
-units (e.g. “days since 1970-01-01”).
-However, when constraining by time we usually want to test calendar-related
-aspects such as hours of the day or months of the year, so Iris
-provides special features to facilitate this:
-
Firstly, Iris can be configured so that when it evaluates Constraint
-expressions, it will convert time-coordinate values (points and bounds) from
-numbers into datetime-like objects for ease of calendar-based
-testing. This feature is not backwards compatible so for now it must be
-explicitly enabled by setting the “cell_datetime_objects” option in iris.Future:
-
>>> filename=iris.sample_data_path('uk_hires.pp')
->>> cube_all=iris.load_cube(filename,'air_potential_temperature')
->>> print'All times :\n',cube_all.coord('time')
-All times :
-DimCoord([2009-11-19 10:00:00, 2009-11-19 11:00:00, 2009-11-19 12:00:00], standard_name='time', calendar='gregorian')
->>> # Define a function which accepts a datetime as its argument (this is simplified in later examples).
->>> hour_11=iris.Constraint(time=lambdacell:cell.point.hour==11)
->>> withiris.FUTURE.context(cell_datetime_objects=True):
-... cube_11=cube_all.extract(hour_11)
-...
->>> print'Selected times :\n',cube_11.coord('time')
-Selected times :
-DimCoord([2009-11-19 11:00:00], standard_name='time', calendar='gregorian')
-
-
-
Secondly, the iris.time module provides flexible time comparison
-facilities. An iris.time.PartialDateTime object can be compared to
-objects such as datetime.datetime instances, and this comparison will
-then test only those ‘aspects’ which the PartialDateTime instance defines:
We can select points within a certain part of the year, in this case between
-the 15th of July through to the 25th of August, by combining the datetime cell
-functionality with PartialDateTime:
The iris.load_cube() and iris.load_cubes() functions are
-similar to iris.load() except they can only return
-one cube per constraint.
-The iris.load_cube() function accepts a single constraint and
-returns a single cube. The iris.load_cubes() function accepts any
-number of constraints and returns a list of cubes (as an iris.cube.CubeList).
-Providing no constraints to iris.load_cube() or iris.load_cubes()
-is equivalent to requesting exactly one cube of any type.
-
A single cube is loaded in the following example:
-
>>> filename=iris.sample_data_path('air_temp.pp')
->>> cube=iris.load_cube(filename)
->>> printcube
-air_temperature / (K) (latitude: 73; longitude: 96)
- Dimension coordinates:
- latitude x -
- longitude - x
-...
- Cell methods:
- mean: time
-
-
-
However, when attempting to load data which would result in anything other than
-one cube, an exception is raised:
-
>>> filename=iris.sample_data_path('uk_hires.pp')
->>> cube=iris.load_cube(filename)
-Traceback (most recent call last):
-...
-iris.exceptions.ConstraintMismatchError: Expected exactly one cube, found 2.
-
-
-
-
Note
-
All the load functions share many of the same features, hence
-multiple files could be loaded with wildcard filenames
-or by providing a list of filenames.
-
-
The strict nature of iris.load_cube() and iris.load_cubes()
-means that, when combined with constrained loading, it is possible to
-ensure that precisely what was asked for on load is given
-- otherwise an exception is raised.
-This fact can be utilised to make code only run successfully if
-the data provided has the expected criteria.
-
For example, suppose that code needed air_potential_temperature
-in order to run:
The result of iris.load_cubes() in this case will be a list of 2 cubes
-ordered by the constraints provided. Multiple assignment has been used to put
-these two cubes into separate variables.
-
-
Note
-
In Python, lists of a pre-known length and order can be exploited
-using multiple assignment:
We saw in the Loading Iris cubes chapter that Iris tries to load as few cubes as
-possible. This is done by collecting together multiple fields with a shared standard
-name (and other key metadata) into a single multidimensional cube. The processes that
-perform this behaviour in Iris are known as merge and concatenate.
-
This chapter describes the merge and concatenate processes; it explains
-why common issues occur when using them and gives advice on how prevent these
-issues from occurring.
-
Both merge and concatenate take multiple cubes as input and
-result in fewer cubes as output. The following diagram illustrates the two processes:
-
-
There is one major difference between the merge and concatenate processes.
-
-
-
The merge process combines multiple input cubes into a
-single resultant cube with new dimensions created from the
-scalar coordinate values of the input cubes.
-
The concatenate process combines multiple input cubes into a
-single resultant cube with the same number of dimensions as the input cubes,
-but with the length of one or more dimensions extended by joining together
-sequential dimension coordinates.
-
-
-
Let’s imagine 28 individual cubes representing the
-temperature at a location (y,x); one cube for each day of February. We can use
-merge() to combine the 28 (y,x) cubes into
-a single (t,y,x) cube, where the length of the t dimension is 28.
-
Now imagine 12 individual cubes representing daily temperature at a time and
-location (t,y,x); one cube for each month in the year. We can use
-concatenate() to combine the 12
-(t,y,x) cubes into a single (t,y,x) cube, where the length
-of the t dimension is now 365.
We’ve seen that the merge process combines multiple input cubes into a
-single resultant cube with new dimensions created from the
-scalar coordinate values of the input cubes.
-
In order to construct new coordinates for the new dimensions, the merge process requires input cubes
-with scalar coordinates that can be combined together into monotonic sequences.
-The order of the input cubes does not affect the merge process.
-
The merge process can produce a cube that has more than one new dimension,
-if the scalar coordinate sequences form an orthogonal basis.
-
-
Important
-
The shape, metadata, attributes, coordinates, coordinates metadata, fill value and
-other aspects of the input cubes must be consistent across all of the input cubes.
The CubeList.merge method operates on a list
-of cubes and returns a new CubeList containing the cubes
-that have been merged.
-
Let’s have a look at the merge() method in operation.
-In this example we have a list of three lateral (x, y) cubes in a
-variable called cubes, each with a scalar z coordinate of
-differing value. We can merge these cubes by stacking the scalar z coordinates to
-make a new z dimension coordinate:
The following diagram illustrates what has taken place in this example:
-
-
The diagram illustrates that we have three input cubes of identical shape
-that stack on the z dimension.
-After merging our three input cubes we get a new CubeList containing
-one cube with a new z coordinate.
The merge_cube() method guarantees that exactly one cube will be returned
-as a result of merging the input cubes.
-If merge_cube() cannot fulfil this guarantee, a descriptive error
-will be raised providing details to help diagnose the differences between the input cubes.
-In contrast, the merge() method makes no check on the number of cubes returned.
-
To demonstrate the differences between merge()
-and merge_cube(), let’s return to our three cubes
-from the earlier merge example.
-
For the purposes of this example a Conventions attribute has been added to the first
-cube’s attributes dictionary.
-Remember that the attributes must be consistent across all cubes in order to merge
-into a single cube:
Note that merge() returns two cubes here.
-All the cubes that can be merged have been merged. Any cubes that can’t be merged are
-included unchanged in the returned CubeList.
-When merge_cube() is called on cubes it raises a
-descriptive error that highlights the difference in the attributes dictionaries.
-It is this difference that is preventing cubes being merged into a
-single cube. An example of fixing an issue like this can be found in the
-Common issues with merge and concatenate section.
The CubeList’s merge() method is used internally
-by the three main Iris load functions introduced in Loading Iris cubes.
-For file formats such as GRIB and PP, which store fields as many
-individual 2D arrays, Iris loading uses the merge process to produce a
-more intuitive higher dimensional cube of each phenomenon where possible.
-
Sometimes the merge process doesn’t behave as expected. In almost all
-cases this is due to the input cubes containing unexpected or inconsistent metadata.
-For this reason, a fourth Iris file loading function, iris.load_raw(), exists.
-The load_raw() function is intended as a diagnostic tool that can be used to
-load cubes from files without the merge process taking place. The return value of
-iris.load_raw() is always a CubeList instance.
-You can then call the merge_cube() method on this returned
-CubeList to help identify merge related load issues.
We’ve seen that the concatenate process combines multiple input cubes into a
-single resultant cube with the same number of dimensions as the input cubes,
-but with the length of one or more dimensions extended by joining together
-sequential dimension coordinates.
-
In order to extend the dimensions lengths, the concatenate process requires input cubes
-with dimension coordinates that can be combined together into monotonic sequences.
-The order of the input cubes does not affect the concatenate process.
-
-
Important
-
The shape, metadata, attributes, coordinates, coordinates metadata, fill value and
-other aspects of the input cubes must be consistent across all of the input cubes.
The CubeList.concatenate method operates on a list
-of cubes and returns a new CubeList containing the cubes
-that have been concatenated.
-
Let’s have a look at the concatenate() method in operation.
-In the example below we have three 3D (t,y,x) cubes whose t coordinates
-have sequentially increasing ranges.
-These cubes can be concatenated by combining the t coordinates of the input
-cubes to form a new cube with an extended t coordinate:
The following diagram illustrates what has taken place in this example:
-
-
The diagram illustrates that we have three 3D input cubes
-that line up on the t dimension.
-After concatenating our three input cubes we get a new CubeList
-containing one cube with an extended t coordinate.
The concatenate_cube() method guarantees that exactly one
-cube will be returned as a result of concatenating the input cubes.
-If concatenate_cube() cannot fulfil this guarantee, a descriptive error
-will be raised providing details to help diagnose the differences between the input cubes.
-In contrast, the concatenate() method makes no check on the number
-of cubes returned.
-
To demonstrate the differences between concatenate()
-and concatenate_cube(), let’s return to our three cubes
-from the earlier concatenate example.
-
For the purposes of this example we’ll add a History attribute to the first
-cube’s attributes dictionary.
-Remember that the attributes must be consistent across all cubes in order to
-concatenate into a single cube:
Note that concatenate() returns two cubes here.
-All the cubes that can be concatenated have been concatenated. Any cubes that can’t be concatenated are
-included unchanged in the returned CubeList.
-When concatenate_cube() is called on cubes it raises a
-descriptive error that highlights the difference in the attributes dictionaries.
-It is this difference that is preventing cubes being concatenated into a
-single cube. An example of fixing an issue like this can be found in the
-Common issues with merge and concatenate section.
The Iris algorithms that drive merge() and
-concatenate() are complex and depend
-on a number of different elements of the input cubes being consistent across
-all input cubes.
-If this consistency is not maintained then the
-merge() or
-concatenate() process can fail in a
-seemingly arbitrary manner.
-
The methods merge_cube() and
-concatenate_cube()
-were introduced to Iris to help you locate differences in input cubes
-that prevent the input cubes merging or concatenating.
-Nevertheless, certain difficulties with using
-merge() and
-concatenate() occur frequently.
-This section describes these common difficulties, why they arise and
-what you can do to avoid them.
Differences in the attributes the input cubes probably
-cause the greatest amount of merge-related difficulties.
-In recognition of this, Iris has a helper function,
-equalise_attributes(), to equalise
-attributes differences in the input cubes.
To demonstrate using equalise_attributes(),
-let’s return to our non-merging list of input cubes from the merge_cube example
-from earlier.
-We’ll call equalise_attributes() on the
-input cubes before merging the input cubes using merge_cube():
Merging input cubes with inconsistent dimension lengths can cause misleading results.
-This is a common problem when merging cubes generated by different ensemble members in a model run.
-
The misleading results cause the merged cube to gain an anonymous leading dimension.
-All the merged coordinates appear as auxiliary coordinates on the anonymous leading dimension.
-This is shown in the example below:
-
>>> printcube
-surface_temperature / (K) (-- : 5494; latitude: 325; longitude: 432)
- Dimension coordinates:
- latitude - x -
- longitude - - x
- Auxiliary coordinates:
- forecast_month x - -
- forecast_period x - -
- forecast_reference_time x - -
- realization x - -
- time x - -
-
-
-
Merging Duplicate Cubes
-
The Iris load process does not merge duplicate cubes (two or more identical cubes in
-the input cubes) by default.
-This behaviour can be changed by setting the unique keyword argument
-to merge() to False.
-
Merging duplicate cubes can cause misleading results. Let’s demonstrate these
-behaviours and misleading results with the following example.
-In this example we have three input cubes.
-The first has a scalar z coordinate with value 1, the second has a
-scalar z coordinate with value 2 and the third has a scalar z
-coordinate with value 1.
-The first and third cubes are thus identical.
-We will demonstrate the effect of merging the input cubes with unique=False
-(duplicate cubes allowed) and unique=True (duplicate cubes not allowed, which
-is the default behaviour):
Notice how merging the input cubes with duplicate cubes allowed produces a result
-with four z coordinate values.
-Closer inspection of these two resultant cubes demonstrates that the
-scalar z coordinate with value 2 is found in both cubes.
-
Trying to merge the input cubes with duplicate cubes not allowed raises an
-error highlighting the presence of the duplicate cube.
-
Single value coordinates
-
Coordinates containing only a single value can cause confusion when
-combining input cubes. Remember:
-
-
The merge process combines multiple input cubes into a
-single resultant cube with new dimensions created from the
-scalarcoordinate values of the input cubes.
-
The concatenate process combines multiple input cubes into a
-single resultant cube with the same number of dimensions as the input cubes,
-but with the length of one or more dimensions extended by joining together
-sequentialdimensioncoordinates.
-
-
In Iris terminology a scalar coordinate is a
-coordinate of length 1 which does not describe a data dimension.
-
Let’s look at two example cubes to demonstrate this.
-
If your cubes are similar to those below (the single value z coordinate
-is not on a dimension) then use merge() to
-combine your cubes:
-
>>> printcubes[0]
-air_temperature / (kelvin) (y: 4; x: 5)
- Dimension coordinates:
- x x -
- y - x
- Scalar coordinates:
- z: 1
->>> printcubes[1]
-air_temperature / (kelvin) (y: 4; x: 5)
- Dimension coordinates:
- x x -
- y - x
- Scalar coordinates:
- z: 2
-
-
-
If your cubes are similar to those below (the single value z coordinate is
-associated with a dimension) then use concatenate() to
-combine your cubes:
Differences in the units of the time coordinates of the input cubes probably cause
-the greatest amount of concatenate-related difficulties.
-In recognition of this, Iris has a helper function,
-unify_time_units(), to apply a common time unit to all the input cubes.
-
To demonstrate using unify_time_units(),
-let’s adapt our list of input cubes from the concatenate_cube example from earlier.
-We’ll give the input cubes unequal time coordinate units and call
-unify_time_units() on the input cubes before concatenating
-the input cubes using concatenate_cube():
-
>>> fromiris.utilimportunify_time_units
->>> printcubes
-0: air_temperature / (kelvin) (t: 31; y: 3; x: 4)
-1: air_temperature / (kelvin) (t: 28; y: 3; x: 4)
-2: air_temperature / (kelvin) (t: 31; y: 3; x: 4)
-
->>> printcubes[0].coord('t').units
-days since 1990-02-15
->>> printcubes[1].coord('t').units
-days since 1970-01-01
-
->>> printcubes.concatenate_cube()
-Traceback (most recent call last):
- ...
-ConcatenateError: failed to concatenate into a single cube.
- Dimension coordinates metadata differ: t != t
-
->>> unify_time_units(cubes)
-
->>> printcubes[1].coord('t').units
-days since 1990-02-15
-
->>> printcubes.concatenate_cube()
-air_temperature / (kelvin) (t: 90; y: 3; x: 4)
- Dimension coordinates:
- t x - -
- y - x -
- x - - x
-
-
-
Attributes Mismatch
-
The concatenate process is affected by attributes mismatch on input cubes
-in the same way that the merge process is.
-The Attributes Mismatch section earlier in this
-chapter gives further information on attributes mismatch.
After loading any cube, you will want to investigate precisely what it contains. This section is all about accessing
-and manipulating the metadata contained within a cube.
We have already seen a basic string representation of a cube when printing:
-
>>> importiris
->>> filename=iris.sample_data_path('rotated_pole.nc')
->>> cube=iris.load_cube(filename)
->>> printcube
-air_pressure_at_sea_level / (Pa) (grid_latitude: 22; grid_longitude: 36)
- Dimension coordinates:
- grid_latitude x -
- grid_longitude - x
- Scalar coordinates:
- forecast_period: 0.0 hours
- forecast_reference_time: 2006-06-15 00:00:00
- time: 2006-06-15 00:00:00
- Attributes:
- Conventions: CF-1.5
- STASH: m01s16i222
- source: Data from Met Office Unified Model 6.01
-
-
-
This representation is equivalent to passing the cube to the str() function. This function can be used on
-any Python variable to get a string representation of that variable.
-Similarly there exist other standard functions for interrogating your variable: repr(), type() for example:
Other, more verbose, functions also exist which give information on what you can do with any given
-variable. In most cases it is reasonable to ignore anything starting with a “_” (underscore) or a “__” (double underscore):
Interrogating these with the standard type() function will tell you that standard_name and long_name
-are either a string or None, and units is an instance of iris.unit.Unit.
-
You can access a string representing the “name” of a cube with the Cube.name() method:
-
printcube.name()
-
-
-
The result of which is always a string.
-
Each cube also has a numpy array which represents the phenomenon of the cube which can be accessed with the
-Cube.data attribute. As you can see the type is a numpyn-dimensionalarray:
-
printtype(cube.data)
-
-
-
-
Note
-
When loading some file formats in Iris, the data itself is not loaded until the first time that the data is requested.
-Hence you may have noticed that running the previous command for the first time takes a little longer than it does for
-subsequent calls.
-
For this reason, when you have a large cube it is strongly recommended that you do not access the cube’s data unless
-you need to.
-For convenience shape and ndim attributes exists on a cube, which
-can tell you the shape of the cube’s data without loading it:
-
printcube.shape
-printcube.ndim
-
-
-
-
You can change the units of a cube using the convert_units() method. For example:
-
cube.convert_units('celsius')
-
-
-
As well as changing the value of the units attribute this will also convert the values in
-data. To replace the units without modifying the data values one can change the
-units attribute directly.
-
Some cubes represent a processed phenomenon which are represented with cell methods, these can be accessed on a
-cube with the Cube.cell_methods attribute:
The callback is a user defined function which must have the calling sequence function(cube,field,filename)
-which can make any modifications to the cube in-place, or alternatively return a completely new cube instance.
-
Suppose we wish to load a lagged ensemble dataset from the Met Office’s GloSea4 model.
-The data for this example represents 13 ensemble members of 6 one month timesteps; the logistics of the
-model mean that the run is spread over several days.
-
If we try to load the data directly for surface_temperature:
We get multiple cubes some with more dimensions than expected, some without a realization (i.e. ensemble member) dimension.
-In this case, two of the PP files have been encoded without the appropriate realization number attribute, which means that
-the appropriate coordinate cannot be added to the resultant cube. Fortunately, the missing attribute has been encoded in the filename
-which, given the filename, we could extract:
We can solve this problem by adding the appropriate metadata, on load, by using a callback function, which runs on a field
-by field basis before they are automatically merged together:
-
importnumpyasnp
-importiris
-importiris.coordsasicoords
-
-deflagged_ensemble_callback(cube,field,filename):
- # Add our own realization coordinate if it doesn't already exist.
- ifnotcube.coords('realization'):
- realization=np.int32(filename[-6:-3])
- ensemble_coord=icoords.AuxCoord(realization,standard_name='realization')
- cube.add_aux_coord(ensemble_coord)
-
-filename=iris.sample_data_path('GloSea4','*.pp')
-
-printiris.load(filename,'surface_temperature',callback=lagged_ensemble_callback)
-
-
-
The result is a single cube which represents the data in a form that was expected:
Iris utilises the power of Python’s
-Matplotlib package in order to generate
-high quality, production ready 1D and 2D plots.
-The functionality of the Matplotlib
-pyplot module has
-been extended within Iris to facilitate easy visualisation of a cube’s data.
This code will automatically create a figure with appropriate axes for the plot
-and show it on screen.
-The call to plt.plot([1, 2, 2.5]) will create a line plot with
-appropriate axes for the data (x=0, y=1; x=1, y=2; x=2, y=2.5).
-The call to plt.show() tells Matplotlib that you have finished with
-this plot and that you would like to visualise it in a window.
-This is an example of using matplotlib in non-interactive mode.
-
There are two modes of rendering within Matplotlib; interactive and
-non-interactive.
The previous example was non-interactive as the figure is only rendered
-after the call to plt.show().
-Rendering plots interactively can be achieved by changing the interactive
-mode:
In this case the plot is rendered automatically with no need to explicitly call
-matplotlib.pyplot.show() after plt.plot.
-Subsequent changes to your figure will be automatically rendered in the window.
-
The current rendering mode can be determined as follows:
For clarity, each example includes all of the imports required to run on its
-own; when combining examples such as the two above, it would not be necessary
-to repeat the import statement more than once:
The matplotlib.pyplot.savefig() function is similar to plt.show()
-in that they are both non-interactive visualisation modes.
-As you might expect, plt.savefig saves your figure as an image:
The filename extension passed to the matplotlib.pyplot.savefig()
-function can be used to control the output file format of the plot
-(keywords can also be used to control this and other aspects,
-see matplotlib.pyplot.savefig()).
-
Some of the formats which are supported by plt.savefig:
-
-
-
-
-
-
-
-
-
Format
-
Type
-
Description
-
-
-
-
EPS
-
Vector
-
Encapsulated PostScript
-
-
PDF
-
Vector
-
Portable Document Format
-
-
PNG
-
Raster
-
Portable Network Graphics, a format with a lossless compression method
The Iris modules iris.quickplot and iris.plot extend the
-Matplotlib pyplot interface by implementing thin wrapper functions.
-These wrapper functions simply bridge the gap between an Iris cube and
-the data expected by standard Matplotlib pyplot functions.
-This means that all Matplotlib pyplot functionality,
-including keyword options, are still available through the Iris plotting
-wrapper functions.
-
As a rule of thumb:
-
-
-
if you wish to do a visualisation with a cube, use iris.plot or
-iris.quickplot.
-
if you wish to show, save or manipulate any visualisation,
-including ones created with Iris, use matplotlib.pyplot.
-
if you wish to create a non cube visualisation, also use
-matplotlib.pyplot.
-
-
-
The iris.quickplot module is exactly the same as the iris.plot module,
-except that quickplot will add a title, x and y labels and a colorbar
-where appropriate.
-
-
Note
-
In all subsequent examples the matplotlib.pyplot, iris.plot and
-iris.quickplot modules are imported as plt, iplt and qplt
-respectively in order to make the code more readable.
-This is equivalent to:
The simplest 1D plot is achieved with the iris.plot.plot() function.
-The syntax is very similar to that which you would provide to Matplotlib’s
-equivalent matplotlib.pyplot.plot() and indeed all of the
-keyword arguments are equivalent:
-
from__future__import(absolute_import,division,print_function)
-
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.plotasiplt
-
-
-fname=iris.sample_data_path('air_temp.pp')
-temperature=iris.load_cube(fname)
-
-# Take a 1d slice using array style indexing.
-temperature_1d=temperature[5,:]
-
-iplt.plot(temperature_1d)
-plt.show()
-
As well as providing simple Matplotlib wrappers, Iris also has a
-iris.quickplot module, which adds extra cube based metadata
-to a plot.
-For example, the previous plot can be improved quickly by replacing
-iris.plot with iris.quickplot:
-
from__future__import(absolute_import,division,print_function)
-
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.quickplotasqplt
-
-
-fname=iris.sample_data_path('air_temp.pp')
-temperature=iris.load_cube(fname)
-
-# Take a 1d slice using array style indexing.
-temperature_1d=temperature[5,:]
-
-qplt.plot(temperature_1d)
-plt.show()
-
A multi-lined (or over-plotted) plot, with a legend, can be achieved easily by
-calling iris.plot.plot() or iris.quickplot.plot() consecutively
-and providing the label keyword to identify it.
-Once all of the lines have been added the matplotlib.pyplot.legend()
-function can be called to indicate that a legend is desired:
-
"""
-Multi-line temperature profile plot
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-"""
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.plotasiplt
-importiris.quickplotasqplt
-
-
-defmain():
- fname=iris.sample_data_path('air_temp.pp')
-
- # Load exactly one cube from the given file.
- temperature=iris.load_cube(fname)
-
- # We only want a small number of latitudes, so filter some out
- # using "extract".
- temperature=temperature.extract(
- iris.Constraint(latitude=lambdacell:68<=cell<78))
-
- forcubeintemperature.slices('longitude'):
-
- # Create a string label to identify this cube (i.e. latitude: value).
- cube_label='latitude: %s'%cube.coord('latitude').points[0]
-
- # Plot the cube, and associate it with a label.
- qplt.plot(cube,label=cube_label)
-
- # Add the legend with 2 columns.
- plt.legend(ncol=2)
-
- # Put a grid on the plot.
- plt.grid(True)
-
- # Tell matplotlib not to extend the plot axes range to nicely
- # rounded numbers.
- plt.axis('tight')
-
- # Finally, show it.
- iplt.show()
-
-
-if__name__=='__main__':
- main()
-
This example of consecutive qplt.plot calls coupled with the
-Cube.slices() method on a cube shows
-the temperature at some latitude cross-sections.
-
-
Note
-
The previous example uses the if__name__=="__main__" style to run
-the desired code if and only if the script is run from the command line.
-
This is a good habit to get into when writing scripts in Python as it means
-that any useful functions or variables defined within the script can be
-imported into other scripts without running all of the code and thus
-creating an unwanted plot. This is discussed in more detail at
-http://effbot.org/pyfaq/tutor-what-is-if-name-main-for.htm.
-
In order to run this example, you will need to copy the code into a file
-and run it using python2.7my_file.py.
from__future__import(absolute_import,division,print_function)
-
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.quickplotasqplt
-
-
-fname=iris.sample_data_path('air_temp.pp')
-temperature_cube=iris.load_cube(fname)
-
-# Add a contour, and put the result in a variable called contour.
-contour=qplt.contour(temperature_cube)
-
-# Add coastlines to the map created by contour.
-plt.gca().coastlines()
-
-# Add contour labels based on the contour we have just created.
-plt.clabel(contour)
-
-plt.show()
-
In some situations the underlying coordinates are better represented with a
-continuous bounded coordinate, in which case a “block” plot may be more
-appropriate.
-Continuous block plots can be achieved with either iris.plot.pcolormesh()
-or iris.quickplot.pcolormesh().
from__future__import(absolute_import,division,print_function)
-
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.quickplotasqplt
-
-
-# Load the data for a single value of model level number.
-fname=iris.sample_data_path('hybrid_height.nc')
-temperature_cube=iris.load_cube(
- fname,iris.Constraint(model_level_number=1))
-
-# Draw the block plot.
-qplt.pcolormesh(temperature_cube)
-
-plt.show()
-
Iris includes colour specifications and designs developed by
-Cynthia Brewer.
-These colour schemes are freely available under the following licence:
-
Apache-Style Software License for ColorBrewer software and ColorBrewer Color Schemes
-
-Copyright (c) 2002 Cynthia Brewer, Mark Harrower, and The Pennsylvania State University.
-
-Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License.
-You may obtain a copy of the License at
-
-http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing, software distributed
-under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
-CONDITIONS OF ANY KIND, either express or implied. See the License for the
-specific language governing permissions and limitations under the License.
-
-
-
To include a reference in a journal article or report please refer to
-section 5
-in the citation guidance provided by Cynthia Brewer.
To plot a cube using a Brewer colour palette, simply select one of the Iris
-registered Brewer colour palettes and plot the cube as normal. The Brewer palettes
-become available once iris.plot or iris.quickplot are imported.
-
from__future__import(absolute_import,division,print_function)
-
-importmatplotlib.cmasmpl_cm
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.quickplotasqplt
-
-fname=iris.sample_data_path('air_temp.pp')
-temperature_cube=iris.load_cube(fname)
-
-# Load a Cynthia Brewer palette.
-brewer_cmap=mpl_cm.get_cmap('brewer_OrRd_09')
-
-# Draw the contours, with n-levels set for the map colours (9).
-# NOTE: needed as the map is non-interpolated, but matplotlib does not provide
-# any special behaviour for these.
-qplt.contourf(temperature_cube,brewer_cmap.N,cmap=brewer_cmap)
-
-# Add coastlines to the map created by contourf.
-plt.gca().coastlines()
-
-plt.show()
-
Citations can be easily added to a plot using the
-iris.plot.citation() function.
-The recommended text for the Cynthia Brewer citation is provided by
-iris.plot.BREWER_CITE.
-
from__future__import(absolute_import,division,print_function)
-
-importmatplotlib.pyplotasplt
-
-importiris
-importiris.quickplotasqplt
-importiris.plotasiplt
-
-
-fname=iris.sample_data_path('air_temp.pp')
-temperature_cube=iris.load_cube(fname)
-
-# Get the Purples "Brewer" palette.
-brewer_cmap=plt.get_cmap('brewer_Purples_09')
-
-# Draw the contours, with n-levels set for the map colours (9).
-# NOTE: needed as the map is non-interpolated, but matplotlib does not provide
-# any special behaviour for these.
-qplt.contourf(temperature_cube,brewer_cmap.N,cmap=brewer_cmap)
-
-# Add a citation to the plot.
-iplt.citation(iris.plot.BREWER_CITE)
-
-# Add coastlines to the map created by contourf.
-plt.gca().coastlines()
-
-plt.show()
-
The Loading Iris cubes section of the user guide showed how to load data into multidimensional Iris cubes.
-However it is often necessary to reduce the dimensionality of a cube down to something more appropriate and/or manageable.
-
Iris provides several ways of reducing both the amount of data and/or the number of dimensions in your cube depending on the circumstance.
-In all cases the subset of a valid cube is itself a valid cube.
A subset of a cube can be “extracted” from a multi-dimensional cube in order to reduce its dimensionality:
-
>>> importiris
->>> filename=iris.sample_data_path('space_weather.nc')
->>> cube=iris.load_cube(filename,'electron density')
->>> equator_slice=cube.extract(iris.Constraint(grid_latitude=0))
->>> printequator_slice
-electron density / (1E11 e/m^3) (height: 29; grid_longitude: 31)
- Dimension coordinates:
- height x -
- grid_longitude - x
- Auxiliary coordinates:
- latitude - x
- longitude - x
- Scalar coordinates:
- grid_latitude: 0.0 degrees
- Attributes:
- Conventions: CF-1.5
-
-
-
In this example we start with a 3 dimensional cube, with dimensions of height, grid_latitude and grid_longitude,
-and extract every point where the latitude is 0, resulting in a 2d cube with axes of height and grid_longitude.
-
-
Warning
-
Caution is required when using equality constraints with floating point coordinates such as grid_latitude.
-Printing the points of a coordinate does not necessarily show the full precision of the underlying number and it
-is very easy return no matches to a constraint when one was expected.
-This can be avoided by using a function as the argument to the constraint:
-
defnear_zero(cell):
- """Returns true if the cell is between -0.1 and 0.1."""
- return-0.1<cell<0.1
-
-equator_constraint=iris.Constraint(grid_latitude=near_zero)
-
-
-
Often you will see this construct in shorthand using a lambda function definition:
As we saw in Loading Iris cubes the result of iris.load() is a CubeList.
-The extract method also exists on a CubeList and behaves in exactly the
-same way as loading with constraints:
-
>>> importiris
->>> air_temp_and_fp_6=iris.Constraint('air_potential_temperature',forecast_period=6)
->>> level_10=iris.Constraint(model_level_number=10)
->>> filename=iris.sample_data_path('uk_hires.pp')
->>> cubes=iris.load(filename).extract(air_temp_and_fp_6&level_10)
->>> printcubes
-0: air_potential_temperature / (K) (grid_latitude: 204; grid_longitude: 187)
->>> printcubes[0]
-air_potential_temperature / (K) (grid_latitude: 204; grid_longitude: 187)
- Dimension coordinates:
- grid_latitude x -
- grid_longitude - x
- Auxiliary coordinates:
- surface_altitude x x
- Derived coordinates:
- altitude x x
- Scalar coordinates:
- forecast_period: 6.0 hours
- forecast_reference_time: 2009-11-19 04:00:00
- level_height: 395.0 m, bound=(360.0, 433.333) m
- model_level_number: 10
- sigma: 0.954993, bound=(0.958939, 0.95068)
- time: 2009-11-19 10:00:00
- Attributes:
- STASH: m01s00i004
- source: Data from Met Office Unified Model
- um_version: 7.3
-
A useful way of dealing with a Cube in its entirety is by iterating over its layers or slices.
-For example, to deal with a 3 dimensional cube (z,y,x) you could iterate over all 2 dimensional slices in y and x
-which make up the full 3d cube.:
The Python function enumerate() is used in this example to provide an incrementing variable i which is
-printed with the summary of each cube slice. Note that there were 1500 1d longitude cubes as a result of
-slicing the 3 dimensional cube (15, 100, 100) by longitude (i starts at 0 and 1500 = 15 * 100).
-
-
Hint
-
It is often useful to get a single 2d slice from a multidimensional cube in order to develop a 2d plot function, for example.
-This can be achieved by using the next() method on the result of slices:
In the same way that you would expect a numeric multidimensional array to be indexed to take a subset of your
-original array, you can index a Cube for the same purpose.
-
Here are some examples of array indexing in numpy:
-
importnumpyasnp
-# create an array of 12 consecutive integers starting from 0
-a=np.arange(12)
-printa
-
-printa[0]# first element of the array
-
-printa[-1]# last element of the array
-
-printa[0:4]# first four elements of the array (this is the same as a[:4])
-
-printa[-4:]# last four elements of the array
-
-printa[::-1]# gives all of the array, but backwards
-
-# Make a 2d array by reshaping a
-b=a.reshape(3,4)
-printb
-
-printb[0,0]# first element of the first and second dimensions
-
-printb[0]# first element of the first dimension (+ every other dimension)
-
-# get the second element of the first dimension and all of the second dimension
-# in reverse, by steps of two.
-printb[1,::-2]
-
-
-
Similarly, Iris cubes have indexing capability:
-
importiris
-filename=iris.sample_data_path('hybrid_height.nc')
-cube=iris.load_cube(filename)
-
-printcube
-
-# get the first element of the first dimension (+ every other dimension)
-printcube[0]
-
-# get the last element of the first dimension (+ every other dimension)
-printcube[-1]
-
-# get the first 4 elements of the first dimension (+ every other dimension)
-printcube[0:4]
-
-# Get the first element of the first and third dimension (+ every other dimension)
-printcube[0,:,0]
-
-# Get the second element of the first dimension and all of the second dimension
-# in reverse, by steps of two.
-printcube[1,::-2]
-
This document explains the new/changed features of Iris in version 1.0.
-(View all changes.)
-
With the release of Iris 1.0, we have broadly completed the transition
-to the CF data model, and established a stable foundation for future
-work. Following this release we plan to deliver significant performance
-improvements and additional features.
The 1.x series of releases is intended to provide a relatively stable,
-backwards-compatible platform based on the CF-netCDF data model, upon
-which long-lived services can be built.
-
Iris 1.0 targets the data model implicit in CF-netCDF 1.5. This will be
-extended to cover the new features of CF-netCDF 1.6 (e.g. discrete
-sampling geometries) and any subsequent versions which maintain
-backwards compatibility. Similarly, as the efforts of the CF community
-to formalise their data model reach maturity, they will be included
-in Iris where significant backwards-compatibility can be maintained.
The coordinate systems in Iris are now defined by the CF-netCDF
-grid mappings.
-As of Iris 1.0 a subset of the CF-netCDF coordinate systems are
-supported, but this will be expanded in subsequent versions. Adding
-this code is a relatively simple, incremental process - it would make a
-good task to tackle for users interested in getting involved in
-contributing to the project.
-
The coordinate systems available in Iris 1.0 and their corresponding
-Iris classes are:
For convenience, Iris also includes the OSGB
-class which provides a simple way to create the transverse Mercator
-coordinate system used by the British
-Ordnance Survey.
The underlying map drawing package has now been updated to use
-Cartopy. Cartopy provides a
-highly flexible set of mapping tools, with a consistent, intuitive
-interface. As yet it doesn’t have feature-parity with basemap, but its
-goal is to make maps “just work”, making it the perfect complement to Iris.
-
The iris.plot.map_setup function has now been replaced with a cleaner
-interface:
-
-
-
To draw a cube on its native map project, one can simply draw the cube directly:
The iris.plot.gcm function to get the current map is now
-redundant; instead the current map is the current matplotlib axes,
-and matplotlib.pyplot.gca() should be used instead.
-
-
For more examples of what can be done with Cartopy, see the Iris gallery and
-Cartopy’s documentation.
With the introduction of the HybridPressureFactory
-class, it is now possible to represent data expressed on a
-hybrid-pressure vertical coordinate, as defined by the second variant in
-Appendix D.
-A hybrid-pressure factory is created with references to the coordinates
-which provide the components of the hybrid coordinate (“ap” and “b”) and
-the surface pressure. In return, it provides a virtual “pressure”
-coordinate whose values are derived from the given components.
-
This facility is utilised by the GRIB2 loader to automatically provide
-the derived “pressure” coordinate for certain data [1] from the
-ECMWF.
When saving a Cube to a netCDF file, Iris will now define the outermost
-dimension as an unlimited/record dimension. In combination with the
-iris.cube.Cube.transpose() method, this allows any dimension to
-take the role of the unlimited/record dimension.
Also, Iris will now ensure that netCDF files are properly closed when
-they are no longer in use. Previously this could cause problems when
-dealing with large numbers of netCDF files, or in long running
-processes.
Iris includes a selection of carefully designed colour palettes produced
-by Cynthia Brewer. The iris.palette module registers the Brewer
-colour palettes with matplotlib, so they are explicitly selectable via
-the matplotlib.pyplot.set_cmap() function. For example:
Citations can easily be added to a plot using the iris.plot.citation()
-function. The recommended text for the Cynthia Brewer citation is provided
-by iris.plot.BREWER_CITE.
-
To include a reference in a journal article or report please refer to
-section 5
-in the citation guidance provided by Cynthia Brewer.
Iris now stores “source” and “history” metadata in Cube attributes.
-For example:
-
>>> printiris.tests.stock.global_pp()
-air_temperature (latitude: 73; longitude: 96)
- ...
- Attributes:
- ...
- source: Data from Met Office Unified Model
- ...
-
-
-
Where previously it would have appeared as:
-
air_temperature (latitude: 73; longitude: 96)
- ...
- Scalar coordinates:
- ...
- source: Data from Met Office Unified Model
- ...
-
-
-
-
Note
-
This change breaks backwards compatibility with Iris 0.9. But
-if it is desirable to have the “source” metadata expressed as a
-coordinate then it can be done with the following pattern:
In addition, iris.load_raw() has been provided as a last resort
-for situations where the automatic cube merging is not appropriate.
-However, if you find you need to use this function we would encourage
-you to contact the Iris developers so we can see if a fix can be made
-to the cube merge algorithm.
-
The iris.load_strict() function has been deprecated. Code should
-now use the iris.load_cube() and iris.load_cubes()
-functions instead.
Iris now has the ability to project a cube into a number of map projections.
-This functionality is provided by iris.analysis.cartography.project().
-For example:
This function is intended to be used in cases where the cube’s coordinates
-prevent one from directly visualising the data, e.g. when the longitude
-and latitude are two dimensional and do not make up a regular grid. The
-function uses a nearest neighbour approach rather than any form of
-linear/non-linear interpolation to determine the data value of each cell
-in the resulting cube. Consequently it may have an adverse effect on the
-statistics of the data e.g. the mean and standard deviation will not be
-preserved. This function currently assumes global data and will if
-necessary extrapolate beyond the geographical extent of the source cube.
This document explains the new/changed features of Iris in version 1.1.
-(View all changes.)
-
With the release of Iris 1.1, we are introducing support for Mac OS X.
-Version 1.1 also sees the first batch of performance enhancements, with
-some notable improvements to netCDF/PP import.
PP export no longer attempts to set/overwrite the STASH code based on
-the standard_name.
-
Cell comparisons now work consistently, which fixes a bug where
-bounded_cell > point_cell compares the point to the bounds but,
-point_cell < bounded_cell compares the points.
-
Fieldsfile import now correctly recognises pre v3.1 and post v5.2
-versions, which fixes a bug where the two were interchanged.
-
iris.analysis.trajectory.interpolate now handles hybrid-height.
These functions mimic their non-custom versions, but with the addition
-of a seasons parameter which is used to define the custom seasons.
-These seasons are defined by concatenating the single letter
-abbreviations of the relevant, consecutive months.
-
For example, to categorise a Cube based on “winter” and “summer” months,
-one might do:
A summary of the main features added with version 1.2:
-
-
iris.cube.Cube.convert_units() and
-iris.coords.Coord.convert_units() have been added. This is
-aimed at simplifying the conversion of a cube or coordinate from one unit to
-another. For example, to convert a cube in kelvin to celsius, one can now
-call cube.convert_units(‘celsius’). The operation is in-place and if the
-units are not convertible an exception will be raised.
Printing a Cube is now more user friendly with regards
-to dates and time. All time and forecast_reference_time scalar coordinates
-now display human readable date/time information.
-
The units of a Cube are now shown when it is printed.
With the addition of the var_name attribute the signatures of DimCoord and
-AuxCoord have changed. This should have no impact if you are providing
-parameters as keyword arguments, but it may cause issues if you are relying
-on the position/order of the arguments.
Support for the ABF and ABL file formats (as
-defined by the
-climate and vegetation research group of Boston University), is
-currently provided under the “experimental” system. As such, ABF/ABL
-file detection is not automatically enabled.
-
To enable ABF/ABL file detection, simply import the
-iris.experimental.fileformats.abf module before attempting to
-load an ABF/ABL file.
Iris now provides initial support for concatenating Cubes along one or
-more existing dimensions. Currently this will force the data to be
-loaded for all the source Cubes, but future work will remove this
-restriction.
-
For example, if one began with a collection of Cubes, each containing
-data for a different range of times:
As part of simplifying the mechanism for accessing test data,
-iris.io.select_data_path(), iris.config.DATA_REPOSITORY,
-iris.config.MASTER_DATA_REPOSITORY and
-iris.config.RESOURCE_DIR have been removed.
Bilinear, area-weighted and area-conservative regridding functions are now available in
-iris.experimental. These functions support masked data and handle
-derived coordinates such as hybrid height. The final API is still in development.
-In the meantime:
regrid_bilinear_rectilinear_src_and_grid()
-can be used to regrid a cube onto a horizontal grid defined in a different coordinate system.
-The data values are calculated using bilinear interpolation.
regrid_conservative_via_esmpy()
-can be used for area-conservative regridding between geographical coordinate systems.
-This uses the ESMF library functions, via the ESMPy interface.
Cube merging now favours numerical coordinates over string coordinates
-to describe a dimension, and DimCoord over
-AuxCoord. These modifications prevent the error:
-“No functional relationship between separable and inseparable candidate dimensions”.
The default names of categorisation coordinates are now less ambiguous.
-For example, add_month_number() and
-add_month_fullname() now create
-“month_number” and “month_fullname” coordinates.
-
-
-
Cubes with no vertical coord can now be exported to GRIB¶
A new configuration variable called iris.config.TEST_DATA_DIR
-has been added, replacing the previous combination of
-iris.config.MASTER_DATA_REPOSITORY and
-iris.config.DATA_REPOSITORY. This constant should be the path
-to a directory containing the test data required by the unit tests. It can
-be set by adding a test_data_dir entry to the Resources section of
-site.cfg. See iris.config for more details.
linear() can now extrapolate from a single point
-assuming a gradient of zero. This prevents an issue when loading cross sections
-with a hybrid height coordinate, on a staggered grid and only a single orography field.
A bug in differentiate() that had the potential to cause
-the loss of coordinate metadata when calculating the curl or the derivative of a cube has been fixed.
The functions iris.plot.plot() and iris.quickplot.plot() now take
-up to two arguments, which may be cubes or coordinates, allowing the user to
-have full control over what is plotted on each axis. The coords keyword
-argument is now deprecated for these functions. This now also gives extended
-1D plotting capability.
-
# plot a 1d cube against a given 1d coordinate, with the cube
-# values on the x-axis and the coordinate on the y-axis
-iris.plot.plot(cube,coord)
-
-
-
-
iris.analysis.SUM is now a weighted aggregator, allowing it to take a
-weights keyword argument.
-
-
GRIB2 translations added for standard_name ‘soil_temperature’.
-
-
iris.cube.Cube.slices() can now handle passing dimension index as well
-as the currently supported types (string, coordinate), in order to slice in
-cases where there is no coordinate associated with a dimension (a mix of
-types is also supported).
-
# Get cube slices corresponding to the dimension associated with longitude
-# and the first dimension from a multi-dimensional cube.
-forsub_cubeincube.slices(['longitude',0]):
- printsub_cube
-
Iris can now cope with a missing bounds variable from NetCDF files.
-
-
Added support for bool array indexing on a cube.
-
fname=iris.sample_data_path('air_temp.pp')
-temperature=iris.load_cube(fname)
-temperature[temperature.coord('latitude').points>0]
-
-# The constraints mechanism is still the preferred means to do such a query.
-temperature.extract(iris.Constraint(latitude=lambdav:v>0)))
-
-
-
-
Added support for loading fields defined on regular Gaussian grids from GRIB
-files.
When using plotting routines from iris.plot or iris.quickplot,
-the direction of vertical axes will be reversed if the corresponding
-coordinate has a “positive” attribute set to “down”.
NetCDF error handling on save has been extended to capture file path and
-permission errors.
-
Shape of the Earth scale factors are now correctly interpreted by the GRIB
-loader. They were previously used as a multiplier for the given value but
-should have been used as a decimal shift.
-
OSGB definition corrected.
-
Transverse Mercator on load now accepts the following interchangeably due to
-inconsistencies in CF documentation:
-* +scale_factor_at_central_meridian <-> scale_factor_at_projection_origin
-* +longitude_of_central_meridian <-> longitude_of_projection_origin
-(+recommended encoding)
-
Ellipse description now maintained when converting GeogCS to cartopy.
-
GeoTIFF export bug fixes.
-
Polar axis now set to the North Pole, when a cube with no coordinate system
-is saved to the PP file-format.
The coords keyword argument for iris.plot.plot() and
-iris.quickplot.plot() has been deprecated due to the new API which
-accepts multiple cubes or coordinates.
-
iris.fileformats.pp.PPField.regular_points() and
-iris.fileformats.pp.PPField.regular_bounds() have now been deprecated
-in favour of a new factory method
-iris.coords.DimCoord.from_regular().
-
iris.fileformats.pp.add_load_rules() and
-iris.fileformats.grib.add_load_rules() are now deprecated.
Note that, either a datetime.datetime or netcdftime.datetime
-object instance will be returned, depending on the calendar of the time
-reference coordinate.
-
This capability permits the ability to express time constraints more
-naturally when the cell represents a datetime-like object.
-
# Ignore the 1st of January.
-iris.Constraint(time=lambdacell:cell.point.month!=1andcell.point.day!=1)
-
-
-
Note that, iris.Future also supports a context manager
-which allows multiple sections of code to execute with different run-time behaviour.
-
>>> printiris.FUTURE
-Future(cell_datetime_objects=False)
->>> withiris.FUTURE.context(cell_datetime_objects=True):
-... # Code that expects to deal with datetime-like objects.
-... printiris.FUTURE
-...
-Future(cell_datetime_objects=True)
->>> printiris.FUTURE
-Future(cell_datetime_objects=False)
-
The year, month, day, hour, minute, second and microsecond attributes of
-a iris.time.PartialDateTime object may be fully or partially specified
-for any given comparison.
-
This is particularly useful for time based constraints, whilst enabling the
-iris.FUTURE.cell_datetime_objects, see here for further
-details on this new release feature.
-
fromiris.timeimportPartialDateTime
-
-# Ignore the 1st of January.
-iris.Constraint(time=lambdacell:cell!=PartialDateTime(month=1,day=1))
-
-# Constrain by a specific year.
-iris.Constraint(time=PartialDateTime(year=2013))
-
Extended capability of the NetCDF saver iris.fileformats.netcdf.Saver.write()
-for fine-tune control of a netCDF4.Variable. Also allows multiple dimensions
-to be nominated as unlimited.
Iris tests can now be run on systems where directory write permissions
-previously did not allow it. This is achieved by writing to the current working
-directory in such cases.
-
Support for 365 day calendar PP fields.
-
Added phenomenon translation between cf and grib2 for wind (from) direction.
-
PP files now retain lbfc value on save, derived from the stash attribute.
When a cube is loaded from PP or GRIB and it has both time and forecast period
-coordinates, and the time coordinate has bounds, the forecast period coordinate
-will now also have bounds. These bounds will be aligned with the bounds of the
-time coordinate taking into account the forecast reference time. Also,
-the forecast period point will now be aligned with the time point.
iris.cube.Cube.add_history() has been deprecated in favour
-of users modifying/creating the history metadata directly. This is
-because the automatic behaviour did not deliver a sufficiently complete,
-auditable history and often prevented the merging of cubes.
Callback mechanism iris.run_callback has had its deprecation of return
-values revoked. The callback can now return cube instances as well as
-inplace changes to the cube.
To assist with management of caching results to file, the new utility
-function iris.util.file_is_newer_than() may be used to easily determine whether
-the modification time of a specified cache file is newer than one or more other files.
-
Typically, the use of caching is a means to circumvent the cost of repeating time
-consuming processing, or to reap the benefit of fast-loading a pickled cube.
-
# Determine whether to load from the cache or source.
-ifiris.util.file_is_newer(cache_file,source_file):
- withopen(cache_file,'rb')asfh:
- cube=cPickle.load(fh)
-else:
- cube=iris.load_cube(source_file)
-
- # Perhaps perform some intensive processing ...
-
- # Create the cube cache.
- withopen(cache_file,'wb')asfh:
- cPickle.dump(cube,fh)
-
This attempts to smooth the merging process by ensuring that all candidate cubes
-have the same attributes.
-
-
-
Masking a collapsed result by missing-data tolerance¶
-
The result from collapsing masked cube data may now be completely
-masked by providing a mdtol missing-data tolerance keyword
-to iris.cube.Cube.collapsed().
-
This tolerance provides a threshold that will completely mask the
-collapsed result whenever the fraction of data to missing-data is
-less than or equal to the provided tolerance.
The new utility function iris.util.new_axis() creates a new cube with
-a new leading dimension of size unity. If a scalar coordinate is provided, then
-the scalar coordinate is promoted to be the dimension coordinate for the new
-leading dimension.
-
Note that, this function will load the data payload of the cube.
-
-
-
A new PEAK aggregator providing spline interpolation¶
-
The new iris.analysis.PEAK aggregator calculates the global peak
-value from a spline interpolation of the iris.cube.Cube data payload
-along a nominated coordinate axis.
Iris is now making extensive use of Biggus
-for virtual arrays and lazy array evaluation. In practice this means that analyses
-of cubes with data bigger than the available system memory are now possible.
-
Other than the improved functionality the changes are mostly
-transparent; for example, before the introduction of biggus, MemoryErrors
-were likely for very large datasets:
Now, for supported operations, the evaluation is lazy (i.e. it doesn’t take
-place until the actual data is subsequently requested) and can handle data
-larger than available system memory:
Memory is still a limiting factor if ever the data is desired as a NumPy array
-(e.g. via cube.data), but additional methods have
-been added to the Cube to support querying and subsequently accessing the “lazy”
-data form (see has_lazy_data() and
-lazy_data()).
-
-
-
Showcase: New interpolation and regridding API
-
New interpolation and regridding interfaces have been added which simplify and
-extend the existing functionality.
-
The interfaces are exposed on the cube in the form of the
-interpolate() and regrid() methods.
-Conceptually the signatures of the methods are:
Whilst not all schemes have been migrated to the new interface,
-iris.analysis.Linear defines both linear interpolation and regridding,
-and iris.analysis.AreaWeighted defines an area weighted regridding
-scheme.
-
-
-
Showcase: Merge and concatenate reporting
-
Merge reporting is designed as an aid to the merge processes. Should merging
-a CubeList fail, merge reporting means that a descriptive
-error will be raised that details the differences between the cubes in the
-CubeList that prevented the merge from being successful.
-
A new CubeList method, called
-merge_cube(), has been introduced. Calling it on a
-CubeList will result in a single merged
-Cube being returned or an error message being raised
-that describes why the merge process failed.
-
The following example demonstrates the error message that describes a merge
-failure caused by cubes having differing attributes:
-
>>> cube_list=iris.cube.CubeList((c1,c2))
->>> cube_list.merge_cube()
-Traceback (most recent call last):
- ...
- raiseiris.exceptions.MergeError(msgs)
-iris.exceptions.MergeError: failed to merge into a single cube.
- cube.attributes keys differ: 'foo'
-
-
-
The naming of this new method mirrors that of Iris load functions, where
-one would always expect a CubeList from iris.load()
-and a Cube from iris.load_cube().
-
Concatenate reporting is the equivalent process for concatenating a
-CubeList. It is accessed through the method
-concatenate_cube(), which will return a single
-concatenated cube or produce an error message that describes why the
-concatenate process failed.
-
-
-
Showcase: Cube broadcasting
-
When performing cube arithmetic, cubes now follow similar broadcasting rules
-as NumPy arrays.
-
However, the additional richness of Iris coordinate meta-data provides an
-enhanced capability beyond the basic broadcasting behaviour of NumPy.
-
This means that when performing cube arithmetic, the dimensionality and shape of
-cubes no longer need to match. For example, if the dimensionality of a cube is
-reduced by collapsing, then the result can be used to subtract from the original
-cube to calculate an anomaly:
Merge reporting that raises a descriptive error if the merge process fails.
-
Linear interpolation and regridding now make use of SciPy’s RegularGridInterpolator
-for much faster linear interpolation.
-
NAME file loading now handles the “no time averaging” column and translates
-height/altitude above ground/sea-level columns into appropriate coordinate metadata.
-
The NetCDF saver has been extended to allow saving of cubes with hybrid pressure
-auxiliary factories.
-
PP/FF loading supports LBLEV of 9999.
-
Extended GRIB1 loading to support data on hybrid pressure levels.
The mdtol keyword was added to area-weighted regridding to allow control of the
-tolerance for missing data. For a further description of this concept, see
-iris.analysis.AreaWeighted.
-
Handling for patching of the CF conventions global attribute via a defined
-cf_patch_conventions function.
-
Deferred GRIB data loading has been introduced for reduced memory consumption when
-loading GRIB files.
-
Concatenate reporting that raises a descriptive error if the concatenation
-process fails.
-
A speed improvement when loading PP or FF data and constraining on STASH code.
Data containing more than one reference cube for constructing hybrid height
-coordinates can now be loaded.
-
Removed cause of increased margin of error when interpolating.
-
Changed floating-point precision used when wrapping points for interpolation.
-
Mappables that can be used to generate colorbars are now returned by Iris
-plotting wrappers.
-
NetCDF load ignores over-specified formula terms on bounded dimensionless vertical
-coordinates.
-
Auxiliary coordinate factory loading now correctly interprets formula term
-varibles for “atmosphere hybrid sigma pressure” coordinate data.
-
Corrected comparison of NumPy NaN values in cube merge process.
-
Fixes for iris.cube.Cube.intersection() to correct calculating the intersection
-of a cube with split bounds, handling of circular coordinates, handling of
-monotonically descending bounded coordinats and for finding a wrapped two-point
-result and longitude tolerances.
Coord.guess_bounds() can now deal with circular coordinates.
-
Coord.nearest_neighbour_index() can now work with descending bounds.
-
Passing weights to Cube.rolling_window() no longer prevents other
-keyword arguments from being passed to the aggregator.
-
Several minor fixes to allow use of Iris on Windows.
-
Made use of the new standard_parallels keyword in Cartopy’s LambertConformal
-projection (Cartopy v0.12). Older versions of Iris will not be able to
-create LambertConformal coordinate systems with Cartopy >= 0.12.
Saving a cube with a STASH attribute to NetCDF now produces a variable
-with an attribute of “um_stash_source” rather than “ukmo__um_stash_source”.
-
Cubes saved to NetCDF with a coordinate system referencing a spherical ellipsoid
-now result in the grid mapping variable containing only the “earth_radius” attribute,
-rather than the “semi_major_axis” and “semi_minor_axis”.
-
Collapsing a cube over all of its dimensions now results in a scalar cube rather
-than a 1d cube.
An example of reprojecting data from 2D auxiliary spatial coordinates
-(such as that from the ORCA grid) has been added. See Tri-Polar Grid Projected Plotting.
A nearest-neighbour scheme for interpolation and regridding has been added
-to Iris. This joins the existing Linear and
-AreaWeighted interpolation and regridding schemes.
You can slice over one or more dimensions of a cube using iris.cube.Cube.slices_over().
-This provides similar functionality to slices() but with
-almost the opposite outcome.
-
Using slices() to slice a cube on a selected dimension returns
-all possible slices of the cube with the selected dimension retaining its dimensionality.
-Using slices_over() to slice a cube on a selected
-dimension returns all possible slices of the cube over the selected dimension.
The iris.analysis.Linear scheme now supports regridding as well as interpolation.
-This enables iris.cube.Cube.regrid() to perform bilinear regridding, which now
-replaces the experimental routine “iris.experimental.regrid.regrid_bilinear_rectilinear_src_and_grid”.
Lateral Boundary Condition (LBC) type FieldFiles are now handled correctly by the FF loader.
-
Making a copy of a scalar cube with no data now correctly copies the data array.
-
Height coordinates in NAME trajectory output files have been changed to match other
-NAME output file formats.
-
Fixed datatype when loading an integer_constants array from a FieldsFile.
-
FF/PP loader adds appropriate cell methods for lbtim.ib=3 intervals.
-
An exception is raised if the units of the latitude and longitude coordinates
-of the cube passed into iris.analysis.cartography.area_weights() are not
-convertible to radians.
-
GRIB1 loader now creates a time coordinate for a time range indicator of 2.
-
NetCDF loader now loads units that are empty strings as dimensionless.
The PP loader now carefully handles floating point errors in date time conversions to hours.
-
The handling fill values for lazy data loaded from NetCDF files is altered, such that the
-_FillValue set in the file is preserved through lazy operations.
-
The risk that cube intersections could return incorrect results due to floating point
-tolerances is reduced.
-
The new GRIB2 loading code is altered to enable the loading of various data representation
-templates; the data value unpacking is handled by the GRIB API.
-
Saving cube collections to NetCDF, where multiple similar aux-factories exist within the cubes,
-is now carefully handled such that extra file variables are created where required in some cases.
The original GRIB loader has been deprecated and replaced with a new
-template-based GRIB loader.
-
Deprecated default NetCDF save behaviour of assigning the outermost
-dimension to be unlimited. Switch to the new behaviour with no auto
-assignment by setting iris.FUTURE.netcdf_no_unlimited to True.
-
The former experimental method
-“iris.experimental.regrid.regrid_bilinear_rectilinear_src_and_grid” has been removed, as
-iris.analysis.Linear now includes this functionality.
This document provides a basic account of how PP and Fieldsfiles data is
-represented within Iris.
-It describes how Iris represents data from the Met Office Unified Model (UM),
-in terms of the metadata elements found in PP and Fieldsfile formats.
-
For simplicity, we shall describe this mostly in terms of loading of PP data into
-Iris (i.e. into cubes). However most of the details are identical for
-Fieldsfiles, and are relevant to saving in these formats as well as loading.
-
Notes:
-
-
Iris treats Fieldsfile data almost exactly as if it were PP – i.e. it
-treats each field’s lookup table entry like a PP header.
-
The Iris datamodel is based on
-NetCDF CF conventions, so most of this can
-also be seen as a metadata translation between PP and CF terms, but it is
-easier to discuss in terms of Iris elements.
-
-
For details of Iris terms (cubes, coordinates, attributes), refer to
-Iris data structures.
The basics of Iris loading are explained at Loading Iris cubes.
-Loading as it specifically applies to PP and Fieldsfile data can be summarised
-as follows:
-
-
Input fields are first loaded from the given sources, using
-iris.fileformats.pp.load(). This returns an iterator, which provides
-a ‘stream’ of PPField input field objects.
-Each PPfield object represents a single source field:
-
PP header elements are provided as named object attributes (e.g.
-lbproc).
-
Some extra, calculated “convenience” properties are also provided (e.g.
-t1 and
-t2 time values).
-
There is a iris.fileformats.pp.PPField.data attribute, but the
-field data is not actually loaded unless/until this is accessed, for
-greater speed and space efficiency.
-
-
-
Each input field is translated into a two-dimensional Iris cube (with
-dimensions of latitude and longitude). These are the ‘raw’ cubes, as
-returned by iris.load_raw().
-Within these:
-
There are 2 horizontal dimension coordinates containing the latitude
-and longitude values for the field.
-
Certain other header elements are interpreted as ‘coordinate’-type
-values applying to the input fields, and stored as auxiliary ‘scalar’
-(i.e. 1-D) coordinates. These include all header elements defining
-vertical and time coordinate values, and also more specialised factors
-such as ensemble number and pseudo-level.
-
Other metadata is encoded on the cube in a variety of other forms, such
-as the cube ‘name’ and ‘units’ properties, attribute values and cell
-methods.
-
-
-
Lastly, Iris attempts to merge the raw cubes into higher-dimensional ones
-(using merge()): This combines raw cubes with
-different values of a scalar coordinate to produce a higher-dimensional
-cube with the values contained in a new vector coordinate. Where possible,
-the new vector coordinate is also a dimension coordinate, describing the
-new dimension.
-Apart from the original 2 horizontal dimensions, all cube dimensions and
-dimension coordinates arise in this way – for example, ‘time’, ‘height’,
-‘forecast_period’, ‘realization’.
-
-
-
Note
-
This document covers the essential features of the UM data loading process.
-The complete details are implemented as follows:
The corresponding save functionality for PP output is implemented by
-the iris.fileformats.pp.save() function. The relevant
-‘save rules’ are defined in a text file
-(“lib/iris/etc/pp_save_rules.txt”), in a form defined by the
-iris.fileformats.rules module.
-
-
-
The rest of this document describes various independent sections of related
-metadata items.
At present, only latitude-longitude projections are supported (both normal and
-rotated).
-In these cases, LBCODE is typically 1 or 101 (though, in fact, cross-sections
-with latitude and longitude axes are also supported).
-
For an ordinary latitude-longitude grid, the cubes have coordinates called
-‘longitude’ and ‘latitude’:
-
-
-
These are mapped to the appropriate data dimensions.
The coordinate points are normally set to the regular sequence
-ZDX/Y+BDX/Y*(1..LBNPT/LBROW) (except, if BDX/BDY is zero, the
-values are taken from the extra data vector X/Y, if present).
-
If X/Y_LOWER_BOUNDS extra data is available, this appears as bounds values
-of the horizontal cooordinates.
-
-
-
For rotated latitude-longitude coordinates (as for LBCODE=101), the
-horizontal coordinates differ only slightly –
-
-
-
The names are ‘grid_latitude’ and ‘grid_longitude’.
Note that in Iris (as in CF) there is no special distinction between
-“regular” and “irregular” coordinates. Thus on saving, X and Y extra data
-sections are written only if the actual values are unevenly spaced.
This information is normally encoded in the cube standard_name property.
-Iris identifies the stash section and item codes from LBUSER4 and the model
-code in LBUSER7, and compares these against a list of phenomenon types with
-known CF translations. If the stashcode is recognised, it then defines the
-appropriate standard_name and units properties of the cube
-(i.e. iris.cube.Cube.standard_name and iris.cube.Cube.units).
-
Where any parts of the stash information are outside the valid range, Iris will
-instead attempt to interpret LBFC, for which a set of known translations is
-also stored. This is often the case for fieldsfiles, where LBUSER4 is
-frequently left as 0.
-
In all cases, Iris also constructs a STASH item
-to identify the phenomenon, which is stored as a cube attribute named
-STASH.
-This preserves the original STASH coding (as standard name translation is not
-always one-to-one), and can be used when no standard_name translation is
-identified (for example, to load only certain stashcodes with a constraint
-– see example at Load constraint examples).
-
-
For example:
-
>>> # Show PPfield phenomenon details.
-... printfield.lbuser[3],field.lbuser[6]
-16203 1
->>>
->>>
->>> # Show Iris equivalents.
-... printcube.standard_name,cube.units,cube.attributes['STASH']
-air_temperature K m01s16i203
-
-
-
-
-
-
Note
-
On saving data, no attempt is made to translate a cube standard_name into a
-STASH code, but any attached ‘STASH’ attribute will be stored into the
-LBUSER4 and LBUSER7 elements.
Several vertical coordinate forms are supported, according to different values
-of LBVC. The commonest ones are:
-
-
lbvc=1 : height levels
-
lbvc=8 : pressure levels
-
lbvc=65 : hybrid height
-
-
In all these cases, vertical coordinates are created, with points and bounds
-values taken from the appropriate header elements. In the raw cubes, each
-vertical coordinate is just a single value, but multiple values will usually
-occur. The subsequent merge operation will then convert these into
-multiple-valued coordinates, and create a new vertical data dimension (i.e. a
-“Z” axis) which they map onto.
-
-
For height levels (LBVC=1):
-
A height coordinate is created. This has units ‘m’, points from
-BLEV, and no bounds. When there are multiple vertical levels, this will
-become a dimension coordinate mapping to the vertical dimension.
-
For pressure levels (LBVC=8):
-
A pressure coordinate is created. This has units ‘hPa’, points from
-BLEV, and no bounds. When there are multiple vertical levels, this will
-become a dimension coordinate mapping a vertical dimension.
-
For hybrid height levels (LBVC=65):
-
Three basic vertical coordinates are created:
-
-
model_level is dimensionless, with points from LBLEV and no bounds.
-
sigma is dimensionless, with points from BHLEV and bounds from
-BHRLEV and BHULEV.
-
level_height has units of ‘m’, points from BLEV and bounds from
-BRLEV and BULEV.
-
-
Also in this case, a HybridHeightFactory is
-created, which references the ‘level_height’ and ‘sigma’ coordinates.
-Following raw cube merging, an extra load stage occurs where the
-attached HybridHeightFactory is called to
-manufacture a new altitude coordinate:
-
-
The altitude coordinate is 3D, mapping to the 2 horizontal
-dimensions and the new vertical dimension.
-
Its units are ‘m’.
-
Its points are calculated from those of the ‘level_height’ and
-‘sigma’ coordinates, and an orography field. If ‘sigma’ and
-‘level_height’ possess bounds, then bounds are also created for
-‘altitude’.
-
-
To make the altitude coordinate, there must be an orography field present
-in the load sources. This is a surface altitude reference field,
-identified (by stashcode) during the main loading operation, and recorded
-for later use in the hybrid height calculation. If it is absent, a warning
-message is printed, and no altitude coordinate is produced.
-
Note that on merging hybrid height data into a cube, only the ‘model_level’
-coordinate becomes a dimension coordinate: The other vertical coordinates
-remain as auxiliary coordinates, because they may be (variously)
-multidimensional or non-monotonic.
-
-
-
See an example printout of a hybrid height cube,
-here:
-
-
Notice that this contains all of the above coordinates –
-‘model_level_number’, ‘sigma’, ‘level_height’ and the derived ‘altitude’.
-
-
Note
-
Hybrid pressure levels can also be handled (for LBVC=9). Without going
-into details, the mechanism is very similar to that for hybrid height:
-it produces basic coordinates ‘model_level_number’, ‘sigma’ and
-‘level_pressure’, and a manufactured 3D ‘air_pressure’ coordinate.
coordinates time, forecast_reference_time, forecast_period
-
-
Details
-
In Iris (as in CF) times and time intervals are both expressed as simple
-numbers, following the approach of the
-UDUNITS project.
-These values are stored as cube coordinates, where the scaling and calendar
-information is contained in the units property.
-
-
The units of a time interval (e.g. ‘forecast_period’), can be ‘seconds’ or
-a simple derived unit such as ‘hours’ or ‘days’ – but it does not contain
-a calendar, so ‘months’ or ‘years’ are not valid.
-
The units of calendar-based times (including ‘time’ and
-‘forecast_reference_time’), are of the general form
-“<time-unit> since <base-date>”, interpreted according to the unit’s
-calendar property. The base date for this is
-always 1st Jan 1970 (times before this are represented as negative values).
-
-
The units.calendar property of time coordinates is set from the lowest decimal
-digit of LBTIM, known as LBTIM.IC. Note that the non-gregorian calendars (e.g.
-360-day ‘model’ calendar) are defined in CF, not udunits.
-
There are a number of different time encoding methods used in UM data, but the
-important distinctions are controlled by the next-to-lowest decimal digit of
-LBTIM, known as “LBTIM.IB”.
-The most common cases are as follows:
-
-
Data at a single measurement timepoint (LBTIM.IB=0):
-
A single time coordinate is created, with points taken from T1 values.
-It has no bounds, units of ‘hours since 1970-01-01 00:00:00’ and a calendar
-defined according to LBTIM.IC.
-
Values forecast from T2, valid at T1 (LBTIM.IB=1):
-
Coordinates time`and``forecast_reference_time are created from the T1
-and T2 values, respectively. These have no bounds, and units of
-‘hours since 1970-01-01 00:00:00’, with the appropriate calendar.
-A forecast_period coordinate is also created, with values T1-T2, no
-bounds and units of ‘hours’.
-
Time mean values between T1 and T2 (LBTIM.IB=2):
-
The time coordinates time, forecast_reference_times and
-forecast_reference_time, are all present, as in the previous case.
-In this case, however, the ‘time’ and ‘forecast_period’ coordinates also
-have associated bounds: The ‘time’ bounds are from T1 to T2, and the
-‘forecast_period’ bounds are from “LBFT - (T2-T1)” to “LBFT”.
-
-
Note that, in those more complex cases where the input defines all three of the
-‘time’, ‘forecast_reference_time’ and ‘forecast_period’ values, any or all of
-these may become dimensions of the resulting data cube. This will depend on
-the values actually present in the source fields for each of the elements.
-
See an example printout of a forecast data cube,
-here :
-
-
Notice that this example contains all of the above coordinates – ‘time’,
-‘forecast_period’ and ‘forecast_reference_time’. In this case the data are
-forecasts, so ‘time’ is a dimension, ‘forecast_period’ varies with time and
-‘forecast_reference_time’ is a constant.
Where a field contains statistically processed data, Iris will add an
-appropriate iris.coords.CellMethod to the cube, representing the
-aggregation operation which was performed.
-
This is implemented for certain binary flag bits within the LBPROC element
-value. For example:
-
-
-
time mean, when (LBPROC & 128):
-
Cube has a cell_method of the form “CellMethod(‘mean’, ‘time’).
-
-
-
-
-
time period minimum value, when (LBPROC & 4096):
-
Cube has a cell_method of the form “CellMethod(‘minimum’, ‘time’).
-
-
-
-
-
time period maximum value, when (LBPROC & 8192):
-
Cube has a cell_method of the form “CellMethod(‘maximum’, ‘time’).
-
-
-
-
-
In all these cases, if the field LBTIM is also set to denote a time aggregate
-field (i.e. “LBTIM.IB=2”, see above Time information), then the
-second-to-last digit of LBTIM, aka “LBTIM.IA” may also be non-zero, in which
-case this indicates the aggregation time-interval. In that case, the
-cell-method intervals attribute is also set to
-this many hours.
-
-
For example:
-
>>> # Show stats metadata in a test PP field.
-... fname=iris.sample_data_path('pre-industrial.pp')
->>> eg_field=next(iris.fileformats.pp.load(fname))
->>> printeg_field.lbtim,eg_field.lbproc
-622 128
->>>
->>> # Print out the Iris equivalent information.
-... printiris.load_cube(fname).cell_methods
-(CellMethod(method='mean', coord_names=('time',), intervals=('6 hour',), comments=()),)
-
If non-zero, this is interpreted as an ensemble number. This produces a cube
-scalar coordinate named ‘realization’ (as defined in the CF conventions).