Skip to content

Commit 2a2ab82

Browse files
author
dcherian
committed
Merge branch 'master' into 2d-coord
* master: (21 commits) xarray.backends refactor (pydata#2261) Fix indexing error for data loaded with open_rasterio (pydata#2456) Properly support user-provided norm. (pydata#2443) pep8speaks (pydata#2462) isort (pydata#2469) tests shoudn't need to pass for a PR (pydata#2471) Replace the last of unittest with pytest (pydata#2467) Add python_requires to setup.py (pydata#2465) Update whats-new.rst (pydata#2466) Clean up _parse_array_of_cftime_strings (pydata#2464) plot.contour: Don't make cmap if colors is a single color. (pydata#2453) np.AxisError was added in numpy 1.13 (pydata#2455) Add CFTimeIndex.shift (pydata#2431) Fix FutureWarning in CFTimeIndex.date_type (pydata#2448) fix:2445 (pydata#2446) Enable use of cftime.datetime coordinates with differentiate and interp (pydata#2434) restore ddof support in std (pydata#2447) Future warning for default reduction dimension of groupby (pydata#2366) Remove incorrect statement about "drop" in the text docs (pydata#2439) Use profile mechanism, not no-op mutation (pydata#2442) ...
2 parents 50d139d + 289b377 commit 2a2ab82

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

86 files changed

+2634
-1578
lines changed

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
11
- [ ] Closes #xxxx (remove if there is no corresponding issue, which should only be the case for minor changes)
22
- [ ] Tests added (for all bug fixes or enhancements)
3-
- [ ] Tests passed (for all non-documentation changes)
43
- [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

.pep8speaks.yml

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# File : .pep8speaks.yml
2+
3+
scanner:
4+
diff_only: True # If True, errors caused by only the patch are shown
5+
6+
pycodestyle:
7+
max-line-length: 79
8+
ignore: # Errors and warnings to ignore
9+
- E402, # module level import not at top of file
10+
- E731, # do not assign a lambda expression, use a def
11+
- W503 # line break before binary operator

.stickler.yml

Lines changed: 0 additions & 11 deletions
This file was deleted.

.travis.yml

Lines changed: 28 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Based on http://conda.pydata.org/docs/travis.html
2-
language: python
2+
language: minimal
33
sudo: false # use container based build
44
notifications:
55
email: false
@@ -10,74 +10,48 @@ branches:
1010
matrix:
1111
fast_finish: true
1212
include:
13-
- python: 2.7
14-
env: CONDA_ENV=py27-min
15-
- python: 2.7
16-
env: CONDA_ENV=py27-cdat+iris+pynio
17-
- python: 3.5
18-
env: CONDA_ENV=py35
19-
- python: 3.6
20-
env: CONDA_ENV=py36
21-
- python: 3.6 # TODO: change this to 3.7 once https://github.com/travis-ci/travis-ci/issues/9815 is fixed
22-
env: CONDA_ENV=py37
23-
- python: 3.6
24-
env:
13+
- env: CONDA_ENV=py27-min
14+
- env: CONDA_ENV=py27-cdat+iris+pynio
15+
- env: CONDA_ENV=py35
16+
- env: CONDA_ENV=py36
17+
- env: CONDA_ENV=py37
18+
- env:
2519
- CONDA_ENV=py36
2620
- EXTRA_FLAGS="--run-flaky --run-network-tests"
27-
- python: 3.6
28-
env: CONDA_ENV=py36-netcdf4-dev
21+
- env: CONDA_ENV=py36-netcdf4-dev
2922
addons:
3023
apt_packages:
3124
- libhdf5-serial-dev
3225
- netcdf-bin
3326
- libnetcdf-dev
34-
- python: 3.6
35-
env: CONDA_ENV=py36-dask-dev
36-
- python: 3.6
37-
env: CONDA_ENV=py36-pandas-dev
38-
- python: 3.6
39-
env: CONDA_ENV=py36-bottleneck-dev
40-
- python: 3.6
41-
env: CONDA_ENV=py36-condaforge-rc
42-
- python: 3.6
43-
env: CONDA_ENV=py36-pynio-dev
44-
- python: 3.6
45-
env: CONDA_ENV=py36-rasterio-0.36
46-
- python: 3.6
47-
env: CONDA_ENV=py36-zarr-dev
48-
- python: 3.5
49-
env: CONDA_ENV=docs
50-
- python: 3.6
51-
env: CONDA_ENV=py36-hypothesis
27+
- env: CONDA_ENV=py36-dask-dev
28+
- env: CONDA_ENV=py36-pandas-dev
29+
- env: CONDA_ENV=py36-bottleneck-dev
30+
- env: CONDA_ENV=py36-condaforge-rc
31+
- env: CONDA_ENV=py36-pynio-dev
32+
- env: CONDA_ENV=py36-rasterio-0.36
33+
- env: CONDA_ENV=py36-zarr-dev
34+
- env: CONDA_ENV=docs
35+
- env: CONDA_ENV=py36-hypothesis
36+
5237
allow_failures:
53-
- python: 3.6
54-
env:
38+
- env:
5539
- CONDA_ENV=py36
5640
- EXTRA_FLAGS="--run-flaky --run-network-tests"
57-
- python: 3.6
58-
env: CONDA_ENV=py36-netcdf4-dev
41+
- env: CONDA_ENV=py36-netcdf4-dev
5942
addons:
6043
apt_packages:
6144
- libhdf5-serial-dev
6245
- netcdf-bin
6346
- libnetcdf-dev
64-
- python: 3.6
65-
env: CONDA_ENV=py36-pandas-dev
66-
- python: 3.6
67-
env: CONDA_ENV=py36-bottleneck-dev
68-
- python: 3.6
69-
env: CONDA_ENV=py36-condaforge-rc
70-
- python: 3.6
71-
env: CONDA_ENV=py36-pynio-dev
72-
- python: 3.6
73-
env: CONDA_ENV=py36-zarr-dev
47+
- env: CONDA_ENV=py36-pandas-dev
48+
- env: CONDA_ENV=py36-bottleneck-dev
49+
- env: CONDA_ENV=py36-condaforge-rc
50+
- env: CONDA_ENV=py36-pynio-dev
51+
- env: CONDA_ENV=py36-zarr-dev
7452

7553
before_install:
76-
- if [[ "$TRAVIS_PYTHON_VERSION" == "2.7" ]]; then
77-
wget http://repo.continuum.io/miniconda/Miniconda-3.16.0-Linux-x86_64.sh -O miniconda.sh;
78-
else
79-
wget http://repo.continuum.io/miniconda/Miniconda3-3.16.0-Linux-x86_64.sh -O miniconda.sh;
80-
fi
54+
- wget http://repo.continuum.io/miniconda/Miniconda3-3.16.0-Linux-x86_64.sh -O miniconda.sh;
8155
- bash miniconda.sh -b -p $HOME/miniconda
8256
- export PATH="$HOME/miniconda/bin:$PATH"
8357
- hash -r
@@ -97,6 +71,8 @@ install:
9771
- python xarray/util/print_versions.py
9872

9973
script:
74+
- which python
75+
- python --version
10076
- python -OO -c "import xarray"
10177
- if [[ "$CONDA_ENV" == "docs" ]]; then
10278
conda install -c conda-forge sphinx sphinx_rtd_theme sphinx-gallery numpydoc;

asv_bench/asv.conf.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,7 @@
6464
"scipy": [""],
6565
"bottleneck": ["", null],
6666
"dask": [""],
67+
"distributed": [""],
6768
},
6869

6970

asv_bench/benchmarks/dataset_io.py

Lines changed: 42 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,13 @@
11
from __future__ import absolute_import, division, print_function
22

3+
import os
4+
35
import numpy as np
46
import pandas as pd
57

68
import xarray as xr
79

8-
from . import randn, randint, requires_dask
10+
from . import randint, randn, requires_dask
911

1012
try:
1113
import dask
@@ -14,6 +16,9 @@
1416
pass
1517

1618

19+
os.environ['HDF5_USE_FILE_LOCKING'] = 'FALSE'
20+
21+
1722
class IOSingleNetCDF(object):
1823
"""
1924
A few examples that benchmark reading/writing a single netCDF file with
@@ -405,3 +410,39 @@ def time_open_dataset_scipy_with_time_chunks(self):
405410
with dask.set_options(get=dask.multiprocessing.get):
406411
xr.open_mfdataset(self.filenames_list, engine='scipy',
407412
chunks=self.time_chunks)
413+
414+
415+
def create_delayed_write():
416+
import dask.array as da
417+
vals = da.random.random(300, chunks=(1,))
418+
ds = xr.Dataset({'vals': (['a'], vals)})
419+
return ds.to_netcdf('file.nc', engine='netcdf4', compute=False)
420+
421+
422+
class IOWriteNetCDFDask(object):
423+
timeout = 60
424+
repeat = 1
425+
number = 5
426+
427+
def setup(self):
428+
requires_dask()
429+
self.write = create_delayed_write()
430+
431+
def time_write(self):
432+
self.write.compute()
433+
434+
435+
class IOWriteNetCDFDaskDistributed(object):
436+
def setup(self):
437+
try:
438+
import distributed
439+
except ImportError:
440+
raise NotImplementedError
441+
self.client = distributed.Client()
442+
self.write = create_delayed_write()
443+
444+
def cleanup(self):
445+
self.client.shutdown()
446+
447+
def time_write(self):
448+
self.write.compute()

asv_bench/benchmarks/unstacking.py

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
from __future__ import absolute_import, division, print_function
22

33
import numpy as np
4+
45
import xarray as xr
56

67
from . import requires_dask

doc/api-hidden.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,3 +151,5 @@
151151
plot.FacetGrid.set_titles
152152
plot.FacetGrid.set_ticks
153153
plot.FacetGrid.map
154+
155+
CFTimeIndex.shift

doc/api.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -624,3 +624,6 @@ arguments for the ``from_store`` and ``dump_to_store`` Dataset methods:
624624
backends.H5NetCDFStore
625625
backends.PydapDataStore
626626
backends.ScipyDataStore
627+
backends.FileManager
628+
backends.CachingFileManager
629+
backends.DummyFileManager

doc/data-structures.rst

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -408,13 +408,6 @@ operations keep around coordinates:
408408
list(ds[['x']])
409409
list(ds.drop('temperature'))
410410
411-
If a dimension name is given as an argument to ``drop``, it also drops all
412-
variables that use that dimension:
413-
414-
.. ipython:: python
415-
416-
list(ds.drop('time'))
417-
418411
As an alternate to dictionary-like modifications, you can use
419412
:py:meth:`~xarray.Dataset.assign` and :py:meth:`~xarray.Dataset.assign_coords`.
420413
These methods return a new dataset with additional (or replaced) or values:

0 commit comments

Comments
 (0)