-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add separate non-TensorFlow test run in Travis CI #2075
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly looks good; thanks!
How did you determine which tests can be run without TensorFlow? If I
create a new virtualenv with only the dependencies specified in the new
no-TF section of the Travis build script, then try to run all the tests,
I see that a number of additional tests pass:
//tensorboard:lazy_test PASSED in 0.2s
//tensorboard:lib_test PASSED in 0.8s
//tensorboard/compat/tensorflow_stub:gfile_test PASSED in 0.8s
//tensorboard/components/tf_backend/test:test_chromium PASSED in 6.8s
//tensorboard/components/tf_categorization_utils/test:test_chromium PASSED in 6.7s
//tensorboard/components/tf_color_scale/test:test_chromium PASSED in 5.9s
//tensorboard/components/tf_data_selector/test:test_chromium PASSED in 7.0s
//tensorboard/components/tf_paginated_view/test:test_chromium PASSED in 6.1s
//tensorboard/components/tf_storage/test:test_chromium PASSED in 5.4s
//tensorboard/components/vz_line_chart2/test:test_chromium PASSED in 6.7s
//tensorboard/components/vz_sorting/test:test_chromium PASSED in 5.2s
//tensorboard/plugins/graph/tf_graph_common/test:test_chromium PASSED in 7.5s
//tensorboard/plugins/histogram/tf_histogram_dashboard/test:test_chromium PASSED in 6.6s
//tensorboard/plugins/hparams/tf_hparams_google_analytics_tracker:test_chromium PASSED in 6.9s
//tensorboard/plugins/hparams/tf_hparams_parallel_coords_plot:test_chromium PASSED in 7.6s
//tensorboard/plugins/hparams/tf_hparams_query_pane:test_chromium PASSED in 8.1s
//tensorboard/plugins/hparams/tf_hparams_scale_and_color_controls:test_chromium PASSED in 6.5s
//tensorboard/plugins/hparams/tf_hparams_utils:test_chromium PASSED in 5.7s
//tensorboard/summary:summary_test PASSED in 0.8s
Also: when running
bazel run //tensorboard/pip_package:build_pip_package -- --tf-version ''
in such a virtualenv as of this PR (d9c66b5a), I see:
+ python -c '
import tensorboard as tb
assert tb.__version__ == tb.version.VERSION
tb.summary.scalar_pb('\''test'\'', 42)
from tensorboard.plugins.projector import visualize_embeddings
from tensorboard.plugins.beholder import Beholder, BeholderHook
tb.notebook.start # don'\''t invoke; just check existence
'
Traceback (most recent call last):
File "<string>", line 4, in <module>
File "/tmp/tensorboard-pip.bjgl3EqxDD/smoke-venv2/local/lib/python2.7/site-packages/tensorboard/plugins/scalar/summary.py", line 89, in pb
import tensorflow.compat.v1 as tf
ImportError: No module named tensorflow.compat.v1
in the smoke test (which then fails). This happens regardless of whether
I include TENSORBOARD_NO_TF=1
. My virtualenv contents are as follows:
pip freeze
output
$ pip freeze
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.
absl-py==0.7.1
asn1crypto==0.24.0
aws-xray-sdk==0.95
backports.ssl-match-hostname==3.7.0.1
backports.tempfile==1.0
backports.weakref==1.0.post1
boto==2.49.0
boto3==1.9.86
botocore==1.12.126
certifi==2019.3.9
cffi==1.12.2
chardet==3.0.4
configparser==3.7.4
cookies==2.2.1
cryptography==2.6.1
docker==3.7.2
docker-pycreds==0.4.0
docutils==0.14
ecdsa==0.13
enum34==1.1.6
flake8==3.5.0
funcsigs==1.0.2
future==0.17.1
futures==3.1.1
grpcio==1.6.3
idna==2.8
ipaddress==1.0.22
Jinja2==2.10
jmespath==0.9.4
jsondiff==1.1.1
jsonpickle==1.1
MarkupSafe==1.1.1
mccabe==0.6.1
mock==2.0.0
moto==1.3.7
numpy==1.16.2
pbr==5.1.3
pkg-resources==0.0.0
protobuf==3.7.1
pyaml==18.11.0
pycodestyle==2.3.1
pycparser==2.19
pycryptodome==3.8.0
pyflakes==1.6.0
python-dateutil==2.8.0
python-jose==2.0.2
pytz==2018.9
PyYAML==5.1
requests==2.21.0
responses==0.10.6
s3transfer==0.1.13
six==1.12.0
urllib3==1.24.1
websocket-client==0.56.0
Werkzeug==0.15.1
wrapt==1.11.1
xmltodict==0.12.0
yamllint==1.5.0
Can you reproduce this, or does the script work for you?
tensorboard/main.py
Outdated
# TENSORBOARD_NO_TF environment variable, we check for TensorFlow here so that if it's | ||
# missing we generate a clear and immediate error rather than partial functionality. | ||
# TODO(#2027): Remove environment check once we have placeholder UI | ||
if os.getenv('TENSORBOARD_NO_TF') is None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If this removal is included in this PR, then we can’t merge this PR
until #2032 is merged. If you don’t want to block on that PR, then it
looks to me like you should be able to add TENSORBOARD_NO_TF=1
to the
Travis CI script and/or Pip package smoke test as appropriate. Or, feel
free to keep this as is if you don’t mind waiting for #2032.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For now I'll keep it, but we can remove it if this is ready to land before the UI one. Thanks!
The script works for me, but I'm not sure if I'm running it correctly. From a conda environment:
|
It looks like it’s failing on Travis with the same error:
Build: https://travis-ci.com/tensorflow/tensorboard/jobs/189495702 Your Bazel invocation is correct. Maybe something’s the matter in your
in case that helps. Can you look into this when you get a chance? I’m not sure what’s up with the |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I’m not sure what’s up with the
assertItemsEqual
Python 3 failures in
your Travis build, though. I can’t reproduce them, and they don’t make
much sense to me.
Ah, I was at an older commit: I can reproduce them at 45ba2589, but
not at d9c66b5a. This PR’s modifications to change these tests to use
unittest.TestCase
instead of tf.test.TestCase
are not sound, because
the unittest
method has a different name in Python 3 vs. Python 2.
We can fix this by using six.assertCountEqual
, but could you please
revert the changes to these tests regardless? The PR that adds testing
support for notf
in our CI shouldn’t also have a functional change to
make more tests work in notf
. (That is: enabling tests is fine;
changing them is less so.) I’d gladly welcome a working version of this
change as a separate PR, which wouldn’t need to depend on anything.
3046da3
to
c851333
Compare
Rebased and removed test changes, etc. However, I still can't repro the |
@wchargin CI passes now! Let me know your thoughts and thanks. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Verified that smoke tests work properly on all three configs
(nightly, preview, and notf), and that a failure in summary.scalar_pb
is still caught by the smoke tests.
Please address two small comments, then we can merge.
Thanks! |
Continuation of work for #2027 and ultimately removing the
TENSORBOARD_NO_TF=1
environment variable.Plugins tests aren't run yet, but this confirms that build and limited tests run without TensorFlow installed.