Skip to content

Conversation

@marqh
Copy link
Member

@marqh marqh commented Oct 5, 2016

new PR to replace #2161 with a cleaner history (perhaps)

@marqh marqh mentioned this pull request Oct 5, 2016
@bjlittle bjlittle self-assigned this Oct 6, 2016
bits[0] = os.path.splitext(file_name)[0]
folder, location = os.path.split(path)
bits = [location] + bits
while location != 'iris':
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marqh 💣 This won't work for documentation example_tests, iris isn't the only root namespace module in the testing context.

You'll need to account for example_tests also being a terminal module.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indeed

sha1 = hashlib.sha1(res_file.read())
if unique_id not in repo:
msg = 'Image comparison failed: Created image {} for test {}.'
raise ValueError(msg.format(result_fname, unique_id))
Copy link
Member

@bjlittle bjlittle Oct 6, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marqh You've changed the logic here from the original, which I kinda disagree with.

As it now stands any developer who adds a new graphics test, has to manually craft the changes to the imagerepo.json and manage the hash renaming of the png ... this doesn't scale very well.

I'll propose an alternative approach on your rebase_hash branch as a PR that (I believe) improves this work-flow. We can discuss that further there ... so hang tight for that.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marqh Okay, I'm working on that rebase_hash branch PR ... in-bound soon, thanks

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I understand your concerns and I am keen to see a work flow for updating lots of images

however, in your previous implementation, the effect was that tests with no checksum to check would pass, for developers and for travis.
This is a more serious problem. I simplified the logic to allow me to fix this problem, which I deemed critical.
If you are working with respect to this branch, please make sure that an implementation does not allow tests to pass where there is no entry in the imagerepo or where the image checksum comparison fails.

any tooling to help updating must (imho) deal with these as test failures, then help the developer fix them, not enable them to pass

I hope that point of view is clear and coherent

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marqh I completely understand your concerns, and I think that there is a suitable compromise to be had here to meet all our needs.

I don't think it would be wise of me to mix this into this PR - let's do that in a separate PR 👍

import logging
import os
from six.moves.queue import Queue
import requests
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marqh Move this import above six.moves.queue, thanks.

repo = json.load(codecs.getreader('utf-8')(fi))
uris = []
for k, v in six.iteritems(repo):
uris = uris + v
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marqh Drop the usage of k ... you don't need it, then use six.itervalues

Another neat pattern from itertools to flatten a list of lists is ...

from itertools import chain
uris = list(chain.from_iterable(six.itervalues(repo)))

@marqh
Copy link
Member Author

marqh commented Oct 6, 2016

#2163 concern about testing stability

i suggest we move forward, not treat this as a showstopper

@marqh
Copy link
Member Author

marqh commented Oct 6, 2016

#2163 concern about testing stability

i suggest we move forward, not treat this as a showstopper

@bjlittle it's gone green

nothing to see here ;) :/

@bjlittle
Copy link
Member

bjlittle commented Oct 6, 2016

@marqh Not yet ...

@bjlittle
Copy link
Member

bjlittle commented Oct 6, 2016

@marqh Now that your happy LGTM

@bjlittle bjlittle merged commit 77304ab into SciTools:master Oct 6, 2016
@marqh marqh mentioned this pull request Oct 6, 2016
@QuLogic QuLogic added this to the v2.0 milestone Oct 6, 2016
Copy link
Member

@QuLogic QuLogic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have some qualms with this, though it's been merged already.

try:
result = requests.head(resource)
if (result.status_code == 200 and
resource.startswith('https://scitools.github.io')):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is not aligned correctly.

Copy link
Member

@QuLogic QuLogic Oct 6, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, why not check the startswith beforehand and save a request?

from threading import Thread

# maximum number of threads for multi-threading code
MAXTHREADS = 128
Copy link
Member

@QuLogic QuLogic Oct 6, 2016

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you really need to run 128 parallel HEAD requests? Browsers limit themselves to something like 2-3 connections per host; this seems like overkill.

When I run it with 128, I get a bunch of ConnectionError(MaxRetryError... in the exception deque and it takes 22.277s. Running with 1 thread takes 33.248s, 2 threads is 16.276s, 4 threads is 8.292s and 8 threads is 5.938s. The delta is already diminishing there, so I doubt there is any benefit to bumping it all the way to 128.

@bjlittle bjlittle mentioned this pull request Oct 7, 2016
2 tasks
@QuLogic QuLogic removed this from the v2.0 milestone Oct 7, 2016
@QuLogic QuLogic modified the milestones: v1.11, v2.0 Oct 7, 2016
@QuLogic QuLogic mentioned this pull request Oct 10, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants