Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Py 2 to 3 encoding errors #410

Open
wants to merge 2 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -181,11 +181,9 @@ def _setup_network(self):
# - ``caffe.TEST`` indicates phase of either TRAIN or TEST
self._log.debug("Initializing network")
self._log.debug("Loading Caffe network from network/model configs")
self.network = caffe.Net(self.network_prototxt.write_temp(),
self.network = caffe.Net(self.network_prototxt,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.network_prototxt here, as well as self.network_model below, should be DataElement instances at this stage. I believe Caffe 1 wanted filepaths here, thus the previous write_temp() calls. Is this not the case any more? I realize that the caffe 1 generator might be getting a little stale at this point.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It still seems to accept a string as input. I can change the documentation in Linehttps://github.com/Kitware/SMQTK/blob/654f69e0db89ce7b0a2fd13b27ef503ed4523e6b/python/smqtk/algorithms/descriptor_generator/caffe_descriptor.py#L58 to string and the setstate function to reflect the same. Please let me know if you think that sounds right.

caffe.TEST,
weights=self.network_model.write_temp())
self.network_prototxt.clean_temp()
self.network_model.clean_temp()
weights=self.network_model)
# Assuming the network has a 'data' layer and notion of data shape
self.net_data_shape = self.network.blobs[self.data_layer].data.shape
self._log.debug("Network data shape: %s", self.net_data_shape)
Expand All @@ -201,7 +199,7 @@ def _setup_network(self):
if self.image_mean is not None:
self._log.debug("Loading image mean (reducing to single pixel "
"mean)")
image_mean_bytes = self.image_mean.get_bytes()
image_mean_bytes = open(self.image_mean, "rb").read()
try:
a = numpy.load(io.BytesIO(image_mean_bytes))
self._log.info("Loaded image mean from numpy bytes")
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
import os
import tempfile

from six.moves import cStringIO as StringIO
from six import BytesIO

from smqtk.utils import SmqtkObject
from smqtk.utils.file import safe_create_dir
Expand Down Expand Up @@ -45,7 +45,7 @@ def __init__(self, name, parent_app, working_directory, url_prefix=None):
# Top level key is the file ID of the upload. The dictionary
# underneath that is the index ID'd chunks. When all chunks are
# present, the file is written and the entry in this map is removed.
#: :type: dict of (str, dict of (int, StringIO))
#: :type: dict of (str, dict of (int, BytesIO))
self._file_chunks = {}
# Lock per file ID so as to not collide when uploading multiple chunks
#: :type: dict of (str, RLock)
Expand Down Expand Up @@ -82,7 +82,7 @@ def upload_file():
# - Need to explicitly copy the buffered data as the file object
# closes between chunk messages.
self._file_chunks.setdefault(fid, {})[current_chunk] \
= StringIO(chunk_data.read())
= BytesIO(chunk_data.read())
message = "Uploaded chunk #%d of %d for file '%s'" \
% (current_chunk, total_chunks, filename)

Expand Down Expand Up @@ -178,7 +178,7 @@ def _write_file_chunks(self, chunk_map, file_extension=''):
Returned file path should be manually removed by the user.

:param chunk_map: Mapping of integer index to file-like chunk
:type chunk_map: dict of (int, StringIO)
:type chunk_map: dict of (int, BytesIO)
:param file_extension: String extension to suffix the temporary file
with
:type file_extension: str
Expand Down