-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds OME-TIFF export #6838
Adds OME-TIFF export #6838
Conversation
<div | ||
className="clearfix" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I only got rid of this div in this file
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work & clean up 👍 I didn't test it, but left some comments.
frontend/javascripts/oxalis/view/right-border-tabs/starting_job_modals.tsx
Outdated
Show resolved
Hide resolved
frontend/javascripts/oxalis/view/right-border-tabs/bounding_box_tab.tsx
Outdated
Show resolved
Hide resolved
}; | ||
} | ||
|
||
function useRunningJobs(): [ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe you can have a look at the useStartAndPollJob
hook in the job_hooks.ts module to check whether there is some intersection which could be extracted/reused.
const { lowestResolutionIndex, highestResolutionIndex } = selectedLayerInfos; | ||
const [rawResolutionIndex, setResolutionIndex] = useState<number>(lowestResolutionIndex); | ||
const resolutionIndex = clamp(lowestResolutionIndex, rawResolutionIndex, highestResolutionIndex); | ||
const [exportFormat, setExportFormat] = useState<ExportFormat>(ExportFormat.OME_TIFF); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is ome tiff widely supported and thus a good default?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, I think so. We should probably track usage over time.
frontend/javascripts/oxalis/view/action-bar/download_modal_view.tsx
Outdated
Show resolved
Hide resolved
frontend/javascripts/oxalis/view/action-bar/download_modal_view.tsx
Outdated
Show resolved
Hide resolved
frontend/javascripts/oxalis/view/action-bar/download_modal_view.tsx
Outdated
Show resolved
Hide resolved
frontend/javascripts/oxalis/view/action-bar/download_modal_view.tsx
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Backend LGTM :)
I can do a testing round once the other feedback is adressed
setJobs(await getJobs()); | ||
}, interval); | ||
}): [ | ||
Array<[string, string]>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Array<[string, string]>, | |
Array<[jobKey: string, jobId: string]>, |
would make this more readable
mostRecentSuccessfulJob, | ||
}; | ||
usePolling(checkForJobs, runningJobs.length > 0 ? interval : null, [ | ||
runningJobs.map(([key]) => key).join("-"), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why are these dependencies necessary? the delay (which depends on runningJobs) is already passed as a dependency within the hook. I would have thought that this should suffice?
"The computation of the largest segment id for this dataset has finished. Please reload the page to see it.", | ||
); | ||
} | ||
setMostRecentSuccessfulJob(job); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how does this ensure that this is the most recent job? the result of it is rendered in line 600 and could be confusing if it shows outdated data.
also, if there have been multiple jobs already, the callbacks will be called multiple times as far as I see. thus, multiple toasts will be shown. I think, adapting the interface of the hook so that the mostRecentSuccessfulJob is emitted somehow would be nicer.
async onSuccess(job) { | ||
if (job.resultLink != null) { | ||
const token = await doWithToken(async (t) => t); | ||
saveAs(`${job.resultLink}?token=${token}`); | ||
} | ||
}, | ||
onFailure() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't these handle the isInitial
parameter so that these don't fire for old jobs? maybe it would be better to have dedicated callbacks for "past jobs".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ah, this won't be a problem since the initial key fn is not passed to the hook.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very cool stuff, the UI works nicely for me!
I had an error in the export job, related to this now being available for coarser mags:
File "/home/f/scm/code/voxelytics/voxelytics/worker/jobs/export_tiff.py", line 209, in write_tiff_stack
with input_mag_view.get_buffered_slice_reader(
File "/home/f/scm/code/voxelytics/environments/miniconda/envs/vx-py39/lib/python3.9/site-packages/webknossos/dataset/view.py", line 683, in get_buffered_slice_reader
return BufferedSliceReader(
File "/home/f/scm/code/voxelytics/environments/miniconda/envs/vx-py39/lib/python3.9/site-packages/webknossos/dataset/_utils/buffered_slice_reader.py", line 61, in __init__
self.bbox_current_mag = absolute_bounding_box.in_mag(view.mag)
File "/home/f/scm/code/voxelytics/environments/miniconda/envs/vx-py39/lib/python3.9/site-packages/webknossos/geometry/bounding_box.py", line 306, in in_mag
assert (
AssertionError: topleft Vec3Int(0,31,0) is not aligned with the mag 2. Use BoundingBox.align_with_mag().
Since the user cannot see the error message, maybe this should be caught already before? (Or just pass a mag-aligned bbox to the tiff export code?) This happened only in the tiff stack export, not for ome-tiff.
With a mag-aligned bbox, both exports work fine :)
Another thing: the bbox tool by default places the bbox depending on the current viewport dimensions. This often leads to negative start values when zoomed out. Maybe it would make sense to limit the bboxes (or at least their initial params when hitting the plus sign) to the dataset bbox? Currently, the backend rejects export jobs with such bboxes, saying “The selected bounding box could not be parsed, must be x,y,z,width,height,depth”. So I don’t know what the worker would do with them.
The last commits look good 👍
Is this fixed now? Since this seems like an annoying problem with a potentially easy solution, I'd rather fix it now. The frontend has a |
I fixed it in the vx worker code. Do you think it should rather be fixed in the frontend? |
Ah, I didn't know that. That's even better 👍 |
The new worker should probably be released first before merging this feature, though. |
The backend still rejects bboxes with negative topleft, though. Do you think we should adapt the bbox literal parsing in the backend to support negative toplefts? Or rather clamp it in the frontend? |
I fixed the regex in the backend, so that negative bboxes are accepted. The worker then intersects the bboxes with the layer-bbox. |
…d-v4.24 * 'master' of github.com:scalableminds/webknossos: Implement http range requests for HttpsSeekableByteChannel (#6869) new GH action for adding issues to project board Fix links in Changelog (#6881) adds dedicated explore method for zarr datasets with a datasource-properties.json (#6879) Release 23.03.0 (#6880) Fix superUser being wrongly marked as organization owners (#6876) Followups for OME-TIFF export (#6874) Fix reload-precomputed-mesh functionality (#6875) Adds OME-TIFF export (#6838) Add evolutions 99,100 to migration guide (#6871) Add link to imprint and privacy to help menu (#6870) Annotation Locking Mechanism (#6819)
…come_header_UI * 'master' of github.com:scalableminds/webknossos: (34 commits) Slim down view mode dropdown by using icons (#6900) Logging on password reset/change (#6901) When merging volume tracings, also merge segment lists (#6882) avoid spinner when switching tabs in dashboard (#6894) Upgrade Antd to v4.24 (#6865) Support n5 end-chunks with chunksize differing from metadata chunksize (#6890) Implement http range requests for HttpsSeekableByteChannel (#6869) new GH action for adding issues to project board Fix links in Changelog (#6881) adds dedicated explore method for zarr datasets with a datasource-properties.json (#6879) Release 23.03.0 (#6880) Fix superUser being wrongly marked as organization owners (#6876) Followups for OME-TIFF export (#6874) Fix reload-precomputed-mesh functionality (#6875) Adds OME-TIFF export (#6838) Add evolutions 99,100 to migration guide (#6871) Add link to imprint and privacy to help menu (#6870) Annotation Locking Mechanism (#6819) Update deprecated antd <Menu> (#6860) Add functions to get and set segment colors to the frontend API (#6853) ...
Steps to test:
ome-tiff-export
branch of vx