Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Drag'n'drop import for hybrid annotations #4347

Closed
normanrz opened this issue Nov 28, 2019 · 7 comments · Fixed by #4837
Closed

Drag'n'drop import for hybrid annotations #4347

normanrz opened this issue Nov 28, 2019 · 7 comments · Fixed by #4837

Comments

@normanrz
Copy link
Member

Currently, hybrid annotations can only be imported through the dashboard and not through drag&drop.

@philippotto
Copy link
Member

@youri-k Hit me up when you have some time to talk about this :)

@fm3 fm3 removed their assignment Aug 31, 2020
@youri-k
Copy link
Contributor

youri-k commented Sep 24, 2020

Our approach looks the following:

  • frontend unzips outer zip file
  • frontend applies nml like in skeleton tracings
  • frontend sends inner data zip to backend
  • backend merges zip data and existing volume tracing
  • frontend reloads segmentation layer after backend finishes

@normanrz
Copy link
Member Author

One question that I have: Would it be feasible (or a good idea) to load the full volume tracing to the client; just like the skeleton tracings?
I would assume the compressed data is not that large, but I don't have numbers.

@philippotto
Copy link
Member

philippotto commented Sep 24, 2020

Would it be feasible (or a good idea) to load the full volume tracing to the client; just like the skeleton tracings?

I think, it could* work out regarding the data sizes (however, the limits will probably be tighter than when doing it via the back-end). However, I don't really see the benefit in doing it that way (apart from being able to say that we do both skeleton and volume import in the front-end). The front-end would need to unzip all the data, apply it, build update actions and then send it back to the server which also needs to re-apply it. Just sending the whole bundled zip to the back-end and then reloading the segmentation layer, feels way lighter (both implementation- and performance-wise) to me.

* @fm3 mentioned that the zip itself is quite small, but he wasn't sure whether the actual wkw files were compressed. so, this could be a problem.

@normanrz
Copy link
Member Author

Maybe it is not worth for this one use case, but could make a bunch of features less complex. I am thinking of multi-res tracing, merger mode, export, viewing.

@philippotto
Copy link
Member

Ok, I think I misunderstood your initial question. I thought your were talking about loading the full volume data from the zip directly into the client, but you meant always having all the volume data locally available? This might be feasible, but we'd need to distinguish between segmentation layers and volume tracings then (since segmentation layers cannot be loaded fully). And for volume tracings with fallback segmentation, the features wouldn't probably get noticeably less complex.

However, one could think about having two layers for the segmentation: The actual segmentation layer on disk and the volume tracing. Then, one could also toggle the original segmentation layer without having to permanently unlink it, for example. Always loading all the volume data might be easier then.

This probably deserves to be a separate issue, though.

@normanrz
Copy link
Member Author

This probably deserves to be a separate issue, though.

Probably right 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants