Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Versatile distance transforms (Euclidean and Geodesic) #1332

Open
tvercaut opened this issue Dec 8, 2020 · 8 comments
Open

Versatile distance transforms (Euclidean and Geodesic) #1332

tvercaut opened this issue Dec 8, 2020 · 8 comments
Labels
Feature request Module: networks network, layers, blocks definitions in PyTorch Module: transform data transforms for preprocessing and postprocessing. WG: Research For the research working group WG: Transforms For the transforms working group

Comments

@tvercaut
Copy link
Member

tvercaut commented Dec 8, 2020

Is your feature request related to a problem? Please describe.
Several segmentation pipelines, and in particular interactive segementation ones, rely on some form of distance transform (e.g. Euclidean, Chamfer or Geodesic Distance Transform).

At the moment there is no off-the-shelf pytorch function to compute these.

Describe the solution you'd like
A MONAI-based implementation of distance transforms (ideally differentiable) would be fantastic. It woul accelerate uptake of such methods.

Describe alternatives you've considered

Additional context
Example use cases of distance transforms in segmentaiton:

@wyli
Copy link
Contributor

wyli commented Dec 9, 2020

also used in monai metrics computation

distance_transform_cdt, _ = optional_import("scipy.ndimage.morphology", name="distance_transform_cdt")

@ddrobny
Copy link

ddrobny commented Apr 26, 2021

There is a recent paper using a convolution-based and differential approximation for a distance transform:
https://link.springer.com/chapter/10.1007%2F978-3-030-71278-5_31
There doesn't seem to be related code published yet but it might be worth looking into.

@tvercaut
Copy link
Member Author

Thanks @ddrobny.

On a related note, tensorflow has a GPU implementation of the Felzenszwalb & Huttenlocher Distance transform:

For Euclidean distance, we can compare the performance (and read the interesting comments) with an established (but GPL so don't look at the code) implementation:

@wyli wyli added Module: networks network, layers, blocks definitions in PyTorch Module: transform data transforms for preprocessing and postprocessing. WG: Transforms For the transforms working group WG: Research For the research working group labels May 13, 2021
@tvercaut
Copy link
Member Author

tvercaut commented Jun 28, 2022

As mentioned in #4603, @masadcv just released FastGeodis, a package to compute Geodesic and Euclidean distance maps in PyTorch.

@grlee77
Copy link
Contributor

grlee77 commented Jul 20, 2022

We plan to have an equivalent of SciPy's distance_transform_edt (Euclidean distance transform) for CuPy arrays in the upcoming cuCIM 22.08 release. The GPU implementation is currently in 2D and 3D only as compared to the general nD one in SciPy.

see: rapidsai/cucim#318

@matt3o
Copy link
Contributor

matt3o commented Apr 27, 2023

@grlee77 Thanks a lot! I just tried it on the DeepEdit Code and it appears to be 100% compatible to what scipy does, really awesome.
For everyone else trying this out: The FastGeodis Code yields different results than the scipy one, also see masadcv/FastGeodis#11 about that. I did not verify what impact it has, so it might still work as usual.

@matt3o
Copy link
Contributor

matt3o commented Apr 27, 2023

Also for anyone wondering what the impact of this would be, I did some profiling on a full 3D volume. Here is the according snippet: https://gist.github.com/matt3o/0de8a24a10135ed24de6f77ab89e15b1

And the results were:

INFO:root:torch.Size([400, 400, 284])
INFO:root:distance_transform_cdt_scipy: 20 runs took 6.403534 seconds, which means 0.320177 seconds per run
INFO:root:distance_transform_edt_scipy: 20 runs took 99.965921 seconds, which means 4.998296 seconds per run
INFO:root:distance_transform_edt_cupy: 20 runs took 0.162181 seconds, which means 0.008109 seconds per run.

Edit: Added the cdt transform too, since that is the one used in DeepEdit. CDT runs a lot faster on CPU, still it's a huge difference for big volumes.

wyli added a commit to wyli/MONAI that referenced this issue Jul 5, 2023
Fixes Project-MONAI#4103

### Description
- considers spacing and subvoxel borders when computing surface Dice
- reimplemented
http://medicaldecathlon.com/files/Surface_distance_based_measures.ipynb
using pytorch API
- there's possibility of speed up once
Project-MONAI#1332 is addressed


### Types of changes
<!--- Put an `x` in all the boxes that apply, and remove the not
applicable items -->
- [x] Non-breaking change (fix or new feature that would not break
existing functionality).
- [ ] Breaking change (fix or new feature that would cause existing
functionality to change).
- [x] New tests added to cover the changes.
- [ ] Integration tests passed locally by running `./runtests.sh -f -u
--net --coverage`.
- [x] Quick tests passed locally by running `./runtests.sh --quick
--unittests --disttests`.
- [x] In-line docstrings updated.
- [x] Documentation updated, tested `make html` command in the `docs/`
folder.

---------

Signed-off-by: Wenqi Li <[email protected]>
@matt3o
Copy link
Contributor

matt3o commented Sep 29, 2023

As of #6981, the Euclidean distance transform will be part of MONAI from version 1.3 ongoing. It has GPU support via cuCIM and runs a lot faster. Falls back to SciPy if necessary, look into the docs for details.

I will also soon publish an overhauled version of DeepEdit with much faster transforms based on torch and this GPU-based distance transform, which will hopefully be integrated into MONAI as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Module: networks network, layers, blocks definitions in PyTorch Module: transform data transforms for preprocessing and postprocessing. WG: Research For the research working group WG: Transforms For the transforms working group
Projects
None yet
Development

No branches or pull requests

6 participants