Skip to content

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Notifications You must be signed in to change notification settings

VaticanCameos99/knowledge-distillation-for-unet

Repository files navigation

Knowledge Distillation for UNet

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Results:

Dataset: Carvana Image Masking Challenge

Models trained without knowledge distillation

Models trained without knowledge distillation

Models trained with knowledge distillation

Models trained with knowledge distillation

References

About

An implementation of Knowledge distillation for segmentation, to train a small (student) UNet from a larger (teacher) UNet thereby reducing the size of the network while achieving performance similar to the heavier model.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages