Skip to content
#

kolmogorov-arnold-representation

Here are 8 public repositories matching this topic...

Language: All
Filter by language

An implementation of the KAN architecture using learnable activation functions for knowledge distillation on the MNIST handwritten digits dataset. The project demonstrates distilling a three-layer teacher KAN model into a more compact two-layer student model, comparing the performance impacts of distillation versus non-distilled models.

  • Updated May 11, 2024
  • Python

Improve this page

Add a description, image, and links to the kolmogorov-arnold-representation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the kolmogorov-arnold-representation topic, visit your repo's landing page and select "manage topics."

Learn more