Rust newtype with guarantees 🇺🇦 🦀
-
Updated
Dec 20, 2024 - Rust
Rust newtype with guarantees 🇺🇦 🦀
Geom3D: Geometric Modeling on 3D Structures, NeurIPS 2023
Equivariant Transformer (ET) layers are image-to-image mappings that incorporate prior knowledge on invariances with respect to continuous transformations groups (ICML 2019). Paper: https://arxiv.org/abs/1901.11399
Equivariant Subgraph Aggregation Networks (ICLR 2022 Spotlight)
Relative representations can be leveraged to enable solving tasks regarding "latent communication": from zero-shot model stitching to latent space comparison between diverse settings.
Official PyTorch and JAX Implementation of "Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks"
A List of Papers on Theoretical Foundations of Graph Neural Networks
Size-Invariant Graph Representations for Graph Classification Extrapolations (ICML 2021 Long Talk)
Official PyTorch implementation of Bispectral Neural Networks, ICLR 2023
Continuous regular group convolutions for Pytorch
Code for "Improving Stain Invariance of CNNs for Segmentation by Fusing Channel Attention and Domain-Adversarial Training"
On the forward invariance of Neural ODEs: performance guarantees for policy learning
Measurement invariance explorer - R package to explore measurement invariance
MMD-B-Fair: Learning Fair Representations with Statistical Testing (AISTATS 2023)
Non-parametric hypothesis tests for identifying distributional group symmetries from data
Official code for NeurIPS 2024 paper "A Canonicalization Perspective on Invariant and Equivariant Learning".
Official PyTorch Implementation of "A General Framework for Robust G-Invariance in G-Equivariant Networks," NeurIPS 2023
Matlab implementation of Trajectory Invariants.
Calculate invariant trajectory representations from trajectory data and generate new trajectories from the invariants.
This a tensorflow implementation of VICReg - a self-supervised learning architecture that prevents collapse in an intuitive manner using a loss function that 1. maintains the variance of each embedding over a batch above a threshold and 2. decorrelating pairs of embeddings over a batch and attracting them to 0. Training was done using TPU on colab
Add a description, image, and links to the invariance topic page so that developers can more easily learn about it.
To associate your repository with the invariance topic, visit your repo's landing page and select "manage topics."