Training two separate expert neural networks and one gater that can switch the expert networks.
-
Updated
Jul 8, 2024 - Python
Training two separate expert neural networks and one gater that can switch the expert networks.
A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with simplified mathematical explanations. Made as part of my journey for learning and experimenting with state-of-the-art generative AI.
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
A compilation of the best multi-agent papers
Training-free Post-training Efficient Sub-quadratic Complexity Attention. Implemented with OpenAI Triton.
Official Implementation of SEA: Sparse Linear Attention with Estimated Attention Mask (ICLR 2024)
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
Generative Pre-trained Transformer in PyTorch
Multi-Corpus Emotion Recognition Method based on Cross-Modal Gated Attention Fusion
🔥🔥🔥A collection of some awesome public SNN(Spiking Neural Network) projects.
[CVPR 2024] Official implementation of the paper "Salience DETR: Enhancing Detection Transformer with Hierarchical Salience Filtering Refinement"
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
Scenic: A Jax Library for Computer Vision Research and Beyond
Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learning Tasks"
The message passing GAN https://arxiv.org/abs/2106.11535 and generative adversarial particle transformer https://arxiv.org/abs/2211.10295 architectures for generating particle clouds
Explainable Neural Subgraph Matching with Graph Learnable Multi-hop Attention Networks
PyTorch Implementation for the paper "Let Me Help You! Neuro-Symbolic Short-Context Action Anticipation" accepted to RA-L'24.
Julia Implementation of Transformer models
Official implementation for "GLASS: Global to Local Attention for Scene-Text Spotting" (ECCV'22)
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."