NLP学习笔记的Notebook,包含经典模型的理解与相关实践。
-
Updated
Apr 6, 2020 - Jupyter Notebook
NLP学习笔记的Notebook,包含经典模型的理解与相关实践。
Deep learning research implemented on notebooks using PyTorch.
Assignments and lab notebooks of NLP Specialization by DeepLearning.ai
The notebook explains the various steps to obtain the results of publication: "Is Space-Time Attention All You Need for Video Understanding?"
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Notebooks for running and visualizing results using trained models for linguistic complexity.
Snippets of nlp frameworks
The basic notebook fot implementing the attention.
Notes on ML and DL with jupyter notebooks (python)
Exercise notebooks for CVND, the Udacity Computer Vision Nanodegree.
QuillGPT is an implementation of the GPT decoder block based on the architecture from Attention is All You Need paper by Vaswani et. al. in PyTorch. Additionally, this repository contains two pre-trained models — Shakespearean GPT and Harpoon GPT, a Streamlit Playground, Containerized FastAPI Microservice, training - inference scripts & notebooks.
In this notebook, we look at how attention is implemented. We will focus on implementing attention in isolation from a larger mode
A simple, easy-to-understand library for diffusion models using Flax and Jax. Includes detailed notebooks on DDPM, DDIM, and EDM with simplified mathematical explanations. Made as part of my journey for learning and experimenting with state-of-the-art generative AI.
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."