Skip to content

An interactive book on deep learning. Much easy, so MXNet. Wow.

Notifications You must be signed in to change notification settings

stgapr/mxnet-the-straight-dope

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Learning - The Straight Dope

Abstract

This repo contains an incremental sequence of notebooks designed to teach deep learning, MXNet, and the gluon interface. Our goal is to leverage the strengths of Jupyter notebooks to present prose, graphics, equations, and code together in one place. If we're successful, the result will be a resource that could be simultaneously a book, course material, a prop for live tutorials, and a resource for plagiarising (with our blessing) useful code. To our knowledge there's no source out there that teaches either (1) the full breadth of concepts in modern deep learning or (2) interleaves an engaging textbook with runnable code. We'll find out by the end of this venture whether or not that void exists for a good reason.

Another unique aspect of this book is its authorship process. We are developing this resource fully in the public view and are making it available for free in its entirety. While the book has a few primary authors to set the tone and shape the content, we welcome contributions from the community and hope to coauthor chapters and entire sections with experts and community members. Already we've received contributions spanning typo corrections through full working examples.

Implementation with Apache MXNet

Throughout this book, we rely upon MXNet to teach core concepts, advanced topics, and a full complement of applications. MXNet is widely used in production environments owing to its strong reputation for speed. Now with gluon, MXNet's new imperative interface (alpha), doing research in MXNet is easy.

Dependencies

To run these notebooks, you'll want to build MXNet from source. Fortunately, this is easy (especially on Linux) if you follow these instructions. You'll also want to install Jupyter and use Python 3 (because it's 2017).

Slides

The authors (& others) are inreasingly giving talks that are based on the content in this books. Some of these slide-decks (like the 6-hour KDD 2017) are gigantic so we're collecting them separately in this repo. Contribute there if you'd like to share tutorials or course material based on this books.

Table of contents

Part 1: Crashcourse

Part 2: Introduction to Supervised Learning

Part 3: Deep neural networks (DNNs)

Part 3.5: gluon Plumbing

Part 4: Convolutional neural networks (CNNs)

Part 5: Recurrent neural networks (RNNs)

Part 6: Computer vision (CV)

  • Roadmap Network of networks (inception & co)
  • Roadmap Residual networks
  • Object detection
  • Roadmap Fully-convolutional networks
  • Roadmap Siamese (conjoined?) networks
  • Roadmap Embeddings (pairwise and triplet losses)
  • Roadmap Inceptionism / visualizing feature detectors
  • Roadmap Style transfer
  • Fine-tuning

Part 7: Natural language processing (NLP)

  • Roadmap Word embeddings (Word2Vec)
  • Roadmap Sentence embeddings (SkipThought)
  • Roadmap Sentiment analysis
  • Roadmap Sequence-to-sequence learning (machine translation)
  • Roadmap Sequence transduction with attention (machine translation)
  • Roadmap Named entity recognition
  • Roadmap Image captioning
  • Tree-LSTM for semantic relatedness

Part 8: Unsupervised Learning

  • Roadmap Introduction to autoencoders
  • Roadmap Convolutional autoencoders (introduce upconvolution)
  • Roadmap Denoising autoencoders
  • Roadmap Variational autoencoders
  • Roadmap Clustering

Part 9: Adversarial learning

  • Roadmap Two Sample Tests
  • Roadmap Finding adversarial examples
  • Roadmap Adversarial training

Part 10: Generative adversarial networks (GANs)

  • 1 - Introduction to GANs
  • Roadmap DCGAN
  • Roadmap Wasserstein-GANs
  • Roadmap Energy-based GANS
  • Roadmap Conditional GANs
  • Roadmap Image transduction GANs (Pix2Pix)
  • Roadmap Learning from Synthetic and Unsupervised Images

Part 11: Deep reinforcement learning (DRL)

  • Roadmap Introduction to reinforcement learning
  • Roadmap Deep contextual bandits
  • Roadmap Deep Q-networks
  • Roadmap Policy gradient
  • Roadmap Actor-critic gradient

Part 12: Variational methods and uncertainty

  • Roadmap Dropout-based uncertainty estimation (BALD)
  • Roadmap Weight uncertainty (Bayes-by-backprop)
  • Roadmap Variational autoencoders

Part 13: Optimization

Part 14: Optimization, Distributed and high-performance learning

Part 15: Hacking MXNet

  • Custom Operators
  • ...

Part 16: Audio Processing

  • Roadmap Intro to automatic speech recognition
  • Roadmap Connectionist temporal classification (CSC) for unaligned sequences
  • Roadmap Combining static and sequential data

Part 17: Recommender systems

  • Roadmap Latent factor models
  • Roadmap Deep latent factor models
  • Roadmap Bilinear models
  • Roadmap Learning from implicit feedback

Part 18: Time series

  • Roadmap Forecasting
  • Roadmap Modeling missing data
  • Roadmap Combining static and sequential data

Part 19 Tensor Methods

  • Roadmap Introduction to tensor algebra
  • Roadmap Tensor decomposition
  • Roadmap Tensorized neural networks

Appendix 1: Cheatsheets

  • Roadmap gluon
  • Roadmap PyTorch to MXNet
  • Roadmap Tensorflow to MXNet
  • Roadmap Keras to MXNet
  • Roadmap Math to MXNet

Choose your own adventure

We've designed these tutorials so that you can traverse the curriculum in more than one way.

  • Anarchist - Choose whatever you want to read, whenever you want to read it.
  • Imperialist - Proceed through all tutorials in order. In this fashion you will be exposed to each model first from scratch, writing all the code ourselves but for the basic linear algebra primitives and automatic differentiation.
  • Capitalist - If you don't care how things work (or already know) and just want to see working code in gluon, you can skip (from scratch!) tutorials and go straight to the production-like code using the high-level gluon front end.

Authors

This evolving creature is a collaborative effort (see contributors tab). The lead writers, assimilators, and coders include:

Inspiration

In creating these tutorials, we've have drawn inspiration from some the resources that allowed us to learn deep / machine learning with other libraries in the past. These include:

Contribute

  • Already, in the short time this project has been off the ground, we've gotten some helpful PRs from the community with pedagogical suggestions, typo corrections, and other useful fixes. If you're inclined, please contribute!

About

An interactive book on deep learning. Much easy, so MXNet. Wow.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 99.5%
  • Other 0.5%