simple tutorials to familiarize with TRAX language for building, training and running Machine Learning models
Trax is a new library developped and maintained by Google Brain. It uses tensorflow and JAX (enhanced version of numpy). Trax is reputed to be faster and simplier to code than predecessors due to the fact it has been developped from the ground, without the inherent backward-compatibility issues which put weight and add complexity to older frameworks like Pytorch or Tensorflow-Keras. Trax is therefore a deep-learning library that's focused on clear code and speed.
According to their authors (also involved in the development of tensorflow several years ago), Trax is recommended especially for sequenced models, models like transformer and models that are used in natural language processing.
Trax manual can be found here.
These tutorials are extracted from Course 3 of DeepLearning AI NLP specialization on Coursera.
Trax can use TPU on Colab seamlessly.
A sample implementation of TRAX is illustrated in a Sentiment classifier for tweets.