Skip to content

wjlgatech/tf_multimodal

Repository files navigation

tf_multimodal: Imbalanced Multi-class Classification for Multi-modal Data

tf_multimodal is a deep learning classifier for Imbalanced Multi-class & Multi-modal Data.

Why tf_multimodal

tf_multimodal aims to be:

  • easy-to-use: no boiler-plate coding required, simple API to follow
  • easy-to-rebuild (custom): modules are lego-like, ready to be mixed and match
  • end-to-end: input raw data output predictions
  • stable and scalable for production

Essential features

  • imbalanced multiclass classification: tf_multimodal provides several options to handle imbalanced learning, including bias initiation, class weights, resampling, focal loss.

  • feature encoding: tf_multimodal automates the encoding for all selected features, making it a bleeze for large scale experimentations.

  • feature dtype: tf_multimodal provides helpers to automatically separate features of different data-types and corresponding encoders for each type, including cnt(continuous feature), cat(categorical feature), txt(text feature), img (image feature), dt(datetime feature).

  • feature selection: tf_multimodal provides tools to select relevant features, at this release include L1, VSN

  • feature crossing: tf_multimodal provides tools for feature crossing (i.e. feature interactions), at this release includes DeepCrossLayer, vsn_DeepCrossLayer

tf_multimodal is under active development and welcomes your contribution. For recent updates, please visit https://github.com/wjlgatech/tf_multimodal

Dev Notes

  • this package is developed using nbdev_colab
  • tf_multimodal is a synergetic project fast_tfrs, an easy-to-use, easy-to-make recommendation engineer based on tfrs

Features Built

  • [5/5] preprocess and encode cnt_cols: normalization and bucketization

  • [5/5] preprocess and encode cat_cols for int_cat_cols and str_cat_cols

  • [5/5] preprocess and encode txt_cols: implemented 2 methods: 1.LSTM 2.Bert

  • [5/5] preprocess and encode img_cols

  • [5/5]Situation: How do I combine info from various dtype of data, given that each dtype have different number of columns (e.g. 5 cnt_cols, 10 cat_cols, 2 txt_cols, 2 img_cols), and the embedding of each dtype column can be very different emb_width (e.g. con_cols have emb_width 1 vs. cat_col has emb_width 16). The info from col with narrow emb_width can be overwhelmed by the col with wide emb_width. Solution: instead of using simple concat, to add a deep-wide module where the deep layer squeeze the cols with dense representation (e.g. embedding with 16 cols) while the wide layer is to process the sparse cols. Then the output from deep-branch and the output from the wide-branch is combined. In this way, features of various embedding width are normalized so that their contribution can be backtracked and compared.

  • [5/5] automate the separation for cnt, cat, txt columns in df

  • [5/5] OneTower structure: concat the emb of txt_cols, cnt_cols, cat_cols for df

  • [5/5] allow txt_, cnt_, cat_cols interaction with deep cross module

  • [5/5] to accelerate hyperparameter tuning by a tf.keras-based module https://keras.io/keras_tuner/, build 3 tuners: randomSearch, Bayesian, Hyperband.

  • [5/5] in order to experiment various downstream learning architectures (e.g. simple, deep-wide-cross, dcn, two tower), simplify and modularize the preprocessing and encoding of cnt, cat, txt, img.

  • [5/5] to automatically switch between binary classification and multiclass classification, build unified deep learning utilities, including data-preprocessing for label column (multi-hot-encoding), loss function (tf.keras.losses.CategoricalCrossentropy) and custom metrics (accuracy, F1, roc-auc, pr-auc, confusion matrix, avg_precision, avg_recall)

  • [5/5] wide, deep, cross on concat_emb https://keras.io/examples/structured_data/wide_deep_cross_networks/

    • cat dense emb VS cat sparse emb

Features to Build

References & Credits:

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages