Important
📢 KerasNLP is now KerasHub! 📢 Read the announcement.
We have renamed the repo to KerasHub in preparation for the release, but have not yet released the new package. Follow the announcement for news.
KerasHub is a library that supports natural language processing, computer vision, audio, and multimodal backbones and task models, working natively with TensorFlow, JAX, or PyTorch. KerasHub provides a repository of pre-trained models and a collection of lower-level building blocks for these tasks. Built on Keras 3, models can be trained and serialized in any framework and re-used in another without costly migrations.
This library is an extension of the core Keras API; all high-level modules are Layers and Models that receive that same level of polish as core Keras. If you are familiar with Keras, congratulations! You already understand most of KerasHub.
All models support JAX, TensorFlow, and PyTorch from a single model definition and can be fine-tuned on GPUs and TPUs out of the box. Models can be trained on individual accelerators with built-in PEFT techniques, or fine-tuned at scale with model and data parallel training. See our Getting Started guide to start learning our API. Browse our models on Kaggle. We welcome contributions.
Fine-tune a BERT classifier on IMDb movie reviews:
import os
os.environ["KERAS_BACKEND"] = "jax" # Or "tensorflow" or "torch"!
import keras_hub
import tensorflow_datasets as tfds
imdb_train, imdb_test = tfds.load(
"imdb_reviews",
split=["train", "test"],
as_supervised=True,
batch_size=16,
)
# Load a BERT model.
classifier = keras_hub.models.Classifier.from_preset(
"bert_base_en",
num_classes=2,
activation="softmax",
)
# Fine-tune on IMDb movie reviews.
classifier.fit(imdb_train, validation_data=imdb_test)
# Predict two new examples.
classifier.predict(["What an amazing movie!", "A total waste of my time."])
Try it out in a colab. For more in depth guides and examples, visit keras.io/keras_hub.
To try out the latest version of KerasHub, you can use our nightly package:
pip install keras-hub
KerasHub currently requires TensorFlow to be installed for use of the
tf.data
API for preprocessing. Even when pre-processing with tf.data
,
training can still happen on any backend.
Read Getting started with Keras for more information on installing Keras 3 and compatibility with different frameworks.
Important
We recommend using KerasHub with TensorFlow 2.16 or later, as TF 2.16 packages Keras 3 by default.
If you have Keras 3 installed in your environment (see installation above),
you can use KerasHub with any of JAX, TensorFlow and PyTorch. To do so, set the
KERAS_BACKEND
environment variable. For example:
export KERAS_BACKEND=jax
Or in Colab, with:
import os
os.environ["KERAS_BACKEND"] = "jax"
import keras_hub
Important
Make sure to set the KERAS_BACKEND
before importing any Keras libraries;
it will be used to set up Keras when it is first imported.
We follow Semantic Versioning, and plan to
provide backwards compatibility guarantees both for code and saved models built
with our components. While we continue with pre-release 0.y.z
development, we
may break compatibility at any time and APIs should not be considered stable.
KerasHub provides access to pre-trained models via the keras_hub.models
API.
These pre-trained models are provided on an "as is" basis, without warranties
or conditions of any kind. The following underlying models are provided by third
parties, and subject to separate licenses:
BART, BLOOM, DeBERTa, DistilBERT, GPT-2, Llama, Mistral, OPT, RoBERTa, Whisper,
and XLM-RoBERTa.
If KerasHub helps your research, we appreciate your citations. Here is the BibTeX entry:
@misc{kerashub2024,
title={KerasHub},
author={Watson, Matthew, and Chollet, Fran\c{c}ois and Sreepathihalli,
Divyashree, and Saadat, Samaneh and Sampath, Ramesh, and Rasskin, Gabriel and
and Zhu, Scott and Singh, Varun and Wood, Luke and Tan, Zhenyu and Stenbit,
Ian and Qian, Chen, and Bischof, Jonathan and others},
year={2024},
howpublished={\url{https://github.com/keras-team/keras-hub}},
}
Thank you to all of our wonderful contributors!