Skip to content

UbiOps/tutorials

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tutorials

Welcome to the UbiOps tutorials page!

The UbiOps tutorials page is here to provide (new) users with inspiration on how to work with UbiOps. Use it to find inspiration or to discover new ways of working with the UbiOps platform.

With a (free) UbiOps account you can use the tutorials to have example applications running in your own environment in minutes.*

How does it work?

We have three tutorial categories: All of our tutorials contain full walkthroughs that can be run in Jupyter notebook or Rstudio. Except for the UI tutorials, they contain ready-to-go deployment packages which illustrate how to use the deployment package for typical cases, using the WebApp.

Requirements

To be able to use the UbiOps tutorials you need three things:

  • You need to have the UbiOps client library installed. For Python this can be done via pip install or via Setuptools. For more information see our GitHub Python page. For R this can be done by installing the devtools package and then using the install_github function. For more information see our GitHub R page

  • If you want to run Python tutorials, you need to be able to run Jupyter Notebook. See the installation guide for more information.

  • If you want to run R script tutorials, you need to be able to run Rstudio. See the installation guide for more information.

  • You need to have a UbiOps account. You can create a free account here.

UI tutorials

steps-overview The UI tutorials show how to set up your deployment package for typical use cases. You can download the deployment package, fill in the deployment creation form in the UI, and upload the deployment package. Afterwards you can make a request to the deployment to test it out.

Tutorials

steps-overview

Every tutorial contains a standalone example with all the material you need to run it. They are all centered around a Jupyter Notebook. If you download the tutorial folder and run the notebook/script it will build the example in your own UbiOps account.

Our current Python Tutorials

Topic and link to tutorial Functionalities of UbiOps addressed
Multiplying a number UI, deployment
Load & run a pre-trained model UI, training
Image recognition UI, deployment
GPU deployment UI, GPU instance types
Creating a training and production pipeline with Scikit Learn in UbiOps Deployments, pipelines
Deploying a TensorFlow model in UbiOps Deployments
Deploying an XGBoost model in UbiOps Deployments
Inference speed of ONNX vs TensorFlow
Training a Tensorflow model Training
Checkpointing TensorFlow model training in UbiOps Training
Retraining a PyTorch model in UbiOps Training, Logs
Training an XGBoost model Training
Azure Data Factory and UbiOps pipeline interaction tutorial Integration, pipelines
Using Azure ML services to train a model and deploy on UbiOps Integration, deployments
Triggering a deployment/pipeline request from Azure Functions Different forms of requests, integration
Triggering a deployment/pipeline request from Google Cloud Functions Different forms of requests, integration
Convert your MLFlow model to UbiOps deployment Deployments
How to turn your deployment into a live web app using Streamlit Deployments, Integration
Weights and Biases integration using FinBERT
Using TensorRT in UbiOps Deployments, Integration, Requests
Accelerate workflows with NVIDIA RAPIDS Local testing, Environments, Training
Huggingface & BERT Deployments, Integration, Requests, GenAI
Deploy Gemma 2B with streaming (CPU) Deployment, streaming
Huggingface & Stable Diffusion Deployments, Integration, Requests, GenAI
Fine-tuning Falcon 1B Training, Integration, GenAI
Implement RAG with Langchain on UbiOps Deployments, Integration, Requests, Pipelines, GenAI
Deploy vLLM server (GPU) Deployment, OpenAI-compatibility, Request Concurrency, Streaming
Deploy Ollama (CPU) Deployment, OpenAI-compatibility, Bring your own Docker, Concurrency, Streaming
Implement Input Guardrails Deployment, Pipelines, OpenAI-compatibility, Streaming
Pipeline that matches, orders and visualises a list of Pokemon Pipelines
Pipeline Tutorial Intro tutorial
DSPy Pipeline Tutorial Pipeline
Deploy Multi-Model Ollama Deployment,OpenAI-compatibility, Bring your own Docker, Concurrency, Streaming, Environment Variables

About

UbiOps Tutorials

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •