Skip to content

Commit

Permalink
Update readme.md (opendatahub-io#890)
Browse files Browse the repository at this point in the history
* Update readme.md

* update links to the respective schemas

* use crd-ref-doc for api doc generation

* update readme docs

* update pr based on comments

* change k8s version

* remove dev preview

* update doc to get kserve value

* add @build back in makefile

* add make api-docs under ##@ Development
  • Loading branch information
AjayJagan authored Mar 11, 2024
1 parent 410ea57 commit 1b86e42
Show file tree
Hide file tree
Showing 19 changed files with 701 additions and 174 deletions.
15 changes: 14 additions & 1 deletion Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -61,14 +61,15 @@ CONTROLLER_GEN ?= $(LOCALBIN)/controller-gen
ENVTEST ?= $(LOCALBIN)/setup-envtest
OPERATOR_SDK ?= $(LOCALBIN)/operator-sdk
GOLANGCI_LINT ?= $(LOCALBIN)/golangci-lint

CRD_REF_DOCS ?= $(LOCALBIN)/crd-ref-docs
## Tool Versions
KUSTOMIZE_VERSION ?= v3.8.7
CONTROLLER_GEN_VERSION ?= v0.9.2
OPERATOR_SDK_VERSION ?= v1.24.1
GOLANGCI_LINT_VERSION ?= v1.54.0
# ENVTEST_K8S_VERSION refers to the version of kubebuilder assets to be downloaded by envtest binary.
ENVTEST_K8S_VERSION = 1.24.2
CRD_REF_DOCS_VERSION = 0.0.10

# Get the currently used golang install path (in GOPATH/bin, unless GOBIN is set)
ifeq (,$(shell go env GOBIN))
Expand Down Expand Up @@ -160,6 +161,11 @@ get-manifests: ## Fetch components manifests from remote git repo
./get_all_manifests.sh
CLEANFILES += odh-manifests/*

.PHONY: api-docs
api-docs: crd-ref-docs ## Creates API docs using https://github.com/elastic/crd-ref-docs
$(CRD_REF_DOCS) --source-path ./ --output-path ./docs/api-overview.md --renderer markdown --config ./crd-ref-docs.config.yaml && \
egrep -v '\.io/[^v][^1].*)$$' ./docs/api-overview.md > temp.md && mv ./temp.md ./docs/api-overview.md

##@ Build

.PHONY: build
Expand Down Expand Up @@ -245,6 +251,13 @@ golangci-lint: $(GOLANGCI_LINT) ## Download golangci-lint locally if necessary.
$(GOLANGCI_LINT): $(LOCALBIN)
test -s $(LOCALBIN)/golangci-lint || { curl -sSfL $(GOLANGCI_LINT_INSTALL_SCRIPT) | bash -s $(GOLANGCI_LINT_VERSION); }

CRD_REF_DOCS_DL_URL ?= 'https://github.com/elastic/crd-ref-docs/releases/download/v$(CRD_REF_DOCS_VERSION)/crd-ref-docs'
.PHONY: crd-ref-docs
crd-ref-docs: $(CRD_REF_DOCS) ## Download crd-ref-docs locally if necessary.
$(CRD_REF_DOCS): $(LOCALBIN)
test -s $(CRD_REF_DOCS) || curl -sSLo $(CRD_REF_DOCS) $(CRD_REF_DOCS_DL_URL) && \
chmod +x $(CRD_REF_DOCS) ;\

BUNDLE_DIR ?= "bundle"
.PHONY: bundle
bundle: prepare operator-sdk ## Generate bundle manifests and metadata, then validate generated files.
Expand Down
111 changes: 74 additions & 37 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,9 @@ and configure these applications.

### Table of contents
- [Usage](#usage)
- [Prerequisites](#prerequisites)
- [Installation](#installation)
- [Dev Preview](#dev-preview)
- [Developer Guide](#developer-guide)
- [Developer Guide](#developer-guide)
- [Pre-requisites](#pre-requisites)
- [Download manifests](#download-manifests)
- [Structure of `COMPONENT_MANIFESTS`](#structure-of-component_manifests)
Expand All @@ -20,55 +20,69 @@ and configure these applications.
- [Build Image](#build-image)
- [Deployment](#deployment)
- [Test with customized manifests](#test-with-customized-manifests)
- [Update API docs](#update-api-docs)
- [Example DSCInitialization](#example-dscinitialization)
- [Example DataScienceCluster](#example-datasciencecluster)
- [Run functional Tests](#run-functional-tests)
- [Run e2e Tests](#run-e2e-tests)
- [API Overview](#api-overview)
- [Component Integration](#component-integration)
- [Troubleshooting](#troubleshooting)
- [Upgrade testing](#upgrade-testing)

## Usage

### Prerequisites
If `single model serving configuration` is used or if `Kserve` component is used then please make sure to install the following operators before proceeding to create a DSCI and DSC instances.
- [Authorino operator](https://github.com/Kuadrant/authorino)
- [Service Mesh operator](https://github.com/Maistra/istio-operator)
- [Serverless operator](https://github.com/openshift-knative/serverless-operator)

Additionally installing `Authorino operator` & `Service Mesh operator` enhances user-experience by providing a single sign on experience.

### Installation

The latest version of operator can be installed from the `community-operators` catalog on `OperatorHub`. It can also be build
and installed from source manually, see the Developer guide for further instructions.
- The latest version of operator can be installed from the `community-operators` catalog on `OperatorHub`.

1. Subscribe to operator by creating following subscription
![ODH operator in OperatorHub](docs/images/OperatorHub%20ODH%20Operator.png)

```console
cat <<EOF | oc create -f -
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
name: opendatahub-operator
namespace: openshift-operators
spec:
channel: fast
name: opendatahub-operator
source: community-operators
sourceNamespace: openshift-marketplace
EOF
```
Please note that the latest releases are made in the `Fast` channel.

2. Create [DSCInitializationc](#example-dscinitialization) CR manually.
You can also use operator to create default DSCI CR by removing env variable DISABLE_DSC_CONFIG from CSV following restart operator pod.
- It can also be build
and installed from source manually, see the Developer guide for further instructions.

3. Create [DataScienceCluster](#example-datasciencecluster) CR to enable components
1. Subscribe to operator by creating following subscription

## Dev Preview
```console
cat <<EOF | oc create -f -
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
name: opendatahub-operator
namespace: openshift-operators
spec:
channel: fast
name: opendatahub-operator
source: community-operators
sourceNamespace: openshift-marketplace
EOF
```

Developer Preview of the new Open Data Hub operator codebase is now available.
Refer [Dev-Preview.md](./docs/Dev-Preview.md) for testing preview features.
2. Create [DSCInitialization](#example-dscinitialization) CR manually.
You can also use operator to create default DSCI CR by removing env variable DISABLE_DSC_CONFIG from CSV or changing the value to "false", followed by restarting the operator pod.

### Developer Guide
3. Create [DataScienceCluster](#example-datasciencecluster) CR to enable components

## Developer Guide

#### Pre-requisites

- Go version **go1.18.9**
- Go version **go1.19**
- operator-sdk version can be updated to **v1.24.1**

#### Download manifests

The `get_all_manifests.sh` script facilitates the process of fetching manifests from remote git repositories. It is configured to work with a predefined map of components and their corresponding manifest locations.
The [get_all_manifests.sh](/get_all_manifests.sh) script facilitates the process of fetching manifests from remote git repositories. It is configured to work with a predefined map of components and their corresponding manifest locations.

#### Structure of `COMPONENT_MANIFESTS`

Expand Down Expand Up @@ -113,8 +127,7 @@ Ensure back up before run this command if you have local changes of manifests wa

##### for build operator image

```
```commandline
make image-build
```

Expand Down Expand Up @@ -191,22 +204,29 @@ e.g `make image-build -e IMAGE_BUILD_FLAGS="--build-arg USE_LOCAL=true"`
```commandline
operator-sdk run bundle quay.io/<username>/opendatahub-operator-bundle:<VERSION> --namespace $OPERATOR_NAMESPACE
```

### Test with customized manifests

There are 2 ways to test your changes with modification:

1. set `devFlags.ManifestsUri` field of DSCI instance during runtime: this will pull down manifests from remote git repo
by using this method, it overwrites manifests and component images if images are set in the params.env file
1. Each component in the `DataScienceCluster` CR has `devFlags.manifests` field, which can be used to pull down the manifests from the remote git repos of the respective components. By using this method, it overwrites manifests and creates customized resources for the respective components.

2. [Under implementation] build operator image with local manifests.

### Update API docs

Whenever a new api is added or a new field is added to the CRD, please make sure to run the command:
```commandline
make api-docs
```
This will ensure that the doc for the apis are updated accordingly.

### Example DSCInitialization

Below is the default DSCI CR config

```console
apiVersion: dscinitialization.opendatahub.io/v1
kind: DSCInitialization
apiVersion: dscinitialization.opendatahub.io/v1
metadata:
name: default-dsci
spec:
Expand All @@ -220,6 +240,10 @@ spec:
name: data-science-smcp
namespace: istio-system
managementState: Managed
trustedCABundle:
customCABundle: ''
managementState: Managed

```

Apply this example with modification for your usage.
Expand All @@ -246,17 +270,23 @@ spec:
managementState: Managed
kserve:
managementState: Managed
serving:
ingressGateway:
certificate:
type: SelfSigned
managementState: Managed
name: knative-serving
kueue:
managementState: Managed
modelmeshserving:
managementState: Managed
ray:
modelregistry:
managementState: Managed
workbenches:
ray:
managementState: Managed
trustyai:
managementState: Managed
modelregistry:
workbenches:
managementState: Managed
```

Expand Down Expand Up @@ -325,6 +355,13 @@ for DataScienceCluster deletion.
```shell
make e2e-test -e OPERATOR_NAMESPACE=<namespace> -e E2E_TEST_FLAGS="--skip-deletion=true"
```
### API Overview

Please refer to [api documentation](docs/api-overview.md)

### Component Integration

Please refer to [components docs](components/README.md)

### Troubleshooting

Expand Down
2 changes: 2 additions & 0 deletions apis/infrastructure/v1/groupversion_info.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
// +groupName=datasciencecluster.opendatahub.io
package v1
2 changes: 1 addition & 1 deletion components/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ To ensure a component is integrated seamlessly in the operator, follow the steps
### Add Component to DataScienceCluster API spec

DataScienceCluster CRD is responsible for defining the component fields and exposing them to end users.
Add your component to it's [api spec](https://github.com/opendatahub-io/opendatahub-operator/blob/main/apis/datasciencecluster/v1/datasciencecluster_types.go#L40):
Add your component to it's [api spec](../docs/api-overview.md#datascienceclusterspec):

```go
type Components struct {
Expand Down
1 change: 1 addition & 0 deletions components/codeflare/codeflare.go
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
// Package codeflare provides utility functions to config CodeFlare as part of the stack
// which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists
// +groupName=datasciencecluster.opendatahub.io
package codeflare

import (
Expand Down
1 change: 1 addition & 0 deletions components/component.go
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
// +groupName=datasciencecluster.opendatahub.io
package components

import (
Expand Down
1 change: 1 addition & 0 deletions components/dashboard/dashboard.go
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
// Package dashboard provides utility functions to config Open Data Hub Dashboard: A web dashboard that displays
// installed Open Data Hub components with easy access to component UIs and documentation
// +groupName=datasciencecluster.opendatahub.io
package dashboard

import (
Expand Down
1 change: 1 addition & 0 deletions components/datasciencepipelines/datasciencepipelines.go
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
// Package datasciencepipelines provides utility functions to config Data Science Pipelines:
// Pipeline solution for end to end MLOps workflows that support the Kubeflow Pipelines SDK and Tekton
// +groupName=datasciencecluster.opendatahub.io
package datasciencepipelines

import (
Expand Down
1 change: 1 addition & 0 deletions components/kserve/kserve.go
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
// Package kserve provides utility functions to config Kserve as the Controller for serving ML models on arbitrary frameworks
// +groupName=datasciencecluster.opendatahub.io
package kserve

import (
Expand Down
1 change: 1 addition & 0 deletions components/kueue/kueue.go
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
// +groupName=datasciencecluster.opendatahub.io
package kueue

import (
Expand Down
1 change: 1 addition & 0 deletions components/modelmeshserving/modelmeshserving.go
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
// Package modelmeshserving provides utility functions to config MoModelMesh, a general-purpose model serving management/routing layer
// +groupName=datasciencecluster.opendatahub.io
package modelmeshserving

import (
Expand Down
1 change: 1 addition & 0 deletions components/modelregistry/modelregistry.go
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
// Package modelregistry provides utility functions to config ModelRegistry, an ML Model metadata repository service
// +groupName=datasciencecluster.opendatahub.io
package modelregistry

import (
Expand Down
1 change: 1 addition & 0 deletions components/ray/ray.go
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
// Package ray provides utility functions to config Ray as part of the stack
// which makes managing distributed compute infrastructure in the cloud easy and intuitive for Data Scientists
// +groupName=datasciencecluster.opendatahub.io
package ray

import (
Expand Down
1 change: 1 addition & 0 deletions components/trustyai/trustyai.go
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
// Package trustyai provides utility functions to config TrustyAI, a bias/fairness and explainability toolkit
// +groupName=datasciencecluster.opendatahub.io
package trustyai

import (
Expand Down
1 change: 1 addition & 0 deletions components/workbenches/workbenches.go
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
// Package workbenches provides utility functions to config Workbenches to secure Jupyter Notebook in Kubernetes environments with support for OAuth
// +groupName=datasciencecluster.opendatahub.io
package workbenches

import (
Expand Down
10 changes: 10 additions & 0 deletions crd-ref-docs.config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
processor:
# RE2 regular expressions describing Group.Version that should be excluded from the generated documentation.
ignoreGroupVersions:
- "features.opendatahub.io/v1"
# RE2 regular expressions describing types that should be excluded from the generated documentation.
ignoreTypes:
- "(DataScienceCluster|DSCInitialization)List$"
render:
# Version of Kubernetes to use when generating links to Kubernetes API documentation.
kubernetesVersion: 1.24
Loading

0 comments on commit 1b86e42

Please sign in to comment.