Cloud Experts Documentation

GPU

Deploying and Running Ollama and Open WebUI in a ROSA Cluster with GPUs

Red Hat OpenShift Service on AWS (ROSA) provides a managed OpenShift environment that can leverage AWS GPU instances. This guide will walk you through deploying Ollama and OpenWebUI on ROSA using instances with GPU for inferences. Prerequisites A Red Hat OpenShift on AWS (ROSA classic or HCP) 4.14+ cluster OC CLI (Admin access to cluster) ROSA CLI Set up GPU-enabled Machine Pool First we need to check availability of our instance type used here (g4dn.

Creating Images using Stable Diffusion on Red Hat OpenShift AI on ROSA cluster with GPU enabled

1. Introduction Stable Diffusionexternal link (opens in new tab) is an AI model to generate images from text description. It uses a diffusion process to iteratively denoise random Gaussian noise into coherent images. This is a simple tutorial to create images using Stable Diffusion model using Red Hat OpenShift AI (RHOAI) , formerly called Red Hat OpenShift Data Science (RHODS), which is our OpenShift platform for AI/ML projects lifecycle management, running on a Red Hat OpenShift Services on AWS (ROSA) cluster, which is our managed service OpenShift platform on AWS, with NVIDIA GPU enabled.

ROSA with Nvidia GPU Workloads

ROSA guide to running Nvidia GPU workloads. Prerequisites ROSA Cluster (4.14+) rosa cli #logged-in oc cli #logged-in-cluster-admin jq If you need to install a ROSA cluster, please read our ROSA Quickstart Guide , or better yet Use Terraform to create an HCP Cluster . Enter the oc login command, username, and password from the output of the previous command: Example login: oc login https://api.cluster_name.t6k4.i1.organization.org:6443 \ > --username cluster-admin \ > --password mypa55w0rd Login successful.

ROSA with Nvidia GPU Workloads - Manual

This is a guide to install GPU on ROSA cluster manually, which is an alternative to our Helm chart guide . Prerequisites ROSA cluster (4.14+) You can install a Classic version using CLI or an HCP one using Terraform . Please be sure you are logged in to the cluster with a cluster admin access. rosa cli oc cli 1. Setting up GPU machine pools In this tutorial, I’m using g5.

How to deploy Jupyter Notebook

Retrieve the login command If you are not logged in via the CLI, access your cluster via the web console, then click on the dropdown arrow next to your name in the top-right and select Copy Login Command. A new tab will open and select the authentication method you are using (in our case it’s github) Click Display Token Copy the command under where it says “Log in with this token”.

Installing the Open Data Hub Operator

The Open Data Hub operator is available for deployment in the OpenShift OperatorHub as a Community Operators. You can install it from the OpenShift web console: From the OpenShift web console, log in as a user with cluster-admin privileges. For a developer installation from try.openshift.com including AWS and CRC, the kubeadmin user will work. Create a new project named ‘jph-demo’ for your installation of Open Data Hub Find Open Data Hub in the OperatorHub catalog.

Jupyter Notebooks

You will need the following prerequistes in order to run a basic Jupyter notebook with GPU on OpenShift 1. A OpenShift Cluster This will assume you have already provisioned a OpenShift cluster succesfully and are able to use it. You will need to log in as cluster admin to deploy GPU Operator . 2. OpenShift Command Line Interface Please see the OpenShift Command Line section for more information on installing.

ARO with Nvidia GPU Workloads

ARO guide to running Nvidia GPU workloads. Prerequisites oc cli Helm jq, moreutils, and gettext package An ARO 4.14 cluster Note: If you need to install an ARO cluster, please read our ARO Terraform Install Guide . Please be sure if you’re installing or using an existing ARO cluster that it is 4.14.x or higher. Note: Please ensure your ARO cluster was created with a valid pull secret (to verify make sure you can see the Operator Hub in the cluster’s console).

Interested in contributing to these docs?

Collaboration drives progress. Help improve our documentation The Red Hat Way.

Red Hat logo LinkedIn YouTube Facebook Twitter

Products

Tools

Try, buy & sell

Communicate

About Red Hat

We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Subscribe to our newsletter, Red Hat Shares

Sign up now
© 2023 Red Hat, Inc.