Cloud Experts Documentation

AI / ML

AI / ML Topics:

Deploying and Running Ollama and Open WebUI in a ROSA Cluster with GPUs

Red Hat OpenShift Service on AWS (ROSA) provides a managed OpenShift environment that can leverage AWS GPU instances. This guide will walk you through deploying Ollama and OpenWebUI on ROSA using instances with GPU for inferences. Prerequisites A Red Hat OpenShift on AWS (ROSA classic or HCP) 4.14+ cluster OC CLI (Admin access to cluster) ROSA CLI Set up GPU-enabled Machine Pool First we need to check availability of our instance type used here (g4dn.

Interested in contributing to these docs?

Collaboration drives progress. Help improve our documentation The Red Hat Way.

Red Hat logo LinkedIn YouTube Facebook Twitter

Products

Tools

Try, buy & sell

Communicate

About Red Hat

We’re the world’s leading provider of enterprise open source solutions—including Linux, cloud, container, and Kubernetes. We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Subscribe to our newsletter, Red Hat Shares

Sign up now
© 2023 Red Hat, Inc.