Cardinal

Ollama

Ollama is an open-source inference server supporting a number of generative AI models.  This module also includes Open-WebUI, which provides an easy-to-use web interface.

Ollama is in early user testing phase - not all functionality is guaranteed to work.  Contact oschelp@osc.edu with any questions.

Availability and Restrictions

Versions

Ollama is available on OSC Clusters. The versions currently available at OSC are:

osc-seff

Introduction

osc-seff is a command developed at OSC for use on OSC's systems and provides a the CPU resource data of the seff command with the GPU resource data of gpu-seff.

gpu-seff

Introduction

gpu-seff is a command developed at OSC for use on OSC's systems and is similar providing GPU resource data, similar to the CPU resource data reported by the seff command.

Pages