Cardinal
Ollama
Ollama is an open-source inference server supporting a number of generative AI models. This module also includes Open-WebUI, which provides an easy-to-use web interface.
Availability and Restrictions
Versions
Ollama is available on OSC Clusters. The versions currently available at OSC are:
starccm/19.06.009 is now available on Cardinal
Starccm can be loaded with the following modules on Cardinal:module load starccm/19.06.009
module load starccm/19.06.009-mixed
module load starccm/19.06.009-hbm
module load starccm/19.06.009-mixed-hbm
Ascend is open to all
bedtools2 2.31.0 is now available on Cardinal
bedtools2 2.31.0 is now available on Cardinal via module load bedtools2/2.31.0
Intel oneAPI Toolkit 2025 Update on Cardinal
The following components and versions are available on Cardinal: |
osc-seff
Introduction
osc-seff is a command developed at OSC for use on OSC's systems and provides a the CPU resource data of the seff command with the GPU resource data of gpu-seff.
gpu-seff
Introduction
gpu-seff is a command developed at OSC for use on OSC's systems and is similar providing GPU resource data, similar to the CPU resource data reported by the seff command.