Ollama
Ollama is an open-source inference server supporting a number of generative AI models. This module also includes Open-WebUI, which provides an easy-to-use web interface.
Availability and Restrictions
Versions
Ollama is available on OSC Clusters. The versions currently available at OSC are: