osc-seff
Introduction
osc-seff is a command developed at OSC for use on OSC's systems and provides a the CPU resource data of the seff command with the GPU resource data of gpu-seff.
Owens cluster will be decommissioned on February 3, 2025. Some pages may still reference Owens after Owens is decommissioned , and we are in the process of gradually updating the content. Thank you for your patience during this transition
osc-seff is a command developed at OSC for use on OSC's systems and provides a the CPU resource data of the seff command with the GPU resource data of gpu-seff.
gpu-seff is a command developed at OSC for use on OSC's systems and is similar providing GPU resource data, similar to the CPU resource data reported by the seff command.
It is strongly suggested to consider the memory use to the available per-core memory when users request OSC resources for their jobs.
The following are technical specifications for Cardinal.
378 nodes
756 (2 sockets/node for all nodes)
39,312
104 cores/node for all nodes (96 usable)
This page includes a summary of differences to keep in mind when migrating jobs from other clusters to Ascend.
Ascend (PER NODE) | Pitzer (PER NODE) | ||
---|---|---|---|
Regular compute node | n/a |
40 cores and 192GB of RAM 48 cores and 192GB of RAM |
In October 2022, OSC retires the Data Direct Networks (DDN) GRIDScaler system deployed in 2016 and expands the IBM Elastic Storage System (ESS) for both Project and global Scratch services. This expands the total capacity of Project and Scratch storage at OSC to ~16 petabytes with better performance.
In December 2021 OSC updated its firewall to enhance security. As a result, SSH sessions are being closed more quickly than they used to be. It is very easy to modify your SSH options in the client you use to connect to OSC to keep your connection open.
In ~/.ssh/config (use the command touch ~/.ssh/config
to create it if there is no exisitng one), you can set 3 options:
This document is obsoleted and kept as a reference to previous Owens programming environment. Please refer to here for the latest version.