Two dedicated compute nodes host remote visualization capabilities for CCR users.
The remote visualization nodes are part of the CCR general computing cluster (rush). These nodes can be accessed from outside the cluster using the NICE Remote Visualization software suite. Access restrictions are the same as the general CCR compute cluster. If you are not on campus, you must connect through the VPN.
The nodes have 12 cores, 256GB of memory and 2 Nvidia Tesla M2075 GPUs each.
The visualization nodes are integrated with Slurm. As such, each allocation to use these resources will initiate a Slurm batch job. The limits for these jobs are set to be 24 hours of wall time. After this time limit is reached, your remote visualization session will be terminated and the resource released for other users.
These nodes can be used when there is a need to use an OpenGL application GUI with access to the CCR Cluster resources.
The procedure described below will allow OpenGL software to run in a hardware accelerated mode on the visualization nodes and display the application on your local workstation. It should behave as if you were running the application locally in terms of performance. A side benefit of this scheme is that the connection is VNC based, such that you can disconnect and connect to the display at any time without terminating your job.
The node runs the NICE Remote Visualization software suite. Use the following procedure to install the client software and connect to the server.
IMPORTANT : Please close your session from the "Sessions" tab of the webpage when you are done. We have a limited number of licenses for the remote visualization software.