Reaching Others University at Buffalo - The State University of New York
Skip to Content

Research Facilities

The Center for Computational Research, a leading academic supercomputing facility, maintains a high-performance computing environment, high-end visualization laboratories, and support staff with expertise in advanced computing, modeling and simulation, visualization, and networking. 

 CCR cluster

CCR's large production clusters currently provide more than 100 Tflops of peak compute capacity

High Performance Computing

A leading academic supercomputing facility, CCR has more than 100 Tflops of peak performance compute capacity and 600 TB of high-performance storage.   CCR's computing facilities, which are housed in a state-of-the-art 4000 sq ft machine room, include a Linux cluster with more than 8000 processor cores and QDR Infiniband interconnect. A subset (32) of the cluster nodes contain (64) NVidia Tesla M2050 "Fermi" graphics processing units (GPUs). The Center also maintains several high-performance storage systems including Isilon-based storage (320TB) as well as a parallel storage system from Panasas (215TB). The Center's clusters contain high-performance low-latency Infiniband networks, and the clusters are interconnected with a 10 gigabit Ethernet (10gigE) core network from Arista. 

High End Visualization

The computer visualization laboratory features a tiled display wall, and a VisDuo passive stereo system. The tiled display device was assembled for the purpose of allowing scientific investigations of high-resolution images by teams of scientists working in a comfortable setting. In addition, it has proven to be ideal for urban planning and design efforts. The tiled-display wall is back-projected by 9x EPSON HD Projectors (edge blended with no seams) arranged in a matrix 3 across and 3 high providing 15.2 Megapixels of resolution.  The VisDuo is a ceiling mounted, 2 projector, passive stereo display, used for viewing complex 3D environments, molecular structures, and medical simulations. The stereo effect is realized by each projector producing images for one eye. The output is polarized by special filters and the resulting image is viewed on a custom polarization preserving screen. Users can view the resulting 3D imagery by wearing lightweight polarizing glasses.


While users of the CCR clusters may have varying degrees of data storage requirements, most agree they need large amounts of storage and they want it available from all the CCR resources.  At the start of 2011, CCR put two new storage systems into production to help UB researchers.  We've designed the systems to provide fast I/O for parallel applications as well as provide the security of high availability and reliability.  We've also made sure these storage systems can be easily expanded as the storage needs for nearly all areas of research continue to increase exponentially.  Please contact us to discuss your data storage needs and how we can best provide assistance for your research projects.

Isilon Network Attached Storage:

  • User disk space at CCR is contained in a high-performance Isilon Network Attach Storage (NAS). 
  • CCR's Isilon NAS provides a total of 325TB of available disk storage to all the clusters in the Center. 
  • Directories are NFS mounted to the clusters and accessible from all compute nodes.
Isilon Network Attached Storage

Isilon Network Attached Storage

  • Home directories: /user
    • Example: /user/UBIT_username/
    • Default user quota is 2GB.
    • Backed up by UB's Central Computing Enterprise Infrastruture Services department

  • Projects directories: /project
    • Example /projects/mygroup/
    • Additional disk space is available for research groups in the project directories.
    • Faculty interested in project disk space should contact the CCR staff.
    • The default {PI} directory quota is 200GB.
    • Backed up by UB's Central Computing Enterprise Infrastruture Services department

Panasas Parallel Network Attached Storage:

Panasas Storage

Panasas High-Speed Global Storage

The Panasas NAS provides 180TB of high performance global scratch space. The Panasas NAS  is accessible to users while running applications.  This high performance storage is recommended for applications that require significant (or parallel) I/O throughput.  Please note that it is volatile storage and not meant to serve as permanent storage.

  • /panasas/scratch
    • Accessible from all compute nodes.
    • Available to all users for temporary use.
    • Data that has not been accessed in more than a month may be subject to removal by a scrubber.
    • There is NO backup of data.

Local Scratch Space:

All servers and compute nodes in all the clusters have local disk space (/scratch).

  • This scratch space is available to batch jobs running on the clusters.
    • The variable $PBSTMPDIR is set to /scratch/$PBS_JOBID.
  • Data on local scratch may be subject to removal as soon as the job completes
  • There is NO backup of data.

Breakdown of the local scratch space for the U2 (general compute) cluster can be found here