The Center for Computational Research, a leading academic supercomputing facility, maintains a high-performance computing environment, high-end visualization laboratories, and support staff with expertise in advanced computing, modeling and simulation, visualization, and networking.
The Center’s extensive computing facilities, which are
housed in a state-of-the-art 4000 sq ft machine room, include a
generally accessible (to all UB researchers) Linux cluster with
more than 8000 processor cores and QDR Infiniband, a subset (32) of
which contain (64) NVidia Tesla M2050 “Fermi” graphics
processing units (GPUs). Industrial partners of the
University have access to a cluster with more than 3400 processor
cores and FDR Infiniband. The Center maintains a 3PB IBM GPFS
high-performance parallel file system. The computer
visualization laboratory features a tiled display wall, and a
VisDuo passive stereo system. A leading academic
supercomputing facility, CCR has more than 170 Tflops of peak
performance compute capacity. CCR additionally hosts a number
of clusters and specialized storage devices for various specific
departments, projects, and collaborations, interested researchers
should contact CCR staff.
UB CCR's research cloud, nicknamed Lake Effect, is a
subscription-based Infrastructure as a Service (IAAS) cloud that
provides root level access to virtual servers and storage on
demand. This means CCR can now provide tech-savy researchers
with hardware that is not part of a compute cluster that can be
used for testing software and databases, running websites for
research projects, conducting proof-of-concept studies for grant
funding, and many other things to benefit your research. The
CCR cloud is 100% compatible with Amazon's Web Services (AWS) to
allow our users to go between the two services. More details about the Lake Effect cloud
The computer visualization laboratory features a tiled display
wall, and a VisDuo passive stereo system. The tiled display
device was assembled for the purpose of allowing scientific
investigations of high-resolution images by teams of scientists
working in a comfortable setting. In addition, it has proven to be
ideal for urban planning and design efforts. The tiled-display wall
is back-projected by 9x EPSON HD Projectors (edge blended with no
seams) arranged in a matrix 3 across and 3 high providing 15.2
Megapixels of resolution. The VisDuo is a ceiling
mounted, 2 projector, passive stereo display, used for viewing
complex 3D environments, molecular structures, and medical
simulations. The stereo effect is realized by each projector
producing images for one eye. The output is polarized by special
filters and the resulting image is viewed on a custom polarization
preserving screen. Users can view the resulting 3D imagery by
wearing lightweight polarizing glasses.
Remote Visualization: CCR offers dedicated compute nodes that host remote visualization capabilities for CCR users that require use of an OpenGL application GUI with access to the CCR Cluster resources.
While users of the CCR clusters may have varying degrees of data
storage requirements, most agree they need large amounts of storage
and they want it available from all the CCR resources. At the
beginning of 2015, CCR put a new 3 PB (petabyte) IBM GPFS storage
system into production to help UB researchers. This storage
is CCR's high performance parallel file system. With
40GigE connections to the core Arista network, it provides I/O
performance in excess of 30 GigaBytes per second sustained in
tests. Designed for high performance and concurrent access,
CCR’s GPFS is primarily intended for generation and analyses
of large quantities of short-lived data (scratch usage).
In December 2015, CCR put into place an EMC Isilon NAS storage solution which serves as the high reliability core storage for user home and group project directories. The storage system consists of 1PB of usable storage in a hierarchical storage pool connected to the CCR core network with two 10GigE links per server. The storage is designed to tolerate simultaneous failures, helping ensure the 24x7x365 availability of the Center's primary storage.
Please contact us to discuss your data storage needs and how we can best provide assistance for your research projects.
CCR provides 500TB of high performance global scratch space.
All servers and compute nodes in all the clusters have local disk space (/scratch).