Release Date: March 8, 2023
BUFFALO, N.Y. — As data centers are expected to consume 8% of the world’s electricity by 2030, new University at Buffalo School of Management research has discovered a strategy to conserve power in these IT behemoths — and optimize their data processing performance.
Available online ahead of publication in Information Systems Research, the study found that using a mix of two different resource management strategies resulted in near-optimal energy efficiency under all workload conditions.
“Data centers consume a tremendous amount of energy because they generate heat, which requires cooling services, and when the servers communicate across racks, energy consumption goes through the roof,” says study co-author Ram Ramesh, PhD, professor of management science and systems in the UB School of Management. “The rise in energy costs coupled with the urgent need to reduce the carbon footprint together create an imperative for new approaches to achieve efficiency.”
The researchers collected nearly 3 terabytes of data from a supercomputing center over a one-month period, and analyzed the power consumption and heat load, and how those translated into total energy consumption on a second-by-second basis. They then developed a model that uses observations from previous time steps to predict how the total computing load and its distribution across servers would affect total energy consumption in a data center. Using this model, they discovered the optimal way to allocate computing resources for incoming jobs.
Their findings show that data center managers can reduce energy consumption 10-30% by consolidating jobs to as few servers as possible when workloads are high, and evenly distributing the workload across all servers when loads are low. And, it’s simple to find the transition point between the two strategies using their methods.
“When we focus on energy conservation while allocating computing resources, certain jobs can take longer to complete, so there is a tradeoff between minimizing energy consumption and optimizing job performance.” says Ramesh. “By attaining an efficient balance in this tradeoff, companies can reduce their environmental impact, complete jobs on time and save millions of dollars in energy costs each year.”
Ramesh collaborated on the study with Zhiling Guo, PhD, associate professor of information systems in the School of Computing Information Systems at Singapore Management University, and Jin Li, PhD, associate professor of information systems and intelligent business in the School of Management at Xi’an Jiaotong University. The study was supported by the National Supercomputing Centre of Singapore and the National Natural Science Foundation of China.