Designers of data center services have long struggled with the challenge of how to best meet the diverse quality-of-experience requirements of individual users while allowing for optimized and fair computing, and networking resource-sharing among users.
Hao Che, a professor in the Department of Computer Science and Engineering at The University of Texas at Arlington, and Hong Jiang, the Wendell H. Nedderman Professor and department chair of CSE, believe they have a mathematical solution that will allow outstanding user experiences while balancing computing and network resource allocation. They will use a three-year, $1.05 million National Science Foundation grant to prove that it works with system prototypes and implementations.
Data center services for the end-user include web searches, social networking and data mining. A single user request of a service may trigger the execution of a massive number of parallel tasks and the exchange of a large number of data flows among task servers across a data center network. As a result, to provide user quality-of-experience guarantees, high utilization and fair sharing of data center resources, a user-context-aware, joint computing-and-networking resource allocation solution is necessary.
Unfortunately, existing solutions are bottom-up solutions concerning either computing or networking aspects of resource allocation alone and are generally user-context unaware.
Current systems typically supply more resources than are needed to provide good user quality of experience -- a practice known as resource over-provisioning. That practice has designers look at what they have to offer and then build up without seamlessly aligning the resources with what the actual needs are.
"Hardware is designed to meet the demands of the maximum load, but the load rarely meets that level," Jiang said. "In fact, actual usage is often 30 to 40 percent of the maximum. If we can build an optimization framework to meet the demands of both computing and networking resources, we can optimize resource sharing and provide the highest quality of experience for individual users."
To this end, Che and Jiang will take the opposite approach -- a holistic, top-down process.
"We have developed a set of algorithms that will allow us to quantify user quality-of-experience requirements in the form of user utilities, and then work downward by mapping user utilities into computing and networking resource demands at individual computing and networking components. These resource demands then set the boundary conditions for an optimization framework that will allow optimal computing-and-networking resource sharing among all users," Che said. "If we can successfully develop a solution of this kind, we will automatically fulfill the goal of quality-of-experience guarantees for individual users while optimizing computing and network resource allocations."
The new grant will build on a 2016 NSF grant in which Che, Jiang and computer science Professor Jeff Lei created a model that allows mapping of user service-level objectives into precise computing resource requirements at individual computing servers, independent of the servers to be used, giving service providers the ability to develop service-level-objective guaranteed packages, portable to any data center platforms.Both grants are examples of UTA's emphasis on data-driven discovery contained within the Strategic Plan 2020: Bold Solutions | Global Impact, said Peter Crouch, dean of the College of Engineering.
"If they are successful, Dr. Che and Dr. Jiang will make it easier for users to complete their online tasks while easing the strain on providers," Crouch said. "There are many potential economic and social benefits of this solution, should it be widely implemented."
Jiang joined UTA in 2015 after a stint as program director at the NSF's Computing and Communications Foundation. He previously was a Willa Cather professor of computer science and engineering at the University of Nebraska-Lincoln. He is a fellow of the Institute of Electrical and Electronics Engineers and focuses on memory and storage architectures, cloud computing, scalable and distributed file systems, memory management, high performance computing and big data analytics.
Che joined UTA in 2002 following a brief tenure as a systems architect in private industry and two years as an assistant professor and adjunct professor at Pennsylvania State University. His research focuses on network architecture and network resource management, performance analysis of large-scale distributed computing systems, including many-core processors, warehouse-scale computing and cloud computing.
Several other UTA College of Engineering faculty also are working on research related to data-driven discovery, including:
- Heng Huang in computer science and engineering has earned several grants totaling nearly $5 million for his work in big data analysis for applications related to Alzheimer's disease and depression, and precision medicine.
- Junzhou Huang in computer science and engineering is working to discover a process by which image-omics data can be combined into files that are small enough that current computing technology will allow scientists to better predict how long a patient will live and how best to treat that patient.
- Ioannis Schizas in electrical engineering is working to develop a framework for a network of simple sensors that could be as powerful as a supercomputer, but smaller and less costly.