Going Fast and Fair: Latency Optimization for Cloud-Based Service Chains

Yuchao Zhang, Ke Xu, Haiyang Wang, Qi Li, Tong Li, Xuan Cao

Research output: Contribution to journalArticlepeer-review

9 Scopus citations

Abstract

State-of-the-art microservices have been attracting more attention in recent years. A broad spectrum of online interactive applications are now programmed to service chains on the cloud, seeking better system scalability and lower operating costs. Different from the conventional batch jobs, most of these applications consist of multiple stand-alone services that communicate with each other. These step-by-step operations unavoidably introduce higher latency to the delay-sensitive chained services. In this article, we aim at designing an optimization approach for reducing the latency of chained services. Specifically, presenting the measurement and analysis of chained services on Baidu's cloud platform, our real-world trace indicates that these chained services are suffering from significantly high latency because they are mostly handled by different queues on cloud servers for multiple times. However, such a unique feature introduces significant challenges to optimize a microservice's overall queueing delay. To address this problem, we propose a delay-guaranteed approach to accelerate the overall queueing of chained services while obtaining fairness across all the workloads. Our evaluations on Baidu servers shows that the proposed design can successfully reduce the latency of chained services by 35 percent with minimal impact on other workloads.

Original languageEnglish (US)
Pages (from-to)138-143
Number of pages6
JournalIEEE Network
Volume32
Issue number2
DOIs
StatePublished - Mar 1 2018
Externally publishedYes

Fingerprint Dive into the research topics of 'Going Fast and Fair: Latency Optimization for Cloud-Based Service Chains'. Together they form a unique fingerprint.

Cite this