Ph.D. Oral Defense: "Distributed Quantile Regression Analysis"

Liqun Yu, Washington University in Saint Louis

Abstract: This dissertation develops novel methodologies for distributed quantile regression analysis for big data via two different approaches. The first approach is to derive distributed optimization algorithms for the quantile regression estimation. Compared to traditional numerical methods for quantile regression, our proposed optimization algorithms possess the desirable property of faster computation and flexible parallelization. Our second approach is to apply the divide-and-combine (DC) strategies for statistical aggregation. The essential idea is to fit quantile regression separately on subsets of data and then aggregate subset results in a statistical efficient way. Existing DC-based statistical aggregation focus on smooth problems while we extend them to the non-smooth quantile regression problem. Both approaches enjoy the ease of parallel implementation and hence the scalability to big data by utilizing model distributed computation frameworks.

Host: Nan Lin