< Back to previous page

Project

Asymptotic theory of debiased regularized M-estimators

Regression coefficient estimation by regularized estimators such as the Lasso, introduces bias and its selection of coefficients makes classical inference methods inappropriate. Concerning debiasing the regularized estimator, recent literature focuses mostly on the case of the least squares loss, leaving other loss functions such as the quantile loss underexplored. My research proposal mainly investigates the general class of debiased regularized M-estimators. I develop a theoretical study and a computational approach for generating componentwise confidence intervals for coefficients in high-dimensional linear models. A bootstrap method using debiased l_1-regularized M-estimators is studied, and extended towards l_1regularized model averaged and composite M-estimators. Quantile regression is an example. A new choice of weights for model averaged and composite estimation is proposed by minimizing an analytical expression for the asymptotic variance of such debiased l_1-regularized M-estimators. A bootstrap procedure is to be developed to study the distribution of the estimated weights. The efficiency of such estimators using random weights will be compared with similar estimators using deterministic weights. Further, since the debiased l_1-regularized M-estimators are only asymptotically unbiased, I consider a double-debiasing procedure to further improve the sample size requirements for these estimators to possess good theoretical properties.
 

Date:1 Nov 2020 →  31 Jan 2023
Keywords:model averaging, regularization, robust M-estimators, quantile regression
Disciplines:Statistics, Statistics and numerical methods not elsewhere classified