SCAD‐penalized quantile regression for high‐dimensional data analysis and variable selection

The present penalized quantile variable selection methods are only applicable to finite number of predictors or do not have oracle property associated with estimator. This technique is considered as an alternative to ordinary least squares regression in case of the outliers and the heavy‐tailed errors existing in linear models. The variable selection through quantile regression with diverging number of parameters is investigated in this paper. The convergence rate of estimator with smoothly clipped absolute deviation penalty function is also studied. Moreover, the oracle property with proper s... Mehr ...

Verfasser: Amin, Muhammad
Dokumenttyp: Artikel
Reihe/Periodikum: Statistica Neerlandica
Verlag/Hrsg.: Oxford, Blackwell
Sprache: Englisch
ISSN: 0039-0402
Weitere Identifikatoren: doi: 10.1111/stan.12056
Permalink: https://search.fid-benelux.de/Record/olc-benelux-1964987539
URL: NULL
NULL
Datenquelle: Online Contents Benelux; Originalkatalog
Powered By: Verbundzentrale des GBV (VZG)
Link(s) : http://dx.doi.org/10.1111/stan.12056
http://dx.doi.org/10.1111/stan.12056

The present penalized quantile variable selection methods are only applicable to finite number of predictors or do not have oracle property associated with estimator. This technique is considered as an alternative to ordinary least squares regression in case of the outliers and the heavy‐tailed errors existing in linear models. The variable selection through quantile regression with diverging number of parameters is investigated in this paper. The convergence rate of estimator with smoothly clipped absolute deviation penalty function is also studied. Moreover, the oracle property with proper selection of tuning parameter for quantile regression under certain regularity conditions is also established. In addition, the rank correlation screening method is used to accommodate ultra‐high dimensional data settings. Monte Carlo simulations demonstrate finite performance of the proposed estimator. The results of real data reveal that this approach provides substantially more information as compared with ordinary least squares, conventional quantile regression, and quantile lasso.