The effects of different weight functions on partial robust M-regression performance: A simulation study


POLAT E.

COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, cilt.49, sa.4, ss.1089-1104, 2020 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 49 Sayı: 4
  • Basım Tarihi: 2020
  • Doi Numarası: 10.1080/03610918.2019.1586926
  • Dergi Adı: COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Applied Science & Technology Source, Business Source Elite, Business Source Premier, CAB Abstracts, Compendex, Computer & Applied Sciences, Veterinary Science Database, zbMATH, Civil Engineering Abstracts
  • Sayfa Sayıları: ss.1089-1104
  • Hacettepe Üniversitesi Adresli: Evet

Özet

Partial Robust M (PRM) is a partial robust regression estimator using robust M-estimators. The aim of this study is to see the effects of using alternative Iteratively Reweighted Least Squares (IRLS) weight functions instead of Fair weight function used in original robust Partial Least Squares Regression (PLSR) method PRM, furthermore, is to examine the effects of soft, semi-hard and hard weightings on this algorithm in terms of efficiency, goodness of fit and prediction. Hence, classical SIMPLS, original PRM algorithm using Fair weight function and four alternative PRM algorithms named as PRMBSQR, PRMCHY, PRMHBR, PRMTLWRTH (obtained using Bisquare, Cauchy, Huber and Talworth weight functions) are compared. The simulation results and a real data application show the original PRM and PRMCHY, both of using soft weighting functions, are the leading algorithms in terms of efficiency and prediction for both low and high dimensional data sets in case of moderate outliers existence. However, when the proportion of outliers is getting higher, semi-hard weighting PRMBSQR or a hard weighting PRMTLWRTH could be good alternatives. Moreover, real data application showed that generally original PRM and PRMCHY, PRMBSQR and PRMTLWRTH algorithms have better performances in terms of outlier detection than both PRMHBR and classical SIMPLS algorithms.