[1] McLachlan G J, Lee S X, Rathnayake S I. Finite mixture models[J]. Annual Review of Statistics and Its Application, 2019, 6(1): 355-378. DOI: 10.1146/annurev-statistics-031017-100325. [2] Tibshirani R, Saunders M, Rosset S, et al.Sparsity and smoothness via the fused lasso[J]. Journal of the Royal Statistical Society Series B: Statistical Methodology, 2005, 67(1): 91-108. DOI:10.1111/j.1467-9868.2005.00490.x. [3] Ma S J, Huang J.A concave pairwise fusion approach to subgroup analysis[J]. Journal of the American Statistical Association, 2017, 112(517): 410-423. DOI: 10.1080/01621459.2016.1148039. [4] Tang X W, Xue F, Qu A N.Individualized multidirectional variable selection[J]. Journal of the American Statistical Association, 2021, 116(535): 1280-1296. DOI: 10.1080/01621459.2019.1705308. [5] He Y, Zhou L, Xia Y C, et al.Center-augmented $\ell $2-type regularization for subgroup learning[J]. Biometrics, 2023, 79(3): 2157-2170. DOI: 10.1111/biom.13725. [6] Tibshirani R.Regression shrinkage and selection via the Lasso[J]. Journal of the Royal Statistical Society Series B: Statistical Methodology, 1996, 58(1): 267-288. DOI: 10.1111/j.2517-6161.1996.tb02080.x. [7] Fan J Q, Li R Z.Variable selection via nonconcave penalized likelihood and its oracle properties[J]. Journal of the American Statistical Association, 2001, 96(456): 1348-1360. DOI: 10.1198/016214501753382273. [8] Zhang C H.Nearly unbiased variable selection under minimax concave penalty[J]. The Annals of Statistics, 2010, 38(2): 894-942. DOI: 10.1214/09-aos729. [9] He X M, Shao Q M.A general Bahadur representation of M-estimators and its application to linear regression with nonstochastic designs[J]. The Annals of Statistics, 1996, 24(6): 2608-2630. DOI: 10.1214/aos/1032181172. [10] Wu W B.M-estimation of linear models with dependent errors[J]. The Annals of Statistics, 2007, 35(2): 495-521. DOI: 10.1214/009053606000001406. [11] Koenker R, Bassett JR G.Regression quantiles[J]. Econometrica: Journal of the Econometric Society, 1978, 46(1): 33-50. DOI: 10.2307/1913643. [12] Zou H, Yuan M.Composite quantile regression and the oracle model selection theory[J]. The Annals of Statistics, 2008, 36(3): 1108-1126. DOI: 10.1214/07-aos507. [13] Fan J Q, Fan Y Y, Barut E.Adaptive robust variable selection[J]. Annals of Statistics, 2014, 42(1): 324-351. DOI: 10.1214/13-AOS1191. [14] Li G R, Peng H, Zhu L X.Nonconcave penalized m-estimation with a diverging number of parameters[J]. Statistica Sinica, 2011, 21(1): 391-419. [15] Cheng C, Feng X D, Li X G, et al.Robust analysis of cancer heterogeneity for high-dimensional data[J]. Statistics in Medicine, 2022, 41(27): 5448-5462. DOI:10.1002/sim.9578. [16] Zhang Y Y, Wang H J, Zhu Z Y.Robust subgroup identification[J]. Statistica Sinica, 2019, 29(4): 1873-1889. DOI:10.5705/ss.202017.0179. [17] Huber P J.Robust estimation of a location parameter[J]. The Annals of Mathematical Statistics, 1964, 35(1): 73-101. DOI: 10.1214/aoms/1177703732. [18] Sun Q, Zhou W X, Fan J Q.Adaptive Huber regression[J]. Journal of the American Statistical Association, 2020, 115(529): 254-265. DOI: 10.1080/01621459.2018.1543124. [19] Boyd S, Parikh N, Chu E,et al.Distributed optimization and statistical learning via the alternating direction method of multipliers[J]. Foundations and Trends in Machine Learning, 2011, 3(1): 1-122. DOI: 10.1561/2200000016. [20] Liu W D, Mao X J, Zhang X F, et al.Robust personalized federated learning with sparse penalization[J]. Journal of the American Statistical Association, 2025, 120(549): 266-277. DOI:10.1080/01621459.2024.2321652. [21] Loh P L, Wainwright M J.Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima[J]. The Journal of Machine Learning Research, 2015, 16(1): 559-616. DOI: 10.48550/arXiv.1305.2436. [22] Zhu X L, Qu A N.Cluster analysis of longitudinal profiles with subgroups[J]. Electronic Journal of Statistics, 2018, 12(1): 171-193. DOI: 10.1214/17-EJS1389. [23] Wang H S, Li R Z, Tsai C L.Tuning parameter selectors for the smoothly clipped absolute deviation method[J]. Biometrika, 2007, 94(3): 553-568. DOI: 10.1093/biomet/asm053. [24] Rand W M.Objective criteria for the evaluation of clustering methods[J]. Journal of the American Statistical Association, 1971, 66(336): 846-850. DOI: 10.1080/01621459.1971.10482356. [25] Penrose K W, Nelson A G, Fisher A G.Generalized body composition prediction equation for men using simple measurement techniques[J]. Medicine & Science in Sports & Exercise, 1985, 17(2): 189. DOI: 10.1249/00005768-198504000-00037. [26] Zhang Y Y, Wang H J, Zhu Z Y.Quantile-regression-based clustering for panel data[J]. Journal of Econometrics, 2019, 213(1): 54-67.DOI: 10.1016/j.jeconom.2019.04.005. [27] Mohajan D, Mohajan H K.A study on body fat percentage for physical fitness and prevention of obesity: A two compartment model[J]. Journal of Innovations in Medical Research, 2023, 2(4): 1-10. DOI: 10.56397/jimr/2023.04.01. [28] Lin H Z, Peng H.Smoothed rank correlation of the linear transformation regression model[J]. Computational Statistics & Data Analysis, 2013, 57(1): 615-630. DOI: 10.1016/j.csda.2012.07.012. [29] Ranasinghe C, Gamage P, Katulanda P, et al.Relationship between Body Mass Index (BMI) and body fat percentage, estimated by bioelectrical impedance, in a group of Sri Lankan adults: A cross sectional study[J]. BMC Public Health, 2013, 13: 1-8. DOI:10.1186/1471-2458-13-797. |