[1]BLAGUS R, LUSA L. Class prediction for high-dimensional class-imbalanced data[J]. BMC Bioinform, 2010, 11:523.
[2]ALON U, BARKAI N, NOTTERMAN D A, et al. Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays[J]. Proc Natl Acad USA, 1999, 96(12):6745-6750.
[3]GROBELNIK M. Feature selection for unbalanced class distribution and naive bayes[C]// Proceedings of the Sixteenth International Conference on Machine Learning, 1999.
[4]BOLONCANEDO V, SANCHEZMARONO N, ALONSOBETANZOS A, et al. A review of microarray datasets and applied feature selection methods[J]. Information Sciences, 2014, 282:111-135.
[5]GUO H X, LI Y J, SHANG J, et al. Learning from classimbalanced data: Review of methods and applications[J]. Expert Systems with Applications, 2017, 73:220-239.
[6]HOLDER L B, HAQUE M M, SKINNER M K. Machine learning for epigenetics and future medical applications[J]. Epigenetics, 2017, 12(7):505-514.
[7]袁联雄,佘玲玲,林爱华,等.常用分类算法在不同样本量和类分布的不平衡数据中的分类效果比较[J].中国医院统计,2015,22(1):22-26.
[8]LI L, DARDEN T A, WEINBERG C R, et al. Gene assessment and sample classification for gene expression data using a genetic algorithm/knearest neighbor method[J]. Comb Chem High Throughput Screen, 2001, 4(8):727-739.
[9]GOLUB T R, SLONIM D K, TAMAYO P, et al. Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring[J]. Science, 1999, 286(5439):531-537.
[10]MOLLINEDA R, ALEJO R, SOTOCA J. The class imbalance problem in pattern classification and learning[C]// II Congreso Espanol de Informática, 2007.
[11]李军.不平衡数据学习的研究[D].长春:吉林大学,2011.
[12]SHI L, CAMPBELL G, JONES W D, et al. The MicroArray Quality Control (MAQC)II study of common practices for the development and validation of microarraybased predictive models[J]. Nat Biotechnol, 2010, 28(8):827-838.
[13]GALAR M, FERNANDEZ A, BARRENECHEA E, et al. A review on ensembles for the class imbalance problem: Bagging, boosting, and hybrid-based approaches[J]. IEEE Trans Syst, Man, Cybern C, 2012, 42(4):463-484.
[14]CHERKASSKY V. The Nature Of Statistical Learning Theory[J]. IEEE Trans Neural Netw, 1997, 8(6):1564.
[15]QUINLAN J R. Induction of Decision Trees[J]. Machine Learning, 1986, 1(1):81-106.
[16]马刚.朴素贝叶斯算法的改进与应用[D].合肥:安徽大学,2018.
[17]BREIMAN L. Random forests[J]. Mach Learn, 2001, 45(1):5-32.
[18]PARRY R M, JONES W, STOKES T H, et al. KNearest neighbor models for microarray gene expression analysis and clinical outcome prediction[J]. Pharmacogenomics J, 2010, 10(4):292-309.
[19]FREUND Y, SCHAPIRE R E. A decisiontheoretic generalization of online learning and an application to boosting[J]. J Comput Syst Sci, 1997, 55(1):119-139.
[20]BREIMAN L. Bagging predictors[J]. Mach Learn, 1996, 24(2):123-140.
[21]JONG V L, NOVIANTI P W, ROES K C, et al. Selecting a classification function for class prediction with gene expression data[J]. Bioinformatics, 2016, 32(12):1814-1822. |