M. Aksela and J. Laaksonen, Using diversity of errors for selecting members of a committee classifier, Pattern Recognition, vol.39, issue.4, 2006.
DOI : 10.1016/j.patcog.2005.08.017

S. Arlot and A. Celisse, A survey of cross-validation procedures for model selection, Statistics Surveys, vol.4, issue.0, pp.40-79, 2010.
DOI : 10.1214/09-SS054

URL : https://hal.archives-ouvertes.fr/hal-00407906

D. Ballabio and V. Consonni, Classification tools in chemistry. Part 1: linear models. PLS-DA, Analytical Methods, vol.20, issue.16, pp.3790-3798, 2013.
DOI : 10.1016/S1093-3263(01)00123-1

Y. Bi, The impact of diversity on the accuracy of evidential classifier ensembles, International Journal of Approximate Reasoning, vol.53, issue.4, pp.584-607, 2012.
DOI : 10.1016/j.ijar.2011.12.011

C. M. Bishop, Neural networks for pattern recognition, 1995.

L. Breiman, Bagging predictors, Machine Learning, pp.123-140, 1996.
DOI : 10.2307/1403680

L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and regression trees, 1984.

W. C. Chen, A. Lee, W. J. Deng, and K. Y. Liu, The implementation of neural network for semiconductor PECVD process, Expert Systems with Applications, vol.32, issue.4, pp.1148-1153, 2007.
DOI : 10.1016/j.eswa.2006.02.013

N. Christianini and J. Shawe-taylor, An introduction to support vector machines and other kernel-based learning methods, 2000.
DOI : 10.1017/CBO9780511801389

C. Cortes and V. Vapnik, Support-vector networks, Machine Learning, pp.273-297, 1995.
DOI : 10.1007/BF00994018

T. Cover and P. Hart, Nearest neighbor pattern classification, IEEE Transactions on Information Theory, vol.13, issue.1, pp.21-27, 1967.
DOI : 10.1109/TIT.1967.1053964

URL : http://ssg.mit.edu/cal/abs/2000_spring/np_dens/classification/cover67.pdf

Q. Dai, A competitive ensemble pruning approach based on cross-validation technique. Knowledge- Based Systems, pp.394-414, 2013.

D. Abajo, N. Diez, A. Lobato, V. Cuesta, and S. R. , ANN quality diagnostic models for packaging manufacturing: an industrial data mining case study, Proc. of the 10th ACM SIGKDD international Conference on Knowledge Discovery and Data Mining, pp.22-25, 2004.

T. G. Dietterich, An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting, and randomization, Machine Learning, pp.139-157, 2000.

B. Efron and R. Tibshirani, Improvement on cross-validation: the .632+ bootstrap method, J. of the American Statistical Association, vol.92, pp.548-560, 1997.

B. Efron and R. Tibshirani, An introduction to the bootstrap, Monographs and Statistics and Applied Probability, 1998.

R. E. Fan, P. H. Chen, and C. J. Lin, Working set selection using second order information training SVM, J. Machine Learn. Res, vol.6, pp.889-1918, 2005.

Y. Freund and R. E. Schapire, Experiments with a new boosting algorithm, 13 th International Conference on Machine Learning ICML'96, 1996.

G. Giacinto and F. Roli, Design of effective neural network ensembles for image classification purposes, Image and Vision Computing, vol.19, issue.9-10, pp.699-707, 2001.
DOI : 10.1016/S0262-8856(01)00045-2

Y. Grandvalet and S. Canu, SVM and kernel methods matlab toolbox:ASI-INSA de Rouen, 2008.

L. Guo and S. Boukir, Margin-based ordered aggregation for ensemble pruning, Pattern Recognition Letters, vol.34, issue.6, pp.603-609, 2013.
DOI : 10.1016/j.patrec.2013.01.003

M. S. Hachichi, A. Vahedian, and H. S. Yazdi, Creating and measuring diversity in multiple classifier systems using support vector data description, Applied Soft Computing, vol.11, pp.4931-4942, 2011.

L. Han, L. Han, and C. Liu, Neural network applied to prediction of the failure stress for a pressurized cylinder containing defects, International Journal of Pressure Vessels and Piping, vol.76, issue.4, pp.215-219, 1999.
DOI : 10.1016/S0308-0161(98)00129-X

H. G. Han and J. F. Qiao, A structure optimisation algorithm for feedforward neural network construction, Neurocomputing, vol.99, pp.347-357, 2013.
DOI : 10.1016/j.neucom.2012.07.023

X. He, Z. Wang, C. Jin, Y. Zheng, and X. Xue, A simplified multi-class support vector machine with reduced dual optimization, Pattern Recognition Letters, vol.33, issue.1, pp.71-82, 2012.
DOI : 10.1016/j.patrec.2011.09.035

D. Hernandez-lobato, G. Martinez-munoz, and A. Suarez, How large should ensembles of classifiers be? Pattern Recogn, pp.1323-1336, 2013.

T. Ho, The random subspace method for constructing decision forests, IEEE Transact. Pattern Anal. Machine Intell, vol.20, issue.8, pp.832-844, 1998.

K. L. Hsieh and L. I. Tong, Optimization of multiple quality responses involving qualitative and quantitative characteristics in IC manufacturing using neural networks, Computers in Industry, vol.46, issue.1, pp.1-12, 2001.
DOI : 10.1016/S0166-3615(01)00091-4

S. C. Hsu and C. F. Chien, Hybrid data mining approach for pattern extraction from wafer bin map to improve yield in semiconductor manufacturing, International Journal of Production Economics, vol.107, issue.1, pp.88-103, 2007.
DOI : 10.1016/j.ijpe.2006.05.015

C. W. Hsu, C. C. Chang, and C. J. Lin, A practical guide to support vector classification, 2003.

Y. H. Hung, Optimal process parameters design for a wire bonding of ultra???thin CSP package based on hybrid methods of artificial intelligence, Microelectronics International, vol.24, issue.3, pp.3-10, 2007.
DOI : 10.1016/S0167-9236(96)00070-X

K. Ishikawa, Guide to quality control, 1986.

A. K. Jain, M. N. Murty, and P. Flynn, Data clustering: a review, ACM Computing Surveys, vol.31, issue.3, pp.264-323, 1999.
DOI : 10.1145/331499.331504

G. James, D. Witten, T. Hastie, and R. Tibshirani, An Introduction to Statistical Learning, 2013.
DOI : 10.1007/978-1-4614-7138-7

Y. W. Kim and I. S. Oh, Classifier ensemble selection using hybrid genetic algorithms, Pattern Recognition Letters, vol.29, issue.6, pp.796-802, 2008.
DOI : 10.1016/j.patrec.2007.12.013

J. Kittler, M. Hatef, R. Duin, and J. Matas, On combining classifiers, IEEE Transact. Pattern Anal. Machine Intell, vol.3, pp.226-239, 1998.
DOI : 10.1109/34.667881

A. H. Ko, R. Sabourin, and A. S. Britto, From dynamic classifier selection to dynamic ensemble selection, Pattern Recognition, vol.41, issue.5, pp.1718-1731, 2008.
DOI : 10.1016/j.patcog.2007.10.015

URL : http://dollar.biz.uiowa.edu/~street/ko07.pdf

R. Kohavi, A study of cross-validation and bootstrap for accuracy estimation and model selection, 1995.

G. Köksal, I. Batmaz, and M. C. Testik, A review of data mining applications for quality improvement in manufacturing industry, Expert Systems with Applications, vol.38, issue.10, pp.13448-13467, 2011.
DOI : 10.1016/j.eswa.2011.04.063

S. B. Kotsiantis, Supervised machine learning: a review of classification techniques, Informatica, vol.31, pp.249-268, 2007.

S. B. Kotsiantis, Combining bagging, boosting, rotation forest and random subspace methods, Artificial Intelligence Review, vol.26, issue.10, pp.225-240, 2011.
DOI : 10.1016/j.patrec.2005.03.029

S. B. Kotsiantis, Decision trees: a recent overview, Artificial Intelligence Review, vol.4114, issue.8, pp.261-283, 2013.
DOI : 10.1016/S0950-7051(02)00038-2

A. Krimpenis, P. G. Bernardos, G. C. Vosniakos, and A. Koukouvitaki, Simulation-based selection of optimum pressure die-casting process parameters using neural nets and genetic algorithms, The International Journal of Advanced Manufacturing Technology, vol.20, issue.5-6, pp.509-517, 2006.
DOI : 10.1016/S0261-3069(99)00072-2

L. I. Kuncheva, Switching between selection and fusion in combining classifiers: an experiment, IEEE Transactions on Systems, Man and Cybernetics, Part B (Cybernetics), vol.32, issue.2, pp.146-156, 2002.
DOI : 10.1109/3477.990871

L. I. Kuncheva, Combining pattern classifiers: methods and algorithms, 2004.
DOI : 10.1002/9781118914564

L. I. Kuncheva and C. J. Whitaker, Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy, Machine Learning, pp.181-207, 2003.

L. I. Kuncheva, C. J. Whitaker, and C. A. Shipp, Limits on the majority vote accuracy in classifier fusion, Pattern Analysis & Applications, vol.6, issue.1, pp.22-31, 2003.
DOI : 10.1007/s10044-002-0173-7

A. D. Li, Z. He, and Y. Zhang, Bi-objective variable selection for key quality characteristics selection based on a modified NSGA-II and the ideal point method, Computers in Industry, vol.82, pp.95-103, 2016.
DOI : 10.1016/j.compind.2016.05.008

K. Li, Z. Liu, and Y. Han, Study of Selective Ensemble Learning Methods Based on Support Vector Machine, Physics Procedia, vol.33, pp.1518-1525, 2012.
DOI : 10.1016/j.phpro.2012.05.247

X. Liang, Y. Ma, Y. He, L. Yu, R. C. Chen et al., Fast pruning superfluous support vectors in SVMs, Pattern Recognition Letters, vol.34, issue.10, pp.1203-1209, 2013.
DOI : 10.1016/j.patrec.2013.03.015

. Mathworks, Statistics toolbox user's guide V 6.0, 2007.

. Mathworks, fileexchanges https://fr.mathworks.com/matlabcentral, p.58102, 2016.

M. Mehta, R. Agrawal, and J. Rissanen, SLIQ: A fast scalable classifier for data mining, Adv. Database Technol. EDBT ' Lecture Notes in Computer Science, vol.96, issue.1057, pp.18-32, 1996.
DOI : 10.1007/BFb0014141

D. Meyer, F. Leisch, and K. Hornik, The support vector machine under test, Neurocomputing, vol.55, issue.1-2, pp.169-186, 2003.
DOI : 10.1016/S0925-2312(03)00431-4

D. Nguyen and B. Widrow, Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights, 1990 IJCNN International Joint Conference on Neural Networks, pp.21-26, 1990.
DOI : 10.1109/IJCNN.1990.137819

M. Noyel, P. Thomas, P. Charpentier, A. Thomas, and T. Brault, Implantation of an on-line quality process monitoring, 5th International Conference on Industrial Engineering and Systems Management IESM'13, pp.28-30, 2013.
URL : https://hal.archives-ouvertes.fr/hal-00879491

M. Noyel, P. Thomas, A. Thomas, and P. Charpentier, Reconfiguration process for neuronal classification models: Application to a quality monitoring problem, Computers in Industry, vol.83, pp.78-91, 2016.
DOI : 10.1016/j.compind.2016.09.004

URL : https://hal.archives-ouvertes.fr/hal-01374907

M. Paliwal and U. A. Kumar, Neural networks and statistical techniques: A review of applications, Expert Systems with Applications, vol.36, issue.1, pp.2-17, 2009.
DOI : 10.1016/j.eswa.2007.10.005

J. C. Platt, Sequential minimal optimization: a fast algorithm for training support vector machines, 1998.

J. R. Quinlan, Improved use of continuous attributes in C4.5, J. Artif. Intell. Res, vol.4, pp.77-90, 1996.

J. J. Rodriguez, L. I. Kuncheva, and C. J. Alonso, Rotation Forest: A New Classifier Ensemble Method, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.28, issue.10, pp.1619-1630, 2006.
DOI : 10.1109/TPAMI.2006.211

D. Ruta and B. Gabrys, Classifier selection for majority voting, Information Fusion, vol.6, issue.1, pp.63-81, 2005.
DOI : 10.1016/j.inffus.2004.04.008

E. Santucci, L. Didaci, G. Fumera, and F. Roli, A parameter randomization approach for constructing classifier ensembles, Pattern Recognition, vol.69, pp.1-13, 2017.
DOI : 10.1016/j.patcog.2017.03.031

J. Shafer, R. Agrawal, and M. Mehta, SPRINT: a scalable parallel classifier for data mining, 22 th International Conference on Very Large Data VLDB'96, 1996.

V. Soto, ?. Martinez, G. Munoz, ?. Hernadez, D. Lobato et al., A Double Pruning Algorithm for Classification Ensembles, 11 th International Workshop on Multiple Classifier Systems MCS'13, pp.15-17, 2013.
DOI : 10.1007/978-3-642-12127-2_11

G. Taguchi, Quality Engineering in Production Systems, 1989.

E. K. Tang, P. N. Suganthan, and X. Yao, An analysis of diversity measures, Machine Learning, pp.247-271, 2006.
DOI : 10.1007/978-1-4757-2440-0

P. Thomas, G. Bloch, F. Sirou, and V. Eustache, Neural modeling of an induction furnace using robust learning criteria, J. Integrated Computer Aided Engineering, vol.6, issue.1, pp.5-23, 1999.

P. Thomas and M. C. Suhner, A New Multilayer Perceptron Pruning Algorithm for Classification and Regression Applications, Neural Processing Letters.42, pp.437-458, 2015.
DOI : 10.3182/20130522-3-BR-4036.00055

URL : https://hal.archives-ouvertes.fr/hal-01086842

P. Thomas and A. Thomas, How deals with discrete data for the reduction of simulation models using neural network, 13 th IFAC Symp. On Information Control Problems in Manufacturing INCOM'09, pp.3-5, 2009.
DOI : 10.3182/20090603-3-RU-2001.0490

URL : https://hal.archives-ouvertes.fr/hal-00393987

G. Tsoumakas, I. Patalas, and I. Vlahavas, An ensemble pruning primer Applications of supervised and unsupervised ensemble methods, Studies in Computational Intelligence, 2009.

T. Windeatt, Diversity measures for multiple classifier system analysis and design, Information Fusion, vol.6, issue.1, pp.21-36, 2005.
DOI : 10.1016/j.inffus.2004.04.002

M. Wozniak, Experiments with Trained and Untrained Fusers, Innovations in Hybrid Intelligent Systems, Advances in Soft Computing, pp.144-150, 2007.
DOI : 10.1007/978-3-540-74972-1_20

M. Wozniak, Evolutionary approach to produce classifier ensemble based on weighted voting, 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), pp.648-653, 2009.
DOI : 10.1109/NABIC.2009.5393446

M. Wozniak, M. Grana, and E. Corchado, A survey of multiple classifier systems as hybrid systems, Information Fusion, vol.16, pp.3-17, 2014.
DOI : 10.1016/j.inffus.2013.04.006

W. Xiaoqiao, L. Mingzhou, G. Maogen, L. Lin, and L. Conghu, Research on assembly quality adaptive control system for complex mechanical products assembly process under uncertainty, pp.43-57, 2015.

L. Yang, Classifiers selection for ensemble learning based on accuracy and diversity, Procedia Engineering, vol.15, pp.4266-4270, 2011.
DOI : 10.1016/j.proeng.2011.08.800

T. Yang, T. Tsai, and J. Yeh, A neural network-based prediction model for fine pitch stencil-printing quality in surface mount assembly, Engineering Applications of Artificial Intelligence, vol.18, issue.3, pp.335-341, 2005.
DOI : 10.1016/j.engappai.2004.09.004

X. Yang, J. Lu, and G. Zhang, Adaptive pruning algorithm for least squares support vector machine classifier, Soft Computing, vol.16, issue.6, pp.667-680, 2010.
DOI : 10.1007/978-1-4757-2440-0

J. Yu, L. Xi, and X. Zhou, Intelligent monitoring and diagnosis of manufacturing processes using an integrated approach of KBANN and GA, Computers in Industry, vol.59, issue.5, pp.489-501, 2008.
DOI : 10.1016/j.compind.2007.12.005

Z. H. Zhou, J. Wu, and W. Tang, Ensembling neural networks: Many could be better than all, Artificial Intelligence, vol.137, issue.1-2, pp.239-263, 2002.
DOI : 10.1016/S0004-3702(02)00190-X

URL : https://doi.org/10.1016/s0004-3702(02)00190-x

X. Zhou, Y. Ma, Y. Tu, and Y. Feng, Ensemble of Surrogates for Dual Response Surface Modeling in Robust Parameter Design, Quality and Reliability Engineering International, vol.47, issue.5, pp.173-197, 2013.
DOI : 10.1007/978-3-642-59109-9