, First, we fix one element of the pair as a nearest neighbor in the class of the item to be classified. Then, we fix the second one as a most remote element: this leads to a degraded accuracy. Finally, we consider the second element as a second nearest neighbor in the class. Our experiments on UCI benchmarks show that we are still competitive with regard to state of the art classifiers (k-NN, SVM) while having drastically decreased the complexity. Indeed, as datasets become bigger and bigger, the scalability of oddness-based classifiers is paramount: this has to be further investigated in future works. While using the classical notion of nearest neighbor, our handling of them is quite different from the one in k-NN methods

L. Miclet, S. Bayoudh, and A. Delhay, Analogical dissimilarity: definition, algorithms and two experiments in machine learning, JAIR, vol.32, pp.793-824, 2008.

M. Bounhas, H. Prade, and R. G. , Analogical classification: A new way to deal with examples, ECAI 2014 -21st European Conference on Artificial Intelligence, pp.135-140
URL : https://hal.archives-ouvertes.fr/hal-01399864

M. Bounhas, H. Prade, and G. Richard, Analogy-based classifiers for nominal or numerical data, Int J Approx Reasoning, vol.91, pp.36-55, 2017.

H. Prade and G. Richard, Reasoning with logical proportions, Proceedings of 12th International Conference on Principles of Knowledge Representation and Reasoning, pp.545-555, 2010.

H. Prade and G. Richard, Homogenous and heterogeneous logical proportions, IfCoLog J Logics Appl, vol.1, issue.1, pp.1-51, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01154243

M. Bounhas, H. Prade, and G. Richard, Evenness-based reasoning with logical proportions applied to classification, Proceedings of Scalable Uncertainty Management -9th International Conference, SUM, vol.9310, pp.139-154, 2015.

M. Bounhas, H. Prade, and R. G. , Oddness-based classifiers for Boolean or numerical data, Advances in Artificial Intelligence -38th Annual German Conference on AI Dresden, pp.32-44, 2015.

M. Bounhas, H. Prade, and G. Richard, Oddness/eve nness-based classifiers for Boolean or numerical data, Int J Approx Reasoning, vol.82, pp.81-100, 2017.

M. Bounhas, H. Prade, G. Richard, G. A. Kaminka, M. Fox et al., Not being at odds with a class: A new way of exploiting neighbors for classification, 22nd European Conference on Artificial Intelligence, vol.29, pp.1662-1663, 2016.

N. Rescher, Many-valued logic, 1969.

J. Mertz and P. M. Murphy, Uci repository of machine learning databases, 2000.

C. W. Hsu, C. C. Chang, and C. J. Lin, A practical guide to support vector classification, 2010.

M. Bounhas, H. Prade, and G. Richard, Analogical classification: Handling numerical data, Proceedings of the 8th International Conference on Scalable Uncertainty Management, SUM, vol.8720, pp.66-79, 2014.

M. Couceiro, N. Hug, and H. Prade, Analogy-preserving functions: A way to extend Boolean samples, Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence Main track, 1575.
URL : https://hal.archives-ouvertes.fr/hal-01668230

M. Friedman, The use of ranks to avoid the assumption of normality implicit in the analysis of variance, J Am Stat Assoc, vol.32, pp.675-701, 0200.

W. J. Conover, Practical Nonparametric Statistics, 1971.

D. Coomans and D. L. Massart, Alternative k-nearest neighbour rules in supervised pattern recognition: Part 1. k-nearest neighbour classification by using alternative voting rules, Anal Chim Acta, pp.15-27, 1982.

M. T. Law, N. Thome, and M. Cord, Quadruplet-wise image similarity learning, IEEE International Conference on Computer Vision (ICCV), 2013.
URL : https://hal.archives-ouvertes.fr/hal-01094069