utiliser des méta-attributs caractérisant des attributs particuliers des jeux de données, mais diverses expériences [28, 5] ont montré que les propriétés d'un attribut dans le contexte des autres attributs sont au moins aussi importantes Il serait alors intéressant de permettre l'utilisation de tels méta-attributs relationnels, comme la covariance, ou l'information mutuelle, par la dissimilarité. D'autre part, bien que divers et provenant d'approches très différentes, les méta-attributs employés dans nos expériences ne couvrent pascompì etement l'´ etat de l'art en lamatì ere. Cesdernì eres années ontétéontété riches en contributions introduisant de nouveaux méta-attributs [29, 15, 27, 35], dont l'utilisation pourrait révéler l'intérêt d'une approche par dissimilarité dans de nouveaux contextes. Enfin, comme l'efficacité de l'approche par dissimilarité appara??tappara??t très dépendante du contexte (comme c'est souvent le cas en apprentissage et méta-apprentissage), il pourraitêtrepourraitêtre intéressant de concevoir une méthode d'´ evaluation de méta-attributs considérant leurs diverses natures (globaux ,
analyse de données, un atout particulier de notre approche est qu'elle permet une caractérisation unifiée des expériences d'analyse de données. En effet, disposant d'une quelconque représentation du processus d'analyse de données et de ses résultats, il est possible de l'intégrer dans la dissimilarité, permettant ainsi la comparaison directe d'expériencescompì etes. Il s'agit l` a d'un premier pas vers de nouvelle approches d'assistance intelligentè a l'analyse de données ,
A new look at the statistical model identification, IEEE Transactions on Automatic Control, vol.19, issue.16, pp.716-723, 1974. ,
How k-nearest neighbor parameters affect its performance, Argentine Symposium on Artificial Intelligence, pp.1-12, 2009. ,
Characterizing the applicability of classification algorithms using meta-level learning, European conference on machine learning, pp.83-102, 1994. ,
DOI : 10.1007/3-540-57868-4_52
Random forests, Machine Learning, vol.45, issue.1, pp.5-32, 2001. ,
DOI : 10.1023/A:1010933404324
Conditional likelihood maximisation : a unifying framework for information theoretic feature selection, The Journal of Machine Learning Research, vol.13, issue.1, pp.27-66, 2012. ,
An instance-based learner using an entropic distance measure, Proceedings of the 12th International Conference on Machine learning, pp.108-114, 1995. ,
Weighted kappa: Nominal scale agreement provision for scaled disagreement or partial credit., Psychological Bulletin, vol.70, issue.4, p.213, 1968. ,
DOI : 10.1037/h0026256
Ensembles of Balanced Nested Dichotomies for Multi-class Problems, Knowledge Discovery in Databases : PKDD 2005, pp.84-95, 2005. ,
DOI : 10.1007/11564126_13
On Kuhn's Hungarian Method?A tribute from Hungary, Naval Research Logistics, vol.5, issue.1, pp.2-5, 2005. ,
DOI : 10.1002/nav.20056
Fully supervised training of gaussian radial basis function networks in weka, 2014. ,
Using model trees for classification, Machine Learning, pp.63-76, 1998. ,
Extended data characteristics, 2002. ,
Introduction to the Special Issue on Meta-Learning, Machine Learning, vol.54, issue.3, pp.187-193, 2004. ,
DOI : 10.1023/B:MACH.0000015878.60765.42
The WEKA data mining software, ACM SIGKDD Explorations Newsletter, vol.11, issue.1 ,
DOI : 10.1145/1656274.1656278
Complexity measures of supervised classification problems. Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.24, issue.3, pp.289-300, 2002. ,
Generating Rule Sets from Model Trees, Australasian Joint Conference on Artificial Intelligence, pp.1-12, 1999. ,
DOI : 10.1007/3-540-46695-9_1
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.148.9721
Algorithm selection via meta-learning, 2002. ,
On Data and Algorithms: Understanding Inductive Performance, Machine Learning, pp.275-312, 2004. ,
DOI : 10.1023/B:MACH.0000015882.38031.85
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.464.6813
Model selection via metalearning : a comparative study, International Journal on Artificial Intelligence Tools, vol.10, issue.04, pp.525-554, 2001. ,
DOI : 10.1109/tai.2000.889901
Feature selection for metalearning, Proceedings of the 5th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD '01, pp.222-233, 2001. ,
Information-based evaluation criterion for classifier's performance, Machine Learning, pp.67-80, 1991. ,
DOI : 10.1007/BF00153760
URL : https://repozitorij.uni-lj.si/Dokument.php?id=49317
The hungarian method for the assignment problem, Naval research logistics quarterly, vol.2, issue.12, pp.83-97, 1955. ,
Selecting Classification Algorithms with Active Testing, Machine Learning and Data Mining in Pattern Recognition, pp.117-131, 2012. ,
DOI : 10.1007/978-3-642-31537-4_10
URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.416.9998
A set of complexity measures designed for applying meta-learning to instance selection. Knowledge and Data Engineering, IEEE Transactions on, vol.27, issue.2, pp.354-367, 2015. ,
Introduction to gaussian processes, NATO ASI Series F Computer and Systems Sciences, vol.168, pp.133-166, 1998. ,
A general framework for estimating similarity of datasets and decision trees: exploring semantic similarity of decision trees, SDM, pp.810-821, 2008. ,
DOI : 10.1137/1.9781611972788.73
Feature selection based on mutual information criteria of max-dependency, max-relevance, and minredundancy . Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.27, issue.8, pp.1226-1238, 2005. ,
Decision tree-based data characterization for meta-learning. IDDM-2002, p.111, 2002. ,
Tell me who can learn you and i can tell you who you are : Landmarking various learning algorithms, Proceedings of the 17th international conference on machine learning, pp.743-750, 2000. ,
Toward effective support for data mining using intelligent discovery assistance, 2013. ,
Computational Intelligence Methods in Metalearning, 2016. ,
Table for Estimating the Goodness of Fit of Empirical Distributions, The Annals of Mathematical Statistics, vol.19, issue.2, pp.279-281, 1948. ,
DOI : 10.1214/aoms/1177730256
A tutorial on support vector regression, Statistics and computing, vol.14, issue.3, pp.199-222, 2004. ,
Pairwise meta-rules for better meta-learning-based algorithm ranking, Machine Learning, vol.1, issue.1, pp.141-161, 2013. ,
DOI : 10.1007/s10994-013-5387-y
URL : http://researchcommons.waikato.ac.nz/bitstream/10289/7823/1/qs-mlj13.pdf
Full model selection in the space of data mining operators, Proceedings of the fourteenth international conference on Genetic and evolutionary computation conference companion, GECCO Companion '12, pp.1503-1504, 2012. ,
DOI : 10.1145/2330784.2331014
Report on the experiments with feature selection in meta-level learning decision support, meta-learning and ILP : forum for practical problem presentation and prospective solutions, Proceedings of the PKDD-00 workshop on data mining, pp.27-39, 2000. ,
An Efficient Algorithm for Enumerating Closed Patterns in Transaction Databases, Discovery science, pp.16-31, 2004. ,
DOI : 10.1007/978-3-540-30214-8_2
Experiment databases, Machine Learning, pp.127-158, 2012. ,
DOI : 10.1007/978-1-4419-7738-0_14
A perspective view and survey of metalearning, Artificial Intelligence Review, vol.18, issue.2, pp.77-95, 2002. ,
DOI : 10.1023/A:1019956318069
Theory and Algorithm for Learning with Dissimilarity Functions, Neural Computation, vol.8, issue.5, pp.1459-1484, 2009. ,
DOI : 10.1145/954339.954342
Learning data set similarities for hyperparameter optimization initializations, MetaSel@ PKDD/ECML, pp.15-26, 2015. ,
DOI : 10.1109/dsaa.2015.7344817
The lack of a priori distinctions between learning algorithms, Neural computation, vol.8, issue.7, pp.1341-1390, 1996. ,
No free lunch theorems for optimization, Evolutionary Computation IEEE Transactions on, vol.1, issue.1, pp.67-82, 1997. ,
Slurm : Simple linux utility for resource management, Job Scheduling Strategies for Parallel Processing, pp.44-60, 2003. ,
Automating Knowledge Discovery Workflow Composition Through Ontology-Based Planning, IEEE Transactions on Automation Science and Engineering, vol.8, issue.2, pp.253-264, 2011. ,
DOI : 10.1109/TASE.2010.2070838