, voir Section 5.1) et par conséquent, d'augmenter l'efficacité de l'appariement des documents avec la requête

, apprendre les représentations des documents, conjointement avec les mots et concepts issus d'une ressource sémantique externe De plus, l'apprentissage est contraint par les relations établies dans cette ressource afin d'assurer la lisibilité des représentations obtenues et au delà, réduire le fossé sémantique en RI. Les expérimentations ont montré l'impact positif de l'usage des concepts et des relations à la fois pour des tâches de similarité de mots, de documents et de RI. Cependant, nous avons également observé que le modèle proposé est sensible à la qualité de l'annotation conceptuelle. Une perspective intéressante serait d'intégrer dans le processus d'apprentissage un alignement approximatif mot-concepts candidats qui, Conclusion Nous avons présenté dans cet article un modèle neuronal tripartite permettant d

, Bibliographie

Q. Ai, L. Yang, J. Guo, and W. B. Croft, Analysis of the Paragraph Vector Model for Information Retrieval, Proceedings of the 2016 ACM International Conference on the Theory of Information Retrieval, ICTIR '16, pp.133-142, 2016.
DOI : 10.1145/2838931.2838936

P. Arora, J. Foster, and G. J. Jones, Query Expansion for Sentence Retrieval Using Pseudo Relevance Feedback and Word Embedding, pp.97-103, 2017.
DOI : 10.1561/1500000019

Y. Bengio, H. Schwenk, J. Senécal, F. Morin, and J. Gauvain, Neural probabilistic language models Machine Learning, 2006.

E. Bruni, G. Boleda, M. Baroni, and N. Tran, Distributional semantics in technicolor, pp.136-145, 2012.

J. Cheng, Z. Wang, J. Wen, J. Yan, and Z. Chen, Contextual Text Understanding in Distributional Semantic Space », CIKM, pp.133-142, 2015.

E. Choi, M. T. Bahadori, E. Searles, C. Coffey, and J. Sun, Multi-layer Representation Learning for Medical Concepts, pp.1495-1504, 2016.

F. Corcoglioniti, M. Dragoni, M. Rospocher, and A. P. Aprosio, « Knowledge Extraction for Information Retrieval », ESWC, vol.9678, pp.317-333, 2016.

F. Crestani, Exploiting the Similarity of Non-Matching Terms at Retrieval Time, Information Retrieval, vol.2, issue.1, pp.27-47, 2000.
DOI : 10.1023/A:1009973415168

F. Diaz, B. Mitra, and N. Craswell, Query Expansion with Locally-Trained Word Embeddings », ACL, 2016.

M. Faruqui, J. Dodge, S. K. Jauhar, C. Dyer, E. Hovy et al., Retrofitting Word Vectors to Semantic Lexicons, p.2014

P. Ferragina and U. Scaiella, on-the-fly annotation of short text fragments (by wikipedia entities), pp.1625-1628, 2010.

L. Finkelstein, E. Gabrilovich, Y. Matias, E. Rivlin, Z. Solan et al., Placing search in context, Proceedings of the tenth international conference on World Wide Web , WWW '01, pp.406-414, 2001.
DOI : 10.1145/371920.372094

Z. S. Harris, Distributional structure, pp.146-162, 1954.

I. Iacobacci, M. T. Pilehvar, R. Navigli, and . Sensembed, SensEmbed: Learning Sense Embeddings for Word and Relational Similarity, Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pp.95-105, 2015.
DOI : 10.3115/v1/P15-1010

T. Kenter, A. Borisov, and M. De-rijke, Siamese cbow : Optimizing word embeddings for sentence representations, 2016.

R. Kiros, Y. Zhu, R. R. Salakhutdinov, R. Zemel, R. Urtasun et al., , pp.3294-3302, 2015.

Q. V. Le and T. Mikolov, Distributed Representations of Sentences and Documents, pp.1188-1196, 2014.

X. Liu, J. Nie, and A. Sordoni, Constraining Word Embeddings by Prior Knowledge ??? Application to Medical Information Retrieval, Information Retrieval Technology, vol.51, issue.1, pp.155-167, 2016.
DOI : 10.1145/2407085.2407100

M. Mancini, J. Camacho-collados, I. Iacobacci, and R. Navigli, Embedding Words and Senses Together via Joint Knowledge-Enhanced Training, 2016.

M. Massimiliano, C. Jose, I. Ignacio, and N. Roberto, « Embedding words and senses together via joint knowledge-enhanced training, pp.100-111, 2017.

T. Mikolov, K. Chen, G. Corrado, and J. Dean, « Efficient estimation of word representations in vector space, 2013.

J. Mitchell and M. Lapata, Vector-based Models of Semantic Composition. », ACL, pp.236-244, 2008.

N. Mrk?ic, D. Oséaghdha, B. Thomson, M. Ga?ic, L. Rojas-barahona et al., Counter-fitting Word Vectors to Linguistic Constraints, pp.142-148, 2016.

J. Pennington, R. Socher, and C. Manning, Glove: Global Vectors for Word Representation, Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp.1532-1543, 2014.
DOI : 10.3115/v1/D14-1162

P. Rajpurkar, J. Zhang, K. Lopyrev, P. Liang, and . Squad, 000+ questions for machine comprehension of text, 2016.

H. Rubenstein and J. B. Goodenough, Contextual correlates of synonymy, Contextual correlates of synonymy, pp.627-633, 1965.
DOI : 10.1145/365628.365657

I. Vuli´cvuli´c and M. Moens, « Monolingual and cross-lingual information retrieval models based on (bilingual) word embeddings, pp.363-372, 2015.

C. Xiong and J. Callan, Query Expansion with Freebase, Proceedings of the 2015 International Conference on Theory of Information Retrieval, ICTIR '15, pp.111-120, 2015.
DOI : 10.1145/502585.502654

I. Yamada, H. Shindo, H. Takeda, and Y. Takefuji, Joint Learning of the Embedding of Words and Entities for Named Entity Disambiguation, Proceedings of The 20th SIGNLL Conference on Computational Natural Language Learning, pp.250-259, 2016.
DOI : 10.18653/v1/K16-1025

M. Yu and M. Dredze, Improving Lexical Embeddings with Semantic Knowledge, pp.545-550, 2014.

H. Zamani and W. B. Croft, Estimating Embedding Vectors for Queries, Proceedings of the 2016 ACM International Conference on the Theory of Information Retrieval, ICTIR '16, pp.123-132, 2016.
DOI : 10.1145/2838931.2838936

R. Zhao and W. I. Grosky, « Narrowing the semantic gap-improved text-based web document retrieval using visual features, IEEE transactions on multimedia, pp.189-200, 2002.

G. Zuccon, B. Koopman, P. Bruza, and L. Azzopardi, Integrating and Evaluating Neural Word Embeddings in Information Retrieval, p.12, 2015.