TAG Parsing with Neural Networks and Vector Representations of Supertags
Résumé
We present supertagging-based models for Tree Adjoining Grammar parsing that use neural network architectures and dense vector representation of supertags (elementary trees) to achieve state-of-the-art performance in unlabeled and labeled attachment scores. The shift-reduce parsing model eschews lexical information entirely , and uses only the 1-best supertags to parse a sentence, providing further support for the claim that supertagging is " almost parsing. " We demonstrate that the embedding vector representations the parser induces for supertags possess linguistically interpretable structure, supporting analogies between grammatical structures like those familiar from recent work in distri-butional semantics. This dense representation of supertags overcomes the drawbacks for statistical models of TAG as compared to CCG parsing, raising the possibility that TAG is a viable alternative for NLP tasks that require the assignment of richer structural descriptions to sentences.
Domaines
Traitement du texte et du document
Origine : Fichiers éditeurs autorisés sur une archive ouverte