Skip to Main content Skip to Navigation
Conference papers

Extracting Context-Free Grammars from Recurrent Neural Networks using Tree-Automata Learning and A* Search

Abstract : This paper presents (i) an active learning algorithm for visibly pushdown grammars and (ii) shows its applicability for learning surrogate models of recurrent neural networks (RNNs) trained on context-free languages. Such surrogate models may be used for verification or explainability. Our learning algorithm makes use of the proximity of visibly pushdown languages and regular tree languages and builds on an existing learning algorithm for regular tree languages. Equivalence tests between a given RNN and a hypothesis grammar rely on a mixture of A* search and random sampling. An evaluation of our approach on a set of RNNs from the literature shows good preliminary results.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03285433
Contributor : Benedikt Bollig Connect in order to contact the contributor
Submitted on : Tuesday, August 24, 2021 - 1:23:51 PM
Last modification on : Friday, August 5, 2022 - 2:58:08 PM
Long-term archiving on: : Friday, November 26, 2021 - 9:24:13 AM

File

icgi-2020_21.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03285433, version 1

Citation

Benoît Barbot, Benedikt Bollig, Alain Finkel, Serge Haddad, Igor Khmelnitsky, et al.. Extracting Context-Free Grammars from Recurrent Neural Networks using Tree-Automata Learning and A* Search. ICGI 2021 - 15th International Conference on Grammatical Inference, Aug 2021, New York City / Virtual, United States. ⟨hal-03285433⟩

Share

Metrics

Record views

115

Files downloads

59