Skip to Main content Skip to Navigation
Conference papers

Predify: Augmenting deep neural networks with brain-inspired predictive coding dynamics

Abstract : Deep neural networks excel at image classification, but their performance is far less robust to input perturbations than human perception. In this work we explore whether this shortcoming may be partly addressed by incorporating brain-inspired recurrent dynamics in deep convolutional networks. We take inspiration from a popular framework in neuroscience: 'predictive coding'. At each layer of the hierarchical model, generative feedback 'predicts' (i.e., reconstructs) the pattern of activity in the previous layer. The reconstruction errors are used to iteratively update the network's representations across timesteps, and to optimize the network's feedback weights over the natural image dataset-a form of unsupervised training. We show that implementing this strategy into two popular networks, VGG16 and EfficientNetB0, improves their robustness against various corruptions. We hypothesize that other feedforward networks could similarly benefit from the proposed framework. To promote research in this direction, we provide an open-sourced PyTorch-based package called Predify, which can be used to implement and investigate the impacts of the predictive coding dynamics in any convolutional neural network.
Document type :
Conference papers
Complete list of metadata
Contributor : Rufin Vanrullen Connect in order to contact the contributor
Submitted on : Saturday, November 27, 2021 - 10:46:27 AM
Last modification on : Tuesday, January 4, 2022 - 6:47:52 AM


Files produced by the author(s)


  • HAL Id : hal-03452646, version 1
  • ARXIV : 2106.02749


Bhavin Choksi, Milad Mozafari, Callum Biggs O'May, Benjamin Ador, Andrea Alamia, et al.. Predify: Augmenting deep neural networks with brain-inspired predictive coding dynamics. 35th Conference on Neural Information Processing Systems (NeurIPS 2021), 2021, online, Canada. ⟨hal-03452646⟩



Les métriques sont temporairement indisponibles