A probabilistic study of neural complexity
Résumé
G. Edelman, O. Sporns, and G. Tononi have introduced in theoretical biology the neural complexity of a family of random variables. They have defined it as a specific average of mutual information over subsystems. We show that their choice of weights satisfies two natural properties, namely exchangeability and additivity. This paper classifies all functionals satisfying these two properties (which we call intricacies) in terms of probability laws on the unit interval and studies the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that the maximizers are non-unique and that the maximal value is not approached by exchangeable laws.
Origine : Fichiers produits par l'(les) auteur(s)