A probabilistic study of neural complexity

Abstract : G. Edelman, O. Sporns, and G. Tononi have introduced the neural complexity of a family of random variables, defining it as a specific average of mutual information over subfamilies. We show that their choice of weights satisfies two natural properties, namely exchangeability and additivity, and we call any functional satisfying these two properties an intricacy. We classify all intricacies in terms of probability laws on the unit interval and study the growth rate of maximal intricacies when the size of the system goes to infinity. For systems of a fixed size, we show that maximizers have small support and exchangeable systems have small intricacy. In particular, maximizing intricacy leads to spontaneous symmetry breaking and failure of uniqueness.
Document type :
Preprints, Working Papers, ...
minor edits. 2009
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-00409143
Contributor : Jerome Buzzi <>
Submitted on : Friday, December 18, 2009 - 10:19:03 PM
Last modification on : Wednesday, January 4, 2017 - 4:19:48 PM
Document(s) archivé(s) le : Thursday, September 23, 2010 - 6:14:55 PM

Files

intricacy1-revised-arxiv-2009-...
Files produced by the author(s)

Identifiers

  • HAL Id : hal-00409143, version 3
  • ARXIV : 0908.1006

Collections

Citation

Jerome Buzzi, Lorenzo Zambotti. A probabilistic study of neural complexity. minor edits. 2009. <hal-00409143v3>

Share

Metrics

Record views

309

Document downloads

49