Multimodal neural networks better explain multivoxel patterns in the hippocampus - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Neural Networks Année : 2022

Multimodal neural networks better explain multivoxel patterns in the hippocampus

Résumé

The human hippocampus possesses “concept cells”, neurons that fire when presented with stimuli belonging to a specific concept, regardless of the modality. Recently, similar concept cells were discovered in a multimodal network called CLIP (Radford et al., 2021). Here, we ask whether CLIP can explain the fMRI activity of the human hippocampus better than a purely visual (or linguistic) model. We extend our analysis to a range of publicly available uni- and multi-modal models. We demonstrate that “multimodality” stands out as a key component when assessing the ability of a network to explain the multivoxel activity in the hippocampus.

Dates et versions

hal-03859816 , version 1 (18-11-2022)

Licence

Paternité

Identifiants

Citer

Bhavin Choksi, Milad Mozafari, Rufin VanRullen, Leila Reddy. Multimodal neural networks better explain multivoxel patterns in the hippocampus. Neural Networks, 2022, 154, pp.538-542. ⟨10.1016/j.neunet.2022.07.033⟩. ⟨hal-03859816⟩
52 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More