Skip to Main content Skip to Navigation
Conference papers

Multimodal neural networks better explain multivoxel patterns in the hippocampus

Abstract : The human hippocampus possesses "concept cells", neurons that fire when presented with stimuli belonging to a specific concept, regardless of the modality. Recently, similar concept cells were discovered in a multimodal network called CLIP [1]. Here, we ask whether CLIP can explain the fMRI activity of the human hippocampus better than a purely visual (or linguistic) model. We extend our analysis to a range of publicly available uni-and multi-modal models. We demonstrate that "multimodality" stands out as a key component when assessing the ability of a network to explain the multivoxel activity in the hippocampus.
Document type :
Conference papers
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-03428635
Contributor : Leila Reddy Connect in order to contact the contributor
Submitted on : Monday, November 15, 2021 - 11:15:31 AM
Last modification on : Thursday, September 1, 2022 - 4:02:34 AM
Long-term archiving on: : Wednesday, February 16, 2022 - 8:02:06 PM

File

Bhavin_concept_cells_paper.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : hal-03428635, version 1

Citation

Bhavin Choksi, Milad Mozafari, Rufin Vanrullen, Leila Reddy. Multimodal neural networks better explain multivoxel patterns in the hippocampus. Neural Information Processing Systems (NeurIPS) conference: 3rd Workshop on Shared Visual Representations in Human and Machine Intelligence (SVRHM 2021), Dec 2021, Virtual Conference, United States. ⟨hal-03428635⟩

Share

Metrics

Record views

124

Files downloads

25