Ponzi, Adam;
Suzuki, Keisuke;
(2025)
Multisensory integration in chaotic networks.
Neural Networks
, 191
, Article 107766. 10.1016/j.neunet.2025.107766.
![]() |
Text
Ponzi_papaug10nc_sept.pdf - Accepted Version Access restricted to UCL open access staff until 13 January 2026. Download (4MB) |
Abstract
Empirical studies of multisensory spatial perception have uncovered a puzzling array of findings. Illusions, such as the rubber-hand and ventriloquism, demonstrate that simultaneous but spatially separated multisensory stimuli are combined into a single unified percept, but only if they are not too far apart. Intriguingly, the perception of unity fluctuates strongly across apparently identical trials. Spatial localization belief also shows strong fluctuations across identical trials which increase with true spatial disparity, and are larger when beliefs are segregated. Fluctuations are much larger than can be accounted for by external sensory noise sources and also strongly depend on the sequence of preceding stimuli. Here we present a very general and minimal deterministic firing rate network model to explore how fluctuations in spatial localization belief - and the perception of whether these beliefs arise from a single cause - are influenced by the chaotic dynamics of a multisensory brain network. Our study examines the conditions under which these endogenous fluctuations emerge and how they contribute to the unified or segregated nature of perceptual experiences. Crucially, we find that multiple empirical effects observed in multisensory integration arise naturally when the network operates at the edge of chaos. We propose a new neuronal mechanism that estimates the probability of perceiving a unified cause which reflects the extent of network chaos. Additionally, we investigate the effects of varying visual reliability through visual blur and demonstrate that increasing visual blur enhances network chaos, thereby influencing the stability of unified and segregated perceptual states. Ultimately, we calculate explicit proprioceptive and visual beliefs by integrating the emergent internal spatial belief, the unity report probability, and sensory inputs, consistent with Bayesian Causal Inference. The model reproduces a large set of experimental findings, including negative bias in the less reliable sensory modality, increasing fluctuations at low disparity in segregated percepts, and the dependence of belief fluctuations on the sequence of previous stimuli. It makes several novel predictions and provides insights into the role of intrinsic network dynamics in shaping multisensory perception.
Archive Staff Only
![]() |
View Item |