Noninformative vision improves the spatial resolution of touch in humans.
1188 - 1191.
Research on sensory perception now often considers more than one sense at a time. This approach reflects real-world situations, such as when a visible object touches us. Indeed, vision and touch show great interdependence: the sight of a body part can reduce tactile target detection times , visual and tactile attentional systems are spatially linked , and the texture of surfaces that are actively touched with the fingertips is perceived using both vision and touch . However, these previous findings might be mediated by spatial attention [1, 2] or by improved guidance of movement  via visually enhanced body position sense [4-6]. Here, we investigate the direct effects of viewing the body on passive touch. We measured tactile two-point discrimination thresholds  on the forearm while manipulating the visibility of the arm but holding gaze direction constant. The spatial resolution of touch was better when the arm was visible than when it was not. Tactile performance was further improved when the view of the arm was magnified. In contrast, performance was not improved by viewing a neutral object at the arm's location, ruling out improved spatial orienting as a possible account. Controls confirmed that no information about the tactile stimulation was provided by visibility of the arm. This visual enhancement of touch may point to online reorganization of tactile receptive fields.
|Title:||Noninformative vision improves the spatial resolution of touch in humans|
|Keywords:||CROSS-MODAL LINKS, RECEPTIVE-FIELDS, TOOL USE, ATTENTION, PROPRIOCEPTION, MONKEYS, HAND, PERCEPTION, NEURONS, FINGER|
|UCL classification:||UCL > School of Life and Medical Sciences > Faculty of Brain Sciences > Psychology and Language Sciences (Division of) > Institute of Cognitive Neuroscience
UCL > School of Life and Medical Sciences > Faculty of Life Sciences
Archive Staff Only