Multisensorische Repräsentation von Eigenbewegung im menschlichen Gehirn
Wenn wir uns durch den Raum bewegen, erhalten wir visuelle, propriozeptive, vestibuläre, auditive und bisweilen auch taktile Informationen über die Position, Geschwindigkeit und Beschleunigung unseres Körpers. Nur eine erfolgreiche Integration dieser Signale ermöglicht uns eine kohärente Wahrnehm...
Saved in:
Main Author: | |
---|---|
Contributors: | |
Format: | Doctoral Thesis |
Language: | German |
Published: |
Philipps-Universität Marburg
2011
|
Subjects: | |
Online Access: | PDF Full Text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Table of Contents:
While moving through our environment we receive visual, auditory, proprioceptive,
vestibular and sometimes tactile information about the position, velocity and acceleration
of our body. Only a successful integration of these signals allows for a
coherent perception of self-motion. Information from all modalities together provides
the most reliable representation. However, previous studies demonstrated that
one can use pure visual, vestibular or proprioceptive signals for distance estimation.
The aim of my thesis was to analyse the role of auditory signals for self-motion
perception and to determine which brain areas process audio-visual self-motion signals.
For this purpose I carried out psychophysical tests and recorded brain activities
using functional magnetic resonance imaging (fMRI).
In my first study I investigated whether auditory self-motion information can
be used to estimate and reproduce the distances of forward movements. Participants
were presented with a visually simulated forward-motion across a ground
plane (passive displacement). The frequency of an associated auditory stimulus was
proportional to the simulated speed. Subjects had to reproduce the distance of the
displacement with a joystick (active displacement). During the active displacements
they received either audio-visual or pure visual or pure auditory motion signals. I
found that reproduction was most precise when the participants only heard the tone
while it was least precise when they only saw the ground plane. In a subsequent experiment
in some trials the relationship between optical velocity and tone frequency
was differently scaled during the active displacements, i.e., the tone frequency was
either higher or lower than during the passive displacements (catch trials). I found
that the re-scaling affected the subjects’ performance:When the frequency was lower
subjects used higher speeds resulting in a substantial overshoot of travelled distance,
whereas a higher frequency resulted in an undershoot of travelled distance. I conclude
that during self motion tone frequency can be used as a velocity cue and helps
to estimate and reproduce travel distance.
During self-motion an image of the environment is shifted on the retina. This
image motion – called optic flow – provides us with information about the direction
and velocity of the displacement. It induces reflexive, compensatory eye movements
which stabilize part of the image on the retina. In my second study I observed that
a simulated forward motion across a ground plane (as used in Study I) induces such
reflexive eye movements. They are composed of slow (following) and fast (resetting)
phases. I found that subjects controlled the speed of the slow eye movements more
precisely when they controlled the driving speed with a joystick. Probably the proprioceptive
feedback from the joystick facilitated eye movement control. Moreover,
I found that participants also moved their eyes in the direction of the ground plane
motion when they did not see the plane but only received auditory velocity cues.
In a third study I investigated by means of fMRI which brain regions are involved
in the processing of audio-visual self-motion signals. Since only spatially and temporally
congruent signals are integrated optimally into a common percept I investigated
to what extend the congruency of signals influences brain activity. The visual stimulus
consisted of an alternately expanding and contracting cloud of random dots
simulating a forward and backward motion. Auditory stimuli consisted of a sinusoidal
tone which simulated a forward and backward motion in a congruent bimodal
condition. In an incongruent bimodal condition the tone simulated a frontoparallel
motion while the visual stimulus simulated a forward and backward motion. The
contrast of bimodal versus unimodal stimulation activated amongst others regions
around the precentral sulcus, the superior temporal sulcus as well as the intraparietal
sulcus. Compared to incongruent stimulation the congruent stimulus activated
a part of the precentral sulcus.
Taken together, I showed in my thesis that auditory self-motion information plays
an important role for the estimation and reproduction of travelled distances. Audiovisual
self-motion information is processed in a parieto-frontal brain network. Spatially
congruent signals are processed in a brain area which might be an equivalent
of the polysensory zone (PZ) in the macaque brain.