Neural processing of continuous temporal information: visual and visuomotor systems
Our visual environment is highly dynamic, comprising changes in different parameters at multiple timescales in parallel. These changes inform us about the identities and movements of external objects, and at the same time depend on our own behavior. For us to efficiently interact with the environmen...
Saved in:
Main Author: | |
---|---|
Contributors: | |
Format: | Doctoral Thesis |
Language: | English |
Published: |
Philipps-Universität Marburg
2021
|
Subjects: | |
Online Access: | PDF Full Text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Our visual environment is highly dynamic, comprising changes in different parameters at multiple timescales in parallel. These changes inform us about the identities and movements of external objects, and at the same time depend on our own behavior. For us to efficiently interact with the environment, we therefore rely on visual representations that are as accurate as possible while also separating between different sources of information in the temporal domain.
In my thesis, I studied the processing of continuous temporal information in the primate visual system. To this end, I used behavioral and electrophysiological measurements from human subjects, as well as neural recordings in two different primate animal models. The thesis comprises five experimental studies. In these, I investigated how temporal information is extracted from visual input, and how its neural representation depends on and enables motor behavior. The order of studies broadly follows the visual processing hierarchy, from representations of low-level information at early (studies I & II) and intermediate visual areas (III), to the processing of high-level information in visuomotor control (IV & V).
In the first two studies, I investigated the response of primary visual cortex (area V1) to continuous random luminance sequences. The response properties of neurons in area V1 have long been described by an approximately linear impulse response, that is stable in the temporal domain even during continuous stimulation. Yet, recent human EEG studies showed that, when stimulated with broadband luminance sequences, area V1 selects a distinct temporal frequency (alpha, at approx. 10 Hz) from the input and propagates its information across cortical space in the form of a traveling wave (the “perceptual echo”; VanRullen and MacDonald, 2012; Lozano-Soldevilla and VanRullen, 2019). This non-linear response has been suggested to play a functional role in the active sampling of visual information, based on evidence from the human EEG. However, it had not been investigated how it is generated on a neural level, and a description of intracortical responses to the same stimulation in the animal model has been lacking.
In the first study, I combined EEG recordings in human observers with two established behavioral paradigms to study if and how the echo response is modulated by shifts in the cortical excitation/inhibition balance. I could show that the echo response is enhanced when the excitatory gain to area V1 is increased globally, but not when this modulation is limited to the input from one eye. This suggests that the echo is a cortically distributed response associated with excitatory neural states, dissociating it from spontaneous alpha oscillations which depend rather on inhibitory feedback loops (Hindriks and van Putten, 2013).
In my second study, I investigated the neural responses in primary visual cortex to the same broadband luminance stimulation. Using the marmoset monkey as an animal model, I showed that - unexpectedly - the local representations in area V1 are already highly selective for distinct frequency bands, comprising an alpha-band component that possibly represents a functional homologue of the human echo response. A comparison between responses from the anesthetized and the awake preparation showed that the frequency-selectivity is a signature of active visual processing, suggesting top-down modulatory signals as the driving mechanism. Together, my findings from the first two studies show that primary visual cortex actively extracts temporal information from continuous input at distinct frequencies, at the expense of information in other bands. The second study provides first, tentative evidence in the search for a neural correlate of the perceptual echo response.
One of the challenges the visual system is faced with in maintaining accurate temporal representations is that the incoming visual input changes drastically every time we move our eyes. Previous investigations have identified distinct neural mechanisms that modulate responses around the time of saccadic eye-movements to counterbalance this instability. These modulations also affect the temporal domain of visual processing, as demonstrated by changes in neural response latencies (Reppas et al., 2002; Ibbotson et al., 2006) and illusory distortions of perceived time (Yarrow et al., 2001; Morrone et al., 2005; Knöll et al., 2013). Yet, substantial differences between areas as well as diverging perceptual effects have been difficult to reconcile thus far.
To broaden our understanding of the neural signature of perisaccadic modulations of temporal encoding, in the third study I compared temporal information processing between fixation and perisaccadic time-windows in the ventral stream area V4 of macaque monkeys. By analyzing multi-unit responses to random sequential luminance stimulation I found that representations remained stable in temporal and spatial domains across saccades. However, the ratio of response amplitudes within a sequence was modulated by a global perisaccadic activity increase, in a way that could account for known perceptual effects. This finding newly introduces amplitude modulations as a neural correlate of perisaccadic distortions in the perception of time. The observation of stable representations in space and time contrasts with previously described perisaccadic spatial modulations in area V4 (Tolias et al., 2001). In light of evidence from other studies, we proposed that this disparity may partly reflect differences in processing between sequential and isolated visual stimulation. Taken together, the third study provided new insights on how temporal information processing is modulated by eye-movements, adding first findings for area V4 to existing evidence from other areas.
The first three studies all considered temporal information contained in low-level visual features, such as luminance or contrast. Yet, in our natural visual input, changes in these features over time are often also linked to movements of objects or the environment relative to the observer. Using retinal velocity patterns (optic flow) as a cue, our visual system infers information about the body’s own movement in space from this input. Accordingly, early behavioral studies have demonstrated that humans, in stable stance, respond to visually simulated self-motion with compensatory postural adjustments (Lee, 1980). The final two studies in this thesis investigated how these responses are coupled to the temporal structure of self-motion information in the continuous visuomotor control of balance.
The fourth study used measurements from a force-platform to characterize postural responses of human subjects to a visual environment that oscillated along the anterior-posterior axis, simulated in virtual reality. By applying an analysis procedure that was originally established in human EEG studies, we were able to show that postural responses coupled to the visual stimulus with high consistency. This coupling remained stable even for high temporal frequency (1.2 Hz) of the motion stimulus, suggesting that previous studies showing a selectivity for lower frequencies were biased by biomechanical constraints of the body through their analysis of response amplitude instead of phase.
The fifth study extended this paradigm further by additional measurements from video-based full body tracking, in order to determine how the different body joints contribute to the observed postural responses. The findings revealed that the body’s balance control strategy depends greatly on the temporal frequency of the visual input. For higher frequencies, coupling was increasingly centered around the hip joint, matching model predictions by previous studies.
Lastly, following up on the results from these two studies, we established a novel stimulation paradigm designed to provide a generalized mapping of visually evoked postural responses in the temporal domain. Applying the same approach as in the first two studies of this thesis, we used responses to broadband random self-motion sequences to investigate how the coupling behavior depended on temporal frequencies. The results revealed highly idiosyncratic response profiles for different subjects, possibly reflecting interindividual differences in visuomotor abilities as well as anthropometric factors. The frequency distributions of coupling surprisingly extended also towards very high frequencies (> 2 Hz). These components likely reflect a third, reflexive strategy that may be controlled by different motor pathways.
Taken together, these findings show that the visuomotor system uses the temporal structure of self-motion information for the control of balance during standing, selecting different frequency bands for distinct motor outputs.
In conclusion, with this thesis I have investigated how the visual system extracts and processes temporal information from continuous input. My findings from different subsystems of the visual hierarchy show that neural representations of temporal information are functionally specialized through their selectivity for distinct frequency bands. In the primate animal model, I identified possible neural correlates of this frequency selection and characterized their dynamics during active visual behavior. Future research could make use of these as measures by which to distinguish functionally between different sensory systems, as well as between individuals, e.g., for clinical applications. |
---|---|
Physical Description: | 165 Pages |
DOI: | 10.17192/z2022.0051 |