BCN Annual Report 2002 2003
1 Neurocognition of visual and
Our subjective experience suggests that our brain converts the sensory signals from the external
world into the perception of a coherent world almost without effort. Considering the overwhelm-
ing complexity of the constantly changing sensory perceptual input, this is quite amazing. The
papers in this chapter cover a broad spectrum of research on perception, ranging from research on
sensory receptors, psychophysical research, research on the perception of emotional stimuli, indi-
vidual differences in perception, development of perception, selective perception (attention), to
pathologies of disturbed (selective) perception.
The sense organs can be conceived as filters, limiting the range of signals that are available for neu-
ral processing. Each animal species lives in its own perceptual world and appears to have evolved fil-
ters that are fitted to its own ecological environment. Stavenga and Vanhoutte investigated the
building blocks (ommatidia) of the eyes of butterflies. They describe a special setup that allows in
vivo epi-illumination microspectrophotometry of the retinal photoreceptors. In this way they were
able to determine the color-processing properties of the nine photoreceptors in the ommatidia.
They found that three different types can be distinguished, each absorbing light maximally in a dif-
ferent wavelength region.
The mammalian hearing organ is remarkably sensitive to sound. Its receptors, the hair cells along
the basilar membrane in the cochlea, transduce the mechanical motion of the cochlear fluids into
an electrical signal, which is then relayed to the brain. From the brain’s point of view, the hair cell
transducer channels serve as “windows” onto the mechanical events in the cochlea, which carry
mainly the sound-evoked information. However, thermal activation impinges on this mechano-
electrical transduction process, imposing a little noise on what the brain sees through the hair cell
windows. Van Netten and Dinklo conclude on the basis of experimental results and theoretical
modeling that the physical construction of the transducer apparatus in hair cells of the inner ear is
pivotal in order to achieve the exquisite sensitivity of mammalian hearing.
Although we perceive visual objects as wholistic unities, an important principle of the human visu-
al system is its feature-specialized organization. That is, distinctive visual brain areas are optimized
to process particular attributes of the retinal image (e.g., color, orientation, etc.). Hannus et al. inves-
tigated visual search performance (measured by eye movement accuracy and speed) in tasks in
which targets were defined by a single feature (color or orientation in different conditions) or by a
combination (conjunction) of color and orientation. Surprisingly, although the subjects performed
equally well in the feature search and orientation search conditions, a different pattern of results
was found in the conjunction condition: relative to the single feature conditions they improved on
color detection and deteriorated on orientation detection. This suggests an interaction between the
color and orientation specialized visual systems in conjunction search. The authors propose an
explanation in terms of limitations in attentional resources.
An important general goal of our sensory systems is to extract those aspects of our environment that
remain stable despite the fact that the same environment may generate vastly varying signals to the
sensory organs. In some circumstances however, opposite phenomena occur, in which our sensation
is ‘fooled’ to perceive information differently despite its unchanging physical properties. This is what
occurs in brightness induction, in which the subjective brightness of an object is changed when the
brightness of its surrounding changes. Boucard et al. demonstrated with fMRI that the responses of
some areas in early visual cortex correlate with the subjectively perceived brightness of an object
rather than with the actual physical brightness.
BCN Annual Report 2002 2003 Neurocognition of visual and auditory processing
Evidence is accumulating that the perception of stimuli carrying emotional information involves
(at least partly) different brain systems than the perception of neutral stimuli. Groen et al. measured
event-related potentials (ERPs) in children in two conditions in which they received series of pic-
tures of faces. In two conditions they received identical pictures, but in one condition they had to
discriminate the expression of the faces, whereas in another condition they had to discriminate the
personal identity of the faces. ERP-effects in the two conditions were dissociated both in time-
course and in scalp-distribution, supporting a modular account of facial processing, in which per-
ception of identity and expression call on different brain mechanisms.
According to LeDoux (1996) emotional perception involves both fast, pre-attentive (subconscious)
processes, depending on a sub cortical route of processing, and a slower, attentive (conscious) cortical
route. Heutink et al. showed with ERPs that early cortical visual responses to faces presented subcon-
sciously, are modulated by the emotional valence of the stimuli. Potentially threatening stimuli (i.e.,
angry faces) led to enhanced cortical processing. Stimuli known to be threatening (by classical aver-
sive noise conditioning), on the other hand, resulted in reduced cortical processing, possibly because
these stimuli could be adequately handled at the subcortical level. Interestingly, a subgroup of sub-
jects with high trait-anxiety did not show this subconscious down-regulation of cortical processing.
In albinism the visual pathways cross atypically, in that most fibers from one eye cross to the con-
tra lateral visual cortex. Since albinism does not always manifest itself clearly phenotypically, an
objective tool to diagnose this visual dysfunction would be very desirable for the pediatric ophthal-
mologist. Pott et al. explore the possibility of using visual evoked potentials (VEPs) in diagnosis. If
one eye of an albino is stimulated, the VEP over the ipsi-lateral hemisphere is delayed and less
prominent as compared to the contra-lateral hemisphere. Pott et al. investigated a quantitative
scoring method for this hemispheric asymmetry. In this method VEPs are recorded over the left and
right visual cortices, separately for stimulation of the left eye and stimulation of the right eye. First,
difference potentials are computed by subtracting the left hemisphere VEP response from the right
hemisphere response, separately for the left and right eye conditions. Secondly, the cross-correlation
coefficient between these two difference potentials is computed. Pott et al. found this to be a prom-
ising objective quantification method.
Although individual differences in perception is still a relatively unexplored research area, this is a
fascinating topic, and some intriguing studies have been conducted. For instance, plasticity in the
organization of cortical perceptual function has been found as the result of long-term learning expe-
riences. Obviously, major individual differences are related to the gender of subjects. Ruytjens et al.
report remarkable differences between male’s and female’s brain responses (as measured with fMRI)
while listening to music or to continuous noise. While listening to music, similar brain activations
were found in males and females. In contrast, while listening to noise, brain activations in males
were much reduced (and non-significant) as compared to females, who showed robust activations.
One way in which the brain handles the overwhelming complexity of the sensory array is by giving
processing priority to part of the information. Only information that is relevant given the organ-
isms current condition and goals is selected for full processing. Obviously, selective attention is an
essential cognitive function. One way to select visual information is by means of eye movements,
i.e., overt spatial attention. Hunnius et al. investigated the development of visual scanning behav-
ior during the first few months of the infant’s life. Visual scanning was measured while the infants
watched video recordings of their own mother and a matched abstract stimulus. As the infants grew
older, they exhibited more and shorter fixations, reaching stable scanning patterns after 14-18 weeks
of age. In contrast to previous research, in which it was found that infants scanned the borders of
photographed faces, Hunnius et al. found that the infants predominantly looked at the mother’s
mouth and eyes with their dynamic stimulus presentation. Thus, already at a very young age the
infants had learned to attend to the most meaningful, informative regions of the mother’s face.
BCN Annual Report 2002 2003 Neurocognition of visual and auditory processing
In addition to overt spatial attention, we are also able to direct our attention to different parts of the
visual field independent of the direction of our gaze. There is a vast and consistent literature on this
phenomenon of covert spatial attention, and many have argued that spatial selection is the primary
attentional mode in vision. Others, on the other hand, have argued that the Gestalt-principles by
which the visual scene is segmented in distinctive objects also importantly guides selective atten-
tion. Van der Helden et al. used ERPs to study the time-course of selective processing based on space-
based and object-based representations. They found that both selection modes resulted in attention
related ERP-effects, but with distinctive time-courses. Effects of space based selection occurred first,
and was followed by object based effects only later in time.
When information falls within the region of space that we visually attend to, the processing of this
information is modulated (as described by Van der Helden et al.). In addition, separate processes are
needed to shift attention from one visual area to another. With ERPs, these attentional control
processes can be visualized by measuring the brain activity in the interval between a cue instructing
subjects to attend to the left or to the right, and the actual presentation of an imperative stimulus at
one of these positions. Wijers et al. investigated control and dyslexic subjects with such a set-up, since
there is evidence to suggest that dyslexics display ‘sluggish attentional shifting’. It was found that
the ERP-effects of attentional shifting over the frontal cortex showed a deviant hemispheric lateral-
ization for the dyslexics as compared with the controls. This is in accordance with the general
hypothesis that in dyslexia the development of interhemispheric asymmetry is dysregulated.