Docstoc

Faster than a speeding brain

Document Sample
Faster than a speeding brain Powered By Docstoc
					Faster than a speeding brain

       20 April 1996



TEST pilots at NASA push human performance to its limits. They must climb into untried, high-performance aircraft
and fly them at high speed while taking them through complex test procedures. To stay alive they need the ultimate
mental edge. And to help gain that edge, NASA pilots have a trade secret.

Before taking a new plane for that first flight, NASA pilots jack up the speed of their simulators until they run faster
than "real-time". Being trained to cope even when events happen twice as fast as in reality, makes the pilots feel they
can think faster and stay calmer once they are airborne and their lives are on the line.

Is such training just a curiosity for people who live on the edge? Simulator enthusiasts claim that it may have a much
wider impact. What works for the military could provide a tonic for all hard working minds. A pep-up computer game—
or better still, a thorough perceptual work-out in virtual reality—could boost the speed at which ordinary people think,
they say.

Dutch Guckenberger, a research fellow at the simulator manufacturer, ECC International in Orlando, Florida, believes
that humans are surprisingly plastic in their speed of thought. He says that most of us dawdle along at a lazy 10 or 20
"decision cycles" a second. But if retuned by the demands of an artificial reality, this could be boosted to 30 or even
40 cycles a second. "You could take a slow man and turn him into a fast man," says Guckenberger. "From our work
with above real-time training, the evidence is that a 30 per cent performance increase could be a normal gain."

On the strength of these claims the US military funded new studies with F-16 fighter pilots learning emergency drills
and tank gunners practising shooting down enemy helicopters. Guckenberger says training at an accelerated rate in a
simulator left personnel with more time to carry out complex tasks in real-life situations, giving green recruits the
battle-hardened reflexes of veterans.

But computer games to boost your IQ? Is it possible that training the brain to work faster could also make you
smarter? First it was subliminal learning tapes and biofeedback machines: then smart drugs were touted as the short-
cut to a sharper mind. How long before CD-ROM hypertime game trainers are the latest fad in the consciousness-
expanding industry? The dream of instant braininess—yours for just £99.99 plus postage—means that there will
always be a market for such products. But does science give us reason to believe that we could ever benefit from a
quick fix solution? Is IQ based on some simple brain property—such as the level of a vital neurotransmitter, a few key
genes, or the tuning of an inner perceptual clock—that would make it susceptible to some gross form of manipulation
like gene therapy or virtual reality training?

Opinion on this is divided. Many psychologists are convinced that IQ is a matter of high-level skills such as logic and
abstract thinking. They see the brain as such a complex system that any variation in its performance must be due to
some incredibly subtle details of its design, and so there can be no simple way of making it run better, or boosting
intelligence. But another group of psychologists takes the opposite view. They reason that the brain grows from a
relatively limited number of genes, so some very simple factors will have to dictate its level of performance. Something
about the basic speed or efficiency of a person's neurons will have to account for much of the observed variation in
intelligence.

Chris Brand, a psychologist at the University of Edinburgh, argues in his new book, The g Factor, that serious
scientists can no longer deny that intelligence has a biological component. He says that underlying most cognitive
activities is a single mental power called general intelligence or g. Brand says that comparisons of twins reared apart
and other such studies now strongly suggest that about 45 per cent of the variance found in IQ (or, at least, whatever
it is that IQ tests measure) is the result of inherited factors. Differences in the home environment, says Brand, account
for just 10 per cent of the variance in IQ scores, with just 5 per cent down to other environmental influences such as
schooling. If this view is correct then much of IQ has its roots in basic biological factors which may be open to
manipulation. But what is it about the working of brains that might account for differences in IQ?

The idea that intelligence might be linked to nerve conduction speed or perceptual efficiency dates back to the work of
Francis Galton, the Victorian gentleman-scientist. At his London testing centre, Galton compared the reaction times
and sensory acuity of labourers and middle-class subjects. When he failed to find any correlations, psychologists
moved on to develop the pencil and paper reasoning tests that are familiar as IQ tests today. Only a few isolated
figures, such as Arthur Jensen from the University of California, Berkeley, kept looking for evidence that sheer neural
speed might play a role in intelligence.

Trunk cabling

Neuroscience certainly gives reason to doubt that raw conduction speeds would count. After all, the brain does not
use just one kind of nerve. The conduction speed of a nerve fibre depends on its size and the thickness of its fatty
insulation. On the kind of trunk cabling used to connect the eye to the brain or send a muscle command down to the
hand, impulses zip along at just under 90 metres per second. But inside the brain, the wiring is generally slower and
more varied. Impulses tend to crawl along at between 1 and 9 metres per second.

That the brain uses such a mix of nerves would seem to kill off any simple theory of conduction speed. Yet in recent
years, a number of researchers have claimed to have found positive correlations between IQ scores and neural
speed. Perhaps the most radical are studies by Philip Vernon at the University of Western Ontario in Canada who has
matched IQ to the speed of impulses in the median nerve of the arm. Using electrodes at the elbow and armpit to time
the reaction to a small electric shock at the wrist, Vernon claims that variations in conduction velocity show a modest,
yet significant, correlation. Unfortunately, even those sympathetic to conduction-time theories of IQ, such as Jensen
and Hans Eysenck at the Institute of Psychiatry in London, have been unable to replicate these findings.

Another approach has been to measure mental processing speeds using evoked response potential techniques. An
ERP is a scalp electrode recording of the snap of electrical activity produced in the brain by a simple stimulus, such as
a flash of light or the reversal of a chequerboard pattern. Nothing can be seen in a single trial because the brain
produces too much background noise. But by putting someone through hundreds of trials and averaging the results,
the background activity cancels itself out, leaving just the rises and falls in neural activity associated with processing
the stimulus.

With a stimulus such as a reversing chequerboard, the first big ERP peak comes after about a tenth of a second. This
rise in polarity is taken to be the building of a primary sensory map in the visual cortex. Other peaks follow in different
parts of the brain and these are believed to be related to thoughts about the meaning or significance of the sensory
information.

The conventional wisdom is that ERP timings vary little from person to person. Take a common measure, such as the
P100 spike, which occurs 100 milliseconds after the sharp visual jolt of a chequerboard reversal. Mike Rugg from the
University of St Andrews says that the variation in P100 times is so small that doctors use it as a benchmark for
diagnosing conditions that damage nerves, such as multiple sclerosis. What's more, only a few milliseconds are
actually taken up by the transmission of impulses down the optic nerve from eye to brain. The rest of the 100
milliseconds is used up in transduction of light by pigments in the retina and by the visual cortex as it organises its
reaction.

However, even though variation in P100 is low, there will still be a small percentage of people with ERPs beyond 110
milliseconds, or below 90 milliseconds. Jensen and Ed Reed at the University of Toronto, believe that this leaves
plenty of room for speed differences to count: the two have reported a small, but constant, correlation between P100
times and IQ scores. Jensen uses this to argue that if ERP differences over the first leg of the sensory processing
journey reflect a general efficiency of wiring in the brain, then raw conduction advantages could account for as much
as 25 per cent of the variance in IQ scores.

The ERP approach to measuring brain speed has been replicated by a number of laboratories over recent years.
However many feel that neural speed alone is probably, after all, a little too simple to explain advantages in
intelligence. The belief that the quick movement of nerve traffic must be good is based on a rather old-fashioned,
computational view of the brain. In the classic serial processing computer, information is chopped up and passed
through a sequence of calculating steps. The faster each step is completed, the sooner a computer completes its
program.

But modern neuroscience sees the brain as a dynamic neural network. A "state of information" has to grow
organically, evolving under the pressures of positive and negative feedback until it reaches a state of balanced
tension. In such a network, it is not the speed of traffic along individual wires that counts but the performance of the
entire network as it settles into a "solution state". So ERP studies might really be measuring reliability rather than
speed.
Less speed, more synchrony

Pat Rabbitt at the University of Manchester used to be sceptical about simple biological explanations of IQ but has
been persuaded by his own work on perception timing tasks. He points out that a person's ERP score is the average
of hundreds of trials. Analysis of these averages suggests that high IQ scorers are not markedly quicker, but exhibit
less overall variation. Their averages are lower because fewer trials are skewed by the occasional lagging response.

"When you look at the development of reaction times in kids, what you get is fewer slow responses occurring as they
get older, rather than more fast ones," says Rabbitt. The same phenomenon shows up in other mental tasks such as
judging short spans of time. "If we ask people to estimate time periods of under a second, clever and less clever
subjects have similar averages, but the variance of one is smaller than the other," he says.

This picture fits with recent findings by Peter Caryl, Brand's colleague at the University of Edinburgh. Caryl says he
looked at the ERPs of people who had to judge which of two lines was longest when the pair was flashed up on a
screen. The ERPs of high IQ scorers tended to show a steeper, more pronounced rise, as if their brains were
responding with sharper synchrony. The bunching of ERP times would fit the idea that some brains form mental
representations that are more precise.

Yet both Caryl and Rabbitt warn against overinterpreting these findings. As Rabbitt points out, if it turns out to be
network performance rather than plain neural conduction speed which is basic to the biology of intelligence, then that
is really quite a complex "simple factor". At the moment, researchers can only guess what it might take to tune a brain
network to optimum pitch. All sorts of variables, from the number of branches on neurons to the replenishment rates of
neurotransmitters, could play a part in producing a crisp response.

Caryl adds that as soon as you take a dynamic network view of the brain, it also becomes difficult to separate
"upstream" from "downstream" processes. The old cognitive science model saw sensory information being mapped in
the visual cortex then being analysed for meaning and content by a succession of filters. But in a neural network,
feedback from high-level brain areas will flood down to affect the primary visual map even as it forms.

Indeed, there is now plenty of evidence that the brain runs ahead of itself, anticipating what will happen in the next few
moments so as to guide its sensory processing. When we reach for a door handle, our brains will already have
anticipated how it should feel. We only notice this ever-present habit of anticipation when something goes wrong—
when someone pulls the door open from the other side just as you are about to grab the handle. The unexpected
causes sensory confusion and we find ourselves briefly flummoxed.

Boost to the system

If feedback intimately connects high-level processes to basic sensory processes, then people with high IQ might have
their advantages elsewhere, says Caryl. The crispness of an ERP might reflect the efficiency of priming and attention-
directing processes further up the brain. In that case, even a strong correlation between some apparently basic neural
property and IQ would not mean that one was causing the other. A fast brain could be just one that anticipates better.

So intelligence researchers have no answers: just some firmer correlations and a few ideas about possible neural
network properties. As a result, most are reluctant to speculate about whether intelligence could be boosted by some
mind drug or perceptual training technique. Nevertheless, the main argument against these and other physical
interventions is that if peak mental performance depends on delicately tuned neural networks, then these methods
would be like jamming a screwdriver into the front of your hi-fi.

The accepted view on mental training has been that any benefit is normally limited to the skill being practised.
According to Rabbitt, playing a lot of chess, doing crosswords—or even filling in IQ questionnaires—makes you better
at those activities, but will not give you a general boost to the system.

So where does this leave Guckenberger and his dreams of using "hypertime" training to quicken a person's thinking?
Firstly, his attempts to explain a training effect in terms of an increase in decision cycles owes too much to the
outdated, serial computer model of the way the brain works. Perhaps rather than fast and slow man, Guckenberger
should be talking about the difference between precise and blurred man. But more importantly, a consultant who
looked into accelerated simulator training for NASA back in the 1970s came up with a relatively straightforward
explanation for the effect.
Staff at NASA's Dryden testing ground in Edwards, California, first stumbled upon above real-time training in the early
1970s while working on the F-15 jet fighter project. The prototype of the F-15 was actually a remotely piloted model
glider designed to test the controls and flight surfaces. The pilot sat in a mock cockpit on the ground and flew the
model via a radio link. After preparing for hours in a conventional flight simulator—flying a computer—pilots still
complained of feeling rushed when they began flying the model, so NASA engineers tinkered with the simulator to let
pilots adjust its speed to what seemed a psychologically realistic level. This turned out to be about 1.4 times as fast as
real-time. Ever since, NASA Dryden has routinely used speeded up simulators for the final stages of flight training.

Simulator consultant Robert Hoey investigated the basis of the effect. Hooking up pilots to a heart monitor, Hoey
found that they were much more relaxed in a simulator than when flying a plane. Even just knowing that they were
flying a radio-controlled glider was enough to produce an extra rush of adrenaline.

Such physical arousal is a normal response that prepares a person to take violent action in potentially dangerous
situations. Arousal also has a psychological effect, altering the balance of certain neurotransmitters and lowering
sensory thresholds, so making us feel more "jumpy". In neural network terms, the weights of connections are adjusted
so the network is more likely to respond to partial information, allowing the brain to trade off urgency against a greater
probability of errors.

Hoey argued that under these conditions, the brains of test pilots would have more trouble dealing with the kind of
detached, intellectual skills needed to check out a new plane. Test pilots, unlike normal pilots, have to follow a strict
plan of manoeuvres while watching for problems. Training under accelerated conditions in a simulator appeared to be
a good way of mimicking the effects of added psychological stress, so that during the eventual flight the pilots felt less
rushed and better able to cope.

Guckenberger's own more recent studies support this explanation. In his work, F-16 pilots reaped the most benefit
during enemy avoidance drills which involved a complicated sequence of manoeuvres. The pilots had to deal with a
lengthy checklist while under severe stress. Guckenberger found much less evidence that above real-time training
gave an advantage when learning ordinary flight skills. Likewise, the tank gunnery exercise was peculiarly dependent
on making complex calculations under pressure.

This suggests that time-pressured training may only help in situations that mix high arousal with a need to preserve a
measure of analytical detachment. A speeded-up virtual reality environment might well prove valuable for training
traders in the City, where millions ride on split-second decisions; surgeons, whose actions can mean life or death; and
even people who suffer exam nerves. In fact, it could be useful for anyone who has to use skills in stressful conditions.
But it is unlikely to deliver the general IQ boost talked about by Guckenberger.

Still, Guckenberger is not downhearted. He admits that he has no hard evidence that above real-time training could be
harnessed as other than a way of simulating the effects of psychological stress. But he can still dream. At the moment
he is toying with a computer display that takes you down a road of columns, with the columns zipping by faster and
faster. "Staring at the screen might pick up the brain rate," says Guckenberger. "It's worth a try."
John McCrone is a freelance writer who specialises in technology and psychology.
From issue 2026 of New Scientist magazine, 20 April 1996, page Page 44

Copyright New Scientist

				
DOCUMENT INFO