Cognitive scientists were, in the main, younger psychologists; men like Howard Gardner who introduced the theory of Multiple Intelligence (1984), and John Bruer with his influential work on “Schools for Thought” (1993). Their challenge was largely successful, and cognitive science became a major contributor to psychological theory. Such researchers were not, however, medically trained; they dealt in measurable statistical inputs and outputs, and formulated valuable conclusions about the processes that they deduced shaped each. But such researchers did not actually “touch” brains. Neurons and synapses, myelin sheathing, dendrites and neurotransmitters were on a scale with which cognitive scientists could not deal. Nor did such cognitive scientists look for explanations as to why such processes might exist – they dealt entirely in the “here and now.”
The second discipline is Neurobiology. Neurologists are medical practitioners who study, and operate on, the brain. Their antecedents go back to an assumption in the early nineteenth century that the study of “bumps” on the skull resulted from various kinds of brain activity (phrenology). It was the development of CAT (computerised axial topography) scans in the early 1980s, followed shortly thereafter by PET (positive emission topography) scans that first made it possible to identify the several different areas of the brain that are activated by specific thoughts or actions. Functional MRI (magnetic resonancing imagery) makes such studies even easier, faster and safer, and confirms what was earlier surmised, that even an apparently straight forward mental task requires the coordinated activity of several different parts of the brain.
“The brain is made up of anatomically distinct regions, but these regions are not autonomous mini brains; rather they constitute a cohesive and integrated system organised for the most part in a mysterious way”, wrote Professor Susan Greenfield in 1997. The welter of subsequent publications shows that the more that is discovered the more mysterious – awesome – the brain becomes. This was forcefully put by Schwartz and Begley (2002): “Human brains are only partially understandable when viewed as the product of medical processes… the mind can, through knowledge and effort, reshape the neurobiological processes…It’s a mental striving, not a deterministic physical process.”
In less than fifteen years contempory informed opinion has moved from thinking of the brain as a mechanism, to thinking of it as an organism – a blueprint dictated by events long past but constantly reshaping itself in each individual life cycle through neuroplasticity. The brain is not a fixed entity, and nurture matters quite enormously. We design our houses, the colloquial expression has it, and they then shape the way we live. Put another way we are enormously empowered by the evolved nature of our brains, but we are constrained as well. Forced to go against our natural way of doing things the grain becomes fractured, rough and unserviceable. We can only bend our mental processes a certain amount from the normal – that is why the analogy about society having been diverted on to a side track that ends in a set of buffers, is so important.
While cognitive science tells us much about the process of learning, neurobiology explains many of the structures in the brain that actually handle this. What neither discipline does is to explain how the brain came to be that way in the first place. This is where evolutionary studies, and especially evolutionary psychology is especially helpful.
Evolutionary Psychology is the most speculative of the three new “disciplines” – so much so that it is frequently not even accepted as a discipline by its two older cousins. Essentially a hybrid of the evolutionary sciences and psychology it draws extensively on biology, genetics, archaeology, anthropology, and neurolinguists as well. Since its inception in the late 1980s evolutionary psychology has caught the public’s attention for its ability to paint the “Big Picture” of human origins. All three disciplines are essential to understand the proposition made in this paper about adolescence as being a critical survival skill. A further analogy may help at this stage. Neurology is like taking very detailed static photographs, at enormous magnification, of highly specific parts of the brain. Cognitive science is more like an infra-red image showing the connections between different parts of the brain, and some of these images have movement – they are like short video clips. Evolutionary psychology is more like an epic film, a vast moving picture on an enormous screen which, while indistinct in many places, and so vast that the plot sometimes gets lost, shows how the human brain processes have evolved over time.
To understand the proposition that adolescence is a critical evolutionary adaptation essential to society’s survival we need each kind of “picture.” Together these disciplines take us way back into the origins of human learning that makes the explanations of men like Confucius, St Augustine and the writer of the Book of Ecclesiastes (as quoted earlier) seem like comments made in the last second of a film that started a full twenty-four hours before. Only in the last five or so years could the story set out below be told.
* * *
Section Three. The “Big Picture” of how we humans learn, as we can now describe it.
The human species apparently separated from the Great Apes some seven million years ago, leaving modern man still sharing ninety-eight percent of its genes with the chimpanzees. Most of this two percent difference seems to relate to our brains which appear to have grown exponentially in this time (creating in the human brain the most complex organism in the known universe). The more effectively our ancestors used their brains it seems the larger their descendants’ brains became. Here is much of the reason for what earlier researchers saw as the dilemma about human learning: is it a product of genetics or of experience? If it was nature versus nurture, who won… or how many points in a hard fought fight should be allocated to each? That we now understand this conundrum better will unfold in the next few paragraphs.
Virtually all mammals give birth to their young when their brains are almost fully formed. The major exception is us humans. As the brains of our ancestors started to grow (probably some two million or so years ago) that put pressure on their skulls to get bigger. This created a painful and devastating problem, for there is an absolute limit as to how large the woman’s birth canal can get. Over time it seems evolution found a neat compromise – an adaptation (a chance ad hoc solution which eventually became encased in the human genome). Human babies are born nine months after conception but – and here is the wonder of the adaptation – with their brains only forty percent fully formed. If pregnancy was to go to its natural term it would last twenty-seven months, and the baby would never get down the birth canal. Being born so premature human babies are extremely vulnerable for it takes a further thirty months outside the womb for the brain to become structurally complete. The behaviour of most mammals is based on instincts (spontaneous, unreflective responses) successfully implanted in the young brain before birth. For humans, however, a full sixty percent of brain growth is dependent on environmental and other stimuli to which the young child is exposed during the earliest months of life. In this respect the Behaviourists were right, human behaviour is far more dependent on learning than it is on instincts – but for reasons very different to those advanced by those early psychologists.
Born so obviously helpless as is a human baby it is all the more amazing that humans have evolved to become the most dominant species on the planet. From such an apparently poor start how did our distant ancestors grow such amazing brains?