And so, some sixty thousand years ago the evidence suggests, our ancestors simply started walking out of Africa. After nearly seven million years of becoming finally adapted to life around the waterholes of the savannah our ancestors used this experience to colonize the world. This “small-group” species, made up of extended family units of between fifteen to twenty people, loosely organized into “clans” of probably no more than one hundred and fifty people (the evidence is that, at about that scale, the competition of too many alpha males led the clans to subdivide) adapted the skill sets they had developed on the savannah to fit a vast variety of very different environments, all within about fifty thousand years.
Studying the DNA of native peoples today it is possible to plot the routes our distant ancestors followed as they spread out around the world. Their travels were largely facilitated by the lower sea levels of interglacial times, and the damper and milder climates of that period. Our ancestors appear to have reached India some fifty thousand years ago and Thailand ten thousand years later. They landed in the Andaman Islands around thirty thousand years ago, and possibly reached Australia very shortly after that. Migration into central and northern Europe was delayed by the last stages of the Ice Age until some twenty-five thousand years ago. There is some confusion as to when our ancestors reached north America by way of Siberia and the Bering Straits, with most estimates suggesting twelve and fifteen thousand years ago, eventually reaching the far extremity of Tierra del Fuego some ten thousand year ago.
These were frontier times, when none but the fittest survived. This was not a migration led by powerful and charismatic leaders. This was the energy of a species where every family, each clan, possessed amongst its tiny numbers all the multiple skills needed to investigate, respond to, and colonize new territories. To do this everyone of their members had either to possess the multiple skills to do this for themselves or, more likely, to know how to collaborate with a few others to achieve things they could not do for themselves. The diverse skills they needed would have depended on a further refinement of the multiple intelligences they brought with them out of the savannah. Inquisitiveness fired their every action; they climbed mountains to be able to see what was on the other side; they followed rivers to the sea and, with time, constructed crude rafts to take them to off shore islands, and later to explore oceans in the hope of finding new lands.
Life was a constant struggle in landscapes never before colonized by man, and these explorers craved security in their travels. That ancient search for security is, to this day, recreated wherever people meet together for a meal, replace the electric light with candles, turn off the central heating and throw more logs on to the fire and, as the wine flows, listen again to the fisherman retell his tale of “the one that got away.” Its not simply a return to the womb – we have an inherited sense of what is safe and secure. Even the most exclusive of modern hotels seeks to blend the security of being alone with your own family by constructing internal atriums where every room looks out onto a common, but safe, enclosed area where life goes on as others sleep. Just like the ancient caravanserai of the desert.
The origin of some of the words we still use may even go back to those distant days. The Persian word for “paradise” actually describes a small walled enclosure around a spring of fresh water. Beyond the wall is nothing but sandy desert; go through a tiny door in the wall and inside all is green, cool and moist – the vegetation exotic and the bird song enchanting. This, to our ancient ancestors, was “paradise.” But we are a species of complex motives; the search for security contrasts with our love of novelty; the thrill of risk contrasts with the comfort of the predictable; the challenge of being alone contrasts with the enjoyment of collaboration. These have probably become ever stronger features of human behaviour as human genes have steadily mutated over the diaspora (the spreading of a population). Foremost amongst those risk takers – the scout out in front of everyone else – would have been the adolescents. Those adolescents that survived would have been hard tested by the experience, and become the leaders of the next generation. (In the medieval trading practices of Venice – the Colleganza – that balance of energy and risk was reflected by the young entrepreneur putting up only one quarter of the capital for a speculative venture with the other three quarters coming from an older, more sedentary merchant. If the young entrepreneur returned to Venice successful he then retained half the profit). Generation after generation, perhaps as many as three thousand generations of traveling peoples, made enough successful adjustments during that diaspora to survive and colonize the world. Simpletons simply died out.
Writing his much acclaimed book “The Fifth Discipline” for business leaders in 1990 Peter Senge of MIT summarized the kinds of learning that enable people to flourish when faced with novel and problematic situations. Without his consciously realizing this Senge probably captures in the following words what our ancestors some fifty thousand years ago experienced as reality. “Real learning gets to the heart of what it means to be human. Through learning we recreate ourselves. Through learning we perceive the world and our relationship to it. Through learning we extend our capacity to create, to be part of the generative process of life.” Learning makes us feel good about ourselves, said Senge, and Learners are reinvigorated by their inquisitiveness, and are “committed to continuously seeing reality ever more and more accurately. There is within each of us a deep hunger for this type of learning.” Creative learning is “as fundamental to human beings as the sex drive”, says Senge, and as with the sex drive, we humans developed that learning instinct a long, long time ago.
It seems that our ancestors had perfected their language skills long before moving out of Africa. With such linguistic skills deeply established in the human brain our ancestors developed numerous separate languages, of which currently some six thousand still exist. Cultural speciation (the formation of a new and distinctive species) proceeds far faster than physical speciation. Over that sixty thousand year period our ancestors developed a number of relatively minor physical adaptations – blonde hair and fair skin as apposed to dark skin and dark hair, or tall tribes on the open grasslands and pygmies in the forests – but in their physical form they all remained true to the genome. When in the early seventeenth century the first French or English fur trader met a willing native American woman in the primeval wastes of northern Canada, together they had no difficulty in producing numerous “metis” offspring. Such a mating probably represented the greatest potential biological divergence that could be observed on the planet. The genes of the European fur trader meeting those of the native American had possibly moved out of Africa a full sixty thousand years before, while the native American genes had migrated to Canada through Asia and across the Bering Straits. After all that time those genes had no difficulty in recombining. No physical speciation had taken place. Three or four thousand generations were insufficient for any significant biological change; the brains of people from all over the world seem to work in exactly the same way as do the rest of their bodies.
Studies in the early 1990’s on what was called cognitive apprenticeship extended the study of learning beyond classroom practice to learning in non-institutional settings. “Learning is not something which requires time out from productive activity; learning is the very heart of productivity”, wrote Shoshama Zuboff of Harvard in her study of how people in the computer industry learnt to improve their skills. Mental structures for learning reflect social collaborative, problem-solving techniques, contemporary studies from the business world showed. A whole plethora of studies have followed in the last twenty years showing that learning is an intensively subjective, personal process that each individual constantly and actively modifies in the light of new experiences. The more varied a person’s experience the more perspectives that person brings to each new opportunity or problem. Writing from the Santa Fe Institute in 1995 Shank and Cleave explained that “We make sense of personal experiences by comparing these to previous ones. Once we have found a match, we use our previous experience to decide what to do next. “What this means is that we can really only understand – and hence remember – situations we have been in before. Our memories are actually little more than the sum of stories we can recall and apply. “The thing which human beings need more than comfort, more than possessions, more than sex or a settled home, is a good supply of stories”, wrote Libby Purves in 1997. “Through stories we make sense of the world.”
At some stage those stories told by our ancient forbearers started to account for the ultimate mysteries of life – humans started to envisage God. They created stories that helped them place themselves in the universe. God was defined until only five thousand years ago as female, opening up fascinating questions as to why the subsequent gender change. Anthropologists equate the orderly burial of the dead with the development of spiritual awareness. “Mystical, symbolic and religious thinking – all those ways of thinking that rationalists would condemn as “irrational” – seem to characterize human thinking everywhere and at all times”, wrote John Barrow Professor of Astronomy at the University of Sussex in 1995, continuing, “It is as if there was some adaptive advantage to such modes of thinking that offers benefits that rationality could not provide.”
All these experiences have “imprinted (themselves) upon us in ways that constrain our sensibilities in striking and unexpected ways”, Barrow continues. We are, literally and figuratively, the children of travelers from antique lands. In twenty first century terms our children face the same challenges, as did our ancestors’ children; our basic biology still needs to empower them to master both basic skills (those that can be readily taught) and think creatively for themselves (experiential learning).
Section Five. So what is it that we now know in 2005?
We know that the human brain is essentially plastic, that it constantly reshapes itself in response to environmental challenges, but that it does this within the blueprint of the species inherited experience. There are three phases during a normal life cycle when the brain goes through extraordinary periods of internal reorganization, a kind of mental housekeeping. At an involuntary, subconscious level the brain clears out those structures which its “evolutionary sense” tells it are redundant so as to enable other parts of the brain to grow. Experience during each of these phases became critical to how the individual brain is reconfigured to deal with the next stage of life. This process is called Synaptogenesis; a period during which many separate specific predispositions coalesce to produce a major evolutionary adaptation that becomes critical to human survival. Three phases have so far been identified – the earliest months of life, adolescence and old age. This Paper proposes that it is in the relationship of the first two phases of synaptogenesis to each other that has made the advance of the human race possible. Neither phase – be it the first three years, or adolescence – can be seen in isolation; neither phase on its own can account for the human propensity to learn. It is only through the interaction of the two that we become “the learning species.”
The highly suggestible brain of the pre pubescent child enables it to speedily and effectively learn through imitating its elders, while it is in the changes of the adolescent brain which force young people to take control of their own future by making them discontented with being told what to do. The features of adolescence – the risk taking, the exuberance and the outrageous questioning of the status quo – are there for a time, but they disappear as the young person grows older. If the adolescent does not want to follow such instincts or – as in a modern society that is so frightened by such instincts that it does its best to get the adolescent to actually ignore them – such youngsters go into adulthood essentially nervous of risk-taking, fearful of ever becoming too enthusiastic, and unduly conforming in their behaviours. To over-school adolescents is to rob them of the once-in-a-lifetime opportunity to grow up properly. We all pay the price.
* * *
Scientists have found it easier to study the development of the young brain, than to study that of the adolescent. Not only are younger children more malleable, their brains have had less time to be shaped by social and environmental inferences. The variables are fewer, and each variable is more easily quantifiable. Such research has become the stuff of much popular science. It has an immediate appeal both to politicians looking to the supposed economic advantages of getting as many young mothers back into profitable employment as possible, as well as to women themselves seeking to balance their roles as mothers with the demands of their careers. Both approaches conspire to emphasize what popularly became known as “the issue of the first three years.” Such research, it was often expected, would provide criteria by which quality childcare could be institutionalized; thus releasing more women into the work place and with calmer consciences that all would be well with their children.