Inquisitive Man has so extended the range and depth of human knowledge that it is impossible for a single mind to comprehend its entirety; that being so, it would seem essential that some should attempt to create a synthesis of all the parts, not with the authority of disciplinary experts, but with a skill in showing the relationship of the parts to each other — at the risk of making fools of themselves1.
It was the quantum theorist and Nobel Prize Winner, Ervin Schrödinger, who first articulated the sense of this thesis in 19442. by adding “At the risk of making fools of ourselves” he acknowledged how vulnerable such people would be in the eyes of the individual experts to the charge of compromising themselves by working at too high a level of generalisation. It had been claimed that Francis Bacon (1561 – 1626)3 was the last true polymath who had been able to know everything about every topic of importance. Three hundred years later, with quantum mechanics challenging Newtonian physics, Schrödinger acknowledged the confusion of not having a methodology that could explore the emerging reality that everything is connected to everything; Western intellectual thought was frustrated by its adherence to Platonic belief that any problem could be resolved if reduced to its constituent parts4. But as schoolboys only a generation or so ago well knew, it was comparatively easy to take an old-fashioned clock to pieces, what was hard was putting it back together again.
In the sixty years since Schrödinger, scientists have unlocked the mysteries of the double Helix5 so enabling us to understand how genes shape inheritance. Archaeology has escaped from the technical limitations of spades, trowels and sieves and embraced pollen analysis, computer simulations and the extraction of mitochondrial DNA6 from very dry bones. So much better organised are our museums that we can now compare teaching methods in the national curriculum with those of the text books used in a Romano-British household 1,600 years ago, as well as follow the thought processes of Babylonian mathematicians.
It was only in the mid 1970s, however, that psychology came to accept the reality of what Darwin had projected more than 100 years before, namely that the human brain is as much a product of evolution as are any of the other organs in the human body — it has a “grain to it” which, as any woodcarver knows, is the very devil to work with if you have to go against its natural structure7. To understand this confusion in contemporary thinking turn to the Encyclopaedia Britannica and look up two words, analysis and synthesis. Instantly you will see the problem. A full couple of columns of text describes analysis as a mathematical and scientific methodology from which all kinds of powerful discoveries have flown. Synthesis merits only a couple of inches, and starts by saying simplistically, “In philosophy… to form a more complete view, or system”8. It took the artistic skills of George Seurat9, the impressionist painter, to capitalise on how the brain relates to such perspectives. Consider his wonderful painting of the picnic, “Sunday Afternoon on the Island le Grand Joffe”. Stand too close to the painting and all you will see is a mass of disconnected dots that seem to make no sense whatsoever; move back a bit, and let your eyes relax and find a new focus, and the individual dots dissolve into a beautiful painting. The human brain has evolved to see and comprehend the same thing from different perspectives.
When the oldest of today’s teachers were studying education, psychology was still dominated by the theory of Behaviourism. “Animals have instincts, humans have learned behaviours”10 lecturers told future teachers. The inference was clear. We humans were a superior species because we didn’t have to depend on instinct. Such a misunderstanding went back to psychology’s decision in the early 1860s to have nothing to do with evolution. We were exactly what we were born with; education therefore had to be everything to do with instruction.
It was not until the emergence of cognitive science in the 1960s (in conjunction with the first computers), the discovery of non-invasive technologies of PET11 and CAT12 scans in the 1970s and then functional MRI, followed by the growing confidence in the hybrid subject of evolutionary psychology13 from the 1980s, that educationalists developed a far more rigorous approach to a science of learning. But this science14 has been difficult to define. Each of these disciplines has a language of its own. While cognitive science tells us much about the process of learning, neurobiology15 explains many of the structures in the brain that actually handle this. What neither discipline does is to explain how the brain came to be that way in the first place — this is where evolutionary psychology is especially helpful.
An analogy may help. Neurology is like taking very detailed static photographs, at enormous magnification, of highly specific parts of the brain. Cognitive science is more like an infra-red image showing the connections between different parts of the brain, and some of these images have movement — they’re like short video clips. Evolutionary psychology is more like an epic film, a vast moving picture on an enormous screen which, while indistinct in many places, and so vast that the plot sometimes gets lost, shows how the human brain processes have evolved over time. He who seeks to synthesise has to be able to focus at three different scales.
Few of these ideas have made it into mainstream educational thinking. There is no synthesis yet on the shelves of bookstores, which are full of “how to pass exams” guides. This situation has to be rectified quickly for so much of this research is showing that so skilful is the learning species that it is being trivialised by being made to fit into conventional schools. Th. 80: 27/08/0606