The Problem With Creationism
For years I, like other Fundamentalists, held to a simplistic and literalist interpretation of the book of Genesis, as though it was a scientific and historical textbook, something I justified by extended special pleading, which explained both how the universe came about and how humankind developed. I felt increasingly uncomfortable doing so, but was trapped in a theological cul-de-sac. So much of my theological superstructure rested on these somewhat doubtful foundations that I could not bring myself to examine them carefully. I did not have the time or the energy, I was on the treadmill of parish ministry. As a student I had felt a sense of embarrassment, almost shame, at being a creationist, certainly the literature did not seem credible (https://answersingenesis.co.uk/store/product/it-trueevidence-creation/?sku=00-1-136) indeed, much of it was almost laughable – but it raised challenging questions (how could a fish, which breathes through its gills, develop into something that could breathe out of water without suffocating in the process? Yes, ok, I know about mudskippers, I’ve even seen them, but back then the question seemed compelling.) My problem lay with Adam: if Adam was not a special creation, simply a mythical first human being, where did that leave original sin? Paul, in 1 Corinthians 15:20-22 had written, “the truth is that Christ has been raised from death, as the guarantee that those who sleep in death will also be raised … for just as all people die because of their union with Adam, in the same way all will be raised to life because of their union with Christ.” Without Adam’s fall, there was no need for the atonement and Jesus’ death became pointless. And there could be no historical fall, without an historical Adam. To safeguard the Gospel, I had to reject evolution. This view had been passed to me by my father and I was charged, like Atlas, to hold it firm, but all the while, doubt stood at my shoulder, whispering.Others, I know, have no such doubts at all, theirs is a blithely confident faith. To quote from a recent publication:“… there is no reason to believe that the universe and the earth in particular are billions of years old … as many astronomers and geologists insist. A real creation would of necessity require that some aspects of the universe would have come from the hand of its Creator with an appearance of age. For example, Adam in the very hour he was created would have appeared to be a mature man of some years. Then the geological upheaval at the time of the Flood (see Gen. 7:11; 2 Pet. 3:6) could also account for much of the geologist’s “evidence” for an ancient earth … we simply cannot discover the age of the earth or of man on the basis of evidence … but the tendency of Scripture, limiting the known gaps … to tens and hundreds of years and not thousands and millions of years, seems to be toward a relatively young earth and a relatively short history of man to date.”– Robert Reymond, “A New Systematic Theology of The Christian Faith”
But such a blithely confident faith is only possible if we ignore the genre of the literature this model is based upon, adopt a wilful obscurantism towards the findings of science, in the process ignoring the overwhelming consensus of the scientific community since Galileo in the 17th century, and credit the Bible with both supernatural authorship and its attendant infallibility. A recent study by biochemists at Brandeis University, using a computer model to calculate the probability that all forms of life on Earth developed from a single common ancestor, also tested the Fundamentalist model (that humans arose in their current form and have no evolutionary ancestors) and found that, statistically, the probability that humans were created separately from everything else is 1 in 10 to the 6,000th power. It is beyond improbable.The Fundamentalist’s model simply does not stand up to any degree of scrutiny or take into account clearly observable facts (hence Reymond’s assertion about not being able to rely on evidence). Indeed the picture which emerges from consideration of the evidence is that the universe began with the “Big Bang” as an unimaginably hot, dense point, which when it was just one hundredth of a billionth of a trillionth of a trillionth of one second old experienced an incredible burst of expansion in which space itself expanded faster than the speed of light. During this period, the universe doubled in size at least 90 times, going from subatomic to the size of a golf ball almost instantly. As space expanded, the universe cooled and matter formed. One second after the Big Bang, the universe was filled with neutrons, protons, electrons, anti-electrons, photons and neutrinos.From Big Bang to BacteriaFor 380,000 years or so, the universe was too hot for light to shine. The heat of creation smashed atoms together with enough force to break them up into a dense plasma that scattered light like fog. Roughly 380,000 years after the Big Bang, matter cooled enough for atoms to reform, setting loose the initial flash of light created during the Big Bang, which is still detectable today as cosmic microwave background radiation. About 400 million years after the Big Bang, during a period which lasted more than a half-billion years, clumps of gas collapsed enough to form the first stars and galaxies. A little over 9 billion years after the Big Bang, our own solar system was born. By dating the rocks in Earth’s crust, as well as the rocks in Earth’s neighbours, such as the moon and visiting meteorites, scientists have calculated that Earth is 4.54 billion years old, with an error range of 50 million years.
The environment on the early Earth was volatile and hostile to life, yet life somehow began here. We might ask how this could possibly be, yet microbial life forms have been discovered that can survive and thrive at extremes of both high and low temperature and pressure, and in conditions of acidity, salinity, alkalinity, and in concentrations of heavy metals that would have been regarded as lethal even a few years ago. These discoveries include the wide diversity of life near sea–floor hydrothermal vents, where some organisms live on chemical energy in the absence of sunlight. This demonstrates how life could have begun in that hostile primordial soup. Fossil evidence suggests life emerged some time prior to 3.7 billion years ago and perhaps as early as 4.1 to 4.28 billion years ago. As mentioned above, the similarities among all known present-day species indicate that they have diverged through the process of evolution from a common ancestor. This common ancestor developed into the three domains which form the branches of the evolutionary tree: archaea, bacteria and eukaryotes. Animals are, essentially, multicellular eukaryotes and it is from these eukaryotes that animal life developed to jellyfish and thence to vertebrate fish, amphibians, reptiles, mammals and on and on to the earliest homonids. The story is one of continual development and adaptation over the millennia. But it is with the rise of the homonids that we begin to approach “God”, for while consciousness of self and of the other is present in such animal life as octopuses, it is four very specific cognitive developments in the brain which set the homonids apart and these developments were necessary for the prehension of “God”.
The Rise of Humankind
If we begin with Homo habilis, about 2 million years ago, we find hominins progressively experiencing a significant increase in brain size and general intelligence. Mammals (and, obviously, their brains) had been evolving for 200 million years before that time. For the first 140 million years of their existence, mammals were insignificant creatures living in the nooks and crannies of a dinosaur’s world. During that time, evolution was experimenting with the development of the three-part brain (the forebrain, midbrain, and hindbrain) that forms the central nervous system for mammals. About 65 million years ago, an asteroid apparently struck Earth, producing a cataclysm that killed the dinosaurs. Mammals not only survived but thrived in a world now devoid of their Jurassic predators. Consciousness would not have evolved if a cosmic catastrophe had not claimed the dinosaurs as victims. So, in a very literal sense, we owe our existence, as reasoning mammals, to our lucky stars. With the disappearance of dinosaurs, mammals rapidly diversified, grew, and dominated Earth. The mammal forebrain increased disproportionately in size compared to the midbrain and hindbrain and eventually occupied most of the skull. As it grew, the forebrain differentiated into the four lobes (frontal, temporal, parietal, and occipital), and developed a thin layer called the neocortex. This was the key innovation of the mammal brain, because it included six layers of neurons, compared to the three layers in the cortex of earlier animals. Since neurons are connected three-dimensionally, both horizontally and vertically, to other neurons, the additional three layers increased the connections exponentially, making possible the processing of much more complex information and thought. The first primates appeared approximately 60 million years ago. They proliferated rapidly into hundreds of species, of which 235 species still exist. About 30 million years ago, New World monkeys (cebus monkeys and marmosets) went their separate way, and 25 million years ago the Old World monkeys (baboons and macaques) did the same. The great apes, the group most closely related to us, began dividing about 18 million years ago, with the orangutan, and then the gorilla, following separate evolutionary paths. Finally, about six million years ago, the hominins separated from chimpanzees, our closest hominid ancestor. Insofar as the evolution of chimpanzees was subjected to similar evolutionary pressures as hominins were, it would not be surprising, given the principles of parallel evolution, to find that chimpanzees and, indeed other great apes would develop some cognitive abilities similar to those developed by hominins. Awareness of self is an example of such parallel development. Perhaps more surprisingly, this is also present in octopuses.
Homo habilis is generally regarded as being the first hominin to have diverged significantly from its primate ancestors, although its precise relationship to other early members of the Homo species, such as Homo rudolfensis, Homo ergaster, and the recently discovered Homo naldi, is far from settled. Fossils of Homo habilis have been discovered in Ethiopia, Kenya and Tanzania. Homo habilis lived between 2.3 and 1.4 million years ago, although recent finds in Ethiopia suggest that it may have existed as early as 2.8 million years ago. Its average brain size is estimated to have been about one-third larger than the brain of Australopithecus, making it smarter than Australopithecus, which is demonstrated by it making crude stone tools by breaking rocks to produce sharp stone edges. The use of tools is not unique to hominins, of course, crows use sticks and carefully cut leaves to extract insects from them, sea otters use stones to break the shells of crabs, monkeys use sticks to kill snakes, and chimpanzees use sticks, from which they strip the leaves, to forage in termite mounds, and stones to crack open nuts. Crude stone tools have been found dated to 3.3 million years ago, but those made by Homo habilis were more sophisticated and have been found in abundance in association with Homo habilis fossils. Although crude, such tools would have been effective for cutting the hides and tendons of dead animals, allowing the tool-user to strip meat. The stone tools could also have been used to break open animals’ long bones and extract the marrow, an especially rich source of protein. Animal bones found in association with the stone tools suggest that the tools were used in this way. The bones also suggest that Homo habilis was probably a meat eater, in contrast to earlier hominin species. There is no evidence that Homo habilis hunted animals, so they probably scavenged for animals that had been killed by other animals or had died of old age or disease. What makes the stone tools used by Homo habilis different is their complexity. According to archeologist Steven Mithen of Cambridge University, to detach the type of flakes found in the sites of Olduvai Gorge, it is necessary to recognize acute angles on the stone nodules, to select so-called striking platforms and to employ good hand-eye coordination to strike the nodule in the correct place, in the right direction and with the appropriate amount of force. Homo habilis possessed advanced physical skills and the ability to plan. They were clearly smarter than their hominin ancestors. However, despite their greater intelligence, there is no evidence that they possessed self-awareness or any of the other higher cognitive functions that would distinguish later hominins and lead to the emergence of the gods. Theirs were clever brains, but blank minds.With Homo erectus we see the development of an awareness of self. Homo erectus first appeared approximately 1.8 million years ago and lived until 300,000 years ago, thus existing for 1.5 million years. They had physical features, especially their arms and toes, that show they had more or less given up climbing trees. Their brains ranged from 750 to 1,250 cubic centimeters, averaging about 1,000 cubic centimeters (about 60 percent larger than the brain of Homo habilis) and only slightly smaller than that of modern Homo sapiens at about 1,350 cubic centimeters. It has been claimed with some justification that Homo erectus was the first hominid species whose anatomy and behavior justify the label, human. The larger brain led to new behaviours. Stone tools, some of which have been dated to more than 1.7 million years ago, became elegantly flaked on two sides. This new tool is generally referred to as a handaxe even though it was really just an elegantly sharpened rock weighing several pounds which was significantly sharper than earlier tools. In addition Homo erectus also made wooden spears up to six feet in length and sharpened at both ends. Eleven spears found in Germany were apparently used to hunt wild horses, while in southern England and in Spain, Homo erectus hunted other large mammals, including bison, deer, bears, and elephants. Such hunts required the co-operation of a large number of people. Stone- tipped spears dated to 460,000 years ago have recently been found in South Africa. Homo erectus was also the first hominid to control and use fire and there is good evidence for the controlled use of fire by 790,000 years ago. Co-operative hunting and co-operative living (suggested by archeological remains in natural shelters, such as caves, and in the building of artificial shelters) would have required communication, though the degree of language development remains the subject of debate.However, Homo erectus was not only smarter, but was becoming self-aware. Self-awareness in children is, we know, a gradual process, it develops in stages. Its development is not dependent on achieving a specific age but on achieving a critical level of brain development, which varies among children. This is illustrated by the fact that most children with autism develop mirror self- recognition at a later age than other children. Similarly, it would be expected that self-awareness in Homo erectus would have developed slowly and would have fluctuated in its early stages. Neuroanatomist, Bud Craig, of Arizona State University, has defined self-awareness as “knowing that I exist” and “the feeling that ‘I am.’ In other words, a sense of our own being. We need to be able to experience our own existence before we can experience the existence of anything else in the environment. Without an “I” there can be no “you.” By developing an awareness of self, Homo erectus would have developed an awareness of others and, therefore, been able to initiate co-operative activities. Such an awareness of others would not have included a “theory of mind”, but would have been more like that found in animals that hunt jointly, such as wolves. They are aware of one another, and so able to co-operate, without understanding what the other is thinking. Self-awareness is, also, not unique to humans. Charles Darwin, “while visiting a zoo … held a mirror up to an orangutan and carefully observed the ape’s reaction, which made a series of facial expressions.” In fact the majority of chimpanzees in studies learn to use a mirror to explore body areas they could not see without the mirror, such as their teeth and ears and yet monkeys tested similarly show no such signs of self- recognition. The capacity for self-recognition, it seems, does not extend below humans and the great apes.However, it is with Archaic Homo sapiens, beginning about 200,000 years ago, that the development of an awareness of others’ thoughts, commonly referred to as a “theory of mind” may be seen. In terms of longevity, Homo erectus was the most successful hominin species to inhabit this planet, surviving 15 times longer than our own species has so far. Given its success and broad geographical distribution, it is not surprising that around 700,000 years ago, Homo erectus began evolving into several other hominin species, commonly grouped together and designated as “Archaic Homo sapiens”. Some members of this group apparently developed a new major cognitive advance that would be essential for becoming modern Homo sapiens. Depending on where they lived geographically, these hominins developed as Homo heidelbergensis and Homo neanderthalensis (in Europe): specimens from Spain, dated to approximately 430,000 years ago, display features of both of these; Homo rhodesiensis (in Africa); Homo floresiensis (in Indonesia) and the Denisovans, a sister group to the Neandertals, whom they apparently outnumbered and with whom they interbred (in Siberia). Neandertals lived in Europe 230,000 to 40,000 years ago. The largest concentration living in what is now southern France, but they were widely distributed from Wales in the west to Uzbekistan in the east. There is no evidence that Neandertals ever migrated to China or Indonesia, as Homo erectus had done, or that they ever lived in Africa. The most striking physical characteristic of the Neandertals was their large brain, which averaged 1,480 cubic centimeters, larger than the average for modern humans.
Their short, stocky frames, similar to modern-day Inuit, were suited to the cold European climate. They followed animal herds in the summer and spent winters in a home base, often a cave. They made extensive use of fire and animal skins for warmth and were excellent hunters. They made both stone and bone tools, and weapons significantly more sophisticated than those of Homo erectus. Their spears were as elegantly balanced as Olympic javelins and were used to hunt herd animals, the source of their largely protein diet. Much of the hunting was done in groups, and there is evidence of co-ordination such as driving herds of bison and mammoths over a cliff. They also fished and trapped birds. Despite having large brains and sophisticated hunting techniques, however, the culture of the Neandertals was remarkably static. There were no innovations, just a narrow repertoire of ancient technologies that sustained them for thousands of years. They never invented the harpoon, bow and arrow, or other weapons, despite hunting large animals for almost 200,000 years. Based on their brain size alone, Neandertals should have built computers and flown to the moon. The discrepancy between their brain size and lifestyle has been characterized as a brain-culture mismatch. However, crosshatched lines, 39,000 years old, carved onto the rock of a cave on Gibraltar have been found which are suggestive of early art (there have also been crudely painted shells and collections of feathers from large birds, such as eagles, found). However, one clear difference from their hominin predecessors is established: for the first time in history, there are suggestions of caring for other members of the group. Evidence from caves in Iraq of the remains of nine Neandertals, who died 60,000 to 80,000 years ago, have been found. Amongst them was one older man who showed evidence of having had severe injuries with multiple fractures many years before his death, including trauma to his right arm and leg that would have crippled him as well as a blow to his head that left him blind in one eye. Such a hominin would not have survived for long on his own, and it is clear that others provided care for him for many years. Another example of caring among Neandertals was their practice of burying their dead. From between 75,000 and 35,000 years ago, at least 59 intentional Neandertal burials at 20 sites have been found, mostly in southwestern France. Most placed in a tightly flexed foetal position, which may have had a symbolic significance. Neandertal burials may suggest belief in some sort of afterlife, but, at the very least they show a strength of attachment between individuals that transcends anything seen previously: a gesture toward the dead that was far from obligatory for any but emotional reasons.Providing care for another suggests that you are able to share their emotional perspective, which is to say, to empathize with them, to get into the mind of the other, to know what they are thinking and feeling. Psychologists refer to this as having a theory of mind, an understanding that the behavior of others is motivated by thoughts, emotions, and beliefs. It is not merely being aware of the physical presence and intentions of another person; early hominins all had that ability, as do many animals, as when wolves submit to an alpha male. However, a theory of mind involves actually putting yourself into the other person’s mind. We read the mind of others not only by listening to what they say but also by observing their facial expressions, gaze, posture, and movements. By definition, an awareness of others cannot develop until an awareness of self has first developed, since you cannot understand the thoughts and emotions of another unless you are aware of your own, which is your point of reference. It was at this point that humans became aware of themselves as one amongst many, no longer as the centre of the universe.It is very unlikely, however, that Neandertals believed in gods. Although they apparently had acquired a theory of mind, they had not yet acquired the second- order theory of mind that would allow them to think about what the other was thinking about them. Nor had they acquired an ability to fully project themselves into the past and future and to use their past experiences to plan their future. They were not yet cognitively mature enough to prehend and honor the gods.By 100,000 years ago, hominins had been separated from their primate ancestors for about 5.9 million years, 99 percent of the time from the point of separation to the present day. What were the odds, in the remaining mere 100,000 years, that hominins would write Handel’s Messiah, split the atom and fly to the moon, let alone build monuments such as Angkor Wat and Chartres Cathedral to honour gods? Something remarkable was about to happen.BreakthroughAbout 100,000 years ago, early homo sapiens developed an introspective ability to reflect on their own thoughts. Thus, they could not only think about what others were thinking but also about what others were thinking about them and their reaction to such thoughts.There is archaeological evidence from South Africa of cave dwellers (who also made crude rock shelters where there were no caves) who ate a varied diet including seafood and game, lived a reasonably settled existence, used bedding made from various grasses and plants, (including some plants that have insecticidal properties against, for example, mosquitos), and made herb medicines. In these caves, which centre on the Blombos Cave in Southern Cape, and are dated to 77,0000 years ago, were also discovered seashells, covered with red ochre which had been deliberately perforated, allowing them to be strung together as necklaces or bracelets. A 100,000 year old red ochre processing workshop was recently uncovered in these same caves. Ochre can be used for tanning skins, as an insect repellent when applied to the skin, and for hafting stone tools onto wood handles, but also as body decoration. While it is not possible to say definitely what the ochre was being used for 100,000 years ago, the fact that it was applied to perforated shells suggests that it was, at least occasionally, being used for decoration. Altogether, five different kinds of shells have been identified and among the shell beads, it was found that “beads from two separate archeological layers displayed patterns of wear distinct from one another suggesting that they had been re-strung and worn differently at different times. This is the first evidence of evolving hominin fashion consciousness. Also in the South African caves, 15 pieces of ochre were found that had been modified by scraping and grinding, and then deliberately engraved with straight line designs. On one, for example, cross hatching consists of two sets of six and eight lines partly intercepted by a longer line. Some of the engraved pieces of ochre have been dated to approximately 99,000 years ago. Elsewhere, in what is now Botswana, a six metre long rock had been deliberately carved as a piece of public art to resemble the head of a snake and has been dated to 70,000 years ago. There is also evidence that the inhabitants of the South African caves were wearing fitted clothing at this time. Animal skins had been worn for warmth for thousands of years, by members of Homo erectus and Archaic Homo sapiens who lived in colder climates in Europe and Asia. However, about 72,000 years ago modern humans began to wear clothing that was more tailored than simple animal skin capes, even in hotter climates.Sophisticated tools, pierced shells, fitted clothing, engravings on ochre, rocks carved to resemble animals: a new kind of hominin had clearly emerged. The behaviour of these hominins was so at variance with the behaviour of their predecessors that we designate this group as Homo sapiens, “wise man.” We assume that such individuals must have made some kind of major cognitive leap forward as wearing shell jewelry, decorating one’s body, and wearing fitted clothing all suggest that early Homo sapiens had become not merely self-aware, but aware of what others were thinking about them. Self- adornment is a means of advertising family relationships, social class, group allegiance, and even sexual availability. It is intended to send a message to observers. Self-adornment has been used by Homo sapiens in every known culture, often involving extraordinary investments of time and resources. At the heart of self-adornment is one Homo sapiens thinking about what another Homo sapiens is thinking about him or her. This is the introspective self. The second order theory of mind, which involves thinking about what one person thinks another person is thinking. The acquisition of a second-order theory of mind requires the person to view the self as an object. It is not merely looking in a mirror and recognizing the self but rather being able to think about what you look like to other people, how they see you, and what you think about how they see you. It includes being able to think about yourself thinking about yourself. It is, in short, the introspective self. The fact that early Homo sapiens were apparently decorating themselves and wearing fitted clothing suggests that they were now thinking about themselves and how they appeared to others. The evolution of an introspective self provided early Homo sapiens with major advantages over other hominins, especially in social interactions as they were able to predict others’ behaviour. It would have greatly facilitated group activities such as group hunting, and put Homo sapiens at a significant advantage in warfare against other hominins who did not possess this cognitive skill. Zygmunt Bauman wrote, “Unlike other animals, we not only know; we know that we know. We are aware of being aware, conscious of ‘having’ consciousness, of being conscious. Our knowledge is itself an object of knowledge: we can gaze at our thoughts the same way we look at our hands and feet and at the ‘things’ which surround our bodies not being part of them.” The evolution of the introspective self may be the defining moment in the development of human cognition. Thought is born.In the Bible this emergence of an introspective self is symbolized by the Genesis myth of Adam and Eve, who eat fruit from the forbidden tree in the Garden of Eden and, for the first time, become aware of themselves and their nakedness. This is most likely drawn from the experience of childhood, when at an early age we become aware of ourselves, rather than being a commentary on the development of humanity, but the story powerfully addresses both. We experience differentiation and with it alienation, something both good and bad. It is, ultimately, the source of the existential longing and loss we experience which forms the basis of religion, but it is not so much a “fall” as a great leap forward.The concept of gods would not have occurred to hominins prior to about 100,000 – 77,000 years ago, and would probably not have fully developed before about 10,000 years ago with a settled agrarian society. The human brain, and thus the self-aware human world, would not have been ready for them before that time. However, somewhere between 100,000 and 10,000 years ago, between Homo sapiens’ awareness of others and the appearance of the pantheons of gods which characterise settled agrarian communities, something happened. That something is the prehension of the numinous, of the wholly other. That which transformed Homo sapiens from a self-aware shell decorator, to a worshipper.