Chapter 1 - A Contemporary Trinity
The theory of evolution by natural selection –a coincidental discovery by Charles Darwin and Alfred Russel
Wallace towards the middle of the 19th century– is a powerful philosophical concept. Nevertheless, as emphasized by Ernst Mayr, it is amazing that nobody thought of it during the European history of philosophy which lasted more than two thousand years, starting with the Greeks and including minds of the caliber of Descartes, Hume and Kant, until the beginning of the 19th century
(Mayr, 2000). It is a brilliant idea that combines a great simplicity with huge explanatory power. It associates the notion of survival of the fittest members of a population with the fact of limited resources, resulting in the unavoidable consequence of the progressive elimination of the less able to deal with constraining environmental conditions. Assuming a mechanism to generate diversity from one generation to the next, there are bound to be changes tending toward segregation. With time, the diversified sub-populations become separate species, each adapted to a particular environmental niche. The Aristotelian notion of scala naturae –a grandiose discontinuous march toward ever more perfect creatures– was overthrown, replaced by the idea of a gradual evolution lacking preestablished goals.
Our new century, thanks to the mutually complementary work of many notable scientists and thinkers of the previous two centuries, finds the biological sciences integrated within the grand paradigm of the theory of evolution by natural selection. Among the great contributors we must mention, aside from Gregor Mendel, the earlier discoverer of the four laws of heredity,1 Hugo Marie De Vries, the discoverer of genetic mutations, 2 and Francis Crick and James Watson, joint discoverers of the molecular basis of heredity. We will discuss their contributions in more detail further on.3
As it usually happens with every active scientific paradigm, there are still discussions concerning the evolutionist theory, all of a strictly technical character. The adversaries of the paradigm, mainly religious fundamentalists, can find no comfort in them. None threaten the essence of the “grand synthesis” resting upon the concepts of natural selection, heredity through recombination of genes and fortuitous mutations, together with the physicochemical molecular mechanisms which thoroughly explain all those natural processes.
The taxonomy of living beings, popular in pre-Darwin times, descended directly, through Aristotelian articulation, from Plato’s essentialist conceptions. The very word “species” was coined at classical times as the standard translation for eidos, the word used by Plato to mean “form” or “idea.” This Greek philosopher considered ideas as the only authentic reality, sole self-standing beings, enduring in an ethereal universe different from our own; things, in contrast, were mere appearances and shadows of those eternal ideas. An important consequence of this bizarre conception was the essentially discreet nature of reality, each of the ideas, identical only to itself, being essentially different from all others. Chiaroscuros were allowed only in appearances, in the world of senses, not in the world of the truly real, the rational concepts interesting to science.
The application of this intellectual prejudice to the investigation of nature prescribed the canvassing of the world in search of the imprints of pure essences, reflections of the Platonic ideas; transitions between definite forms were interpreted as deceit of the senses, sheer appearances, or godsend monstrosities in punishment for our sins. These prejudices were overcome early by the Renaissance physicists and astronomers, but remained entrenched in biological sciences for several more centuries. Still in the 18th century, the different species of organisms were considered eternal untemporal forms, in homology with the perfect triangles and circles of Euclidean geometry. It was only until the first half of the 19th century that the development of geology and archeology began to mollify this aberrant essentialist position.
Thomas Malthus' Essay on the Principle of Population (1798) deeply influenced Darwin’s thinking. Malthus had theorized that in a world populated by organisms that reproduced themselves abundantly, it was mathematically unavoidable that, sooner or later, the population would end up being disproportionately large with respect to the available resources. At that point, many organisms would begin to die before reaching the age of reproduction.
Darwin would add two points to this sensible perspective. First, if there were meaningful variations –however small– in a population, any relative advantage in some individuals would slant reproduction towards them as soon as shortages of food or space began to occur. Second, as a consequence of the first point, differences would tend to increase with time, resulting in a drift of the population toward specimens with advantageous variations. As he stated in his work, if more individuals were born than those who survived, a grain of sand in the balance would be enough to determine which individual would live and which would die, which variety or species would increase and which would decrease or ultimately be extinguished (Darwin, 1859). The gist of the natural selection process must be considered as a two-stage sequence:
It is easy to see the coincidence of this process with the generate and test method widely applied in artificial-intelligence algorithms,4 as much as with the research and development strategy applied by large contemporary firms –e.g., pharmaceutical companies– to design new products.
Before the publication of The Origin of Species, in 1859, as we are reminded by Ernst Mayr, almost all European scientists and philosophers were Christians. The world had been created by God, with wise laws that assured the adaptation of organisms to each other and to their environment. Darwin proceeded to dispose of supernatural phenomena and causes, since his theory of evolution by natural selection explained world diversity in a completely materialistic way. There was no longer any need for a God designer of the universe, since all features of the marvelous design in nature,6 acclaimed by the believing scientists who preceded him, could be explained by natural selection. It is remarkable that this mechanistic explanation of nature disposed of the infamous problem of evil in the world, which theologians were unable to explain in previous ages. The new conception predicts, with great simplicity, that the automatic design of natural selection does not always favor human beings, as clearly demonstrated by diseases, most of them attributable to other biological beings –viruses and bacteria– which restlessly and ferociously compete with us as ecological rivals. Evil ceases to be a problem, becoming rather confirmatory evidence, within the new conception.
Ever since the times of classical Greek philosophers, European thinkers had emphasized world invariance and stability. The sole variety considered real was that separating natural classes, each one uniform in itself by virtue of its essence, even if individuals differ significantly by their accidents. The characteristics of each class were considered fixed and immutable, clearly distinguishable from those of the other classes. The difficulty that essentialism had in dealing with variations among living beings still permeates today the equivocal concept of 'human races.' To an essentialist, Caucasians, Africans, Asians, or Inuits were types conspicuously different from each other, way of thinking that inevitably led to racism. Darwin rejected essentialism, proposing instead a population way of thinking: all groups of living organisms, humans included, constitute populations of individuals so different from each other that they are, in fact, unmistakable. There are no two identical human beings, or any other living beings for that matter. Populations do not vary because of their different essences or accidents, worn-out mythological philosophic concepts, but by virtue of their statistical differences. By rejecting the immutability of populations and the concept of natural classes, Darwin fully introduced history in scientific thinking, laying down the foundation for the contemporary scientific humanism (Mayr, 2000).
The evidence in favor of the evolutionist theory have not ceased to accumulate since Darwin’s times. Some arose in the fields of geology, paleontology, biogeography, or anatomy, main sources used by Darwin. More recently, proofs from molecular biology and other biological sciences, such as ecology and virology, have appeared. The demonstrative force of the contributions from all these sciences has an overwhelming character; so much that we may say, joining Daniel Dennett (1995), that today anyone doubting that life diversity in this planet was produced by a process of evolution, is simply an ignoramus. We have collected some of the more impressive evidence produced in those fields in favor of evolution in one of our appendixes.7 It should be added to that evidence the massive confirmations contributed by the recent and ongoing sequencing of the genome of many different species, the human one included.8 Many homologous genes have been found, coincident for different species, even in pairs as distant from each other as the fruit fly and the human being, or plants and mammals. It is no wonder that scientists have accepted natural selection as the essential explanatory principle in the biological sciences. Despite pretensions of American creationists, evolution by natural selection has long since ceased to be a theory in the derogatory sense, tantamount to a proposition with little support. It is a theory only in the epistemological sense of a very general proposition with sufficient explanatory power to give meaning to a whole scientific field of research. In fact, the theory of natural selection has become the fundamental statement of all contemporary biological sciences. That is exactly what historians of science consider a scientific paradigm (Kuhn, 1962). This paradigm has been confirmed, clarified and quantified more and more over the years, proving to be stronger and stronger as it has been solving every puzzle and overcoming every challenge to which it has been confronted. As Karl Popper would reason, a false idea would have succumbed to the unmerciful and concerted attacks from so many quarters throughout decades of scientific inquiry and testing (Popper 1962).
Natural selection, understood as the survival of the fittest (the one who is more likely to survive and have offspring), is the cornerstone of contemporary biology. Notice, however, the strictly formal character of the natural selection principle. Although it is able to explain all kinds of biological phenomena, its rigorous formulation does not even need to consider life itself; it is enough that it refers to beings capable of replicating themselves. This allows us to establish a connection to another of the powerful idea of our “contemporary trinity”: the computer algorithm. In order to see this more clearly, let us specify the components of the natural selection principle in rigorous logical terms.
As you can see, there is no mention here of living beings, only of constituents capable of replication, whatever their nature might be. They could, for instance, be industrial companies, giving grounds for the competition theory of economics. Or they could be computer programs –written in an appropriate language– producing the digital-evolution phenomena studied by a brand-new discipline, artificial life.9
At the beginning of this section, we shared Mayr’s amazement before the fact that the idea of evolution by natural selection, so powerfully explanatory and of such compelling simplicity, had taken so long to appear in European thought. We may wonder about the reasons for such a delay, in contrast with for instance, the atomic theory, one of the paradigms of contemporary physics, already conceived by Leucippus of Mileto and Democritus of Abdera in the 5th century BC. An explanation that comes immediately to mind is the humans' resistance to lose their mythically overestimated status with respect to other species, manifest in almost all civilizations. This would explain why it was not easily accepted, but not why it was not even conceived by some of the most inquisitive minds that humankind has produced. The heliocentric theory was vigorously defended several centuries ago, in clear opposition to Aristotelian doctrines dogmatically supported by the Catholic Church. However, no one conceived the idea of natural selection before Darwin's time. In my opinion, there probably was something more involved here. I believe that it had to do with the difficulty of directly perceiving the splendid simplicity of this algorithm. In higher –multi-cellular– organisms, selection and its consequential evolution operate through a two-level mechanism, which acts as a concealment device capable of misleading even outstandingly sharp minds. Biological replicators of multicellular beings –we call them genomes– distinguish themselves conceptually and physically from their bearers –what we call organisms.10 What directly evolves through generations is the genome, although the element that must survive until the age of reproduction is the genome carrier, the individual organism. The evolving genome, although present in each cell, is not easily detected and, before Mendel, no one even had an inkling of its existence. A complicated process of construction and maintenance mediates between genome and body, namely the differential expression of genes according to the bodily region where each specific cell is rooted.11 It was this mediation that constituted, in my opinion, an impediment to an earlier discovery of the fundamental mechanisms of evolution in higher organisms.
The genome-organism distinction is late in the history of life. In the earlier stages of evolution, at the beginning of what the specialists call the RNA world,12 replicators were simply threads of nucleic acid, instead of being carried by a body formed through their mediation. In other words, at that time the algorithmic conditions were directly fulfilled, without any intervention of gene expression or sexual mix from father and mother genomes, as would happen later, when the world of DNA13 came into existence. Nowadays natural selection acts indirectly. What is reproduced –identically or with eventual mutations– is the genome; but what survives (or not) until reproduction is the organism carrying the genes, better or worse adapted to the environment. For primitive replicators, evolution was controlled by what we should call primary natural selection, on account of its being the first that existed and also because of its literal correspondence to the logically simple algorithm described in the previous section. Our own replication, that in which the distinction between genome and its carrier holds, should be called secondary natural selection. It is the one controlling the evolution of higher organisms, more complex and appearing much later than those evolved through primary natural selection. The great merit of the discoverers of evolution by natural selection was to have been able to intuit, under the cover of the secondary type, the algorithmic nature and power of the primary one. Further on we will see that cultural evolution corresponds to the first type of natural selection, not to the second.14
Without detriment to the remarkable shrewdness of the discoverers of natural selection, Darwin and Wallace, the fact that it occurred precisely in the 19th century must have been propitiated by the chronological precedence of Adam Smith’s economic thought, with which both scientists were well acquainted. As a matter of fact, the natural selection implied by the "invisible hand," postulated by this thinker to explain the magic of resource optimization in a free market, is decidedly of the first type. As this natural selection is simpler, the surfacing of the more complex idea in the cultural ambiance of the 19th century must have facilitated the perception of a similar mechanism, more complex, in the biological realm. The methodological Baconian principle,what nature hides in the complex it manifests in the simple, might have been here in operation.
We understand by algorithm a reliable and mechanistic procedure that achieves a specific result, especially in mathematics and computer science; the concept is also applicable in other fields, such as economics, engineering and, as we will see with more rigor, biology itself. The term ‘algorithm’, through several translations and corruptions, comes from the name of the Persian mathematician Mûusâ al-Khowârizm, whose book about arithmetic processes written around the year 835, was translated into Latin in the 12th century. The use of the word in the sense explained above has been current for several centuries; it reaches notoriety, however, in our own time, thanks to the meta-mathematical15 work of thinkers such as Hilbert, Gödel, Church, and Turing, who revolutionized mathematical sciences during the first third of the 20th century. The transfer of the concept from mathematics to computer science is due to the last mentioned of these distinguished mathematicians. He proposed to his colleagues to define ‘algorithm’ in a formal manner as what a particular machine –now bearing his name– does. The machine, as described by him, can materialize infinitely many different configurations (Gutierrez, 1993). According to his proposal, gladly accepted by his colleagues, it became settled that something is an algorithm if, and only if, it is a deterministic process with the following characteristics:
The penultimate condition forbids probability or chance; for instance, tossing a coin to decide whether or not to perform the next instruction. The last requisite is mandatory only in cases where a specific result is expected, such as the value of a function (for instance, dividing 7834 by 555, with a specific number of decimals), or the preparation of a certain number of food servings or of a synthetic drug. However, never-ending algorithms are allowed when the intention of the process, uninterrupted or intermittent, is not a result or value but a collateral effect. For instance, the algorithm that supports a pacemaker should never end, since its function is precisely to permanently regulate the beatings of an ailing heart. The algorithm of an operative system such as Windows or Linux should not terminate either, since its function is to sustain the indefinite functioning of a variety of programs within a particular electronic environment. These algorithms, which in principle should never end, are referred to as irregular or partial algorithms.
Before the advent of digital computers, algorithms were generally conceived as pertaining to the exclusive turf of mathematicians, since an algorithm was essential to calculate a function. Once the concept of algorithm was equated to the operation of a Turing machine, electrical engineers began to built real machines embodying –to the extent possible– that mathematical abstraction. Those machines were the predecessors of the graceful contraption which have taken over our desks in the form of personal computers, as the one I am using now to write this book. Currently, we associate ‘algorithm’ with a broad range of virtual objects as diverse as spreadsheets, word processors, databases, electronic calendars, all kinds of digital games, and let us not forget the operating systems themselves which, impeccable and vertiginously, run all those programs for us.
It is important to point out that the concept of the algorithm itself does not specify the materials required for its construction. In fact, they do not have to be constructed of any stuff in particular, except for the stuff our dreams are made of. To be effective in the world, of course, they do have to “incarnate” in some material: our own body to brush our teeth or do body building. Transistors and cables, in the printed circuits of electronic gadgets. Far more encompassing is its incarnation in a general-purpose machine, such as the one created by electrical engineers after the image of the Turing one: the computer.
The difference between hardware (wiring) and software (programs) is already proverbial. It is what distinguishes a computer –algorithm that performs very general functions– from the specific program it runs which transforms it into an approximate16 Turing machine much more specific, for example a word processor or a spreadsheet. Our older readers may remember that, in the 70’s, “word processors” were very expensive typewriters which allowed to do wonders (for that time) while preparing documents. They were real algorithms, but not the soft ones incarnated in electromagnetic media that we nowadays download to a computer; they were hard ones, firmly incarnated as electrical permanent circuits.
The irrelevance of building materials is important for the purposes of this work since –in agreement with Daniel Dennett– one of its main theses is that evolution by natural selection, the generator of all life on Earth, is an algorithm, in fact the oldest and most powerful of them all.
Having lived through a great part of the 20th century, I am able to contemplate a broad panorama of algorithms intertwined with experiences of my own life: First, those that I found when I went to school; later on, those that befell us all, for some as a manna from heaven to simplify their work, for others, as the anguish of being left behind halfway in life. I remember my admiration, as a child in the 30´s, of the effectiveness of addition, subtraction, multiplication and division rules, taught to me by my mother and my school teacher. In the 40's, the unbounded joy of discovering the basic algebra and geometry ones. As a law student in the 50´s, I wondered at the procedural rules that ensured results in court if and only if all legal steps were faithfully followed to their minimal, sometimes ridiculous, details. Later on, as a graduate student at The University of Chicago, I discovered the magic of economic algorithms, especially the ones regarding supply and demand which, under appropriate conditions, automatically ensure the best possible social provision of goods and services. Along my doctoral studies, I learned also that logic and mathematics were the same thing, as demonstrated by Peano and Bertrand Russell, among other famous creators of the algorithms of contemporary mathematical logic.
But those were old algorithms, some among them even centuries old. It was not until the 60’s that real novelty began to appear, through the commercialization of computers. Algorithms had now the surprising feature of not only explaining action, but also executing it by themselves. These algorithms, computer algorithms properly said, were ultimately equivalent to the mathematical ones. However, they came forth with an autonomy that gave them the capacity to gradually change the very structure of society and –as a result of our reaction to them– also our own conception of human nature and the world. An intriguing dichotomy emerged, whose depth and transcendence we are still pondering: On the one hand, the incredible exploit of making machines perform rational processes, considered to that date the reserved turf of humans, underlined more than ever our extraordinary capacity for conceiving and handling symbols, an exclusive characteristic of our genus within the animal kingdom, as Aristotle stated. On the other hand, the great achievement of a machine being able to perform symbolic activities, pressed on us the profound introspection regarding the nature and scope of our “specific difference” as human beings. After our long struggle to overcome ancestral atavisms, which liken us to the wildest animals, we now perceive that rationality –with which we glorify us so much in theory but exert so little in practice– likens us in some ironic sense to the new machines. So much so, that the mechanistic paradigm clearly reigning in the biological sciences points unequivocally in the direction of considering our organism (a machine after all) the finest and most delicate of all mechanisms, including our brain and all its portentous processes.
As humankind was receiving this introspective impact, it was also collecting a plethora of new goods from the computer revolution. They could not even be counted since more were continuously appearing, at an accelerating speed. Algorithms went wild through the world, transforming each and every human activity, without exception, from the most material to the most spiritual. In the 60’s and 70’s, we were witness to the transformation of bureaucracies and feared that computers would become powerful weapons for monstrous totalitarian regimens that could put an end to our liberty and privacy. Fortunately, at the beginning of the next decade, a revolution within a revolution arose from a Californian garage and the personal computer instantly began to take over the world. It became a major democratizing instrument that would later only be exceeded by Internet17 –a revolution within a revolution within a revolution– in the 90’s.
Let us mention some of the impressive consequences of these spectacular recursive revolutions that characterize the computer era, still in full swing today:
Very early, the word processor transformed forever the writing practice, introducing all the marvelous small advantages which make it superior to paper and pen or typewriter as an instrument for document production.
Shortly afterwards, the electronic spreadsheet relieved the intellectual labor-pains of millions of accountants (professional or domestic). Those who did not live before this outstanding invention have no idea of the incredible torture which resulted from changing only one datum on a book-keeping sheet with eraser and pencil, due to the enormity of its repercussions on the rest of the sheet.
Immediately following, relational databases appeared, prodigious swarms of logical and automatic interconnections which forever relieved filing clerks and allowed managers and general public to do relevant instantaneous searches in seas of information.
Electronic games and “virtual reality” appeared, opening all of a sudden a new world of possibilities to human expression, amusement and communication, creating in passing the foundations for new interactive forms of education.
Banking operations were totally transformed, from one day to the next, becoming more efficient, inexpensive, and user friendly.
The just-in-time strategy was introduced in commerce and industry, reducing to the minimum world inventories, cutting prices on countless items for millions of people.
The introduction of robots allowed the world production of automobiles and electrical appliances to meet the increasing demand, far beyond what population growth would have permitted without this technological development.
Flexible manufacture was introduced, allowing redirection of plant production toward customized specifications, by the simple change of some parameters in a computer program. This transformation allowed industries to adapt to the preferences of small groups of consumers in fields so diverse as tool-machine industries and book publishing, overcoming the inconveniences inherent to the mass production inherited from the Industrial Revolution.
Multiple new work-at-home opportunities were generated, in fields as diverse as consulting, stock-market brokering, translation, or outsourcing of regular office and business activities. Thanks to electronic networks, those outsourcings could spread beyond frontiers and oceans, pari passu with the globalization of the economy.18
The introduction of digital imagery made the human body practically transparent for medical diagnosis, with the resulting benefits for the health of world populations.
More recently, human genome sequencing, with the consequent explosion of biological sciences, is just about to produce custom-made medication for each human phenotype, making traditional pharmacology and medicine all but obsolete.
A digital communication protocol created, in the 70’s by scientific researchers, to exchange results at an almost instantaneous speed –what would later become Internet– explosively took over commerce and daily life during the 90’s. It has introduced incredible improvements in the quality of human life and set the stage for long-distance, personalized and all but free self-education at a global scale.
To this impressive list we must add a remarkable contribution of the 21th century: an extension to the Internet algorithm has begun to drastically revolutionize world telephony by allowing voice communication on its basis. This phenomenon is equating the price of national and international telephone communication to the cost of Internet, insignificant when compared to traditional telephony. So much so that the latter has begun to be rapidly replaced by the new technology.
Let us dwell for a moment on the transforming influence which the Internet protocols are having on current and near-future education. We are dealing here with nothing less than the overcoming of the educational systems inherited way back from the Industrial Revolution, when traditional schooling was created as a complement of large production plants and in their own likeness. Today –more than two centuries later– we are still enduring such an educational model, basically all around the world: a system characterized by massive concentration of students in separate and arbitrary “grades,” under rigid study programs practically identical for all students, with little or no attention to individual needs. Personal computers, at lower and lower prices, connected 24-hours a day to the world networks, capable of transmitting sound and image at extraordinary speed, are making it possible and even unavoidable to overcome such an archaic and ineffectual educational model. If we add the emergence of Google as a handy means of obtaining instant information about practically anything, from leading sources and in many different languages, we are able to visualize the distinct possibility of first-class personalized education, for all humankind, comparable only to the one received by royal children during the Renaissance or the Enlightenment. Furthermore, this type of personalized education could in principle continue for life, adapting people to rapidly changing technology, early retirement, and cultural and economic globalization.
ALGORITHM AND PHILOSOPHY
Aside from its relevance for mathematicians, computer professionals, and the general public for its practical applications, algorithms are food for thought to professional philosophers and other abstract thinkers. Besides meta-mathematical works, of great philosophical interest in relation to the ultimate limits of knowledge,19 several general topics of considerable philosophic interest related to algorithms have surfaced in our days. Let us outline some of the most interesting.
The first and foremost of those topics entered the scene with an article by Alan Turing, regarding the possibility of
machines being capable of rational thought, at the dawn of the computer era. (Turing, 1950)
Although Turing was not a professional in philosophy, he was indeed a philosopher in the profound sense that characterized the European Renaissance and the Enlightenment eras, when great physicists and mathematicians or distinguished politicians, wrote about meaningful subjects connected with their fields. Such were, for instance, the cases of Francis Bacon’s philosophizing about the “idols” obstructing human knowledge, or Isaac Newton's disquisitions equating absolute space to God’s sensual apparatus (sensorium Dei). Actually, Turing inaugurated with his article a brand-new science, currently called artificial intelligence, which would attract many philosophers (this author included) during the following decades, inspired by Thomas Hobbes’ dictum that the best way of understanding something (in this case, human intelligence) is attempting to build it.
Artificial intelligence (AI for short), as John McCarthy would define it later, consists of the attempt to make machines do things which, if performed by a human being, we would say that they require intelligence. This attempt is carried out by providing the machine in question with computer algorithms capable of solving problems similar to those we confront, routinely or in specific situations: playing chess or other board games, proving theorems, solving riddles, understanding or producing language, recognizing objects visually, doing medical diagnosis or preparing a travel itinerary, avoiding obstacles while walking, etc. The achievements of this enterprise have been considerable, considering its difficulty. It is interesting to note that success has been proportional to the degree that the skill fell within the realm of expertise (as playing chess or making medical diagnoses), and not of general, non specialized knowledge or what we normally call commonsense (small talk, discussing politics, posing philosophical problems).
AI, however, has its skeptics. One of them, the outstanding physician Roger Penrose (1989) insists on the contrast between the mechanical or blind character of algorithm execution and the creative character of human intelligence, non deterministic and subject to errors. Such contrast drives him to deny that artificial intelligence is genuine intelligence and that human intelligence might be based in algorithms. However, he makes no effort to clarify the creative power of human intelligence and pays no attention to the achievements of the artificial one. On our part, we think that the conundrum can be resolved in a different way. Of course, AI programs are blind and mechanical in some sense, since they are executed by machines; but so are the firing of our neurons and their synaptic biochemical connections. The paradox of intelligence, be it artificial or human, is resolved when we realize that the strategy of its programs combines two distinct algorithms, each performing its function mechanically: One is in charge of generating, somewhat arbitrarily, a number of hypotheses that could possibly resolve the problem in question; the other –a complementary but necessary one– is responsible for testing those hypotheses –by the application of a heuristic function– to chose the most promising one, according to the nature of the problem. This function is not infallible, in spite of its application being totally deterministic.20 The interesting thing is that, through the combined action of these two mechanical non creative processes, some creative intelligence may be achieved –fallible for both humans or machines. As magic does not exist, any deficiency that the creativity of our intelligent machines may have for the moment as compared to human intelligence, is attributable to the simple fact that we have not yet finished our work in designing their respective algorithms, if we ever can. Period.
The strategy of generate and test is also present –under the name of genetic algorithm– in relation to a second important philosophical topic. It has to do with a new discipline, emerged in the 80’s and culminating in the 90’s, referred to as artificial life. This discipline exploits the principle, already discussed, that an algorithm is indifferent to the materials in which it is embodied. With a clear understanding that natural selection is an algorithm, it takes seriously the assertion that there is no reason why there cannot be an evolution of replicators made of non biological materials. They could be, for instance, digital sequences confined in a computer's memory, able to mutate along many generations within that particular
ecological landscape. In other publications I have described the most conspicuous experiment so far in that field,
underlining its value as a direct empirical proof of the validity of the theory of evolution by natural selection, regardless of its particular incarnation (Gutierrez, 1993 and 1999b).
Natural Selection as an Algorithm
Once one acquires a new notion, if it is at all worthwhile, an unavoidable work of internal elaboration takes place to incorporate it to our system of beliefs, through interaction with our standing ideas. This sort of dialectics compels us to rethink all that we know under the light of a new discovery or concept. It would seem to be an inescapable law of intelligence whose possible foundation must be related to the organic connectivity of our neurons. I am inclined to accept, in that spirit, Daniel Dennett's insight (1995), which most appropriately reinterprets the theory of evolution by natural selection proposed by Darwin as an algorithm, in the same sense the word is used in contemporary computer science or mathematics. In accordance with that theory, duly complemented with biological discoveries of the 20th century, the reproduction of a population produces in each generation sufficient variety of individuals, from which environmental constraints mechanically select the best adapted to the available ecological niches. Taking into account the abstract character of algorithms, regardless of the material where they are embodied, there is no impediment for accepting natural selection as a genuine algorithm, and of old ancestry. And ancestry it has, having functioned throughout our planet for billions of years, populating seas and continents, more than enough to be declared the more distinguished of them all.
In order to better perceive the algorithmic character of natural selection, let us express it in the form of a flowchart, similar to those used by computer professionals to represent their programs.
A population of elements capable of replicating themselves is produced.21
The elements of the new generation confront environmental constraints and are supported by the resources of the environment.
Some of the elements of the new generation (less equipped to confront the environment) are discarded.
The fact that the program may not end under normal conditions of species survival, technically makes this an irregular algorithm, with characters similar to the operating systems, as previously described.
Algorithms and Natural Selection
The action of the generate-and-test method did not end with the production of humankind. In a certain moment of our lineage's evolution, our brain began to have so much connectivity that the capacity to conceive, transmit and interpret symbols emerged. The possibility of the natural selection algorithm being applied to something other than genes sprang forth, giving birth to human culture. This new kind of evolution has already been in course for more than two million years, from the first moment hominids communicated among themselves through non oral symbols. The new capacity developed rapidly, through the inventions of oral and written languages, accounting, versification, secret codes, Greek temples, Gothic cathedrals, mathematics and experimental sciences, all the way to current Homo sapiens specimens communicating via Internet. In our times, it has produced something as impressive as symbol systems that can generate, transmit and interpret themselves, independently from human minds, in automatic machines created through the conscious processes of accumulation of design: computers and computer networks. It might seem that we are living –for good or bad– if not the end of the history that began with the first stammering of symbolic systems, the beginning of one of its most outstanding unedited stages, to be known perhaps by our remote descendants as the age of disembodied symbols.
The 19th and 20th centuries have been the period during which technology, thanks to both the industrial and the computer revolutions, reached its peak as a peculiarly human form of adaptation to the environment. Its roots, though, lie in Antiquity and even in prehistoric times. "Prehistory" is an inappropriate name. It was coined by historians to identify history not based on documents but on vestiges of a material kind. It constitutes an inadequate classification of the human time, based on pragmatic decisions of the professionals specialized in humankind's past, concerned with facilitating their work: no documents, no history! Although misleading to the general public, it carried a grain of truth: the beginning of history must be related to the invention of symbols. However, symbols appeared in the human lineage long before their written versions, at precisely the time when our ancestors began to express themselves with language, either spoken or gestural or of any other kind; that moment was the true dawn of history. I join in with whoever might have proposed that the birth of the first symbols, as rudimentary as they might have been –and definitely not the invention of writing– should be considered as the authentic beginning of history.
The new century we are entering could not help having been built upon the previous centuries, and upon this, as well, will indeed the future ones be built. The end of history is not in the horizon. On the contrary, symbol systems and other means contributing to expressing thoughts are becoming the most elaborate and meaningful ever. The human design capabilities will be in this century –no doubt about it– the highest achieved so far by the species and, through it, by evolution itself. The fundamental difference between this century's technology and that of the two previous ones consists of the crushing dominance of intellectual components over physical ones, of conceptual design over factory chimneys and the physical effort of manual workers. Non intellectual aspects will continue to dwindle in relative terms in the value of goods. Engineering, in all its branches –including computer science, biotechnology and nanotechnology–, will place more and more emphasis in ingenuity, the essential note in the etymology of the word. As to the products of these technologies, they will tend to encompass progressively more and more areas of human society, becoming second nature to a point and level undreamed of by previous generations. Human intellect will continue to materialize in “intelligence techniques,” ever more remote from those which have predominate since the invention of writing, or even of the printing press. (Lévy, 1990)
When we contemplate the monumental technological achievements of the past centuries, we cannot but marvel over the creative capacity of our species. At the same time, we should be overwhelmed with modesty, since the individual most conspicuous in his contribution to this process surely knows he has only been a small link in a millenarian chain of human research and development. In this great movement that began in true prehistory and has continued increasing day by day, the proportion of human presence in the natural world has been taking place according to a great ruling principle: the accumulation of design. This great construction surges through the collaborative and extended action in time and space of an immense number of small individual designers. Such grandiose movement is nothing different in essence than the functioning of the supreme algorithm of natural selection. The same one that produced our biological constitution but which, since the humanization of humans, has doubled itself as a conscious endeavor, giving rise to the vast swell of history. The ideas produced by our brains compete now for a place in human minds, over and above the deep waters where biological species continue to compete for the planet ecological niches. In contrast with the duration of the old enterprise of the biological evolution, the experiment of nature with the evolution of ideas22 exists only since comparatively not long ago. Nevertheless, it has already produced Picasso, Gandhi. Shakespeare, M. L. King, Aristotle, Marie Curie, Jefferson, Simone de Beauvoir, Chopin, Einstein, Edison, Nelson Mandela, and Darwin, among other magnificent creatures.
Charles Darwin wrote The Origin of Species to solve the relatively modest biological problem expressed with that title. However, as he did, he was carried away by his observational skills and analytical capacities to discover and describe a creative process novel to the human mind, which he called “natural selection” (Mayr, 2000). It consists of a blind or mechanical sequence of events devoid of purpose; in fact incapable of conceiving any idea of finality. From the philosophical point of view, this idea of natural selection turned out to be the answer for an even more profound question: how it was that design, the capacity to produce devices that accomplish a function and consequently pursue goals –consciously or not– could have been created (Dennett, 1995). With this he was not only clearing up the origin of the species, his purported intention, but also and without even realizing it, the problem of how human culture entered this world.
The argument of design has been a proverbial way of reasoning to demonstrate God's existence. Apologists of all denominations argue that the presence of design in nature – especially the marvels of the functioning of the human body and brain– demands as its cause the existence of a universal designer. Daniel Dennett, in his account of natural selection, points out that Darwin accepts rather than denies the premise of this argument. Right: biological nature and humankind are products of a great design. It could not have been otherwise. Simple randomness could not have formed the wonders flaunted by organisms in their splendid harmonious assembly of parts impeccably working together to assure their subsistence and reproduction. However, at the same time, and this is the revolutionary aspect of the Darwinian approach, thanks to the natural selection algorithm, design may have been produced spontaneously, without the intervention of a conscious purpose, by the simple accumulation of small fortuitous adaptations which, once achieved, restrict and condition the future ones within ranges where their advent is anything but implausible. Sufficiently wide lapses and genetic instability take care of the rest. A process not too different from the one which explains the development of human technology, from the most primitive stone-age carvings to the wonders of contemporary industry. Although already supported by the contribution of conscious planning, technological development does not proceed any differently from biological evolution: existing instruments and routines at the same time condition and facilitate the creation of new ones; diversity generation and the success of the best alternatives take care of the rest.
Before Darwin, the only model we had of a process involving research and development necessarily implied the participation of an intelligent artificer. His genius is revealed by his having found and analyzed another essentially equivalent process that obtains the same results by ways not necessarily conscious. In this procedure, work is distributed over enormous lapses, much vaster than the periods in which human technology would later emerge. However, in every step taken, nature preserved, with the attitude of a prudent saver, all the micro designs already obtained which, consequently, would not be necessary to be achieved again from scratch. From this perspective, biological evolution appears in fundamental continuity with the research and development that will happen later, beginning in Paleolithic caves. It will in fact be its obvious precursor and model. We are dealing in both cases with the application of the same algorithm, the game of competition, which takes for granted and to build upon, at each cycle, the achievements already obtained.
We are well aware that, by using the term 'research' to designate the marrow of a design process where no intelligent
researcher (in the usual sense of the words) intervenes, we are imposing force on the use of the term as normally applied.
Nevertheless, the history of philosophy and science is flanked by proposals of term redefinition, which expand or change in
some way their field of application. In exchange, clarity and usefulness for the symbols are obtained for their delicate
function of representing reality. A relevant example is Turing’s proposal of replacing the informal vague term “algorithm”
of old mathematicians for the precise and formal which makes it equivalent to the operation of the abstract machine that
bears his name (Gutierrez, 1993). This proposal, accepted by the mathematicians of the time, not only revolutionized
mathematical sciences but served as basis for the invention of computers, unleashing the computer revolution. Good
proposals of concept redefinition are those which, without affecting their previous usages, solve major problems still
lacking a solution, producing crops abundant in conceptual and practical benefits for humankind.
In the present case, the proposal is for the terms “design,” “research,” “development,” “invention” and other related ones, which currently assume a mind behind the designated process, to be replaced by others with the same appearance and meaning except that there is no assumption of the presence behind the process, in every case, of a conscious mind. The main benefit of this linguistic reform lies on the fact that, with the help of the newly defined terms, living beings and their characteristics become easily explainable, barring the need to either label them as incomprehensible or invoke supernatural or mythic causes to explain their existence. Furthermore, it produces a neat logical continuity between historical processes and biological ones, instrumental to better understand both types of phenomena. As a bonus, it allows the unification of biology and engineering within the classificatory framework of the sciences, as will be shown in the following paragraph.
The principle of accumulation of design means that once a device, a tool, or a method to do something, has been invented, further design processes, automatic or conscious, will be simultaneously constrained and facilitated by those previous achievements. One of those constraints is the drastic limitation of fortuitous mutations which can occur starting from a given state, since the organism is adapted to an environment and such environment acts at each evolutionary step as a filter, reducing the number of further possible adaptations. This circumstance sharply contradicts creationist arguments assuming that evolution is a game of chance similar to throwing in the air a large number of mold letters in the hope that they will form, upon falling Garcia Marquez’ novel One hundred Years of Solitude.
At the beginning of a chess game, all game possibilities are open. Once a player has moved a pawn or a knight, the possible continuations for both players are drastically reduced. This seems a limitation, but it is in fact what allows the game to be playable. In a like manner, the decision to have perpendicular avenues and streets in a city, or rather some transversal boulevards, determines the set of possibilities with which an urban planner may play. Similarly, with each new invention of evolution, the feasibility of a new branch of life opens up, e.g., organisms with an exoskeleton (as insects) or rather an internal skeleton (as vertebrates). These “body plans”23 concern very primitive and conserved genes that determine the ulterior variations which could eventually occur. The phenomenon is likewise present in engineering: the lever invention drives military technicians to invent catapults; a body plan for basilicas or cathedrals predisposes architects for several centuries to either erect Romanesque or Gothic buildings. In music –the engineering of time– the invention of the sonata or the symphony restricts later composers to concentrate within limits the further combinatorial possibilities of their art. Here again, not all moves are possible, and it is delimitation what makes possible accords and melodies.
Once one has thought about it, the unification of biology and engineering, as two forms of application of the same intellectual discipline, seems unavoidable. There is nothing essentially different between trying to discover the blueprint with which a captured enemy submarine was built (a concern for a naval engineer) and trying to discover the blueprint of a squid just found at the bottom of the ocean (a concern for a marine biologist). In both cases, it is a question of making a precise inventory of the device parts and determining the relations between them that produce their complex and efficient operation. Skeptics of this unification could argue that, even conceding that both objects may be products of design, it will always be true that important differences exist between the automatic and blind design of nature and the conscious and voluntary design of an engineer. One of them, which I know well enough for having practiced for several years structured programming, in accordance with software-engineering canons, is the circumstance that, in engineering, one follows the golden rule of unique purpose for each device part (or computer function). This strongly contrasts with the typical multipurpose characteristic of organic parts, due to the opportunistic character of biologic design: evolution plows with the oxen it has, creating for instance a same duct for swallowing, breathing and generating sounds, merrily running the risk of suffocation. However, perhaps this was the only possible way to produce the marvelous phenomenon of human language, on time for us to talk about this matter. Besides, it may even be questionable that multipurpose programming is a disadvantage by itself. One may even say that its opposite, structured programming, is only advantageous because of the human brain tendency to make mistakes. Evolution can afford not to follow those rigid canons since, by definition, it works primarily by committing mistakes, purging them afterwards by natural selection. A software production company, in contrast, acts in a time-frame much more restricted; it has to apply zero tolerance to errors in order to fulfill orders and survive competition.
Another difference between engineering and biology is the prevailing parallelism in biological processes in contrast to the, predominantly serial, programming of engineering. Here again another deficiency of human design is exemplified: we are forced to think in only one thing at a time.24 With the advent of parallel computers, an attempt to save this gap has been made, although within large restrictions: when all the chips are counted, parallel processes have to coincide at critical points –the bottlenecks–, where the fastest has to rest to give time to the slowest. The biologic method is infinitely superior in this respect: diffusion of amino acids or other elements in the intracellular soup gives polymerases simultaneous opportunities to synthesize several points of a DNA chain, without having to wait for instructions from a central processor. Besides, their physical shapes allow (and at the same time force) them to do what they have to.
We are led to the conclusion that the sole implication of the analyzed difference is to proclaim that there exists a plurality of design strategies, revealed mostly by the contrast between living beings and artificial devices. However, if the criterion for defining something as a work of engineering is the presence of design –and I see no other legitimate criterion– we have no choice but to accept the essential homogeneity between engineering and biology. Nonetheless, biology will qualify for the time being just as reverse engineering,25 which is true engineering even if it navigates up the logical stream, instead of flowing from design to construction. Nothing guarantees that, in the near future, biotechnology will not take biology downriver, practicing direct engineering in addition to the reverse one, by producing organisms not by natural but by artificial design.
This grand unification of biology and engineering is one of many discipline unifications which have occurred throughout the history of science, thanks to some concept redefinition or generalization. Examples of such are the unification of terrestrial mechanics and astronomy, with the advent of Newton’s physics, and that of thermodynamics, reduced to a particular case of statistical mechanics. Consequences of those unifications are most of the times of great import. Unification of celestial and terrestrial mechanics made space technology possible; reduction of thermodynamics to statistical mechanics opened the door to the revolutionary ideas of quantum mechanics. In the same vein, one may anticipate two possible important projections of generalizing the concept of design accumulation.
First of all, it is possible to think of a social engineering that does not incur the deviations denounced by Karl Popper's severe analysis (1957). I am not referring, of course, to a direct social engineering (equivalent to the chaos of State planning and totalitarian dictatorship) but rather, if you will, to a genuine biology (in our new sense) of society. Such would be a program of study devoted to consider society as a large device produced by (cultural) evolution, and social facts as endowed with multiple functionalities, like diffusive propagation or other of the many constants of biological organization. We should begin to look for counterparts to all biological properties or processes in the functioning of human societies. From the adoption of this perspective, sociology cannot but result greatly benefited, in rigor and in horizon widening, with no trace of old-style positivist reductionism. I visualize here ample space for new research styles and programs, regarding for instance the origins and development of law and its social effectiveness, or the origin, development, and methods of appropriation of natural languages. The application to education of considering symbol systems as living quasi-biological species could be vast, for instance in our understanding of moral values transmission methods or language learning, two orders of things with many important parallels between them.
Besides the social one, another type of design deserves careful exploration: the design of design. How is a conscious design produced in the mind of an engineer? I consider such not only the professional engineer, but any human being who applies his wit to find a solution to a problem affecting human life, or to create products aimed for human enjoyment; this includes architecture, industrial design, music, visual arts, and even theater or any other literary field. For that matter, how is a good book designed? A good general theory of design must be applicable to all the different classes of creative activity. Let us assume that we are mature enough to have outlived the inclination to invoke mysterious or unclear mental faculties, such as “intuition” or “inspiration” to explain natural enigmas. I see two converging approaches which hold promise to solve the general problem of explaining conscious design without appealing to non natural causes.
In the first place, we have Daniel Dennett demystifying analysis, which dethrones the homunculus as the agent of consciousness and the inner theater as the screen where the flow of consciousness gets projected. He replaces both with the “editorial model” of mental life, picturesquely branded pandemonium. According to this rich metaphor, there would always develop within the mind a (mostly unconscious) struggle between different tentative solution or rehearsals (design sketches) for the question that occupies our current interest. That struggle would basically exhibit all the characteristics of a natural-selection algorithm, with the peculiarity of taking place in the innards of the brain rather than in a tropical rain forest's or in the world's markets. The principle of adaptation to the environment, constituted in this case by the thousands of neural constellations simultaneously activated within the privacy of our mental universe, will decide which of all those outlines or sketches will finally emerge to officially represent the contents of consciousness (Dennett, 1991).
The other promising contribution constitutes one of the most exciting research programs in the neurological field. It corresponds to attempts like Gerald Edelman's of explaining intelligence through natural-selection processes of patterns, fortuitously produced in the “associative” areas of the neocortex (Edelman, 1992). Although it is a fact that a large part of the human cerebral cortex is engaged in processing of sense data and controlling body motions, which monopolizes almost the totality of a chimpanzee cortex, approximately the other three quarters of it are “vacant,” except for the relatively small areas of language understanding and language articulation. The appropriate occupation of that rest, enormous in neuron numbers, seems to be simply being available as a blackboard to outline, in the format of neural constellations, interior designs, more or less random, more or less constrained by patterns already achieved. If this research avenue proves successful, the greatest achievement of the natural-selection algorithm will be confirmed, namely its own internalization in the human mind, through a design of design, as conscious intelligence.
In conclusion, the work performed by natural selection may be understood as “research and development” of the most authentic kind, and biology as essentially related to engineering. In fact, biology is not only similar to engineering, but full engineering. It constitutes an application –in reverse and with rich and varied styles– of actual engineering methods, directed to the discovery of the design, structure and functioning of all living beings. This conclusion may be resisted by a groundless fear of its possible implications; nevertheless, it sheds abundant light over some of the acutest philosophical perplexities. It opens up as well the possibility of productively incorporating engineering methods to both biology and different fields of the social sciences and the arts; equally, that many of those same fields –and technology itself– receive the invigorating influx of the biological perspective.
See Appendix A: MENDELIAN INHERITANCE.
Note 2: See Appendix C: MUTATIONS AND GENETIC DRIFT.
Note 3: See Chapter 3, THE EMERGENCE OF MOLECULAR BIOLOGY.
Note 4: See this chapter, ALGORITHM AND PHILOSOPHY.
Note 5: Word derived from the Greek 'telos,' meaning ‘end’, ‘objective’, or ‘purpose’.
Note 6: See this chapter, THE THEOLOGICAL ARGUMENT OF DESIGN.
Note 7: See Appendix D: EVIDENCE FOR THE EVOLUTION BY NATURAL SELECTION.
Note 8: See Chapter 5, THE SEQUENCING OF THE HUMAN GENOME.
Note 9: See this chapter, ALGORITHM AND PHILOSOPHY.
Note 10: Another technical pair of terms with similar corresponding meanings is genotype and phenotype. See Part III.
Note 11: See Chapter 5, THE CONSTRUCTION OF THE CELL.
Note 12: Ribonucleic acid. See FIRST STEPS: THE RNA WORLD.
Note 13: Deoxyribonucleic acid.
Note 14: See Chapter 8, IS THE CULTURAL EVOLUTION LAMARCKIAN?.
Note 15: No, this is not a typographical error. The word 'meta-mathematics' does exist and refers to a "second-exponent" science which consists of applying mathematics (or logic) to the examination of mathematics itself. It deals with things such as the internal consistency of its systems, decidability of its propositions (the queston of whether or not they are mechanistically decidable), completeness of its axioms, and other abstract subjects.
Note 16: One has to call it 'approximate' for the reason that a genuine (abstract) Turing machine must have an indefinitely large memory; any real-life computer cannot avoid having a limited one.
Note 17: See Appendix Q: A MEME CALLED INTERNET.
Note 18: See Chapter 14: THE AMERICAN WORLD UNIFICATION.
Note 19: For instance, Gödel's proofs that mathematics are ultimately either incomplete or inconsistent, which rings a note of profound intellectual humility and proclaims the endless nature of the quest for knowledge.
Note 20: An example of a rough heuristic function to choose the best move in a chess game would be to count, for each alternative position, the squares controlled by the player and those by his contender and subtract the opponent’s number from one's own. The preferred move should be the one with a higher score. In any event, the function should be relevant to the problem in question, and be based upon some –not infallible but well-confirmed– expertise (human or mechanical).
Note 21: I have preferred not to use imperatives here, which is the usual form of expressing commands in computer programming, to avoid the impression that these “instructions” require a conscious being giving orders and another receiving them. Rather, we are dealing here with an automatic situation where natural laws perform the action, without consciousness or dialog between parties, natural chaining of causes and effects reigning serene. In the case of computer science, the natural laws implied are those of electricity, since computers currently built are electromagnetic mechanisms. In the case of biological evolution, the laws implied are those regulating carbon-based biochemical phenomena, foundation of every type of life known to us.
Note 22: See Chapter 8, NON GENETIC EVOLUTION.
Note 23: A translation from the German 'Bauplan,' essentially the way an organism is laid out, its symmetry, number of body segments and body limbs. The body plan of a starfish and a shark are different, so are those of a palm tree and a fern.
Note 24: In the brain at large, in spite of conscious attention being serial, the consensus of neurologists is that the majority of important things happen in parallel.
Note 25: See Appendix B: REVERSE ENGINEERING.