Dr. Pangloss's nose: In evolution, cause, correlation, and effect are not always identical
2011; Wiley; Volume: 20; Issue: 1 Linguagem: Inglês
10.1002/evan.20285
ISSN1520-6505
AutoresKenneth M. Weiss, Holly Dunsworth,
Tópico(s)Medical and Biological Sciences
ResumoVoltaire's novel Candide1 is a rather chaotic mix of political satire, fantasy, and confusion. Voltaire himself didn't think much of his odd little book, but it none the less has been received as a classic for centuries Fig. 1. The hero, Candide's tutor Dr. Pangloss, became an archetype of pompous academic pedants, “the most profound metaphysician in Germany,” then and now. He entrances audiences with lectures on “metaphysico-theologo-cosmolonigology,” which we're sure would earn him an endowed chair at an august university today (naming no names). Voltaire (1694-1778). Painting by Catherine Lusurier, Wikimedia public domain. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.] Academic pedants are still with us (or are us?), but sometimes what we say is right on the money. Although Candide was published in 1759, before Darwin's time, Pangloss makes prescient pronouncements for our own day. He is perhaps most famous for teaching, or perhaps preaching, that we live in the best of all possible worlds, where “there is no effect without a cause,” and “things cannot be otherwise than they are; for all things have been created for some end… They must necessarily be created for the best end.” Some of these ends are so obvious it would be foolish to deny them. As Pangloss notes, “The legs are visibly designed for stockings; accordingly we wear stockings.” And even more obvious is that “the nose is formed for spectacles.” Explanations of this kind, if sometimes less silly, are common in evolutionary biology, and they've been called Panglossian before.2 Alfred Wallace drove Charles Darwin up the wall with his ideas about how human mental capacities prove the existence of God.3 Wallace's reasoning was perhaps less convincing than Pangloss's, but went like this: Our ancestors had no need for things like higher mathematics or complex musical scales, so how could natural selection have given us those abilities if they were not divinely installed? In 1867, Darwin wrote a letter to Wallace to say that he “hoped you have not murdered too completely your own and my child,”3 that is, evolution by natural selection. Panglossian arguments carry more than a whiff of predestiny, but it is easy to find careless rhetoric that equates a trait's function today with what that trait evolved “for”: Our pharynx evolved for language, our thumbs for tool use. That may sound like invoking foresight, but what we mean is that precursor pharynxes and thumbs conferred reproductive advantage for some reason back then and their presence made them available for a new use. We cannot and do not suggest that proto-thumbs and gurgling pharynxes were striving to become dexterous orators at some time in the future. In the Panglossian context, of course, spectacles as we know them require a resting spot. But nothing about our beaky noses indicates that spectacles were predestined to be here, much less that in days of yore we were sniffing our way to some future optician's office. This does, however, raise the question of directionality or even predictability in evolution. Natural selection as we understand it today is a probabilistic process involving differential success of randomly arising genetic variation that responds to external environments. That is inherently not predictable. But biological traits are complex, affected by many genes working together. Once a trait has evolved it may be essentially impossible to change it too fast or too much. There would simply be too many ways for mutations to break it. The trait may still diverge and diversify in descendant species, but the options may be limited or “canalized” by the very complexity that the past had established. A relevant image may be the notion of a lodestone (naturally magnetic rock) as described by Socrates in Plato's essay Ion, in which he discusses chains of influence: a lodestone can induce magnetism along a chain of iron rings even if only one ring is touching the lodestone. Likewise, one evolutionary link entrains the next, and while adaptation today is not directed by the past, the future is not entirely independent of the past. If a genetic lodestone effect is important, as some evolutionary thinking goes, future directions may to some extent be predictable after all. Future vertebrates are likely to have more, or fewer, or modified vertebrae, but they will still have vertebrae and a central support down their backs. You simply won't make it if you don't have the backbone to stand up for yourself. This doesn't mean that a trait cannot arise for one reason long before it finds some other use in a subsequent environment. Darwin was wary of any notion of “nascent” traits that are ready to become something else in the future. In a letter to Charles Lyell he wrote, “Here is a bold prophecy! To admit prophetic germs is tantamount to rejecting theory of Natural Selection.” He further noticed that “A nascent organ, though little developed, as it has to be developed, must be useful in every stage of development. As we cannot prophecy we cannot tell what organs are now nascent….”17 What Darwin called nascent traits have been called exaptations.4 Eyes that can resolve visual images may be the distant lodestone consequence of an ancient exaptation used simply to detect light, as Darwin suggested in The Origin of Species.5 Today we cannot tell whether a trait of ours, say, our nose, is ready to become something other than a spectacle rack in the future. A deeper meaning is that we cannot claim that about a trait in fossils either. Whether the trait was there for some reason or by chance, we cannot assume that its function was related to its use by descendants today. We scientists choose how to define a “trait,” but nature makes the only definitions that count. An organism is an integrated assemblage that does or does not succeed, but each part reflects its own evolutionary lodestone effect. The degree of correlation or “morphological integration” of a trait, such as among the parts of the skull, depends on the nature of those components and the environment at any given time along the evolutionary path. As long as the overall trait adapts, the role of any of its parts depends on the relevant genetic variation that happens to be there at that time, which is all that selection or drift can affect. To some extent, at least, each part evolves on its own. This is how evolution is mosaic. Even if we can identify a trait's uses today, we may not be able to reconstruct its adaptive history, because the trait and its history may not be as unitary as our reconstructions or definitions. It may seem heretical to suggest this, but a prominent nose may have arisen not in relation to spectacles (though that does seem to be the most obvious explanation), but in relation to smelling or even as an indirect consequence of a reduced face related to diet or because large canines were replaced by hand-held tools. Bizarre as it may seem, the latter possibility would imply that our spectacle bridge is the consequence of our opposable thumbs. Furthermore, spectacles evolved first as monocles, meaning that our eye sockets may have evolved not just for protecting eyes but to hold lenses (Fig. 2). (Of course Mr. Peanut is only a cartoon, since real peanut shells have pits but not eye sockets.) And when did lorgnettes come into the picture, and necks for holding their straps so the glasses didn't fall onto the ground where they'd be stepped on (a definite fitness handicap)? A related example of morphological integration would be if our external ears (pinnae), often viewed as a legacy of primate hearing, really evolved jointly with noses to support spectacles. The evolution of visual-support anatomy. Left: the Supportus monoculus (rostral view). Right: the Supportus spectacleus apparatus in humans (lateral view). Sources: left, Mr Peanut, logo of Planter's Peanuts, a division of Kraft foods; right, from Gray's anatomy, Wikimedia public domain. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.] Darwin spent eight arduous years studying barnacles. It required four volumes for him to describe every minute detail.6 Besides their inherent interest to biologists at the time, one purpose he had in studying these tiny crustaceans was to account for the pattern of trait-sharing among species in a way that related to the stunning evolutionary theory that he would only announce years later. His idea was that shared traits were not just reflections of some ideal barnacle in God-the-Creator's mind, but reflected actual common ancestry. Darwin was especially interested in the number and use of body segments, and in the sex organs. In these respects and others, barnacle species are complex trait mosaics. Some use their antenna to glue themselves upside down on rocks and ship hulls. They have every imaginable range of sexual morphology and behavior, ranging from complete hermaphrodism to two separate sexes. Some are females whose bodies house tiny male barnacles. In others, these embedded guests are little more than rudimentary sperm pouches that the host uses for fertilization. An observation of critical importance to Darwin's theory was that every trait in every species varied and “Systematic work wd be easy were it not for this confounded variation.”18 The idea of the stability of species, which systematists relied on and which was often cited by opponents of evolutionary concepts, was an illusion caused by the glacially slow pace of evolution. An important implication of Darwin's theory of evolution was that if every trait differed between species, every trait must also vary within species, as the source of the adaptations that would lead to the divergence of future species.5 If every creature that ever lived were laid out on your (very large) lab table, how many really discrete categories could you put them into? Probably very few, as Darwin himself knew. The fact that variation is ubiquitous is often seen as the very heart of our understanding of adaptive evolution. In turn, this implies that everything at every time can be viewed as a potential exaptation for something new. Darwin barely mentions barnacles in The Origin of Species. (Such restraint from self-citation, after 8 years and 4 volumes of work!). But afterwards, in both Descent of Man and The Expression of the Emotions in Man and Animals, he used similar reasoning about shared traits to show that humans of all races were a single species with a single common origin. But he went further, and in a way that put him on shakier ground relative to the observation that all traits varied in all species. If he saw a partial trait in species A, while species B had that trait to its full modern extent, he assumed that cross-species variation reflected the longitudinal trend that was taking place within species A, which was on the way to becoming like species B. Thus, he viewed the presence in different species of hermaphrodism or females hosting rudimentary males as a cross-species reflection of within-species evolutionary directions. But this reasoning amounts to the very anathema of evolutionary biology, the idea that the present does predict the future. How could Darwin, of all people, have come to use such reasoning? In Jean Lamarck's view of evolution,7 an organism was viewed as striving toward certain ends in confronting its environment. The results of that striving are molded into its body parts and shed into the gonads to be transmitted directly to offspring. Darwin rejected any idea of distant foresight, but he accepted the inheritance of acquired characteristics, calling the shed elements “gemmules.”8 Thus, an organism trying to stand up and throw rocks at prey will simultaneously mold all of its relevant gemmules in that direction. In this theory, morphological and behavioral integration could be expected, since all aspects of the organism's nature would be molded in a unitary way from its life experiences. Yet even in his barnacle work Darwin recognized the mosaicism of many combinations of trait variants. There are traps everywhere. Darwin reasoned that vestigial traits that seem useless today must reflect ancestrally functional traits. If what you are is what your parents did, and your survival indicates that they did it well enough to transmit the results to you, then everything has to have a selective explanation. But what about men's nipples? Using Darwin's reasoning, they would have to be vestiges of some past function, whereas our modern interpretation is that they are incidental artifacts of mammalian development. Of course men's nipples could be viewed as exaptations for piercing and decorative rings. Similarly, if Pangloss's view of noses is correct, it could imply that eye sockets were Since evolution is basically a continuous process, viewing a present trait as the result of a staircase-like succession of separate exaptational steps is a misleading simplification. exaptations for monocles, making monocles exaptations for spectacles. That is analogous to other questions that might be asked about the fossil evidence for human evolution. For example, did bipedalism evolve for some other reason but then became the basis of savanna hunting or did early tries at savanna hunting lead to selection for bipedal posture? Did weapons evolve first and enable bipedal savanna hunting or did hunting stimulate the invention of hand-axes? The causal reasons are challenging enough to guess at, but impossible to know without first knowing the polarity, or historical order, of the appearance of traits. But their composite nature and mosaic evolution means that we must know the relative order not just of our trait of interest but, in what is something of an endless evolutionary regress, the trait's subtraits, and their subtraits, and so on ad infinitum down to the level of DNA sequence variants. Today's traits have been built up over time with whatever subcomponents were present at each particular time. Darwin stressed this. Since evolution is basically a continuous process, viewing a present trait as the result of a staircase-like succession of separate exaptational steps is a misleading simplification. Since evolution is basically a continuous process, viewing a present trait as the result of a staircase-like succession of separate exaptational steps is a misleading simplification. Since evolution is basically a continuous process, viewing a present trait as the result of a staircase-like succession of separate exaptational steps is a misleading simplification. Since evolution is basically a continuous process, viewing a present trait as the result of a staircase-like succession of separate exaptational steps is a misleading simplification. To observers at the time of any fossil's existence, its traits and their uses would have seemed as final as ours do today. For example, ancient hominins were able to walk habitually upright before they had the anatomy that we display today9, 10 and were good enough at it to avoid being eaten before they passed on their hodge-podge bipedal traits to the next generation. Conversely, our traits today are no more “final” than any fossil's. For example, selection might modify our hips to increase the survival of mothers and offspring at childbirth without significantly affecting locomotion, or to attract mates by their curvaceous appearance, or both. These real-life scenarios, like Pangloss's fictional ones, illustrate the difficulty in predicting the outcome of adaptive evolution and whether it will occur at all. The fact that our premodern relatives were bipedal underlines the difficulty in using a complex trait to discern phylogenetic history unless, of course, that history is a single linear sequence of speciation events. Genetic data can be useful in establishing phylogenetic history because the accumulation of DNA sequence differences in separated lineages over time is roughly clock-like.19 But there is so much statistical variation in the process of genetic change over time that a single gene does not tell the phylogenetic tale. One must look at a substantial amount of sequence data before taxonomic relationships can be reliably inferred. However, since genes affect traits through complex overlapping and interconnected functional networks, and since we know relatively little about the effects of individual genetic variants or the time order of their mutational origin, we are on shakier ground when it comes to using genetic data to reconstruct adaptive functional history. The evolution of human bipedalism has involved the barnacle equivalent of changes in many parts, including foot, ankle, knee, femur, and hip structure, as well as cranial orientation and spinal shape. These traits perhaps co-evolved causally with our visual apparatus, shoulders, and hands, and, as some might argue, our development of language. Upright posture may not have evolved as the unitary trait we may view it as today, and its contributory parts would not be expected each to evolve in the same way and time. The 4.4-million-year-old Ardipithecus ramidus fossils that were recently described illustrate these issues. (See the entire issue of Science, 2 October, 2009, particularly White and coworkers11). “Ardi” was described as the earliest ancestor in the human-specific lineage, after we split from common ancestry with chimps. As a direct ancestral specimen, we should, in principle, be able to place it in proper phylogenetic context based on our considerable knowledge of comparative anatomy, other fossils, and genetic data on the human-chimp separation time. But dissension immediately arose, and the disputes ranged from head to toe. By the same kind of reasoning as Darwin's approach to barnacles, human bipedalism was a split from the ancestral primate locomotion. We might have expected to find a progression of increasingly refined aspects of today's upright walking inour fossil ancestors. But for the reasons we've described, in that march towards the present not all aspects of modern bipedalism need occur in lock-step. As shown in Figure 3, parts of Ardi's reconstructed hip seem ape-like, others human-like. It is not necessarily clear which of those is primitive and which is derived. Moreover, the reconstructed morphology is not an easy sell in the first place considering how fragmentary the original material is. If Ardi was really an early hominin, then as expected under mosaic evolution of complex adaptations, it was on a shakedown cruise of posture-related traits that eventually resolved in the human direction (as opposed to the chimpanzee direction or, alternatively, along a dead-end lineage). The discoverers can be read as suggesting that, despite morphological differences, Ardi locomoted as we do, even assuming no disputes about the reconstructions of the specimens themselves. Hip-hopping through time. The pelvis of various hominids, including the reconstruction of the new Ardipithecus showing mosaic changes in pelvic subtraits. Lower arrows show hamstring attachment suitable for bipedality, while upper right arrows show a sciatic notch, absent in chimps. Upper left arrows show the anterior inferior iliac spine, also absent in chimps. Courtesy Tim White.12 [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.] Ardi's finders argue that its forest habitat was not open savanna, yet if Ardi was ‘becoming’ bipedal, that is, was an early biped, then that means that the longstanding view that bipedality arose as an adaptation to open-country ecology (a hypothesis that traditionally included hunting) is wrong. Yet dissenters reply that this was not a forest but a mixed open-country and forested environment,13 leaving the old standby account intact, or at least unfalsified, after all: Early hominins specialized in coming down from and moving terrestrially among trees. It has further been objected that Ardi was not even in the human-specific lineage.14 That criticism was based on morphological traits that were argued to be present and/or have evolved independently in various taxa, not just hominins, and so are not definitive in separating the chimp from the human lineage. It has also been said that Ardi's date was, in any case, before the ape-human split. The molecular evidence19 seems to contradict this claim, but if it is correct then Ardi simply showed a mosaic menu of traits before they were sorted out by different adaptations in each lineage. The discoverers of Ardi respond that this alternative reasoning is far from parsimonious,15 invoking too many evolutionary pathways in comparison to their own simpler interpretation. Perhaps that is right, but taking the simplest path is manifestly not a characteristic of evolution. The parsimony principle is only an assumption that comes from medieval philosophy. Whether the parsimony argument persuades you is a purely subjective judgment. However, even if one accepts that Ardi was on the human-specific lineage, and hence could tabulate our lineage's early morphological trait set, we can't necessarily make statements about the polarity of the order of their appearance or about how variable they were across the species' range or at different times over millions of years. Ardi was not the very first individual to possess any, much less every one of its subtraits. These subtraits are probably not due to single genes. The relevant mutations in contributing genes probably arose at very different times or locations, and gained frequency either just due to chance or with some selective assistance as part of whatever locomotory or other behavioral complex was present along the way—or maybe even for some other reason we haven't guessed. Ardi's discoverers' arguments seem to verge on invoking a lodestone-like linear evolution of early hominin begetting: Ardipithecus ramidus kadabba begets Ardipithecus ramidus ramidus begets Australopithecus anamensis begets Australopithecus afarensis. That may be correct, but it is obviously hard to prove. In making adaptive explanations, we have to go beyond description and assert actual fitness differences. That is not as easy as it may seem, especially if we guess wrong about function. In making adaptive explanations, we have to go beyond description and assert actual fitness differences. That is not as easy as it may seem, especially if we guess wrong about function. A convincing adaptive scenario has to go beyond a blandly plausible PG rating, because we're talking about sex here, and cannot just glibly say that ape-men with slightly different earlier versions of some present-day trait were more likely to be successful runarounds (the ape-women of the time would know whether or why that was true, but their lips are sealed). Candide's mate choice was named Cunégonde. It may have been a rash choice, because by the end of the story he thought that the trials of life had made her “very ugly.” However, saying “I am a man of honor,” he agreed to marry her anyway. Beauty is in the eye (or spectacles) of the beholder, and we cannot make confident assertions about what traits Ardis would have thought attractive, or whether they were honorable enough to marry despite their intendeds' failings until the fossil record has become much richer across time and space. That these are relevant issues can be seen in the fact that, in August 2010, as this essay was being written, a report appeared that contained evidence of carnivorous behavior in 3.4-My-old hominins, most likely Australopithecus afarensis.16 This claim which made the cover of Nature, was misleadingly titled “The First Cut,” though it was no more the first cut than Ardi was the first hominin, even if they are the currently first-known ones. The evidence consists of cut marks on the bones of ungulate prey, which suggest the use of stone tools for butchering and even to crack marrow out of long bones, and pushes back the earliest known occurrence of such behavior some 800,000 years. This interpretation is already being disputed in the popular press (see Notes) because it could affect explanations of the reasons for, or nature of, human origins and, perhaps, because it could affect the interpretation of the behavior of the million-year older Ardi. If stone-tool use and the eating of mammals can be pushed back to earlier along bipedalism's evolutionary journey, then it is not so easy to dismiss the savanna hypothesis. Therefore, it is not so easy to dismiss particular interpretations of anatomy and/or paleoenvironmental reconstructions. To respond to critics of his theory, Darwin inserted a new chapter VII into the 6th edition of The Origin of Species, in which he addressed the different time origins of the components of a complex trait. In that discussion, he was taking a unitary view of a trait's evolution. But at the same time he clearly recognized the fundamental problem of “correlated variation” that could include things unrelated to an adaptation of interest. As he wrote, “An organic being is a microcosm—a little universe, formed of a host of self-propagating organisms, inconceivably minute and numerous as the stars in heaven.”8 These same issues are confirmed in the genes that produce the traits we see in fossils, because many genes, if not most, are pleiotropic—they have many unrelated functions. This is a Panglossian problem! Ears, nose, and neck can evolve together without implying they were adapting to spectacles. The same interpretational challenge applies to more pedestrian traits like locomotion. To say that the evolution of bipedalism was an adaptation to savanna foraging suggests a tacit assumption that its functional evolution is not mosaic, even though we know that it was. The sciatic notch or the hamstring attachment (Fig. 3) need not have evolved their Ardi state for reasons related to locomotion or savanna foraging. In the 150 years since Darwin, we still have not got past this interpretational challenge. Knowing what truly happened in the past would edify our attempt to understand what we are today. By combining comparative, evolutionary, and genetic data we know a lot that seems reliable. But perhaps we must accept the limitations of the data that could even conceivably become available. Thus, to evaluate Pangloss's assertion seriously, even if we found spectacles at a fossil site, rusty and bent, we would not know that they were the first spectacles or what held them on, much less what had held the first ones on. The same goes for why we have legs besides their obvious main use for holding stockings. We might agree with Pangloss that every effect must have a cause and everything a reason. But that doesn't imply that we're always going to be able to identify that reason. We have to disagree that “things cannot be otherwise than they are” because ancestors different from ourselves obviously were successful in the past, or we wouldn't be here to argue over their remains. However, we could ironically choose to agree with Pangloss that ours really is the best of all possible worlds in one sense, that it is the only one that actually succeeded. We can look back at evolution and argue about whether or how other worlds might have made it instead, but such arguments can never be settled. This is true to the extent that invoking function, much less adaptive reasons, for traits we see in either the fragmentary fossil record of the past or comparative genomics today is problematic. As Voltaire's famous last line in Candide goes, when we get right down to it, we may just have to “cultivate our gardens,” focusing on the more mundane and even fallible understanding we can hope to have about the way life is today. At the time of writing this piece, controversy over the oldest cut-marked bones was still informal and comments by professional anthropologists can be seen in the New York Times of August 11, 2010 and in the August 13, 2010 episode of Science Friday. Since then, a formal challenge to McPherron et al. has been published in the November 15, 2010 issue of PNAS by Dominguez-Rodrigo et al. We welcome comments on this column: kenweiss@psu.edu. KMW co-authors a blog on relevant topics at EcoDevoEvo.blogspot.com. We thank Anne Buchanan and John Fleagle for critically reading this manuscript.
Referência(s)