Revisão Revisado por pares

Bottom, Crowned with an Ass's Head: Does an inversion in thinking portend the end of Enlightenment science?

2012; Wiley; Volume: 21; Issue: 1 Linguagem: Inglês

10.1002/evan.20325

ISSN

1520-6505

Autores

Kenneth M. Weiss,

Tópico(s)

Philosophy and History of Science

Resumo

A recurring theme among writers during the classical period of Greece and Rome was metamorphosis. Many enjoyable, fanciful stories were told in which various people or gods were transformed into other creatures or objects. These changes were disguises, rewards, punishments, and trials. Among famous examples are Apuleius' second-century AD novel Metamorphoses, more popularly known as Lucius and the Golden Ass, the adventures of a nobleman who, tinkering with magic, took the wrong potion and transformed himself into a jackass. His adventures were interesting because, as an ass, the people who owned him didn't realize he could hear and understand the schemes they were up to. Another important example is Ovid's first-century AD Metamorphoses, which told numerous tales about individuals who underwent transformations for various reasons. Many writers picked up these stories. They bear a resemblance to stories in the Islamic tradition, including Scheherazade's 1001Arabian Nights, written about 1000 AD. And these classics were the basis of one of Shakespeare's most beloved plays, A Midsummer Night's Dream, written around 1595 AD, a fantasy outdoor bedroom farce borrowed directly from Ovid and at least indirectly also from Apuleius. In Midsummer Night, a group of tradesmen plan a play to stage at the Duke of Athens' wedding. One of the performers, a weaver named Bottom, asks that a prologue be written to explain what might otherwise be too scary a play for the audience. Here's my prologue to this essay: The Duke orders a friend's daughter, Hermia, to marry Demetrius, but she loves Lysander instead. Since the price of defiance is death or exile, she and Lysander elope through the forest to safety. Helena also loves Demetrius, but he jilted her, so in anger she informs him of the escape plan and they pursue the elopers into the dense wood. There, the fairy king Oberon and his queen Titania are squabbling. He tells his impish servant Puck to sprinkle juice from the love-in-idleness flower (wild pansy) in Titania's eyes when she sleeps, which will make her fall hopelessly in love with the first creature she sees upon waking. Puck is instructed to enchant Demetrius, too, so he'll fall in love with Helena and stop chasing Hermia. But chaos follows. Also in the wood, the tradesmen are rehearsing their play. Puck transforms Bottom's head into that of a jackass (Fig. 1) and, by misadventure, Titania awakens to the sight of Bottom, crowned with an ass's head. She proceeds to make an ass of herself by mooning over Bottom. Entangled mistaken identities lead both Demetrius and a mistakenly enchanted Lysander to pursue and fight over Helena, which leads Helena and Hermia to quarrel jealously over who's pursuing or betraying whom. “. . ..the blots of nature's hand Shall not in their issue stand. Never mole, harelip, nor scar, Nor mark prodigious such as are Despised in nativity Shall upon their children be.” (Act 5, Scene 2) We can use this tale of nested plots within plots and magical transformations to ask whether in genetics and evolution ideas that may amount to fantasy are being taken as fact. Are we falling in love with the first explanation we come across? Or are we learning to discriminate better among the forest of possibilities that we confront? The tradesmen in Midsummer Night are transformations of the actors, and when they perform their play within a play they are engaging in a nested metamorphosis of their basic, real selves. The classical transformations, and those in Midsummer, happen at a single stroke, imposed from without. If life is about anything, it, too, is about metamorphosis: changing form, literally at all levels. There, too, transformations may disguise what lies beneath. But life's transformations do not happen at a stroke. They are imposed gradually from within, and therein lies a big difference. Evolution has transformed primal soup into the species that populate the earth. And organisms, like you, are figuratively, at least, transformations of the information carried by DNA in an original egg or seed through the processes of embryological development. We can study the means of this metamorphosis from the causal top—a specific gene of interest—down to the trait it produces. Alternatively, we can work from the bottom—the entire collection of genes—up to the trait we're interested in, to see which parts of the genome are involved in producing the trait. Science has recently been taking both top-down and bottom-up approaches as I have defined them here. Issues related to these opposite approaches are at the heart of current views of genetic causation and its evolution. Top-down studies test the hypothesis that some known candidate gene causes the trait in question. Such studies use the classical approach that has developed as the foundation of modern science since its origin 400 years ago during the Enlightenment.1 This approach rests on the belief that there are laws of nature and that what we call the scientific method is the way to understand them. Conclusions are based on formal tests of hypotheses in light of such laws. In this way of science, correct theory should be able to predict new things not previously observed. Moreover, we should be able to confirm our findings by replicating them in a new study. When initial findings are based on observations of existing nature, as in human or primate traits, they are checked by a couple of strategies. Suppose we find that a genetic variant is statistically associated with some human or primate trait such as a disease, large face length, or tooth shape. We can compare corresponding sequences of that gene among different individuals or different species to see if the genome region in which our candidate variant occurs is highly conserved; if it is, the gene may have been adaptively important, suggesting that our interpretation of the variant's effect may be correct. Alternatively, although we cannot do experiments on humans or primates, we can sometimes test our idea in an animal model, such as a laboratory mouse. We know where the ‘same’ gene is in mice because primates and rodents share a common ancestor millions of years ago, and we have already determined these species' genomic DNA sequences. Once identified, there are a variety of molecular tricks whereby the mouse gene can be altered to have the same suspect causal variant as is found in humans. Then, we can examine the ‘transgenic’ mouse to see if the same genetic variant has similar effects. Mice are not humans, but we can at least see if we may be on the right track. That's top-down analysis, and its power is what one expects from science, the ability to predict a specific outcome from a specific cause. But there's a problem: What if the result does not support your favored hypothesis? Since you're limited to the intentionally focused conditions of your experiment, even a negative result doesn't necessarily overturn your hypothesis. This is contrary to a widespread belief that falsifiability is a definitive criterion for scientific decision-making, but your experiment could have been misunderstood, poorly designed, or done badly. Other factors, including the effects of other genes, could be important but not taken into account. So the hypothesis-driven scientific method leaves much to be desired, especially at a stage when one doesn't understand the trait well enough to have a focused hypothesis in the first place. One can easily become lost in the woods. But what can be done about it? Not so long ago, little could be done. Because of the fundamental theory that genes are the “it” of life, one often assumed that a trait must be genetic in some important sense. But there was little way to prove it, except perhaps to show that relatives resembled each other in a pattern consistent with Mendelian transmission in relatives. But inheritance patterns are probabilistic and rarely precise. With billions of DNA base pairs, tens of thousands of protein-coding genes, and huge but unknown numbers of other functional elements in our genome, what could one think, much less afford, to do? Fortunately, technology enabling very large-scale ‘high throughput’ DNA sequencing or genotyping and other similar kinds of exhaustive enumeration has burgeoned. A new approach has become not only possible, but has taken center stage in much of contemporary biology. This is the bottom-up approach. Rather than looking first at individual hypothesized candidate genes, we look at entire genomes at once, indiscriminately on purpose, to see what we can find that may cause a trait of interest. This agnostic approach was given the name genomics. Genomic approaches are widely called “hypothesis free,” but that is a subtle disguise. What the investigator is free of is the obligation to state any specific hypothesis about what is going on except that genes have some causal role in it—and geneticists usually think in terms of major causal role of individual genes. That assumption is a version of the overall theory that your genes are what youare. In the bottom-up approach, samples of individuals are measured for some trait, such as stature or a particular disease, and the entire genome is scanned to find those chromosome locations in which variation is differentially associated with the trait. Unabashed trolling expeditions were long ridiculed as “Victorian beetle collecting”; that is, merely as descriptive collections of details without analytical content or guiding theory. Today, however, scientists are defiantly proud of this approach, whereby “ignorance is bliss” is taken to the extreme. But how can such agnosticism ever have become acceptable, much less de rigueur? If we're talking science rather than scientology, biological traits simply must have material causes, assumed to lie hidden somewhere, somehow in the genome. Indeed, after about 25 years of increasingly intense genomic approaches, we're getting a pretty clear picture of nature. But it's not what many had hoped for. That's because what has been found by bottom-up analysis is that most traits are affected by a host of variants across the genome, not by just one or a few genes. False trails are everywhere, as effects show up in one study but disappear in others. Mistaken identity is rife. Some variants approximate classically strong Mendelian, highly deterministic effects, but for most variants it'snearly, or even essentially, impossible to show with any statistical significance that they really are causal. Essentially, they work only in aggregate. Unfortunately, that aggregate effect often dwarfs therarer strong effects that are easy to find. Such results present one of the leading challenges to genetics and evolutionary biology. (I've written some previous discussions2-4 that may help you through what has become a thicket of literature). You might say that it's now time for good old-fashioned top-down experimental study of the suspects uncovered by genomic analysis. But how could you do such studies? You can't do experiments on people and, because different populations have different histories, you often can't even get new samples that will reliably replicate most of the findings. You can't do adequately large experiments to test very tiny effects in mice, even if you assume (inaccurately) that mice are just furry people. At present, with hundreds of candidate genome locations revealed by bottom-up science for interesting traits, top-down science is an impracticably tough slog, and literally impossible if the causal agents have their effects only in aggregate combinations. So both approaches leave us with uncontrolled and hence unknown levels of ignorance, especially if the causal question involves adaptive evolution that took part in the deep, largely unobservable past. However, besides curiosity and the availability of sexy new technology, the very reason for the move to bottom-up science is that life has more complexity than single causation much of the time, so that while a priori genetic hypotheses were hard to come by, bottom-up findings at least leave all the fish in the pond. That means mountains of raw data to sort through and try to make sense of, but at least if the fish are there, there ought to be a way to catch them. Can the difference in approach explain why things seem consistently complex to some investigators, while others argue that specific genes have major effects through which we can understand genetic causation and evolution? Yes and no. There is no formal theory for the relative frequency of a given strength of effect of an allele (genetic variant). Extensive data suggest that a small fraction of effects are substantial; these are nature's teasers, which are responsible for the occasional ‘Mendelian’ appearance of the trait in family members. They're easy to find and have driven much of the history of genetics. But for most alleles, including many of the strong ones, their effects are only relative depending on the variation in the rest of the genome and any environmental influences in each individual who carries the allele. What we identify by comparing genotypes with their associated traits depends on the sample and situation. Effects differ among individuals and populations and over time, to the extent that variants that are essentially lethal in one species or individual can even be normal in others. It's similar in regard to evolution. Intense selection may quickly raise the frequency of strongly positive genetic variants, but it will also raise the frequencies of all other variants in the genome that contribute, even weakly, in the positive direction. It may often be that we can find the specific strong signal, but err if we assume it was the only, or perhaps even the major gene contributing to the adaptive evolution. This shows why genomics approaches have merit. The bottom-up approach can be highly informative if we take advantage of all the information it provides, not just to fish for the big catches that yield statistically significant evidence, but to try to understand the bulk of the remaining weak effects. The implication is that new function largely arises not out of the lucky blessing of a single mutational event, but by the nudging in a given direction of variation in large numbers of genes. And even for a stable trait, the mix of contributing variation changes as new contributing mutations arise and existing variation drifts out of existence. So this supports the usefulness of the agnostic bottom-up approach. On the other hand…. In our professional rush to anything that might boost our careers in research, no matter by what thin thread of hope, scientists behave like the proverbial lemmings frantically scouring the landscape for any advantage. We rush to any fad or fashion that will dress our papers or proposals in modern costume. Add to this the excitement of having new technology right in our own labs, and there's a problem. But the problem is cultural, not scientific: the uncontrolled and not very critical diffusion of a way of thinking, relying on high-throughput technology and deferring the hard challenge of hypothesis formation. Well, we're only human, and we have to make a living. ‘Omics’ is a totemic suffix today. No matter what you are doing, if you want to be au courant and to secure resources, you have to ‘omicize’ it. First in my recollection came genomics. But the whole genome is a big pond to fish in, so it was then suggested that all mutations worth a trophy must be in protein-coding regions. (There is no good reason to think that way, but nonetheless…) Such coding mutations are only a few percent of the genome, so it's more practical to sequence all of them rather than the entire genome. Since these regions are called exons, we got exome sequencing. But others argued that DNA sequence is really inefficient because DNA only does something important when its protein code is transformed into actual protein: a corporal body rather than just a gleam in the genomic eye. Thus, proteomics, the cataloguing of all proteins in a system under study, took the stage of beetle-collecting in new costume. Wait, wait, don't tell me! We now know that many different parts of DNA are transcribed into RNA in addition to that which codes for protein, so we have transcriptomics, or catalogs of all RNA found in a sample of cells. And what about epigenomic, or the cataloguing of every region where a person's DNA is chemically modified because such modifications affect which regions are transcribed in the first place? We now also have interactomics (physiological networks), connectomics (documentation of every connection between every brain cell), diseaseomics (every gene changed in every disease), metabalomics (every gene in every metabolic pathway), cancer genomics (complete “omics” of cancer cells), nutrigenomics (every change in gene expression responding to a given food), cistromics (interactions along the chromosomes in the nucleus), microbiomics (DNA sequence of every microbe in or on your body or in a flea's gut), and museomics (sequence, don't just look at the stuffed dodos). One might predict this will all culminate in a muddlenomics nightmare, with everyone desparately seeking recognition by marketing new terms in the way trees in a tropical forest desperately reach for the light. Scientists are now trained to strategize as much through appearances and advertising as through critical, focused science unless we respond to what's actually being learned. The ‘omics’ age reveals the scale of the problem we face in trying to understand biological causation beyond data collection. Genomes contain several billion nucleotides to sort through, including 20 to 120 thousand “genes,” depending on how and what you count. Most of the genome may be transcribed into RNA to serve many functions, some of which are totally unknown, and there are millions of other elements in the genome, and in the environment, that are likely important in determining the traits each of us bear. Despite being in such current fashion, the hope that major causes will jump out like yellow peas is a mistaken conceptual heritage going back to Mendel himself. What we're learning is that the ground state of genetic causation seems to be minor rather than major effects, which are the exception. “Omics” is showing us why evolution generally works gradually on aggregates of small causal variation, as Darwin insisted, rather than lucky jumps due to an individual gene or allele. This is what bottom-up approaches can look at and why top-down ones are often frustrated. The drive to enumerate all causes is a legacy of Mendel's carefully designed experiments and the reductionist thinking of the Enlightenment.1 “Omics” constitutes a confession, though often couched in braggadocio, that we don't have good enough hypotheses. It's fun to mock the herd-like nature of scientists, following whatever is in fashion. That's an accurate characterization, as candid scientists are well aware. But if we want to observe the genome, or the brain, or a bone, to make inferences about some causal aspect of its structure or evolution, and we're up against the limits of current resolution, we create a natural market for technological innovation. This produces new knowledge that, despite its shallow faddishness, may signal something that could be truly profound: the end of four centuries of a particular way of understanding nature.1 Perhaps the experiment-based top-down approach to science is yielding to what will engender entirely new ways of thinking. When specific facts and theories don't provide sufficient explanations, going bottom up may not be as asinine as it may seem. The transformations in life are not due to the magic want of a single gene, but are complex. In an organism, they occur from the genomic bottom up. If the nature of genomic causation is more aggregate and fluid than in standard theory, perhaps for many purposes it is sufficient to know about the resulting trait, the player on the stage of life. We may not need to enumerate the underlying traits of the actor in the costume in any given performance. We can't infer the actor's traits from the role, nor can we predict the role by knowing the actor. And we usually can't predict an organism's traits from its genome, either. Bottom's head was transformed into an ass, and the bewitched Titania awakened and called him to bed to caress his amiable cheeks (Fig. 2). Is our infatuation with ‘omics’ leading us to wander through an enchanted forest of wishful thinking and mistaken identity? The answer isn't clear: top-down approaches were unrealistically simplified, but bottom-up approaches may be unmanageably complex until new conceptual approaches are developed. It may be fun, but unfair, to judge bottom-up approaches on the faddish way the scientific herd uses and invents them. In reality, they are not wholly “c-omical.” Bottom in disguise. “Sleep thou and I will wind thee in my arms” (Titania 4.1: 37) By Arthur Rackham, 1908. Public domain. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.] Titania cuddling her Bottom. Painting by Johann Heinrich Füssli. Public domain. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.] This isn't entirely new. The foundation of modern science, back at least to Galileo and Hooke, was largely driven by new technology in the form of telescopes and microscopes, among other gadgetry. Even indirectly, reliable clocks and astrolabes led to convenient worldwide travel and in turn to “omical” specimen collection, which led to the striking insights of Darwin and Wallace. Galileo could begin transforming science with little more than a cheap child's toy, as the original telescopes were, but today the high cost of newtechnologies helps drive the frenetically careerist nature of science, making cost itself an issue. The technologically driven metamorphosis of raw data into theory certainly involves disguises, rewards, punishments, and trials. We haven't a woodland fairy king to save us from genetic disease at a stroke by simple proclamation; instead, the resulting bottom-up analysis is revealing a complex causal reality that our top-down scientific heritage makes us reluctant to accept or digest. If we're just in love with technology, or stubbornly cling to dreamy illusions of simplicity in nature, we may one day come to our senses, saying, as Titania did (4.1.76), “Methinks I was enamored of an ass.” Or, if we take new findings seriously, they may show us which end is up. I welcome comments on this column: kenweiss@psu.edu. I co-author a blog on relevant topics at EcoDevoEvo.blogspot.com. I thank Anne Buchanan and John Fleagle for critically reading this manuscript. My work receives financial assistance from the Penn State Evan Pugh professors fund.

Referência(s)